Science.gov

Sample records for advanced statistical tools

  1. Integration of Advanced Statistical Analysis Tools and Geophysical Modeling

    DTIC Science & Technology

    2010-12-01

    later in this section. 2) San Luis Obispo . Extracted features were also provided for MTADS EM61, MTADS magnetics, EM61 cart, and TEMTADS data sets from...subsequent training of statistical classifiers using these features. Results of discrimination studies at Camp Sibert and San Luis Obispo have shown...Comparison of classification performance Figures 10 through 13 show receiver operating characteristics for data sets acquired at San Luis Obispo . Subplot

  2. Advanced Prosthetic Gait Training Tool

    DTIC Science & Technology

    2011-09-01

    AD_________________ Award Number: W81XWH-10-1-0870 TITLE: Advanced Prosthetic Gait Training Tool...Advanced Prosthetic Gait Training Tool 5b. GRANT NUMBER W81XWH-10-1-0870 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER Rajankumar...produce a computer-based Advanced Prosthetic Gait Training Tool to aid in the training of clinicians at military treatment facilities providing care for

  3. Advanced Prosthetic Gait Training Tool

    DTIC Science & Technology

    2015-12-01

    study is to produce a computer-based Advanced Prosthetic Gait Training Tool to aid in the training of clinicians at military treatment facilities...providing care for wounded service members. In Phase I of the effort, significant work was completed at the University of Iowa Center for Computer- Aided ...Gait Training Tool Introduction The objective of our study is to produce a computer-based Advanced Prosthetic Gait Training Tool (APGTT) to aid in

  4. Advanced Welding Tool

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Accutron Tool & Instrument Co.'s welder was originally developed as a tool specifically for joining parts made of plastic or composite materials in any atmosphere to include the airless environment of space. Developers decided on induction or magnetic heating to avoid causing deformation and it also can be used with almost any type of thermoplastic material. Induction coil transfers magnetic flux through the plastic to a metal screen that is sandwiched between the sheets of plastic to be joined. When welder is energized, alternating current produces inductive heating on the screen causing the adjacent plastic surfaces to melt and flow into the mesh, creating a bond on the total surface area. Dave Brown, owner of Great Falls Canoe and Kayak Repair, Vienna, VA, uses a special repair technique based on operation of the Induction Toroid Welder to fix canoes. Whitewater canoeing poses the problem of frequent gashes that are difficult to repair. The main reason is that many canoes are made of plastics. The commercial Induction model is a self-contained, portable welding gun with a switch on the handle to regulate the temperature of the plastic melting screen. Welder has a broad range of applications in the automobile, appliance, aerospace and construction industries.

  5. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  6. Statistical Tools for Forensic Analysis of Toolmarks

    SciTech Connect

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  7. Intermediate/Advanced Research Design and Statistics

    NASA Technical Reports Server (NTRS)

    Ploutz-Snyder, Robert

    2009-01-01

    The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs

  8. New advanced tools for combined ULF wave analysis of multipoint space-borne and ground observations: application to single event and statistical studies

    NASA Astrophysics Data System (ADS)

    Balasis, G.; Papadimitriou, C.; Daglis, I. A.; Georgiou, M.; Giamini, S. A.

    2013-12-01

    In the past decade, a critical mass of high-quality scientific data on the electric and magnetic fields in the Earth's magnetosphere and topside ionosphere has been progressively collected. This data pool will be further enriched by the measurements of the upcoming ESA/Swarm mission, a constellation of three satellites in three different polar orbits between 400 and 550 km altitude, which is expected to be launched in November 2013. New analysis tools that can cope with measurements of various spacecraft at various regions of the magnetosphere and in the topside ionosphere as well as ground stations will effectively enhance the scientific exploitation of the accumulated data. Here, we report on a new suite of algorithms based on a combination of wavelet spectral methods and artificial neural network techniques and demonstrate the applicability of our recently developed analysis tools both for individual case studies and statistical studies of ultra-low frequency (ULF) waves. First, we provide evidence for a rare simultaneous observation of a ULF wave event in the Earth's magnetosphere, topside ionosphere and surface: we have found a specific time interval during the Halloween 2003 magnetic storm, when the Cluster and CHAMP spacecraft were in good local time (LT) conjunction, and have examined the ULF wave activity in the Pc3 (22-100 mHz) and Pc4-5 (1-22 mHz) bands using data from the Geotail, Cluster and CHAMP missions, as well as the CARISMA and GIMA magnetometer networks. Then, we perform a statistical study of Pc3 wave events observed by CHAMP for the full decade (2001-2010) of the satellite vector magnetic data: the creation of a database of such events enabled us to derive valuable statistics for many important physical properties relating to the spatio-temporal location of these waves, the wave power and frequency, as well as other parameters and their correlation with solar wind conditions, magnetospheric indices, electron density data, ring current decay

  9. STATWIZ - AN ELECTRONIC STATISTICAL TOOL (ABSTRACT)

    EPA Science Inventory

    StatWiz is a web-based, interactive, and dynamic statistical tool for researchers. It will allow researchers to input information and/or data and then receive experimental design options, or outputs from data analysis. StatWiz is envisioned as an expert system that will walk rese...

  10. Modeling Tool Advances Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  11. Advanced Human Factors Engineering Tool Technologies.

    DTIC Science & Technology

    1987-03-20

    identified the types of tools they would like to see V developed to fill the existing technology gaps. The advanced tools were catego- rized using an...the prototype phase of development were considered candidates for inclusion. The advanced tools were next categorized using an eight point...role, application, status and cost. Decision criteria were then developed as the basis for the tradeoff process to aid in tool selection. To

  12. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is

  13. Recent Advances in Algal Genetic Tool Development

    SciTech Connect

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well as prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.

  14. Rapid medical advances challenge the tooling industry.

    PubMed

    Conley, B

    2008-01-01

    The requirement for greater performance in smaller spaces has increased demands for product and process innovation in tubing and other medical products. In turn, these developments have placed greater demands on the producers of the advanced tooling for these products. Tooling manufacturers must now continuously design equipment with much tighter tolerances for more sophisticated coextrusions and for newer generations of multilumen and multilayer tubing.

  15. Using Tree Diagrams as an Assessment Tool in Statistics Education

    ERIC Educational Resources Information Center

    Yin, Yue

    2012-01-01

    This study examines the potential of the tree diagram, a type of graphic organizer, as an assessment tool to measure students' knowledge structures in statistics education. Students' knowledge structures in statistics have not been sufficiently assessed in statistics, despite their importance. This article first presents the rationale and method…

  16. Recent Advances in Algal Genetic Tool Development

    DOE PAGES

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less

  17. Deterministic and Advanced Statistical Modeling of Wind-Driven Sea

    DTIC Science & Technology

    2015-07-06

    COVERED (From - To) 01/09/2010-06/07/2015 4. TITLE AND SUBTITLE Deterministic and advanced statistical modeling of wind-driven sea 5a. CONTRACT...Technical Report Deterministic and advanced statistical modeling of wind-driven sea Vladimir Zakharov, Andrei Pushkarev Waves and Solitons LLC, 1719 W...Development of accurate and fast advanced statistical and dynamical nonlinear models of ocean surface waves, based on first physical principles, which will

  18. Writing to Learn Statistics in an Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Northrup, Christian Glenn

    2012-01-01

    This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…

  19. Advanced genetic tools for plant biotechnology.

    PubMed

    Liu, Wusheng; Yuan, Joshua S; Stewart, C Neal

    2013-11-01

    Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis of large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.

  20. Advanced genetic tools for plant biotechnology

    SciTech Connect

    Liu, WS; Yuan, JS; Stewart, CN

    2013-10-09

    Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis of large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.

  1. Data Torturing and the Misuse of Statistical Tools

    SciTech Connect

    Abate, Marcey L.

    1999-08-16

    Statistical concepts, methods, and tools are often used in the implementation of statistical thinking. Unfortunately, statistical tools are all too often misused by not applying them in the context of statistical thinking that focuses on processes, variation, and data. The consequences of this misuse may be ''data torturing'' or going beyond reasonable interpretation of the facts due to a misunderstanding of the processes creating the data or the misinterpretation of variability in the data. In the hope of averting future misuse and data torturing, examples are provided where the application of common statistical tools, in the absence of statistical thinking, provides deceptive results by not adequately representing the underlying process and variability. For each of the examples, a discussion is provided on how applying the concepts of statistical thinking may have prevented the data torturing. The lessons learned from these examples will provide an increased awareness of the potential for many statistical methods to mislead and a better understanding of how statistical thinking broadens and increases the effectiveness of statistical tools.

  2. A Hierarchical Statistic Methodology for Advanced Memory System Evaluation

    SciTech Connect

    Sun, X.-J.; He, D.; Cameron, K.W.; Luo, Y.

    1999-04-12

    Advances in technology have resulted in a widening of the gap between computing speed and memory access time. Data access time has become increasingly important for computer system design. Various hierarchical memory architectures have been developed. The performance of these advanced memory systems, however, varies with applications and problem sizes. How to reach an optimal cost/performance design eludes researchers still. In this study, the authors introduce an evaluation methodology for advanced memory systems. This methodology is based on statistical factorial analysis and performance scalability analysis. It is two fold: it first determines the impact of memory systems and application programs toward overall performance; it also identifies the bottleneck in a memory hierarchy and provides cost/performance comparisons via scalability analysis. Different memory systems can be compared in terms of mean performance or scalability over a range of codes and problem sizes. Experimental testing has been performed extensively on the Department of Energy's Accelerated Strategic Computing Initiative (ASCI) machines and benchmarks available at the Los Alamos National Laboratory to validate this newly proposed methodology. Experimental and analytical results show this methodology is simple and effective. It is a practical tool for memory system evaluation and design. Its extension to general architectural evaluation and parallel computer systems are possible and should be further explored.

  3. Self-advancing step-tap tool

    NASA Technical Reports Server (NTRS)

    Pettit, Donald R. (Inventor); Penner, Ronald K. (Inventor); Franklin, Larry D. (Inventor); Camarda, Charles J. (Inventor)

    2008-01-01

    Methods and tool for simultaneously forming a bore in a work piece and forming a series of threads in said bore. In an embodiment, the tool has a predetermined axial length, a proximal end, and a distal end, said tool comprising: a shank located at said proximal end; a pilot drill portion located at said distal end; and a mill portion intermediately disposed between said shank and said pilot drill portion. The mill portion is comprised of at least two drill-tap sections of predetermined axial lengths and at least one transition section of predetermined axial length, wherein each of said at least one transition section is sandwiched between a distinct set of two of said at least two drill-tap sections. The at least two drill-tap sections are formed of one or more drill-tap cutting teeth spirally increasing along said at least two drill-tap sections, wherein said tool is self-advanced in said work piece along said formed threads, and wherein said tool simultaneously forms said bore and said series of threads along a substantially similar longitudinal axis.

  4. Conceptualizing a Framework for Advanced Placement Statistics Teaching Knowledge

    ERIC Educational Resources Information Center

    Haines, Brenna

    2015-01-01

    The purpose of this article is to sketch a conceptualization of a framework for Advanced Placement (AP) Statistics Teaching Knowledge. Recent research continues to problematize the lack of knowledge and preparation among secondary level statistics teachers. The College Board's AP Statistics course continues to grow and gain popularity, but is a…

  5. Statistical spectroscopic tools for biomarker discovery and systems medicine.

    PubMed

    Robinette, Steven L; Lindon, John C; Nicholson, Jeremy K

    2013-06-04

    Metabolic profiling based on comparative, statistical analysis of NMR spectroscopic and mass spectrometric data from complex biological samples has contributed to increased understanding of the role of small molecules in affecting and indicating biological processes. To enable this research, the development of statistical spectroscopy has been marked by early beginnings in applying pattern recognition to nuclear magnetic resonance data and the introduction of statistical total correlation spectroscopy (STOCSY) as a tool for biomarker identification in the past decade. Extensions of statistical spectroscopy now compose a family of related tools used for compound identification, data preprocessing, and metabolic pathway analysis. In this Perspective, we review the theory and current state of research in statistical spectroscopy and discuss the growing applications of these tools to medicine and systems biology. We also provide perspectives on how recent institutional initiatives are providing new platforms for the development and application of statistical spectroscopy tools and driving the development of integrated "systems medicine" approaches in which clinical decision making is supported by statistical and computational analysis of metabolic, phenotypic, and physiological data.

  6. Advanced Algorithms and Statistics for MOS Surveys

    NASA Astrophysics Data System (ADS)

    Bolton, A. S.

    2016-10-01

    This paper presents an individual view on the current state of computational data processing and statistics for inference and discovery in multi-object spectroscopic surveys, supplemented by a historical perspective and a few present-day applications. It is more op-ed than review, and hopefully more readable as a result.

  7. Advance Report of Final Mortality Statistics, 1985.

    ERIC Educational Resources Information Center

    Monthly Vital Statistics Report, 1987

    1987-01-01

    This document presents mortality statistics for 1985 for the entire United States. Data analysis and discussion of these factors is included: death and death rates; death rates by age, sex, and race; expectation of life at birth and at specified ages; causes of death; infant mortality; and maternal mortality. Highlights reported include: (1) the…

  8. Instruction of Statistics via Computer-Based Tools: Effects on Statistics' Anxiety, Attitude, and Achievement

    ERIC Educational Resources Information Center

    Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar

    2014-01-01

    The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…

  9. Recent advances in statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Heron, K. H.

    1992-01-01

    Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.

  10. A Statistical Project Control Tool for Engineering Managers

    NASA Technical Reports Server (NTRS)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  11. Novel statistical tools for monitoring the safety of marketed drugs.

    PubMed

    Almenoff, J S; Pattishall, E N; Gibbs, T G; DuMouchel, W; Evans, S J W; Yuen, N

    2007-08-01

    Robust tools for monitoring the safety of marketed therapeutic products are of paramount importance to public health. In recent years, innovative statistical approaches have been developed to screen large post-marketing safety databases for adverse events (AEs) that occur with disproportionate frequency. These methods, known variously as quantitative signal detection, disproportionality analysis, or safety data mining, facilitate the identification of new safety issues or possible harmful effects of a product. In this article, we describe the statistical concepts behind these methods, as well as their practical application to monitoring the safety of pharmaceutical products using spontaneous AE reports. We also provide examples of how these tools can be used to identify novel drug interactions and demographic risk factors for adverse drug reactions. Challenges, controversies, and frontiers for future research are discussed.

  12. Advanced cryogenics for cutting tools. Final report

    SciTech Connect

    Lazarus, L.J.

    1996-10-01

    The purpose of the investigation was to determine if cryogenic treatment improved the life and cost effectiveness of perishable cutting tools over other treatments or coatings. Test results showed that in five of seven of the perishable cutting tools tested there was no improvement in tool life. The other two tools showed a small gain in tool life, but not as much as when switching manufacturers of the cutting tool. The following conclusions were drawn from this study: (1) titanium nitride coatings are more effective than cryogenic treatment in increasing the life of perishable cutting tools made from all cutting tool materials, (2) cryogenic treatment may increase tool life if the cutting tool is improperly heat treated during its origination, and (3) cryogenic treatment was only effective on those tools made from less sophisticated high speed tool steels. As a part of a recent detailed investigation, four cutting tool manufacturers and two cutting tool laboratories were queried and none could supply any data to substantiate cryogenic treatment of perishable cutting tools.

  13. Advanced Human Factors Engineering Tool Technologies.

    DTIC Science & Technology

    1988-03-01

    representing the government, the military, academe, and private industry were surveyed to identify those tools that are most frequently used or viewed...tools by HFE researchers and practitioners within the academic, industrial , and military settings. % .. J. &@ossion For XTIS GR&&I DTIC TAS 0...267 E. Human Factors Engineering Tools Questionnaire .. ......... . 279 F. Listing of Industry , Government, and Academe

  14. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    ERIC Educational Resources Information Center

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  15. Surface evaluation by estimation of fractal dimension and statistical tools.

    PubMed

    Hotar, Vlastimil; Salac, Petr

    2014-01-01

    Structured and complex data can be found in many applications in research and development, and also in industrial practice. We developed a methodology for describing the structured data complexity and applied it in development and industrial practice. The methodology uses fractal dimension together with statistical tools and with software modification is able to analyse data in a form of sequence (signals, surface roughness), 2D images, and dividing lines. The methodology had not been tested for a relatively large collection of data. For this reason, samples with structured surfaces produced with different technologies and properties were measured and evaluated with many types of parameters. The paper intends to analyse data measured by a surface roughness tester. The methodology shown compares standard and nonstandard parameters, searches the optimal parameters for a complete analysis, and specifies the sensitivity to directionality of samples for these types of surfaces. The text presents application of fractal geometry (fractal dimension) for complex surface analysis in combination with standard roughness parameters (statistical tool).

  16. Surface Evaluation by Estimation of Fractal Dimension and Statistical Tools

    PubMed Central

    Salac, Petr

    2014-01-01

    Structured and complex data can be found in many applications in research and development, and also in industrial practice. We developed a methodology for describing the structured data complexity and applied it in development and industrial practice. The methodology uses fractal dimension together with statistical tools and with software modification is able to analyse data in a form of sequence (signals, surface roughness), 2D images, and dividing lines. The methodology had not been tested for a relatively large collection of data. For this reason, samples with structured surfaces produced with different technologies and properties were measured and evaluated with many types of parameters. The paper intends to analyse data measured by a surface roughness tester. The methodology shown compares standard and nonstandard parameters, searches the optimal parameters for a complete analysis, and specifies the sensitivity to directionality of samples for these types of surfaces. The text presents application of fractal geometry (fractal dimension) for complex surface analysis in combination with standard roughness parameters (statistical tool). PMID:25250380

  17. New statistical tools for analyzing the structure of animal groups.

    PubMed

    Cavagna, Andrea; Cimarelli, Alessio; Giardina, Irene; Orlandi, Alberto; Parisi, Giorgio; Procaccini, Andrea; Santagati, Raffaele; Stefanini, Fabio

    2008-01-01

    The statistical characterization of the spatial structure of large animal groups has been very limited so far, mainly due to a lack of empirical data, especially in three dimensions (3D). Here we focus on the case of large flocks of starlings (Sturnus vulgaris) in the field. We reconstruct the 3D positions of individual birds within flocks of up to few thousands of elements. In this respect our data constitute a unique set. We perform a statistical analysis of flocks' structure by using two quantities that are new to the field of collective animal behaviour, namely the conditional density and the pair correlation function. These tools were originally developed in the context of condensed matter theory. We explain what is the meaning of these two quantities, how to measure them in a reliable way, and why they are useful in assessing the density fluctuations and the statistical correlations across the group. We show that the border-to-centre density gradient displayed by starling flocks gives rise to an anomalous behaviour of the conditional density. We also find that the pair correlation function has a structure incompatible with a crystalline arrangement of birds. In fact, our results suggest that flocks are somewhat intermediate between the liquid and the gas phase of physical systems.

  18. Advances in Statistical Methods for Substance Abuse Prevention Research

    PubMed Central

    MacKinnon, David P.; Lockwood, Chondra M.

    2010-01-01

    The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467

  19. Advanced REACH Tool: A Bayesian Model for Occupational Exposure Assessment

    PubMed Central

    McNally, Kevin; Warren, Nicholas; Fransman, Wouter; Entink, Rinke Klein; Schinkel, Jody; van Tongeren, Martie; Cherrie, John W.; Kromhout, Hans; Schneider, Thomas; Tielemans, Erik

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate sources of information within a Bayesian statistical framework. The information is obtained from expert knowledge expressed in a calibrated mechanistic model of exposure assessment, data on inter- and intra-individual variability in exposures from the literature, and context-specific exposure measurements. The ART provides central estimates and credible intervals for different percentiles of the exposure distribution, for full-shift and long-term average exposures. The ART can produce exposure estimates in the absence of measurements, but the precision of the estimates improves as more data become available. The methodology presented in this paper is able to utilize partially analogous data, a novel approach designed to make efficient use of a sparsely populated measurement database although some additional research is still required before practical implementation. The methodology is demonstrated using two worked examples: an exposure to copper pyrithione in the spraying of antifouling paints and an exposure to ethyl acetate in shoe repair. PMID:24665110

  20. Statistical tools for prognostics and health management of complex systems

    SciTech Connect

    Collins, David H; Huzurbazar, Aparna V; Anderson - Cook, Christine M

    2010-01-01

    Prognostics and Health Management (PHM) is increasingly important for understanding and managing today's complex systems. These systems are typically mission- or safety-critical, expensive to replace, and operate in environments where reliability and cost-effectiveness are a priority. We present background on PHM and a suite of applicable statistical tools and methods. Our primary focus is on predicting future states of the system (e.g., the probability of being operational at a future time, or the expected remaining system life) using heterogeneous data from a variety of sources. We discuss component reliability models incorporating physical understanding, condition measurements from sensors, and environmental covariates; system reliability models that allow prediction of system failure time distributions from component failure models; and the use of Bayesian techniques to incorporate expert judgments into component and system models.

  1. Alternative Fuel and Advanced Vehicle Tools (AFAVT), AFDC (Fact Sheet)

    SciTech Connect

    Not Available

    2010-01-01

    The Alternative Fuels and Advanced Vehicles Web site offers a collection of calculators, interactive maps, and informational tools to assist fleets, fuel providers, and others looking to reduce petroleum consumption in the transportation sector.

  2. Innovative Tools Advance Revolutionary Weld Technique

    NASA Technical Reports Server (NTRS)

    2009-01-01

    The iconic, orange external tank of the space shuttle launch system not only contains the fuel used by the shuttle s main engines during liftoff but also comprises the shuttle s backbone, supporting the space shuttle orbiter and solid rocket boosters. Given the tank s structural importance and the extreme forces (7.8 million pounds of thrust load) and temperatures it encounters during launch, the welds used to construct the tank must be highly reliable. Variable polarity plasma arc welding, developed for manufacturing the external tank and later employed for building the International Space Station, was until 1994 the best process for joining the aluminum alloys used during construction. That year, Marshall Space Flight Center engineers began experimenting with a relatively new welding technique called friction stir welding (FSW), developed in 1991 by The Welding Institute, of Cambridge, England. FSW differs from traditional fusion welding in that it is a solid-state welding technique, using frictional heat and motion to join structural components without actually melting any of the material. The weld is created by a shouldered pin tool that is plunged into the seam of the materials to be joined. The tool traverses the line while rotating at high speeds, generating friction that heats and softens but does not melt the metal. (The heat produced approaches about 80 percent of the metal s melting temperature.) The pin tool s rotation crushes and stirs the plasticized metal, extruding it along the seam as the tool moves forward. The material cools and consolidates, resulting in a weld with superior mechanical properties as compared to those weld properties of fusion welds. The innovative FSW technology promises a number of attractive benefits. Because the welded materials are not melted, many of the undesirables associated with fusion welding porosity, cracking, shrinkage, and distortion of the weld are minimized or avoided. The process is more energy efficient, safe

  3. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Statistical Tool Reporting. 1852.223-76 Section 1852.223-76 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.223-76 Federal Automotive Statistical Tool Reporting. As prescribed at 1823.271 and 1851.205, insert the following clause: Federal Automotive Statistical Tool Reporting (JUL 2003)...

  4. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Statistical Tool Reporting. 1852.223-76 Section 1852.223-76 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.223-76 Federal Automotive Statistical Tool Reporting. As prescribed at 1823.271 and 1851.205, insert the following clause: Federal Automotive Statistical Tool Reporting (JUL 2003)...

  5. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Statistical Tool Reporting. 1852.223-76 Section 1852.223-76 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.223-76 Federal Automotive Statistical Tool Reporting. As prescribed at 1823.271 and 1851.205, insert the following clause: Federal Automotive Statistical Tool Reporting (JUL 2003)...

  6. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Statistical Tool Reporting. 1852.223-76 Section 1852.223-76 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.223-76 Federal Automotive Statistical Tool Reporting. As prescribed at 1823.271 and 1851.205, insert the following clause: Federal Automotive Statistical Tool Reporting (JUL 2003)...

  7. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Statistical Tool Reporting. 1852.223-76 Section 1852.223-76 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.223-76 Federal Automotive Statistical Tool Reporting. As prescribed at 1823.271 and 1851.205, insert the following clause: Federal Automotive Statistical Tool Reporting (JUL 2003)...

  8. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  9. Terahertz Tools Advance Imaging for Security, Industry

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Picometrix, a wholly owned subsidiary of Advanced Photonix Inc. (API), of Ann Arbor, Michigan, invented the world s first commercial terahertz system. The company improved the portability and capabilities of their systems through Small Business Innovation Research (SBIR) agreements with Langley Research Center to provide terahertz imaging capabilities for inspecting the space shuttle external tanks and orbiters. Now API s systems make use of the unique imaging capacity of terahertz radiation on manufacturing floors, for thickness measurements of coatings, pharmaceutical tablet production, and even art conservation.

  10. Enhanced bio-manufacturing through advanced multivariate statistical technologies.

    PubMed

    Martin, E B; Morris, A J

    2002-11-13

    The paper describes the interrogation of data, from a reaction vessel producing an active pharmaceutical ingredient (API), using advanced multivariate statistical techniques. Due to the limited number of batches available, data augmentation was used to increase the number of batches thereby enabling the extraction of more subtle process behaviour from the data. A second methodology investigated was that of multi-group modelling. This allowed between cluster variability to be removed, thus allowing attention to focus on within process variability. The paper describes how the different approaches enabled the realisation of a better understanding of the factors causing the onset of an impurity formation to be obtained as well demonstrating the power of multivariate statistical data analysis techniques to provide an enhanced understanding of the process.

  11. A new statistical tool for NOAA local climate studies

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Meyers, J. C.; Hollingshead, A.

    2011-12-01

    The National Weather Services (NWS) Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices' ability to efficiently access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NOAA's staff to conduct regional and local climate studies using state-of-the-art station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. LCAT will augment current climate reference materials with information pertinent to the local and regional levels as they apply to diverse variables appropriate to each locality. The LCAT main emphasis is to enable studies of extreme meteorological and hydrological events such as tornadoes, flood, drought, severe storms, etc. LCAT will close a very critical gap in NWS local climate services because it will allow addressing climate variables beyond average temperature and total precipitation. NWS external partners and government agencies will benefit from the LCAT outputs that could be easily incorporated into their own analysis and/or delivery systems. Presently we identified five existing requirements for local climate: (1) Local impacts of climate change; (2) Local impacts of climate variability; (3) Drought studies; (4) Attribution of severe meteorological and hydrological events; and (5) Climate studies for water resources. The methodologies for the first three requirements will be included in the LCAT first phase implementation. Local rate of climate change is defined as a slope of the mean trend estimated from the ensemble of three trend techniques: (1) hinge, (2) Optimal Climate Normals (running mean for optimal time periods), (3) exponentially

  12. Statistical inference to advance network models in epidemiology.

    PubMed

    Welch, David; Bansal, Shweta; Hunter, David R

    2011-03-01

    Contact networks are playing an increasingly important role in the study of epidemiology. Most of the existing work in this area has focused on considering the effect of underlying network structure on epidemic dynamics by using tools from probability theory and computer simulation. This work has provided much insight on the role that heterogeneity in host contact patterns plays on infectious disease dynamics. Despite the important understanding afforded by the probability and simulation paradigm, this approach does not directly address important questions about the structure of contact networks such as what is the best network model for a particular mode of disease transmission, how parameter values of a given model should be estimated, or how precisely the data allow us to estimate these parameter values. We argue that these questions are best answered within a statistical framework and discuss the role of statistical inference in estimating contact networks from epidemiological data.

  13. Advanced Mathematical Tools in Metrology III

    NASA Astrophysics Data System (ADS)

    Ciarlini, P.

    The Table of Contents for the book is as follows: * Foreword * Invited Papers * The ISO Guide to the Expression of Uncertainty in Measurement: A Bridge between Statistics and Metrology * Bootstrap Algorithms and Applications * The TTRSs: 13 Oriented Constraints for Dimensioning, Tolerancing & Inspection * Graded Reference Data Sets and Performance Profiles for Testing Software Used in Metrology * Uncertainty in Chemical Measurement * Mathematical Methods for Data Analysis in Medical Applications * High-Dimensional Empirical Linear Prediction * Wavelet Methods in Signal Processing * Software Problems in Calibration Services: A Case Study * Robust Alternatives to Least Squares * Gaining Information from Biomagnetic Measurements * Full Papers * Increase of Information in the Course of Measurement * A Framework for Model Validation and Software Testing in Regression * Certification of Algorithms for Determination of Signal Extreme Values during Measurement * A Method for Evaluating Trends in Ozone-Concentration Data and Its Application to Data from the UK Rural Ozone Monitoring Network * Identification of Signal Components by Stochastic Modelling in Measurements of Evoked Magnetic Fields from Peripheral Nerves * High Precision 3D-Calibration of Cylindrical Standards * Magnetic Dipole Estimations for MCG-Data * Transfer Functions of Discrete Spline Filters * An Approximation Method for the Linearization of Tridimensional Metrology Problems * Regularization Algorithms for Image Reconstruction from Projections * Quality of Experimental Data in Hydrodynamic Research * Stochastic Drift Models for the Determination of Calibration Intervals * Short Communications * Projection Method for Lidar Measurement * Photon Flux Measurements by Regularised Solution of Integral Equations * Correct Solutions of Fit Problems in Different Experimental Situations * An Algorithm for the Nonlinear TLS Problem in Polynomial Fitting * Designing Axially Symmetric Electromechanical Systems of

  14. Toward Understanding the Role of Technological Tools in Statistical Learning.

    ERIC Educational Resources Information Center

    Ben-Zvi, Dani

    2000-01-01

    Begins with some context setting on new views of statistics and statistical education reflected in the introduction of exploratory data analysis (EDA) into the statistics curriculum. Introduces a detailed example of an EDA learning activity in the middle school that makes use of the power of the spreadsheet to mediate students' construction of…

  15. Statistical methods for the forensic analysis of striated tool marks

    SciTech Connect

    Hoeksema, Amy Beth

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  16. Integration of Advanced Statistical Analysis Tools and Geophysical Modeling

    DTIC Science & Technology

    2012-08-01

    this problem with a fingerprinting algorithm that inverts for target location and orientation while holding polarizations fixed at their library values...Cross-Domain Multitask Learning with Latent Probit Mod- els,” Proc. Int. Conf. Machine Learning (ICML), 2012 L. Beran, S.D. Billings and D. Oldenburg

  17. Automated Reshelving Statistics as a Tool in Reference Collection Management.

    ERIC Educational Resources Information Center

    Welch, Jeanie M.; Cauble, Lynn A.; Little, Lara B.

    1997-01-01

    Discusses implementation of the automated recording of reshelving statistics for print reference materials and the use of these statistics in reference-collection development and management, especially in making acquisitions and weeding decisions, based on experiences at the University of North Carolina, Charlotte. (Author/LRW)

  18. Children's Services Statistical Neighbour Benchmarking Tool. Practitioner User Guide

    ERIC Educational Resources Information Center

    National Foundation for Educational Research, 2007

    2007-01-01

    Statistical neighbour models provide one method for benchmarking progress. For each local authority (LA), these models designate a number of other LAs deemed to have similar characteristics. These designated LAs are known as statistical neighbours. Any LA may compare its performance (as measured by various indicators) against its statistical…

  19. Tools for Assessing Readability of Statistics Teaching Materials

    ERIC Educational Resources Information Center

    Lesser, Lawrence; Wagler, Amy

    2016-01-01

    This article provides tools and rationale for instructors in math and science to make their assessment and curriculum materials (more) readable for students. The tools discussed (MSWord, LexTutor, Coh-Metrix TEA) are readily available linguistic analysis applications that are grounded in current linguistic theory, but present output that can…

  20. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  1. Machine Tool Advanced Skills Technology Program (MAST). Overview and Methodology.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology Program (MAST) is a geographical partnership of six of the nation's best two-year colleges located in the six states that have about one-third of the density of metals-related industries in the United States. The purpose of the MAST grant is to develop and implement a national training model to overcome…

  2. Statistical metrology—measurement and modeling of variation for advanced process development and design rule generation

    NASA Astrophysics Data System (ADS)

    Boning, Duane S.; Chung, James E.

    1998-11-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.

  3. Statistical Analysis Tools for Learning in Engineering Laboratories.

    ERIC Educational Resources Information Center

    Maher, Carolyn A.

    1990-01-01

    Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…

  4. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  5. Technological Tools in the Introductory Statistics Classroom: Effects on Student Understanding of Inferential Statistics

    ERIC Educational Resources Information Center

    Meletiou-Mavrotheris, Maria

    2004-01-01

    While technology has become an integral part of introductory statistics courses, the programs typically employed are professional packages designed primarily for data analysis rather than for learning. Findings from several studies suggest that use of such software in the introductory statistics classroom may not be very effective in helping…

  6. Knowledge, Models and Tools in Support of Advanced Distance Learning

    DTIC Science & Technology

    2006-06-01

    authoring iRides simulations and training, Rivets , is a fast C++ program that has been compiled for three Unix-type operating systems: Linux, Silicon...School instructors to introduce core concepts of the tool in advance of teaching about expected value theory. 4.0 Rivets -Linux-based Authoring of...Simulations and Instruction Functioning versions of Rivets , a descendent of the classic RIDES program have been developed for Linux and for the Macintosh

  7. Bacteriophage-based tools: recent advances and novel applications

    PubMed Central

    O'Sullivan, Lisa; Buttimer, Colin; McAuliffe, Olivia; Bolton, Declan; Coffey, Aidan

    2016-01-01

    Bacteriophages (phages) are viruses that infect bacterial hosts, and since their discovery over a century ago they have been primarily exploited to control bacterial populations and to serve as tools in molecular biology. In this commentary, we highlight recent diverse advances in the field of phage research, going beyond bacterial control using whole phage, to areas including biocontrol using phage-derived enzybiotics, diagnostics, drug discovery, novel drug delivery systems and bionanotechnology. PMID:27990274

  8. Anvil Tool in the Advanced Weather Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Barrett, Joe, III; Bauman, William, III; Keen, Jeremy

    2007-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) created a graphical overlay tool for the Meteorological Interactive Data Display Systems (MIDDS) to indicate the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. In order for the Anvil Tool to remain available to the meteorologists, the AMU was tasked to transition the tool to the Advanced Weather interactive Processing System (AWIPS). This report describes the work done by the AMU to develop the Anvil Tool for AWIPS to create a graphical overlay depicting the threat from thunderstorm anvil clouds. The AWIPS Anvil Tool is based on the previously deployed AMU MIDDS Anvil Tool. SMG and 45 WS forecasters have used the MIDDS Anvil Tool during launch and landing operations. SMG's primary weather analysis and display system is now AWIPS and the 45 WS has plans to replace MIDDS with AWIPS. The Anvil Tool creates a graphic that users can overlay on satellite or radar imagery to depict the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on an average of the upper-level observed or forecasted winds. The graphic includes 10 and 20 nm standoff circles centered at the location of interest, in addition to one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 degree sector width based on a previous AMU study which determined thunderstorm anvils move in a direction plus or minus 15 degrees of the upper-level (300- to 150-mb) wind direction. This report briefly describes the history of the MIDDS Anvil Tool and then explains how the initial development of the AWIPS Anvil Tool was carried out. After testing was

  9. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  10. Evaluation of reliability modeling tools for advanced fault tolerant systems

    NASA Technical Reports Server (NTRS)

    Baker, Robert; Scheper, Charlotte

    1986-01-01

    The Computer Aided Reliability Estimation (CARE III) and Automated Reliability Interactice Estimation System (ARIES 82) reliability tools for application to advanced fault tolerance aerospace systems were evaluated. To determine reliability modeling requirements, the evaluation focused on the Draper Laboratories' Advanced Information Processing System (AIPS) architecture as an example architecture for fault tolerance aerospace systems. Advantages and limitations were identified for each reliability evaluation tool. The CARE III program was designed primarily for analyzing ultrareliable flight control systems. The ARIES 82 program's primary use was to support university research and teaching. Both CARE III and ARIES 82 were not suited for determining the reliability of complex nodal networks of the type used to interconnect processing sites in the AIPS architecture. It was concluded that ARIES was not suitable for modeling advanced fault tolerant systems. It was further concluded that subject to some limitations (the difficulty in modeling systems with unpowered spare modules, systems where equipment maintenance must be considered, systems where failure depends on the sequence in which faults occurred, and systems where multiple faults greater than a double near coincident faults must be considered), CARE III is best suited for evaluating the reliability of advanced tolerant systems for air transport.

  11. Anvil Forecast Tool in the Advanced Weather Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and National Weather Service Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) was tasked to create a graphical overlay tool for the Meteorological Interactive Data Display System (MIDDS) that indicates the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. The tool creates a graphic depicting the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on the average of the upper level observed or forecasted winds. The graphic includes 10 and 20 n mi standoff circles centered at the location of interest, as well as one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 sector width based on a previous AMU study that determined thunderstorm anvils move in a direction plus or minus 15 of the upper-level wind direction. The AMU was then tasked to transition the tool to the Advanced Weather Interactive Processing System (AWIPS). SMG later requested the tool be updated to provide more flexibility and quicker access to model data. This presentation describes the work performed by the AMU to transition the tool into AWIPS, as well as the subsequent improvements made to the tool.

  12. Statistical Analysis of Noisy Signals Using Classification Tools

    SciTech Connect

    Thompson, Sandra E.; Heredia-Langner, Alejandro; Johnson, Timothy J.; Foster, Nancy S.; Valentine, Nancy B.; Amonette, James E.

    2005-06-04

    The potential use of chemicals, biotoxins and biological pathogens are a threat to military and police forces as well as the general public. Rapid identification of these agents is made difficult due to the noisy nature of the signal that can be obtained from portable, in-field sensors. In previously published articles, we created a flowchart that illustrated a method for triaging bacterial identification by combining standard statistical techniques for discrimination and identification with mid-infrared spectroscopic data. The present work documents the process of characterizing and eliminating the sources of the noise and outlines how multidisciplinary teams are necessary to accomplish that goal.

  13. Rapid development and optimization of tablet manufacturing using statistical tools.

    PubMed

    Fernández, Eutimio Gustavo; Cordero, Silvia; Benítez, Malvina; Perdomo, Iraelio; Morón, Yohandro; Morales, Ada Esther; Arce, Milagros Gaudencia; Cuesta, Ernesto; Lugones, Juan; Fernández, Maritza; Gil, Arturo; Valdés, Rodolfo; Fernández, Mirna

    2008-01-01

    The purpose of this paper was to develop a statistical methodology to optimize tablet manufacturing considering drug chemical and physical properties applying a crossed experimental design. The assessed model drug was dried ferrous sulphate and the variables were the hardness and the relative proportions of three excipients, binder, filler and disintegrant. Granule properties were modeled as a function of excipient proportions and tablet parameters were defined by the excipient proportion and hardness. The desirability function was applied to achieve optimal values for excipient proportions and hardness. In conclusion, crossed experimental design using hardness as the only process variable is an efficient strategy to quickly determine the optimal design process for tablet manufacturing. This method can be applied for any tablet manufacturing method.

  14. Statistical, economic and other tools for assessing natural aggregate

    USGS Publications Warehouse

    Bliss, J.D.; Moyle, P.R.; Bolm, K.S.

    2003-01-01

    Quantitative aggregate resource assessment provides resource estimates useful for explorationists, land managers and those who make decisions about land allocation, which may have long-term implications concerning cost and the availability of aggregate resources. Aggregate assessment needs to be systematic and consistent, yet flexible enough to allow updating without invalidating other parts of the assessment. Evaluators need to use standard or consistent aggregate classification and statistic distributions or, in other words, models with geological, geotechnical and economic variables or interrelationships between these variables. These models can be used with subjective estimates, if needed, to estimate how much aggregate may be present in a region or country using distributions generated by Monte Carlo computer simulations.

  15. Statistical Methods and Tools for Hanford Staged Feed Tank Sampling

    SciTech Connect

    Fountain, Matthew S.; Brigantic, Robert T.; Peterson, Reid A.

    2013-10-01

    This report summarizes work conducted by Pacific Northwest National Laboratory to technically evaluate the current approach to staged feed sampling of high-level waste (HLW) sludge to meet waste acceptance criteria (WAC) for transfer from tank farms to the Hanford Waste Treatment and Immobilization Plant (WTP). The current sampling and analysis approach is detailed in the document titled Initial Data Quality Objectives for WTP Feed Acceptance Criteria, 24590-WTP-RPT-MGT-11-014, Revision 0 (Arakali et al. 2011). The goal of this current work is to evaluate and provide recommendations to support a defensible, technical and statistical basis for the staged feed sampling approach that meets WAC data quality objectives (DQOs).

  16. Statistical Considerations of Data Processing in Giovanni Online Tool

    NASA Technical Reports Server (NTRS)

    Suhung, Shen; Leptoukh, G.; Acker, J.; Berrick, S.

    2005-01-01

    The GES DISC Interactive Online Visualization and Analysis Infrastructure (Giovanni) is a web-based interface for the rapid visualization and analysis of gridded data from a number of remote sensing instruments. The GES DISC currently employs several Giovanni instances to analyze various products, such as Ocean-Giovanni for ocean products from SeaWiFS and MODIS-Aqua; TOMS & OM1 Giovanni for atmospheric chemical trace gases from TOMS and OMI, and MOVAS for aerosols from MODIS, etc. (http://giovanni.gsfc.nasa.gov) Foremost among the Giovanni statistical functions is data averaging. Two aspects of this function are addressed here. The first deals with the accuracy of averaging gridded mapped products vs. averaging from the ungridded Level 2 data. Some mapped products contain mean values only; others contain additional statistics, such as number of pixels (NP) for each grid, standard deviation, etc. Since NP varies spatially and temporally, averaging with or without weighting by NP will be different. In this paper, we address differences of various weighting algorithms for some datasets utilized in Giovanni. The second aspect is related to different averaging methods affecting data quality and interpretation for data with non-normal distribution. The present study demonstrates results of different spatial averaging methods using gridded SeaWiFS Level 3 mapped monthly chlorophyll a data. Spatial averages were calculated using three different methods: arithmetic mean (AVG), geometric mean (GEO), and maximum likelihood estimator (MLE). Biogeochemical data, such as chlorophyll a, are usually considered to have a log-normal distribution. The study determined that differences between methods tend to increase with increasing size of a selected coastal area, with no significant differences in most open oceans. The GEO method consistently produces values lower than AVG and MLE. The AVG method produces values larger than MLE in some cases, but smaller in other cases. Further

  17. Advances in Mass Spectrometric Tools for Probing Neuropeptides

    NASA Astrophysics Data System (ADS)

    Buchberger, Amanda; Yu, Qing; Li, Lingjun

    2015-07-01

    Neuropeptides are important mediators in the functionality of the brain and other neurological organs. Because neuropeptides exist in a wide range of concentrations, appropriate characterization methods are needed to provide dynamic, chemical, and spatial information. Mass spectrometry and compatible tools have been a popular choice in analyzing neuropeptides. There have been several advances and challenges, both of which are the focus of this review. Discussions range from sample collection to bioinformatic tools, although avenues such as quantitation and imaging are included. Further development of the presented methods for neuropeptidomic mass spectrometric analysis is inevitable, which will lead to a further understanding of the complex interplay of neuropeptides and other signaling molecules in the nervous system.

  18. Advanced Electric Submersible Pump Design Tool for Geothermal Applications

    SciTech Connect

    Xuele Qi; Norman Turnquist; Farshad Ghasripoor

    2012-05-31

    Electrical Submersible Pumps (ESPs) present higher efficiency, larger production rate, and can be operated in deeper wells than the other geothermal artificial lifting systems. Enhanced Geothermal Systems (EGS) applications recommend lifting 300 C geothermal water at 80kg/s flow rate in a maximum 10-5/8-inch diameter wellbore to improve the cost-effectiveness. In this paper, an advanced ESP design tool comprising a 1D theoretical model and a 3D CFD analysis has been developed to design ESPs for geothermal applications. Design of Experiments was also performed to optimize the geometry and performance. The designed mixed-flow type centrifugal impeller and diffuser exhibit high efficiency and head rise under simulated EGS conditions. The design tool has been validated by comparing the prediction to experimental data of an existing ESP product.

  19. Constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  20. Review on advanced composite materials boring mechanism and tools

    NASA Astrophysics Data System (ADS)

    Shi, Runping; Wang, Chengyong

    2011-05-01

    With the rapid development of aviation and aerospace manufacturing technology, advanced composite materials represented by carbon fibre reinforced plastics (CFRP) and super hybrid composites (fibre/metal plates) are more and more widely applied. The fibres are mainly carbon fibre, boron fibre, Aramid fiber and Sic fibre. The matrixes are resin matrix, metal matrix and ceramic matrix. Advanced composite materials have higher specific strength and higher specific modulus than glass fibre reinforced resin composites of the 1st generation. They are widely used in aviation and aerospace industry due to their high specific strength, high specific modulus, excellent ductility, anticorrosion, heat-insulation, sound-insulation, shock absorption and high&low temperature resistance. They are used for radomes, inlets, airfoils(fuel tank included), flap, aileron, vertical tail, horizontal tail, air brake, skin, baseboards and tails, etc. Its hardness is up to 62~65HRC. The holes are greatly affected by the fibre laminates direction of carbon fibre reinforced composite material due to its anisotropy when drilling in unidirectional laminates. There are burrs, splits at the exit because of stress concentration. Besides there is delamination and the hole is prone to be smaller. Burrs are caused by poor sharpness of cutting edge, delamination, tearing, splitting are caused by the great stress caused by high thrust force. Poorer sharpness of cutting edge leads to lower cutting performance and higher drilling force at the same time. The present research focuses on the interrelation between rotation speed, feed, drill's geometry, drill life, cutting mode, tools material etc. and thrust force. At the same time, holes quantity and holes making difficulty of composites have also increased. It requires high performance drills which won't bring out defects and have long tool life. It has become a trend to develop super hard material tools and tools with special geometry for drilling

  1. Review on advanced composite materials boring mechanism and tools

    NASA Astrophysics Data System (ADS)

    Shi, Runping; Wang, Chengyong

    2010-12-01

    With the rapid development of aviation and aerospace manufacturing technology, advanced composite materials represented by carbon fibre reinforced plastics (CFRP) and super hybrid composites (fibre/metal plates) are more and more widely applied. The fibres are mainly carbon fibre, boron fibre, Aramid fiber and Sic fibre. The matrixes are resin matrix, metal matrix and ceramic matrix. Advanced composite materials have higher specific strength and higher specific modulus than glass fibre reinforced resin composites of the 1st generation. They are widely used in aviation and aerospace industry due to their high specific strength, high specific modulus, excellent ductility, anticorrosion, heat-insulation, sound-insulation, shock absorption and high&low temperature resistance. They are used for radomes, inlets, airfoils(fuel tank included), flap, aileron, vertical tail, horizontal tail, air brake, skin, baseboards and tails, etc. Its hardness is up to 62~65HRC. The holes are greatly affected by the fibre laminates direction of carbon fibre reinforced composite material due to its anisotropy when drilling in unidirectional laminates. There are burrs, splits at the exit because of stress concentration. Besides there is delamination and the hole is prone to be smaller. Burrs are caused by poor sharpness of cutting edge, delamination, tearing, splitting are caused by the great stress caused by high thrust force. Poorer sharpness of cutting edge leads to lower cutting performance and higher drilling force at the same time. The present research focuses on the interrelation between rotation speed, feed, drill's geometry, drill life, cutting mode, tools material etc. and thrust force. At the same time, holes quantity and holes making difficulty of composites have also increased. It requires high performance drills which won't bring out defects and have long tool life. It has become a trend to develop super hard material tools and tools with special geometry for drilling

  2. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  3. Simulated Interactive Research Experiments as Educational Tools for Advanced Science.

    PubMed

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M; Hopf, Martin; Arndt, Markus

    2015-09-15

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  4. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    PubMed Central

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  5. ADVISOR: a systems analysis tool for advanced vehicle modeling

    NASA Astrophysics Data System (ADS)

    Markel, T.; Brooker, A.; Hendricks, T.; Johnson, V.; Kelly, K.; Kramer, B.; O'Keefe, M.; Sprik, S.; Wipke, K.

    This paper provides an overview of Advanced Vehicle Simulator (ADVISOR)—the US Department of Energy's (DOE's) ADVISOR written in the MATLAB/Simulink environment and developed by the National Renewable Energy Laboratory. ADVISOR provides the vehicle engineering community with an easy-to-use, flexible, yet robust and supported analysis package for advanced vehicle modeling. It is primarily used to quantify the fuel economy, the performance, and the emissions of vehicles that use alternative technologies including fuel cells, batteries, electric motors, and internal combustion engines in hybrid (i.e. multiple power sources) configurations. It excels at quantifying the relative change that can be expected due to the implementation of technology compared to a baseline scenario. ADVISOR's capabilities and limitations are presented and the power source models that are included in ADVISOR are discussed. Finally, several applications of the tool are presented to highlight ADVISOR's functionality. The content of this paper is based on a presentation made at the 'Development of Advanced Battery Engineering Models' workshop held in Crystal City, Virginia in August 2001.

  6. Basics, common errors and essentials of statistical tools and techniques in anesthesiology research.

    PubMed

    Bajwa, Sukhminder Jit Singh

    2015-01-01

    The statistical portion is a vital component of any research study. The research methodology and the application of statistical tools and techniques have evolved over the years and have significantly helped the research activities throughout the globe. The results and inferences are not accurately possible without proper validation with various statistical tools and tests. The evidencebased anesthesia research and practice has to incorporate statistical tools in the methodology right from the planning stage of the study itself. Though the medical fraternity is well acquainted with the significance of statistics in research, there is a lack of in-depth knowledge about the various statistical concepts and principles among majority of the researchers. The clinical impact and consequences can be serious as the incorrect analysis, conclusions, and false results may construct an artificial platform on which future research activities are replicated. The present tutorial is an attempt to make anesthesiologists aware of the various aspects of statistical methods used in evidence-based research and also to highlight the common areas where maximum number of statistical errors are committed so as to adopt better statistical practices.

  7. Carbohydrate Structure Database: tools for statistical analysis of bacterial, plant and fungal glycomes

    PubMed Central

    Egorova, K.S.; Kondakova, A.N.; Toukach, Ph.V.

    2015-01-01

    Carbohydrates are biological blocks participating in diverse and crucial processes both at cellular and organism levels. They protect individual cells, establish intracellular interactions, take part in the immune reaction and participate in many other processes. Glycosylation is considered as one of the most important modifications of proteins and other biologically active molecules. Still, the data on the enzymatic machinery involved in the carbohydrate synthesis and processing are scattered, and the advance on its study is hindered by the vast bulk of accumulated genetic information not supported by any experimental evidences for functions of proteins that are encoded by these genes. In this article, we present novel instruments for statistical analysis of glycomes in taxa. These tools may be helpful for investigating carbohydrate-related enzymatic activities in various groups of organisms and for comparison of their carbohydrate content. The instruments are developed on the Carbohydrate Structure Database (CSDB) platform and are available freely on the CSDB web-site at http://csdb.glycoscience.ru. Database URL: http://csdb.glycoscience.ru PMID:26337239

  8. Bioinformatics Methods and Tools to Advance Clinical Care

    PubMed Central

    Lecroq, T.

    2015-01-01

    Summary Objectives To summarize excellent current research in the field of Bioinformatics and Translational Informatics with application in the health domain and clinical care. Method We provide a synopsis of the articles selected for the IMIA Yearbook 2015, from which we attempt to derive a synthetic overview of current and future activities in the field. As last year, a first step of selection was performed by querying MEDLINE with a list of MeSH descriptors completed by a list of terms adapted to the section. Each section editor has evaluated separately the set of 1,594 articles and the evaluation results were merged for retaining 15 articles for peer-review. Results The selection and evaluation process of this Yearbook’s section on Bioinformatics and Translational Informatics yielded four excellent articles regarding data management and genome medicine that are mainly tool-based papers. In the first article, the authors present PPISURV a tool for uncovering the role of specific genes in cancer survival outcome. The second article describes the classifier PredictSNP which combines six performing tools for predicting disease-related mutations. In the third article, by presenting a high-coverage map of the human proteome using high resolution mass spectrometry, the authors highlight the need for using mass spectrometry to complement genome annotation. The fourth article is also related to patient survival and decision support. The authors present datamining methods of large-scale datasets of past transplants. The objective is to identify chances of survival. Conclusions The current research activities still attest the continuous convergence of Bioinformatics and Medical Informatics, with a focus this year on dedicated tools and methods to advance clinical care. Indeed, there is a need for powerful tools for managing and interpreting complex, large-scale genomic and biological datasets, but also a need for user-friendly tools developed for the clinicians in their

  9. Clinical holistic health: advanced tools for holistic medicine.

    PubMed

    Ventegodt, Søren; Clausen, Birgitte; Nielsen, May Lyck; Merrick, Joav

    2006-02-24

    According to holistic medical theory, the patient will heal when old painful moments, the traumatic events of life that are often called "gestalts", are integrated in the present "now". The advanced holistic physician's expanded toolbox has many different tools to induce this healing, some that are more dangerous and potentially traumatic than others. The more intense the therapeutic technique, the more emotional energy will be released and contained in the session, but the higher also is the risk for the therapist to lose control of the session and lose the patient to his or her own dark side. To avoid harming the patient must be the highest priority in holistic existential therapy, making sufficient education and training an issue of highest importance. The concept of "stepping up" the therapy by using more and more "dramatic" methods to get access to repressed emotions and events has led us to a "therapeutic staircase" with ten steps: (1) establishing the relationship; (2) establishing intimacy, trust, and confidentiality; (3) giving support and holding; (4) taking the patient into the process of physical, emotional, and mental healing; (5) social healing of being in the family; (6) spiritual healing--returning to the abstract wholeness of the soul; (7) healing the informational layer of the body; (8) healing the three fundamental dimensions of existence: love, power, and sexuality in a direct way using, among other techniques, "controlled violence" and "acupressure through the vagina"; (9) mind-expanding and consciousness-transformative techniques like psychotropic drugs; and (10) techniques transgressing the patient's borders and, therefore, often traumatizing (for instance, the use of force against the will of the patient). We believe that the systematic use of the staircase will greatly improve the power and efficiency of holistic medicine for the patient and we invite a broad cooperation in scientifically testing the efficiency of the advanced holistic

  10. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  11. Sandia Advanced MEMS Design Tools, Version 2.0

    SciTech Connect

    Allen, Jim; McBrayer, John; Miller, Sam; Rodgers, Steve; montague, Steve; Sniegowski, Jeff; Jakubczak, Jay; Yarberry, Vic; Barnes, Steve; Priddy, Brian; Reyes, David; Westling, Belinda

    2002-06-13

    Sandia Advanced MEMS Design Tools is a 5-level surface micromachine fabrication technology, which customers internal and external to Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Provide enabling educational information (including pictures, videos, technical information) c)Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) d) Facilitate the process of having MEMS fabricated at SNL e) Facilitate the process of having post-fabrication services performed While there exist some files on the CD that are used in conjunction with the software AutoCAD, these files are not intended for use independent of the CD. NOTE: THE CUSTOMER MUST PURCHASE HIS/HER OWN COPY OF AutoCAD TO USE WITH THESE FILES.

  12. Sandia Advanced MEMS Design Tools v. 3.0

    SciTech Connect

    Yarberry, Victor R.; Allen, James J.; Lantz, Jeffrey W.; Priddy, Brian; Westlin, Belinda; Young, Andrew

    2016-08-25

    This is a major revision to the Sandia Advanced MEMS Design Tools. It replaces all previous versions. New features in this version: Revised to support AutoCAD 2014 and 2015 This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Provide enabling educational information (including pictures, videos, technical information) c) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) d) Facilitate the process of having MEMS fabricated at Sandia National Laboratories e) Facilitate the process of having post-fabrication services performed. While there exists some files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.

  13. Tool for Sizing Analysis of the Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  14. Sandia Advanced MEMS Design Tools, Version 2.2.5

    SciTech Connect

    Yarberry, Victor; Allen, James; Lantz, Jeffery; Priddy, Brian; & Westling, Belinda

    2010-01-19

    The Sandia National Laboratories Advanced MEMS Design Tools, Version 2.2.5, is a collection of menus, prototype drawings, and executables that provide significant productivity enhancements when using AutoCAD to design MEMS components. This release is designed for AutoCAD 2000i, 2002, or 2004 and is supported under Windows NT 4.0, Windows 2000, or XP. SUMMiT V (Sandia Ultra planar Multi level MEMS Technology) is a 5 level surface micromachine fabrication technology, which customers internal and external to Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) New features in this version: AutoCAD 2004 support has been added. SafeExplode ? a new feature that explodes blocks without affecting polylines (avoids exploding polylines into objects that are ignored by the DRC and Visualization tools). Layer control menu ? a pull-down menu for selecting layers to isolate, freeze, or thaw. Updated tools: A check has been added to catch invalid block names. DRC features: Added username/password validation, added a method to update the user?s password. SNL_DRC_WIDTH ? a value to control the width of the DRC error lines. SNL_BIAS_VALUE ? a value use to offset selected geometry SNL_PROCESS_NAME ? a value to specify the process name Documentation changes: The documentation has been updated to include the new features. While there exist some files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.

  15. Advanced Infusion Techniques with 3-D Printed Tooling

    SciTech Connect

    Nuttall, David; Elliott, Amy; Post, Brian K.; Love, Lonnie J.

    2016-05-10

    The manufacturing of tooling for large, contoured surfaces for fiber-layup applications requires significant effort to understand the geometry and then to subtractively manufacture the tool. Traditional methods for the auto industry use clay that is hand sculpted. In the marine pleasure craft industry, the exterior of the model is formed from a foam lay-up that is either hand cut or machined to create smooth lines. Engineers and researchers at Oak Ridge National Laboratory s Manufacturing Demonstration Facility (ORNL MDF) collaborated with Magnum Venus Products (MVP) in the development of a process for reproducing legacy whitewater adventure craft via digital scanning and large scale 3-D printed layup molds. The process entailed 3D scanning a legacy canoe form, converting that form to a CAD model, additively manufacturing (3-D Print) the mold tool, and subtractively finishing the mold s transfer surfaces. Future work will include applying a gelcoat to the mold transfer surface and infusing using vacuum assisted resin transfer molding, or VARTM principles, to create a watertight vessel. The outlined steps were performed on a specific canoe geometry found by MVP s principal participant. The intent of utilizing this geometry is to develop an energy efficient and marketable process for replicating complex shapes, specifically focusing on this particular watercraft, and provide a finished product for demonstration to the composites industry. The culminating part produced through this agreement has been slated for public presentation and potential demonstration at the 2016 CAMX (Composites and Advanced Materials eXpo) exposition in Anaheim, CA. Phase I of this collaborative research and development agreement (MDF-15-68) was conducted under CRADA NFE-15-05575 and was initiated on May 7, 2015, with an introduction to the MVP product line, and concluded in March of 2016 with the printing of and processing of a canoe mold. The project partner Magnum Venous Products (MVP) is

  16. The statistical multifragmentation model: Origins and recent advances

    NASA Astrophysics Data System (ADS)

    Donangelo, R.; Souza, S. R.

    2016-07-01

    We review the Statistical Multifragmentation Model (SMM) which considers a generalization of the liquid-drop model for hot nuclei and allows one to calculate thermodynamic quantities characterizing the nuclear ensemble at the disassembly stage. We show how to determine probabilities of definite partitions of finite nuclei and how to determine, through Monte Carlo calculations, observables such as the caloric curve, multiplicity distributions, heat capacity, among others. Some experimental measurements of the caloric curve confirmed the SMM predictions of over 10 years before, leading to a surge in the interest in the model. However, the experimental determination of the fragmentation temperatures relies on the yields of different isotopic species, which were not correctly calculated in the schematic, liquid-drop picture, employed in the SMM. This led to a series of improvements in the SMM, in particular to the more careful choice of nuclear masses and energy densities, specially for the lighter nuclei. With these improvements the SMM is able to make quantitative determinations of isotope production. We show the application of SMM to the production of exotic nuclei through multifragmentation. These preliminary calculations demonstrate the need for a careful choice of the system size and excitation energy to attain maximum yields.

  17. Advanced statistics: applying statistical process control techniques to emergency medicine: a primer for providers.

    PubMed

    Callahan, Charles D; Griffen, David L

    2003-08-01

    Emergency medicine faces unique challenges in the effort to improve efficiency and effectiveness. Increased patient volumes, decreased emergency department (ED) supply, and an increased emphasis on the ED as a diagnostic center have contributed to poor customer satisfaction and process failures such as diversion/bypass. Statistical process control (SPC) techniques developed in industry offer an empirically based means to understand our work processes and manage by fact. Emphasizing that meaningful quality improvement can occur only when it is exercised by "front-line" providers, this primer presents robust yet accessible SPC concepts and techniques for use in today's ED.

  18. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  19. Development of a statistical tool for the estimation of riverbank erosion probability

    NASA Astrophysics Data System (ADS)

    Varouchakis, Emmanouil

    2016-04-01

    Riverbank erosion affects river morphology and local habitat, and results in riparian land loss, property and infrastructure damage, and ultimately flood defence weakening. An important issue concerning riverbank erosion is the identification of the vulnerable areas in order to predict river changes and assist stream management/restoration. An approach to predict areas vulnerable to erosion is to quantify the erosion probability by identifying the underlying relations between riverbank erosion and geomorphological or hydrological variables that prevent or stimulate erosion. In the present work, a innovative statistical methodology is proposed to predict the probability of presence or absence of erosion in a river section. A physically based model determines the locations vulnerable to erosion by quantifying the potential eroded area. The derived results are used to determine validation locations for the evaluation of the statistical tool performance. The statistical tool is based on a series of independent local variables and employs the Logistic Regression methodology. It is developed in two forms, Logistic Regression and Locally Weighted Logistic Regression, which both deliver useful and accurate results. The second form though, provides the most accurate results as it validates the presence or absence of erosion at all validation locations. The proposed tool is easy to use, accurate and can be applied to any region and river. Varouchakis, E. A., Giannakis, G. V., Lilli, M. A., Ioannidou, E., Nikolaidis, N. P., and Karatzas, G. P.: Development of a statistical tool for the estimation of riverbank erosion probability, SOIL (EGU), in print, 2016.

  20. Learning Axes and Bridging Tools in a Technology-Based Design for Statistics

    ERIC Educational Resources Information Center

    Abrahamson, Dor; Wilensky, Uri

    2007-01-01

    We introduce a design-based research framework, "learning axes and bridging tools," and demonstrate its application in the preparation and study of an implementation of a middle-school experimental computer-based unit on probability and statistics, "ProbLab" (Probability Laboratory, Abrahamson and Wilensky 2002 [Abrahamson, D., & Wilensky, U.…

  1. Calibrating the Difficulty of an Assessment Tool: The Blooming of a Statistics Examination

    ERIC Educational Resources Information Center

    Dunham, Bruce; Yapa, Gaitri; Yu, Eugenia

    2015-01-01

    Bloom's taxonomy is proposed as a tool by which to assess the level of complexity of assessment tasks in statistics. Guidelines are provided for how to locate tasks at each level of the taxonomy, along with descriptions and examples of suggested test questions. Through the "Blooming" of an examination--that is, locating its constituent…

  2. A Web-Based Learning Tool Improves Student Performance in Statistics: A Randomized Masked Trial

    ERIC Educational Resources Information Center

    Gonzalez, Jose A.; Jover, Lluis; Cobo, Erik; Munoz, Pilar

    2010-01-01

    Background: e-status is a web-based tool able to generate different statistical exercises and to provide immediate feedback to students' answers. Although the use of Information and Communication Technologies (ICTs) is becoming widespread in undergraduate education, there are few experimental studies evaluating its effects on learning. Method: All…

  3. The scientific modeling assistant: An advanced software tool for scientific model building

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.

    1991-01-01

    Viewgraphs on the scientific modeling assistant: an advanced software tool for scientific model building are presented. The objective is to build a specialized software tool to assist in scientific model-building.

  4. Functional toxicology: tools to advance the future of toxicity testing

    PubMed Central

    Gaytán, Brandon D.; Vulpe, Chris D.

    2014-01-01

    The increased presence of chemical contaminants in the environment is an undeniable concern to human health and ecosystems. Historically, by relying heavily upon costly and laborious animal-based toxicity assays, the field of toxicology has often neglected examinations of the cellular and molecular mechanisms of toxicity for the majority of compounds—information that, if available, would strengthen risk assessment analyses. Functional toxicology, where cells or organisms with gene deletions or depleted proteins are used to assess genetic requirements for chemical tolerance, can advance the field of toxicity testing by contributing data regarding chemical mechanisms of toxicity. Functional toxicology can be accomplished using available genetic tools in yeasts, other fungi and bacteria, and eukaryotes of increased complexity, including zebrafish, fruit flies, rodents, and human cell lines. Underscored is the value of using less complex systems such as yeasts to direct further studies in more complex systems such as human cell lines. Functional techniques can yield (1) novel insights into chemical toxicity; (2) pathways and mechanisms deserving of further study; and (3) candidate human toxicant susceptibility or resistance genes. PMID:24847352

  5. Infrastructure requirement of knowledge management system model of statistical learning tool (SLT) for education community

    NASA Astrophysics Data System (ADS)

    Abdullah, Rusli; Samah, Bahaman Abu; Bolong, Jusang; D'Silva, Jeffrey Lawrence; Shaffril, Hayrol Azril Mohamed

    2014-09-01

    Today, teaching and learning (T&L) using technology as tool is becoming more important especially in the field of statistics as a part of the subject matter in higher education system environment. Eventhough, there are many types of technology of statistical learnig tool (SLT) which can be used to support and enhance T&L environment, however, there is lack of a common standard knowledge management as a knowledge portal for guidance especially in relation to infrastructure requirement of SLT in servicing the community of user (CoU) such as educators, students and other parties who are interested in performing this technology as a tool for their T&L. Therefore, there is a need of a common standard infrastructure requirement of knowledge portal in helping CoU for managing of statistical knowledge in acquiring, storing, desseminating and applying of the statistical knowedge for their specific purposes. Futhermore, by having this infrastructure requirement of knowledge portal model of SLT as a guidance in promoting knowledge of best practise among the CoU, it can also enhance the quality and productivity of their work towards excellence of statistical knowledge application in education system environment.

  6. STRING 3: An Advanced Groundwater Flow Visualization Tool

    NASA Astrophysics Data System (ADS)

    Schröder, Simon; Michel, Isabel; Biedert, Tim; Gräfe, Marius; Seidel, Torsten; König, Christoph

    2016-04-01

    The visualization of 3D groundwater flow is a challenging task. Previous versions of our software STRING [1] solely focused on intuitive visualization of complex flow scenarios for non-professional audiences. STRING, developed by Fraunhofer ITWM (Kaiserslautern, Germany) and delta h Ingenieurgesellschaft mbH (Witten, Germany), provides the necessary means for visualization of both 2D and 3D data on planar and curved surfaces. In this contribution we discuss how to extend this approach to a full 3D tool and its challenges in continuation of Michel et al. [2]. This elevates STRING from a post-production to an exploration tool for experts. In STRING moving pathlets provide an intuition of velocity and direction of both steady-state and transient flows. The visualization concept is based on the Lagrangian view of the flow. To capture every detail of the flow an advanced method for intelligent, time-dependent seeding is used building on the Finite Pointset Method (FPM) developed by Fraunhofer ITWM. Lifting our visualization approach from 2D into 3D provides many new challenges. With the implementation of a seeding strategy for 3D one of the major problems has already been solved (see Schröder et al. [3]). As pathlets only provide an overview of the velocity field other means are required for the visualization of additional flow properties. We suggest the use of Direct Volume Rendering and isosurfaces for scalar features. In this regard we were able to develop an efficient approach for combining the rendering through raytracing of the volume and regular OpenGL geometries. This is achieved through the use of Depth Peeling or A-Buffers for the rendering of transparent geometries. Animation of pathlets requires a strict boundary of the simulation domain. Hence, STRING needs to extract the boundary, even from unstructured data, if it is not provided. In 3D we additionally need a good visualization of the boundary itself. For this the silhouette based on the angle of

  7. Classification of human colonic tissues using FTIR spectra and advanced statistical techniques

    NASA Astrophysics Data System (ADS)

    Zwielly, A.; Argov, S.; Salman, A.; Bogomolny, E.; Mordechai, S.

    2010-04-01

    One of the major public health hazards is colon cancer. There is a great necessity to develop new methods for early detection of cancer. If colon cancer is detected and treated early, cure rate of more than 90% can be achieved. In this study we used FTIR microscopy (MSP), which has shown a good potential in the last 20 years in the fields of medical diagnostic and early detection of abnormal tissues. Large database of FTIR microscopic spectra was acquired from 230 human colonic biopsies. Five different subgroups were included in our database, normal and cancer tissues as well as three stages of benign colonic polyps, namely, mild, moderate and severe polyps which are precursors of carcinoma. In this study we applied advanced mathematical and statistical techniques including principal component analysis (PCA) and linear discriminant analysis (LDA), on human colonic FTIR spectra in order to differentiate among the mentioned subgroups' tissues. Good classification accuracy between normal, polyps and cancer groups was achieved with approximately 85% success rate. Our results showed that there is a great potential of developing FTIR-micro spectroscopy as a simple, reagent-free viable tool for early detection of colon cancer in particular the early stages of premalignancy among the benign colonic polyps.

  8. Advanced Fuel Cycle Economic Tools, Algorithms, and Methodologies

    SciTech Connect

    David E. Shropshire

    2009-05-01

    The Advanced Fuel Cycle Initiative (AFCI) Systems Analysis supports engineering economic analyses and trade-studies, and requires a requisite reference cost basis to support adequate analysis rigor. In this regard, the AFCI program has created a reference set of economic documentation. The documentation consists of the “Advanced Fuel Cycle (AFC) Cost Basis” report (Shropshire, et al. 2007), “AFCI Economic Analysis” report, and the “AFCI Economic Tools, Algorithms, and Methodologies Report.” Together, these documents provide the reference cost basis, cost modeling basis, and methodologies needed to support AFCI economic analysis. The application of the reference cost data in the cost and econometric systems analysis models will be supported by this report. These methodologies include: the energy/environment/economic evaluation of nuclear technology penetration in the energy market—domestic and internationally—and impacts on AFCI facility deployment, uranium resource modeling to inform the front-end fuel cycle costs, facility first-of-a-kind to nth-of-a-kind learning with application to deployment of AFCI facilities, cost tradeoffs to meet nuclear non-proliferation requirements, and international nuclear facility supply/demand analysis. The economic analysis will be performed using two cost models. VISION.ECON will be used to evaluate and compare costs under dynamic conditions, consistent with the cases and analysis performed by the AFCI Systems Analysis team. Generation IV Excel Calculations of Nuclear Systems (G4-ECONS) will provide static (snapshot-in-time) cost analysis and will provide a check on the dynamic results. In future analysis, additional AFCI measures may be developed to show the value of AFCI in closing the fuel cycle. Comparisons can show AFCI in terms of reduced global proliferation (e.g., reduction in enrichment), greater sustainability through preservation of a natural resource (e.g., reduction in uranium ore depletion), value from

  9. The Web as an educational tool for/in learning/teaching bioinformatics statistics.

    PubMed

    Oliver, J; Pisano, M E; Alonso, T; Roca, P

    2005-12-01

    Statistics provides essential tool in Bioinformatics to interpret the results of a database search or for the management of enormous amounts of information provided from genomics, proteomics and metabolomics. The goal of this project was the development of a software tool that would be as simple as possible to demonstrate the use of the Bioinformatics statistics. Computer Simulation Methods (CSMs) developed using Microsoft Excel were chosen for their broad range of applications, immediate and easy formula calculation, immediate testing and easy graphics representation, and of general use and acceptance by the scientific community. The result of these endeavours is a set of utilities which can be accessed from the following URL: http://gmein.uib.es/bioinformatica/statistics. When tested on students with previous coursework with traditional statistical teaching methods, the general opinion/overall consensus was that Web-based instruction had numerous advantages, but traditional methods with manual calculations were also needed for their theory and practice. Once having mastered the basic statistical formulas, Excel spreadsheets and graphics were shown to be very useful for trying many parameters in a rapid fashion without having to perform tedious calculations. CSMs will be of great importance for the formation of the students and professionals in the field of bioinformatics, and for upcoming applications of self-learning and continuous formation.

  10. Informing the judgments of fingerprint analysts using quality metric and statistical assessment tools.

    PubMed

    Langenburg, Glenn; Champod, Christophe; Genessay, Thibault

    2012-06-10

    The aim of this research was to evaluate how fingerprint analysts would incorporate information from newly developed tools into their decision making processes. Specifically, we assessed effects using the following: (1) a quality tool to aid in the assessment of the clarity of the friction ridge details, (2) a statistical tool to provide likelihood ratios representing the strength of the corresponding features between compared fingerprints, and (3) consensus information from a group of trained fingerprint experts. The measured variables for the effect on examiner performance were the accuracy and reproducibility of the conclusions against the ground truth (including the impact on error rates) and the analyst accuracy and variation for feature selection and comparison. The results showed that participants using the consensus information from other fingerprint experts demonstrated more consistency and accuracy in minutiae selection. They also demonstrated higher accuracy, sensitivity, and specificity in the decisions reported. The quality tool also affected minutiae selection (which, in turn, had limited influence on the reported decisions); the statistical tool did not appear to influence the reported decisions.

  11. Comparing Simple and Advanced Video Tools as Supports for Complex Collaborative Design Processes

    ERIC Educational Resources Information Center

    Zahn, Carmen; Pea, Roy; Hesse, Friedrich W.; Rosen, Joe

    2010-01-01

    Working with digital video technologies, particularly advanced video tools with editing capabilities, offers new prospects for meaningful learning through design. However, it is also possible that the additional complexity of such tools does "not" advance learning. We compared in an experiment the design processes and learning outcomes…

  12. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  13. Analysis methodology and development of a statistical tool for biodistribution data from internal contamination with actinides.

    PubMed

    Lamart, Stephanie; Griffiths, Nina M; Tchitchek, Nicolas; Angulo, Jaime F; Van der Meeren, Anne

    2017-03-01

    The aim of this work was to develop a computational tool that integrates several statistical analysis features for biodistribution data from internal contamination experiments. These data represent actinide levels in biological compartments as a function of time and are derived from activity measurements in tissues and excreta. These experiments aim at assessing the influence of different contamination conditions (e.g. intake route or radioelement) on the biological behavior of the contaminant. The ever increasing number of datasets and diversity of experimental conditions make the handling and analysis of biodistribution data difficult. This work sought to facilitate the statistical analysis of a large number of datasets and the comparison of results from diverse experimental conditions. Functional modules were developed using the open-source programming language R to facilitate specific operations: descriptive statistics, visual comparison, curve fitting, and implementation of biokinetic models. In addition, the structure of the datasets was harmonized using the same table format. Analysis outputs can be written in text files and updated data can be written in the consistent table format. Hence, a data repository is built progressively, which is essential for the optimal use of animal data. Graphical representations can be automatically generated and saved as image files. The resulting computational tool was applied using data derived from wound contamination experiments conducted under different conditions. In facilitating biodistribution data handling and statistical analyses, this computational tool ensures faster analyses and a better reproducibility compared with the use of multiple office software applications. Furthermore, re-analysis of archival data and comparison of data from different sources is made much easier. Hence this tool will help to understand better the influence of contamination characteristics on actinide biokinetics. Our approach can aid

  14. New advanced radio diagnostics tools for Space Weather Program

    NASA Astrophysics Data System (ADS)

    Krankowski, A.; Rothkaehl, H.; Atamaniuk, B.; Morawski, M.; Zakharenkova, I.; Cherniak, I.; Otmianowska-Mazur, K.

    2013-12-01

    To give a more detailed and complete understanding of physical plasma processes that govern the solar-terrestrial space, and to develop qualitative and quantitative models of the magnetosphere-ionosphere-thermosphere coupling, it is necessary to design and build the next generation of instruments for space diagnostics and monitoring. Novel ground- based wide-area sensor networks, such as the LOFAR (Low Frequency Array) radar facility, comprising wide band, and vector-sensing radio receivers and multi-spacecraft plasma diagnostics should help solve outstanding problems of space physics and describe long-term environmental changes. The LOw Frequency ARray - LOFAR - is a new fully digital radio telescope designed for frequencies between 30 MHz and 240 MHz located in Europe. The three new LOFAR stations will be installed until summer 2015 in Poland. The LOFAR facilities in Poland will be distributed among three sites: Lazy (East of Krakow), Borowiec near Poznan and Baldy near Olsztyn. All they will be connected via PIONIER dedicated links to Poznan. Each site will host one LOFAR station (96 high-band+96 low-band antennas). They will most time work as a part of European network, however, when less charged, they can operate as a national network The new digital radio frequency analyzer (RFA) on board the low-orbiting RELEC satellite was designed to monitor and investigate the ionospheric plasma properties. This two-point ground-based and topside ionosphere-located space plasma diagnostic can be a useful new tool for monitoring and diagnosing turbulent plasma properties. The RFA on board the RELEC satellite is the first in a series of experiments which is planned to be launched into the near-Earth environment. In order to improve and validate the large scales and small scales ionospheric structures we will used the GPS observations collected at IGS/EPN network employed to reconstruct diurnal variations of TEC using all satellite passes over individual GPS stations and the

  15. Development of a statistical tool for the estimation of riverbank erosion probability

    NASA Astrophysics Data System (ADS)

    Varouchakis, E. A.; Giannakis, G. V.; Lilli, M. A.; Ioannidou, E.; Nikolaidis, N. P.; Karatzas, G. P.

    2016-01-01

    Riverbank erosion affects river morphology and local habitat, and results in riparian land loss, property and infrastructure damage, and ultimately flood defence weakening. An important issue concerning riverbank erosion is the identification of the vulnerable areas in order to predict river changes and assist stream management/restoration. An approach to predict areas vulnerable to erosion is to quantify the erosion probability by identifying the underlying relations between riverbank erosion and geomorphological or hydrological variables that prevent or stimulate erosion. In the present work, a statistical methodology is proposed to predict the probability of the presence or absence of erosion in a river section. A physically based model determines the locations vulnerable to erosion by quantifying the potential eroded area. The derived results are used to determine validation locations for the evaluation of the statistical tool performance. The statistical tool is based on a series of independent local variables and employs the logistic regression methodology. It is developed in two forms, logistic regression and locally weighted logistic regression, which both deliver useful and accurate results. The second form, though, provides the most accurate results as it validates the presence or absence of erosion at all validation locations. The proposed tool is easy to use and accurate and can be applied to any region and river.

  16. Reducing Anxiety and Increasing Self-Efficacy within an Advanced Graduate Psychology Statistics Course

    ERIC Educational Resources Information Center

    McGrath, April L.; Ferns, Alyssa; Greiner, Leigh; Wanamaker, Kayla; Brown, Shelley

    2015-01-01

    In this study we assessed the usefulness of a multifaceted teaching framework in an advanced statistics course. We sought to expand on past findings by using this framework to assess changes in anxiety and self-efficacy, and we collected focus group data to ascertain whether students attribute such changes to a multifaceted teaching approach.…

  17. Using the Student Research Project to Integrate Macroeconomics and Statistics in an Advanced Cost Accounting Course

    ERIC Educational Resources Information Center

    Hassan, Mahamood M.; Schwartz, Bill N.

    2014-01-01

    This paper discusses a student research project that is part of an advanced cost accounting class. The project emphasizes active learning, integrates cost accounting with macroeconomics and statistics by "learning by doing" using real world data. Students analyze sales data for a publicly listed company by focusing on the company's…

  18. A Tool Preference Choice Method for RNA Secondary Structure Prediction by SVM with Statistical Tests

    PubMed Central

    Hor, Chiou-Yi; Yang, Chang-Biau; Chang, Chia-Hung; Tseng, Chiou-Ting; Chen, Hung-Hsin

    2013-01-01

    The Prediction of RNA secondary structures has drawn much attention from both biologists and computer scientists. Many useful tools have been developed for this purpose. These tools have their individual strengths and weaknesses. As a result, based on support vector machines (SVM), we propose a tool choice method which integrates three prediction tools: pknotsRG, RNAStructure, and NUPACK. Our method first extracts features from the target RNA sequence, and adopts two information-theoretic feature selection methods for feature ranking. We propose a method to combine feature selection and classifier fusion in an incremental manner. Our test data set contains 720 RNA sequences, where 225 pseudoknotted RNA sequences are obtained from PseudoBase, and 495 nested RNA sequences are obtained from RNA SSTRAND. The method serves as a preprocessing way in analyzing RNA sequences before the RNA secondary structure prediction tools are employed. In addition, the performance of various configurations is subject to statistical tests to examine their significance. The best base-pair accuracy achieved is 75.5%, which is obtained by the proposed incremental method, and is significantly higher than 68.8%, which is associated with the best predictor, pknotsRG. PMID:23641141

  19. Constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.

    1990-01-01

    A prototype is described that can serve as a scientific-modeling software tool to facilitate the development of useful scientific models. The prototype is developed for applications to planetary modeling, and specific examples are given that relate to the atmosphere of Titan. The scientific modeling tool employs a high-level domain-specific modeling language, several data-display facilities, and a library of experimental datasets and scientific equations. The planetary modeling prototype links uncomputed physical variables to computed variables with computational transformations based on a backchaining procedure. The system - implemented in LISP with an object-oriented knowledge-representation tool - is run on a workstation that provides interface with several models. The prototype is expected to form the basis for a sophisticated modeling tool that can permit active experimentation.

  20. Development of a statistical tool for the estimation of riverbank erosion probability

    NASA Astrophysics Data System (ADS)

    Varouchakis, E. A.; Giannakis, G. V.; Lilli, M. A.; Ioannidou, E.; Nikolaidis, N. P.; Karatzas, G. P.

    2015-06-01

    Riverbank erosion affects river morphology and local habitat and results in riparian land loss, property and infrastructure damage, and ultimately flood defence weakening. An important issue concerning riverbank erosion is the identification of the vulnerable areas in order to predict river changes and assist stream management/restoration. An approach to predict vulnerable to erosion areas is to quantify the erosion probability by identifying the underlying relations between riverbank erosion and geomorphological or hydrological variables that prevent or stimulate erosion. In the present work, a combined deterministic and statistical methodology is proposed to predict the probability of presence or absence of erosion in a river section. A physically based model determines the vulnerable to erosion locations by quantifying the potential eroded area. The derived results are used to determine validation locations for the statistical tool performance evaluation. The statistical tool is based on a series of independent local variables and employs the Logistic Regression methodology. It is developed in two forms, Logistic Regression and Locally Weighted Logistic Regression, which both deliver useful and accurate results. The second form though provides the most accurate results as it validates the presence or absence of erosion at all validation locations. The proposed methodology is easy to use, accurate and can be applied to any region and river.

  1. AeroStat: NASA Giovanni Tool for Statistical Intercomparison of Aerosols

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Petrenko, M.; Leptoukh, G. G.; Lynnes, C.; Da Silva, D.; Hegde, M.; Ichoku, C. M.

    2011-12-01

    Giovanni is a NASA's interactive online visualization and analysis tool for exploring very large global Earth science datasets. One of the new Giovanni analytical and statistical tools is called AeroStat, and it is designed to perform the direct statistical intercomparison of global aerosol parameters. Currently, we incorporate the MAPSS (A Multi-sensor Aerosol Products Sampling System) data that provides spatio-temporal statistics for multiple spatial spaceborne Level 2 aerosol products (MODIS Terra, MODIS Aqua, MISR, POLDER, OMI and CALIOP) sampled over AERONET ground stations. The dataset period, 1997-2011 (up to date), is long enough to encompass a number of scientifically challenging cases of long-term global aerosol validation from multi-sensors. AeroStat allows users to easily visualize and analyze in details the statistical properties of such cases, including data collected from multiple sensors and quality assurance (QA) properties of these data. One of the goals of AeroStat is to also provide a collaborative research environment, where aerosol scientists can share pertinent research workflow information, including data cases of interest, algorithms, best practices, and known errors, with the broader science community and enable other users of the system to easily reproduce and independently verify their results. Furthermore, AeroStat provides an easy access to the data provenance (data lineage) and quality information, which allows for a convenient tracing of scientific results back to their original input data, thus further ensuring the reliability of these results. Case studies will be presented to show the described functionality and capabilities of AeroStat, and possible directions of the future development.

  2. XML based tools for assessing potential impact of advanced technology space validation

    NASA Technical Reports Server (NTRS)

    Some, Raphael R.; Weisbin, Charles

    2004-01-01

    A hierarchical XML database and related analysis tools are being developed by the New Millennium Program to provide guidance on the relative impact, to future NASA missions, of advanced technologies under consideration for developmental funding.

  3. Human Factors Evaluation of Advanced Electric Power Grid Visualization Tools

    SciTech Connect

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin

    2009-04-01

    This report describes initial human factors evaluation of four visualization tools (Graphical Contingency Analysis, Force Directed Graphs, Phasor State Estimator and Mode Meter/ Mode Shapes) developed by PNNL, and proposed test plans that may be implemented to evaluate their utility in scenario-based experiments.

  4. SEPEM: A tool for statistical modeling the solar energetic particle environment

    NASA Astrophysics Data System (ADS)

    Crosby, Norma; Heynderickx, Daniel; Jiggens, Piers; Aran, Angels; Sanahuja, Blai; Truscott, Pete; Lei, Fan; Jacobs, Carla; Poedts, Stefaan; Gabriel, Stephen; Sandberg, Ingmar; Glover, Alexi; Hilgers, Alain

    2015-07-01

    Solar energetic particle (SEP) events are a serious radiation hazard for spacecraft as well as a severe health risk to humans traveling in space. Indeed, accurate modeling of the SEP environment constitutes a priority requirement for astrophysics and solar system missions and for human exploration in space. The European Space Agency's Solar Energetic Particle Environment Modelling (SEPEM) application server is a World Wide Web interface to a complete set of cross-calibrated data ranging from 1973 to 2013 as well as new SEP engineering models and tools. Both statistical and physical modeling techniques have been included, in order to cover the environment not only at 1 AU but also in the inner heliosphere ranging from 0.2 AU to 1.6 AU using a newly developed physics-based shock-and-particle model to simulate particle flux profiles of gradual SEP events. With SEPEM, SEP peak flux and integrated fluence statistics can be studied, as well as durations of high SEP flux periods. Furthermore, effects tools are also included to allow calculation of single event upset rate and radiation doses for a variety of engineering scenarios.

  5. [EpiInfo as a research and teaching tool in epidemiology and statistics: strengths and weaknesses].

    PubMed

    Mannocci, Alice; Bontempi, Claudio; Giraldi, Guglielmo; Chiaradia, Giacomina; de Waure, Chiara; Sferrazza, Antonella; Ricciardi, Walter; Boccia, Antonio; La Torre, Giuseppe

    2012-01-01

    EpiInfo is a free software developed in 1988 by the Centers for Disease Control and Prevention (CDC) in Atlanta to facilitate field epidemiological investigations and statistical analysis. The aim of this study was to assess whether the software represents, in the Italian biomedical field, an effective analytical research tool and a practical and simple epidemiology and biostatistics teaching tool. A questionnaire consisting of 20 multiple-choice and open questions was administered to 300 healthcare workers, including doctors, biologists, nurses, medical students and interns, at the end of a CME course in epidemiology and biostatistics. Sixty-four percent of participants were aged between 26 and 45 years, 52% were women and 73% were unmarried. Results show that women are more likely to utilize EpiInfo in their research activities with respect to men (p = 0.023), as are individuals aged 26-45 years with respect to the older and younger age groups (p = 0.023) and unmarried participants with respect to those married (p = 0.010). Thirty-one percent of respondents consider EpiInfo to be more than adequate for analysis of their research data and 52% consider it to be sufficiently so. The inclusion of an EpiInfo course in statistics and epidemiology modules facilitates the understanding of theoretical concepts and allows researchers to more easily perform some of the clinical/epidemiological research activities.

  6. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1992-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  7. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1993-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  8. Analyzing Planck and low redshift data sets with advanced statistical methods

    NASA Astrophysics Data System (ADS)

    Eifler, Tim

    The recent ESA/NASA Planck mission has provided a key data set to constrain cosmology that is most sensitive to physics of the early Universe, such as inflation and primordial NonGaussianity (Planck 2015 results XIII). In combination with cosmological probes of the LargeScale Structure (LSS), the Planck data set is a powerful source of information to investigate late time phenomena (Planck 2015 results XIV), e.g. the accelerated expansion of the Universe, the impact of baryonic physics on the growth of structure, and the alignment of galaxies in their dark matter halos. It is the main objective of this proposal to re-analyze the archival Planck data, 1) with different, more recently developed statistical methods for cosmological parameter inference, and 2) to combine Planck and ground-based observations in an innovative way. We will make the corresponding analysis framework publicly available and believe that it will set a new standard for future CMB-LSS analyses. Advanced statistical methods, such as the Gibbs sampler (Jewell et al 2004, Wandelt et al 2004) have been critical in the analysis of Planck data. More recently, Approximate Bayesian Computation (ABC, see Weyant et al 2012, Akeret et al 2015, Ishida et al 2015, for cosmological applications) has matured to an interesting tool in cosmological likelihood analyses. It circumvents several assumptions that enter the standard Planck (and most LSS) likelihood analyses, most importantly, the assumption that the functional form of the likelihood of the CMB observables is a multivariate Gaussian. Beyond applying new statistical methods to Planck data in order to cross-check and validate existing constraints, we plan to combine Planck and DES data in a new and innovative way and run multi-probe likelihood analyses of CMB and LSS observables. The complexity of multiprobe likelihood analyses scale (non-linearly) with the level of correlations amongst the individual probes that are included. For the multi

  9. Accuracy Evaluation of a Mobile Mapping System with Advanced Statistical Methods

    NASA Astrophysics Data System (ADS)

    Toschi, I.; Rodríguez-Gonzálvez, P.; Remondino, F.; Minto, S.; Orlandini, S.; Fuller, A.

    2015-02-01

    This paper discusses a methodology to evaluate the precision and the accuracy of a commercial Mobile Mapping System (MMS) with advanced statistical methods. So far, the metric potentialities of this emerging mapping technology have been studied in few papers, where generally the assumption that errors follow a normal distribution is made. In fact, this hypothesis should be carefully verified in advance, in order to test how well the Gaussian classic statistics can adapt to datasets that are usually affected by asymmetrical gross errors. The workflow adopted in this study relies on a Gaussian assessment, followed by an outlier filtering process. Finally, non-parametric statistical models are applied, in order to achieve a robust estimation of the error dispersion. Among the different MMSs available on the market, the latest solution provided by RIEGL is here tested, i.e. the VMX-450 Mobile Laser Scanning System. The test-area is the historic city centre of Trento (Italy), selected in order to assess the system performance in dealing with a challenging and historic urban scenario. Reference measures are derived from photogrammetric and Terrestrial Laser Scanning (TLS) surveys. All datasets show a large lack of symmetry that leads to the conclusion that the standard normal parameters are not adequate to assess this type of data. The use of non-normal statistics gives thus a more appropriate description of the data and yields results that meet the quoted a-priori errors.

  10. A 3D Interactive Multi-object Segmentation Tool using Local Robust Statistics Driven Active Contours

    PubMed Central

    Gao, Yi; Kikinis, Ron; Bouix, Sylvain; Shenton, Martha; Tannenbaum, Allen

    2012-01-01

    Extracting anatomical and functional significant structures renders one of the important tasks for both the theoretical study of the medical image analysis, and the clinical and practical community. In the past, much work has been dedicated only to the algorithmic development. Nevertheless, for clinical end users, a well designed algorithm with an interactive software is necessary for an algorithm to be utilized in their daily work. Furthermore, the software would better be open sourced in order to be used and validated by not only the authors but also the entire community. Therefore, the contribution of the present work is twofolds: First, we propose a new robust statistics based conformal metric and the conformal area driven multiple active contour framework, to simultaneously extract multiple targets from MR and CT medical imagery in 3D. Second, an open source graphically interactive 3D segmentation tool based on the aforementioned contour evolution is implemented and is publicly available for end users on multiple platforms. In using this software for the segmentation task, the process is initiated by the user drawn strokes (seeds) in the target region in the image. Then, the local robust statistics are used to describe the object features, and such features are learned adaptively from the seeds under a non-parametric estimation scheme. Subsequently, several active contours evolve simultaneously with their interactions being motivated by the principles of action and reaction — This not only guarantees mutual exclusiveness among the contours, but also no longer relies upon the assumption that the multiple objects fill the entire image domain, which was tacitly or explicitly assumed in many previous works. In doing so, the contours interact and converge to equilibrium at the desired positions of the desired multiple objects. Furthermore, with the aim of not only validating the algorithm and the software, but also demonstrating how the tool is to be used, we

  11. AMAS: a fast tool for alignment manipulation and computing of summary statistics.

    PubMed

    Borowiec, Marek L

    2016-01-01

    The amount of data used in phylogenetics has grown explosively in the recent years and many phylogenies are inferred with hundreds or even thousands of loci and many taxa. These modern phylogenomic studies often entail separate analyses of each of the loci in addition to multiple analyses of subsets of genes or concatenated sequences. Computationally efficient tools for handling and computing properties of thousands of single-locus or large concatenated alignments are needed. Here I present AMAS (Alignment Manipulation And Summary), a tool that can be used either as a stand-alone command-line utility or as a Python package. AMAS works on amino acid and nucleotide alignments and combines capabilities of sequence manipulation with a function that calculates basic statistics. The manipulation functions include conversions among popular formats, concatenation, extracting sites and splitting according to a pre-defined partitioning scheme, creation of replicate data sets, and removal of taxa. The statistics calculated include the number of taxa, alignment length, total count of matrix cells, overall number of undetermined characters, percent of missing data, AT and GC contents (for DNA alignments), count and proportion of variable sites, count and proportion of parsimony informative sites, and counts of all characters relevant for a nucleotide or amino acid alphabet. AMAS is particularly suitable for very large alignments with hundreds of taxa and thousands of loci. It is computationally efficient, utilizes parallel processing, and performs better at concatenation than other popular tools. AMAS is a Python 3 program that relies solely on Python's core modules and needs no additional dependencies. AMAS source code and manual can be downloaded from http://github.com/marekborowiec/AMAS/ under GNU General Public License.

  12. AMAS: a fast tool for alignment manipulation and computing of summary statistics

    PubMed Central

    2016-01-01

    The amount of data used in phylogenetics has grown explosively in the recent years and many phylogenies are inferred with hundreds or even thousands of loci and many taxa. These modern phylogenomic studies often entail separate analyses of each of the loci in addition to multiple analyses of subsets of genes or concatenated sequences. Computationally efficient tools for handling and computing properties of thousands of single-locus or large concatenated alignments are needed. Here I present AMAS (Alignment Manipulation And Summary), a tool that can be used either as a stand-alone command-line utility or as a Python package. AMAS works on amino acid and nucleotide alignments and combines capabilities of sequence manipulation with a function that calculates basic statistics. The manipulation functions include conversions among popular formats, concatenation, extracting sites and splitting according to a pre-defined partitioning scheme, creation of replicate data sets, and removal of taxa. The statistics calculated include the number of taxa, alignment length, total count of matrix cells, overall number of undetermined characters, percent of missing data, AT and GC contents (for DNA alignments), count and proportion of variable sites, count and proportion of parsimony informative sites, and counts of all characters relevant for a nucleotide or amino acid alphabet. AMAS is particularly suitable for very large alignments with hundreds of taxa and thousands of loci. It is computationally efficient, utilizes parallel processing, and performs better at concatenation than other popular tools. AMAS is a Python 3 program that relies solely on Python’s core modules and needs no additional dependencies. AMAS source code and manual can be downloaded from http://github.com/marekborowiec/AMAS/ under GNU General Public License. PMID:26835189

  13. Power spectra as a diagnostic tool in probing statistical/nonstatistical behavior in unimolecular reactions

    NASA Astrophysics Data System (ADS)

    Chang, Xiaoyen Y.; Sewell, Thomas D.; Raff, Lionel M.; Thompson, Donald L.

    1992-11-01

    The possibility of utilizing different types of power spectra obtained from classical trajectories as a diagnostic tool to identify the presence of nonstatistical dynamics is explored by using the unimolecular bond-fission reactions of 1,2-difluoroethane and the 2-chloroethyl radical as test cases. In previous studies, the reaction rates for these systems were calculated by using a variational transition-state theory and classical trajectory methods. A comparison of the results showed that 1,2-difluoroethane is a nonstatistical system, while the 2-chloroethyl radical behaves statistically. Power spectra for these two systems have been generated under various conditions. The characteristics of these spectra are as follows: (1) The spectra for the 2-chloroethyl radical are always broader and more coupled to other modes than is the case for 1,2-difluoroethane. This is true even at very low levels of excitation. (2) When an internal energy near or above the dissociation threshold is initially partitioned into a local C-H stretching mode, the power spectra for 1,2-difluoroethane broaden somewhat, but discrete and somewhat isolated bands are still clearly evident. In contrast, the analogous power spectra for the 2-chloroethyl radical exhibit a near complete absence of isolated bands. The general appearance of the spectrum suggests a very high level of mode-to-mode coupling, large intramolecular vibrational energy redistribution (IVR) rates, and global statistical behavior. (3) The appearance of the power spectrum for the 2-chloroethyl radical is unaltered regardless of whether the initial C-H excitation is in the CH2 or the CH2Cl group. This result also suggests statistical behavior. These results are interpreted to mean that power spectra may be used as a diagnostic tool to assess the statistical character of a system. The presence of a diffuse spectrum exhibiting a nearly complete loss of isolated structures indicates that the dissociation dynamics of the molecule will

  14. Detection of Cutting Tool Wear using Statistical Analysis and Regression Model

    NASA Astrophysics Data System (ADS)

    Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin

    2010-10-01

    This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.

  15. The use of machine learning and nonlinear statistical tools for ADME prediction.

    PubMed

    Sakiyama, Yojiro

    2009-02-01

    Absorption, distribution, metabolism and excretion (ADME)-related failure of drug candidates is a major issue for the pharmaceutical industry today. Prediction of ADME by in silico tools has now become an inevitable paradigm to reduce cost and enhance efficiency in pharmaceutical research. Recently, machine learning as well as nonlinear statistical tools has been widely applied to predict routine ADME end points. To achieve accurate and reliable predictions, it would be a prerequisite to understand the concepts, mechanisms and limitations of these tools. Here, we have devised a small synthetic nonlinear data set to help understand the mechanism of machine learning by 2D-visualisation. We applied six new machine learning methods to four different data sets. The methods include Naive Bayes classifier, classification and regression tree, random forest, Gaussian process, support vector machine and k nearest neighbour. The results demonstrated that ensemble learning and kernel machine displayed greater accuracy of prediction than classical methods irrespective of the data set size. The importance of interaction with the engineering field is also addressed. The results described here provide insights into the mechanism of machine learning, which will enable appropriate usage in the future.

  16. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  17. Advanced tools and framework for historical film restoration

    NASA Astrophysics Data System (ADS)

    Croci, Simone; Aydın, Tunç Ozan; Stefanoski, Nikolce; Gross, Markus; Smolic, Aljosa

    2017-01-01

    Digital restoration of film content that has historical value is crucial for the preservation of cultural heritage. Also, digital restoration is not only a relevant application area of various video processing technologies that have been developed in computer graphics literature but also involves a multitude of unresolved research challenges. Currently, the digital restoration workflow is highly labor intensive and often heavily relies on expert knowledge. We revisit some key steps of this workflow and propose semiautomatic methods for performing them. To do that we build upon state-of-the-art video processing techniques by adding the components necessary for enabling (i) restoration of chemically degraded colors of the film stock, (ii) removal of excessive film grain through spatiotemporal filtering, and (iii) contrast recovery by transferring contrast from the negative film stock to the positive. We show that when applied individually our tools produce compelling results and when applied in concert significantly improve the degraded input content. Building on a conceptual framework of film restoration ensures the best possible combination of tools and use of available materials.

  18. Advanced Epi Tools for Gallium Nitride Light Emitting Diode Devices

    SciTech Connect

    Patibandla, Nag; Agrawal, Vivek

    2012-12-01

    Over the course of this program, Applied Materials, Inc., with generous support from the United States Department of Energy, developed a world-class three chamber III-Nitride epi cluster tool for low-cost, high volume GaN growth for the solid state lighting industry. One of the major achievements of the program was to design, build, and demonstrate the world’s largest wafer capacity HVPE chamber suitable for repeatable high volume III-Nitride template and device manufacturing. Applied Materials’ experience in developing deposition chambers for the silicon chip industry over many decades resulted in many orders of magnitude reductions in the price of transistors. That experience and understanding was used in developing this GaN epi deposition tool. The multi-chamber approach, which continues to be unique in the ability of the each chamber to deposit a section of the full device structure, unlike other cluster tools, allows for extreme flexibility in the manufacturing process. This robust architecture is suitable for not just the LED industry, but GaN power devices as well, both horizontal and vertical designs. The new HVPE technology developed allows GaN to be grown at a rate unheard of with MOCVD, up to 20x the typical MOCVD rates of 3{micro}m per hour, with bulk crystal quality better than the highest-quality commercial GaN films grown by MOCVD at a much cheaper overall cost. This is a unique development as the HVPE process has been known for decades, but never successfully commercially developed for high volume manufacturing. This research shows the potential of the first commercial-grade HVPE chamber, an elusive goal for III-V researchers and those wanting to capitalize on the promise of HVPE. Additionally, in the course of this program, Applied Materials built two MOCVD chambers, in addition to the HVPE chamber, and a robot that moves wafers between them. The MOCVD chambers demonstrated industry-leading wavelength yield for GaN based LED wafers and industry

  19. Source apportionment advances using polar plots of bivariate correlation and regression statistics

    NASA Astrophysics Data System (ADS)

    Grange, Stuart K.; Lewis, Alastair C.; Carslaw, David C.

    2016-11-01

    This paper outlines the development of enhanced bivariate polar plots that allow the concentrations of two pollutants to be compared using pair-wise statistics for exploring the sources of atmospheric pollutants. The new method combines bivariate polar plots, which provide source characteristic information, with pair-wise statistics that provide information on how two pollutants are related to one another. The pair-wise statistics implemented include weighted Pearson correlation and slope from two linear regression methods. The development uses a Gaussian kernel to locally weight the statistical calculations on a wind speed-direction surface together with variable-scaling. Example applications of the enhanced polar plots are presented by using routine air quality data for two monitoring sites in London, United Kingdom for a single year (2013). The London examples demonstrate that the combination of bivariate polar plots, correlation, and regression techniques can offer considerable insight into air pollution source characteristics, which would be missed if only scatter plots and mean polar plots were used for analysis. Specifically, using correlation and slopes as pair-wise statistics, long-range transport processes were isolated and black carbon (BC) contributions to PM2.5 for a kerbside monitoring location were quantified. Wider applications and future advancements are also discussed.

  20. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    SciTech Connect

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHA then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment

  1. Advanced statistical process control: controlling sub-0.18-μm lithography and other processes

    NASA Astrophysics Data System (ADS)

    Zeidler, Amit; Veenstra, Klaas-Jelle; Zavecz, Terrence E.

    2001-08-01

    Feed-forward, as a method to control the Lithography process for Critical Dimensions and Overlay, is well known in the semiconductors industry. However, the control provided by simple averaging feed-forward methodologies is not sufficient to support the complexity of a sub-0.18micrometers lithography process. Also, simple feed-forward techniques are not applicable for logics and ASIC production due to many different products, lithography chemistry combinations and the short memory of the averaging method. In the semiconductors industry, feed-forward control applications are generally called APC, Advanced Process Control applications. Today, there are as many APC methods as the number of engineers involved. To meet the stringent requirements of 0.18 micrometers production, we selected a method that is described in SPIE 3998-48 (March 2000) by Terrence Zavecz and Rene Blanquies from Yield Dynamics Inc. This method is called PPC, Predictive Process Control, and employs a methodology of collecting measurement results and the modeled bias attributes of expose tools, reticles and the incoming process in a signatures database. With PPC, before each lot exposure, the signatures of the lithography tool, the reticle and the incoming process are used to predict the setup of the lot process and the expected lot results. Benefits derived from such an implementation are very clear; there is no limitation of the number of products or lithography-chemistry combinations and the technique avoids the short memory of conventional APC techniques. ... and what's next? (Rob Morton, Philips assignee to International Sematech). The next part of the paper will try to answer this question. Observing that CMP and metal deposition significantly influence CD's and overlay results, and even Contact Etch can have a significant influence on Metal 5 overlay, we developed a more general PPC for lithography. Starting with the existing lithography PPC applications database, the authors extended the

  2. Advanced Flow Control as a Management Tool in the National Airspace System

    NASA Technical Reports Server (NTRS)

    Wugalter, S.

    1974-01-01

    Advanced Flow Control is closely related to Air Traffic Control. Air Traffic Control is the business of the Federal Aviation Administration. To formulate an understanding of advanced flow control and its use as a management tool in the National Airspace System, it becomes necessary to speak somewhat of air traffic control, the role of FAA, and their relationship to advanced flow control. Also, this should dispell forever, any notion that advanced flow control is the inspirational master valve scheme to be used on the Alaskan Oil Pipeline.

  3. An Interactive Tool For Semi-automated Statistical Prediction Using Earth Observations and Models

    NASA Astrophysics Data System (ADS)

    Zaitchik, B. F.; Berhane, F.; Tadesse, T.

    2015-12-01

    We developed a semi-automated statistical prediction tool applicable to concurrent analysis or seasonal prediction of any time series variable in any geographic location. The tool was developed using Shiny, JavaScript, HTML and CSS. A user can extract a predictand by drawing a polygon over a region of interest on the provided user interface (global map). The user can select the Climatic Research Unit (CRU) precipitation or Climate Hazards Group InfraRed Precipitation with Station data (CHIRPS) as predictand. They can also upload their own predictand time series. Predictors can be extracted from sea surface temperature, sea level pressure, winds at different pressure levels, air temperature at various pressure levels, and geopotential height at different pressure levels. By default, reanalysis fields are applied as predictors, but the user can also upload their own predictors, including a wide range of compatible satellite-derived datasets. The package generates correlations of the variables selected with the predictand. The user also has the option to generate composites of the variables based on the predictand. Next, the user can extract predictors by drawing polygons over the regions that show strong correlations (composites). Then, the user can select some or all of the statistical prediction models provided. Provided models include Linear Regression models (GLM, SGLM), Tree-based models (bagging, random forest, boosting), Artificial Neural Network, and other non-linear models such as Generalized Additive Model (GAM) and Multivariate Adaptive Regression Splines (MARS). Finally, the user can download the analysis steps they used, such as the region they selected, the time period they specified, the predictand and predictors they chose and preprocessing options they used, and the model results in PDF or HTML format. Key words: Semi-automated prediction, Shiny, R, GLM, ANN, RF, GAM, MARS

  4. Advanced tools for astronomical time series and image analysis

    NASA Astrophysics Data System (ADS)

    Scargle, Jeffrey D.

    The algorithms described here, which I have developed for applications in X-ray and γ-ray astronomy, will hopefully be of use in other ways, perhaps aiding in the exploration of modern astronomy's data cornucopia. The goal is to describe principled approaches to some ubiquitous problems, such as detection and characterization of periodic and aperiodic signals, estimation of time delays between multiple time series, and source detection in noisy images with noisy backgrounds. The latter problem is related to detection of clusters in data spaces of various dimensions. A goal of this work is to achieve a unifying view of several related topics: signal detection and characterization, cluster identification, classification, density estimation, and multivariate regression. In addition to being useful for analysis of data from space-based and ground-based missions, these algorithms may be a basis for a future automatic science discovery facility, and in turn provide analysis tools for the Virtual Observatory. This chapter has ties to those by Larry Bretthorst, Tom Loredo, Alanna Connors, Fionn Murtagh, Jim Berger, David van Dyk, Vicent Martinez & Enn Saar.

  5. Atomic force microscopy as an advanced tool in neuroscience

    PubMed Central

    Jembrek, Maja Jazvinšćak; Šimić, Goran; Hof, Patrick R.; Šegota, Suzana

    2015-01-01

    This review highlights relevant issues about applications and improvements of atomic force microscopy (AFM) toward a better understanding of neurodegenerative changes at the molecular level with the hope of contributing to the development of effective therapeutic strategies for neurodegenerative illnesses. The basic principles of AFM are briefly discussed in terms of evaluation of experimental data, including the newest PeakForce Quantitative Nanomechanical Mapping (QNM) and the evaluation of Young’s modulus as the crucial elasticity parameter. AFM topography, revealed in imaging mode, can be used to monitor changes in live neurons over time, representing a valuable tool for high-resolution detection and monitoring of neuronal morphology. The mechanical properties of living cells can be quantified by force spectroscopy as well as by new AFM. A variety of applications are described, and their relevance for specific research areas discussed. In addition, imaging as well as non-imaging modes can provide specific information, not only about the structural and mechanical properties of neuronal membranes, but also on the cytoplasm, cell nucleus, and particularly cytoskeletal components. Moreover, new AFM is able to provide detailed insight into physical structure and biochemical interactions in both physiological and pathophysiological conditions. PMID:28123795

  6. Comparative evaluation of spectroscopic models using different multivariate statistical tools in a multicancer scenario

    NASA Astrophysics Data System (ADS)

    Ghanate, A. D.; Kothiwale, S.; Singh, S. P.; Bertrand, Dominique; Krishna, C. Murali

    2011-02-01

    Cancer is now recognized as one of the major causes of morbidity and mortality. Histopathological diagnosis, the gold standard, is shown to be subjective, time consuming, prone to interobserver disagreement, and often fails to predict prognosis. Optical spectroscopic methods are being contemplated as adjuncts or alternatives to conventional cancer diagnostics. The most important aspect of these approaches is their objectivity, and multivariate statistical tools play a major role in realizing it. However, rigorous evaluation of the robustness of spectral models is a prerequisite. The utility of Raman spectroscopy in the diagnosis of cancers has been well established. Until now, the specificity and applicability of spectral models have been evaluated for specific cancer types. In this study, we have evaluated the utility of spectroscopic models representing normal and malignant tissues of the breast, cervix, colon, larynx, and oral cavity in a broader perspective, using different multivariate tests. The limit test, which was used in our earlier study, gave high sensitivity but suffered from poor specificity. The performance of other methods such as factorial discriminant analysis and partial least square discriminant analysis are at par with more complex nonlinear methods such as decision trees, but they provide very little information about the classification model. This comparative study thus demonstrates not just the efficacy of Raman spectroscopic models but also the applicability and limitations of different multivariate tools for discrimination under complex conditions such as the multicancer scenario.

  7. Statistical Dimensioning of Nutrient Loading Reduction: LLR Assessment Tool for Lake Managers.

    PubMed

    Kotamäki, Niina; Pätynen, Anita; Taskinen, Antti; Huttula, Timo; Malve, Olli

    2015-08-01

    Implementation of the EU Water Framework Directive (WFD) has set a great challenge on river basin management planning. Assessing the water quality of lakes and coastal waters as well as setting the accepted nutrient loading levels requires appropriate decision supporting tools and models. Uncertainty that is inevitably related to the assessment results and rises from several sources calls for more precise quantification and consideration. In this study, we present a modeling tool, called lake load response (LLR), which can be used for statistical dimensioning of the nutrient loading reduction. LLR calculates the reduction that is needed to achieve good ecological status in a lake in terms of total nutrients and chlorophyll a (chl-a) concentration. We show that by combining an empirical nutrient retention model with a hierarchical chl-a model, the national lake monitoring data can be used more efficiently for predictions to a single lake. To estimate the uncertainties, we separate the residual variability and the parameter uncertainty of the modeling results with the probabilistic Bayesian modeling framework. LLR has been developed to answer the urgent need for fast and simple assessment methods, especially when implementing WFD at such an extensive scale as in Finland. With a case study for an eutrophic Finnish lake, we demonstrate how the model can be utilized to set the target loadings and to see how the uncertainties are quantified and how they are accumulating within the modeling chain.

  8. Electrochemical Processing Tools for Advanced Copper Interconnects: An Introduction

    NASA Astrophysics Data System (ADS)

    Datta, Madhav

    The change from vacuum-deposited aluminum to electroplated copper in 1997 brought about a paradigm shift in interconnect technology and in chip making [1]. Since then, most of the leading chip manufacturers have converted to electroplated Cu technology for chip interconnects. Cu interconnects are fabricated by dual Damascene process which is referred to a metallization patterning process by which two insulator (dielectric) levels are patterned, filled with copper, and planarized to create a metal layer consisting of vias and lines. The process steps consist of laying a sandwich of two levels of insulator and etch stop layers that are patterned as holes for vias and troughs for lines. They are then filled with a single metallization step. Finally, the excess material is removed, and the wafer is planarized by chemical mechanical polishing (CMP). While finer details of exact sequence of fabrication steps vary, the end result of forming a metal layer remains the same in which vias are formed in the lower layer, and trenches are formed in the upper layer. Electroplating enables deposition of Cu in via holes and overlying trenches in a single step thus eliminating a via/line interface and significantly reducing the cycle time. Due to these reasons and due to relatively less expensive tooling, electroplating is a cost-effective and efficient process for Cu interconnects [2, 3]. Compared with vacuum deposition processes, electroplated Cu provides improved super filling capabilities and abnormal grain growth phenomena. These properties contribute significantly to improved reliability of Cu interconnects. With the proper choice of additives and plating conditions, void-free, seam-free Damascene deposits are obtained which eliminates surface-like fast diffusion paths for Cu electromigration.

  9. Advancing alternate tools: why science education needs CRP and CRT

    NASA Astrophysics Data System (ADS)

    Dodo Seriki, Vanessa

    2016-09-01

    Ridgeway and Yerrick's paper, Whose banner are we waving?: exploring STEM partnerships for marginalized urban youth, unearthed the tensions that existed between a local community "expert" and a group of students and their facilitator in an afterschool program. Those of us who work with youth who are traditionally marginalized, understand the importance of teaching in culturally relevant ways, but far too often—as Ridgeway and Yerrick shared—community partners have beliefs, motives, and ideologies that are incompatible to the program's mission and goals. Nevertheless, we often enter partnerships assuming that the other party understands the needs of the students or community; understands how in U.S. society White is normative while all others are deficient; and understands how to engage with students in culturally relevant ways. This forum addresses the underlying assumption, described in the Ridgeway and Yerrick article, that educators—despite their background and experiences—are able to teach in culturally relevant ways. Additionally, I assert based on the finding in the article that just as Ladson-Billings and Tate (Teach Coll Rec 97(1):47-68, 1995) asserted, race in the U.S. society, as a scholarly pursuit, was under theorized. The same is true of science education; race in science education is under theorized and the use of culturally relevant pedagogy and critical race theory as a pedagogical model and analytical tool, respectively, in science education is minimal. The increased use of both would impact our understanding of who does science, and how to broaden participation among people of color.

  10. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    NASA Astrophysics Data System (ADS)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  11. New advances in methodology for statistical tests useful in geostatistical studies

    SciTech Connect

    Borgman, L.E.

    1988-05-01

    Methodology for statistical procedures to perform tests of hypothesis pertaining to various aspects of geostatistical investigations has been slow in developing. The correlated nature of the data precludes most classical tests and makes the design of new tests difficult. Recent studies have led to modifications of the classical t test which allow for the intercorrelation. In addition, results for certain nonparametric tests have been obtained. The conclusions of these studies provide a variety of new tools for the geostatistician in deciding questions on significant differences and magnitudes.

  12. Advances in Coupling of Kinetics and Molecular Scale Tools to Shed Light on Soil Biogeochemical Processes

    SciTech Connect

    Sparks, Donald

    2014-09-02

    Biogeochemical processes in soils such as sorption, precipitation, and redox play critical roles in the cycling and fate of nutrients, metal(loid)s and organic chemicals in soil and water environments. Advanced analytical tools enable soil scientists to track these processes in real-time and at the molecular scale. Our review focuses on recent research that has employed state-of-the-art molecular scale spectroscopy, coupled with kinetics, to elucidate the mechanisms of nutrient and metal(loid) reactivity and speciation in soils. We found that by coupling kinetics with advanced molecular and nano-scale tools major advances have been made in elucidating important soil chemical processes including sorption, precipitation, dissolution, and redox of metal(loids) and nutrients. Such advances will aid in better predicting the fate and mobility of nutrients and contaminants in soils and water and enhance environmental and agricultural sustainability.

  13. Utility of the advanced chronic kidney disease patient management tools: case studies.

    PubMed

    Patwardhan, Meenal B; Matchar, David B; Samsa, Gregory P; Haley, William E

    2008-01-01

    Appropriate management of advanced chronic kidney disease (CKD) delays or limits its progression. The Advanced CKD Patient Management Toolkit was developed using a process-improvement technique to assist patient management and address CKD-specific management issues. We pilot tested the toolkit in 2 community nephrology practices, assessed the utility of individual tools, and evaluated the impact on conformance to an advanced CKD guideline through patient chart abstraction. Tool use was distinct in the 2 sites and depended on the site champion's involvement, the extent of process reconfiguration demanded by a tool, and its perceived value. Baseline conformance varied across guideline recommendations (averaged 54%). Posttrial conformance increased in all clinical areas (averaged 59%). Valuable features of the toolkit in real-world settings were its ability to: facilitate tool selection, direct implementation efforts in response to a baseline performance audit, and allow selection of tool versions and customizing them. Our results suggest that systematically created, multifaceted, and customizable tools can promote guideline conformance.

  14. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  15. Degrees of separation as a statistical tool for evaluating candidate genes.

    PubMed

    Nelson, Ronald M; Pettersson, Mats E

    2014-12-01

    Selection of candidate genes is an important step in the exploration of complex genetic architecture. The number of gene networks available is increasing and these can provide information to help with candidate gene selection. It is currently common to use the degree of connectedness in gene networks as validation in Genome Wide Association (GWA) and Quantitative Trait Locus (QTL) mapping studies. However, it can cause misleading results if not validated properly. Here we present a method and tool for validating the gene pairs from GWA studies given the context of the network they co-occur in. It ensures that proposed interactions and gene associations are not statistical artefacts inherent to the specific gene network architecture. The CandidateBacon package provides an easy and efficient method to calculate the average degree of separation (DoS) between pairs of genes to currently available gene networks. We show how these empirical estimates of average connectedness are used to validate candidate gene pairs. Validation of interacting genes by comparing their connectedness with the average connectedness in the gene network will provide support for said interactions by utilising the growing amount of gene network information available.

  16. The Taguchi methodology as a statistical tool for biotechnological applications: a critical appraisal.

    PubMed

    Rao, Ravella Sreenivas; Kumar, C Ganesh; Prakasham, R Shetty; Hobbs, Phil J

    2008-04-01

    Success in experiments and/or technology mainly depends on a properly designed process or product. The traditional method of process optimization involves the study of one variable at a time, which requires a number of combinations of experiments that are time, cost and labor intensive. The Taguchi method of design of experiments is a simple statistical tool involving a system of tabulated designs (arrays) that allows a maximum number of main effects to be estimated in an unbiased (orthogonal) fashion with a minimum number of experimental runs. It has been applied to predict the significant contribution of the design variable(s) and the optimum combination of each variable by conducting experiments on a real-time basis. The modeling that is performed essentially relates signal-to-noise ratio to the control variables in a 'main effect only' approach. This approach enables both multiple response and dynamic problems to be studied by handling noise factors. Taguchi principles and concepts have made extensive contributions to industry by bringing focused awareness to robustness, noise and quality. This methodology has been widely applied in many industrial sectors; however, its application in biological sciences has been limited. In the present review, the application and comparison of the Taguchi methodology has been emphasized with specific case studies in the field of biotechnology, particularly in diverse areas like fermentation, food processing, molecular biology, wastewater treatment and bioremediation.

  17. Chemical indices and methods of multivariate statistics as a tool for odor classification.

    PubMed

    Mahlke, Ingo T; Thiesen, Peter H; Niemeyer, Bernd

    2007-04-01

    Industrial and agricultural off-gas streams are comprised of numerous volatile compounds, many of which have substantially different odorous properties. State-of-the-art waste-gas treatment includes the characterization of these molecules and is directed at, if possible, either the avoidance of such odorants during processing or the use of existing standardized air purification techniques like bioscrubbing or afterburning, which however, often show low efficiency under ecological and economical regards. Selective odor separation from the off-gas streams could ease many of these disadvantages but is not yet widely applicable. Thus, the aim of this paper is to identify possible model substances in selective odor separation research from 155 volatile molecules mainly originating from livestock facilities, fat refineries, and cocoa and coffee production by knowledge-based methods. All compounds are examined with regard to their structure and information-content using topological and information-theoretical indices. Resulting data are fitted in an observation matrix, and similarities between the substances are computed. Principal component analysis and k-means cluster analysis are conducted showing that clustering of indices data can depict odor information correlating well to molecular composition and molecular shape. Quantitative molecule describtion along with the application of such statistical means therefore provide a good classification tool of malodorant structure properties with no thermodynamic data needed. The approximate look-alike shape of odorous compounds within the clusters suggests a fair choice of possible model molecules.

  18. Temporal Aspects of Surface Water Quality Variation Using Robust Statistical Tools

    PubMed Central

    Mustapha, Adamu; Aris, Ahmad Zaharin; Ramli, Mohammad Firuz; Juahir, Hafizan

    2012-01-01

    Robust statistical tools were applied on the water quality datasets with the aim of determining the most significance parameters and their contribution towards temporal water quality variation. Surface water samples were collected from four different sampling points during dry and wet seasons and analyzed for their physicochemical constituents. Discriminant analysis (DA) provided better results with great discriminatory ability by using five parameters with (P < 0.05) for dry season affording more than 96% correct assignation and used five and six parameters for forward and backward stepwise in wet season data with P-value (P < 0.05) affording 68.20% and 82%, respectively. Partial correlation results revealed that there are strong (rp = 0.829) and moderate (rp = 0.614) relationships between five-day biochemical oxygen demand (BOD5) and chemical oxygen demand (COD), total solids (TS) and dissolved solids (DS) controlling for the linear effect of nitrogen in the form of ammonia (NH3) and conductivity for dry and wet seasons, respectively. Multiple linear regression identified the contribution of each variable with significant values r = 0.988, R2 = 0.976 and r = 0.970, R2 = 0.942 (P < 0.05) for dry and wet seasons, respectively. Repeated measure t-test confirmed that the surface water quality varies significantly between the seasons with significant value P < 0.05. PMID:22919302

  19. Earthquake information products and tools from the Advanced National Seismic System (ANSS)

    USGS Publications Warehouse

    Wald, Lisa

    2006-01-01

    This Fact Sheet provides a brief description of postearthquake tools and products provided by the Advanced National Seismic System (ANSS) through the U.S. Geological Survey Earthquake Hazards Program. The focus is on products specifically aimed at providing situational awareness in the period immediately following significant earthquake events.

  20. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection.

    PubMed

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-12-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces.

  1. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection

    PubMed Central

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-01-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces. PMID:21487489

  2. Raman spectroscopy coupled with advanced statistics for differentiating menstrual and peripheral blood.

    PubMed

    Sikirzhytskaya, Aliaksandra; Sikirzhytski, Vitali; Lednev, Igor K

    2014-01-01

    Body fluids are a common and important type of forensic evidence. In particular, the identification of menstrual blood stains is often a key step during the investigation of rape cases. Here, we report on the application of near-infrared Raman microspectroscopy for differentiating menstrual blood from peripheral blood. We observed that the menstrual and peripheral blood samples have similar but distinct Raman spectra. Advanced statistical analysis of the multiple Raman spectra that were automatically (Raman mapping) acquired from the 40 dried blood stains (20 donors for each group) allowed us to build classification model with maximum (100%) sensitivity and specificity. We also demonstrated that despite certain common constituents, menstrual blood can be readily distinguished from vaginal fluid. All of the classification models were verified using cross-validation methods. The proposed method overcomes the problems associated with currently used biochemical methods, which are destructive, time consuming and expensive.

  3. Optimization of Sinter Plant Operating Conditions Using Advanced Multivariate Statistics: Intelligent Data Processing

    NASA Astrophysics Data System (ADS)

    Fernández-González, Daniel; Martín-Duarte, Ramón; Ruiz-Bustinza, Íñigo; Mochón, Javier; González-Gasca, Carmen; Verdeja, Luis Felipe

    2016-08-01

    Blast furnace operators expect to get sinter with homogenous and regular properties (chemical and mechanical), necessary to ensure regular blast furnace operation. Blends for sintering also include several iron by-products and other wastes that are obtained in different processes inside the steelworks. Due to their source, the availability of such materials is not always consistent, but their total production should be consumed in the sintering process, to both save money and recycle wastes. The main scope of this paper is to obtain the least expensive iron ore blend for the sintering process, which will provide suitable chemical and mechanical features for the homogeneous and regular operation of the blast furnace. The systematic use of statistical tools was employed to analyze historical data, including linear and partial correlations applied to the data and fuzzy clustering based on the Sugeno Fuzzy Inference System to establish relationships among the available variables.

  4. Monte Carlo Simulations in Statistical Physics -- From Basic Principles to Advanced Applications

    NASA Astrophysics Data System (ADS)

    Janke, Wolfhard

    2013-08-01

    This chapter starts with an overview of Monte Carlo computer simulation methodologies which are illustrated for the simple case of the Ising model. After reviewing importance sampling schemes based on Markov chains and standard local update rules (Metropolis, Glauber, heat-bath), nonlocal cluster-update algorithms are explained which drastically reduce the problem of critical slowing down at second-order phase transitions and thus improve the performance of simulations. How this can be quantified is explained in the section on statistical error analyses of simulation data including the effect of temporal correlations and autocorrelation times. Histogram reweighting methods are explained in the next section. Eventually, more advanced generalized ensemble methods (simulated and parallel tempering, multicanonical ensemble, Wang-Landau method) are discussed which are particularly important for simulations of first-order phase transitions and, in general, of systems with rare-event states. The setup of scaling and finite-size scaling analyses is the content of the following section. The chapter concludes with two advanced applications to complex physical systems. The first example deals with a quenched, diluted ferromagnet, and in the second application we consider the adsorption properties of macromolecules such as polymers and proteins to solid substrates. Such systems often require especially tailored algorithms for their efficient and successful simulation.

  5. PREFACE: Advanced many-body and statistical methods in mesoscopic systems

    NASA Astrophysics Data System (ADS)

    Anghel, Dragos Victor; Sabin Delion, Doru; Sorin Paraoanu, Gheorghe

    2012-02-01

    It has increasingly been realized in recent times that the borders separating various subfields of physics are largely artificial. This is the case for nanoscale physics, physics of lower-dimensional systems and nuclear physics, where the advanced techniques of many-body theory developed in recent times could provide a unifying framework for these disciplines under the general name of mesoscopic physics. Other fields, such as quantum optics and quantum information, are increasingly using related methods. The 6-day conference 'Advanced many-body and statistical methods in mesoscopic systems' that took place in Constanta, Romania, between 27 June and 2 July 2011 was, we believe, a successful attempt at bridging an impressive list of topical research areas: foundations of quantum physics, equilibrium and non-equilibrium quantum statistics/fractional statistics, quantum transport, phases and phase transitions in mesoscopic systems/superfluidity and superconductivity, quantum electromechanical systems, quantum dissipation, dephasing, noise and decoherence, quantum information, spin systems and their dynamics, fundamental symmetries in mesoscopic systems, phase transitions, exactly solvable methods for mesoscopic systems, various extension of the random phase approximation, open quantum systems, clustering, decay and fission modes and systematic versus random behaviour of nuclear spectra. This event brought together participants from seventeen countries and five continents. Each of the participants brought considerable expertise in his/her field of research and, at the same time, was exposed to the newest results and methods coming from the other, seemingly remote, disciplines. The talks touched on subjects that are at the forefront of topical research areas and we hope that the resulting cross-fertilization of ideas will lead to new, interesting results from which everybody will benefit. We are grateful for the financial and organizational support from IFIN-HH, Ovidius

  6. Synthetic biology and molecular genetics in non-conventional yeasts: Current tools and future advances.

    PubMed

    Wagner, James M; Alper, Hal S

    2016-04-01

    Coupling the tools of synthetic biology with traditional molecular genetic techniques can enable the rapid prototyping and optimization of yeast strains. While the era of yeast synthetic biology began in the well-characterized model organism Saccharomyces cerevisiae, it is swiftly expanding to include non-conventional yeast production systems such as Hansenula polymorpha, Kluyveromyces lactis, Pichia pastoris, and Yarrowia lipolytica. These yeasts already have roles in the manufacture of vaccines, therapeutic proteins, food additives, and biorenewable chemicals, but recent synthetic biology advances have the potential to greatly expand and diversify their impact on biotechnology. In this review, we summarize the development of synthetic biological tools (including promoters and terminators) and enabling molecular genetics approaches that have been applied in these four promising alternative biomanufacturing platforms. An emphasis is placed on synthetic parts and genome editing tools. Finally, we discuss examples of synthetic tools developed in other organisms that can be adapted or optimized for these hosts in the near future.

  7. Measuring political commitment and opportunities to advance food and nutrition security: piloting a rapid assessment tool.

    PubMed

    Fox, Ashley M; Balarajan, Yarlini; Cheng, Chloe; Reich, Michael R

    2015-06-01

    Lack of political commitment has been identified as a primary reason for the low priority that food and nutrition interventions receive from national governments relative to the high disease burden caused by malnutrition. Researchers have identified a number of factors that contribute to food and nutrition's 'low-priority cycle' on national policy agendas, but few tools exist to rapidly measure political commitment and identify opportunities to advance food and nutrition on the policy agenda. This article presents a theory-based rapid assessment approach to gauging countries' level of political commitment to food and nutrition security and identifying opportunities to advance food and nutrition on the policy agenda. The rapid assessment tool was piloted among food and nutrition policymakers and planners in 10 low- and middle-income countries in April to June 2013. Food and nutrition commitment and policy opportunity scores were calculated for each country and strategies to advance food and nutrition on policy agendas were designed for each country. The article finds that, in a majority of countries, political leaders had verbally and symbolically committed to addressing food and nutrition, but adequate financial resources were not allocated to implement specific programmes. In addition, whereas the low cohesion of the policy community has been viewed a major underlying cause of the low-priority status of food and nutrition, the analysis finds that policy community cohesion and having a well thought-out policy alternative were present in most countries. This tool may be useful to policymakers and planners providing information that can be used to benchmark and/or evaluate advocacy efforts to advance reforms in the food and nutrition sector; furthermore, the results can help identify specific strategies that can be employed to move the food and nutrition agenda forward. This tool complements others that have been recently developed to measure national commitment to

  8. Advanced Risk Reduction Tool (ARRT) Special Case Study Report: Science and Engineering Technical Assessments (SETA) Program

    NASA Technical Reports Server (NTRS)

    Kirsch, Paul J.; Hayes, Jane; Zelinski, Lillian

    2000-01-01

    This special case study report presents the Science and Engineering Technical Assessments (SETA) team's findings for exploring the correlation between the underlying models of Advanced Risk Reduction Tool (ARRT) relative to how it identifies, estimates, and integrates Independent Verification & Validation (IV&V) activities. The special case study was conducted under the provisions of SETA Contract Task Order (CTO) 15 and the approved technical approach documented in the CTO-15 Modification #1 Task Project Plan.

  9. Mnemonic Aids during Tests: Worthless Frivolity or Effective Tool in Statistics Education?

    ERIC Educational Resources Information Center

    Larwin, Karen H.; Larwin, David A.; Gorman, Jennifer

    2012-01-01

    Researchers have explored many pedagogical approaches in an effort to assist students in finding understanding and comfort in required statistics courses. This study investigates the impact of mnemonic aids used during tests on students' statistics course performance in particular. In addition, the present study explores several hypotheses that…

  10. Statistically Optimal Approximations of Astronomical Signals: Implications to Classification and Advanced Study of Variable Stars

    NASA Astrophysics Data System (ADS)

    Andronov, I. L.; Chinarova, L. L.; Kudashkina, L. S.; Marsakova, V. I.; Tkachenko, M. G.

    2016-06-01

    We have elaborated a set of new algorithms and programs for advanced time series analysis of (generally) multi-component multi-channel observations with irregularly spaced times of observations, which is a common case for large photometric surveys. Previous self-review on these methods for periodogram, scalegram, wavelet, autocorrelation analysis as well as on "running" or "sub-interval" local approximations were self-reviewed in (2003ASPC..292..391A). For an approximation of the phase light curves of nearly-periodic pulsating stars, we use a Trigonometric Polynomial (TP) fit of the statistically optimal degree and initial period improvement using differential corrections (1994OAP.....7...49A). For the determination of parameters of "characteristic points" (minima, maxima, crossings of some constant value etc.) we use a set of methods self-reviewed in 2005ASPC..335...37A, Results of the analysis of the catalogs compiled using these programs are presented in 2014AASP....4....3A. For more complicated signals, we use "phenomenological approximations" with "special shapes" based on functions defined on sub-intervals rather on the complete interval. E. g. for the Algol-type stars we developed the NAV ("New Algol Variable") algorithm (2012Ap.....55..536A, 2012arXiv1212.6707A, 2015JASS...32..127A), which was compared to common methods of Trigonometric Polynomial Fit (TP) or local Algebraic Polynomial (A) fit of a fixed or (alternately) statistically optimal degree. The method allows determine the minimal set of parameters required for the "General Catalogue of Variable Stars", as well as an extended set of phenomenological and astrophysical parameters which may be used for the classification. Totally more that 1900 variable stars were studied in our group using these methods in a frame of the "Inter-Longitude Astronomy" campaign (2010OAP....23....8A) and the "Ukrainian Virtual Observatory" project (2012KPCB...28...85V).

  11. Preparing High School Students for Success in Advanced Placement Statistics: An Investigation of Pedagogies and Strategies Used in an Online Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Potter, James Thomson, III

    2012-01-01

    Research into teaching practices and strategies has been performed separately in AP Statistics and in K-12 online learning (Garfield, 2002; Ferdig, DiPietro, Black & Dawson, 2009). This study seeks combine the two and build on the need for more investigation into online teaching and learning in specific content (Ferdig et al, 2009; DiPietro,…

  12. CONFERENCE NOTE: International Workshop on Advanced Mathematical Tools in Metrology, Villa Gualino, Torino, Italy, 20 22 October 1993

    NASA Astrophysics Data System (ADS)

    1993-01-01

    Preliminary Programme The three-day programme features approximately twenty-five invited contributions. Participants may present a poster on the topic "Applications for Industrial Measurements", concerning applied mathematics, software development and computer-based measurements. 20 October Two plenary talks on mathematical methods and metrological applications "Numerical Methods and Modelling" Partial differential equations and integral equations Methods of identification and validation Algorithms for approximation Geometrical shape determination of industrial solids Round Table 21 October "Data Analysis" Spectral analysis and wavelets Calibration of precision instrumentation Comparison measurement of standards Statistical methods in metrology Robust estimation and outliers Applications of the bootstrap method Round Table 22 October (in cooperation with SIMAI and ASP) "Applications for Industrial Measurements" Data acquisition Measurement software, standard computational modules and their validation Round Table Industrial presentations Discussion of poster presentations Conclusions Lecturers Mathematicians from the international metrological community; mathematicians from Italian universities (Politecnico of Torino, Milano, Università di Genova, Milano, Padova, Roma, Trento); scientists and mathematicians from national standards institutes and the Italian National Research Council. The workshop will be of interest to people in universities, research centres and industry who are involved in measurement and need advanced mathematical tools to solve their problems, and to those who work in the development of these mathematical tools. Metrology is concerned with measurement at the highest level of precision. Advances in metrology depend on many factors: improvements in scientific and technical knowledge, instrumentation quality, better use of advanced mathematical tools and the development of new tools. In some countries, metrological institutions have a tradition of

  13. Advanced gradient-index lens design tools to maximize system performance and reduce SWaP

    NASA Astrophysics Data System (ADS)

    Campbell, Sawyer D.; Nagar, Jogender; Brocker, Donovan E.; Easum, John A.; Turpin, Jeremiah P.; Werner, Douglas H.

    2016-05-01

    GRadient-INdex (GRIN) lenses have long been of interest due to their potential for providing levels of performance unachievable with traditional homogeneous lenses. While historically limited by a lack of suitable materials, rapid advancements in manufacturing techniques, including 3D printing, have recently kindled a renewed interest in GRIN optics. Further increasing the desire for GRIN devices has been the advent of Transformation Optics (TO), which provides the mathematical framework for representing the behavior of electromagnetic radiation in a given geometry by "transforming" it to an alternative, usually more desirable, geometry through an appropriate mapping of the constituent material parameters. Using TO, aspherical lenses can be transformed to simpler spherical and flat geometries or even rotationally-asymmetric shapes which result in true 3D GRIN profiles. Meanwhile, there is a critical lack of suitable design tools which can effectively evaluate the optical wave propagation through 3D GRIN profiles produced by TO. Current modeling software packages for optical lens systems also lack advanced multi-objective global optimization capability which allows the user to explicitly view the trade-offs between all design objectives such as focus quality, FOV, ▵nand focal drift due to chromatic aberrations. When coupled with advanced design methodologies such as TO, wavefront matching (WFM), and analytical achromatic GRIN theory, these tools provide a powerful framework for maximizing SWaP (Size, Weight and Power) reduction in GRIN-enabled optical systems. We provide an overview of our advanced GRIN design tools and examples which minimize the presence of mono- and polychromatic aberrations in the context of reducing SWaP.

  14. Coping, Stress, and Job Satisfaction as Predictors of Advanced Placement Statistics Teachers' Intention to Leave the Field

    ERIC Educational Resources Information Center

    McCarthy, Christopher J.; Lambert, Richard G.; Crowe, Elizabeth W.; McCarthy, Colleen J.

    2010-01-01

    This study examined the relationship of teachers' perceptions of coping resources and demands to job satisfaction factors. Participants were 158 Advanced Placement Statistics high school teachers who completed measures of personal resources for stress prevention, classroom demands and resources, job satisfaction, and intention to leave the field…

  15. "I am Not a Statistic": Identities of African American Males in Advanced Science Courses

    NASA Astrophysics Data System (ADS)

    Johnson, Diane Wynn

    The United States Bureau of Labor Statistics (2010) expects new industries to generate approximately 2.7 million jobs in science and technology by the year 2018, and there is concern as to whether there will be enough trained individuals to fill these positions. A tremendous resource remains untapped, African American students, especially African American males (National Science Foundation, 2009). Historically, African American males have been omitted from the so called science pipeline. Fewer African American males pursue a science discipline due, in part; to limiting factors they experience in school and at home (Ogbu, 2004). This is a case study of African American males who are enrolled in advanced science courses at a predominantly African American (84%) urban high school. Guided by expectancy-value theory (EVT) of achievement related results (Eccles, 2009; Eccles et al., 1983), twelve African American male students in two advanced science courses were observed in their science classrooms weekly, participated in an in-depth interview, developed a presentation to share with students enrolled in a tenth grade science course, responded to an open-ended identity questionnaire, and were surveyed about their perceptions of school. Additionally, the students' teachers were interviewed, and seven of the students' parents. The interview data analyses highlighted the important role of supportive parents (key socializers) who had high expectations for their sons and who pushed them academically. The students clearly attributed their enrollment in advanced science courses to their high regard for their science teachers, which included positive relationships, hands-on learning in class, and an inviting and encouraging learning environment. Additionally, other family members and coaches played important roles in these young men's lives. Students' PowerPoint(c) presentations to younger high school students on why they should take advanced science courses highlighted these

  16. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education

    PubMed Central

    Masel, J.; Humphrey, P. T.; Blackburn, B.; Levine, J. A.

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students’ intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes’ theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. PMID:26582236

  17. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    PubMed

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education.

  18. Cellulosome-based, Clostridium-derived multi-functional enzyme complexes for advanced biotechnology tool development: advances and applications.

    PubMed

    Hyeon, Jeong Eun; Jeon, Sang Duck; Han, Sung Ok

    2013-11-01

    The cellulosome is one of nature's most elegant and elaborate nanomachines and a key biological and biotechnological macromolecule that can be used as a multi-functional protein complex tool. Each protein module in the cellulosome system is potentially useful in an advanced biotechnology application. The high-affinity interactions between the cohesin and dockerin domains can be used in protein-based biosensors to improve both sensitivity and selectivity. The scaffolding protein includes a carbohydrate-binding module (CBM) that attaches strongly to cellulose substrates and facilitates the purification of proteins fused with the dockerin module through a one-step CBM purification method. Although the surface layer homology (SLH) domain of CbpA is not present in other strains, replacement of the cell surface anchoring domain allows a foreign protein to be displayed on the surface of other strains. The development of a hydrolysis enzyme complex is a useful strategy for consolidated bioprocessing (CBP), enabling microorganisms with biomass hydrolysis activity. Thus, the development of various configurations of multi-functional protein complexes for use as tools in whole-cell biocatalyst systems has drawn considerable attention as an attractive strategy for bioprocess applications. This review provides a detailed summary of the current achievements in Clostridium-derived multi-functional complex development and the impact of these complexes in various areas of biotechnology.

  19. Anvil Forecast Tool in the Advanced Weather Interactive Processing System (AWIPS)

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Launch Weather Officers (LWOs) from the 45th Weather Squadron (45 WS) and forecasters from the National Weather Service (NWS) Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violating the Lightning Launch Commit Criteria (LLCC) (Krider et al. 2006; Space Shuttle Flight Rules (FR), NASA/JSC 2004)). As a result, the Applied Meteorology Unit (AMU) developed a tool that creates an anvil threat corridor graphic that can be overlaid on satellite imagery using the Meteorological Interactive Data Display System (MIDDS, Short and Wheeler, 2002). The tool helps forecasters estimate the locations of thunderstorm anvils at one, two, and three hours into the future. It has been used extensively in launch and landing operations by both the 45 WS and SMG. The Advanced Weather Interactive Processing System (AWIPS) is now used along with MIDDS for weather analysis and display at SMG. In Phase I of this task, SMG tasked the AMU to transition the tool from MIDDS to AWIPS (Barrett et aI., 2007). For Phase II, SMG requested the AMU make the Anvil Forecast Tool in AWIPS more configurable by creating the capability to read model gridded data from user-defined model files instead of hard-coded files. An NWS local AWIPS application called AGRID was used to accomplish this. In addition, SMG needed to be able to define the pressure levels for the model data, instead of hard-coding the bottom level as 300 mb and the top level as 150 mb. This paper describes the initial development of the Anvil Forecast Tool for MIDDS, followed by the migration of the tool to AWIPS in Phase I. It then gives a detailed presentation of the Phase II improvements to the AWIPS tool.

  20. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  1. WORD STATISTICS IN THE GENERATION OF SEMANTIC TOOLS FOR INFORMATION SYSTEMS.

    ERIC Educational Resources Information Center

    STONE, DON C.

    ONE OF THE PROBLEMS IN INFORMATION STORAGE AND RETRIEVAL SYSTEMS OF TECHNICAL DOCUMENTS IS THE INTERPRETATION OF WORDS USED TO INDEX DOCUMENTS. SEMANTIC TOOLS, DEFINED AS CHANNELS FOR THE COMMUNICATION OF WORD MEANINGS BETWEEN TECHNICAL EXPERTS, DOCUMENT INDEXERS, AND SEARCHERS, PROVIDE ONE METHOD OF DEALING WITH THE PROBLEM OF MULTIPLE…

  2. Statistical description of the macrostructure of diamond-containing powder tool materials

    NASA Astrophysics Data System (ADS)

    Vinokurov, G. G.; Sharin, P. P.; Popov, O. N.

    2015-12-01

    The macrostructure of diamond-containing tool material has been investigated. The potentials of application of a cluster theory for processing a digital metallographic image of a diamond-containing powder material are substantiated. It is proposed to consider agglomerates of diamond grains to estimate the heterogeneity of a two-phase macrostructure.

  3. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  4. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  5. NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Craig, D. A.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The objective of this Technical Interchange Meeting was to increase the quantity and quality of technical, cost, and programmatic data used to model the impact of investing in different technologies. The focus of this meeting was the Technology Tool Box (TTB), a database of performance, operations, and programmatic parameters provided by technologists and used by systems engineers. The TTB is the data repository used by a system of models known as the Advanced Technology Lifecycle Analysis System (ATLAS). This report describes the result of the November meeting, and also provides background information on ATLAS and the TTB.

  6. Genetic susceptibility and gastric cancer risk: the importance of meta-analyses as a statistical tool.

    PubMed

    García-González, María Asunción; Lanas, Angel

    2014-01-01

    Gastric cancer (GC) is a complex disease and a worldwide health burden due to its high prevalence and poor prognosis. A deeper knowledge of the factors involved in the development and progression of GC could help to identify subpopulations at risk that therefore require surveillance or early treatment strategies. Current research is based on the study of genetic variants that confer a higher risk of GC and their interactions with environmental exposure. Recently, meta-analysis has emerged as an important statistical method involving pooling of data from individual association studies to increase statistical power and obtain more conclusive results. Given the importance of chronic inflammation in the process of gastric carcinogenesis, the present article reviews the most recent meta-analyses of the contribution of cytokine gene polymorphisms to GC risk.

  7. Plant Aquaporins: Genome-Wide Identification, Transcriptomics, Proteomics, and Advanced Analytical Tools

    PubMed Central

    Deshmukh, Rupesh K.; Sonah, Humira; Bélanger, Richard R.

    2016-01-01

    Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is

  8. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database.

    SciTech Connect

    Quock, D. E. R.; Cianciarulo, M. B.; APS Engineering Support Division; Purdue Univ.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  9. EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing

    PubMed Central

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments. PMID:21687590

  10. An Analysis of Energy Savings Possible Through Advances in Automotive Tooling Technology

    SciTech Connect

    Rick Schmoyer, RLS

    2004-12-03

    The use of lightweight and highly formable advanced materials in automobile and truck manufacturing has the potential to save fuel. Advances in tooling technology would promote the use of these materials. This report describes an energy savings analysis performed to approximate the potential fuel savings and consequential carbon-emission reductions that would be possible because of advances in tooling in the manufacturing of, in particular, non-powertrain components of passenger cars and heavy trucks. Separate energy analyses are performed for cars and heavy trucks. Heavy trucks are considered to be Class 7 and 8 trucks (trucks rated over 26,000 lbs gross vehicle weight). A critical input to the analysis is a set of estimates of the percentage reductions in weight and drag that could be achieved by the implementation of advanced materials, as a consequence of improved tooling technology, which were obtained by surveying tooling industry experts who attended a DOE Workshop, Tooling Technology for Low-Volume Vehicle Production, held in Seattle and Detroit in October and November 2003. The analysis is also based on 2001 fuel consumption totals and on energy-audit component proportions of fuel use due to drag, rolling resistance, and braking. The consumption proportions are assumed constant over time, but an allowance is made for fleet growth. The savings for a particular component is then the product of total fuel consumption, the percentage reduction of the component, and the energy audit component proportion. Fuel savings estimates for trucks also account for weight-limited versus volume-limited operations. Energy savings are assumed to be of two types: (1) direct energy savings incurred through reduced forces that must be overcome to move the vehicle or to slow it down in braking. and (2) indirect energy savings through reductions in the required engine power, the production and transmission of which incur thermodynamic losses, internal friction, and other

  11. genipe: an automated genome-wide imputation pipeline with automatic reporting and statistical tools.

    PubMed

    Lemieux Perreault, Louis-Philippe; Legault, Marc-André; Asselin, Géraldine; Dubé, Marie-Pierre

    2016-12-01

    Genotype imputation is now commonly performed following genome-wide genotyping experiments. Imputation increases the density of analyzed genotypes in the dataset, enabling fine-mapping across the genome. However, the process of imputation using the most recent publicly available reference datasets can require considerable computation power and the management of hundreds of large intermediate files. We have developed genipe, a complete genome-wide imputation pipeline which includes automatic reporting, imputed data indexing and management, and a suite of statistical tests for imputed data commonly used in genetic epidemiology (Sequence Kernel Association Test, Cox proportional hazards for survival analysis, and linear mixed models for repeated measurements in longitudinal studies).

  12. Design and contents of an advanced distance-based statistics course for a PhD in nursing program.

    PubMed

    Azuero, Andres; Wilbanks, Bryan; Pryor, Erica

    2013-01-01

    Doctoral nursing students and researchers are expected to understand, critique, and conduct research that uses advanced quantitative methodology. The authors describe the design and contents of a distance-based course in multivariate statistics for PhD students in nursing and health administration, compare the design to recommendations found in the literature for distance-based statistics education, and compare the course contents to a tabulation of the methodologies used in a sample of recently published quantitative dissertations in nursing. The authors conclude with a discussion based on these comparisons as well as with experiences in course implementation and directions for future course development.

  13. Advanced Telecommunications in U.S. Public Elementary and Secondary Schools, Fall 1996. Statistics in Brief.

    ERIC Educational Resources Information Center

    Heaviside, Sheila; And Others

    The "Survey of Advanced Telecommunications in U.S. Public Elementary and Secondary Schools, Fall 1996" collected information from 911 regular United States public elementary and secondary schools regarding the availability and use of advanced telecommunications, and in particular, access to the Internet, plans to obtain Internet access, use of…

  14. Computational AstroStatistics: fast and efficient tools for analysing huge astronomical data sources

    NASA Astrophysics Data System (ADS)

    Nichol, Robert C.; Chong, S.; Connolly, A. J.; Davies, S.; Genovese, C.; Hopkins, A. M.; Miller, C. J.; Moore, A. W.; Pelleg, D.; Richards, G. T.; Schneider, J.; Szapudi, I.; Wasserman, L.

    I present here a review of past and present multi-disciplinary research of the Pittsburgh Computational AstroStatistics (PiCA) group. This group is dedicated to developing fast and efficient statistical algorithms for analysing huge astronomical data sources. I begin with a short review of multi-resolutional kd-trees which are the building blocks for many of our algorithms. For example, quick range queries and fast N-point correlation functions. I will present new results from the use of Mixture Models (Connolly et al. 2000) in desity estimation of multi-color data from the Sloan Digital Sky Survey (SDSS). Specifically, the selection of quasars and the automated identification of X-ray sources. I will also present a brief overview of the False Discovery Rate (FDR) procedure (Miller et al. 2001) and show how it has been used in the detection of "Baryon Wiggles" in the local galaxy power spectrum and source identification in radio data. Finally, I will look forward to new research on an automatied Bayes Network anomaly detector and the possible use of the Locally Linear Embedding algorithm (LLE; Roweis & Saul 2000) for spectral classification of SDSS spectra.

  15. COLLABORATIVE RESEARCH:USING ARM OBSERVATIONS & ADVANCED STATISTICAL TECHNIQUES TO EVALUATE CAM3 CLOUDS FOR DEVELOPMENT OF STOCHASTIC CLOUD-RADIATION

    SciTech Connect

    Somerville, Richard

    2013-08-22

    The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key step in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been a collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen).

  16. Laser vision: lidar as a transformative tool to advance critical zone science

    NASA Astrophysics Data System (ADS)

    Harpold, A. A.; Marshall, J. A.; Lyon, S. W.; Barnhart, T. B.; Fisher, B.; Donovan, M.; Brubaker, K. M.; Crosby, C. J.; Glenn, N. F.; Glennie, C. L.; Kirchner, P. B.; Lam, N.; Mankoff, K. D.; McCreight, J. L.; Molotch, N. P.; Musselman, K. N.; Pelletier, J.; Russo, T.; Sangireddy, H.; Sjöberg, Y.; Swetnam, T.; West, N.

    2015-01-01

    Laser vision: lidar as a transformative tool to advance critical zone science. Observation and quantification of the Earth surface is undergoing a revolutionary change due to the increased spatial resolution and extent afforded by light detection and ranging (lidar) technology. As a consequence, lidar-derived information has led to fundamental discoveries within the individual disciplines of geomorphology, hydrology, and ecology. These disciplines form the cornerstones of Critical Zone (CZ) science, where researchers study how interactions among the geosphere, hydrosphere, and ecosphere shape and maintain the "zone of life", extending from the groundwater to the vegetation canopy. Lidar holds promise as a transdisciplinary CZ research tool by simultaneously allowing for quantification of topographic, vegetative, and hydrological data. Researchers are just beginning to utilize lidar datasets to answer synergistic questions in CZ science, such as how landforms and soils develop in space and time as a function of the local climate, biota, hydrologic properties, and lithology. This review's objective is to demonstrate the transformative potential of lidar by critically assessing both challenges and opportunities for transdisciplinary lidar applications. A review of 147 peer-reviewed studies utilizing lidar showed that 38 % of the studies were focused in geomorphology, 18 % in hydrology, 32 % in ecology, and the remaining 12 % have an interdisciplinary focus. We find that using lidar to its full potential will require numerous advances across CZ applications, including new and more powerful open-source processing tools, exploiting new lidar acquisition technologies, and improved integration with physically-based models and complementary in situ and remote-sensing observations. We provide a five-year vision to utilize and advocate for the expanded use of lidar datasets to benefit CZ science applications.

  17. Completion of the Edward Air Force Base Statistical Guidance Wind Tool

    NASA Technical Reports Server (NTRS)

    Dreher, Joseph G.

    2008-01-01

    The goal of this task was to develop a GUI using EAFB wind tower data similar to the KSC SLF peak wind tool that is already in operations at SMG. In 2004, MSFC personnel began work to replicate the KSC SLF tool using several wind towers at EAFB. They completed the analysis and QC of the data, but due to higher priority work did not start development of the GUI. MSFC personnel calculated wind climatologies and probabilities of 10-minute peak wind occurrence based on the 2-minute average wind speed for several EAFB wind towers. Once the data were QC'ed and analyzed the climatologies were calculated following the methodology outlined in Lambert (2003). The climatologies were calculated for each tower and month, and then were stratified by hour, direction (10" sectors), and direction (45" sectors)/hour. For all climatologies, MSFC calculated the mean, standard deviation and observation counts of the Zminute average and 10-minute peak wind speeds. MSFC personnel also calculated empirical and modeled probabilities of meeting or exceeding specific 10- minute peak wind speeds using PDFs. The empirical PDFs were asymmetrical and bounded on the left by the 2- minute average wind speed. They calculated the parametric PDFs by fitting the GEV distribution to the empirical distributions. Parametric PDFs were calculated in order to smooth and interpolate over variations in the observed values due to possible under-sampling of certain peak winds and to estimate probabilities associated with average winds outside the observed range. MSFC calculated the individual probabilities of meeting or exceeding specific 10- minute peak wind speeds by integrating the area under each curve. The probabilities assist SMG forecasters in assessing the shuttle FR for various Zminute average wind speeds. The A M ' obtained the processed EAFB data from Dr. Lee Bums of MSFC and reformatted them for input to Excel PivotTables, which allow users to display different values with point

  18. PECA: a novel statistical tool for deconvoluting time-dependent gene expression regulation.

    PubMed

    Teo, Guoshou; Vogel, Christine; Ghosh, Debashis; Kim, Sinae; Choi, Hyungwon

    2014-01-03

    Protein expression varies as a result of intricate regulation of synthesis and degradation of messenger RNAs (mRNA) and proteins. Studies of dynamic regulation typically rely on time-course data sets of mRNA and protein expression, yet there are no statistical methods that integrate these multiomics data and deconvolute individual regulatory processes of gene expression control underlying the observed concentration changes. To address this challenge, we developed Protein Expression Control Analysis (PECA), a method to quantitatively dissect protein expression variation into the contributions of mRNA synthesis/degradation and protein synthesis/degradation, termed RNA-level and protein-level regulation respectively. PECA computes the rate ratios of synthesis versus degradation as the statistical summary of expression control during a given time interval at each molecular level and computes the probability that the rate ratio changed between adjacent time intervals, indicating regulation change at the time point. Along with the associated false-discovery rates, PECA gives the complete description of dynamic expression control, that is, which proteins were up- or down-regulated at each molecular level and each time point. Using PECA, we analyzed two yeast data sets monitoring the cellular response to hyperosmotic and oxidative stress. The rate ratio profiles reported by PECA highlighted a large magnitude of RNA-level up-regulation of stress response genes in the early response and concordant protein-level regulation with time delay. However, the contributions of RNA- and protein-level regulation and their temporal patterns were different between the two data sets. We also observed several cases where protein-level regulation counterbalanced transcriptomic changes in the early stress response to maintain the stability of protein concentrations, suggesting that proteostasis is a proteome-wide phenomenon mediated by post-transcriptional regulation.

  19. Exon array data analysis using Affymetrix power tools and R statistical software

    PubMed Central

    2011-01-01

    The use of microarray technology to measure gene expression on a genome-wide scale has been well established for more than a decade. Methods to process and analyse the vast quantity of expression data generated by a typical microarray experiment are similarly well-established. The Affymetrix Exon 1.0 ST array is a relatively new type of array, which has the capability to assess expression at the individual exon level. This allows a more comprehensive analysis of the transcriptome, and in particular enables the study of alternative splicing, a gene regulation mechanism important in both normal conditions and in diseases. Some aspects of exon array data analysis are shared with those for standard gene expression data but others present new challenges that have required development of novel tools. Here, I will introduce the exon array and present a detailed example tutorial for analysis of data generated using this platform. PMID:21498550

  20. Exon array data analysis using Affymetrix power tools and R statistical software.

    PubMed

    Lockstone, Helen E

    2011-11-01

    The use of microarray technology to measure gene expression on a genome-wide scale has been well established for more than a decade. Methods to process and analyse the vast quantity of expression data generated by a typical microarray experiment are similarly well-established. The Affymetrix Exon 1.0 ST array is a relatively new type of array, which has the capability to assess expression at the individual exon level. This allows a more comprehensive analysis of the transcriptome, and in particular enables the study of alternative splicing, a gene regulation mechanism important in both normal conditions and in diseases. Some aspects of exon array data analysis are shared with those for standard gene expression data but others present new challenges that have required development of novel tools. Here, I will introduce the exon array and present a detailed example tutorial for analysis of data generated using this platform.

  1. Contemporary molecular tools in microbial ecology and their application to advancing biotechnology.

    PubMed

    Rashid, Mamoon; Stingl, Ulrich

    2015-12-01

    Novel methods in microbial ecology are revolutionizing our understanding of the structure and function of microbes in the environment, but concomitant advances in applications of these tools to biotechnology are mostly lagging behind. After more than a century of efforts to improve microbial culturing techniques, about 70-80% of microbial diversity - recently called the "microbial dark matter" - remains uncultured. In early attempts to identify and sample these so far uncultured taxonomic lineages, methods that amplify and sequence ribosomal RNA genes were extensively used. Recent developments in cell separation techniques, DNA amplification, and high-throughput DNA sequencing platforms have now made the discovery of genes/genomes of uncultured microorganisms from different environments possible through the use of metagenomic techniques and single-cell genomics. When used synergistically, these metagenomic and single-cell techniques create a powerful tool to study microbial diversity. These genomics techniques have already been successfully exploited to identify sources for i) novel enzymes or natural products for biotechnology applications, ii) novel genes from extremophiles, and iii) whole genomes or operons from uncultured microbes. More can be done to utilize these tools more efficiently in biotechnology.

  2. Development of Experimental and Computational Aeroacoustic Tools for Advanced Liner Evaluation

    NASA Technical Reports Server (NTRS)

    Jones, Michael G.; Watson, Willie R.; Nark, Douglas N.; Parrott, Tony L.; Gerhold, Carl H.; Brown, Martha C.

    2006-01-01

    Acoustic liners in aircraft engine nacelles suppress radiated noise. Therefore, as air travel increases, increasingly sophisticated tools are needed to maximize noise suppression. During the last 30 years, NASA has invested significant effort in development of experimental and computational acoustic liner evaluation tools. The Curved Duct Test Rig is a 152-mm by 381- mm curved duct that supports liner evaluation at Mach numbers up to 0.3 and source SPLs up to 140 dB, in the presence of user-selected modes. The Grazing Flow Impedance Tube is a 51- mm by 63-mm duct currently being fabricated to operate at Mach numbers up to 0.6 with source SPLs up to at least 140 dB, and will replace the existing 51-mm by 51-mm duct. Together, these test rigs allow evaluation of advanced acoustic liners over a range of conditions representative of those observed in aircraft engine nacelles. Data acquired with these test ducts are processed using three aeroacoustic propagation codes. Two are based on finite element solutions to convected Helmholtz and linearized Euler equations. The third is based on a parabolic approximation to the convected Helmholtz equation. The current status of these computational tools and their associated usage with the Langley test rigs is provided.

  3. Molecular tools for functional genomics in filamentous fungi: recent advances and new strategies.

    PubMed

    Jiang, Dewei; Zhu, Wei; Wang, Yunchuan; Sun, Chang; Zhang, Ke-Qin; Yang, Jinkui

    2013-12-01

    Advances in genetic transformation techniques have made important contributions to molecular genetics. Various molecular tools and strategies have been developed for functional genomic analysis of filamentous fungi since the first DNA transformation was successfully achieved in Neurospora crassa in 1973. Increasing amounts of genomic data regarding filamentous fungi are continuously reported and large-scale functional studies have become common in a wide range of fungal species. In this review, various molecular tools used in filamentous fungi are compared and discussed, including methods for genetic transformation (e.g., protoplast transformation, electroporation, and microinjection), the construction of random mutant libraries (e.g., restriction enzyme mediated integration, transposon arrayed gene knockout, and Agrobacterium tumefaciens mediated transformation), and the analysis of gene function (e.g., RNA interference and transcription activator-like effector nucleases). We also focused on practical strategies that could enhance the efficiency of genetic manipulation in filamentous fungi, such as choosing a proper screening system and marker genes, assembling target-cassettes or vectors effectively, and transforming into strains that are deficient in the nonhomologous end joining pathway. In summary, we present an up-to-date review on the different molecular tools and latest strategies that have been successfully used in functional genomics in filamentous fungi.

  4. A decision support tool for synchronizing technology advances with strategic mission objectives

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda S.; Willoughby, John K.

    1992-01-01

    Successful accomplishment of the objectives of many long-range future missions in areas such as space systems, land-use planning, and natural resource management requires significant technology developments. This paper describes the development of a decision-support data-derived tool called MisTec for helping strategic planners to determine technology development alternatives and to synchronize the technology development schedules with the performance schedules of future long-term missions. Special attention is given to the operations, concept, design, and functional capabilities of the MisTec. The MisTec was initially designed for manned Mars mission, but can be adapted to support other high-technology long-range strategic planning situations, making it possible for a mission analyst, planner, or manager to describe a mission scenario, determine the technology alternatives for making the mission achievable, and to plan the R&D activity necessary to achieve the required technology advances.

  5. Advanced Launch Technology Life Cycle Analysis Using the Architectural Comparison Tool (ACT)

    NASA Technical Reports Server (NTRS)

    McCleskey, Carey M.

    2015-01-01

    Life cycle technology impact comparisons for nanolauncher technology concepts were performed using an Affordability Comparison Tool (ACT) prototype. Examined are cost drivers and whether technology investments can dramatically affect the life cycle characteristics. Primary among the selected applications was the prospect of improving nanolauncher systems. As a result, findings and conclusions are documented for ways of creating more productive and affordable nanolauncher systems; e.g., an Express Lane-Flex Lane concept is forwarded, and the beneficial effect of incorporating advanced integrated avionics is explored. Also, a Functional Systems Breakdown Structure (F-SBS) was developed to derive consistent definitions of the flight and ground systems for both system performance and life cycle analysis. Further, a comprehensive catalog of ground segment functions was created.

  6. Genomic similarity and kernel methods I: advancements by building on mathematical and statistical foundations.

    PubMed

    Schaid, Daniel J

    2010-01-01

    Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1].

  7. Statistical analyses of the magnet data for the advanced photon source storage ring magnets

    SciTech Connect

    Kim, S.H.; Carnegie, D.W.; Doose, C.; Hogrefe, R.; Kim, K.; Merl, R.

    1995-05-01

    The statistics of the measured magnetic data of 80 dipole, 400 quadrupole, and 280 sextupole magnets of conventional resistive designs for the APS storage ring is summarized. In order to accommodate the vacuum chamber, the curved dipole has a C-type cross section and the quadrupole and sextupole cross sections have 180{degrees} and 120{degrees} symmetries, respectively. The data statistics include the integrated main fields, multipole coefficients, magnetic and mechanical axes, and roll angles of the main fields. The average and rms values of the measured magnet data meet the storage ring requirements.

  8. Complex Spine Pathology Simulator: An Innovative Tool for Advanced Spine Surgery Training.

    PubMed

    Gragnaniello, Cristian; Abou-Hamden, Amal; Mortini, Pietro; Colombo, Elena V; Bailo, Michele; Seex, Kevin A; Litvack, Zachary; Caputy, Anthony J; Gagliardi, Filippo

    2016-11-01

    Background Technical advancements in spine surgery have made possible the treatment of increasingly complex pathologies with less morbidity. Time constraints in surgeons' training have made it necessary to develop new training models for spine pathology. Objective To describe the application of a novel compound, Stratathane resin ST-504 derived polymer (SRSDP), that can be injected at different spinal target locations to mimic spinal epidural, subdural extra-axial, and intra-axial pathologies for the use in advanced surgical training. Material and Methods Fresh-frozen thoracolumbar and cervical spine segments of human and sheep cadavers were used to study the model. SRSDP is initially liquid after mixing, allowing it to be injected into target areas where it expands and solidifies, mimicking the entire spectrum of spinal pathologies. Results Different polymer concentrations have been codified to vary adhesiveness, texture, spread capability, deformability, and radiologic visibility. Polymer injection was performed under fluoroscopic guidance through pathology-specific injection sites that avoided compromising the surgical approach for subsequent excision of the artificial lesion. Inflation of a balloon catheter of the desired size was used to displace stiff cadaveric neurovascular structures to mimic pathology-related mass effect. Conclusion The traditional cadaveric training models principally only allow surgeons to practice the surgical approach. The complex spine pathology simulator is a novel educational tool that in a user-friendly, low-cost fashion allows trainees to practice advanced technical skills in the removal of complex spine pathology, potentially shortening some of the aspects of the learning curve of operative skills that may otherwise take many years to acquire.

  9. DNA technological progress toward advanced diagnostic tools to support human hookworm control.

    PubMed

    Gasser, R B; Cantacessi, C; Loukas, A

    2008-01-01

    Blood-feeding hookworms are parasitic nematodes of major human health importance. Currently, it is estimated that 740 million people are infected worldwide, and more than 80 million of them are severely affected clinically by hookworm disease. In spite of the health problems caused and the advances toward the development of vaccines against some hookworms, limited attention has been paid to the need for improved, practical methods of diagnosis. Accurate diagnosis and genetic characterization of hookworms is central to their effective control. While traditional diagnostic methods have considerable limitations, there has been some progress toward the development of molecular-diagnostic tools. The present article provides a brief background on hookworm disease of humans, reviews the main methods that have been used for diagnosis and describes progress in establishing polymerase chain reaction (PCR)-based methods for the specific diagnosis of hookworm infection and the genetic characterisation of the causative agents. This progress provides a foundation for the rapid development of practical, highly sensitive and specific diagnostic and analytical tools to be used in improved hookworm prevention and control programmes.

  10. MATISSE: Multi-purpose Advanced Tool for Instruments for the Solar System Exploration .

    NASA Astrophysics Data System (ADS)

    Zinzi, A.; Capria, M. T.; Antonelli, L. A.

    In planetary sciences, design, assemble and launch onboard instruments are only preliminary steps toward the final aim of converting data into scientific knowledge, as the real challenge is the data analysis and interpretation. Up to now data have been generally stored in "old style" archives, i.e. common ftp servers where the user can manually search for data browsing directories organized in a time order manner. However, as datasets to be stored and searched become particularly large, this latter task absorbs a great part of the time, subtracting time to the real scientific work. In order to reduce the time spent to search and analyze data MATISSE (Multi-purpose Advanced Tool for Instruments for the Solar System Exploration), a new set of software tools developed together with the scientific teams of the instruments involved, is under development at ASDC (ASI Science Data Center), whose experience in space missions data management is well known (e.g., \\citealt{verrecchia07,pittori09,giommi09,massaro11}) and its features and aims will be presented here.

  11. Laser vision: lidar as a transformative tool to advance critical zone science

    NASA Astrophysics Data System (ADS)

    Harpold, A. A.; Marshall, J. A.; Lyon, S. W.; Barnhart, T. B.; Fisher, B. A.; Donovan, M.; Brubaker, K. M.; Crosby, C. J.; Glenn, N. F.; Glennie, C. L.; Kirchner, P. B.; Lam, N.; Mankoff, K. D.; McCreight, J. L.; Molotch, N. P.; Musselman, K. N.; Pelletier, J.; Russo, T.; Sangireddy, H.; Sjöberg, Y.; Swetnam, T.; West, N.

    2015-06-01

    Observation and quantification of the Earth's surface is undergoing a revolutionary change due to the increased spatial resolution and extent afforded by light detection and ranging (lidar) technology. As a consequence, lidar-derived information has led to fundamental discoveries within the individual disciplines of geomorphology, hydrology, and ecology. These disciplines form the cornerstones of critical zone (CZ) science, where researchers study how interactions among the geosphere, hydrosphere, and biosphere shape and maintain the "zone of life", which extends from the top of unweathered bedrock to the top of the vegetation canopy. Fundamental to CZ science is the development of transdisciplinary theories and tools that transcend disciplines and inform other's work, capture new levels of complexity, and create new intellectual outcomes and spaces. Researchers are just beginning to use lidar data sets to answer synergistic, transdisciplinary questions in CZ science, such as how CZ processes co-evolve over long timescales and interact over shorter timescales to create thresholds, shifts in states and fluxes of water, energy, and carbon. The objective of this review is to elucidate the transformative potential of lidar for CZ science to simultaneously allow for quantification of topographic, vegetative, and hydrological processes. A review of 147 peer-reviewed lidar studies highlights a lack of lidar applications for CZ studies as 38 % of the studies were focused in geomorphology, 18 % in hydrology, 32 % in ecology, and the remaining 12 % had an interdisciplinary focus. A handful of exemplar transdisciplinary studies demonstrate lidar data sets that are well-integrated with other observations can lead to fundamental advances in CZ science, such as identification of feedbacks between hydrological and ecological processes over hillslope scales and the synergistic co-evolution of landscape-scale CZ structure due to interactions amongst carbon, energy, and water cycles

  12. Flipping the Classroom and Student Performance in Advanced Statistics: Evidence from a Quasi-Experiment

    ERIC Educational Resources Information Center

    Touchton, Michael

    2015-01-01

    I administer a quasi-experiment using undergraduate political science majors in statistics classes to evaluate whether "flipping the classroom" (the treatment) alters students' applied problem-solving performance and satisfaction relative to students in a traditional classroom environment (the control). I also assess whether general…

  13. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates.

    PubMed

    Schwämmle, Veit; León, Ileana Rodríguez; Jensen, Ole Nørregaard

    2013-09-06

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods for analyses of triplicate data sets that exhibited more than 1000 features and more than 50% missing values. The maximum number of differentially represented features was identified by using limma and rank products methods in a complementary manner. We therefore recommend combined usage of these methods as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling

  14. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  15. A Kernel of Truth: Statistical Advances in Polygenic Variance Component Models for Complex Human Pedigrees

    PubMed Central

    Blangero, John; Diego, Vincent P.; Dyer, Thomas D.; Almeida, Marcio; Peralta, Juan; Kent, Jack W.; Williams, Jeff T.; Almasy, Laura; Göring, Harald H. H.

    2014-01-01

    Statistical genetic analysis of quantitative traits in large pedigrees is a formidable computational task due to the necessity of taking the non-independence among relatives into account. With the growing awareness that rare sequence variants may be important in human quantitative variation, heritability and association study designs involving large pedigrees will increase in frequency due to the greater chance of observing multiple copies of rare variants amongst related individuals. Therefore, it is important to have statistical genetic test procedures that utilize all available information for extracting evidence regarding genetic association. Optimal testing for marker/phenotype association involves the exact calculation of the likelihood ratio statistic which requires the repeated inversion of potentially large matrices. In a whole genome sequence association context, such computation may be prohibitive. Toward this end, we have developed a rapid and efficient eigensimplification of the likelihood that makes analysis of family data commensurate with the analysis of a comparable sample of unrelated individuals. Our theoretical results which are based on a spectral representation of the likelihood yield simple exact expressions for the expected likelihood ratio test statistic (ELRT) for pedigrees of arbitrary size and complexity. For heritability, the ELRT is: −∑ln[1+ĥ2(λgi−1)], where ĥ2 and λgi are respectively the heritability and eigenvalues of the pedigree-derived genetic relationship kernel (GRK). For association analysis of sequence variants, the ELRT is given by ELRT[hq2>0:unrelateds]−(ELRT[ht2>0:pedigrees]−ELRT[hr2>0:pedigrees]), where ht2,hq2, and hr2 are the total, quantitative trait nucleotide, and residual heritabilities, respectively. Using these results, fast and accurate analytical power analyses are possible, eliminating the need for computer simulation. Additional benefits of eigensimplification include a simple method for

  16. Live-site UXO classification studies using advanced EMI and statistical models

    NASA Astrophysics Data System (ADS)

    Shamatava, I.; Shubitidze, F.; Fernandez, J. P.; Bijamov, A.; Barrowes, B. E.; O'Neill, K.

    2011-06-01

    In this paper we present the inversion and classification performance of the advanced EMI inversion, processing and discrimination schemes developed by our group when applied to the ESTCP Live-Site UXO Discrimination Study carried out at the former Camp Butner in North Carolina. The advanced models combine: 1) the joint diagonalization (JD) algorithm to estimate the number of potential anomalies from the measured data without inversion, 2) the ortho-normalized volume magnetic source (ONVMS) to represent targets' EMI responses and extract their intrinsic "feature vectors," and 3) the Gaussian mixture algorithm to classify buried objects as targets of interest or not starting from the extracted discrimination features. The studies are conducted using cued datasets collected with the next-generation TEMTADS and MetalMapper (MM) sensor systems. For the cued TEMTADS datasets we first estimate the data quality and the number of targets contributing to each signal using the JD technique. Once we know the number of targets we proceed to invert the data using a standard non-linear optimization technique in order to determine intrinsic parameters such as the total ONVMS for each potential target. Finally we classify the targets using a library-matching technique. The MetalMapper data are all inverted as multi-target scenarios, and the resulting intrinsic parameters are grouped using an unsupervised Gaussian mixture approach. The potential targets of interest are a 37-mm projectile, an M48 fuze, and a 105-mm projectile. During the analysis we requested the ground truth for a few selected anomalies to assist in the classification task. Our results were scored independently by the Institute for Defense Analyses, who revealed that our advanced models produce superb classification when starting from either TEMTADS or MM cued datasets.

  17. Development, Implementation and Application of Micromechanical Analysis Tools for Advanced High Temperature Composites

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This document contains the final report to the NASA Glenn Research Center (GRC) for the research project entitled Development, Implementation, and Application of Micromechanical Analysis Tools for Advanced High-Temperature Composites. The research supporting this initiative has been conducted by Dr. Brett A. Bednarcyk, a Senior Scientist at OM in Brookpark, Ohio from the period of August 1998 to March 2005. Most of the work summarized herein involved development, implementation, and application of enhancements and new capabilities for NASA GRC's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package. When the project began, this software was at a low TRL (3-4) and at release version 2.0. Due to this project, the TRL of MAC/GMC has been raised to 7 and two new versions (3.0 and 4.0) have been released. The most important accomplishments with respect to MAC/GMC are: (1) A multi-scale framework has been built around the software, enabling coupled design and analysis from the global structure scale down to the micro fiber-matrix scale; (2) The software has been expanded to analyze smart materials; (3) State-of-the-art micromechanics theories have been implemented and validated within the code; (4) The damage, failure, and lifing capabilities of the code have been expanded from a very limited state to a vast degree of functionality and utility; and (5) The user flexibility of the code has been significantly enhanced. MAC/GMC is now the premier code for design and analysis of advanced composite and smart materials. It is a candidate for the 2005 NASA Software of the Year Award. The work completed over the course of the project is summarized below on a year by year basis. All publications resulting from the project are listed at the end of this report.

  18. Advancing the Science of Spatial Neglect Rehabilitation: An Improved Statistical Approach with Mixed Linear Modeling

    PubMed Central

    Goedert, Kelly M.; Boston, Raymond C.; Barrett, A. M.

    2013-01-01

    Valid research on neglect rehabilitation demands a statistical approach commensurate with the characteristics of neglect rehabilitation data: neglect arises from impairment in distinct brain networks leading to large between-subject variability in baseline symptoms and recovery trajectories. Studies enrolling medically ill, disabled patients, may suffer from missing, unbalanced data, and small sample sizes. Finally, assessment of rehabilitation requires a description of continuous recovery trajectories. Unfortunately, the statistical method currently employed in most studies of neglect treatment [repeated measures analysis of variance (ANOVA), rANOVA] does not well-address these issues. Here we review an alternative, mixed linear modeling (MLM), that is more appropriate for assessing change over time. MLM better accounts for between-subject heterogeneity in baseline neglect severity and in recovery trajectory. MLM does not require complete or balanced data, nor does it make strict assumptions regarding the data structure. Furthermore, because MLM better models between-subject heterogeneity it often results in increased power to observe treatment effects with smaller samples. After reviewing current practices in the field, and the assumptions of rANOVA, we provide an introduction to MLM. We review its assumptions, uses, advantages, and disadvantages. Using real and simulated data, we illustrate how MLM may improve the ability to detect effects of treatment over ANOVA, particularly with the small samples typical of neglect research. Furthermore, our simulation analyses result in recommendations for the design of future rehabilitation studies. Because between-subject heterogeneity is one important reason why studies of neglect treatments often yield conflicting results, employing statistical procedures that model this heterogeneity more accurately will increase the efficiency of our efforts to find treatments to improve the lives of individuals with neglect. PMID

  19. A Complex Approach to UXO Discrimination: Combining Advanced EMI Forward Models and Statistical Signal Processing

    DTIC Science & Technology

    2012-01-01

    assume that the NSMS can be approximated by a series of expansion functions F m (  ) such that (  )   m F m (  ) m1 M  (31) UXO...a receiver coil is the electromotive force given by the negative of the time derivative of the secondary magnetic flux through the coil. Since the...statistical signal processing MM-1572 Final Report Sky Research, Inc. January 2012 52 A support vector machine learns from data: when fed a series

  20. StatXFinder: a web-based self-directed tool that provides appropriate statistical test selection for biomedical researchers in their scientific studies.

    PubMed

    Suner, Aslı; Karakülah, Gökhan; Koşaner, Özgün; Dicle, Oğuz

    2015-01-01

    The improper use of statistical methods is common in analyzing and interpreting research data in biological and medical sciences. The objective of this study was to develop a decision support tool encompassing the commonly used statistical tests in biomedical research by combining and updating the present decision trees for appropriate statistical test selection. First, the decision trees in textbooks, published articles, and online resources were scrutinized, and a more comprehensive unified one was devised via the integration of 10 distinct decision trees. The questions also in the decision steps were revised by simplifying and enriching of the questions with examples. Then, our decision tree was implemented into the web environment and the tool titled StatXFinder was developed. Finally, usability and satisfaction questionnaires were applied to the users of the tool, and StatXFinder was reorganized in line with the feedback obtained from these questionnaires. StatXFinder provides users with decision support in the selection of 85 distinct parametric and non-parametric statistical tests by directing 44 different yes-no questions. The accuracy rate of the statistical test recommendations obtained by 36 participants, with the cases applied, were 83.3 % for "difficult" tests, and 88.9 % for "easy" tests. The mean system usability score of the tool was found 87.43 ± 10.01 (minimum: 70-maximum: 100). A statistically significant difference could not be seen between total system usability score and participants' attributes (p value >0.05). The User Satisfaction Questionnaire showed that 97.2 % of the participants appreciated the tool, and almost all of the participants (35 of 36) thought of recommending the tool to the others. In conclusion, StatXFinder, can be utilized as an instructional and guiding tool for biomedical researchers with limited statistics knowledge. StatXFinder is freely available at http://webb.deu.edu.tr/tb/statxfinder.

  1. Advances in statistical methods to map quantitative trait loci in outbred populations.

    PubMed

    Hoeschele, I; Uimari, P; Grignola, F E; Zhang, Q; Gage, K M

    1997-11-01

    Statistical methods to map quantitative trait loci (QTL) in outbred populations are reviewed, extensions and applications to human and plant genetic data are indicated, and areas for further research are identified. Simple and computationally inexpensive methods include (multiple) linear regression of phenotype on marker genotypes and regression of squared phenotypic differences among relative pairs on estimated proportions of identity-by-descent at a locus. These methods are less suited for genetic parameter estimation in outbred populations but allow the determination of test statistic distributions via simulation or data permutation; however, further inferences including confidence intervals of QTL location require the use of Monte Carlo or bootstrap sampling techniques. A method which is intermediate in computational requirements is residual maximum likelihood (REML) with a covariance matrix of random QTL effects conditional on information from multiple linked markers. Testing for the number of QTLs on a chromosome is difficult in a classical framework. The computationally most demanding methods are maximum likelihood and Bayesian analysis, which take account of the distribution of multilocus marker-QTL genotypes on a pedigree and permit investigators to fit different models of variation at the QTL. The Bayesian analysis includes the number of QTLs on a chromosome as an unknown.

  2. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  3. Advances in Statistical Methods to Map Quantitative Trait Loci in Outbred Populations

    PubMed Central

    Hoeschele, I.; Uimari, P.; Grignola, F. E.; Zhang, Q.; Gage, K. M.

    1997-01-01

    Statistical methods to map quantitative trait loci (QTL) in outbred populations are reviewed, extensions and applications to human and plant genetic data are indicated, and areas for further research are identified. Simple and computationally inexpensive methods include (multiple) linear regression of phenotype on marker genotypes and regression of squared phenotypic differences among relative pairs on estimated proportions of identity-by-descent at a locus. These methods are less suited for genetic parameter estimation in outbred populations but allow the determination of test statistic distributions via simulation or data permutation; however, further inferences including confidence intervals of QTL location require the use of Monte Carlo or bootstrap sampling techniques. A method which is intermediate in computational requirements is residual maximum likelihood (REML) with a covariance matrix of random QTL effects conditional on information from multiple linked markers. Testing for the number of QTLs on a chromosome is difficult in a classical framework. The computationally most demanding methods are maximum likelihood and Bayesian analysis, which take account of the distribution of multilocus marker-QTL genotypes on a pedigree and permit investigators to fit different models of variation at the QTL. The Bayesian analysis includes the number of QTLs on a chromosome as an unknown. PMID:9383084

  4. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Garcia, Jessica; Threet, Grady E., Jr.; Phillips, Alan

    2013-01-01

    The Earth-to-Orbit Team (ETO) of the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) is considered the pre-eminent "go-to" group for pre-phase A and phase A concept definition. Over the past several years the ETO team has evaluated thousands of launch vehicle concept variations for a significant number of studies including agency-wide efforts such as the Exploration Systems Architecture Study (ESAS), Constellation, Heavy Lift Launch Vehicle (HLLV), Augustine Report, Heavy Lift Propulsion Technology (HLPT), Human Exploration Framework Team (HEFT), and Space Launch System (SLS). The ACO ETO Team is called upon to address many needs in NASA's design community; some of these are defining extremely large trade-spaces, evaluating advanced technology concepts which have not been addressed by a large majority of the aerospace community, and the rapid turn-around of highly time critical actions. It is the time critical actions, those often limited by schedule or little advanced warning, that have forced the five member ETO team to develop a design process robust enough to handle their current output level in order to meet their customer's needs. Based on the number of vehicle concepts evaluated over the past year this output level averages to four completed vehicle concepts per day. Each of these completed vehicle concepts includes a full mass breakdown of the vehicle to a tertiary level of subsystem components and a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. A structural analysis of the vehicle to determine flight loads based on the trajectory output, material properties, and geometry of the concept is also performed. Due to working in this fast-paced and sometimes rapidly changing environment, the ETO Team has developed a finely tuned process to maximize their delivery capabilities. The objective of this paper is to describe the interfaces

  5. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Creech, Dennis M.; Garcia, Jessica; Threet, Grady E., Jr.; Phillips, Alan

    2012-01-01

    The Earth-to-Orbit Team (ETO) of the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) is considered the pre-eminent go-to group for pre-phase A and phase A concept definition. Over the past several years the ETO team has evaluated thousands of launch vehicle concept variations for a significant number of studies including agency-wide efforts such as the Exploration Systems Architecture Study (ESAS), Constellation, Heavy Lift Launch Vehicle (HLLV), Augustine Report, Heavy Lift Propulsion Technology (HLPT), Human Exploration Framework Team (HEFT), and Space Launch System (SLS). The ACO ETO Team is called upon to address many needs in NASA s design community; some of these are defining extremely large trade-spaces, evaluating advanced technology concepts which have not been addressed by a large majority of the aerospace community, and the rapid turn-around of highly time critical actions. It is the time critical actions, those often limited by schedule or little advanced warning, that have forced the five member ETO team to develop a design process robust enough to handle their current output level in order to meet their customer s needs. Based on the number of vehicle concepts evaluated over the past year this output level averages to four completed vehicle concepts per day. Each of these completed vehicle concepts includes a full mass breakdown of the vehicle to a tertiary level of subsystem components and a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. A structural analysis of the vehicle to determine flight loads based on the trajectory output, material properties, and geometry of the concept is also performed. Due to working in this fast-paced and sometimes rapidly changing environment, the ETO Team has developed a finely tuned process to maximize their delivery capabilities. The objective of this paper is to describe the interfaces

  6. Recent advances in statistical methods for the estimation of sediment and nutrient transport in rivers

    NASA Astrophysics Data System (ADS)

    Colin, T. A.

    1995-07-01

    This paper reviews advances in methods for estimating fluvial transport of suspended sediment and nutrients. Research from the past four years, mostly dealing with estimating monthly and annual loads, is emphasized. However, because this topic has not appeared in previous IUGG reports, some research prior to 1990 is included. The motivation for studying sediment transport has shifted during the past few decades. In addition to its role in filling reservoirs and channels, sediment is increasingly recognized as an important part of fluvial ecosystems and estuarine wetlands. Many groups want information about sediment transport [Bollman, 1992]: Scientists trying to understand benthic biology and catchment hydrology; citizens and policy-makers concerned about environmental impacts (e.g. impacts of logging [Beschta, 1978] or snow-fences [Sturges, 1992]); government regulators considering the effectiveness of programs to protect in-stream habitat and downstream waterbodies; and resource managers seeking to restore wetlands.

  7. How Project Management Tools Aid in Association to Advance Collegiate Schools of Business (AACSB) International Maintenance of Accreditation

    ERIC Educational Resources Information Center

    Cann, Cynthia W.; Brumagim, Alan L.

    2008-01-01

    The authors present the case of one business college's use of project management techniques as tools for accomplishing Association to Advance Collegiate Schools of Business (AACSB) International maintenance of accreditation. Using these techniques provides an efficient and effective method of organizing maintenance efforts. In addition, using…

  8. Advances in the genetic dissection of plant cell walls: tools and resources available in Miscanthus

    PubMed Central

    Slavov, Gancho; Allison, Gordon; Bosch, Maurice

    2013-01-01

    Tropical C4 grasses from the genus Miscanthus are believed to have great potential as biomass crops. However, Miscanthus species are essentially undomesticated, and genetic, molecular and bioinformatics tools are in very early stages of development. Furthermore, similar to other crops targeted as lignocellulosic feedstocks, the efficient utilization of biomass is hampered by our limited knowledge of the structural organization of the plant cell wall and the underlying genetic components that control this organization. The Institute of Biological, Environmental and Rural Sciences (IBERS) has assembled an extensive collection of germplasm for several species of Miscanthus. In addition, an integrated, multidisciplinary research programme at IBERS aims to inform accelerated breeding for biomass productivity and composition, while also generating fundamental knowledge. Here we review recent advances with respect to the genetic characterization of the cell wall in Miscanthus. First, we present a summary of recent and on-going biochemical studies, including prospects and limitations for the development of powerful phenotyping approaches. Second, we review current knowledge about genetic variation for cell wall characteristics of Miscanthus and illustrate how phenotypic data, combined with high-density arrays of single-nucleotide polymorphisms, are being used in genome-wide association studies to generate testable hypotheses and guide biological discovery. Finally, we provide an overview of the current knowledge about the molecular biology of cell wall biosynthesis in Miscanthus and closely related grasses, discuss the key conceptual and technological bottlenecks, and outline the short-term prospects for progress in this field. PMID:23847628

  9. Ares First Stage "Systemology" - Combining Advanced Systems Engineering and Planning Tools to Assure Mission Success

    NASA Technical Reports Server (NTRS)

    Seiler, James; Brasfield, Fred; Cannon, Scott

    2008-01-01

    Ares is an integral part of NASA s Constellation architecture that will provide crew and cargo access to the International Space Station as well as low earth orbit support for lunar missions. Ares replaces the Space Shuttle in the post 2010 time frame. Ares I is an in-line, two-stage rocket topped by the Orion Crew Exploration Vehicle, its service module, and a launch abort system. The Ares I first stage is a single, five-segment reusable solid rocket booster derived from the Space Shuttle Program's reusable solid rocket motor. The Ares second or upper stage is propelled by a J-2X main engine fueled with liquid oxygen and liquid hydrogen. This paper describes the advanced systems engineering and planning tools being utilized for the design, test, and qualification of the Ares I first stage element. Included are descriptions of the current first stage design, the milestone schedule requirements, and the marriage of systems engineering, detailed planning efforts, and roadmapping employed to achieve these goals.

  10. Dynamic statistical optimization of GNSS radio occultation bending angles: advanced algorithm and performance analysis

    NASA Astrophysics Data System (ADS)

    Li, Y.; Kirchengast, G.; Scherllin-Pirscher, B.; Norman, R.; Yuan, Y. B.; Fritzer, J.; Schwaerz, M.; Zhang, K.

    2015-08-01

    We introduce a new dynamic statistical optimization algorithm to initialize ionosphere-corrected bending angles of Global Navigation Satellite System (GNSS)-based radio occultation (RO) measurements. The new algorithm estimates background and observation error covariance matrices with geographically varying uncertainty profiles and realistic global-mean correlation matrices. The error covariance matrices estimated by the new approach are more accurate and realistic than in simplified existing approaches and can therefore be used in statistical optimization to provide optimal bending angle profiles for high-altitude initialization of the subsequent Abel transform retrieval of refractivity. The new algorithm is evaluated against the existing Wegener Center Occultation Processing System version 5.6 (OPSv5.6) algorithm, using simulated data on two test days from January and July 2008 and real observed CHAllenging Minisatellite Payload (CHAMP) and Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) measurements from the complete months of January and July 2008. The following is achieved for the new method's performance compared to OPSv5.6: (1) significant reduction of random errors (standard deviations) of optimized bending angles down to about half of their size or more; (2) reduction of the systematic differences in optimized bending angles for simulated MetOp data; (3) improved retrieval of refractivity and temperature profiles; and (4) realistically estimated global-mean correlation matrices and realistic uncertainty fields for the background and observations. Overall the results indicate high suitability for employing the new dynamic approach in the processing of long-term RO data into a reference climate record, leading to well-characterized and high-quality atmospheric profiles over the entire stratosphere.

  11. Dynamic statistical optimization of GNSS radio occultation bending angles: an advanced algorithm and its performance analysis

    NASA Astrophysics Data System (ADS)

    Li, Y.; Kirchengast, G.; Scherllin-Pirscher, B.; Norman, R.; Yuan, Y. B.; Fritzer, J.; Schwaerz, M.; Zhang, K.

    2015-01-01

    We introduce a new dynamic statistical optimization algorithm to initialize ionosphere-corrected bending angles of Global Navigation Satellite System (GNSS) based radio occultation (RO) measurements. The new algorithm estimates background and observation error covariance matrices with geographically-varying uncertainty profiles and realistic global-mean correlation matrices. The error covariance matrices estimated by the new approach are more accurate and realistic than in simplified existing approaches and can therefore be used in statistical optimization to provide optimal bending angle profiles for high-altitude initialization of the subsequent Abel transform retrieval of refractivity. The new algorithm is evaluated against the existing Wegener Center Occultation Processing System version 5.6 (OPSv5.6) algorithm, using simulated data on two test days from January and July 2008 and real observed CHAMP and COSMIC measurements from the complete months of January and July 2008. The following is achieved for the new method's performance compared to OPSv5.6: (1) significant reduction in random errors (standard deviations) of optimized bending angles down to about two-thirds of their size or more; (2) reduction of the systematic differences in optimized bending angles for simulated MetOp data; (3) improved retrieval of refractivity and temperature profiles; (4) produces realistically estimated global-mean correlation matrices and realistic uncertainty fields for the background and observations. Overall the results indicate high suitability for employing the new dynamic approach in the processing of long-term RO data into a reference climate record, leading to well characterized and high-quality atmospheric profiles over the entire stratosphere.

  12. Improved equilibrium reconstructions by advanced statistical weighting of the internal magnetic measurements.

    PubMed

    Murari, A; Gelfusa, M; Peluso, E; Gaudio, P; Mazon, D; Hawkes, N; Point, G; Alper, B; Eich, T

    2014-12-01

    In a Tokamak the configuration of the magnetic fields remains the key element to improve performance and to maximise the scientific exploitation of the device. On the other hand, the quality of the reconstructed fields depends crucially on the measurements available. Traditionally in the least square minimisation phase of the algorithms, used to obtain the magnetic field topology, all the diagnostics are given the same weights, a part from a corrective factor taking into account the error bars. This assumption unduly penalises complex diagnostics, such as polarimetry, which have a limited number of highly significant measurements. A completely new method to choose the weights, to be given to the internal measurements of the magnetic fields for improved equilibrium reconstructions, is presented in this paper. The approach is based on various statistical indicators applied to the residuals, the difference between the actual measurements and their estimates from the reconstructed equilibrium. The potential of the method is exemplified using the measurements of the Faraday rotation derived from JET polarimeter. The results indicate quite clearly that the weights have to be determined carefully, since the inappropriate choice can have significant repercussions on the quality of the magnetic reconstruction both in the edge and in the core. These results confirm the limitations of the assumption that all the diagnostics have to be given the same weight, irrespective of the number of measurements they provide and the region of the plasma they probe.

  13. Machine learning and statistical methods for the prediction of maximal oxygen uptake: recent advances

    PubMed Central

    Abut, Fatih; Akay, Mehmet Fatih

    2015-01-01

    Maximal oxygen uptake (VO2max) indicates how many milliliters of oxygen the body can consume in a state of intense exercise per minute. VO2max plays an important role in both sport and medical sciences for different purposes, such as indicating the endurance capacity of athletes or serving as a metric in estimating the disease risk of a person. In general, the direct measurement of VO2max provides the most accurate assessment of aerobic power. However, despite a high level of accuracy, practical limitations associated with the direct measurement of VO2max, such as the requirement of expensive and sophisticated laboratory equipment or trained staff, have led to the development of various regression models for predicting VO2max. Consequently, a lot of studies have been conducted in the last years to predict VO2max of various target audiences, ranging from soccer athletes, nonexpert swimmers, cross-country skiers to healthy-fit adults, teenagers, and children. Numerous prediction models have been developed using different sets of predictor variables and a variety of machine learning and statistical methods, including support vector machine, multilayer perceptron, general regression neural network, and multiple linear regression. The purpose of this study is to give a detailed overview about the data-driven modeling studies for the prediction of VO2max conducted in recent years and to compare the performance of various VO2max prediction models reported in related literature in terms of two well-known metrics, namely, multiple correlation coefficient (R) and standard error of estimate. The survey results reveal that with respect to regression methods used to develop prediction models, support vector machine, in general, shows better performance than other methods, whereas multiple linear regression exhibits the worst performance. PMID:26346869

  14. Machine learning and statistical methods for the prediction of maximal oxygen uptake: recent advances.

    PubMed

    Abut, Fatih; Akay, Mehmet Fatih

    2015-01-01

    Maximal oxygen uptake (VO2max) indicates how many milliliters of oxygen the body can consume in a state of intense exercise per minute. VO2max plays an important role in both sport and medical sciences for different purposes, such as indicating the endurance capacity of athletes or serving as a metric in estimating the disease risk of a person. In general, the direct measurement of VO2max provides the most accurate assessment of aerobic power. However, despite a high level of accuracy, practical limitations associated with the direct measurement of VO2max, such as the requirement of expensive and sophisticated laboratory equipment or trained staff, have led to the development of various regression models for predicting VO2max. Consequently, a lot of studies have been conducted in the last years to predict VO2max of various target audiences, ranging from soccer athletes, nonexpert swimmers, cross-country skiers to healthy-fit adults, teenagers, and children. Numerous prediction models have been developed using different sets of predictor variables and a variety of machine learning and statistical methods, including support vector machine, multilayer perceptron, general regression neural network, and multiple linear regression. The purpose of this study is to give a detailed overview about the data-driven modeling studies for the prediction of VO2max conducted in recent years and to compare the performance of various VO2max prediction models reported in related literature in terms of two well-known metrics, namely, multiple correlation coefficient (R) and standard error of estimate. The survey results reveal that with respect to regression methods used to develop prediction models, support vector machine, in general, shows better performance than other methods, whereas multiple linear regression exhibits the worst performance.

  15. Predictive Modeling of Estrogen Receptor Binding Agents Using Advanced Cheminformatics Tools and Massive Public Data

    PubMed Central

    Ribay, Kathryn; Kim, Marlene T.; Wang, Wenyi; Pinolini, Daniel; Zhu, Hao

    2016-01-01

    Estrogen receptors (ERα) are a critical target for drug design as well as a potential source of toxicity when activated unintentionally. Thus, evaluating potential ERα binding agents is critical in both drug discovery and chemical toxicity areas. Using computational tools, e.g., Quantitative Structure-Activity Relationship (QSAR) models, can predict potential ERα binding agents before chemical synthesis. The purpose of this project was to develop enhanced predictive models of ERα binding agents by utilizing advanced cheminformatics tools that can integrate publicly available bioassay data. The initial ERα binding agent data set, consisting of 446 binders and 8307 non-binders, was obtained from the Tox21 Challenge project organized by the NIH Chemical Genomics Center (NCGC). After removing the duplicates and inorganic compounds, this data set was used to create a training set (259 binders and 259 non-binders). This training set was used to develop QSAR models using chemical descriptors. The resulting models were then used to predict the binding activity of 264 external compounds, which were available to us after the models were developed. The cross-validation results of training set [Correct Classification Rate (CCR) = 0.72] were much higher than the external predictivity of the unknown compounds (CCR = 0.59). To improve the conventional QSAR models, all compounds in the training set were used to search PubChem and generate a profile of their biological responses across thousands of bioassays. The most important bioassays were prioritized to generate a similarity index that was used to calculate the biosimilarity score between each two compounds. The nearest neighbors for each compound within the set were then identified and its ERα binding potential was predicted by its nearest neighbors in the training set. The hybrid model performance (CCR = 0.94 for cross validation; CCR = 0.68 for external prediction) showed significant improvement over the original QSAR

  16. Bayesian statistics as a new tool for spectral analysis - I. Application for the determination of basic parameters of massive stars

    NASA Astrophysics Data System (ADS)

    Mugnes, J.-M.; Robert, C.

    2015-11-01

    Spectral analysis is a powerful tool to investigate stellar properties and it has been widely used for decades now. However, the methods considered to perform this kind of analysis are mostly based on iteration among a few diagnostic lines to determine the stellar parameters. While these methods are often simple and fast, they can lead to errors and large uncertainties due to the required assumptions. Here, we present a method based on Bayesian statistics to find simultaneously the best combination of effective temperature, surface gravity, projected rotational velocity, and microturbulence velocity, using all the available spectral lines. Different tests are discussed to demonstrate the strength of our method, which we apply to 54 mid-resolution spectra of field and cluster B stars obtained at the Observatoire du Mont-Mégantic. We compare our results with those found in the literature. Differences are seen which are well explained by the different methods used. We conclude that the B-star microturbulence velocities are often underestimated. We also confirm the trend that B stars in clusters are on average faster rotators than field B stars.

  17. The Advanced Dementia Prognostic Tool (ADEPT): A Risk Score to Estimate Survival in Nursing Home Residents with Advanced Dementia

    PubMed Central

    Mitchell, Susan L.; Miller, Susan C.; Teno, Joan M.; Davis, Roger B.; Shaffer, Michele L.

    2010-01-01

    Context Estimating life expectancy is challenging in advanced dementia. Objectives To create a risk score to estimate survival in nursing home (NH) residents with advanced dementia. Methods This was a retrospective cohort study performed in the setting of all licensed US NHs. Residents with advanced dementia living in US NHs in 2002 were identified using Minimum Data Set (MDS) assessments. Mortality data from Medicare files were used to determine 12-month survival. Independent variables were selected from the MDS. Cox proportional hazards regression was used to model survival. The accuracy of the final model was assessed using the area under the receiver operating characteristic curve (AUROC). To develop a risk score, points were assigned to variables in the final model based on parameter estimates. Residents meeting hospice eligibility guidelines for dementia, based on MDS data, were identified. The AUROC assessed the accuracy of hospice guidelines to predict six-month survival. Results Over 12 months, 40.6% of residents with advanced dementia (n=22,405) died. Twelve variables best predicted survival: length of stay, age, male, dyspnea, pressure ulcers, total functional dependence, bedfast, insufficient intake, bowel incontinence, body mass index, weight loss, and congestive heart failure. The AUROC for the final model was 0.68. The risk score ranged from 0–32 points (higher scores indicate worse survival). Only 15.9% of residents met hospice eligibility guidelines for which the AUROC predicting six-month survival was 0.53. Conclusion A mortality risk score derived from MDS data predicted six-month survival in advanced dementia with moderate accuracy. The predictive ability of hospice guidelines, simulated with MDS data, was poor. PMID:20621437

  18. Advanced Differential Radar Interferometry (A-DInSAR) as integrative tool for a structural geological analysis

    NASA Astrophysics Data System (ADS)

    Crippa, B.; Calcagni, L.; Rossi, G.; Sternai, P.

    2009-04-01

    Advanced Differential SAR interferometry (A-DInSAR) is a technique monitoring large-coverage surface deformations using a stack of interferograms generated from several complex SLC SAR images, acquired over the same target area at different times. In this work are described the results of a procedure to calculate terrain motion velocity on highly correlated pixels (E. Biescas, M. Crosetto, M. Agudo, O. Monserrat e B. Crippa: Two Radar Interferometric Approaches to Monitor Slow and Fast Land Deformation, 2007) in two area Gemona - Friuli, Northern Italy, Pollino - Calabria, Southern Italy, and, furthermore, are presented some consideration, based on successful examples of the present analysis. The choice of these pixels whose displacement velocity is calculated depends on the dispersion index value (DA) or using coherence values along the stack interferograms. A-DInSAR technique allows to obtain highly reliable velocity values of the vertical displacement. These values concern the movement of minimum surfaces of about 80m2 at the maximum resolution and the minimum velocity that can be recognized is of the order of mm/y. Because of the high versatility of the technology, because of the large dimensions of the area that can be analyzed (of about 10000Km2) and because of the high precision and reliability of the results obtained, we think it is possible to exploit radar interferometry to obtain some important information about the structural context of the studied area, otherwise very difficult to recognize. Therefore we propose radar interferometry as a valid investigation tool whose results must be considered as an important integration of the data collected in fieldworks.

  19. Capability index--a statistical process control tool to aid in udder health control in dairy herds.

    PubMed

    Niza-Ribeiro, J; Noordhuizen, J P T M; Menezes, J C

    2004-08-01

    Bulk milk somatic cell count (BMSCC) averages have been used to evaluate udder health both at the individual or the herd level as well as milk quality and hygiene. The authors show that the BMSCC average is not the best tool to be used in udder health control programs and that it can be replaced with advantage by the capability index (Cpk). The Cpk is a statistical process control tool traditionally used by engineers to validate, monitor, and predict the expected behavior of processes or machines. The BMSCC data of 13 consecutive months of production from 414 dairy herds as well as SCC from all cows in the DHI program from 264 herds in the same period were collected. The Cpk and the annual BMSCC average (AAVG) of all the herds were calculated. Confronting the herd's performance explained by the Cpk and AAVG with the European Union (EU) official limit for BMSCC of 400,000 cells/mL, it was noticed that the Cpk accurately classified the compliance of the 414 farms, whereas the AAVG misclassified 166 (40%) of the 414 selected farms. The annual prevalence of subclinical mastitis (SMP) of each herd was calculated with individual SCC data from the same 13-mo period. Cows with more than 200,000 SCC/mL were considered as having subclinical mastitis. A logistic regression model to relate the Cpk and the herd's subclinical mastitis prevalence was calculated. The model is: SMPe = 0.475 e(-0.5286 x Cpk). The validation of the model was carried out evaluating the relation between the observed SMP and the predicted SMPe, in terms of the linear correlation coefficient (R2) and the mean difference between SMP and SMPe (i.e., mean square error of prediction). The validation suggests that our model can be used to estimate the herd's SMP with the herd's Cpk. The Cpk equation relates the herd's BMSCC with the EU official SCC limit, thus the logistic regression model enables the adoption of critical limits for subclinical mastitis, taking into consideration the legal standard for SCC.

  20. Design and optimization of disintegrating pellets of MCC by non-aqueous extrusion process using statistical tools.

    PubMed

    Gurram, Rajesh Kumar; Gandra, Suchithra; Shastri, Nalini R

    2016-03-10

    The objective of the study was to design and optimize a disintegrating pellet formulation of microcrystalline cellulose by non-aqueous extrusion process for a water sensitive drug using various statistical tools. Aspirin was used as a model drug. Disintegrating matrix pellets of aspirin using propylene glycol as a non-aqueous granulation liquid and croscarmellose as a disintegrant was developed. Plackett-Burman design was initially conducted to screen and identify the significant factors. Final optimization of formula was performed by response surface methodology using a central composite design. The critical attributes of the pellet dosage forms (dependent variables); disintegration time, sphericity and yield were predicted with adequate accuracy based on the regression model. Pareto charts and contour charts were studied to understand the influence of factors and predict the responses. A design space was constructed to meet the desirable targets of the responses in terms of disintegration time <5min, maximum yield, sphericity >0.95 and friability <1.7%. The optimized matrix pellets were enteric coated using Eudragit L 100. The drug release from the enteric coated pellets after 30min in the basic media was ~93% when compared to ~77% from the marketed pellets. The delayed release pellets stored at 25°C/60% RH were stable for a period of 10mo. In conclusion, it can be stated that the developed process for disintegrating pellets using non-aqueous granulating agents can be used as an alternative technique for various water sensitive drugs, circumventing the application of volatile organic solvents in conventional drug layering on inert cores. The scope of this study can be further extended to hydrophobic drugs, which may benefit from the rapid disintegration property and the use of various hydrophilic excipients used in the optimized pellet formulation to enhance dissolution and in turn improve bioavailability.

  1. Development of 3D multimedia with advanced computer animation tools for outreach activities related to Meteor Science and Meteoritics

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.

    2012-09-01

    Documentaries related to Astronomy and Planetary Sciences are a common and very attractive way to promote the interest of the public in these areas. These educational tools can get benefit from new advanced computer animation software and 3D technologies, as these allow making these documentaries even more attractive. However, special care must be taken in order to guarantee that the information contained in them is serious and objective. In this sense, an additional value is given when the footage is produced by the own researchers. With this aim, a new documentary produced and directed by Prof. Madiedo has been developed. The documentary, which has been entirely developed by means of advanced computer animation tools, is dedicated to several aspects of Meteor Science and Meteoritics. The main features of this outreach and education initiative are exposed here.

  2. Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C

    SciTech Connect

    Hale, Richard Edward; Fugate, David L.; Cetiner, Sacit M.; Qualls, A. L.

    2015-05-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactor innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  3. Severe Impairment Rating Scale: A Useful and Brief Cognitive Assessment Tool for Advanced Dementia for Nursing Home Residents.

    PubMed

    Yeo, Cindy; Lim, Wee Shiong; Chan, Mark; Ho, Xin Qin; Anthony, Philomena Vasantha; Han, Huey Charn; Chong, Mei Sian

    2016-02-01

    To investigate the utility of the Severe Impairment Rating Scale (SIRS) as a cognitive assessment tool among nursing home residents with advanced dementia, we conducted a cross-sectional study of 96 residents in 3 nursing homes with Functional Assessment Staging Test (FAST) stage 6a and above. We compared the discriminatory ability of SIRS with the Chinese version of Mini-Mental State Examination, Abbreviated Mental Test, and Clock Drawing Test. Among the cognitive tests, SIRS showed the least "floor" effect and had the best capacity to distinguish very severe (FAST stages 7d-f) dementia (area under the curve 0.80 vs 0.46-0.76 for the other tools). The SIRS had the best correlation with FAST staging (r = -.59, P < .01) and, unlike the other 3 tools, exhibited only minimal change in correlation when adjusted for education and ethnicity. Our results support the utility of SIRS as a brief cognitive assessment tool for advanced dementia in the nursing home setting.

  4. Advances in Chimera Grid Tools for Multi-Body Dynamics Simulations and Script Creation

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    This viewgraph presentation contains information about (1) Framework for multi-body dynamics - Geometry Manipulation Protocol (GMP), (2) Simulation procedure using Chimera Grid Tools (CGT) and OVERFLOW-2 (3) Further recent developments in Chimera Grid Tools OVERGRID, Grid modules, Script library and (4) Future work.

  5. Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)

    2001-01-01

    The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.

  6. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    ERIC Educational Resources Information Center

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  7. ADVANCED TOOLS FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS AND SOURCE WATERS

    EPA Science Inventory

    The purpose of this poster is to present the application and assessment of advanced technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs (azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, and methylenedioxy...

  8. Identification of fungal phytopathogens using Fourier transform infrared-attenuated total reflection spectroscopy and advanced statistical methods

    NASA Astrophysics Data System (ADS)

    Salman, Ahmad; Lapidot, Itshak; Pomerantz, Ami; Tsror, Leah; Shufan, Elad; Moreh, Raymond; Mordechai, Shaul; Huleihel, Mahmoud

    2012-01-01

    The early diagnosis of phytopathogens is of a great importance; it could save large economical losses due to crops damaged by fungal diseases, and prevent unnecessary soil fumigation or the use of fungicides and bactericides and thus prevent considerable environmental pollution. In this study, 18 isolates of three different fungi genera were investigated; six isolates of Colletotrichum coccodes, six isolates of Verticillium dahliae and six isolates of Fusarium oxysporum. Our main goal was to differentiate these fungi samples on the level of isolates, based on their infrared absorption spectra obtained using the Fourier transform infrared-attenuated total reflection (FTIR-ATR) sampling technique. Advanced statistical and mathematical methods: principal component analysis (PCA), linear discriminant analysis (LDA), and k-means were applied to the spectra after manipulation. Our results showed significant spectral differences between the various fungi genera examined. The use of k-means enabled classification between the genera with a 94.5% accuracy, whereas the use of PCA [3 principal components (PCs)] and LDA has achieved a 99.7% success rate. However, on the level of isolates, the best differentiation results were obtained using PCA (9 PCs) and LDA for the lower wavenumber region (800-1775 cm-1), with identification success rates of 87%, 85.5%, and 94.5% for Colletotrichum, Fusarium, and Verticillium strains, respectively.

  9. Characterization and detection of Vero cells infected with Herpes Simplex Virus type 1 using Raman spectroscopy and advanced statistical methods.

    PubMed

    Salman, A; Shufan, E; Zeiri, L; Huleihel, M

    2014-07-01

    Herpes viruses are involved in a variety of human disorders. Herpes Simplex Virus type 1 (HSV-1) is the most common among the herpes viruses and is primarily involved in human cutaneous disorders. Although the symptoms of infection by this virus are usually minimal, in some cases HSV-1 might cause serious infections in the eyes and the brain leading to blindness and even death. A drug, acyclovir, is available to counter this virus. The drug is most effective when used during the early stages of the infection, which makes early detection and identification of these viral infections highly important for successful treatment. In the present study we evaluated the potential of Raman spectroscopy as a sensitive, rapid, and reliable method for the detection and identification of HSV-1 viral infections in cell cultures. Using Raman spectroscopy followed by advanced statistical methods enabled us, with sensitivity approaching 100%, to differentiate between a control group of Vero cells and another group of Vero cells that had been infected with HSV-1. Cell sites that were "rich in membrane" gave the best results in the differentiation between the two categories. The major changes were observed in the 1195-1726 cm(-1) range of the Raman spectrum. The features in this range are attributed mainly to proteins, lipids, and nucleic acids.

  10. CRISPR/Cas9: an advanced tool for editing plant genomes.

    PubMed

    Samanta, Milan Kumar; Dey, Avishek; Gayen, Srimonta

    2016-10-01

    To meet current challenges in agriculture, genome editing using sequence-specific nucleases (SSNs) is a powerful tool for basic and applied plant biology research. Here, we describe the principle and application of available genome editing tools, including zinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs) and the clustered regularly interspaced short palindromic repeat associated CRISPR/Cas9 system. Among these SSNs, CRISPR/Cas9 is the most recently characterized and rapidly developing genome editing technology, and has been successfully utilized in a wide variety of organisms. This review specifically illustrates the power of CRISPR/Cas9 as a tool for plant genome engineering, and describes the strengths and weaknesses of the CRISPR/Cas9 technology compared to two well-established genome editing tools, ZFNs and TALENs.

  11. Advanced repair solution of clear defects on HTPSM by using nanomachining tool

    NASA Astrophysics Data System (ADS)

    Lee, Hyemi; Kim, Munsik; Jung, Hoyong; Kim, Sangpyo; Yim, Donggyu

    2015-10-01

    As the mask specifications become tighter for low k1 lithography, more aggressive repair accuracy is required below sub 20nm tech. node. To meet tight defect specifications, many maskshops select effective repair tools according to defect types. Normally, pattern defects are repaired by the e-beam repair tool and soft defects such as particles are repaired by the nanomachining tool. It is difficult for an e-beam repair tool to remove particle defects because it uses chemical reaction between gas and electron, and a nanomachining tool, which uses physical reaction between a nano-tip and defects, cannot be applied for repairing clear defects. Generally, film deposition process is widely used for repairing clear defects. However, the deposited film has weak cleaning durability, so it is easily removed by accumulated cleaning process. Although the deposited film is strongly attached on MoSiN(or Qz) film, the adhesive strength between deposited Cr film and MoSiN(or Qz) film becomes weaker and weaker by the accumulated energy when masks are exposed in a scanner tool due to the different coefficient of thermal expansion of each materials. Therefore, whenever a re-pellicle process is needed to a mask, all deposited repair points have to be confirmed whether those deposition film are damaged or not. And if a deposition point is damaged, repair process is needed again. This process causes longer and more complex process. In this paper, the basic theory and the principle are introduced to recover clear defects by using nanomachining tool, and the evaluated results are reviewed at dense line (L/S) patterns and contact hole (C/H) patterns. Also, the results using a nanomachining were compared with those using an e-beam repair tool, including the cleaning durability evaluated by the accumulated cleaning process. Besides, we discuss the phase shift issue and the solution about the image placement error caused by phase error.

  12. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  13. Implementation of a professional portfolio: a tool to demonstrate professional development for advanced practice.

    PubMed

    Chamblee, Tracy B; Dale, Juanita Conkin; Drews, Barbie; Spahis, Joanna; Hardin, Teri

    2015-01-01

    The literature has a gap related to professional development for APRNs. In the United States, many health care organizations use clinical advancement programs for registered nurses, but APRNs are not often included in these programs. If APRNs are included, advancement opportunities are very limited. At CMC, implementation of a professional portfolio resulted in increased satisfaction among APPs regarding their ability to showcase professional growth and expertise, as well as the uniqueness of their advanced practice. Use of the professional portfolio led to improved recognition by APS and organizational leaders of APP performance excellence during the annual performance evaluation, as well as improved recognition among APP colleagues in terms of nominations for honors and awards.

  14. New methodology to baseline and match AME polysilicon etcher using advanced diagnostic tools

    NASA Astrophysics Data System (ADS)

    Poppe, James; Shipman, John; Reinhardt, Barbara E.; Roussel, Myriam; Hedgecock, Raymond; Fonda, Arturo

    1999-09-01

    As process controls tighten in the semiconductor industry, the need to understand the variables that determine system performance become more important. For plasma etch systems, process success depends on the control of key parameters such as: vacuum integrity, pressure, gas flows, and RF power. It is imperative to baseline, monitor, and control these variables. This paper presents an overview of the methods and tools used by Motorola BMC fabrication facility to characterize an Applied Materials polysilicon etcher. Tool performance data obtained from our traditional measurement techniques are limited in their scope and do not provide a complete picture of the ultimate tool performance. Presently the BMC traditional characterization tools provide a snapshot of the static operation of the equipment under test (EUT); however, complete evaluation of the dynamic performance cannot be monitored without the aid of specialized diagnostic equipment. To provide us with a complete system baseline evaluation of the polysilicon etcher, three diagnostic tools were utilized: Lucas Labs Vacuum Diagnostic System, Residual Gas Analyzer, and the ENI Voltage/Impedance Probe. The diagnostic methodology used to baseline and match key parameters of qualified production equipment has had an immense impact on other equipment characterization in the facility. It has resulted in reduced cycle time for new equipment introduction as well.

  15. Statistically advanced, self-similar, radial probability density functions of atmospheric and under-expanded hydrogen jets

    NASA Astrophysics Data System (ADS)

    Ruggles, Adam J.

    2015-11-01

    This paper presents improved statistical insight regarding the self-similar scalar mixing process of atmospheric hydrogen jets and the downstream region of under-expanded hydrogen jets. Quantitative planar laser Rayleigh scattering imaging is used to probe both jets. The self-similarity of statistical moments up to the sixth order (beyond the literature established second order) is documented in both cases. This is achieved using a novel self-similar normalization method that facilitated a degree of statistical convergence that is typically limited to continuous, point-based measurements. This demonstrates that image-based measurements of a limited number of samples can be used for self-similar scalar mixing studies. Both jets exhibit the same radial trends of these moments demonstrating that advanced atmospheric self-similarity can be applied in the analysis of under-expanded jets. Self-similar histograms away from the centerline are shown to be the combination of two distributions. The first is attributed to turbulent mixing. The second, a symmetric Poisson-type distribution centered on zero mass fraction, progressively becomes the dominant and eventually sole distribution at the edge of the jet. This distribution is attributed to shot noise-affected pure air measurements, rather than a diffusive superlayer at the jet boundary. This conclusion is reached after a rigorous measurement uncertainty analysis and inspection of pure air data collected with each hydrogen data set. A threshold based upon the measurement noise analysis is used to separate the turbulent and pure air data, and thusly estimate intermittency. Beta-distributions (four parameters) are used to accurately represent the turbulent distribution moments. This combination of measured intermittency and four-parameter beta-distributions constitutes a new, simple approach to model scalar mixing. Comparisons between global moments from the data and moments calculated using the proposed model show excellent

  16. Statistical tools for managing the Ambikapur aquifer in central India for sustainable hydrological development of the region

    NASA Astrophysics Data System (ADS)

    Sharma, S. K.

    2009-04-01

    Statistical tools for managing the Ambikapur aquifer in central India for sustainable hydrological development of the region Despite India's tremendous progress on all fronts after independence in 1947, the fact remains that it is one of the poorest nation in the world in terms of per capita income and energy consumption which is considered to be the gauge of the economic situation of any country. In case of India, it is nearly one tenth of the developed nations. If economic condition of its people is to be raised, then country has to boost its agriculture production which is largely monsoon dependent and to exploit its conventional and unconventional energy sources at a very rapid growth rate. Although, worldwide, 70% of the water that is withdrawn for human use is used for agriculture, 22% for industry and 8% is used for domestic services. But in India which is a low income country, 82% is used for agriculture, 10% for industry and 8% for domestic services. Therefore, India needs new sources of water to reduce the risk of dependency on the monsoon for the Sustainable Development of the country. It is in this connection that the Ambikapur Basin in the Central India has been studied for sustainable water withdrawal. At present, the crops in the Ambikapur region are totally monsoon dependent. However, with the initiatives of the State Government, 25 boreholes in an area of about 25 square kilometers have been drilled up to a depth of 500m and completed in the Gondwana sandstone. The water quality and the discharge rates have been established to sustain the crops of the area which is the only livelihood of the local people , in case the monsoon fails. The hydraulic properties of the aquifer like Transmissivity (T) and the Coefficient of Storage (S) were determined following the graphic method of Jacob and Theis. The rate of discharge (Q) of the pumped well was estimated at 4.05 x 10 to the power 3 cubic meters per second and the values of other parameters like T at

  17. Exposure to Alcoholism in the Family: United States, 1988. Advance Data from Vital and Health Statistics of the National Center for Health Statistics. Number 205.

    ERIC Educational Resources Information Center

    Schoenborn, Charlotte A.

    This report is based on data from the 1988 National Health Interview Survey on Alcohol (NHIS-Alcohol), part of the ongoing National Health Interview Survey conducted by the National Center for Health Statistics. Interviews for the NHIS are conducted in person by staff of the United States Bureau of the Census. Information is collected on each…

  18. ADVANCED TOOLS FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS AND SOURCE WATERS

    EPA Science Inventory

    The purpose of this poster is to present the application and assessment of advanced state-of-the-art technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs [azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, m...

  19. Just-in-Time Teaching: A Tool for Enhancing Student Engagement in Advanced Foreign Language Learning

    ERIC Educational Resources Information Center

    Abreu, Laurel; Knouse, Stephanie

    2014-01-01

    Scholars have indicated a need for further research on effective pedagogical strategies designed for advanced foreign language courses in the postsecondary setting, especially in light of decreased enrollments at this level and the elimination of foreign language programs altogether in some institutions (Paesani & Allen, 2012). This article…

  20. Advanced Technologies as Educational Tools in Science: Concepts, Applications, and Issues. Monograph Series Number 8.

    ERIC Educational Resources Information Center

    Kumar, David D.; And Others

    Systems incorporating two advanced technologies, hypermedia systems and intelligent tutors, are examined with respect to their potential impact on science education. The conceptual framework underlying these systems is discussed first. Applications of systems are then presented with examples of each in operation within the context of science…

  1. Genetic tools for advancement of Synechococcus sp. PCC 7002 as a cyanobacterial chassis

    DOE PAGES

    Ruffing, Anne M.; Jensen, Travis J.; Strickland, Lucas M.

    2016-11-10

    Successful implementation of modified cyanobacteria as hosts for industrial applications requires the development of a cyanobacterial chassis. The cyanobacterium Synechococcus sp. PCC 7002 embodies key attributes for an industrial host, including a fast growth rate and high salt, light, and temperature tolerances. Here, this study addresses key limitations in the advancement of Synechococcus sp. PCC 7002 as an industrial chassis.

  2. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  3. Recent advances in microbial production of fuels and chemicals using tools and strategies of systems metabolic engineering.

    PubMed

    Cho, Changhee; Choi, So Young; Luo, Zi Wei; Lee, Sang Yup

    2015-11-15

    The advent of various systems metabolic engineering tools and strategies has enabled more sophisticated engineering of microorganisms for the production of industrially useful fuels and chemicals. Advances in systems metabolic engineering have been made in overproducing natural chemicals and producing novel non-natural chemicals. In this paper, we review the tools and strategies of systems metabolic engineering employed for the development of microorganisms for the production of various industrially useful chemicals belonging to fuels, building block chemicals, and specialty chemicals, in particular focusing on those reported in the last three years. It was aimed at providing the current landscape of systems metabolic engineering and suggesting directions to address future challenges towards successfully establishing processes for the bio-based production of fuels and chemicals from renewable resources.

  4. Towards the characterization of noise sources in a supersonic three-stream jet using advanced analysis tools

    NASA Astrophysics Data System (ADS)

    Ruscher, Christopher; Gogineni, Sivaram

    2016-11-01

    Strict noise regulation set by governing bodies currently make supersonic commercial aviation impractical. One of the many challenges that exist in developing practical supersonic commercial aircraft is the noise produced by the engine's exhaust jet. A promising method of jet noise reduction for supersonic applications is through the addition of extra exhaust streams. Data for an axisymmetric three-stream nozzle were generated using the Naval Research Laboratory's JENRE code. This data will be compared to experimental results obtained by NASA for validation purposes. Once the simulation results show satisfactory agreement to the experiments, advanced analysis tools will be applied to the simulation data to characterize potential noise sources. The tools to be applied include methods that are based on proper orthogonal decomposition, wavelet decomposition, and stochastic estimation. Additionally, techniques such as empirical mode decomposition and momentum potential theorem will be applied to the data as well.

  5. The use of statistical tools in field testing of putative effects of genetically modified plants on nontarget organisms.

    PubMed

    Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F

    2013-08-01

    To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches - for example, analysis of variance (ANOVA) - are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will

  6. Advanced Algorithms and Automation Tools for Discrete Ordinates Methods in Parallel Environments

    SciTech Connect

    Alireza Haghighat

    2003-05-07

    This final report discusses major accomplishments of a 3-year project under the DOE's NEER Program. The project has developed innovative and automated algorithms, codes, and tools for solving the discrete ordinates particle transport method efficiently in parallel environments. Using a number of benchmark and real-life problems, the performance and accuracy of the new algorithms have been measured and analyzed.

  7. Continuous Symmetry and Chemistry Teachers: Learning Advanced Chemistry Content through Novel Visualization Tools

    ERIC Educational Resources Information Center

    Tuvi-Arad, Inbal; Blonder, Ron

    2010-01-01

    In this paper we describe the learning process of a group of experienced chemistry teachers in a specially designed workshop on molecular symmetry and continuous symmetry. The workshop was based on interactive visualization tools that allow molecules and their symmetry elements to be rotated in three dimensions. The topic of continuous symmetry is…

  8. Advances in Omics and Bioinformatics Tools for Systems Analyses of Plant Functions

    PubMed Central

    Mochida, Keiichi; Shinozaki, Kazuo

    2011-01-01

    Omics and bioinformatics are essential to understanding the molecular systems that underlie various plant functions. Recent game-changing sequencing technologies have revitalized sequencing approaches in genomics and have produced opportunities for various emerging analytical applications. Driven by technological advances, several new omics layers such as the interactome, epigenome and hormonome have emerged. Furthermore, in several plant species, the development of omics resources has progressed to address particular biological properties of individual species. Integration of knowledge from omics-based research is an emerging issue as researchers seek to identify significance, gain biological insights and promote translational research. From these perspectives, we provide this review of the emerging aspects of plant systems research based on omics and bioinformatics analyses together with their associated resources and technological advances. PMID:22156726

  9. Genetic tools for advancement of Synechococcus sp. PCC 7002 as a cyanobacterial chassis

    SciTech Connect

    Ruffing, Anne M.; Jensen, Travis J.; Strickland, Lucas M.

    2016-11-10

    Successful implementation of modified cyanobacteria as hosts for industrial applications requires the development of a cyanobacterial chassis. The cyanobacterium Synechococcus sp. PCC 7002 embodies key attributes for an industrial host, including a fast growth rate and high salt, light, and temperature tolerances. Here, this study addresses key limitations in the advancement of Synechococcus sp. PCC 7002 as an industrial chassis.

  10. Using Enabling Technologies to Advance Data Intensive Analysis Tools in the JPL Tropical Cyclone Information System

    NASA Astrophysics Data System (ADS)

    Knosp, B.; Gangl, M. E.; Hristova-Veleva, S. M.; Kim, R. M.; Lambrigtsen, B.; Li, P.; Niamsuwan, N.; Shen, T. P. J.; Turk, F. J.; Vu, Q. A.

    2014-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The TCIS has been supporting specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign and the Hurricane and Severe Storm Sentinel (HS3) campaign, by creating near real-time (NRT) data visualization portals. These portals are intended to assist in mission planning, enhance the understanding of current physical processes, and improve model data by comparing it to satellite and aircraft observations. The TCIS NRT portals allow the user to view plots on a Google Earth interface. To compliment these visualizations, the team has been working on developing data analysis tools to let the user actively interrogate areas of Level 2 swath and two-dimensional plots they see on their screen. As expected, these observation and model data are quite voluminous and bottlenecks in the system architecture can occur when the databases try to run geospatial searches for data files that need to be read by the tools. To improve the responsiveness of the data analysis tools, the TCIS team has been conducting studies on how to best store Level 2 swath footprints and run sub-second geospatial searches to discover data. The first objective was to improve the sampling accuracy of the footprints being stored in the TCIS database by comparing the Java-based NASA PO.DAAC Level 2 Swath Generator with a TCIS Python swath generator. The second objective was to compare the performance of four database implementations - MySQL, MySQL+Solr, MongoDB, and PostgreSQL - to see which database management system would yield the best geospatial query and storage performance. The final objective was to integrate our chosen technologies with our Joint Probability Density Function (Joint PDF), Wave Number Analysis, and

  11. Combining Statistical Tools and Ecological Assessments in the Study of Biodeterioration Patterns of Stone Temples in Angkor (Cambodia)

    PubMed Central

    Caneva, G.; Bartoli, F.; Savo, V.; Futagami, Y.; Strona, G.

    2016-01-01

    Biodeterioration is a major problem for the conservation of cultural heritage materials. We provide a new and original approach to analyzing changes in patterns of colonization (Biodeterioration patterns, BPs) by biological agents responsible for the deterioration of outdoor stone materials. Here we analyzed BPs of four Khmer temples in Angkor (Cambodia) exposed to variable environmental conditions, using qualitative ecological assessments and statistical approaches. The statistical analyses supported the findings obtained with the qualitative approach. Both approaches provided additional information not otherwise available using one single method. Our results indicate that studies on biodeterioration can benefit from integrating diverse methods so that conservation efforts might become more precise and effective. PMID:27597658

  12. Combining Statistical Tools and Ecological Assessments in the Study of Biodeterioration Patterns of Stone Temples in Angkor (Cambodia)

    NASA Astrophysics Data System (ADS)

    Caneva, G.; Bartoli, F.; Savo, V.; Futagami, Y.; Strona, G.

    2016-09-01

    Biodeterioration is a major problem for the conservation of cultural heritage materials. We provide a new and original approach to analyzing changes in patterns of colonization (Biodeterioration patterns, BPs) by biological agents responsible for the deterioration of outdoor stone materials. Here we analyzed BPs of four Khmer temples in Angkor (Cambodia) exposed to variable environmental conditions, using qualitative ecological assessments and statistical approaches. The statistical analyses supported the findings obtained with the qualitative approach. Both approaches provided additional information not otherwise available using one single method. Our results indicate that studies on biodeterioration can benefit from integrating diverse methods so that conservation efforts might become more precise and effective.

  13. Anvil Forecast Tool in the Advanced Weather Interactive Processing System, Phase II

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III

    2008-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and Spaceflight Meteorology Group have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Light Rules. As a result, the Applied Meteorology Unit (AMU) created a graphical overlay tool for the Meteorological Interactive Data Display Systems (MIDDS) to indicate the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input.

  14. ADVANCEMENT OF NUCLEIC ACID-BASED TOOLS FOR MONITORING IN SITU REDUCTIVE DECHLORINATION

    SciTech Connect

    Vangelas, K; ELIZABETH EDWARDS, E; FRANK LOFFLER, F; Brian02 Looney, B

    2006-11-17

    Regulatory protocols generally recognize that destructive processes are the most effective mechanisms that support natural attenuation of chlorinated solvents. In many cases, these destructive processes will be biological processes and, for chlorinated compounds, will often be reductive processes that occur under anaerobic conditions. The existing EPA guidance (EPA, 1998) provides a list of parameters that provide indirect evidence of reductive dechlorination processes. In an effort to gather direct evidence of these processes, scientists have identified key microorganisms and are currently developing tools to measure the abundance and activity of these organisms in subsurface systems. Drs. Edwards and Luffler are two recognized leaders in this field. The research described herein continues their development efforts to provide a suite of tools to enable direct measures of biological processes related to the reductive dechlorination of TCE and PCE. This study investigated the strengths and weaknesses of the 16S rRNA gene-based approach to characterizing the natural attenuation capabilities in samples. The results suggested that an approach based solely on 16S rRNA may not provide sufficient information to document the natural attenuation capabilities in a system because it does not distinguish between strains of organisms that have different biodegradation capabilities. The results of the investigations provided evidence that tools focusing on relevant enzymes for functionally desired characteristics may be useful adjuncts to the 16SrRNA methods.

  15. A Multi-layer, Data-driven Advanced Reasoning Tool for Intelligent Data Mining and Analysis for Smart Grids

    SciTech Connect

    Lu, Ning; Du, Pengwei; Greitzer, Frank L.; Guo, Xinxin; Hohimer, Ryan E.; Pomiak, Yekaterina G.

    2012-12-31

    This paper presents the multi-layer, data-driven advanced reasoning tool (M-DART), a proof-of-principle decision support tool for improved power system operation. M-DART will cross-correlate and examine different data sources to assess anomalies, infer root causes, and anneal data into actionable information. By performing higher-level reasoning “triage” of diverse data sources, M-DART focuses on early detection of emerging power system events and identifies highest priority actions for the human decision maker. M-DART represents a significant advancement over today’s grid monitoring technologies that apply offline analyses to derive model-based guidelines for online real-time operations and use isolated data processing mechanisms focusing on individual data domains. The development of the M-DART will bridge these gaps by reasoning about results obtained from multiple data sources that are enabled by the smart grid infrastructure. This hybrid approach integrates a knowledge base that is trained offline but tuned online to capture model-based relationships while revealing complex causal relationships among data from different domains.

  16. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  17. Multi-parametric analysis and registration of brain tumors: constructing statistical atlases and diagnostic tools of predictive value.

    PubMed

    Davatzikos, Christos; Zacharaki, Evangelia I; Gooya, Ali; Clark, Vanessa

    2011-01-01

    We discuss computer-based image analysis algorithms of multi-parametric MRI of brain tumors, aiming to assist in early diagnosis of infiltrating brain tumors, and to construct statistical atlases summarizing population-based characteristics of brain tumors. These methods combine machine learning, deformable registration, multi-parametric segmentation, and biophysical modeling of brain tumors.

  18. The use of statistical tools in field testing of putative effects of genetically modified plants on nontarget organisms

    PubMed Central

    Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F

    2013-01-01

    Abstract To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches – for example, analysis of variance (ANOVA) – are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and

  19. Pantograph catenary dynamic optimisation based on advanced multibody and finite element co-simulation tools

    NASA Astrophysics Data System (ADS)

    Massat, Jean-Pierre; Laurent, Christophe; Bianchi, Jean-Philippe; Balmès, Etienne

    2014-05-01

    This paper presents recent developments undertaken by SNCF Innovation & Research Department on numerical modelling of pantograph catenary interaction. It aims at describing an efficient co-simulation process between finite element (FE) and multibody (MB) modelling methods. FE catenary models are coupled with a full flexible MB representation with pneumatic actuation of pantograph. These advanced functionalities allow new kind of numerical analyses such as dynamic improvements based on innovative pneumatic suspensions or assessment of crash risks crossing areas that demonstrate the powerful capabilities of this computing approach.

  20. The Advanced Light Source: A new tool for research in atomic and molecular physics

    NASA Astrophysics Data System (ADS)

    Schlachter, F.; Robinson, A.

    1991-04-01

    The Advanced Light Source at the Lawrence Berkeley Laboratory will be the world's brightest synchrotron radiation source in the extreme ultraviolet and soft x-ray regions of the spectrum when it begins operation in 1993. It will be available as a national user facility to researchers in a broad range of disciplines, including materials science, atomic and molecular physics, chemistry, biology, imaging, and technology. The high brightness of the ALS will be particularly well suited to high-resolution studies of tenuous targets, such as excited atoms, ions, and clusters.

  1. Advanced techniques in IR thermography as a tool for the pest management professional

    NASA Astrophysics Data System (ADS)

    Grossman, Jon L.

    2006-04-01

    Within the past five years, the Pest Management industry has become aware that IR thermography can aid in the detection of pest infestations and locate other conditions that are within the purview of the industry. This paper will review the applications that can be utilized by the pest management professional and discuss the advanced techniques that may be required in conjunction with thermal imaging to locate insect and other pest infestations, moisture within structures, the verification of data and the special challenges associated with the inspection process.

  2. Advancing of Russian ChemBioGrid by bringing Data Management tools into collaborative environment.

    PubMed

    Zhuchkov, Alexey; Tverdokhlebov, Nikolay; Kravchenko, Alexander

    2006-01-01

    Virtual organizations of researchers need effective tools to work collaboratively with huge sets of heterogeneous data distributed over HealthGrid. This paper describes a mechanism of supporting Digital Libraries in High-Performance Computing environment based on Grid technology. The proposed approach provides abilities to assemble heterogeneous data from distributed sources into integrated virtual collections by using OGSA-DAI. The core of the conception is a Repository of Meta-Descriptions that are sets of metadata which define personal and collaborative virtual collections on base of virtualized information resources. The Repository is kept in a native XML-database Sedna and is maintained by Grid Data Services.

  3. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Garcia, Jessica; Beers, Benjamin; Philips, Alan; Holt, James B.; Threet, Grady E., Jr.

    2013-01-01

    The Earth to Orbit (ETO) Team of the Advanced Concepts Office (ACO) at NASA Marshal Space Flight Center (MSFC) is considered the preeminent group to go to for prephase A and phase A concept definition. The ACO team has been at the forefront of a multitude of launch vehicle studies determining the future direction of the Agency as a whole due, in part, to their rapid turnaround time in analyzing concepts and their ability to cover broad trade spaces of vehicles in that limited timeframe. Each completed vehicle concept includes a full mass breakdown of each vehicle to tertiary subsystem components, along with a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. Additionally, a structural analysis of the vehicle based on material properties and geometries is performed as well as an analysis to determine the flight loads based on the trajectory outputs. As mentioned, the ACO Earth to Orbit Team prides themselves on their rapid turnaround time and often need to fulfill customer requests within limited schedule or little advanced notice. Due to working in this fast paced environment, the ETO team has developed some finely honed skills and methods to maximize the delivery capability to meet their customer needs. This paper will describe the interfaces between the 3 primary disciplines used in the design process; weights and sizing, trajectory, and structural analysis, as well as the approach each discipline employs to streamline their particular piece of the design process.

  4. Analytical tools employed to determine pharmaceutical compounds in wastewaters after application of advanced oxidation processes.

    PubMed

    Afonso-Olivares, Cristina; Montesdeoca-Esponda, Sarah; Sosa-Ferrera, Zoraida; Santana-Rodríguez, José Juan

    2016-12-01

    Today, the presence of contaminants in the environment is a topic of interest for society in general and for the scientific community in particular. A very large amount of different chemical substances reaches the environment after passing through wastewater treatment plants without being eliminated. This is due to the inefficiency of conventional removal processes and the lack of government regulations. The list of compounds entering treatment plants is gradually becoming longer and more varied because most of these compounds come from pharmaceuticals, hormones or personal care products, which are increasingly used by modern society. As a result of this increase in compound variety, to address these emerging pollutants, the development of new and more efficient removal technologies is needed. Different advanced oxidation processes (AOPs), especially photochemical AOPs, have been proposed as supplements to traditional treatments for the elimination of pollutants, showing significant advantages over the use of conventional methods alone. This work aims to review the analytical methodologies employed for the analysis of pharmaceutical compounds from wastewater in studies in which advanced oxidation processes are applied. Due to the low concentrations of these substances in wastewater, mass spectrometry detectors are usually chosen to meet the low detection limits and identification power required. Specifically, time-of-flight detectors are required to analyse the by-products.

  5. Automatic brain matter segmentation of computed tomography images using a statistical model: A tool to gain working time!

    PubMed Central

    Bertè, Francesco; Lamponi, Giuseppe; Bramanti, Placido

    2015-01-01

    Brain computed tomography (CT) is useful diagnostic tool for the evaluation of several neurological disorders due to its accuracy, reliability, safety and wide availability. In this field, a potentially interesting research topic is the automatic segmentation and recognition of medical regions of interest (ROIs). Herein, we propose a novel automated method, based on the use of the active appearance model (AAM) for the segmentation of brain matter in CT images to assist radiologists in the evaluation of the images. The method described, that was applied to 54 CT images coming from a sample of outpatients affected by cognitive impairment, enabled us to obtain the generation of a model overlapping with the original image with quite good precision. Since CT neuroimaging is in widespread use for detecting neurological disease, including neurodegenerative conditions, the development of automated tools enabling technicians and physicians to reduce working time and reach a more accurate diagnosis is needed. PMID:26427894

  6. Automatic brain matter segmentation of computed tomography images using a statistical model: A tool to gain working time!

    PubMed

    Bertè, Francesco; Lamponi, Giuseppe; Bramanti, Placido; Calabrò, Rocco S

    2015-10-01

    Brain computed tomography (CT) is useful diagnostic tool for the evaluation of several neurological disorders due to its accuracy, reliability, safety and wide availability. In this field, a potentially interesting research topic is the automatic segmentation and recognition of medical regions of interest (ROIs). Herein, we propose a novel automated method, based on the use of the active appearance model (AAM) for the segmentation of brain matter in CT images to assist radiologists in the evaluation of the images. The method described, that was applied to 54 CT images coming from a sample of outpatients affected by cognitive impairment, enabled us to obtain the generation of a model overlapping with the original image with quite good precision. Since CT neuroimaging is in widespread use for detecting neurological disease, including neurodegenerative conditions, the development of automated tools enabling technicians and physicians to reduce working time and reach a more accurate diagnosis is needed.

  7. Regions of Unusual Statistical Properties as Tools in the Search for Horizontally Transferred Genes in Escherichia coli

    NASA Astrophysics Data System (ADS)

    Putonti, Catherine; Chumakov, Sergei; Chavez, Arturo; Luo, Yi; Graur, Dan; Fox, George E.; Fofanov, Yuriy

    2006-09-01

    The observed diversity of statistical characteristics along genomic sequences is the result of the influences of a variety of ongoing processes including horizontal gene transfer, gene loss, genome rearrangements, and evolution. The rate at which various processes affect the genome typically varies between different genomic regions. Thus, variations in statistical properties seen in different regions of a genome are often associated with its evolution and functional organization. Analysis of such properties is therefore relevant to many ongoing biomedical research efforts. Similarity Plot or S-plot is a Windows-based application for large-scale comparisons and 2D visualization of similarities between genomic sequences. This application combines two approaches wildly used in genomics: window analysis of statistical characteristics along genomes and dot-plot visual representation. S-plot is effective in detecting highly similar regions between two genomes. Within a single genome, S-plot has the ability to identify highly dissimilar regions displaying unusual compositional properties. The application was used to perform a comparative analysis of 50+ microbial genomes as well as many eukaryote genomes including human, rat, mouse, and drosophila. We illustrate the uses of S-Plot in a comparison involving Escherichia coli K12 and E. coli O157:H7.

  8. Suite of tools for statistical N-gram language modeling for pattern mining in whole genome sequences.

    PubMed

    Ganapathiraju, Madhavi K; Mitchell, Asia D; Thahir, Mohamed; Motwani, Kamiya; Ananthasubramanian, Seshan

    2012-12-01

    Genome sequences contain a number of patterns that have biomedical significance. Repetitive sequences of various kinds are a primary component of most of the genomic sequence patterns. We extended the suffix-array based Biological Language Modeling Toolkit to compute n-gram frequencies as well as n-gram language-model based perplexity in windows over the whole genome sequence to find biologically relevant patterns. We present the suite of tools and their application for analysis on whole human genome sequence.

  9. Emerging tools for continuous nutrient monitoring networks: Sensors advancing science and water resources protection

    USGS Publications Warehouse

    Pellerin, Brian; Stauffer, Beth A; Young, Dwane A; Sullivan, Daniel J.; Bricker, Suzanne B.; Walbridge, Mark R; Clyde, Gerard A; Shaw, Denice M

    2016-01-01

    Sensors and enabling technologies are becoming increasingly important tools for water quality monitoring and associated water resource management decisions. In particular, nutrient sensors are of interest because of the well-known adverse effects of nutrient enrichment on coastal hypoxia, harmful algal blooms, and impacts to human health. Accurate and timely information on nutrient concentrations and loads is integral to strategies designed to minimize risk to humans and manage the underlying drivers of water quality impairment. Using nitrate sensors as an example, we highlight the types of applications in freshwater and coastal environments that are likely to benefit from continuous, real-time nutrient data. The concurrent emergence of new tools to integrate, manage and share large data sets is critical to the successful use of nutrient sensors and has made it possible for the field of continuous nutrient monitoring to rapidly move forward. We highlight several near-term opportunities for Federal agencies, as well as the broader scientific and management community, that will help accelerate sensor development, build and leverage sites within a national network, and develop open data standards and data management protocols that are key to realizing the benefits of a large-scale, integrated monitoring network. Investing in these opportunities will provide new information to guide management and policies designed to protect and restore our nation’s water resources.

  10. GenSAA: A tool for advancing satellite monitoring with graphical expert systems

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M.; Luczak, Edward C.

    1993-01-01

    During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real time data for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At the NASA Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.

  11. Recent advances in developing molecular tools for targeted genome engineering of mammalian cells.

    PubMed

    Lim, Kwang-il

    2015-01-01

    Various biological molecules naturally existing in diversified species including fungi, bacteria, and bacteriophage have functionalities for DNA binding and processing. The biological molecules have been recently actively engineered for use in customized genome editing of mammalian cells as the molecule-encoding DNA sequence information and the underlying mechanisms how the molecules work are unveiled. Excitingly, multiple novel methods based on the newly constructed artificial molecular tools have enabled modifications of specific endogenous genetic elements in the genome context at efficiencies that are much higher than that of the conventional homologous recombination based methods. This minireview introduces the most recently spotlighted molecular genome engineering tools with their key features and ongoing modifications for better performance. Such ongoing efforts have mainly focused on the removal of the inherent DNA sequence recognition rigidity from the original molecular platforms, the addition of newly tailored targeting functions into the engineered molecules, and the enhancement of their targeting specificity. Effective targeted genome engineering of mammalian cells will enable not only sophisticated genetic studies in the context of the genome, but also widely-applicable universal therapeutics based on the pinpointing and correction of the disease-causing genetic elements within the genome in the near future.

  12. Development of tools for safety analysis of control software in advanced reactors

    SciTech Connect

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  13. Neuron-Miner: An Advanced Tool for Morphological Search and Retrieval in Neuroscientific Image Databases.

    PubMed

    Conjeti, Sailesh; Mesbah, Sepideh; Negahdar, Mohammadreza; Rautenberg, Philipp L; Zhang, Shaoting; Navab, Nassir; Katouzian, Amin

    2016-10-01

    The steadily growing amounts of digital neuroscientific data demands for a reliable, systematic, and computationally effective retrieval algorithm. In this paper, we present Neuron-Miner, which is a tool for fast and accurate reference-based retrieval within neuron image databases. The proposed algorithm is established upon hashing (search and retrieval) technique by employing multiple unsupervised random trees, collectively called as Hashing Forests (HF). The HF are trained to parse the neuromorphological space hierarchically and preserve the inherent neuron neighborhoods while encoding with compact binary codewords. We further introduce the inverse-coding formulation within HF to effectively mitigate pairwise neuron similarity comparisons, thus allowing scalability to massive databases with little additional time overhead. The proposed hashing tool has superior approximation of the true neuromorphological neighborhood with better retrieval and ranking performance in comparison to existing generalized hashing methods. This is exhaustively validated by quantifying the results over 31266 neuron reconstructions from Neuromorpho.org dataset curated from 147 different archives. We envisage that finding and ranking similar neurons through reference-based querying via Neuron Miner would assist neuroscientists in objectively understanding the relationship between neuronal structure and function for applications in comparative anatomy or diagnosis.

  14. Using explanatory crop models to develop simple tools for Advanced Life Support system studies

    NASA Technical Reports Server (NTRS)

    Cavazzoni, J.

    2004-01-01

    System-level analyses for Advanced Life Support require mathematical models for various processes, such as for biomass production and waste management, which would ideally be integrated into overall system models. Explanatory models (also referred to as mechanistic or process models) would provide the basis for a more robust system model, as these would be based on an understanding of specific processes. However, implementing such models at the system level may not always be practicable because of their complexity. For the area of biomass production, explanatory models were used to generate parameters and multivariable polynomial equations for basic models that are suitable for estimating the direction and magnitude of daily changes in canopy gas-exchange, harvest index, and production scheduling for both nominal and off-nominal growing conditions. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  15. The advanced light source — a new tool for research in atomic physics

    NASA Astrophysics Data System (ADS)

    Schlachter, A. S.

    1991-03-01

    The Advanced Light Source, a third-generation national synchrotron-radiation facility now under construction at the Lawrence Berkeley Laboratory in Berkeley, California, is scheduled to begin serving qualified users across a broad spectrum of research areas in the spring of 1993. Undulators will generate high-brightness, partially coherent, plane polarized, soft x-ray and ultraviolet (XUV) radiation from below 10 eV to above 2 keV. Wigglers and bend magnets will generate high fluxes of x-rays to photon energies above 10 keV. The ALS will have an extensive research program in which XUV radiation is used to study matter in all its varied gaseous, liquid, and solid forms.

  16. The advanced light source: A new tool for research in atomic physics

    NASA Astrophysics Data System (ADS)

    Schlachter, A. S.

    1990-09-01

    The Advanced Light Source, a third-generation national synchrotron-radiation facility now under construction at the Lawrence Berkeley Laboratory in Berkeley, California, is scheduled to begin serving qualified users across a broad spectrum of research areas in the spring of 1993. Undulators will generate high-brightness, partially coherent, plane polarized, soft-x-ray and ultraviolet (XUV) radiation from below 10 eV to above 2 keV. Wigglers and bend magnets will generate high fluxes of x-rays to photon energies above 10 keV. The ALS will have an extensive research program in which XUV radiation is used to study matter in all its varied gaseous, liquid, and solid forms.

  17. Virtual charge state separator as an advanced tool coupling measurements and simulations

    NASA Astrophysics Data System (ADS)

    Yaramyshev, S.; Vormann, H.; Adonin, A.; Barth, W.; Dahl, L.; Gerhard, P.; Groening, L.; Hollinger, R.; Maier, M.; Mickat, S.; Orzhekhovskaya, A.

    2015-05-01

    A new low energy beam transport for a multicharge uranium beam will be built at the GSI High Current Injector (HSI). All uranium charge states coming from the new ion source will be injected into GSI heavy ion high current HSI Radio Frequency Quadrupole (RFQ), but only the design ions U4 + will be accelerated to the final RFQ energy. A detailed knowledge about injected beam current and emittance for pure design U4 + ions is necessary for a proper beam line design commissioning and operation, while measurements are possible only for a full beam including all charge states. Detailed measurements of the beam current and emittance are performed behind the first quadrupole triplet of the beam line. A dedicated algorithm, based on a combination of measurements and the results of advanced beam dynamics simulations, provides for an extraction of beam current and emittance values for only the U4 + component of the beam. The proposed methods and obtained results are presented.

  18. Advanced Vibration Analysis Tools and New Strategies for Robust Design of Turbine Engine Rotors

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2002-01-01

    The adverse effects of small, random structural irregularities among the blades, called mistuning, can result in blade forced-response amplitudes and stresses that are much larger than those predicted for a perfectly tuned rotor. Manufacturing tolerances, deviations in material properties, or nonuniform operational wear causes mistuning; therefore, mistuning is unavoidable. Furthermore, even a small mistuning can have a dramatic effect on the vibratory behavior of a rotor because it can lead to spatial localization of the vibration energy (see the following photographs). As a result, certain blades may experience forced response amplitudes and stresses that are substantially larger than those predicted by an analysis of the nominal (tuned) design. Unfortunately, these random uncertainties in blade properties, and the immense computational effort involved in obtaining statistically reliable design data, combine to make this aspect of rotor design cumbersome.

  19. Microfluidic chips with multi-junctions: an advanced tool in recovering proteins from inclusion bodies.

    PubMed

    Yamaguchi, Hiroshi; Miyazaki, Masaya

    2015-01-01

    Active recombinant proteins are used for studying the biological functions of genes and for the development of therapeutic drugs. Overexpression of recombinant proteins in bacteria often results in the formation of inclusion bodies, which are protein aggregates with non-native conformations. Protein refolding is an important process for obtaining active recombinant proteins from inclusion bodies. However, the conventional refolding method of dialysis or dilution is time-consuming and recovered active protein yields are often low, and a cumbersome trial-and-error process is required to achieve success. To circumvent these difficulties, we used controllable diffusion through laminar flow in microchannels to regulate the denaturant concentration. This method largely aims at reducing protein aggregation during the refolding procedure. This Commentary introduces the principles of the protein refolding method using microfluidic chips and the advantage of our results as a tool for rapid and efficient recovery of active recombinant proteins from inclusion bodies.

  20. A software tool for advanced MRgFUS prostate therapy planning and follow up

    NASA Astrophysics Data System (ADS)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  1. ModelTest Server: a web-based tool for the statistical selection of models of nucleotide substitution online.

    PubMed

    Posada, David

    2006-07-01

    ModelTest server is a web-based application for the selection of models of nucleotide substitution using the program ModelTest. The server takes as input a text file with likelihood scores for the set of candidate models. Models can be selected with hierarchical likelihood ratio tests, or with the Akaike or Bayesian information criteria. The output includes several statistics for the assessment of model selection uncertainty, for model averaging or to estimate the relative importance of model parameters. The server can be accessed at http://darwin.uvigo.es/software/modeltest_server.html.

  2. ModelTest Server: a web-based tool for the statistical selection of models of nucleotide substitution online

    PubMed Central

    Posada, David

    2006-01-01

    ModelTest server is a web-based application for the selection of models of nucleotide substitution using the program ModelTest. The server takes as input a text file with likelihood scores for the set of candidate models. Models can be selected with hierarchical likelihood ratio tests, or with the Akaike or Bayesian information criteria. The output includes several statistics for the assessment of model selection uncertainty, for model averaging or to estimate the relative importance of model parameters. The server can be accessed at . PMID:16845102

  3. Advancing spaceborne tools for the characterization of planetary ionospheres and circumstellar environments

    NASA Astrophysics Data System (ADS)

    Douglas, Ewan Streets

    This work explores remote sensing of planetary atmospheres and their circumstellar surroundings. The terrestrial ionosphere is a highly variable space plasma embedded in the thermosphere. Generated by solar radiation and predominantly composed of oxygen ions at high altitudes, the ionosphere is dynamically and chemically coupled to the neutral atmosphere. Variations in ionospheric plasma density impact radio astronomy and communications. Inverting observations of 83.4 nm photons resonantly scattered by singly ionized oxygen holds promise for remotely sensing the ionospheric plasma density. This hypothesis was tested by comparing 83.4 nm limb profiles recorded by the Remote Atmospheric and Ionospheric Detection System aboard the International Space Station to a forward model driven by coincident plasma densities measured independently via ground-based incoherent scatter radar. A comparison study of two separate radar overflights with different limb profile morphologies found agreement between the forward model and measured limb profiles. A new implementation of Chapman parameter retrieval via Markov chain Monte Carlo techniques quantifies the precision of the plasma densities inferred from 83.4 nm emission profiles. This first study demonstrates the utility of 83.4 nm emission for ionospheric remote sensing. Future visible and ultraviolet spectroscopy will characterize the composition of exoplanet atmospheres; therefore, the second study advances technologies for the direct imaging and spectroscopy of exoplanets. Such spectroscopy requires the development of new technologies to separate relatively dim exoplanet light from parent star light. High-contrast observations at short wavelengths require spaceborne telescopes to circumvent atmospheric aberrations. The Planet Imaging Concept Testbed Using a Rocket Experiment (PICTURE) team designed a suborbital sounding rocket payload to demonstrate visible light high-contrast imaging with a visible nulling coronagraph

  4. Multistate Statistical Modeling: A Tool to Build a Lung Cancer Microsimulation Model That Includes Parameter Uncertainty and Patient Heterogeneity.

    PubMed

    Bongers, Mathilda L; de Ruysscher, Dirk; Oberije, Cary; Lambin, Philippe; Uyl-de Groot, Carin A; Coupé, V M H

    2016-01-01

    With the shift toward individualized treatment, cost-effectiveness models need to incorporate patient and tumor characteristics that may be relevant to treatment planning. In this study, we used multistate statistical modeling to inform a microsimulation model for cost-effectiveness analysis of individualized radiotherapy in lung cancer. The model tracks clinical events over time and takes patient and tumor features into account. Four clinical states were included in the model: alive without progression, local recurrence, metastasis, and death. Individual patients were simulated by repeatedly sampling a patient profile, consisting of patient and tumor characteristics. The transitioning of patients between the health states is governed by personalized time-dependent hazard rates, which were obtained from multistate statistical modeling (MSSM). The model simulations for both the individualized and conventional radiotherapy strategies demonstrated internal and external validity. Therefore, MSSM is a useful technique for obtaining the correlated individualized transition rates that are required for the quantification of a microsimulation model. Moreover, we have used the hazard ratios, their 95% confidence intervals, and their covariance to quantify the parameter uncertainty of the model in a correlated way. The obtained model will be used to evaluate the cost-effectiveness of individualized radiotherapy treatment planning, including the uncertainty of input parameters. We discuss the model-building process and the strengths and weaknesses of using MSSM in a microsimulation model for individualized radiotherapy in lung cancer.

  5. Multivariate Statistical Analysis: a tool for groundwater quality assessment in the hidrogeologic region of the Ring of Cenotes, Yucatan, Mexico.

    NASA Astrophysics Data System (ADS)

    Ye, M.; Pacheco Castro, R. B.; Pacheco Avila, J.; Cabrera Sansores, A.

    2014-12-01

    The karstic aquifer of Yucatan is a vulnerable and complex system. The first fifteen meters of this aquifer have been polluted, due to this the protection of this resource is important because is the only source of potable water of the entire State. Through the assessment of groundwater quality we can gain some knowledge about the main processes governing water chemistry as well as spatial patterns which are important to establish protection zones. In this work multivariate statistical techniques are used to assess the groundwater quality of the supply wells (30 to 40 meters deep) in the hidrogeologic region of the Ring of Cenotes, located in Yucatan, Mexico. Cluster analysis and principal component analysis are applied in groundwater chemistry data of the study area. Results of principal component analysis show that the main sources of variation in the data are due sea water intrusion and the interaction of the water with the carbonate rocks of the system and some pollution processes. The cluster analysis shows that the data can be divided in four clusters. The spatial distribution of the clusters seems to be random, but is consistent with sea water intrusion and pollution with nitrates. The overall results show that multivariate statistical analysis can be successfully applied in the groundwater quality assessment of this karstic aquifer.

  6. The advanced light source at Lawrence Berkeley laboratory: a new tool for research in atomic physics

    NASA Astrophysics Data System (ADS)

    Schlachter, Alfred S.; Robinson, Arthur L.

    1991-04-01

    The Advanced Light Source, a third-generation national synchrotron-radiation facility now under construction at the Lawrence Berkeley Laboratory, is scheduled to begin serving qualified users across a broad spectrum of research areas in the spring of 1993. Based on a low-emittance electron storage ring optimized to operate at 1.5 GeV, the ALS will have 10 long straight sections available for insertion devices (undulators and wigglers) and 24 high-quality bend-magnet ports. The short pulse width (30-50 ps) will be ideal for time-resolved measurements. Undulators will generate high-brightness partially coherent soft X-ray and ultraviolet (XUV) radiation from below 10 eV to above 2 keV; this radiation is plane polarized. Wigglers and bend magnets will extend the spectrum by generating high fluxes of X-rays to photon energies above 10 keV. The ALS will have an extensive research program in which XUV radiation is used to study matter in all its varied gaseous, liquid, and solid forms. The high brightness will open new areas of research in the materials sciences, such as spatially resolved spectroscopy (spectromicroscopy), and in biology, such as X-ray microscopy with element-specific sensitivity; the high flux will allow measurements in atomic physics and chemistry to be made with tenuous gas-phase targets. Technological applications could include lithography and nano-fabrication.

  7. Advances in developing molecular-diagnostic tools for strongyloid nematodes of equids: fundamental and applied implications.

    PubMed

    Gasser, Robin B; Hung, Guo-Chiuan; Chilton, Neil B; Beveridge, Ian

    2004-02-01

    Infections of equids with parasitic nematodes of the order Strongylida (subfamilies Strongylinae and Cyathostominae) are of major veterinary importance. In last decades, the widespread use of drugs against these parasites has led to problems of resistance within the Cyathostominae, and to an increase in their prevalence and intensity of infection. Novel control strategies, based on improved knowledge of parasite biology and epidemiology, have thus become important. However, there are substantial limitations in the understanding of fundamental biological and systematic aspects of these parasites, which have been due largely to limitations in their specific identification and diagnosis using traditional, morphological approaches. Recently, there has been progress in the development of DNA-based approaches for the specific identification of strongyloids of equids for systematic studies and disease diagnosis. The present article briefly reviews information on the classification, biology, pathogenesis, epidemiology of equine strongyloids and the diagnosis of infections, highlights knowledge gaps in these areas, describes recent advances in the use of molecular techniques for the genetic characterisation, specific identification and differentiation of strongyloids of equids as a basis for fundamental investigations of the systematics, population biology and ecology.

  8. Propulsion Simulations Using Advanced Turbulence Models with the Unstructured Grid CFD Tool, TetrUSS

    NASA Technical Reports Server (NTRS)

    Abdol-Hamid, Khaled S.; Frink, Neal T.; Deere, Karen A.; Pandya, Mohangna J.

    2004-01-01

    A computational investigation has been completed to assess the capability of TetrUSS for exhaust nozzle flows. Three configurations were chosen for this study (1) an axisymmetric supersonic jet, (2) a transonic axisymmetric boattail with solid sting operated at different Reynolds number and Mach number, and (3) an isolated non-axisymmetric nacelle with a supersonic cruise nozzle. These configurations were chosen because existing experimental data provided a means for measuring the ability of TetrUSS for simulating complex nozzle flows. The main objective of this paper is to validate the implementation of advanced two-equation turbulence models in the unstructured-grid CFD code USM3D for propulsion flow cases. USM3D is the flow solver of the TetrUSS system. Three different turbulence models, namely, Menter Shear Stress Transport (SST), basic k epsilon, and the Spalart-Allmaras (SA) are used in the present study. The results are generally in agreement with other implementations of these models in structured-grid CFD codes. Results indicate that USM3D provides accurate simulations for complex aerodynamic configurations with propulsion integration.

  9. Terahertz pulsed imaging as an advanced characterisation tool for film coatings--a review.

    PubMed

    Haaser, Miriam; Gordon, Keith C; Strachan, Clare J; Rades, Thomas

    2013-12-05

    Solid dosage forms are the pharmaceutical drug delivery systems of choice for oral drug delivery. These solid dosage forms are often coated to modify the physico-chemical properties of the active pharmaceutical ingredients (APIs), in particular to alter release kinetics. Since the product performance of coated dosage forms is a function of their critical coating attributes, including coating thickness, uniformity, and density, more advanced quality control techniques than weight gain are required. A recently introduced non-destructive method to quantitatively characterise coating quality is terahertz pulsed imaging (TPI). The ability of terahertz radiation to penetrate many pharmaceutical materials enables structural features of coated solid dosage forms to be probed at depth, which is not readily achievable with other established imaging techniques, e.g. near-infrared (NIR) and Raman spectroscopy. In this review TPI is introduced and various applications of the technique in pharmaceutical coating analysis are discussed. These include evaluation of coating thickness, uniformity, surface morphology, density, defects and buried structures as well as correlation between TPI measurements and drug release performance, coating process monitoring and scale up. Furthermore, challenges and limitations of the technique are discussed.

  10. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science.

    PubMed

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets.

  11. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science

    PubMed Central

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets. PMID:27532883

  12. A new tool to give hospitalists feedback to improve interprofessional teamwork and advance patient care.

    PubMed

    Chesluk, Benjamin J; Bernabeo, Elizabeth; Hess, Brian; Lynn, Lorna A; Reddy, Siddharta; Holmboe, Eric S

    2012-11-01

    Teamwork is a vital skill for health care professionals, but the fragmented systems within which they work frequently do not recognize or support good teamwork. The American Board of Internal Medicine has developed and is testing the Teamwork Effectiveness Assessment Module (TEAM), a tool for physicians to evaluate how they perform as part of an interprofessional patient care team. The assessment provides hospitalist physicians with feedback data drawn from their own work of caring for patients, in a way that is intended to support immediate, concrete change efforts to improve the quality of patient care. Our approach demonstrates the value of looking at teamwork in the real world of health care-that is, as it occurs in the actual contexts in which providers work together to care for patients. The assessment of individual physicians' teamwork competencies may play a role in the larger effort to bring disparate health professions together in a system that supports and rewards a team approach in hope of improving patient care.

  13. Advanced semi-active engine and transmission mounts: tools for modelling, analysis, design, and tuning

    NASA Astrophysics Data System (ADS)

    Farjoud, Alireza; Taylor, Russell; Schumann, Eric; Schlangen, Timothy

    2014-02-01

    This paper is focused on modelling, design, and testing of semi-active magneto-rheological (MR) engine and transmission mounts used in the automotive industry. The purpose is to develop a complete analysis, synthesis, design, and tuning tool that reduces the need for expensive and time-consuming laboratory and field tests. A detailed mathematical model of such devices is developed using multi-physics modelling techniques for physical systems with various energy domains. The model includes all major features of an MR mount including fluid dynamics, fluid track, elastic components, decoupler, rate-dip, gas-charged chamber, MR fluid rheology, magnetic circuit, electronic driver, and control algorithm. Conventional passive hydraulic mounts can also be studied using the same mathematical model. The model is validated using standard experimental procedures. It is used for design and parametric study of mounts; effects of various geometric and material parameters on dynamic response of mounts can be studied. Additionally, this model can be used to test various control strategies to obtain best vibration isolation performance by tuning control parameters. Another benefit of this work is that nonlinear interactions between sub-components of the mount can be observed and investigated. This is not possible by using simplified linear models currently available.

  14. Development and validation of automatic tools for interactive recurrence analysis in radiation therapy: optimization of treatment algorithms for locally advanced pancreatic cancer

    PubMed Central

    2013-01-01

    Background In radiation oncology recurrence analysis is an important part in the evaluation process and clinical quality assurance of treatment concepts. With the example of 9 patients with locally advanced pancreatic cancer we developed and validated interactive analysis tools to support the evaluation workflow. Methods After an automatic registration of the radiation planning CTs with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence and the distance between the boost and recurrence volume. We calculated the percentage of the recurrence volume within the 80%-isodose volume and compared it to the location of the recurrence within the boost volume, boost + 1 cm, boost + 1.5 cm and boost + 2 cm volumes. Results Recurrence analysis of 9 patients demonstrated that all recurrences except one occurred within the defined GTV/boost volume; one recurrence developed beyond the field border/outfield. With the defined distance volumes in relation to the recurrences, we could show that 7 recurrent lesions were within the 2 cm radius of the primary tumor. Two large recurrences extended beyond the 2 cm, however, this might be due to very rapid growth and/or late detection of the tumor progression. Conclusion The main goal of using automatic analysis tools is to reduce time and effort conducting clinical analyses. We showed a first approach and use of a semi-automated workflow for recurrence analysis, which will be continuously optimized. In conclusion, despite the limitations of the automatic calculations we contributed to in-house optimization of subsequent study concepts based on an improved and validated target volume definition. PMID:24499557

  15. Randomized Controlled Trial of a Video Decision Support Tool for Cardiopulmonary Resuscitation Decision Making in Advanced Cancer

    PubMed Central

    Volandes, Angelo E.; Paasche-Orlow, Michael K.; Mitchell, Susan L.; El-Jawahri, Areej; Davis, Aretha Delight; Barry, Michael J.; Hartshorn, Kevan L.; Jackson, Vicki Ann; Gillick, Muriel R.; Walker-Corkery, Elizabeth S.; Chang, Yuchiao; López, Lenny; Kemeny, Margaret; Bulone, Linda; Mann, Eileen; Misra, Sumi; Peachey, Matt; Abbo, Elmer D.; Eichler, April F.; Epstein, Andrew S.; Noy, Ariela; Levin, Tomer T.; Temel, Jennifer S.

    2013-01-01

    Purpose Decision making regarding cardiopulmonary resuscitation (CPR) is challenging. This study examined the effect of a video decision support tool on CPR preferences among patients with advanced cancer. Patients and Methods We performed a randomized controlled trial of 150 patients with advanced cancer from four oncology centers. Participants in the control arm (n = 80) listened to a verbal narrative describing CPR and the likelihood of successful resuscitation. Participants in the intervention arm (n = 70) listened to the identical narrative and viewed a 3-minute video depicting a patient on a ventilator and CPR being performed on a simulated patient. The primary outcome was participants' preference for or against CPR measured immediately after exposure to either modality. Secondary outcomes were participants' knowledge of CPR (score range of 0 to 4, with higher score indicating more knowledge) and comfort with video. Results The mean age of participants was 62 years (standard deviation, 11 years); 49% were women, 44% were African American or Latino, and 47% had lung or colon cancer. After the verbal narrative, in the control arm, 38 participants (48%) wanted CPR, 41 (51%) wanted no CPR, and one (1%) was uncertain. In contrast, in the intervention arm, 14 participants (20%) wanted CPR, 55 (79%) wanted no CPR, and 1 (1%) was uncertain (unadjusted odds ratio, 3.5; 95% CI, 1.7 to 7.2; P < .001). Mean knowledge scores were higher in the intervention arm than in the control arm (3.3 ± 1.0 v 2.6 ± 1.3, respectively; P < .001), and 65 participants (93%) in the intervention arm were comfortable watching the video. Conclusion Participants with advanced cancer who viewed a video of CPR were less likely to opt for CPR than those who listened to a verbal narrative. PMID:23233708

  16. Assessing health impacts in complex eco-epidemiological settings in the humid tropics: Advancing tools and methods

    SciTech Connect

    Winkler, Mirko S.; Divall, Mark J.; Krieger, Gary R.; Balge, Marci Z.; Singer, Burton H.; Utzinger, Juerg

    2010-01-15

    In the developing world, large-scale projects in the extractive industry and natural resources sectors are often controversial and associated with long-term adverse health consequences to local communities. In many industrialised countries, health impact assessment (HIA) has been institutionalized for the mitigation of anticipated negative health effects while enhancing the benefits of projects, programmes and policies. However, in developing country settings, relatively few HIAs have been performed. Hence, more HIAs with a focus on low- and middle-income countries are needed to advance and refine tools and methods for impact assessment and subsequent mitigation measures. We present a promising HIA approach, developed within the frame of a large gold-mining project in the Democratic Republic of the Congo. The articulation of environmental health areas, the spatial delineation of potentially affected communities and the use of a diversity of sources to obtain quality baseline health data are utilized for risk profiling. We demonstrate how these tools and data are fed into a risk analysis matrix, which facilitates ranking of potential health impacts for subsequent prioritization of mitigation strategies. The outcomes encapsulate a multitude of environmental and health determinants in a systematic manner, and will assist decision-makers in the development of mitigation measures that minimize potential adverse health effects and enhance positive ones.

  17. Earth remote sensing as an effective tool for the development of advanced innovative educational technologies

    NASA Astrophysics Data System (ADS)

    Mayorova, Vera; Mayorov, Kirill

    2009-11-01

    Current educational system is facing a contradiction between the fundamentality of engineering education and the necessity of applied learning extension, which requires new methods of training to combine both academic and practical knowledge in balance. As a result there are a number of innovations being developed and implemented into the process of education aimed at optimizing the quality of the entire educational system. Among a wide range of innovative educational technologies there is an especially important subset of educational technologies which involve learning through hands-on scientific and technical projects. The purpose of this paper is to describe the implementation of educational technologies based on small satellites development as well as the usage of Earth remote sensing data acquired from these satellites. The increase in public attention to the education through Earth remote sensing is based on the concern that although there is a great progress in the development of new methods of Earth imagery and remote sensing data acquisition there is still a big question remaining open on practical applications of this kind of data. It is important to develop the new way of thinking for the new generation of people so they understand that they are the masters of their own planet and they are responsible for its state. They should desire and should be able to use a powerful set of tools based on modern and perspective Earth remote sensing. For example NASA sponsors "Classroom of the Future" project. The Universities Space Research Association in United States provides a mechanism through which US universities can cooperate effectively with one another, with the government, and with other organizations to further space science and technology, and to promote education in these areas. It also aims at understanding the Earth as a system and promoting the role of humankind in the destiny of their own planet. The Association has founded a Journal of Earth System

  18. Multivariate Statistical Analysis as a Supplementary Tool for Interpretation of Variations in Salivary Cortisol Level in Women with Major Depressive Disorder

    PubMed Central

    Dziurkowska, Ewelina; Wesolowski, Marek

    2015-01-01

    Multivariate statistical analysis is widely used in medical studies as a profitable tool facilitating diagnosis of some diseases, for instance, cancer, allergy, pneumonia, or Alzheimer's and psychiatric diseases. Taking this in consideration, the aim of this study was to use two multivariate techniques, hierarchical cluster analysis (HCA) and principal component analysis (PCA), to disclose the relationship between the drugs used in the therapy of major depressive disorder and the salivary cortisol level and the period of hospitalization. The cortisol contents in saliva of depressed women were quantified by HPLC with UV detection day-to-day during the whole period of hospitalization. A data set with 16 variables (e.g., the patients' age, multiplicity and period of hospitalization, initial and final cortisol level, highest and lowest hormone level, mean contents, and medians) characterizing 97 subjects was used for HCA and PCA calculations. Multivariate statistical analysis reveals that various groups of antidepressants affect at the varying degree the salivary cortisol level. The SSRIs, SNRIs, and the polypragmasy reduce most effectively the hormone secretion. Thus, both unsupervised pattern recognition methods, HCA and PCA, can be used as complementary tools for interpretation of the results obtained by laboratory diagnostic methods. PMID:26380376

  19. From Enormous 3D Maps of the Universe to Astrophysical and Cosmological Constraints: Statistical Tools for Realizing the Promise of 21 cm Cosmology

    NASA Astrophysics Data System (ADS)

    Dillon, Joshua S.; Tegmark, Max

    2015-01-01

    21 cm cosmology promises to provide an exquisite probe of astrophysics and cosmology during the cosmic dark ages and the epoch of reionization. An enormous volume of the universe, previously inaccessible, can be directly mapped by looking for the faint signal from hyperfine transition of neutral hydrogen. One day, 21 cm tomography could even eclipse the CMB as the most precise test of our cosmological models. Realizing that promise, however, has proven extremely challenging. We're looking for a small signal buried under foregrounds orders of magnitude stronger. We know that we're going to need very sensitive, and thus very large, low frequency interferometers. Those large interferometers produce vast quantities data, which must be carefully analyzed. In talk, I will present my Ph.D. work at MIT on the development and application of rigorous, fast, and robust statistical tools for extracting that cosmological signal while maintaining a thorough understanding of the error properties of those measurements. These tools reduce vast quanities of interferometric data into the statistics like the power spectrum that can be directly compared with theory and simulation, all while minimizing the amount of cosmological information lost. I will also present results from applying those techniques to data from the the Murchison Widefield Array and will discuss the exciting science they will enable with the upcoming Hydrogen Epoch of Reionization Array.

  20. Multivariate Statistical Analysis as a Supplementary Tool for Interpretation of Variations in Salivary Cortisol Level in Women with Major Depressive Disorder.

    PubMed

    Dziurkowska, Ewelina; Wesolowski, Marek

    2015-01-01

    Multivariate statistical analysis is widely used in medical studies as a profitable tool facilitating diagnosis of some diseases, for instance, cancer, allergy, pneumonia, or Alzheimer's and psychiatric diseases. Taking this in consideration, the aim of this study was to use two multivariate techniques, hierarchical cluster analysis (HCA) and principal component analysis (PCA), to disclose the relationship between the drugs used in the therapy of major depressive disorder and the salivary cortisol level and the period of hospitalization. The cortisol contents in saliva of depressed women were quantified by HPLC with UV detection day-to-day during the whole period of hospitalization. A data set with 16 variables (e.g., the patients' age, multiplicity and period of hospitalization, initial and final cortisol level, highest and lowest hormone level, mean contents, and medians) characterizing 97 subjects was used for HCA and PCA calculations. Multivariate statistical analysis reveals that various groups of antidepressants affect at the varying degree the salivary cortisol level. The SSRIs, SNRIs, and the polypragmasy reduce most effectively the hormone secretion. Thus, both unsupervised pattern recognition methods, HCA and PCA, can be used as complementary tools for interpretation of the results obtained by laboratory diagnostic methods.

  1. High-throughput manufacturing of size-tuned liposomes by a new microfluidics method using enhanced statistical tools for characterization.

    PubMed

    Kastner, Elisabeth; Kaur, Randip; Lowry, Deborah; Moghaddam, Behfar; Wilkinson, Alexander; Perrie, Yvonne

    2014-12-30

    Microfluidics has recently emerged as a new method of manufacturing liposomes, which allows for reproducible mixing in miliseconds on the nanoliter scale. Here we investigate microfluidics-based manufacturing of liposomes. The aim of these studies was to assess the parameters in a microfluidic process by varying the total flow rate (TFR) and the flow rate ratio (FRR) of the solvent and aqueous phases. Design of experiment and multivariate data analysis were used for increased process understanding and development of predictive and correlative models. High FRR lead to the bottom-up synthesis of liposomes, with a strong correlation with vesicle size, demonstrating the ability to in-process control liposomes size; the resulting liposome size correlated with the FRR in the microfluidics process, with liposomes of 50 nm being reproducibly manufactured. Furthermore, we demonstrate the potential of a high throughput manufacturing of liposomes using microfluidics with a four-fold increase in the volumetric flow rate, maintaining liposome characteristics. The efficacy of these liposomes was demonstrated in transfection studies and was modelled using predictive modeling. Mathematical modelling identified FRR as the key variable in the microfluidic process, with the highest impact on liposome size, polydispersity and transfection efficiency. This study demonstrates microfluidics as a robust and high-throughput method for the scalable and highly reproducible manufacture of size-controlled liposomes. Furthermore, the application of statistically based process control increases understanding and allows for the generation of a design-space for controlled particle characteristics.

  2. Heavy metal assessment using geochemical and statistical tools in the surface sediments of Vembanad Lake, Southwest Coast of India.

    PubMed

    Selvam, A Paneer; Priya, S Laxmi; Banerjee, Kakolee; Hariharan, G; Purvaja, R; Ramesh, R

    2012-10-01

    The geochemical distribution and enrichment of ten heavy metals in the surface sediments of Vembanad Lake, southwest coast of India was evaluated. Sediment samples from 47 stations in the Lake were collected during dry and wet seasons in 2008 and examined for heavy metal content (Al, Fe, Mn, Cr, Zn, Ni, Pb, Cu, Co, Cd), organic carbon, and sediment texture. Statistically significant spatial variation was observed among all sediment variables, but negligible significant seasonal variation was observed. Correlation analysis showed that the metal content of sediments was mainly regulated by organic carbon, Fe oxy-hydroxides, and grain size. Principal component analysis was used to reduce the 14 sediment variables into three factors that reveal distinct origins or accumulation mechanisms controlling the chemical composition in the study area. Pollution intensity of the Vembanad Lake was measured using the enrichment factor and the pollution load index. Severe and moderately severe enrichment of Cd and Zn in the north estuary with minor enrichment of Pb and Cr were observed, which reflects the intensity of the anthropogenic inputs related to industrial discharge into this system. The results of pollution load index reveal that the sediment was heavily polluted in northern arm and moderately polluted in the extreme end and port region of the southern arm of the lake. A comparison with sediment quality guideline quotient was also made, indicating that there may be some ecotoxicological risk to benthic organisms in these sediments.

  3. A semi-automated software tool to study treadmill locomotion in the rat: from experiment videos to statistical gait analysis.

    PubMed

    Gravel, P; Tremblay, M; Leblond, H; Rossignol, S; de Guise, J A

    2010-07-15

    A computer-aided method for the tracking of morphological markers in fluoroscopic images of a rat walking on a treadmill is presented and validated. The markers correspond to bone articulations in a hind leg and are used to define the hip, knee, ankle and metatarsophalangeal joints. The method allows a user to identify, using a computer mouse, about 20% of the marker positions in a video and interpolate their trajectories from frame-to-frame. This results in a seven-fold speed improvement in detecting markers. This also eliminates confusion problems due to legs crossing and blurred images. The video images are corrected for geometric distortions from the X-ray camera, wavelet denoised, to preserve the sharpness of minute bone structures, and contrast enhanced. From those images, the marker positions across video frames are extracted, corrected for rat "solid body" motions on the treadmill, and used to compute the positional and angular gait patterns. Robust Bootstrap estimates of those gait patterns and their prediction and confidence bands are finally generated. The gait patterns are invaluable tools to study the locomotion of healthy animals or the complex process of locomotion recovery in animals with injuries. The method could, in principle, be adapted to analyze the locomotion of other animals as long as a fluoroscopic imager and a treadmill are available.

  4. Application of modern tests for stationarity to single-trial MEG data: transferring powerful statistical tools from econometrics to neuroscience.

    PubMed

    Kipiński, Lech; König, Reinhard; Sielużycki, Cezary; Kordecki, Wojciech

    2011-10-01

    Stationarity is a crucial yet rarely questioned assumption in the analysis of time series of magneto- (MEG) or electroencephalography (EEG). One key drawback of the commonly used tests for stationarity of encephalographic time series is the fact that conclusions on stationarity are only indirectly inferred either from the Gaussianity (e.g. the Shapiro-Wilk test or Kolmogorov-Smirnov test) or the randomness of the time series and the absence of trend using very simple time-series models (e.g. the sign and trend tests by Bendat and Piersol). We present a novel approach to the analysis of the stationarity of MEG and EEG time series by applying modern statistical methods which were specifically developed in econometrics to verify the hypothesis that a time series is stationary. We report our findings of the application of three different tests of stationarity--the Kwiatkowski-Phillips-Schmidt-Schin (KPSS) test for trend or mean stationarity, the Phillips-Perron (PP) test for the presence of a unit root and the White test for homoscedasticity--on an illustrative set of MEG data. For five stimulation sessions, we found already for short epochs of duration of 250 and 500 ms that, although the majority of the studied epochs of single MEG trials were usually mean-stationary (KPSS test and PP test), they were classified as nonstationary due to their heteroscedasticity (White test). We also observed that the presence of external auditory stimulation did not significantly affect the findings regarding the stationarity of the data. We conclude that the combination of these tests allows a refined analysis of the stationarity of MEG and EEG time series.

  5. M-TraCE: a new tool for high-resolution computation and statistical elaboration of backward trajectories on the Italian domain

    NASA Astrophysics Data System (ADS)

    Vitali, Lina; Righini, Gaia; Piersanti, Antonio; Cremona, Giuseppe; Pace, Giandomenico; Ciancarella, Luisella

    2016-11-01

    Air backward trajectory calculations are commonly used in a variety of atmospheric analyses, in particular for source attribution evaluation. The accuracy of backward trajectory analysis is mainly determined by the quality and the spatial and temporal resolution of the underlying meteorological data set, especially in the cases of complex terrain. This work describes a new tool for the calculation and the statistical elaboration of backward trajectories. To take advantage of the high-resolution meteorological database of the Italian national air quality model MINNI, a dedicated set of procedures was implemented under the name of M-TraCE (MINNI module for Trajectories Calculation and statistical Elaboration) to calculate and process the backward trajectories of air masses reaching a site of interest. Some outcomes from the application of the developed methodology to the Italian Network of Special Purpose Monitoring Stations are shown to assess its strengths for the meteorological characterization of air quality monitoring stations. M-TraCE has demonstrated its capabilities to provide a detailed statistical assessment of transport patterns and region of influence of the site under investigation, which is fundamental for correctly interpreting pollutants measurements and ascertaining the official classification of the monitoring site based on meta-data information. Moreover, M-TraCE has shown its usefulness in supporting other assessments, i.e., spatial representativeness of a monitoring site, focussing specifically on the analysis of the effects due to meteorological variables.

  6. Multivariate analysis, mass balance techniques, and statistical tests as tools in igneous petrology: application to the Sierra de las Cruces volcanic range (Mexican Volcanic Belt).

    PubMed

    Velasco-Tapia, Fernando

    2014-01-01

    Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures).

  7. Multivariate Analysis, Mass Balance Techniques, and Statistical Tests as Tools in Igneous Petrology: Application to the Sierra de las Cruces Volcanic Range (Mexican Volcanic Belt)

    PubMed Central

    Velasco-Tapia, Fernando

    2014-01-01

    Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures). PMID:24737994

  8. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  9. Topology for statistical modeling of petascale data.

    SciTech Connect

    Pascucci, Valerio; Mascarenhas, Ajith Arthur; Rusek, Korben; Bennett, Janine Camille; Levine, Joshua; Pebay, Philippe Pierre; Gyulassy, Attila; Thompson, David C.; Rojas, Joseph Maurice

    2011-07-01

    This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.

  10. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 11: Computer-Aided Manufacturing & Advanced CNC, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  11. Recent advances in mathematical criminology. Comment on "Statistical physics of crime: A review" by M.R. D'Orsogna and M. Perc

    NASA Astrophysics Data System (ADS)

    Rodríguez, Nancy

    2015-03-01

    The use of mathematical tools has long proved to be useful in gaining understanding of complex systems in physics [1]. Recently, many researchers have realized that there is an analogy between emerging phenomena in complex social systems and complex physical or biological systems [4,5,12]. This realization has particularly benefited the modeling and understanding of crime, a ubiquitous phenomena that is far from being understood. In fact, when one is interested in the bulk behavior of patterns that emerge from small and seemingly unrelated interactions as well as decisions that occur at the individual level, the mathematical tools that have been developed in statistical physics, game theory, network theory, dynamical systems, and partial differential equations can be useful in shedding light into the dynamics of these patterns [2-4,6,12].

  12. The ERP PCA Toolkit: an open source program for advanced statistical analysis of event-related potential data.

    PubMed

    Dien, Joseph

    2010-03-15

    This article presents an open source Matlab program, the ERP PCA (EP) Toolkit, for facilitating the multivariate decomposition and analysis of event-related potential data. This program is intended to supplement existing ERP analysis programs by providing functions for conducting artifact correction, robust averaging, referencing and baseline correction, data editing and visualization, principal components analysis, and robust inferential statistical analysis. This program subserves three major goals: (1) optimizing analysis of noisy data, such as clinical or developmental; (2) facilitating the multivariate decomposition of ERP data into its constituent components; (3) increasing the transparency of analysis operations by providing direct visualization of the corresponding waveforms.

  13. Final Report, DOE Early Career Award: Predictive modeling of complex physical systems: new tools for statistical inference, uncertainty quantification, and experimental design

    SciTech Connect

    Marzouk, Youssef

    2016-08-31

    Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesian inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.

  14. Application of metabolite profiling tools and time-of-flight mass spectrometry in the identification of transformation products of iopromide and iopamidol during advanced oxidation.

    PubMed

    Singh, Randolph R; Lester, Yaal; Linden, Karl G; Love, Nancy G; Atilla-Gokcumen, G Ekin; Aga, Diana S

    2015-03-03

    The efficiency of wastewater treatment systems in removing pharmaceuticals is often assessed on the basis of the decrease in the concentration of the parent compound. However, what is perceived as "removal" during treatment may not necessarily mean mineralization of the pharmaceutical compound but simply conversion into different transformation products (TPs). Using liquid chromatography coupled to a quadrupole time-of-flight mass spectrometer (LC-QToF-MS), we demonstrated conversion of iopromide in wastewater to at least 14 TPs after an advanced oxidation process (AOP) using UV (fluence = 1500 mJ/cm(2)) and H2O2 (10 mg/L). Due to the complexity of the wastewater matrix, the initial experiments were performed using a high concentration (10 mg/L) of iopromide in order to facilitate the identification of TPs. Despite the high concentration of iopromide used, cursory inspection of UV and mass spectra only revealed four TPs in the chromatograms of the post-AOP samples. However, the use of METLIN database and statistics-based profiling tools commonly used in metabolomics proved effective in discriminating between background signals and TPs derived from iopromide. High-resolution mass data allowed one to predict molecular formulas of putative TPs with errors below 5 ppm relative to the observed m/z. Tandem mass spectrometry (MS/MS) data and isotope pattern comparisons provided necessary information that allowed one to elucidate the structure of iopromide TPs. The presence of the proposed iopromide TPs was determined in unspiked wastewater from a municipal wastewater treatment plant, but no iopromide and TPs were detected. Using analogous structural modifications and oxidation that results from the AOP treatment of iopromide, the potential TPs of iopamidol (a structurally similar compound to iopromide) were predicted. The same mass fragmentation pattern observed in iopromide TPs was applied to the predicted iopamidol TPs. LC-QToF-MS revealed the presence of two iopamidol

  15. Robotic-locomotor training as a tool to reduce neuromuscular abnormality in spinal cord injury: the application of system identification and advanced longitudinal modeling.

    PubMed

    Mirbagheri, Mehdi M; Kindig, Matthew; Niu, Xun; Varoqui, Deborah; Conaway, Petra

    2013-06-01

    In this study, the effect of the LOKOMAT, a robotic-assisted locomotor training system, on the reduction of neuromuscular abnormalities associated with spasticity was examined, for the first time in the spinal cord injury (SCI) population. Twenty-three individuals with chronic incomplete SCI received 1-hour training sessions in the LOKOMAT three times per week, with up to 45 minutes of training per session; matched control group received no intervention. The neuromuscular properties of the spastic ankle were then evaluated prior to training and after 1, 2, and 4 weeks of training. A parallel-cascade system identification technique was used to determine the reflex and intrinsic stiffness of the ankle joint as a function of ankle position at each time point. The slope of the stiffness vs. joint angle curve, i.e. the modulation of stiffness with joint position, was then calculated and tracked over the four-week period. Growth Mixture Modeling (GMM), an advanced statistical method, was then used to classify subjects into subgroups based on similar trends in recovery pattern of slope over time, and Random Coefficient Regression (RCR) was used to model the recovery patterns within each subgroup. All groups showed significant reductions in both reflex and intrinsic slope over time, but subjects in classes with higher baseline values of the slope showed larger improvements over the four weeks of training. These findings suggest that LOKOMAT training may also be useful for reducing the abnormal modulation of neuromuscular properties that arises as secondary effects after SCI. This can advise clinicians as to which patients can benefit the most from LOKOMAT training prior to beginning the training. Further, this study shows that system identification and GMM/RCR can serve as powerful tools to quantify and track spasticity over time in the SCI population.

  16. GeneBase 1.1: a tool to summarize data from NCBI gene datasets and its application to an update of human gene statistics.

    PubMed

    Piovesan, Allison; Caracausi, Maria; Antonaros, Francesca; Pelleri, Maria Chiara; Vitale, Lorenza

    2016-01-01

    We release GeneBase 1.1, a local tool with a graphical interface useful for parsing, structuring and indexing data from the National Center for Biotechnology Information (NCBI) Gene data bank. Compared to its predecessor GeneBase (1.0), GeneBase 1.1 now allows dynamic calculation and summarization in terms of median, mean, standard deviation and total for many quantitative parameters associated with genes, gene transcripts and gene features (exons, introns, coding sequences, untranslated regions). GeneBase 1.1 thus offers the opportunity to perform analyses of the main gene structure parameters also following the search for any set of genes with the desired characteristics, allowing unique functionalities not provided by the NCBI Gene itself. In order to show the potential of our tool for local parsing, structuring and dynamic summarizing of publicly available databases for data retrieval, analysis and testing of biological hypotheses, we provide as a sample application a revised set of statistics for human nuclear genes, gene transcripts and gene features. In contrast with previous estimations strongly underestimating the length of human genes, a 'mean' human protein-coding gene is 67 kbp long, has eleven 309 bp long exons and ten 6355 bp long introns. Median, mean and extreme values are provided for many other features offering an updated reference source for human genome studies, data useful to set parameters for bioinformatic tools and interesting clues to the biomedical meaning of the gene features themselves.Database URL: http://apollo11.isto.unibo.it/software/.

  17. GeneBase 1.1: a tool to summarize data from NCBI gene datasets and its application to an update of human gene statistics

    PubMed Central

    Piovesan, Allison; Caracausi, Maria; Antonaros, Francesca; Pelleri, Maria Chiara; Vitale, Lorenza

    2016-01-01

    We release GeneBase 1.1, a local tool with a graphical interface useful for parsing, structuring and indexing data from the National Center for Biotechnology Information (NCBI) Gene data bank. Compared to its predecessor GeneBase (1.0), GeneBase 1.1 now allows dynamic calculation and summarization in terms of median, mean, standard deviation and total for many quantitative parameters associated with genes, gene transcripts and gene features (exons, introns, coding sequences, untranslated regions). GeneBase 1.1 thus offers the opportunity to perform analyses of the main gene structure parameters also following the search for any set of genes with the desired characteristics, allowing unique functionalities not provided by the NCBI Gene itself. In order to show the potential of our tool for local parsing, structuring and dynamic summarizing of publicly available databases for data retrieval, analysis and testing of biological hypotheses, we provide as a sample application a revised set of statistics for human nuclear genes, gene transcripts and gene features. In contrast with previous estimations strongly underestimating the length of human genes, a ‘mean’ human protein-coding gene is 67 kbp long, has eleven 309 bp long exons and ten 6355 bp long introns. Median, mean and extreme values are provided for many other features offering an updated reference source for human genome studies, data useful to set parameters for bioinformatic tools and interesting clues to the biomedical meaning of the gene features themselves. Database URL: http://apollo11.isto.unibo.it/software/ PMID:28025344

  18. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 9: Tool and Die, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  19. Cancer Data and Statistics Tools

    MedlinePlus

    ... C. Richardson Frances Babcock Vicki Benard Djenaba A. Joseph Jacqueline W. Miller Thomas B. Richards Mona Saraiya ... MPH Keisha Houston, DrPH, MPH Commander Djenaba A. Joseph, MD, MPH Jun Li, MD, PhD, MPH Captain ...

  20. Image navigation and registration performance assessment tool set for the GOES-R Advanced Baseline Imager and Geostationary Lightning Mapper

    NASA Astrophysics Data System (ADS)

    De Luccia, Frank J.; Houchin, Scott; Porter, Brian C.; Graybill, Justin; Haas, Evan; Johnson, Patrick D.; Isaacson, Peter J.; Reth, Alan D.

    2016-05-01

    The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. For ABI, these metrics are the 3-sigma errors in navigation (NAV), channel-to-channel registration (CCR), frame-to-frame registration (FFR), swath-to-swath registration (SSR), and within frame registration (WIFR) for the Level 1B image products. For GLM, the single metric of interest is the 3-sigma error in the navigation of background images (GLM NAV) used by the system to navigate lightning strikes. 3-sigma errors are estimates of the 99. 73rd percentile of the errors accumulated over a 24 hour data collection period. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24 hour evaluation period. Another aspect of the IPATS design that vastly reduces execution time is the off-line propagation of Landsat based truth images to the fixed grid coordinates system for each of the three GOES-R satellite locations, operational East and West and initial checkout locations. This paper describes the algorithmic design and implementation of IPATS and provides preliminary test results.

  1. Image Navigation and Registration Performance Assessment Tool Set for the GOES-R Advanced Baseline Imager and Geostationary Lightning Mapper

    NASA Technical Reports Server (NTRS)

    De Luccia, Frank J.; Houchin, Scott; Porter, Brian C.; Graybill, Justin; Haas, Evan; Johnson, Patrick D.; Isaacson, Peter J.; Reth, Alan D.

    2016-01-01

    The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. For ABI, these metrics are the 3-sigma errors in navigation (NAV), channel-to-channel registration (CCR), frame-to-frame registration (FFR), swath-to-swath registration (SSR), and within frame registration (WIFR) for the Level 1B image products. For GLM, the single metric of interest is the 3-sigma error in the navigation of background images (GLM NAV) used by the system to navigate lightning strikes. 3-sigma errors are estimates of the 99.73rd percentile of the errors accumulated over a 24-hour data collection period. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24-hour evaluation period. Another aspect of the IPATS design that vastly reduces execution time is the off-line propagation of Landsat based truth images to the fixed grid coordinates system for each of the three GOES-R satellite locations, operational East and West and initial checkout locations. This paper describes the algorithmic design and implementation of IPATS and provides preliminary test results.

  2. Testing earthquake prediction algorithms: Statistically significant advance prediction of the largest earthquakes in the Circum-Pacific, 1992-1997

    USGS Publications Warehouse

    Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.

    1999-01-01

    Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier

  3. SU-E-T-398: Feasibility of Automated Tools for Robustness Evaluation of Advanced Photon and Proton Techniques in Oropharyngeal Cancer

    SciTech Connect

    Liu, H; Liang, X; Kalbasi, A; Lin, A; Ahn, P; Both, S

    2014-06-01

    Purpose: Advanced radiotherapy (RT) techniques such as proton pencil beam scanning (PBS) and photon-based volumetric modulated arc therapy (VMAT) have dosimetric advantages in the treatment of head and neck malignancies. However, anatomic or alignment changes during treatment may limit robustness of PBS and VMAT plans. We assess the feasibility of automated deformable registration tools for robustness evaluation in adaptive PBS and VMAT RT of oropharyngeal cancer (OPC). Methods: We treated 10 patients with bilateral OPC with advanced RT techniques and obtained verification CT scans with physician-reviewed target and OAR contours. We generated 3 advanced RT plans for each patient: proton PBS plan using 2 posterior oblique fields (2F), proton PBS plan using an additional third low-anterior field (3F), and a photon VMAT plan using 2 arcs (Arc). For each of the planning techniques, we forward calculated initial (Ini) plans on the verification scans to create verification (V) plans. We extracted DVH indicators based on physician-generated contours for 2 target and 14 OAR structures to investigate the feasibility of two automated tools (contour propagation (CP) and dose deformation (DD)) as surrogates for routine clinical plan robustness evaluation. For each verification scan, we compared DVH indicators of V, CP and DD plans in a head-to-head fashion using Student's t-test. Results: We performed 39 verification scans; each patient underwent 3 to 6 verification scan. We found no differences in doses to target or OAR structures between V and CP, V and DD, and CP and DD plans across all patients (p > 0.05). Conclusions: Automated robustness evaluation tools, CP and DD, accurately predicted dose distributions of verification (V) plans using physician-generated contours. These tools may be further developed as a potential robustness screening tool in the workflow for adaptive treatment of OPC using advanced RT techniques, reducing the need for physician-generated contours.

  4. Analysis of the two-dimensional turbulence in pure electron plasmas by means of advanced statistical techniques

    NASA Astrophysics Data System (ADS)

    Romé, M.; Lepreti, F.; Maero, G.; Pozzoli, R.; Vecchio, A.; Carbone, V.

    2013-03-01

    Highly magnetized, pure electron plasmas confined in a Penning-Malmberg trap allow one to perform experiments on the two-dimensional (2D) fluid dynamics under conditions where non-ideal effects are almost negligible. Recent results on the freely decaying 2D turbulence obtained from experiments with electron plasmas performed in the Penning-Malmberg trap ELTRAP are presented. The analysis has been applied to experimental sequences with different types of initial density distributions. The dynamical properties of the system have been investigated by means of wavelet transforms and Proper Orthogonal Decomposition (POD). The wavelet analysis shows that most of the enstrophy is contained at spatial scales corresponding to the typical size of the persistent vortices in the 2D electron plasma flow. The POD analysis allows one to identify the coherent structures which give the dominant contribution to the plasma evolution. The statistical properties of the turbulence have been investigated by means of Probability Density Functions (PDFs) and structure functions of spatial vorticity increments. The analysis evidences how the shape and evolution of the dominant coherent structures and the intermittency properties of the turbulence strongly depend on the initial conditions for the electron density.

  5. Predict! Teaching Statistics Using Informational Statistical Inference

    ERIC Educational Resources Information Center

    Makar, Katie

    2013-01-01

    Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…

  6. Innovative and Advanced Coupled Neutron Transport and Thermal Hydraulic Method (Tool) for the Design, Analysis and Optimization of VHTR/NGNP Prismatic Reactors

    SciTech Connect

    Rahnema, Farzad; Garimeela, Srinivas; Ougouag, Abderrafi; Zhang, Dingkang

    2013-11-29

    This project will develop a 3D, advanced coarse mesh transport method (COMET-Hex) for steady- state and transient analyses in advanced very high-temperature reactors (VHTRs). The project will lead to a coupled neutronics and thermal hydraulic (T/H) core simulation tool with fuel depletion capability. The computational tool will be developed in hexagonal geometry, based solely on transport theory without (spatial) homogenization in complicated 3D geometries. In addition to the hexagonal geometry extension, collaborators will concurrently develop three additional capabilities to increase the code’s versatility as an advanced and robust core simulator for VHTRs. First, the project team will develop and implement a depletion method within the core simulator. Second, the team will develop an elementary (proof-of-concept) 1D time-dependent transport method for efficient transient analyses. The third capability will be a thermal hydraulic method coupled to the neutronics transport module for VHTRs. Current advancements in reactor core design are pushing VHTRs toward greater core and fuel heterogeneity to pursue higher burn-ups, efficiently transmute used fuel, maximize energy production, and improve plant economics and safety. As a result, an accurate and efficient neutron transport, with capabilities to treat heterogeneous burnable poison effects, is highly desirable for predicting VHTR neutronics performance. This research project’s primary objective is to advance the state of the art for reactor analysis.

  7. The use of a gas chromatography-sensor system combined with advanced statistical methods, towards the diagnosis of urological malignancies

    PubMed Central

    Aggio, Raphael B. M.; de Lacy Costello, Ben; White, Paul; Khalid, Tanzeela; Ratcliffe, Norman M.; Persad, Raj; Probert, Chris S. J.

    2016-01-01

    Prostate cancer is one of the most common cancers. Serum prostate-specific antigen (PSA) is used to aid the selection of men undergoing biopsies. Its use remains controversial. We propose a GC-sensor algorithm system for classifying urine samples from patients with urological symptoms. This pilot study includes 155 men presenting to urology clinics, 58 were diagnosed with prostate cancer, 24 with bladder cancer and 73 with haematuria and or poor stream, without cancer. Principal component analysis (PCA) was applied to assess the discrimination achieved, while linear discriminant analysis (LDA) and support vector machine (SVM) were used as statistical models for sample classification. Leave-one-out cross-validation (LOOCV), repeated 10-fold cross-validation (10FoldCV), repeated double cross-validation (DoubleCV) and Monte Carlo permutations were applied to assess performance. Significant separation was found between prostate cancer and control samples, bladder cancer and controls and between bladder and prostate cancer samples. For prostate cancer diagnosis, the GC/SVM system classified samples with 95% sensitivity and 96% specificity after LOOCV. For bladder cancer diagnosis, the SVM reported 96% sensitivity and 100% specificity after LOOCV, while the DoubleCV reported 87% sensitivity and 99% specificity, with SVM showing 78% and 98% sensitivity between prostate and bladder cancer samples. Evaluation of the results of the Monte Carlo permutation of class labels obtained chance-like accuracy values around 50% suggesting the observed results for bladder cancer and prostate cancer detection are not due to over fitting. The results of the pilot study presented here indicate that the GC system is able to successfully identify patterns that allow classification of urine samples from patients with urological cancers. An accurate diagnosis based on urine samples would reduce the number of negative prostate biopsies performed, and the frequency of surveillance cystoscopy

  8. Delineation and evaluation of hydrologic-landscape regions in the United States using geographic information system tools and multivariate statistical analyses.

    PubMed

    Wolock, David M; Winter, Thomas C; McMahon, Gerard

    2004-01-01

    Hydrologic-landscape regions in the United States were delineated by using geographic information system (GIS) tools combined with principal components and cluster analyses. The GIS and statistical analyses were applied to land-surface form, geologic texture (permeability of the soil and bedrock), and climate variables that describe the physical and climatic setting of 43,931 small (approximately 200 km2) watersheds in the United States. (The term "watersheds" is defined in this paper as the drainage areas of tributary streams, headwater streams, and stream segments lying between two confluences.) The analyses grouped the watersheds into 20 noncontiguous regions based on similarities in land-surface form, geologic texture, and climate characteristics. The percentage of explained variance (R-squared value) in an analysis of variance was used to compare the hydrologic-landscape regions to 19 square geometric regions and the 21 U.S. Environmental Protection Agency level-II ecoregions. Hydrologic-landscape regions generally were better than ecoregions at delineating regions of distinct land-surface form and geologic texture. Hydrologic-landscape regions and ecoregions were equally effective at defining regions in terms of climate, land cover, and water-quality characteristics. For about half of the landscape, climate, and water-quality characteristics, the R-squared values of square geometric regions were as high as hydrologic-landscape regions or ecoregions.

  9. Large ensemble modeling of the last deglacial retreat of the West Antarctic Ice Sheet: comparison of simple and advanced statistical techniques

    NASA Astrophysics Data System (ADS)

    Pollard, David; Chang, Won; Haran, Murali; Applegate, Patrick; DeConto, Robert

    2016-05-01

    A 3-D hybrid ice-sheet model is applied to the last deglacial retreat of the West Antarctic Ice Sheet over the last ˜ 20 000 yr. A large ensemble of 625 model runs is used to calibrate the model to modern and geologic data, including reconstructed grounding lines, relative sea-level records, elevation-age data and uplift rates, with an aggregate score computed for each run that measures overall model-data misfit. Two types of statistical methods are used to analyze the large-ensemble results: simple averaging weighted by the aggregate score, and more advanced Bayesian techniques involving Gaussian process-based emulation and calibration, and Markov chain Monte Carlo. The analyses provide sea-level-rise envelopes with well-defined parametric uncertainty bounds, but the simple averaging method only provides robust results with full-factorial parameter sampling in the large ensemble. Results for best-fit parameter ranges and envelopes of equivalent sea-level rise with the simple averaging method agree well with the more advanced techniques. Best-fit parameter ranges confirm earlier values expected from prior model tuning, including large basal sliding coefficients on modern ocean beds.

  10. Elements of Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Sachs, Ivo; Sen, Siddhartha; Sexton, James

    2006-05-01

    This textbook provides a concise introduction to the key concepts and tools of modern statistical mechanics. It also covers advanced topics such as non-relativistic quantum field theory and numerical methods. After introducing classical analytical techniques, such as cluster expansion and Landau theory, the authors present important numerical methods with applications to magnetic systems, Lennard-Jones fluids and biophysics. Quantum statistical mechanics is discussed in detail and applied to Bose-Einstein condensation and topics in astrophysics and cosmology. In order to describe emergent phenomena in interacting quantum systems, canonical non-relativistic quantum field theory is introduced and then reformulated in terms of Feynman integrals. Combining the authors' many years' experience of teaching courses in this area, this textbook is ideal for advanced undergraduate and graduate students in physics, chemistry and mathematics. Analytical and numerical techniques in one text, including sample codes and solved problems on the web at www.cambridge.org/0521841984 Covers a wide range of applications including magnetic systems, turbulence astrophysics, and biology Contains a concise introduction to Markov processes and molecular dynamics

  11. Planning Research on Student Services: Variety in Research Tools.

    ERIC Educational Resources Information Center

    Hom, Willard C.

    This paper discusses the seven types of research tools that have potential for advancing knowledge about student services in California Community Colleges. The seven tools are the following: literature review, data validation, survey research, case study, quasi experiment, meta analysis, and statistical modeling. The report gives reasons why each…

  12. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 1: Executive Summary, of a 15-Volume Set of Skills Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology (MAST) consortium was formed to address the shortage of skilled workers for the machine tools and metals-related industries. Featuring six of the nation's leading advanced technology centers, the MAST consortium developed, tested, and disseminated industry-specific skill standards and model curricula for…

  13. JUST in time health emergency interventions: an innovative approach to training the citizen for emergency situations using virtual reality techniques and advanced IT tools (the VR Tool).

    PubMed

    Manganas, A; Tsiknakis, M; Leisch, E; Ponder, M; Molet, T; Herbelin, B; Magnetat-Thalmann, N; Thalmann, D; Fato, M; Schenone, A

    2004-01-01

    This paper reports the results of the second of the two systems developed by JUST, a collaborative project supported by the European Union under the Information Society Technologies (IST) Programme. The most innovative content of the project has been the design and development of a complementary training course for non-professional health emergency operators, which supports the traditional learning phase, and which purports to improve the retention capability of the trainees. This was achieved with the use of advanced information technology techniques, which provide adequate support and can help to overcome the present weaknesses of the existing training mechanisms.

  14. Human in vitro 3D co-culture model to engineer vascularized bone-mimicking tissues combining computational tools and statistical experimental approach.

    PubMed

    Bersini, Simone; Gilardi, Mara; Arrigoni, Chiara; Talò, Giuseppe; Zamai, Moreno; Zagra, Luigi; Caiolfa, Valeria; Moretti, Matteo

    2016-01-01

    The generation of functional, vascularized tissues is a key challenge for both tissue engineering applications and the development of advanced in vitro models analyzing interactions among circulating cells, endothelium and organ-specific microenvironments. Since vascularization is a complex process guided by multiple synergic factors, it is critical to analyze the specific role that different experimental parameters play in the generation of physiological tissues. Our goals were to design a novel meso-scale model bridging the gap between microfluidic and macro-scale studies, and high-throughput screen the effects of multiple variables on the vascularization of bone-mimicking tissues. We investigated the influence of endothelial cell (EC) density (3-5 Mcells/ml), cell ratio among ECs, mesenchymal stem cells (MSCs) and osteo-differentiated MSCs (1:1:0, 10:1:0, 10:1:1), culture medium (endothelial, endothelial + angiopoietin-1, 1:1 endothelial/osteo), hydrogel type (100%fibrin, 60%fibrin+40%collagen), tissue geometry (2 × 2 × 2, 2 × 2 × 5 mm(3)). We optimized the geometry and oxygen gradient inside hydrogels through computational simulations and we analyzed microvascular network features including total network length/area and vascular branch number/length. Particularly, we employed the "Design of Experiment" statistical approach to identify key differences among experimental conditions. We combined the generation of 3D functional tissue units with the fine control over the local microenvironment (e.g. oxygen gradients), and developed an effective strategy to enable the high-throughput screening of multiple experimental parameters. Our approach allowed to identify synergic correlations among critical parameters driving microvascular network development within a bone-mimicking environment and could be translated to any vascularized tissue.

  15. Use of advanced earth observation tools for the analyses of recent surface changes in Kalahari pans and Namibian coastal lagoons

    NASA Astrophysics Data System (ADS)

    Behling, Robert; Milewski, Robert; Chabrillat, Sabine; Völkel, Jörg

    2016-04-01

    The remote sensing analyses in the BMBF-SPACES collaborative project Geoarchives - Signals of Climate and Landscape Change preserved in Southern African Geoarchives - focuses on the use of recent and upcoming Earth Observation Tools for the study of climate and land use changes and its impact on the ecosystem. It aims at demonstrating the potential of recently available advanced optical remote sensing imagery with its extended spectral coverage and temporal resolution for the identification and mapping of sediment features associated with paleo-environmental archives as well as their recent dynamic. In this study we focus on the analyses of two ecosystems of major interest, the Kalahari salt pans as well as the lagoons at Namibia's west coast, that present high dynamic caused by combined hydrological and surface processes linked to climatic events. Multitemporal remote sensing techniques allow us to derive the recent surface dynamic of the salt pans and also provide opportunities to get a detailed understanding of the spatiotemporal development of the coastal lagoons. Furthermore spaceborne hyperspectral analysis can give insight to the current surface mineralogy of the salt pans on a physical basis and provide the intra pan distribution of evaporites. The soils and sediments of the Kalahari salt pans such as the Omongwa pan are a potentially significant storage of global carbon and also function as an important terrestrial climate archive. Thus far the surface distribution of evaporites have been only assessed mono-temporally and on a coarse regional scale, but the dynamic of the salt pans, especially the formation of evaporites, is still uncertain and poorly understood. For the salt pan analyses a change detection is applied using the Iterative-reweighted Multivariate Alteration Detection (IR-MAD) method to identify and investigate surface changes based on a Landsat time-series covering the period 1984-2015. Furthermore the current spatial distribution of

  16. Undergraduate experiments on statistical optics

    NASA Astrophysics Data System (ADS)

    Scholz, Ruediger; Friege, Gunnar; Weber, Kim-Alessandro

    2016-09-01

    Since the pioneering experiments of Forrester et al (1955 Phys. Rev. 99 1691) and Hanbury Brown and Twiss (1956 Nature 177 27; Nature 178 1046), along with the introduction of the laser in the 1960s, the systematic analysis of random fluctuations of optical fields has developed to become an indispensible part of physical optics for gaining insight into features of the fields. In 1985 Joseph W Goodman prefaced his textbook on statistical optics with a strong commitment to the ‘tools of probability and statistics’ (Goodman 2000 Statistical Optics (New York: John Wiley & Sons Inc.)) in the education of advanced optics. Since then a wide range of novel undergraduate optical counting experiments and corresponding pedagogical approaches have been introduced to underpin the rapid growth of the interest in coherence and photon statistics. We propose low cost experimental steps that are a fair way off ‘real’ quantum optics, but that give deep insight into random optical fluctuation phenomena: (1) the introduction of statistical methods into undergraduate university optical lab work, and (2) the connection between the photoelectrical signal and the characteristics of the light source. We describe three experiments and theoretical approaches which may be used to pave the way for a well balanced growth of knowledge, providing students with an opportunity to enhance their abilities to adapt the ‘tools of probability and statistics’.

  17. Using Hypertext To Develop an Algorithmic Approach to Teaching Statistics.

    ERIC Educational Resources Information Center

    Halavin, James; Sommer, Charles

    Hypertext and its more advanced form Hypermedia represent a powerful authoring tool with great potential for allowing statistics teachers to develop documents to assist students in an algorithmic fashion. An introduction to the use of Hypertext is presented, with an example of its use. Hypertext is an approach to information management in which…

  18. Archaeological applications of laser-induced breakdown spectroscopy: an example from the Coso Volcanic Field, California, using advanced statistical signal processing analysis

    SciTech Connect

    Remus, Jeremiah J.; Gottfried, Jennifer L.; Harmon, Russell S.; Draucker, Anne; Baron, Dirk; Yohe, Robert

    2010-05-01

    of the classifier setup considered in this study include the training/testing routine (a 27-fold leave-one-sample-out setup versus a simple split of the data into separate sets for training and evaluation), the number of latent variables used in the regression model, and whether PLSDA operating on the entire broadband LIBS spectrum is superior to that using only a selected subset of LIBS emission lines. The results point to the robustness of the PLSDA technique and suggest that LIBS analysis combined with the appropriate statistical signal processing has the potential to be a useful tool for chemical analysis of archaeological artifacts and geological specimens.

  19. Using Advanced Monitoring Tools to Evaluate PM PM2.5 2.5 in San Joaquin Valley

    EPA Science Inventory

    One of the primary data deficiencies that prevent the advance of policy relevant research on particulate matter, ozone, and associated precursors is the lack of measurement data and knowledge on the true vertical profile and synoptic-scale spatial distributions of the pollutants....

  20. Bridging the Gap 10 Years Later: A Tool and Technique to Analyze and Evaluate Advanced Academic Curricular Units

    ERIC Educational Resources Information Center

    Beasley, Jennifer G.; Briggs, Christine; Pennington, Leighann

    2017-01-01

    The need for a shared vision concerning exemplary curricula for academically advanced learners must be a priority in the field of education. With the advent of the Common Core State Standards adoption in many states, a new conversation has been ignited over meeting the needs of students with gifts and talents for whom the "standard"…

  1. Advanced CNC and CAM Series. Educational Resources for the Machine Tool Industry. Course Syllabi, Instructor's Handbook [and] Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and student laboratory manual for a 1-year vocational training program to prepare students for entry-level positions as advanced computer numerical control (CNC) and computer-assisted manufacturing (CAM) technicians.. The program was developed through a modification of the DACUM…

  2. Impact of gastrointestinal parasitic nematodes of sheep, and the role of advanced molecular tools for exploring epidemiology and drug resistance - an Australian perspective.

    PubMed

    Roeber, Florian; Jex, Aaron R; Gasser, Robin B

    2013-05-27

    Parasitic nematodes (roundworms) of small ruminants and other livestock have major economic impacts worldwide. Despite the impact of the diseases caused by these nematodes and the discovery of new therapeutic agents (anthelmintics), there has been relatively limited progress in the development of practical molecular tools to study the epidemiology of these nematodes. Specific diagnosis underpins parasite control, and the detection and monitoring of anthelmintic resistance in livestock parasites, presently a major concern around the world. The purpose of the present article is to provide a concise account of the biology and knowledge of the epidemiology of the gastrointestinal nematodes (order Strongylida), from an Australian perspective, and to emphasize the importance of utilizing advanced molecular tools for the specific diagnosis of nematode infections for refined investigations of parasite epidemiology and drug resistance detection in combination with conventional methods. It also gives a perspective on the possibility of harnessing genetic, genomic and bioinformatic technologies to better understand parasites and control parasitic diseases.

  3. Impact of gastrointestinal parasitic nematodes of sheep, and the role of advanced molecular tools for exploring epidemiology and drug resistance - an Australian perspective

    PubMed Central

    2013-01-01

    Parasitic nematodes (roundworms) of small ruminants and other livestock have major economic impacts worldwide. Despite the impact of the diseases caused by these nematodes and the discovery of new therapeutic agents (anthelmintics), there has been relatively limited progress in the development of practical molecular tools to study the epidemiology of these nematodes. Specific diagnosis underpins parasite control, and the detection and monitoring of anthelmintic resistance in livestock parasites, presently a major concern around the world. The purpose of the present article is to provide a concise account of the biology and knowledge of the epidemiology of the gastrointestinal nematodes (order Strongylida), from an Australian perspective, and to emphasize the importance of utilizing advanced molecular tools for the specific diagnosis of nematode infections for refined investigations of parasite epidemiology and drug resistance detection in combination with conventional methods. It also gives a perspective on the possibility of harnessing genetic, genomic and bioinformatic technologies to better understand parasites and control parasitic diseases. PMID:23711194

  4. Bridging Innovation and Outreach to Overcome Global Gaps in Radiation Oncology Through Information and Communication Tools, Trainee Advancement, Engaging Industry, Attention to Ethical Challenges, and Political Advocacy.

    PubMed

    Dad, Luqman; Royce, Trevor J; Morris, Zachary; Moran, Meena; Pawlicki, Todd; Khuntia, Deepak; Hardenbergh, Patricia; Cummings, Bernard; Mayr, Nina; Hu, Kenneth

    2017-04-01

    An evolving paradigm in global outreach in radiation oncology has been the implementation of a more region-specific, needs-based approach to help close the gap in radiation services to low- and middle-income countries through the use of innovative tools in information and communication technology. This report highlights 4 information and communication technology tools in action today: (1) the NCCN Framework for Resource Stratification of NCCN guidelines, (2) ASTRO e-Contouring, (3) i.treatsafely.org, and (4) ChartRounds.com. We also render special consideration to matters related to global outreach that we believe require distinct attention to help us meet the goals established by the 2011 United Nations׳ Declaration on noncommunicable diseases: (1) trainee advancement toward careers in global health, (2) ethical challenges of international outreach, (3) critical importance of political advocacy, and (4) collaboration with Industry.

  5. FACILITATING ADVANCED URBAN METEOROLOGY AND AIR QUALITY MODELING CAPABILITIES WITH HIGH RESOLUTION URBAN DATABASE AND ACCESS PORTAL TOOLS

    EPA Science Inventory

    Information of urban morphological features at high resolution is needed to properly model and characterize the meteorological and air quality fields in urban areas. We describe a new project called National Urban Database with Access Portal Tool, (NUDAPT) that addresses this nee...

  6. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chowdhury, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  7. The Statistical Consulting Center for Astronomy (SCCA)

    NASA Technical Reports Server (NTRS)

    Akritas, Michael

    2001-01-01

    The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to

  8. Laser Nano-Neurosurgery from Gentle Manipulation to Nano-Incision of Neuronal Cells and Scaffolds: An Advanced Neurotechnology Tool

    PubMed Central

    Soloperto, Alessandro; Palazzolo, Gemma; Tsushima, Hanako; Chieregatti, Evelina; Vassalli, Massimo; Difato, Francesco

    2016-01-01

    Current optical approaches are progressing far beyond the scope of monitoring the structure and function of living matter, and they are becoming widely recognized as extremely precise, minimally-invasive, contact-free handling tools. Laser manipulation of living tissues, single cells, or even single-molecules is becoming a well-established methodology, thus founding the onset of new experimental paradigms and research fields. Indeed, a tightly focused pulsed laser source permits complex tasks such as developing engineered bioscaffolds, applying calibrated forces, transfecting, stimulating, or even ablating single cells with subcellular precision, and operating intracellular surgical protocols at the level of single organelles. In the present review, we report the state of the art of laser manipulation in neuroscience, to inspire future applications of light-assisted tools in nano-neurosurgery. PMID:27013962

  9. Laser Nano-Neurosurgery from Gentle Manipulation to Nano-Incision of Neuronal Cells and Scaffolds: An Advanced Neurotechnology Tool.

    PubMed

    Soloperto, Alessandro; Palazzolo, Gemma; Tsushima, Hanako; Chieregatti, Evelina; Vassalli, Massimo; Difato, Francesco

    2016-01-01

    Current optical approaches are progressing far beyond the scope of monitoring the structure and function of living matter, and they are becoming widely recognized as extremely precise, minimally-invasive, contact-free handling tools. Laser manipulation of living tissues, single cells, or even single-molecules is becoming a well-established methodology, thus founding the onset of new experimental paradigms and research fields. Indeed, a tightly focused pulsed laser source permits complex tasks such as developing engineered bioscaffolds, applying calibrated forces, transfecting, stimulating, or even ablating single cells with subcellular precision, and operating intracellular surgical protocols at the level of single organelles. In the present review, we report the state of the art of laser manipulation in neuroscience, to inspire future applications of light-assisted tools in nano-neurosurgery.

  10. Technologic advances in surgery for brain tumors: tools of the trade in the modern neurosurgical operating room.

    PubMed

    McPherson, Christopher M; Sawaya, Raymond

    2005-09-01

    Surgery is an essential part of the oncologic treatment of patients with brain tumors. Surgery is necessary for histologic diagnosis, and the cytoreduction of tumor mass has been shown to improve patient survival time and quality of life. Ultimately, the goal of any oncologic neurosurgery is to achieve maximal safe resection. Over the years, many technologic adjuncts have been developed to assist the surgeon in achieving this goal. In this article, we review the technologic advances of modern neurosurgery that are helping to reach this goal.

  11. Investigation of Alien Wavelength Quality in Live Multi-Domain, Multi-Vendor Link Using Advanced Simulation Tool

    NASA Astrophysics Data System (ADS)

    Nordal Petersen, Martin; Nuijts, Roeland; Lange Bjørn, Lars

    2014-05-01

    This article presents an advanced optical model for simulation of alien wavelengths in multi-domain and multi-vendor dense wavelength-division multiplexing networks. The model aids optical network planners with a better understanding of the non-linear effects present in dense wavelength-division multiplexing systems and better utilization of alien wavelengths in future applications. The limiting physical effects for alien wavelengths are investigated in relation to power levels, channel spacing, and other factors. The simulation results are verified through experimental setup in live multi-domain dense wavelength-division multiplexing systems between two national research networks: SURFnet in Holland and NORDUnet in Denmark.

  12. Nighttime activity of moving objects, their mapping and statistic making, on the example of applying thermal imaging and advanced image processing to the research of nocturnal mammals

    NASA Astrophysics Data System (ADS)

    Pregowski, Piotr; Owadowska, Edyta; Pietrzak, Jan; Zwolenik, Slawomir

    2005-09-01

    The paper presents method of acquiring a new form of statistical information about the changes at scenery, overseen by thermal imaging camera in static configuration. This type of imagers reach uniquely high efficiency during nighttime surveillance and targeting. The technical issue we have solved, resulted from the problem: how to verify the hypothesis that small, nocturnal rodents, like bank voles, use common paths inside their range and that they form a common, rather stable system? Such research has been especially difficult because the mentioned mammals are secretive, move with various speed and due to low contrast to their natural surroundings - as leaves or grass - nearly impossible for other kind of observations from a few meters distance. The main advantage of the elaborated method showed to be both adequately filtered long thermal movies for manual analyses, as well as auto-creation of the synthetic images which present maps of invisible paths and activity of their usage. Additional file with logs describing objects and their dislocations as the ".txt" files allows various, more detailed studies of animal behavior. The obtained results proved that this original method delivers a new, non-invasive, powerful and dynamic concept of solving various ecological problems. Creation of networks consisted of uncooled thermal imagers - of significantly increased availability - with data transmissions to digital centers allows to investigate of moving - particularly heat generated - objects in complete darkness, much wider and much more efficiently than up today. Thus, although our system was elaborated for ecological studies, a similar one can be considered as a tool for chosen tasks in the optical security areas.

  13. Recent advances in elementary flux modes and yield space analysis as useful tools in metabolic network studies.

    PubMed

    Horvat, Predrag; Koller, Martin; Braunegg, Gerhart

    2015-09-01

    A review of the use of elementary flux modes (EFMs) and their applications in metabolic engineering covered with yield space analysis (YSA) is presented. EFMs are an invaluable tool in mathematical modeling of biochemical processes. They are described from their inception in 1994, followed by various improvements of their computation in later years. YSA constitutes another precious tool for metabolic network modeling, and is presented in details along with EFMs in this article. The application of these techniques is discussed for several case studies of metabolic network modeling provided in respective original articles. The article is concluded by some case studies in which the application of EFMs and YSA turned out to be most useful, such as the analysis of intracellular polyhydroxyalkanoate (PHA) formation and consumption in Cupriavidus necator, including the constraint-based description of the steady-state flux cone of the strain's metabolic network, the profound analysis of a continuous five-stage bioreactor cascade for PHA production by C. necator using EFMs and, finally, the study of metabolic fluxes in the metabolic network of C. necator cultivated on glycerol.

  14. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  15. A Demonstration of Advanced Safety Analysis Tools and Methods Applied to Large Break LOCA and Fuel Analysis for PWRs

    SciTech Connect

    Szilard, Ronaldo Henriques; Smith, Curtis Lee; Martineau, Richard Charles

    2016-03-01

    The U.S. Nuclear Regulatory Commission (NRC) is currently proposing a rulemaking designated as 10 CFR 50.46c to revise the loss-of-coolant accident (LOCA)/emergency core cooling system acceptance criteria to include the effects of higher burnup on fuel/cladding performance. We propose a demonstration problem of a representative four-loop PWR plant to study the impact of this new rule in the US nuclear fleet. Within the scope of evaluation for the 10 CFR 50.46c rule, aspects of safety, operations, and economics are considered in the industry application demonstration presented in this paper. An advanced safety analysis approach is used, by integrating the probabilistic element with deterministic methods for LOCA analysis, a novel approach to solving these types of multi-physics, multi-scale problems.

  16. MO-E-18C-04: Advanced Computer Simulation and Visualization Tools for Enhanced Understanding of Core Medical Physics Concepts

    SciTech Connect

    Naqvi, S

    2014-06-15

    Purpose: Most medical physics programs emphasize proficiency in routine clinical calculations and QA. The formulaic aspect of these calculations and prescriptive nature of measurement protocols obviate the need to frequently apply basic physical principles, which, therefore, gradually decay away from memory. E.g. few students appreciate the role of electron transport in photon dose, making it difficult to understand key concepts such as dose buildup, electronic disequilibrium effects and Bragg-Gray theory. These conceptual deficiencies manifest when the physicist encounters a new system, requiring knowledge beyond routine activities. Methods: Two interactive computer simulation tools are developed to facilitate deeper learning of physical principles. One is a Monte Carlo code written with a strong educational aspect. The code can “label” regions and interactions to highlight specific aspects of the physics, e.g., certain regions can be designated as “starters” or “crossers,” and any interaction type can be turned on and off. Full 3D tracks with specific portions highlighted further enhance the visualization of radiation transport problems. The second code calculates and displays trajectories of a collection electrons under arbitrary space/time dependent Lorentz force using relativistic kinematics. Results: Using the Monte Carlo code, the student can interactively study photon and electron transport through visualization of dose components, particle tracks, and interaction types. The code can, for instance, be used to study kerma-dose relationship, explore electronic disequilibrium near interfaces, or visualize kernels by using interaction forcing. The electromagnetic simulator enables the student to explore accelerating mechanisms and particle optics in devices such as cyclotrons and linacs. Conclusion: The proposed tools are designed to enhance understanding of abstract concepts by highlighting various aspects of the physics. The simulations serve as

  17. Process variation monitoring (PVM) by wafer inspection tool as a complementary method to CD-SEM for mapping field CDU on advanced production devices

    NASA Astrophysics Data System (ADS)

    Kim, Dae Jong; Yoo, Hyung Won; Kim, Chul Hong; Lee, Hak Kwon; Kim, Sung Su; Bae, Koon Ho; Spielberg, Hedvi; Lee, Yun Ho; Levi, Shimon; Bustan, Yariv; Rozentsvige, Moshe

    2010-03-01

    As design rules shrink, Critical Dimension Uniformity (CDU) and Line Edge Roughness (LER) have a dramatic effect on printed final lines and hence the need to control these parameters increases. Sources of CDU and LER variations include scanner auto-focus accuracy and stability, layer stack thickness, composition variations, and exposure variations. Process variations, in advanced VLSI production designs, specifically in memory devices, attributed to CDU and LER affect cell-to-cell parametric variations. These variations significantly impact device performance and die yield. Traditionally, measurements of LER are performed by CD-SEM or OCD metrology tools. Typically, these measurements require a relatively long time to set and cover only selected points of wafer area. In this paper we present the results of a collaborative work of the Process Diagnostic & Control Business Unit of Applied Materials and Hynix Semiconductor Inc. on the implementation of a complementary method to the CDSEM and OCD tools, to monitor defect density and post litho develop CDU and LER on production wafers. The method, referred to as Process Variation Monitoring (PVM) is based on measuring variations in the scattered light from periodic structures. The application is demonstrated using Applied Materials DUV bright field (BF) wafer inspection tool under optimized illumination and collection conditions. The UVisionTM has already passed a successful feasibility study on DRAM products with 66nm and 54nm design rules. The tool has shown high sensitivity to variations across an FEM wafer in both exposure and focus axes. In this article we show how PVM can help detection of Field to Field variations on DRAM wafers with 44nm design rule during normal production run. The complex die layout and the shrink in cell dimensions require high sensitivity to local variations within Dies or Fields. During normal scan of production wafers local Process variations are translated into GL (Grey Level) values

  18. Numerical Implementation of Indicators and Statistical Control Tools in Monitoring and Evaluating CACEI-ISO Indicators of Study Program in Industrial Process by Systematization

    ERIC Educational Resources Information Center

    Ayala, Gabriela Cota; Real, Francia Angélica Karlos; Ivan, Ramirez Alvarado Edqar

    2016-01-01

    The research was conducted to determine if the study program of the career of industrial processes Technological University of Chihuahua, 1 year after that it was certified by CACEI, continues achieving the established indicators and ISO 9001: 2008, implementing quality tools, monitoring of essential indicators are determined, flow charts are…

  19. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 13: Laser Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  20. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 12: Instrumentation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  1. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 14: Automated Equipment Technician (CIM), of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  2. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 3: Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  3. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 15: Administrative Information, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This volume developed by the Machine Tool Advanced Skill Technology (MAST) program contains key administrative documents and provides additional sources for machine tool and precision manufacturing information and important points of contact in the industry. The document contains the following sections: a foreword; grant award letter; timeline for…

  4. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 10: Computer-Aided Drafting & Design, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  5. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 7: Industrial Maintenance Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  6. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 6: Welding, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  7. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 8: Sheet Metal & Composites, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  8. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 5: Mold Making, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational speciality areas within the U.S. machine tool and metals-related…

  9. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 2: Career Development, General Education and Remediation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  10. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 4: Manufacturing Engineering Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  11. Cancer-related inflammation as predicting tool for treatment outcome in locally advanced and metastatic non-small cell lung cancer

    PubMed Central

    Korsic, Marta; Mursic, Davorka; Samarzija, Miroslav; Cucevic, Branka; Roglic, Mihovil; Jakopovic, Marko

    2016-01-01

    Background Lung cancer is the leading cause of cancer deaths and the non-small cell lung cancer (NSCLC) represents 80% of all cases. In most cases when diagnosed, it is in locally advanced or metastatic stage, when platinum based doublet chemotherapy is the established therapeutic option for majority of the patients. Predictive factors to filter the patients who will benefit the most from the chemotherapy are not clearly defined. Objective of this study was to explore predictive value of pre-treatment C-reactive protein (CRP), fibrinogen and their interaction, for the response to the frontline chemotherapy. Methods In this retrospective cohort study 170 patients with locally advanced and metastatic NSCLC were included. Relationship between baseline level of CRP and fibrinogen and response to the frontline chemotherapy was assessed. Results We found that pre-treatment CRP and fibrinogen values were statistically significantly correlated. Chemotherapy and CRP, fibrinogen, and their interaction were independently significantly associated with disease control rate at re-evaluation. There was statistically significant difference in median pre-treatment CRP level between the patients with disease control or progression at re-evaluation, 13.8 vs. 30.0 mg/L respectively, P=0.026. By Johnson-Neyman technique we found that in patients with initial fibrinogen value below 3.5 g/L, CRP level was significantly associated with disease control or progression of the disease. Above this fibrinogen value the association of CRP and disease control was lost. Conclusions The findings from this study support the growing evidence of inflammation and cancer relationship, where elevated pre-treatment level of CRP has negative predictive significance on the NSCLC frontline chemotherapy response. PMID:27499936

  12. Application of ICH Q9 Quality Risk Management Tools for Advanced Development of Hot Melt Coated Multiparticulate Systems.

    PubMed

    Stocker, Elena; Becker, Karin; Hate, Siddhi; Hohl, Roland; Schiemenz, Wolfgang; Sacher, Stephan; Zimmer, Andreas; Salar-Behzadi, Sharareh

    2017-01-01

    This study aimed to apply quality risk management based on the The International Conference on Harmonisation guideline Q9 for the early development stage of hot melt coated multiparticulate systems for oral administration. N-acetylcysteine crystals were coated with a formulation composing tripalmitin and polysorbate 65. The critical quality attributes (CQAs) were initially prioritized using failure mode and effects analysis. The CQAs of the coated material were defined as particle size, taste-masking efficiency, and immediate release profile. The hot melt coated process was characterized via a flowchart, based on the identified potential critical process parameters (CPPs) and their impact on the CQAs. These CPPs were prioritized using a process failure mode, effects, and criticality analysis and their critical impact on the CQAs was experimentally confirmed using a statistical design of experiments. Spray rate, atomization air pressure, and air flow rate were identified as CPPs. Coating amount and content of polysorbate 65 in the coating formulation were identified as critical material attributes. A hazard and critical control points analysis was applied to define control strategies at the critical process points. A fault tree analysis evaluated causes for potential process failures. We successfully demonstrated that a standardized quality risk management approach optimizes the product development sustainability and supports the regulatory aspects.

  13. Teaching Statistics without Sadistics.

    ERIC Educational Resources Information Center

    Forte, James A.

    1995-01-01

    Five steps designed to take anxiety out of statistics for social work students are outlined. First, statistics anxiety is identified as an educational problem. Second, instructional objectives and procedures to achieve them are presented and methods and tools for evaluating the course are explored. Strategies for, and obstacles to, making…

  14. Regional Arctic System Model (RASM): A Tool to Advance Understanding and Prediction of Arctic Climate Change at Process Scales

    NASA Astrophysics Data System (ADS)

    Maslowski, W.; Roberts, A.; Osinski, R.; Brunke, M.; Cassano, J. J.; Clement Kinney, J. L.; Craig, A.; Duvivier, A.; Fisel, B. J.; Gutowski, W. J., Jr.; Hamman, J.; Hughes, M.; Nijssen, B.; Zeng, X.

    2014-12-01

    The Arctic is undergoing rapid climatic changes, which are some of the most coordinated changes currently occurring anywhere on Earth. They are exemplified by the retreat of the perennial sea ice cover, which integrates forcing by, exchanges with and feedbacks between atmosphere, ocean and land. While historical reconstructions from Global Climate and Global Earth System Models (GC/ESMs) are in broad agreement with these changes, the rate of change in the GC/ESMs remains outpaced by observations. Reasons for that stem from a combination of coarse model resolution, inadequate parameterizations, unrepresented processes and a limited knowledge of physical and other real world interactions. We demonstrate the capability of the Regional Arctic System Model (RASM) in addressing some of the GC/ESM limitations in simulating observed seasonal to decadal variability and trends in the sea ice cover and climate. RASM is a high resolution, fully coupled, pan-Arctic climate model that uses the Community Earth System Model (CESM) framework. It uses the Los Alamos Sea Ice Model (CICE) and Parallel Ocean Program (POP) configured at an eddy-permitting resolution of 1/12° as well as the Weather Research and Forecasting (WRF) and Variable Infiltration Capacity (VIC) models at 50 km resolution. All RASM components are coupled via the CESM flux coupler (CPL7) at 20-minute intervals. RASM is an example of limited-area, process-resolving, fully coupled earth system model, which due to the additional constraints from lateral boundary conditions and nudging within a regional model domain facilitates detailed comparisons with observational statistics that are not possible with GC/ESMs. In this talk, we will emphasize the utility of RASM to understand sensitivity to variable parameter space, importance of critical processes, coupled feedbacks and ultimately to reduce uncertainty in arctic climate change projections.

  15. The R.E.D. tools: advances in RESP and ESP charge derivation and force field library building.

    PubMed

    Dupradeau, François-Yves; Pigache, Adrien; Zaffran, Thomas; Savineau, Corentin; Lelong, Rodolphe; Grivel, Nicolas; Lelong, Dimitri; Rosanski, Wilfried; Cieplak, Piotr

    2010-07-28

    Deriving atomic charges and building a force field library for a new molecule are key steps when developing a force field required for conducting structural and energy-based analysis using molecular mechanics. Derivation of popular RESP charges for a set of residues is a complex and error prone procedure because it depends on numerous input parameters. To overcome these problems, the R.E.D. Tools (RESP and ESP charge Derive, ) have been developed to perform charge derivation in an automatic and straightforward way. The R.E.D. program handles chemical elements up to bromine in the periodic table. It interfaces different quantum mechanical programs employed for geometry optimization and computing molecular electrostatic potential(s), and performs charge fitting using the RESP program. By defining tight optimization criteria and by controlling the molecular orientation of each optimized geometry, charge values are reproduced at any computer platform with an accuracy of 0.0001 e. The charges can be fitted using multiple conformations, making them suitable for molecular dynamics simulations. R.E.D. allows also for defining charge constraints during multiple molecule charge fitting, which are used to derive charges for molecular fragments. Finally, R.E.D. incorporates charges into a force field library, readily usable in molecular dynamics computer packages. For complex cases, such as a set of homologous molecules belonging to a common family, an entire force field topology database is generated. Currently, the atomic charges and force field libraries have been developed for more than fifty model systems and stored in the RESP ESP charge DDataBase. Selected results related to non-polarizable charge models are presented and discussed.

  16. Development and implementation of a portable grating interferometer system as a standard tool for testing optics at the Advanced Photon Source beamline 1-BM.

    PubMed

    Assoufid, Lahsen; Shi, Xianbo; Marathe, Shashidhara; Benda, Erika; Wojcik, Michael J; Lang, Keenan; Xu, Ruqing; Liu, Wenjun; Macrander, Albert T; Tischler, Jon Z

    2016-05-01

    We developed a portable X-ray grating interferometer setup as a standard tool for testing optics at the Advanced Photon Source (APS) beamline 1-BM. The interferometer can be operated in phase-stepping, Moiré, or single-grating harmonic imaging mode with 1-D or 2-D gratings. All of the interferometer motions are motorized; hence, it is much easier and quicker to switch between the different modes of operation. A novel aspect of this new instrument is its designed portability. While the setup is designed to be primarily used as a standard tool for testing optics at 1-BM, it could be potentially deployed at other APS beamlines for beam coherence and wavefront characterization or imaging. The design of the interferometer system is described in detail and coherence measurements obtained at the APS 34-ID-E beamline are presented. The coherence was probed in two directions using a 2-D checkerboard, a linear, and a circular grating at X-ray energies of 8 keV, 11 keV, and 18 keV.

  17. Evaluation of contaminant removal of reverse osmosis and advanced oxidation in full-scale operation by combining passive sampling with chemical analysis and bioanalytical tools.

    PubMed

    Escher, Beate I; Lawrence, Michael; Macova, Miroslava; Mueller, Jochen F; Poussade, Yvan; Robillot, Cedric; Roux, Annalie; Gernjak, Wolfgang

    2011-06-15

    Advanced water treatment of secondary treated effluent requires stringent quality control to achieve a water quality suitable for augmenting drinking water supplies. The removal of micropollutants such as pesticides, industrial chemicals, endocrine disrupting chemicals (EDC), pharmaceuticals, and personal care products (PPCP) is paramount. As the concentrations of individual contaminants are typically low, frequent analytical screening is both laborious and costly. We propose and validate an approach for continuous monitoring by applying passive sampling with Empore disks in vessels that were designed to slow down the water flow, and thus uptake kinetics, and ensure that the uptake is only marginally dependent on the chemicals' physicochemical properties over a relatively narrow molecular size range. This design not only assured integrative sampling over 27 days for a broad range of chemicals but also permitted the use of a suite of bioanalytical tools as sum parameters, representative of mixtures of chemicals with a common mode of toxic action. Bioassays proved to be more sensitive than chemical analysis to assess the removal of organic micropollutants by reverse osmosis, followed by UV/H₂O₂ treatment, as many individual compounds fell below the quantification limit of chemical analysis, yet still contributed to the observed mixture toxicity. Nonetheless in several cases, the responses in the bioassays were also below their quantification limits and therefore only three bioassays were evaluated here, representing nonspecific toxicity and two specific end points for estrogenicity and photosynthesis inhibition. Chemical analytical techniques were able to quantify 32 pesticides, 62 PCPPs, and 12 EDCs in reverse osmosis concentrate. However, these chemicals could explain only 1% of the nonspecific toxicity in the Microtox assay in the reverse osmosis concentrate and 0.0025% in the treated water. Likewise only 1% of the estrogenic effect in the E-SCREEN could be

  18. Advancing computational methods for calibration of the Soil and Water Assessment Tool (SWAT): Application for modeling climate change impacts on water resources in the Upper Neuse Watershed of North Carolina

    NASA Astrophysics Data System (ADS)

    Ercan, Mehmet Bulent

    -Dominated Sorting Genetic Algorithm II (NSGA-II). This tool was demonstrated through an application for the Upper Neuse Watershed in North Carolina, USA. The objective functions used for the calibration were Nash-Sutcliffe (E) and Percent Bias (PB), and the objective sites were the Flat, Little, and Eno watershed outlets. The results show that the use of multi-objective calibration algorithms for SWAT calibration improved model performance especially in terms of minimizing PB compared to the single objective model calibration. The third study builds upon the first two studies by leveraging the new calibration methods and tools to study future climate impacts on the Upper Neuse watershed. Statistically downscaled outputs from eight Global Circulation Models (GCMs) were used for both low and high emission scenarios to drive a well calibrated SWAT model of the Upper Neuse watershed. The objective of the study was to understand the potential hydrologic response of the watershed, which serves as a public water supply for the growing Research Triangle Park region of North Carolina, under projected climate change scenarios. The future climate change scenarios, in general, indicate an increase in precipitation and temperature for the watershed in coming decades. The SWAT simulations using the future climate scenarios, in general, suggest an increase in soil water and water yield, and a decrease in evapotranspiration within the Upper Neuse watershed. In summary, this dissertation advances the field of watershed-scale hydrologic modeling by (i) providing some of the first work to apply cloud computing for the computationally-demanding task of model calibration; (ii) providing a new, open source library that can be used by SWAT modelers to perform multi-objective calibration of their models; and (iii) advancing understanding of climate change impacts on water resources for an important watershed in the Research Triangle Park region of North Carolina. The third study leveraged the

  19. Topology for Statistical Modeling of Petascale Data

    SciTech Connect

    Pascucci, Valerio; Levine, Joshua; Gyulassy, Attila; Bremer, P. -T.

    2013-10-31

    Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, the approach of the entire team involving all three institutions is based on the complementary techniques of combinatorial topology and statistical modelling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modelling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. The overall technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modelling, and (3) new integrated topological and statistical methods. Roughly speaking, the division of labor between our 3 groups (Sandia Labs in Livermore, Texas A&M in College Station, and U Utah in Salt Lake City) is as follows: the Sandia group focuses on statistical methods and their formulation in algebraic terms, and finds the application problems (and data sets) most relevant to this project, the Texas A&M Group develops new algebraic geometry algorithms, in particular with fewnomial theory, and the Utah group develops new algorithms in computational topology via Discrete Morse Theory. However, we hasten to point out that our three groups stay in tight contact via videconference every 2 weeks, so there is much synergy of ideas between the groups. The following of this document is focused on the contributions that had grater direct involvement from the team at the University of Utah in Salt Lake City.

  20. Using the open-source statistical language R to analyze the dichotomous Rasch model.

    PubMed

    Li, Yuelin

    2006-08-01

    R, an open-source statistical language and data analysis tool, is gaining popularity among psychologists currently teaching statistics. R is especially suitable for teaching advanced topics, such as fitting the dichotomous Rasch model--a topic that involves transforming complicated mathematical formulas into statistical computations. This article describes R's use as a teaching tool and a data analysis software program in the analysis of the Rasch model in item response theory. It also explains thetheory behind, as well as an educator's goals for, fitting the Rasch model with joint maximum likelihood estimation. This article also summarizes the R syntax for parameter estimation and the calculation of fit statistics. The results produced by R is compared with the results obtained from MINISTEP and the output of a conditional logit model. The use of R is encouraged because it is free, supported by a network of peer researchers, and covers both basic and advanced topics in statistics frequently used by psychologists.

  1. Geostatistics as a tool to improve sampling and statistical analysis in wetlands: a case study on dynamics of organic matter distribution in the Pantanal of Mato Grosso, Brazil.

    PubMed

    Nogueira, F; Couto, E G; Bernardi, C J

    2002-11-01

    The Pantanal of Mato Grosso presents distinct landscape units: permanently, occasionally and periodically flooded areas. In the last ones, sampling is especially difficult due to the high heterogeneity occurring inter and intrastratas. This paper presents a comparison of different methodological approaches showing that they can influence decisively the knowledge of distribution organic matter dynamics. In such an area in order to understand the role of the flood pulse in the distribution dynamics of organic matter in a wetland at the Pantanal, we considered that there is spatial dependence between points. This consideration contradicts the classical statistic principle that focuses on the aleatority, and allowed the obtainment of a larger volume of information from a minor sampling effort, which means better performance, with time and money economy.

  2. Biological indicators, tools to verify the effect of sterilisation processes - position paper prepared on behalf of group 1 (biological methods and statistical analysis).

    PubMed

    Haberer, K; van Doorne, H

    2011-11-01

    Biological indicators (BIs) are test systems containing viable microorganisms (usually spores of bacteria) providing a defined challenge to a specified sterilisation process. General chapter 5.1.2 of the European Pharmacopoeia [1] (Ph. Eur.) sets specifications for BIs and gives some guidance for their use. As shown in this text, the approach followed by Ph. Eur. as well as by ISO standards is outdated and could create nowadays some confusion among the users of the pharmacopoeia. It is the objective of this paper to provide the theoretical background of BIs as tools for the design and qualification of reliable moist heat sterilisation processes. The principles laid down in this article will form the basis of a future draft on a revised chapter on BIs in Pharmeuropa.

  3. Regional Arctic System Model (RASM): A Tool to Address the U.S. Priorities and Advance Capabilities for Arctic Climate Modeling and Prediction

    NASA Astrophysics Data System (ADS)

    Maslowski, W.; Roberts, A.; Cassano, J. J.; Gutowski, W. J., Jr.; Nijssen, B.; Osinski, R.; Zeng, X.; Brunke, M.; Duvivier, A.; Hamman, J.; Hossainzadeh, S.; Hughes, M.; Seefeldt, M. W.

    2015-12-01

    The Arctic is undergoing some of the most coordinated rapid climatic changes currently occurring anywhere on Earth, including the retreat of the perennial sea ice cover, which integrates forcing by, exchanges with and feedbacks between atmosphere, ocean and land. While historical reconstructions from Earth System Models (ESMs) are in broad agreement with these changes, the rate of change in ESMs generally remains outpaced by observations. Reasons for that relate to a combination of coarse resolution, inadequate parameterizations, under-represented processes and a limited knowledge of physical interactions. We demonstrate the capability of the Regional Arctic System Model (RASM) in addressing some of the ESM limitations in simulating observed variability and trends in arctic surface climate. RASM is a high resolution, pan-Arctic coupled climate model with the sea ice and ocean model components configured at an eddy-permitting resolution of 1/12o and the atmosphere and land hydrology model components at 50 km resolution, which are all coupled at 20-minute intervals. RASM is an example of limited-area, process-resolving, fully coupled ESM, which due to the constraints from boundary conditions facilitates detailed comparisons with observational statistics that are not possible with ESMs. The overall goal of RASM is to address key requirements published in the Navy Arctic Roadmap: 2014-2030 and in the Implementation Plan for the National Strategy for the Arctic Region, regarding the need for advanced modeling capabilities for operational forecasting and strategic climate predictions through 2030. The main science objectives of RASM are to advance understanding and model representation of critical physical processes and feedbacks of importance to sea ice thickness and area distribution. RASM results are presented to quantify relative contributions by (i) resolved processes and feedbacks as well as (ii) sensitivity to space dependent sub-grid parameterizations to better

  4. From models to advanced 4D visualization tools: Developing a comprehensive framework for collaborative research in physical modelling and hazard assessment of volcanic phenomena

    NASA Astrophysics Data System (ADS)

    Esposti Ongaro, T.; Barsotti, S.; de Michieli Vitturi, M.; Favalli, M.; Longo, A.; Nannipieri, L.; Neri, A.; Papale, P.; Saccorotti, G.; Tarquini, S.

    2009-04-01

    The use of numerical models in volcanological research and volcanic hazard assessment is indispensable to cope with the variety of processes and interactions characterizing magma evolution and eruption dynamics, which are dominated by non-linear phenomena and cannot be modelled at full scale in the laboratory. However, new multidisciplinary problems arise when dealing with complex mathematical formulations, numerical algorithms and their implementations on modern computer architectures, so that new tools are needed for sharing knowledge, software and datasets among scientists. Additionally, the need of communicating the results from complex, physical-based models to the public and authorities requires a further effort to present them in an effective and easy way, while highlighting the strengths and limitations of the approach. Finally, availability of Geographic Information System (GIS) data represents an issue when numerical models have to be applied to real volcanoes for impact studies. We are carrying on several initiatives, started during former and ongoing national and European projects, to develop an electronic infrastructure for promoting information transfer in this field of research. In particular, a web portal, based on a dynamic Content Manager System (CMS), is under construction to host and present physical models and their applications in an extensive way (what is usually not possible in research papers), share numerical codes and simulation datasets and discuss model validation and calibration tests. Moreover, advanced 4D visualization tools have been developed to present model results in a synthetic and effective form. Finally, a web interface to GIS databases has been implemented to share and navigate geographic data. Within this framework, it will be possible to integrate physical model outcomes into a geographic context and access them via an interactive web engine such as Google-Earth.

  5. The Application of the NASA Advanced Concepts Office, Launch Vehicle Team Design Process and Tools for Modeling Small Responsive Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Threet, Grady E.; Waters, Eric D.; Creech, Dennis M.

    2012-01-01

    The Advanced Concepts Office (ACO) Launch Vehicle Team at the NASA Marshall Space Flight Center (MSFC) is recognized throughout NASA for launch vehicle conceptual definition and pre-phase A concept design evaluation. The Launch Vehicle Team has been instrumental in defining the vehicle trade space for many of NASA s high level launch system studies from the Exploration Systems Architecture Study (ESAS) through the Augustine Report, Constellation, and now Space Launch System (SLS). The Launch Vehicle Team s approach to rapid turn-around and comparative analysis of multiple launch vehicle architectures has played a large role in narrowing the design options for future vehicle development. Recently the Launch Vehicle Team has been developing versions of their vetted tools used on large launch vehicles and repackaged the process and capability to apply to smaller more responsive launch vehicles. Along this development path the LV Team has evaluated trajectory tools and assumptions against sounding rocket trajectories and air launch systems, begun altering subsystem mass estimating relationships to handle smaller vehicle components, and as an additional development driver, have begun an in-house small launch vehicle study. With the recent interest in small responsive launch systems and the known capability and response time of the ACO LV Team, ACO s launch vehicle assessment capability can be utilized to rapidly evaluate the vast and opportune trade space that small launch vehicles currently encompass. This would provide a great benefit to the customer in order to reduce that large trade space to a select few alternatives that should best fit the customer s payload needs.

  6. Response surface methodology: A non-conventional statistical tool to maximize the throughput of Streptomyces species biomass and their bioactive metabolites.

    PubMed

    Latha, Selvanathan; Sivaranjani, Govindhan; Dhanasekaran, Dharumadurai

    2017-01-27

    Among diverse actinobacteria, Streptomyces is a renowned ongoing source for the production of a large number of secondary metabolites, furnishing immeasurable pharmacological and biological activities. Hence, to meet the demand of new lead compounds for human and animal use, research is constantly targeting the bioprospecting of Streptomyces. Optimization of media components and physicochemical parameters is a plausible approach for the exploration of intensified production of novel as well as existing bioactive metabolites from various microbes, which is usually achieved by a range of classical techniques including one factor at a time (OFAT). However, the major drawbacks of conventional optimization methods have directed the use of statistical optimization approaches in fermentation process development. Response surface methodology (RSM) is one of the empirical techniques extensively used for modeling, optimization and analysis of fermentation processes. To date, several researchers have implemented RSM in different bioprocess optimization accountable for the production of assorted natural substances from Streptomyces in which the results are very promising. This review summarizes some of the recent RSM adopted studies for the enhanced production of antibiotics, enzymes and probiotics using Streptomyces with the intention to highlight the significance of Streptomyces as well as RSM to the research community and industries.

  7. Robot Tools

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Mecanotron, now division of Robotics and Automation Corporation, developed a quick-change welding method called the Automatic Robotics Tool-change System (ARTS) under Marshall Space Flight Center and Rockwell International contracts. The ARTS system has six tool positions ranging from coarse sanding disks and abrasive wheels to cloth polishing wheels with motors of various horsepower. The system is used by fabricators of plastic body parts for the auto industry, by Texas Instruments for making radar domes, and for advanced composites at Aerospatiale in France.

  8. A Case for Adjusting Subjectively Rated Scores in the Advanced Placement Tests. Program Statistics Research. Technical Report No. 94-5.

    ERIC Educational Resources Information Center

    Longford, Nicholas T.

    A case is presented for adjusting the scores for free response items in the Advanced Placement (AP) tests. Using information about the rating process from the reliability studies, administrations of the AP test for three subject areas, psychology, computer science, and English language and composition, are analyzed. In the reliability studies, 299…

  9. AMOVA ["Accumulative Manifold Validation Analysis"]: An Advanced Statistical Methodology Designed to Measure and Test the Validity, Reliability, and Overall Efficacy of Inquiry-Based Psychometric Instruments

    ERIC Educational Resources Information Center

    Osler, James Edward, II

    2015-01-01

    This monograph provides an epistemological rational for the Accumulative Manifold Validation Analysis [also referred by the acronym "AMOVA"] statistical methodology designed to test psychometric instruments. This form of inquiry is a form of mathematical optimization in the discipline of linear stochastic modelling. AMOVA is an in-depth…

  10. Statistical Parametric Mapping of HR-pQCT Images: A Tool for Population-Based Local Comparisons of Micro-Scale Bone Features.

    PubMed

    Carballido-Gamio, Julio; Bonaretti, Serena; Kazakia, Galateia J; Khosla, Sundeep; Majumdar, Sharmila; Lang, Thomas F; Burghardt, Andrew J

    2017-04-01

    HR-pQCT enables in vivo multi-parametric assessments of bone microstructure in the distal radius and distal tibia. Conventional HR-pQCT image analysis approaches summarize bone parameters into global scalars, discarding relevant spatial information. In this work, we demonstrate the feasibility and reliability of statistical parametric mapping (SPM) techniques for HR-pQCT studies, which enable population-based local comparisons of bone properties. We present voxel-based morphometry (VBM) to assess trabecular and cortical bone voxel-based features, and a surface-based framework to assess cortical bone features both in cross-sectional and longitudinal studies. In addition, we present tensor-based morphometry (TBM) to assess trabecular and cortical bone structural changes. The SPM techniques were evaluated based on scan-rescan HR-pQCT acquisitions with repositioning of the distal radius and distal tibia of 30 subjects. For VBM and surface-based SPM purposes, all scans were spatially normalized to common radial and tibial templates, while for TBM purposes, rescans (follow-up) were spatially normalized to their corresponding scans (baseline). VBM was evaluated based on maps of local bone volume fraction (BV/TV), homogenized volumetric bone mineral density (vBMD), and homogenized strain energy density (SED) derived from micro-finite element analysis; while the cortical bone framework was evaluated based on surface maps of cortical bone thickness, vBMD, and SED. Voxel-wise and vertex-wise comparisons of bone features were done between the groups of baseline and follow-up scans. TBM was evaluated based on mean square errors of determinants of Jacobians at baseline bone voxels. In both anatomical sites, voxel- and vertex-wise uni- and multi-parametric comparisons yielded non-significant differences, and TBM showed no artefactual bone loss or apposition. The presented SPM techniques demonstrated robust specificity thus warranting their application in future clinical HR

  11. The Your Disease Risk Index for colorectal cancer is an inaccurate risk stratification tool for advanced colorectal neoplasia at screening colonoscopy.

    PubMed

    Schroy, Paul C; Coe, Alison M; Mylvaganam, Shamini R; Ahn, Lynne B; Lydotes, Maria A; Robinson, Patricia A; Davis, Julie T; Chen, Clara A; Ashba, Jacqueline; Atkinson, Michael L; Colditz, Graham A; Heeren, Timothy C

    2012-08-01

    Tailoring the use of screening colonoscopy based on the risk of advanced colorectal neoplasia (ACN) could optimize the cost-effectiveness of colorectal cancer (CRC) screening. Our goal was to assess the accuracy of the Your Disease Risk (YDR) CRC risk index for stratifying average risk patients into low- versus intermediate/high-risk categories for ACN. The YDR risk assessment tool was administered to 3,317 asymptomatic average risk patients 50 to 79 years of age just before their screening colonoscopy. Associations between YDR-derived relative risk (RR) scores and ACN prevalence were examined using logistic regression and χ(2) analyses. ACN was defined as a tubular adenoma ≥1 cm, tubulovillous or villous adenoma of any size, and the presence of high-grade dysplasia or cancer. The overall prevalence of ACN was 5.6%. Although YDR-derived RR scores were linearly associated with ACN after adjusting for age and gender (P = 0.033), the index was unable to discriminate "below average" from "above/average" risk patients [OR, 1.01; 95% confidence interval (CI), 0.75-1.37]. Considerable overlap in rates of ACN was also observed between the different YDR risk categories in our age- and gender-stratified analyses. The YDR index lacks accuracy for stratifying average risk patients into low- versus intermediate/high-risk categories for ACN.

  12. Advanced Prosthetic Gait Training Tool

    DTIC Science & Technology

    2014-10-01

    capture sequences was provided by MPL to CCAD and OGAL. CCAD’s work focused on imposing these sequences on the SantosTM digital human avatar . An...capture sequences was provided by MPL to CCAD and OGAL. CCAD’s work focused on imposing these sequences on the Santos digital human avatar . An initial...levels of the patients. In addition, the differences in ability to detect variations in gait conditions for skinned avatar vs. line-skeletal avatar

  13. Advanced Prosthetic Gait Training Tool

    DTIC Science & Technology

    2013-10-01

    or unclothed  avatars ,  stick figures, or even skeletal models to support their analyses. The system will also allow trainees to  isolate specific...CCAD’s work focused on imposing these sequences on the Santos digital  human  avatar . An initial user interface for the training application was also...ability to detect variations in gait conditions for  skinned  avatar  vs. line‐skeletal  avatar , concurrent (side‐by‐side) image representation vs

  14. A strategy for preclinical formulation development using GastroPlus as pharmacokinetic simulation tool and a statistical screening design applied to a dog study.

    PubMed

    Kuentz, Martin; Nick, Sonja; Parrott, Neil; Röthlisberger, Dieter

    2006-01-01

    showed no clear biopharmaceutical superiority over a solid capsule formulation on the average of both dose strengths in fasted and fed dogs. Despite the substantial variability of the in vivo data, the factorial screening design indicated marginal significant interaction between the dose level and feeding status. This can be viewed as a flag for the planning of further studies, since a potential effect of one factor may depend on the level of the other. In summary, the GastroPlus simulation together with the statistically designed dog study provided a thorough biopharmaceutical assessment of the new CNS drug. Based on these findings, it was decided to develop a standard granulate in capsules for phase I studies. More sophisticated formulation options were abandoned and so the clinical formulation development was conducted in a cost-efficient way.

  15. Evaluation of Long-term Outcomes of Correction of Severe Blepharoptosis with Advancement of External Levator Muscle Complex: Descriptive Statistical Analysis of the Results

    PubMed Central

    INNOCENTI, ALESSANDRO; MORI, FRANCESCO; MELITA, DARIO; DREASSI, EMANUELA; CIANCIO, FRANCESCO; INNOCENTI, MARCO

    2017-01-01

    Aim: Evaluation of long-term results after aponeurotic blepharoptosis correction with external levator muscle complex advancement. Patients and Methods: We carried out a retrospective study with medical record review of 20 patients (40 eyes) affected by bilateral aponeurotic moderate and severe ptosis who underwent primary surgery between January 2010 and December 2013. Criteria for outcome evaluations included 3-year postoperative follow-up of upper margin reflex index (uMRD) and symmetry. Results: 3-Year postoperative follow-up showed 17 (85%) cases of successful correction of ptosis and three cases (15%) showed partial success. Two eyes showed hypocorrection, while one eye was overcorrected. The symmetry was maintained in all patients except for the oldest. Conclusion: External superior levator advancement is an effective procedure for moderate and severe aponeurotic blepharoptosis correction, and establishes good long-term eyelid position and symmetry. PMID:28064228

  16. Descriptive statistics.

    PubMed

    Shi, Runhua; McLarty, Jerry W

    2009-10-01

    In this article, we introduced basic concepts of statistics, type of distributions, and descriptive statistics. A few examples were also provided. The basic concepts presented herein are only a fraction of the concepts related to descriptive statistics. Also, there are many commonly used distributions not presented herein, such as Poisson distributions for rare events and exponential distributions, F distributions, and logistic distributions. More information can be found in many statistics books and publications.

  17. Stupid statistics!

    PubMed

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  18. Statistical Software.

    ERIC Educational Resources Information Center

    Callamaras, Peter

    1983-01-01

    This buyer's guide to seven major types of statistics software packages for microcomputers reviews Edu-Ware Statistics 3.0; Financial Planning; Speed Stat; Statistics with DAISY; Human Systems Dynamics package of Stats Plus, ANOVA II, and REGRESS II; Maxistat; and Moore-Barnes' MBC Test Construction and MBC Correlation. (MBR)

  19. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2008-01-01

    As a branch of knowledge, Statistics is ubiquitous and its applications can be found in (almost) every field of human endeavour. In this article, the authors track down the possible source of the link between the "Siren song" and applications of Statistics. Answers to their previous five questions and five new questions on Statistics are presented.

  20. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. 1: Theoretical development and application to yearly predictions for selected cities in the United States

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1986-01-01

    A rain attenuation prediction model is described for use in calculating satellite communication link availability for any specific location in the world that is characterized by an extended record of rainfall. Such a formalism is necessary for the accurate assessment of such availability predictions in the case of the small user-terminal concept of the Advanced Communication Technology Satellite (ACTS) Project. The model employs the theory of extreme value statistics to generate the necessary statistical rainrate parameters from rain data in the form compiled by the National Weather Service. These location dependent rain statistics are then applied to a rain attenuation model to obtain a yearly prediction of the occurrence of attenuation on any satellite link at that location. The predictions of this model are compared to those of the Crane Two-Component Rain Model and some empirical data and found to be very good. The model is then used to calculate rain attenuation statistics at 59 locations in the United States (including Alaska and Hawaii) for the 20 GHz downlinks and 30 GHz uplinks of the proposed ACTS system. The flexibility of this modeling formalism is such that it allows a complete and unified treatment of the temporal aspects of rain attenuation that leads to the design of an optimum stochastic power control algorithm, the purpose of which is to efficiently counter such rain fades on a satellite link.

  1. A Comparison of the Performance of Advanced Statistical Techniques for the Refinement of Day-ahead and Longer NWP-based Wind Power Forecasts

    NASA Astrophysics Data System (ADS)

    Zack, J. W.

    2015-12-01

    Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble

  2. Evaluation of dietary assessment tools used to assess the diet of adults participating in the Communities Advancing the Studies of Tribal Nations Across the Lifespan (CoASTAL) cohort

    PubMed Central

    Fialkowski, Marie K.; McCrory, Megan A.; Roberts, Sparkle M.; Tracy, J. Kathleen; Grattan, Lynn M.

    2011-01-01

    Background Accurate assessment of dietary intake is essential for researchers and public health practitioners to make advancements in health. This is especially important in Native Americans who display disease prevalence rates that are dramatically higher than the general U.S. population. Objective The objective of this study was to evaluate three dietary assessment tools: 1) dietary records, 2) a food frequency questionnaire (FFQ), and 3) a shellfish assessment survey (SAS) among Native American adults from the Communities Advancing Studies of Tribal Nations Across the Lifespan (CoASTAL) cohort. Design CoASTAL was comprised of randomly selected individuals from three tribal registries of Pacific Northwest Tribal Nations. This cross-sectional study used data from the baseline of CoASTAL and was restricted to the non-pregnant adults (18+ yr) who completed the SAS (n=500), a FFQ (n=518), dietary records (n=444), weight measures (n=493), and height measures (n=496). Paired t-tests, Pearson correlation coefficients, and percent agreement were used to evaluate the dietary records and the FFQ with and without accounting for plausibility of reported energy intake (rEI). Sensitivity and specificity as well as Spearman correlation coefficients were used to evaluate the SAS and the FFQ compared to dietary records. Results Statistically significant correlations between the FFQ and dietary records for selected nutrients were not the same by gender. Accounting for plausibility of rEI for the dietary records and the FFQ improved the strength of the correlations for percent energy from protein, energy from carbohydrate, and calcium for both men and women. In addition, significant associations between rEI (dietary records and FFQ) and weight were more apparent when using only rEI considered plausible. The SAS was found to similarly assess shellfish consumption in comparison to the FFQ. Conclusion These results support the benefit of multiple measures of diet, including regional

  3. FluxSuite: a New Scientific Tool for Advanced Network Management and Cross-Sharing of Next-Generation Flux Stations

    NASA Astrophysics Data System (ADS)

    Burba, G. G.; Johnson, D.; Velgersdyk, M.; Beaty, K.; Forgione, A.; Begashaw, I.; Allyn, D.

    2015-12-01

    Significant increases in data generation and computing power in recent years have greatly improved spatial and temporal flux data coverage on multiple scales, from a single station to continental flux networks. At the same time, operating budgets for flux teams and stations infrastructure are getting ever more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are needed to effectively and efficiently handle the entire process. This would help maximize time dedicated to answering research questions, and minimize time and expenses spent on data processing, quality control and station management. Cross-sharing the stations with external institutions may also help leverage available funding, increase scientific collaboration, and promote data analyses and publications. FluxSuite, a new advanced tool combining hardware, software and web-service, was developed to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: Each next-generation station measures all parameters needed for flux computations Field microcomputer calculates final fully-corrected flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. Final fluxes, radiation, weather and soil data are merged into a single quality-controlled file Multiple flux stations are linked into an automated time-synchronized network Flux network manager, or PI, can see all stations in real time, including fluxes, supporting data, automated reports, and email alerts PI can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from

  4. Magnetospheric ULF wave studies in the frame of Swarm mission: new advanced tools for automated detection of pulsations in magnetic and electric field observations

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Daglis, Ioannis A.; Papadimitriou, Constantinos; Georgiou, Marina; Giamini, Sigiava A.; Sandberg, Ingmar; Haagmans, Roger

    2014-05-01

    The rekindling of the interest in space science in the last 15 years has led to many successful satellite missions in the Earth's magnetosphere and topside ionosphere, which were able to provide the scientific community with high-quality data on the magnetic and electric fields surrounding our planet. This data pool will be further enriched by the measurements of ESA's Swarm mission, a constellation of three satellites in different polar orbits, flying at altitudes from 400 to 550 km, which was launched on the 22nd of November 2013. Aiming at the best scientific exploitation of this corpus of accumulated data, we have developed a set of analysis tools that can cope with measurements of various spacecraft, at various regions of the magnetosphere and in the topside ionosphere. Our algorithms are based on a combination of wavelet spectral methods and artificial neural network techniques and are suited for the detection of waves and wave-like disturbances as well as the extraction of several physical parameters. Our recent work demonstrates the applicability of our developed analysis tools, both for individual case studies and statistical analysis of ultra low frequency (ULF) waves. We provide evidence for a rare simultaneous observation of a ULF wave event in the Earth's magnetosphere, topside ionosphere and surface: we have found a specific time interval during the Halloween 2003 magnetic storm, when the Cluster and CHAMP spacecraft were in good local time (LT) conjunction, and have examined the ULF wave activity in the Pc3 (22-100 mHz), Pc4 (7-22 mHz) and Pc5 (1-7 mHz) bands using data from the Geotail, Cluster and CHAMP missions, as well as the CARISMA, GIMA and IMAGE magnetometer networks. Our study shows that the same wave event, characterized by increased activity in the high end of the Pc3 band, was simultaneously observed by all three satellite missions and by certain stations of ground networks. This observation provides a strong argument in favour of the

  5. The LandCarbon Web Application: Advanced Geospatial Data Delivery and Visualization Tools for Communication about Ecosystem Carbon Sequestration and Greenhouse Gas Fluxes

    NASA Astrophysics Data System (ADS)

    Thomas, N.; Galey, B.; Zhu, Z.; Sleeter, B. M.; Lehmer, E.

    2015-12-01

    The LandCarbon web application (http://landcarbon.org) is a collaboration between the U.S. Geological Survey and U.C. Berkeley's Geospatial Innovation Facility (GIF). The LandCarbon project is a national assessment focused on improved understanding of carbon sequestration and greenhouse gas fluxes in and out of ecosystems related to land use, using scientific capabilities from USGS and other organizations. The national assessment is conducted at a regional scale, covers all 50 states, and incorporates data from remote sensing, land change studies, aquatic and wetland data, hydrological and biogeochemical modeling, and wildfire mapping to estimate baseline and future potential carbon storage and greenhouse gas fluxes. The LandCarbon web application is a geospatial portal that allows for a sophisticated data delivery system as well as a suite of engaging tools that showcase the LandCarbon data using interactive web based maps and charts. The web application was designed to be flexible and accessible to meet the needs of a variety of users. Casual users can explore the input data and results of the assessment for a particular area of interest in an intuitive and interactive map, without the need for specialized software. Users can view and interact with maps, charts, and statistics that summarize the baseline and future potential carbon storage and fluxes for U.S. Level 2 Ecoregions for 3 IPCC emissions scenarios. The application allows users to access the primary data sources and assessment results for viewing and download, and also to learn more about the assessment's objectives, methods, and uncertainties through published reports and documentation. The LandCarbon web application is built on free and open source libraries including Django and D3. The GIF has developed the Django-Spillway package, which facilitates interactive visualization and serialization of complex geospatial raster data. The underlying LandCarbon data is available through an open application

  6. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    PubMed Central

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2009-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  7. Statistical Energy Analysis Program

    NASA Technical Reports Server (NTRS)

    Ferebee, R. C.; Trudell, R. W.; Yano, L. I.; Nygaard, S. I.

    1985-01-01

    Statistical Energy Analysis (SEA) is powerful tool for estimating highfrequency vibration spectra of complex structural systems and incorporated into computer program. Basic SEA analysis procedure divided into three steps: Idealization, parameter generation, and problem solution. SEA computer program written in FORTRAN V for batch execution.

  8. Tools & techniques--statistics: propensity score techniques.

    PubMed

    da Costa, Bruno R; Gahl, Brigitta; Jüni, Peter

    2014-10-01

    Propensity score (PS) techniques are useful if the number of potential confounding pretreatment variables is large and the number of analysed outcome events is rather small so that conventional multivariable adjustment is hardly feasible. Only pretreatment characteristics should be chosen to derive PS, and only when they are probably associated with outcome. A careful visual inspection of PS will help to identify areas of no or minimal overlap, which suggests residual confounding, and trimming of the data according to the distribution of PS will help to minimise residual confounding. Standardised differences in pretreatment characteristics provide a useful check of the success of the PS technique employed. As with conventional multivariable adjustment, PS techniques cannot account for confounding variables that are not or are only imperfectly measured, and no PS technique is a substitute for an adequately designed randomised trial.

  9. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  10. Detection of Vero Cells Infected with Herpes Simplex Types 1 and 2 and Varicella Zoster Viruses Using Raman Spectroscopy and Advanced Statistical Methods

    PubMed Central

    Huleihel, Mahmoud; Shufan, Elad; Zeiri, Leila; Salman, Ahmad

    2016-01-01

    Of the eight members of the herpes family of viruses, HSV1, HSV2, and varicella zoster are the most common and are mainly involved in cutaneous disorders. These viruses usually are not life-threatening, but in some cases they might cause serious infections to the eyes and the brain that can lead to blindness and possibly death. An effective drug (acyclovir and its derivatives) is available against these viruses. Therefore, early detection and identification of these viral infections is highly important for an effective treatment. Raman spectroscopy, which has been widely used in the past years in medicine and biology, was used as a powerful spectroscopic tool for the detection and identification of these viral infections in cell culture, due to its sensitivity, rapidity and reliability. Our results showed that it was possible to differentiate, with a 97% identification success rate, the uninfected Vero cells that served as a control, from the Vero cells that were infected with HSV-1, HSV-2, and VZV. For that, linear discriminant analysis (LDA) was performed on the Raman spectra after principal component analysis (PCA) with a leave one out (LOO) approach. Raman spectroscopy in tandem with PCA and LDA enable to differentiate among the different herpes viral infections of Vero cells in time span of few minutes with high accuracy rate. Understanding cell molecular changes due to herpes viral infections using Raman spectroscopy may help in early detection and effective treatment. PMID:27078266

  11. Detection of Vero Cells Infected with Herpes Simplex Types 1 and 2 and Varicella Zoster Viruses Using Raman Spectroscopy and Advanced Statistical Methods.

    PubMed

    Huleihel, Mahmoud; Shufan, Elad; Zeiri, Leila; Salman, Ahmad

    2016-01-01

    Of the eight members of the herpes family of viruses, HSV1, HSV2, and varicella zoster are the most common and are mainly involved in cutaneous disorders. These viruses usually are not life-threatening, but in some cases they might cause serious infections to the eyes and the brain that can lead to blindness and possibly death. An effective drug (acyclovir and its derivatives) is available against these viruses. Therefore, early detection and identification of these viral infections is highly important for an effective treatment. Raman spectroscopy, which has been widely used in the past years in medicine and biology, was used as a powerful spectroscopic tool for the detection and identification of these viral infections in cell culture, due to its sensitivity, rapidity and reliability. Our results showed that it was possible to differentiate, with a 97% identification success rate, the uninfected Vero cells that served as a control, from the Vero cells that were infected with HSV-1, HSV-2, and VZV. For that, linear discriminant analysis (LDA) was performed on the Raman spectra after principal component analysis (PCA) with a leave one out (LOO) approach. Raman spectroscopy in tandem with PCA and LDA enable to differentiate among the different herpes viral infections of Vero cells in time span of few minutes with high accuracy rate. Understanding cell molecular changes due to herpes viral infections using Raman spectroscopy may help in early detection and effective treatment.

  12. [Statistics quantum satis].

    PubMed

    Pestana, Dinis

    2013-01-01

    Statistics is a privileged tool in building knowledge from information, since the purpose is to extract from a sample limited information conclusions to the whole population. The pervasive use of statistical software (that always provides an answer, the question being adequate or not), and the absence of statistics to confer a scientific flavour to so much bad science, has had a pernicious effect on some disbelief on statistical research. Would Lord Rutherford be alive today, it is almost certain that he would not condemn the use of statistics in research, as he did in the dawn of the 20th century. But he would indeed urge everyone to use statistics quantum satis, since to use bad data, too many data, and statistics to enquire on irrelevant questions, is a source of bad science, namely because with too many data we can establish statistical significance of irrelevant results. This is an important point that addicts of evidence based medicine should be aware of, since the meta analysis of two many data will inevitably establish senseless results.

  13. LensTools: Weak Lensing computing tools

    NASA Astrophysics Data System (ADS)

    Petri, A.

    2016-02-01

    LensTools implements a wide range of routines frequently used in Weak Gravitational Lensing, including tools for image analysis, statistical processing and numerical theory predictions. The package offers many useful features, including complete flexibility and easy customization of input/output formats; efficient measurements of power spectrum, PDF, Minkowski functionals and peak counts of convergence maps; survey masks; artificial noise generation engines; easy to compute parameter statistical inferences; ray tracing simulations; and many others. It requires standard numpy and scipy, and depending on tools used, may require Astropy (ascl:1304.002), emcee (ascl:1303.002), matplotlib, and mpi4py.

  14. Predicting Operator Execution Times Using CogTool

    NASA Technical Reports Server (NTRS)

    Santiago-Espada, Yamira; Latorella, Kara A.

    2013-01-01

    Researchers and developers of NextGen systems can use predictive human performance modeling tools as an initial approach to obtain skilled user performance times analytically, before system testing with users. This paper describes the CogTool models for a two pilot crew executing two different types of a datalink clearance acceptance tasks, and on two different simulation platforms. The CogTool time estimates for accepting and executing Required Time of Arrival and Interval Management clearances were compared to empirical data observed in video tapes and registered in simulation files. Results indicate no statistically significant difference between empirical data and the CogTool predictions. A population comparison test found no significant differences between the CogTool estimates and the empirical execution times for any of the four test conditions. We discuss modeling caveats and considerations for applying CogTool to crew performance modeling in advanced cockpit environments.

  15. Statistics Clinic

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  16. Elements of Statistics

    NASA Astrophysics Data System (ADS)

    Grégoire, G.

    2016-05-01

    This chapter is devoted to two objectives. The first one is to answer the request expressed by attendees of the first Astrostatistics School (Annecy, October 2013) to be provided with an elementary vademecum of statistics that would facilitate understanding of the given courses. In this spirit we recall very basic notions, that is definitions and properties that we think sufficient to benefit from courses given in the Astrostatistical School. Thus we give briefly definitions and elementary properties on random variables and vectors, distributions, estimation and tests, maximum likelihood methodology. We intend to present basic ideas in a hopefully comprehensible way. We do not try to give a rigorous presentation, and due to the place devoted to this chapter, can cover only a rather limited field of statistics. The second aim is to focus on some statistical tools that are useful in classification: basic introduction to Bayesian statistics, maximum likelihood methodology, Gaussian vectors and Gaussian mixture models.

  17. National Center for Health Statistics

    MedlinePlus

    ... Topics Data and Tools Publications News and Events Population Surveys National Health and Nutrition Examination Survey National Health Interview Survey National Survey of Family Growth Vital Records National Vital Statistics System National Death ...

  18. Quick Statistics

    MedlinePlus

    ... population, or about 25 million Americans, has experienced tinnitus lasting at least five minutes in the past ... by NIDCD Epidemiology and Statistics Program staff: (1) tinnitus prevalence was obtained from the 2008 National Health ...

  19. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. 3: A stochastic rain fade control algorithm for satellite link power via non linear Markow filtering theory

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1991-01-01

    The dynamic and composite nature of propagation impairments that are incurred on Earth-space communications links at frequencies in and above 30/20 GHz Ka band, i.e., rain attenuation, cloud and/or clear air scintillation, etc., combined with the need to counter such degradations after the small link margins have been exceeded, necessitate the use of dynamic statistical identification and prediction processing of the fading signal in order to optimally estimate and predict the levels of each of the deleterious attenuation components. Such requirements are being met in NASA's Advanced Communications Technology Satellite (ACTS) Project by the implementation of optimal processing schemes derived through the use of the Rain Attenuation Prediction Model and nonlinear Markov filtering theory.

  20. A Meta-Analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    ERIC Educational Resources Information Center

    Zhang, Lin

    2014-01-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to…

  1. Screening and analysis of potential anti-tumor components from the stipe of Ganoderma sinense using high-performance liquid chromatography/time-of-flight mass spectrometry with multivariate statistical tool.

    PubMed

    Chan, Kar-Man; Yue, Grace Gar-Lee; Li, Ping; Wong, Eric Chun-Wai; Lee, Julia Kin-Ming; Kennelly, Edward J; Lau, Clara Bik-San

    2017-03-03

    According to Chinese Pharmacopoeia 2015 edition, Ganoderma (Lingzhi) is a species complex that comprise of Ganoderma lucidum and Ganoderma sinense. The bioactivity and chemical composition of G. lucidium had been studied extensively, and it was shown to possess antitumor activities in pharmacological studies. In contrast, G. sinense has not been studied in great detail. Our previous studies found that the stipe of G. sinense exhibited more potent antitumor activity than the pileus. To identify the antitumor compounds in the stipe of G. sinense, we studied its chemical components by merging the bioactivity results with liquid chromatography-mass spectrometry-based chemometrics. The stipe of G. sinense was extracted with water, followed by ethanol precipitation and liquid-liquid partition. The resulting residue was fractionated using column chromatography. The antitumor activity of these fractions were analysed using MTT assay in murine breast tumor 4T1 cells, and their chemical components were studied using the LC-QTOF-MS with multivariate statistical tools. The chemometric and MS/MS analysis correlated bioactivity with five known cytotoxic compounds, 4-hyroxyphenylacetate, 9-oxo-(10E,12E)-octadecadienoic acid, 3-phenyl-2-propenoic acid, 13-oxo-(9E,11E)-octadecadienoic acid and lingzhine C, from the stipe of G. sinense. To the best of our knowledge, 4-hyroxyphenylacetate, 3-phenyl-2-propenoic acid and lingzhine C are firstly reported to be found in G. sinense. These five compounds will be investigated for their antitumor activities in the future.

  2. Relativistic statistical arbitrage

    NASA Astrophysics Data System (ADS)

    Wissner-Gross, A. D.; Freer, C. E.

    2010-11-01

    Recent advances in high-frequency financial trading have made light propagation delays between geographically separated exchanges relevant. Here we show that there exist optimal locations from which to coordinate the statistical arbitrage of pairs of spacelike separated securities, and calculate a representative map of such locations on Earth. Furthermore, trading local securities along chains of such intermediate locations results in a novel econophysical effect, in which the relativistic propagation of tradable information is effectively slowed or stopped by arbitrage.

  3. Self-imposed evaluation of the Helmholtz Research School MICMoR as a tool for quality assurance and advancement of a structured graduate programme

    NASA Astrophysics Data System (ADS)

    Elija Bleher, Bärbel; Schmid, Hans Peter; Scholz, Beate

    2015-04-01

    The Helmholtz Research School MICMoR (Mechanisms and Interactions of Climate Change in Mountain Regions) offers a structured graduate programme for doctoral students in the field of climate change research. It is hosted by the Institute of Meteorology and Climate Research (KIT/IMK-IFU) in Garmisch-Partenkirchen, in collaboration with 7 Bavarian partner universities and research institutions. Hence, MICMoR brings together a considerably large network with currently 20 doctoral students and 55 scientists. MICMoR offers scientific and professional skills training, provides a state-of-the-art supervision concept, and fosters international exchange and interdisciplinary collaboration. In order to develop and advance its programme, MICMoR has committed itself to a self-imposed mid-term review in its third year, to monitor to which extent its original objectives have been reached, and to explore and identify where MICMoR has room for improvement. The evaluation especially focused on recruitment, supervision, training, networking and cooperation. Carried out by an external expert (Beate Scholz from scholz ctc), the evaluation was based on a mixed methods approach, i.e. combining a quantitative survey involving all doctoral candidates as well as their supervisors and focus groups with different MICMoR stakeholders. The evaluation has brought forward some highly interesting results, pinpointing challenges and opportunities of setting up a structured doctoral programme. Overall, the evaluation proved to be a useful tool for evidence-based programme and policy planning, and demonstrated a high level of satisfaction of supervisors and fellows. Supervision, with facets ranging from disciplinary feedback to career advice, is demanding and requires strong commitment and adequate human resources development by all parties involved. Thus, MICMoR plans to offer mentor coaching and calls on supervisors and mentors to form a community of learners with their doctoral students. To

  4. Development of PowerMap: a software package for statistical power calculation in neuroimaging studies.

    PubMed

    Joyce, Karen E; Hayasaka, Satoru

    2012-10-01

    Although there are a number of statistical software tools for voxel-based massively univariate analysis of neuroimaging data, such as fMRI (functional MRI), PET (positron emission tomography), and VBM (voxel-based morphometry), very few software tools exist for power and sample size calculation for neuroimaging studies. Unlike typical biomedical studies, outcomes from neuroimaging studies are 3D images of correlated voxels, requiring a correction for massive multiple comparisons. Thus, a specialized power calculation tool is needed for planning neuroimaging studies. To facilitate this process, we developed a software tool specifically designed for neuroimaging data. The software tool, called PowerMap, implements theoretical power calculation algorithms based on non-central random field theory. It can also calculate power for statistical analyses with FDR (false discovery rate) corrections. This GUI (graphical user interface)-based tool enables neuroimaging researchers without advanced knowledge in imaging statistics to calculate power and sample size in the form of 3D images. In this paper, we provide an overview of the statistical framework behind the PowerMap tool. Three worked examples are also provided, a regression analysis, an ANOVA (analysis of variance), and a two-sample T-test, in order to demonstrate the study planning process with PowerMap. We envision that PowerMap will be a great aide for future neuroimaging research.

  5. A New Method For Advanced Virtual Design Of Stamping Tools For Automotive Industry: Application To Nodular Cast Iron EN-GJS-600-3

    NASA Astrophysics Data System (ADS)

    Ben-Slima, Khalil; Penazzi, Luc; Mabru, Catherine; Ronde-Oustau, François; Rezaï-Aria, Farhad

    2011-05-01

    This contribution presents an approach combining the stamping numerical processing simulations and structure analysis in order to improve the design for optimizing the tool fatigue life. The method consists in simulating the stamping process via AutoForm® (or any FEM Code) by considering the tool as a perfect rigid body. The estimated contact pressure is then used as boundary condition for FEM structure loading analysis. The result of this analysis is used for life prediction of the tool using S-N fatigue curve. If the prescribed tool life requirements are not satisfied, then the critical region of the tool is redesigned and the whole simulation procedures are reactivated. This optimization method is applied for a cast iron EN-GJS-600-3 as candidate stamping tool materiel. The room temperature fatigue S-N curves of this alloy are established in laboratory under uniaxial push/pull cyclic experiments on cylindrical specimens under a load ratio of R (σmin/σmax) = -2.

  6. Statistics Revelations

    ERIC Educational Resources Information Center

    Chicot, Katie; Holmes, Hilary

    2012-01-01

    The use, and misuse, of statistics is commonplace, yet in the printed format data representations can be either over simplified, supposedly for impact, or so complex as to lead to boredom, supposedly for completeness and accuracy. In this article the link to the video clip shows how dynamic visual representations can enliven and enhance the…

  7. Statistical Inference

    NASA Astrophysics Data System (ADS)

    Khan, Shahjahan

    Often scientific information on various data generating processes are presented in the from of numerical and categorical data. Except for some very rare occasions, generally such data represent a small part of the population, or selected outcomes of any data generating process. Although, valuable and useful information is lurking in the array of scientific data, generally, they are unavailable to the users. Appropriate statistical methods are essential to reveal the hidden "jewels" in the mess of the row data. Exploratory data analysis methods are used to uncover such valuable characteristics of the observed data. Statistical inference provides techniques to make valid conclusions about the unknown characteristics or parameters of the population from which scientifically drawn sample data are selected. Usually, statistical inference includes estimation of population parameters as well as performing test of hypotheses on the parameters. However, prediction of future responses and determining the prediction distributions are also part of statistical inference. Both Classical or Frequentists and Bayesian approaches are used in statistical inference. The commonly used Classical approach is based on the sample data alone. In contrast, increasingly popular Beyesian approach uses prior distribution on the parameters along with the sample data to make inferences. The non-parametric and robust methods are also being used in situations where commonly used model assumptions are unsupported. In this chapter,we cover the philosophical andmethodological aspects of both the Classical and Bayesian approaches.Moreover, some aspects of predictive inference are also included. In the absence of any evidence to support assumptions regarding the distribution of the underlying population, or if the variable is measured only in ordinal scale, non-parametric methods are used. Robust methods are employed to avoid any significant changes in the results due to deviations from the model

  8. Statistical Inference

    NASA Astrophysics Data System (ADS)

    Khan, Shahjahan

    Often scientific information on various data generating processes are presented in the from of numerical and categorical data. Except for some very rare occasions, generally such data represent a small part of the population, or selected outcomes of any data generating process. Although, valuable and useful information is lurking in the array of scientific data, generally, they are unavailable to the users. Appropriate statistical methods are essential to reveal the hidden “jewels” in the mess of the row data. Exploratory data analysis methods are used to uncover such valuable characteristics of the observed data. Statistical inference provides techniques to make valid conclusions about the unknown characteristics or parameters of the population from which scientifically drawn sample data are selected. Usually, statistical inference includes estimation of population parameters as well as performing test of hypotheses on the parameters. However, prediction of future responses and determining the prediction distributions are also part of statistical inference. Both Classical or Frequentists and Bayesian approaches are used in statistical inference. The commonly used Classical approach is based on the sample data alone. In contrast, increasingly popular Beyesian approach uses prior distribution on the parameters along with the sample data to make inferences. The non-parametric and robust methods are also being used in situations where commonly used model assumptions are unsupported. In this chapter,we cover the philosophical andmethodological aspects of both the Classical and Bayesian approaches.Moreover, some aspects of predictive inference are also included. In the absence of any evidence to support assumptions regarding the distribution of the underlying population, or if the variable is measured only in ordinal scale, non-parametric methods are used. Robust methods are employed to avoid any significant changes in the results due to deviations from the model

  9. Graphical and Normative Analysis of Binocular Vision by Mini Computer: A Teaching Aid and Clinical Tool.

    ERIC Educational Resources Information Center

    Kees, Martin; Schor, Clifton

    1981-01-01

    An inexpensive computer graphics systems (Commodore PET), used as a video aid for teaching students advanced case analysis, is described. The course provides students with the analytical tools for evaluating with graphical and statistical techniques and treating with lenses, prisms, and orthoptics various anomalies of binocular vision. (MLW)

  10. [Descriptive statistics].

    PubMed

    Rendón-Macías, Mario Enrique; Villasís-Keever, Miguel Ángel; Miranda-Novales, María Guadalupe

    2016-01-01

    Descriptive statistics is the branch of statistics that gives recommendations on how to summarize clearly and simply research data in tables, figures, charts, or graphs. Before performing a descriptive analysis it is paramount to summarize its goal or goals, and to identify the measurement scales of the different variables recorded in the study. Tables or charts aim to provide timely information on the results of an investigation. The graphs show trends and can be histograms, pie charts, "box and whiskers" plots, line graphs, or scatter plots. Images serve as examples to reinforce concepts or facts. The choice of a chart, graph, or image must be based on the study objectives. Usually it is not recommended to use more than seven in an article, also depending on its length.

  11. Order Statistics and Nonparametric Statistics.

    DTIC Science & Technology

    2014-09-26

    Topics investigated include the following: Probability that a fuze will fire; moving order statistics; distribution theory and properties of the...problem posed by an Army Scientist: A fuze will fire when at least n-i (or n-2) of n detonators function within time span t. What is the probability of

  12. Statistical Optics

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  13. Statistical Physics of Fracture

    SciTech Connect

    Alava, Mikko; Nukala, Phani K; Zapperi, Stefano

    2006-05-01

    Disorder and long-range interactions are two of the key components that make material failure an interesting playfield for the application of statistical mechanics. The cornerstone in this respect has been lattice models of the fracture in which a network of elastic beams, bonds, or electrical fuses with random failure thresholds are subject to an increasing external load. These models describe on a qualitative level the failure processes of real, brittle, or quasi-brittle materials. This has been particularly important in solving the classical engineering problems of material strength: the size dependence of maximum stress and its sample-to-sample statistical fluctuations. At the same time, lattice models pose many new fundamental questions in statistical physics, such as the relation between fracture and phase transitions. Experimental results point out to the existence of an intriguing crackling noise in the acoustic emission and of self-affine fractals in the crack surface morphology. Recent advances in computer power have enabled considerable progress in the understanding of such models. Among these partly still controversial issues, are the scaling and size-effects in material strength and accumulated damage, the statistics of avalanches or bursts of microfailures, and the morphology of the crack surface. Here we present an overview of the results obtained with lattice models for fracture, highlighting the relations with statistical physics theories and more conventional fracture mechanics approaches.

  14. Using Modern Solid-State Analytical Tools for Investigations of an Advanced Carbon Capture Material: Experiments for the Inorganic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Wriedt, Mario; Sculley, Julian P.; Aulakh, Darpandeep; Zhou, Hong-Cai

    2016-01-01

    A simple and straightforward synthesis of an ultrastable porous metal-organic framework (MOF) based on copper(II) and a mixed N donor ligand system is described as a laboratory experiment for chemistry undergraduate students. These experiments and the resulting analysis are designed to teach students basic research tools and procedures while…

  15. NATIONAL URBAN DATABASE AND ACCESS PORTAL TOOL (NUDAPT): FACILITATING ADVANCEMENTS IN URBAN METEOROLOGY AND CLIMATE MODELING WITH COMMUNITY-BASED URBAN DATABASES

    EPA Science Inventory

    We discuss the initial design and application of the National Urban Database and Access Portal Tool (NUDAPT). This new project is sponsored by the USEPA and involves collaborations and contributions from many groups from federal and state agencies, and from private and academic i...

  16. What Are the Key Statistics about Gallbladder Cancer?

    MedlinePlus

    ... Cancer About Gallbladder Cancer What Are the Key Statistics About Gallbladder Cancer? The American Cancer Society’s estimates ... advanced it is when it is found. For statistics on survival rates, see “ Survival statistics for gallbladder ...

  17. Sequence History Update Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris

    2008-01-01

    The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.

  18. Advancing the Assessment of Dynamic Psychological Processes

    PubMed Central

    Wright, Aidan G. C.; Hopwood, Christopher J.

    2016-01-01

    Most commonly used clinical assessment tools cannot fully capture the dynamic psychological processes often hypothesized as core mechanisms of psychopathology and psychotherapy. There is therefore a gap between our theories of problems and interventions for those problems and the tools we use to understand clients. The purpose of this special issue is to connect theory about clinical dynamics to practice by focusing on methods for collecting dynamic data, statistical models for analyzing dynamic data, and conceptual schemes for implementing dynamic data in applied settings. In this introductory article, we argue for the importance of assessing dynamic processes, highlight recent advances in assessment science that enable their measurement, review challenges in using these advances in applied practice, and adumbrate the articles in this issue. PMID:27313187

  19. Drug-induced sleep endoscopy as a selection tool for mandibular advancement therapy by oral device in patients with mild to moderate obstructive sleep apnoea.

    PubMed

    De Corso, E; Bastanza, G; Della Marca, G; Grippaudo, C; Rizzotto, G; Marchese, M R; Fiorita, A; Sergi, B; Meucci, D; Di Nardo, W; Paludetti, G; Scarano, E

    2015-12-01

    Nowadays oral appliance therapy is recognised as an effective therapy for many patients with primary snoring and mild to moderate obstructive sleep apnoea (OSA), as well as those with more severe OSA who cannot tolerate positive airway pressure (PAP) therapies. For this reason, it is important to focus on objective criteria to indicate which subjects may benefit from treatment with a mandibular advancement device (MAD). Various anthropometric and polysomnographic predictors have been described in the literature, whereas there are still controversies about the role of drug-induced sleep endoscopy (DISE) and advancement bimanual manoeuvre as predictor factors of treatment outcome by oral device. Herein, we report our experience in treatment of mild moderate OSA by oral appliance selected by DISE. We performed a single institution, longitudinal prospective evaluation of a consecutive group of mild moderate patients with obstructive sleep apnoea syndrome who underwent DISE. During sleep endoscopy, gentle manoeuvre of mandibular advancement less than 5 mm was performed. In 30 of 65 patients (46.2%) we obtained an unsuccessful improvement of airway patency whereas in 35 of 65 patients (53.8%) the improvement was successful and patients were considered suitable for oral device application. Because 7 of 35 patients were excluded due to conditions interfering with oral appliance therapy, we finally treated 28 patients. After 3 months of treatment, we observed a significant improvement in the Epworth medium index [(7.35 ± 2.8 versus 4.1 ± 2.2 (p < 0.05)], in mean AHI [(21.4 ± 6 events per hour versus 8.85 ± 6.9 (p < 0.05)] and in mean ODI [(18.6 ± 8 events per hour to 7 ± 5.8 (p < 0.05)]. We observed that the apnoea/hypopnoea index (AHI) improved by up to 50% from baseline in 71.4% of patients selected after DISE for MAD therapy. In the current study, mandibular advancement splint therapy was successfully prescribed on the basis not only of severity of disease, as

  20. Statistical Neurodynamics.

    NASA Astrophysics Data System (ADS)

    Paine, Gregory Harold

    1982-03-01

    The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better

  1. JUST in time health emergency interventions: an innovative approach to training the citizen for emergency situations using virtual reality techniques and advanced IT tools (the Web-CD).

    PubMed

    Manganas, A; Tsiknakis, M; Leisch, E; Karefilaki, L; Monsieurs, K; Bossaert, L L; Giorgini, F

    2004-01-01

    This paper reports the results of the first of the two systems developed by JUST, a collaborative project supported by the European Union under the Information Society Technologies (IST) Programme. The most innovative content of the project has been the design and development of a complementary training course for non-professional health emergency operators, which supports the traditional learning phase, and which purports to improve the retention capability of the trainees. This was achieved with the use of advanced information technology techniques, which provide adequate support and can help to overcome the present weaknesses of the existing training mechanisms.

  2. Lithographic measurement of EUV flare in the 0.3-NA Micro ExposureTool optic at the Advanced Light Source

    SciTech Connect

    Cain, Jason P.; Naulleau, Patrick; Spanos, Costas J.

    2005-01-01

    The level of flare present in a 0.3-NA EUV optic (the MET optic) at the Advanced Light Source at Lawrence Berkeley National Laboratory is measured using a lithographic method. Photoresist behavior at high exposure doses makes analysis difficult. Flare measurement analysis under scanning electron microscopy (SEM) and optical microscopy is compared, and optical microscopy is found to be a more reliable technique. In addition, the measured results are compared with predictions based on surface roughness measurement of the MET optical elements. When the fields in the exposure matrix are spaced far enough apart to avoid influence from surrounding fields and the data is corrected for imperfect mask contrast and aerial image proximity effects, the results match predicted values quite well. The amount of flare present in this optic ranges from 4.7% for 2 {micro}m features to 6.8% for 500 nm features.

  3. Statistical and Machine-Learning Classifier Framework to Improve Pulse Shape Discrimination System Design

    SciTech Connect

    Wurtz, R.; Kaplan, A.

    2015-10-28

    Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-­realized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-­building elements and their functions in a fully-­designed and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifier’s receiver operating characteristic (ROC) curve and its behavior at a gamma rejection rate (GRR) relevant for realistic applications.

  4. Coping with Advanced Cancer

    MedlinePlus

    ... Resources Conducting Clinical Trials Statistical Tools and Data Terminology Resources NCI Data Catalog Cryo-EM NCI's Role ... Contacts Other Funding Find NCI funding for small business innovation, technology transfer, and contracts Training Cancer Training ...

  5. Development and Integration of an Advanced Stirling Convertor Linear Alternator Model for a Tool Simulating Convertor Performance and Creating Phasor Diagrams

    NASA Technical Reports Server (NTRS)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2013-01-01

    A simple model of the Advanced Stirling Convertors (ASC) linear alternator and an AC bus controller has been developed and combined with a previously developed thermodynamic model of the convertor for a more complete simulation and analysis of the system performance. The model was developed using Sage, a 1-D thermodynamic modeling program that now includes electro-magnetic components. The convertor, consisting of a free-piston Stirling engine combined with a linear alternator, has sufficiently sinusoidal steady-state behavior to allow for phasor analysis of the forces and voltages acting in the system. A MATLAB graphical user interface (GUI) has been developed to interface with the Sage software for simplified use of the ASC model, calculation of forces, and automated creation of phasor diagrams. The GUI allows the user to vary convertor parameters while fixing different input or output parameters and observe the effect on the phasor diagrams or system performance. The new ASC model and GUI help create a better understanding of the relationship between the electrical component voltages and mechanical forces. This allows better insight into the overall convertor dynamics and performance.

  6. Plastic Surgery Statistics

    MedlinePlus

    ... PRS GO PSN PSEN GRAFT Contact Us News Plastic Surgery Statistics Plastic surgery procedural statistics from the ... Plastic Surgery Statistics 2005 Plastic Surgery Statistics 2016 Plastic Surgery Statistics Stats Report 2016 National Clearinghouse of ...

  7. Advancing representation of hydrologic processes in the Soil and Water Assessment Tool (SWAT) through integration of the TOPographic MODEL (TOPMODEL) features

    USGS Publications Warehouse

    Chen, J.; Wu, Y.

    2012-01-01

    This paper presents a study of the integration of the Soil and Water Assessment Tool (SWAT) model and the TOPographic MODEL (TOPMODEL) features for enhancing the physical representation of hydrologic processes. In SWAT, four hydrologic processes, which are surface runoff, baseflow, groundwater re-evaporation and deep aquifer percolation, are modeled by using a group of empirical equations. The empirical equations usually constrain the simulation capability of relevant processes. To replace these equations and to model the influences of topography and water table variation on streamflow generation, the TOPMODEL features are integrated into SWAT, and a new model, the so-called SWAT-TOP, is developed. In the new model, the process of deep aquifer percolation is removed, the concept of groundwater re-evaporation is refined, and the processes of surface runoff and baseflow are remodeled. Consequently, three parameters in SWAT are discarded, and two new parameters to reflect the TOPMODEL features are introduced. SWAT-TOP and SWAT are applied to the East River basin in South China, and the results reveal that, compared with SWAT, the new model can provide a more reasonable simulation of the hydrologic processes of surface runoff, groundwater re-evaporation, and baseflow. This study evidences that an established hydrologic model can be further improved by integrating the features of another model, which is a possible way to enhance our understanding of the workings of catchments.

  8. Advances in molecular tools for the use of Zygosaccharomyces bailii as host for biotechnological productions and construction of the first auxotrophic mutant.

    PubMed

    Dato, Laura; Branduardi, Paola; Passolunghi, Simone; Cattaneo, Davide; Riboldi, Luca; Frascotti, Gianni; Valli, Minoska; Porro, Danilo

    2010-11-01

    The nonconventional yeast Zygosaccharomyces bailii has been proposed as a new host for biotechnological processes due to convenient properties such as its resistance to high sugar concentrations, relatively high temperatures and especially to acidic environments. We describe a series of new expression vectors specific for Z. bailii and the resulting improvements in production levels. By exploiting the sequences of the endogenous plasmid pSB2, 2microm-like multicopy vectors were obtained, giving a fivefold increase in production. A specific integrative vector was developed which led to 100% stability in the absence of selective pressure; a multiple-integration vector was constructed, based on an rRNA gene unit portion cloned and sequenced for this purpose, driving the insertion of up to 80 copies of the foreign construct. Moreover, we show the construction of the first stable auxotrophic mutant of Z. bailii, obtained by targeted gene deletion applied to ZbLEU2. The development of molecular tools for the Z. bailii manipulation has now reached a level that may be compatible with its industrial exploitation; the production of organic acids is a prominent field of application.

  9. Advancing representation of hydrologic processes in the Soil and Water Assessment Tool (SWAT) through integration of the TOPographic MODEL (TOPMODEL) features

    NASA Astrophysics Data System (ADS)

    Chen, Ji; Wu, Yiping

    2012-02-01

    SummaryThis paper presents a study of the integration of the Soil and Water Assessment Tool (SWAT) model and the TOPographic MODEL (TOPMODEL) features for enhancing the physical representation of hydrologic processes. In SWAT, four hydrologic processes, which are surface runoff, baseflow, groundwater re-evaporation and deep aquifer percolation, are modeled by using a group of empirical equations. The empirical equations usually constrain the simulation capability of relevant processes. To replace these equations and to model the influences of topography and water table variation on streamflow generation, the TOPMODEL features are integrated into SWAT, and a new model, the so-called SWAT-TOP, is developed. In the new model, the process of deep aquifer percolation is removed, the concept of groundwater re-evaporation is refined, and the processes of surface runoff and baseflow are remodeled. Consequently, three parameters in SWAT are discarded, and two new parameters to reflect the TOPMODEL features are introduced. SWAT-TOP and SWAT are applied to the East River basin in South China, and the results reveal that, compared with SWAT, the new model can provide a more reasonable simulation of the hydrologic processes of surface runoff, groundwater re-evaporation, and baseflow. This study evidences that an established hydrologic model can be further improved by integrating the features of another model, which is a possible way to enhance our understanding of the workings of catchments.

  10. Statistical Challenges of Astronomy

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.; Babu, G. Jogesh

    Digital sky surveys, data from orbiting telescopes, and advances in computation have increased the quantity and quality of astronomical data by several orders of magnitude in recent years. Making sense of this wealth of data requires sophisticated statistical and data analytic techniques. Fortunately, statistical methodologies have similarly made great strides in recent years. Powerful synergies thus emerge when astronomers and statisticians join in examining astrostatistical problems and approaches. The volume focuses on several themes: · The increasing power of Bayesian approaches to modeling astronomical data · The growth of enormous databases, leading an emerging federated Virtual Observatory, and their impact on modern astronomical research · Statistical modeling of critical datasets, such as galaxy clustering and fluctuations in the microwave background radiation, leading to a new era of precision cosmology · Methodologies for uncovering clusters and patterns in multivariate data · The characterization of multiscale patterns in imaging and time series data As in earlier volumes in this series, research contributions discussing topics in one field are joined with commentary from scholars in the other. Short contributed papers covering dozens of astrostatistical topics are also included.

  11. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  12. Cell electrospinning: a novel tool for functionalising fibres, scaffolds and membranes with living cells and other advanced materials for regenerative biology and medicine.

    PubMed

    Jayasinghe, Suwan N

    2013-04-21

    Recent years have seen interest in approaches for directly generating fibers and scaffolds following a rising trend for their exploration in the health sciences. In this review the author wishes to briefly highlight the many approaches explored to date for generating such structures, while underlining their advantages and disadvantages, and their contribution in particular to the biomedical sciences. Such structures have been demonstrated as having implications in both the laboratory and the clinic, as they mimic the native extra cellular matrix. Interestingly the only materials investigated until very recently for generating fibrous architectures employed either natural or synthetic polymers with or without the addition of functional molecule(s). Arguably although such constructs have been demonstrated to have many applications, they lack the one unit most important for carrying out the ability to directly reconstruct a three-dimensional functional tissue, namely living cells. Therefore recent findings have demonstrated the ability to directly form cell-laden fibers and scaffolds in useful quantities from which functional three-dimensional living tissues can be conceived. These recent developments have far-reaching ramifications to many areas of research and development, a few of which range from tissue engineering and regenerative medicine, a novel approach to analyzing cell behavior and function in real time in three-dimensions, to the advanced controlled and targeted delivery of experimental and/or medical cells and/or genes for localized treatment. At present these developments have passed all in vitro and in vivo mouse model based challenge trials and are now spearheading their journey towards initiating human clinical trials.

  13. Concept Maps in Introductory Statistics

    ERIC Educational Resources Information Center

    Witmer, Jeffrey A.

    2016-01-01

    Concept maps are tools for organizing thoughts on the main ideas in a course. I present an example of a concept map that was created through the work of students in an introductory class and discuss major topics in statistics and relationships among them.

  14. The Statistical Handbook on Technology.

    ERIC Educational Resources Information Center

    Berinstein, Paula

    This volume tells stories about the tools we use, but these narratives are told in numbers rather than in words. Organized by various aspects of society, each chapter uses tables and statistics to examine everything from budgets, costs, sales, trade, employment, patents, prices, usage, access and consumption. In each chapter, each major topic is…

  15. Rotary earth boring tool

    SciTech Connect

    Dismukes, N.B.

    1983-09-27

    The present invention provides a nonstalling system for advancing a boring tool in situations where the inclination of the bore hole with respect to the vertical is such that the force of gravity does not provide effective forward thrust. A hydraulically powered marine screw propeller adjacent the boring tool provides the necessary thrust for the drilling operation. Pressurized drilling fluid provides the required hydraulic energy. The characteristics of the marine screw propeller are such that it provides maximum thrust at maximum rotative speed but should the tool stall the forward thrust drops to zero preventing stalling.

  16. The General Comments on HIV adopted by the African Commission on Human and Peoples' Rights as a tool to advance the sexual and reproductive rights of women in Africa.

    PubMed

    Durojaye, Ebenezer

    2014-12-01

    The present article examines the contents and importance of the General Comments adopted by the African Commission on Human and Peoples' Rights on Article 14 (1) (d) and (e) of the Protocol to the African Charter on the Rights of Women in Africa as a tool for advancing women's rights in the context of HIV. Given that discriminatory practices in all facets of life have continued to limit African women's enjoyment of their sexual and reproductive rights and render them susceptible to HIV infection, it becomes vital that African governments adopt appropriate measures to address this challenge. The provisions of the Protocol on the Rights of Women in Africa present great opportunities for this to be realized. The radical and progressive provisions of the Protocol will be of no use to women unless policymakers and other stakeholders have a clear understanding of them and are able to implement them effectively. The adoption of the General Comments is a welcome development, and states and civil society groups must maximize it to advance women's rights.

  17. A storm modeling system as an advanced tool in prediction of well organized slowly moving convective cloud system and early warning of severe weather risk

    NASA Astrophysics Data System (ADS)

    Spiridonov, Vlado; Curic, Mladjen

    2015-02-01

    Short-range prediction of precipitation is a critical input to flood prediction and hence the accuracy of flood warnings. Since most of the intensive processes come from convective clouds-the primary aim is to forecast these small-scale atmospheric processes. One characteristic pattern of organized group of convective clouds consist of a line of deep convection resulted in the repeated passage of heavy-rain-producing convective cells over NW part of Macedonia along the line. This slowly moving convective system produced extreme local rainfall and hailfall in urban Skopje city. A 3-d cloud model is used to simulate the main storm characteristic (e.g., structure, intensity, evolution) and the main physical processes responsible for initiation of heavy rainfall and hailfall. The model showed a good performance in producing significantly more realistic and spatially accurate forecasts of convective rainfall event than is possible with current operational system. The output results give a good initial input for developing appropriate tools such as flooding indices and potential risk mapping for interpreting and presenting the predictions so that they enhance operational flood prediction capabilities and warnings of severe weather risk of weather services. Convective scale model-even for a single case used has proved significant benefits in several aspects (initiation of convection, storm structure and evolution and precipitation). The storm-scale model (grid spacing-1 km) is capable of producing significantly more realistic and spatially accurate forecasts of convective rainfall events than is possible with current operational systems based on model with grid spacing 15 km.

  18. Molecular tools for chemical biotechnology

    PubMed Central

    Galanie, Stephanie; Siddiqui, Michael S.; Smolke, Christina D.

    2013-01-01

    Biotechnological production of high value chemical products increasingly involves engineering in vivo multi-enzyme pathways and host metabolism. Recent approaches to these engineering objectives have made use of molecular tools to advance de novo pathway identification, tunable enzyme expression, and rapid pathway construction. Molecular tools also enable optimization of single enzymes and entire genomes through diversity generation and screening, whole cell analytics, and synthetic metabolic control networks. In this review, we focus on advanced molecular tools and their applications to engineered pathways in host organisms, highlighting the degree to which each tool is generalizable. PMID:23528237

  19. A Survey of Reliability, Maintainability, Supportability, and Testability Software Tools

    DTIC Science & Technology

    1991-04-01

    5 1.101 Reliability Prediction Tools ......................... 6 1.102 Reliability Modeling Tools ........................... 14 1.103...M/S/T TOOL SUMMARIES 1.1 RELIABILITY TOOLS 1.101 Reliability Prediction Tools 1.102 Reliability Modeling Tools 1.103 Fault Tree Analysis Tools 1.104...Thermal Analysis Tools 1.115 Structural Reliability Evaluation Tools 5 1.101 RELIABILITY PREDICTION TOOLS NAME: ARM - Advanced Reliability Modeling

  20. The Health Impact Assessment (HIA) Resource and Tool ...

    EPA Pesticide Factsheets

    Health Impact Assessment (HIA) is a relatively new and rapidly emerging field in the U.S. An inventory of available HIA resources and tools was conducted, with a primary focus on resources developed in the U.S. The resources and tools available to HIA practitioners in the conduct of their work were identified through multiple methods and compiled into a comprehensive list. The compilation includes tools and resources related to the HIA process itself and those that can be used to collect and analyze data, establish a baseline profile, assess potential health impacts, and establish benchmarks and indicators for monitoring and evaluation. These resources include literature and evidence bases, data and statistics, guidelines, benchmarks, decision and economic analysis tools, scientific models, methods, frameworks, indices, mapping, and various data collection tools. Understanding the data, tools, models, methods, and other resources available to perform HIAs will help to advance the HIA community of practice in the U.S., improve the quality and rigor of assessments upon which stakeholder and policy decisions are based, and potentially improve the overall effectiveness of HIA to promote healthy and sustainable communities. The Health Impact Assessment (HIA) Resource and Tool Compilation is a comprehensive list of resources and tools that can be utilized by HIA practitioners with all levels of HIA experience to guide them throughout the HIA process. The HIA Resource

  1. Modeling Statistics of Fish Patchiness and Predicting Associated Influence on Statistics of Acoustic Echoes

    DTIC Science & Technology

    2015-09-30

    active sonar. Toward this goal, fundamental advances in the understanding of fish behavior , especially in aggregations, will be made under conditions...relevant to the echo statistics problem. OBJECTIVES To develop new models of behavior of fish aggregations, including the fission/fusion process...and to describe the echo statistics associated with the random fish behavior using existing formulations of echo statistics. APPROACH

  2. Jetting tool

    SciTech Connect

    Szarka, D.D.; Schwegman, S.L.

    1991-07-09

    This patent describes an apparatus for hydraulically jetting a well tool disposed in a well, the well tool having a sliding member. It comprises positioner means for operably engaging the sliding member of the well tool; and a jetting means, connected at a rotatable connection to the positioner means so that the jetting means is rotatable relative to the positioner means and the well tool, for hydraulically jetting the well tool as the jetting means is rotated relative thereto.

  3. Statistical modeling and analysis of laser-evoked potentials of electrocorticogram recordings from awake humans.

    PubMed

    Chen, Zhe; Ohara, Shinji; Cao, Jianting; Vialatte, François; Lenz, Fred A; Cichocki, Andrzej

    2007-01-01

    This article is devoted to statistical modeling and analysis of electrocorticogram (ECoG) signals induced by painful cutaneous laser stimuli, which were recorded from implanted electrodes in awake humans. Specifically, with statistical tools of factor analysis and independent component analysis, the pain-induced laser-evoked potentials (LEPs) were extracted and investigated under different controlled conditions. With the help of wavelet analysis, quantitative and qualitative analyses were conducted regarding the LEPs' attributes of power, amplitude, and latency, in both averaging and single-trial experiments. Statistical hypothesis tests were also applied in various experimental setups. Experimental results reported herein also confirm previous findings in the neurophysiology literature. In addition, single-trial analysis has also revealed many new observations that might be interesting to the neuroscientists or clinical neurophysiologists. These promising results show convincing validation that advanced signal processing and statistical analysis may open new avenues for future studies of such ECoG or other relevant biomedical recordings.

  4. Statistical Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Verde, L.

    2010-03-01

    The advent of large data-set in cosmology has meant that in the past 10 or 20 years our knowledge and understanding of the Universe has changed not only quantitatively but also, and most importantly, qualitatively. Cosmologists rely on data where a host of useful information is enclosed, but is encoded in a non-trivial way. The challenges in extracting this information must be overcome to make the most of a large experimental effort. Even after having converged to a standard cosmological model (the LCDM model) we should keep in mind that this model is described by 10 or more physical parameters and if we want to study deviations from it, the number of parameters is even larger. Dealing with such a high dimensional parameter space and finding parameters constraints is a challenge on itself. Cosmologists want to be able to compare and combine different data sets both for testing for possible disagreements (which could indicate new physics) and for improving parameter determinations. Finally, cosmologists in many cases want to find out, before actually doing the experiment, how much one would be able to learn from it. For all these reasons, sophisiticated statistical techniques are being employed in cosmology, and it has become crucial to know some statistical background to understand recent literature in the field. I will introduce some statistical tools that any cosmologist should know about in order to be able to understand recently published results from the analysis of cosmological data sets. I will not present a complete and rigorous introduction to statistics as there are several good books which are reported in the references. The reader should refer to those.

  5. ESET: [Eagles Student Evaluation of Teaching]--An Online Andragogical Student Ratings of Instruction Tool That Is an In-Depth Systemic Statistical Mechanism Designed to Inform, Enhance, and Empower Higher Education

    ERIC Educational Resources Information Center

    Osler, James Edward, II; Mansaray, Mahmud A.

    2015-01-01

    This paper seeks to provide an epistemological rationale for the establishment of ESET (an acronym for: "Eagles Student Evaluation of Teaching") as a novel universal SRI [Student Ratings of Instruction] tool. Colleges and Universities in the United States use Student Ratings of Instruction [SRI] for course evaluation purposes (Osler and…

  6. Statistical ecology comes of age.

    PubMed

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.

  7. MQSA National Statistics

    MedlinePlus

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard Statistics (Archived) 2015 ...

  8. Reducing statistics anxiety and enhancing statistics learning achievement: effectiveness of a one-minute strategy.

    PubMed

    Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze

    2014-08-01

    Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.

  9. Statistical physics ""Beyond equilibrium

    SciTech Connect

    Ecke, Robert E

    2009-01-01

    The scientific challenges of the 21st century will increasingly involve competing interactions, geometric frustration, spatial and temporal intrinsic inhomogeneity, nanoscale structures, and interactions spanning many scales. We will focus on a broad class of emerging problems that will require new tools in non-equilibrium statistical physics and that will find application in new material functionality, in predicting complex spatial dynamics, and in understanding novel states of matter. Our work will encompass materials under extreme conditions involving elastic/plastic deformation, competing interactions, intrinsic inhomogeneity, frustration in condensed matter systems, scaling phenomena in disordered materials from glasses to granular matter, quantum chemistry applied to nano-scale materials, soft-matter materials, and spatio-temporal properties of both ordinary and complex fluids.

  10. Tool Carrier

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Tool organizer accommodates a selection of hand tools on a waist or thigh belt or alternately on wall, work bench, or car trunk mountings. Tool caddy is widely used by industrial maintenance personnel, TV technicians, mechanics, artists, draftsmen, hobbyists and homeowners. Its innovative feature is rows of flexible vinyl "fingers" like the bristles of a hairbrush which mesh together to hold the tool securely in place yet allow easy insertion or withdrawal. Product is no longer commercially available.

  11. Statistics used in current nursing research.

    PubMed

    Zellner, Kathleen; Boerst, Connie J; Tabb, Wil

    2007-02-01

    Undergraduate nursing research courses should emphasize the statistics most commonly used in the nursing literature to strengthen students' and beginning researchers' understanding of them. To determine the most commonly used statistics, we reviewed all quantitative research articles published in 13 nursing journals in 2000. The findings supported Beitz's categorization of kinds of statistics. Ten primary statistics used in 80% of nursing research published in 2000 were identified. We recommend that the appropriate use of those top 10 statistics be emphasized in undergraduate nursing education and that the nursing profession continue to advocate for the use of methods (e.g., power analysis, odds ratio) that may contribute to the advancement of nursing research.

  12. Atmospheric statistics for aerospace vehicle operations

    NASA Technical Reports Server (NTRS)

    Smith, O. E.; Batts, G. W.

    1993-01-01

    Statistical analysis of atmospheric variables was performed for the Shuttle Transportation System (STS) design trade studies and the establishment of launch commit criteria. Atmospheric constraint statistics have been developed for the NASP test flight, the Advanced Launch System, and the National Launch System. The concepts and analysis techniques discussed in the paper are applicable to the design and operations of any future aerospace vehicle.

  13. Percussion tool

    DOEpatents

    Reed, Teddy R.

    2006-11-28

    A percussion tool is described and which includes a housing mounting a tool bit; a reciprocally moveable hammer borne by the housing and which is operable to repeatedly strike the tool bit; and a reciprocally moveable piston enclosed within the hammer and which imparts reciprocal movement to the reciprocally moveable hammer.

  14. Taking a statistical approach

    SciTech Connect

    Wild, M.; Rouhani, S.

    1995-02-01

    A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate ore concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.

  15. PV Hourly Simulation Tool

    SciTech Connect

    Dean, Jesse; Metzger, Ian

    2010-12-31

    This software requires inputs of simple general building characteristics and usage information to calculate the energy and cost benefits of solar PV. This tool conducts and complex hourly simulation of solar PV based primarily on the area available on the rooftop. It uses a simplified efficiency calculation method and real panel characteristics. It includes a detailed rate structure to account for time-of-use rates, on-peak and off-peak pricing, and multiple rate seasons. This tool includes the option for advanced system design inputs if they are known. This tool calculates energy savings, demand reduction, cost savings, incentives and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.

  16. Perspectives on statistics education: observations from statistical consulting in an academic nursing environment.

    PubMed

    Hayat, Matthew J; Schmiege, Sarah J; Cook, Paul F

    2014-04-01

    Statistics knowledge is essential for understanding the nursing and health care literature, as well as for applying rigorous science in nursing research. Statistical consultants providing services to faculty and students in an academic nursing program have the opportunity to identify gaps and challenges in statistics education for nursing students. This information may be useful to curriculum committees and statistics educators. This article aims to provide perspective on statistics education stemming from the experiences of three experienced statistics educators who regularly collaborate and consult with nurse investigators. The authors share their knowledge and express their views about data management, data screening and manipulation, statistical software, types of scientific investigation, and advanced statistical topics not covered in the usual coursework. The suggestions provided promote a call for data to study these topics. Relevant data about statistics education can assist educators in developing comprehensive statistics coursework for nursing students.

  17. THE ATMOSPHERIC MODEL EVALUATION TOOL

    EPA Science Inventory

    This poster describes a model evaluation tool that is currently being developed and applied for meteorological and air quality model evaluation. The poster outlines the framework and provides examples of statistical evaluations that can be performed with the model evaluation tool...

  18. Statistics Poker: Reinforcing Basic Statistical Concepts

    ERIC Educational Resources Information Center

    Leech, Nancy L.

    2008-01-01

    Learning basic statistical concepts does not need to be tedious or dry; it can be fun and interesting through cooperative learning in the small-group activity of Statistics Poker. This article describes a teaching approach for reinforcing basic statistical concepts that can help students who have high anxiety and makes learning and reinforcing…

  19. Florida Library Directory with Statistics, 2001.

    ERIC Educational Resources Information Center

    Taylor-Furbee, Sondra, Comp.; Kellenberger, Betsy, Comp.

    The annual "Florida Library Directory with Statistics" is intended to be a tool for library staff to present vital statistical information on budgets, collections, and services to local, state, and national policymakers. As with previous editions, this 2001 edition includes the names, addresses, telephone numbers, and other information…

  20. Application of Statistics in Engineering Technology Programs

    ERIC Educational Resources Information Center

    Zhan, Wei; Fink, Rainer; Fang, Alex

    2010-01-01

    Statistics is a critical tool for robustness analysis, measurement system error analysis, test data analysis, probabilistic risk assessment, and many other fields in the engineering world. Traditionally, however, statistics is not extensively used in undergraduate engineering technology (ET) programs, resulting in a major disconnect from industry…