Sample records for statistical mechanical tools

  1. Statistical Analysis Tools for Learning in Engineering Laboratories.

    ERIC Educational Resources Information Center

    Maher, Carolyn A.

    1990-01-01

    Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…

  2. Dynamic principle for ensemble control tools.

    PubMed

    Samoletov, A; Vasiev, B

    2017-11-28

    Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.

  3. New Statistics for Testing Differential Expression of Pathways from Microarray Data

    NASA Astrophysics Data System (ADS)

    Siu, Hoicheong; Dong, Hua; Jin, Li; Xiong, Momiao

    Exploring biological meaning from microarray data is very important but remains a great challenge. Here, we developed three new statistics: linear combination test, quadratic test and de-correlation test to identify differentially expressed pathways from gene expression profile. We apply our statistics to two rheumatoid arthritis datasets. Notably, our results reveal three significant pathways and 275 genes in common in two datasets. The pathways we found are meaningful to uncover the disease mechanisms of rheumatoid arthritis, which implies that our statistics are a powerful tool in functional analysis of gene expression data.

  4. A κ-generalized statistical mechanics approach to income analysis

    NASA Astrophysics Data System (ADS)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2009-02-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.

  5. Simulation of Medical Imaging Systems: Emission and Transmission Tomography

    NASA Astrophysics Data System (ADS)

    Harrison, Robert L.

    Simulation is an important tool in medical imaging research. In patient scans the true underlying anatomy and physiology is unknown. We have no way of knowing in a given scan how various factors are confounding the data: statistical noise; biological variability; patient motion; scattered radiation, dead time, and other data contaminants. Simulation allows us to isolate a single factor of interest, for instance when researchers perform multiple simulations of the same imaging situation to determine the effect of statistical noise or biological variability. Simulations are also increasingly used as a design optimization tool for tomographic scanners. This article gives an overview of the mechanics of emission and transmission tomography simulation, reviews some of the publicly available simulation tools, and discusses trade-offs between the accuracy and efficiency of simulations.

  6. AstrodyToolsWeb an e-Science project in Astrodynamics and Celestial Mechanics fields

    NASA Astrophysics Data System (ADS)

    López, R.; San-Juan, J. F.

    2013-05-01

    Astrodynamics Web Tools, AstrodyToolsWeb (http://tastrody.unirioja.es), is an ongoing collaborative Web Tools computing infrastructure project which has been specially designed to support scientific computation. AstrodyToolsWeb provides project collaborators with all the technical and human facilities in order to wrap, manage, and use specialized noncommercial software tools in Astrodynamics and Celestial Mechanics fields, with the aim of optimizing the use of resources, both human and material. However, this project is open to collaboration from the whole scientific community in order to create a library of useful tools and their corresponding theoretical backgrounds. AstrodyToolsWeb offers a user-friendly web interface in order to choose applications, introduce data, and select appropriate constraints in an intuitive and easy way for the user. After that, the application is executed in real time, whenever possible; then the critical information about program behavior (errors and logs) and output, including the postprocessing and interpretation of its results (graphical representation of data, statistical analysis or whatever manipulation therein), are shown via the same web interface or can be downloaded to the user's computer.

  7. A new approach to fracture modelling in reservoirs using deterministic, genetic and statistical models of fracture growth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rawnsley, K.; Swaby, P.

    1996-08-01

    It is increasingly acknowledged that in order to understand and forecast the behavior of fracture influenced reservoirs we must attempt to reproduce the fracture system geometry and use this as a basis for fluid flow calculation. This article aims to present a recently developed fracture modelling prototype designed specifically for use in hydrocarbon reservoir environments. The prototype {open_quotes}FRAME{close_quotes} (FRActure Modelling Environment) aims to provide a tool which will allow the generation of realistic 3D fracture systems within a reservoir model, constrained to the known geology of the reservoir by both mechanical and statistical considerations, and which can be used asmore » a basis for fluid flow calculation. Two newly developed modelling techniques are used. The first is an interactive tool which allows complex fault surfaces and their associated deformations to be reproduced. The second is a {open_quotes}genetic{close_quotes} model which grows fracture patterns from seeds using conceptual models of fracture development. The user defines the mechanical input and can retrieve all the statistics of the growing fractures to allow comparison to assumed statistical distributions for the reservoir fractures. Input parameters include growth rate, fracture interaction characteristics, orientation maps and density maps. More traditional statistical stochastic fracture models are also incorporated. FRAME is designed to allow the geologist to input hard or soft data including seismically defined surfaces, well fractures, outcrop models, analogue or numerical mechanical models or geological {open_quotes}feeling{close_quotes}. The geologist is not restricted to {open_quotes}a priori{close_quotes} models of fracture patterns that may not correspond to the data.« less

  8. [Applications of mathematical statistics methods on compatibility researches of traditional Chinese medicines formulae].

    PubMed

    Mai, Lan-Yin; Li, Yi-Xuan; Chen, Yong; Xie, Zhen; Li, Jie; Zhong, Ming-Yu

    2014-05-01

    The compatibility of traditional Chinese medicines (TCMs) formulae containing enormous information, is a complex component system. Applications of mathematical statistics methods on the compatibility researches of traditional Chinese medicines formulae have great significance for promoting the modernization of traditional Chinese medicines and improving clinical efficacies and optimizations of formulae. As a tool for quantitative analysis, data inference and exploring inherent rules of substances, the mathematical statistics method can be used to reveal the working mechanisms of the compatibility of traditional Chinese medicines formulae in qualitatively and quantitatively. By reviewing studies based on the applications of mathematical statistics methods, this paper were summarized from perspective of dosages optimization, efficacies and changes of chemical components as well as the rules of incompatibility and contraindication of formulae, will provide the references for further studying and revealing the working mechanisms and the connotations of traditional Chinese medicines.

  9. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.

    1988-01-01

    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  10. Insights into teaching quantum mechanics in secondary and lower undergraduate education

    NASA Astrophysics Data System (ADS)

    Krijtenburg-Lewerissa, K.; Pol, H. J.; Brinkman, A.; van Joolingen, W. R.

    2017-06-01

    This study presents a review of the current state of research on teaching quantum mechanics in secondary and lower undergraduate education. A conceptual approach to quantum mechanics is being implemented in more and more introductory physics courses around the world. Because of the differences between the conceptual nature of quantum mechanics and classical physics, research on misconceptions, testing, and teaching strategies for introductory quantum mechanics is needed. For this review, 74 articles were selected and analyzed for the misconceptions, research tools, teaching strategies, and multimedia applications investigated. Outcomes were categorized according to their contribution to the various subtopics of quantum mechanics. Analysis shows that students have difficulty relating quantum physics to physical reality. It also shows that the teaching of complex quantum behavior, such as time dependence, superposition, and the measurement problem, has barely been investigated for the secondary and lower undergraduate level. At the secondary school level, this article shows a need to investigate student difficulties concerning wave functions and potential wells. Investigation of research tools shows the necessity for the development of assessment tools for secondary and lower undergraduate education, which cover all major topics and are suitable for statistical analysis. Furthermore, this article shows the existence of very diverse ideas concerning teaching strategies for quantum mechanics and a lack of research into which strategies promote understanding. This article underlines the need for more empirical research into student difficulties, teaching strategies, activities, and research tools intended for a conceptual approach for quantum mechanics.

  11. An analysis of a large dataset on immigrant integration in Spain. The Statistical Mechanics perspective on Social Action

    NASA Astrophysics Data System (ADS)

    Barra, Adriano; Contucci, Pierluigi; Sandell, Rickard; Vernia, Cecilia

    2014-02-01

    How does immigrant integration in a country change with immigration density? Guided by a statistical mechanics perspective we propose a novel approach to this problem. The analysis focuses on classical integration quantifiers such as the percentage of jobs (temporary and permanent) given to immigrants, mixed marriages, and newborns with parents of mixed origin. We find that the average values of different quantifiers may exhibit either linear or non-linear growth on immigrant density and we suggest that social action, a concept identified by Max Weber, causes the observed non-linearity. Using the statistical mechanics notion of interaction to quantitatively emulate social action, a unified mathematical model for integration is proposed and it is shown to explain both growth behaviors observed. The linear theory instead, ignoring the possibility of interaction effects would underestimate the quantifiers up to 30% when immigrant densities are low, and overestimate them as much when densities are high. The capacity to quantitatively isolate different types of integration mechanisms makes our framework a suitable tool in the quest for more efficient integration policies.

  12. Statistical mechanics of competitive resource allocation using agent-based models

    NASA Astrophysics Data System (ADS)

    Chakraborti, Anirban; Challet, Damien; Chatterjee, Arnab; Marsili, Matteo; Zhang, Yi-Cheng; Chakrabarti, Bikas K.

    2015-01-01

    Demand outstrips available resources in most situations, which gives rise to competition, interaction and learning. In this article, we review a broad spectrum of multi-agent models of competition (El Farol Bar problem, Minority Game, Kolkata Paise Restaurant problem, Stable marriage problem, Parking space problem and others) and the methods used to understand them analytically. We emphasize the power of concepts and tools from statistical mechanics to understand and explain fully collective phenomena such as phase transitions and long memory, and the mapping between agent heterogeneity and physical disorder. As these methods can be applied to any large-scale model of competitive resource allocation made up of heterogeneous adaptive agent with non-linear interaction, they provide a prospective unifying paradigm for many scientific disciplines.

  13. Current algebra, statistical mechanics and quantum models

    NASA Astrophysics Data System (ADS)

    Vilela Mendes, R.

    2017-11-01

    Results obtained in the past for free boson systems at zero and nonzero temperatures are revisited to clarify the physical meaning of current algebra reducible functionals which are associated to systems with density fluctuations, leading to observable effects on phase transitions. To use current algebra as a tool for the formulation of quantum statistical mechanics amounts to the construction of unitary representations of diffeomorphism groups. Two mathematical equivalent procedures exist for this purpose. One searches for quasi-invariant measures on configuration spaces, the other for a cyclic vector in Hilbert space. Here, one argues that the second approach is closer to the physical intuition when modelling complex systems. An example of application of the current algebra methodology to the pairing phenomenon in two-dimensional fermion systems is discussed.

  14. Occupational injuries in automobile repair workers.

    PubMed

    Vyas, Heer; Das, Subir; Mehta, Shashank

    2011-01-01

    Mechanics are exposed to varied work stressors such as hot noisy environments, strenuous postures, improperly designed tools and machinery and poor psycho-social environments which may exert an influence on their health and safety. The study aimed to examine the occupational injury patterns and identify work stressors associated with injury amongst automobile mechanics. A descriptive ergonomic checklist and questionnaire on general health and psycho-social issues were administered to male workers (N=153). The relative risk factors and correlation statistics were used to identify the work stressors associated with occupational injury. 63% of the workers reported injuries. Cuts were the chief injuries being reported. Poor work environment, machinery and tool characteristics, suffering from poor health and psycho-social stressors were associated with injury occurrence amongst automobile repair workers.

  15. Statistical Tools And Artificial Intelligence Approaches To Predict Fracture In Bulk Forming Processes

    NASA Astrophysics Data System (ADS)

    Di Lorenzo, R.; Ingarao, G.; Fonti, V.

    2007-05-01

    The crucial task in the prevention of ductile fracture is the availability of a tool for the prediction of such defect occurrence. The technical literature presents a wide investigation on this topic and many contributions have been given by many authors following different approaches. The main class of approaches regards the development of fracture criteria: generally, such criteria are expressed by determining a critical value of a damage function which depends on stress and strain paths: ductile fracture is assumed to occur when such critical value is reached during the analysed process. There is a relevant drawback related to the utilization of ductile fracture criteria; in fact each criterion usually has good performances in the prediction of fracture for particular stress - strain paths, i.e. it works very well for certain processes but may provide no good results for other processes. On the other hand, the approaches based on damage mechanics formulation are very effective from a theoretical point of view but they are very complex and their proper calibration is quite difficult. In this paper, two different approaches are investigated to predict fracture occurrence in cold forming operations. The final aim of the proposed method is the achievement of a tool which has a general reliability i.e. it is able to predict fracture for different forming processes. The proposed approach represents a step forward within a research project focused on the utilization of innovative predictive tools for ductile fracture. The paper presents a comparison between an artificial neural network design procedure and an approach based on statistical tools; both the approaches were aimed to predict fracture occurrence/absence basing on a set of stress and strain paths data. The proposed approach is based on the utilization of experimental data available, for a given material, on fracture occurrence in different processes. More in detail, the approach consists in the analysis of experimental tests in which fracture occurs followed by the numerical simulations of such processes in order to track the stress-strain paths in the workpiece region where fracture is expected. Such data are utilized to build up a proper data set which was utilized both to train an artificial neural network and to perform a statistical analysis aimed to predict fracture occurrence. The developed statistical tool is properly designed and optimized and is able to recognize the fracture occurrence. The reliability and predictive capability of the statistical method were compared with the ones obtained from an artificial neural network developed to predict fracture occurrence. Moreover, the approach is validated also in forming processes characterized by a complex fracture mechanics.

  16. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Statistical Tool Reporting. 1852.223-76 Section 1852.223-76 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.223-76 Federal Automotive Statistical Tool Reporting. As prescribed at 1823.271 and 1851.205, insert the following clause: Federal Automotive Statistical Tool Reporting (JUL 2003) If...

  17. Computer aided drug design

    NASA Astrophysics Data System (ADS)

    Jain, A.

    2017-08-01

    Computer based method can help in discovery of leads and can potentially eliminate chemical synthesis and screening of many irrelevant compounds, and in this way, it save time as well as cost. Molecular modeling systems are powerful tools for building, visualizing, analyzing and storing models of complex molecular structure that can help to interpretate structure activity relationship. The use of various techniques of molecular mechanics and dynamics and software in Computer aided drug design along with statistics analysis is powerful tool for the medicinal chemistry to synthesis therapeutic and effective drugs with minimum side effect.

  18. Graphical Tests for Power Comparison of Competing Designs.

    PubMed

    Hofmann, H; Follett, L; Majumder, M; Cook, D

    2012-12-01

    Lineups have been established as tools for visual testing similar to standard statistical inference tests, allowing us to evaluate the validity of graphical findings in an objective manner. In simulation studies lineups have been shown as being efficient: the power of visual tests is comparable to classical tests while being much less stringent in terms of distributional assumptions made. This makes lineups versatile, yet powerful, tools in situations where conditions for regular statistical tests are not or cannot be met. In this paper we introduce lineups as a tool for evaluating the power of competing graphical designs. We highlight some of the theoretical properties and then show results from two studies evaluating competing designs: both studies are designed to go to the limits of our perceptual abilities to highlight differences between designs. We use both accuracy and speed of evaluation as measures of a successful design. The first study compares the choice of coordinate system: polar versus cartesian coordinates. The results show strong support in favor of cartesian coordinates in finding fast and accurate answers to spotting patterns. The second study is aimed at finding shift differences between distributions. Both studies are motivated by data problems that we have recently encountered, and explore using simulated data to evaluate the plot designs under controlled conditions. Amazon Mechanical Turk (MTurk) is used to conduct the studies. The lineups provide an effective mechanism for objectively evaluating plot designs.

  19. Spin Glass a Bridge Between Quantum Computation and Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Ohzeki, Masayuki

    2013-09-01

    In this chapter, we show two fascinating topics lying between quantum information processing and statistical mechanics. First, we introduce an elaborated technique, the surface code, to prepare the particular quantum state with robustness against decoherence. Interestingly, the theoretical limitation of the surface code, accuracy threshold, to restore the quantum state has a close connection with the problem on the phase transition in a special model known as spin glasses, which is one of the most active researches in statistical mechanics. The phase transition in spin glasses is an intractable problem, since we must strive many-body system with complicated interactions with change of their signs depending on the distance between spins. Fortunately, recent progress in spin-glass theory enables us to predict the precise location of the critical point, at which the phase transition occurs. It means that statistical mechanics is available for revealing one of the most interesting parts in quantum information processing. We show how to import the special tool in statistical mechanics into the problem on the accuracy threshold in quantum computation. Second, we show another interesting technique to employ quantum nature, quantum annealing. The purpose of quantum annealing is to search for the most favored solution of a multivariable function, namely optimization problem. The most typical instance is the traveling salesman problem to find the minimum tour while visiting all the cities. In quantum annealing, we introduce quantum fluctuation to drive a particular system with the artificial Hamiltonian, in which the ground state represents the optimal solution of the specific problem we desire to solve. Induction of the quantum fluctuation gives rise to the quantum tunneling effect, which allows nontrivial hopping from state to state. We then sketch a strategy to control the quantum fluctuation efficiently reaching the ground state. Such a generic framework is called quantum annealing. The most typical instance is quantum adiabatic computation based on the adiabatic theorem. The quantum adiabatic computation as discussed in the other chapter, unfortunately, has a crucial bottleneck for a part of the optimization problems. We here introduce several recent trials to overcome such a weakpoint by use of developments in statistical mechanics. Through both of the topics, we would shed light on the birth of the interdisciplinary field between quantum mechanics and statistical mechanics.

  20. Does drug price-regulation affect healthcare expenditures?

    PubMed

    Ben-Aharon, Omer; Shavit, Oren; Magnezi, Racheli

    2017-09-01

    Increasing health costs in developed countries are a major concern for decision makers. A variety of cost containment tools are used to control this trend, including maximum price regulation and reimbursement methods for health technologies. Information regarding expenditure-related outcomes of these tools is not available. To evaluate the association between different cost-regulating mechanisms and national health expenditures in selected countries. Price-regulating and reimbursement mechanisms for prescription drugs among OECD countries were reviewed. National health expenditure indices for 2008-2012 were extracted from OECD statistical sources. Possible associations between characteristics of different systems for regulation of drug prices and reimbursement and health expenditures were examined. In most countries, reimbursement mechanisms are part of publicly financed plans. Maximum price regulation is composed of reference-pricing, either of the same drug in other countries, or of therapeutic alternatives within the country, as well as value-based pricing (VBP). No association was found between price regulation or reimbursement mechanisms and healthcare costs. However, VBP may present a more effective mechanism, leading to reduced costs in the long term. Maximum price and reimbursement mechanism regulations were not found to be associated with cost containment of national health expenditures. VBP may have the potential to do so over the long term.

  1. Origin of Pareto-like spatial distributions in ecosystems.

    PubMed

    Manor, Alon; Shnerb, Nadav M

    2008-12-31

    Recent studies of cluster distribution in various ecosystems revealed Pareto statistics for the size of spatial colonies. These results were supported by cellular automata simulations that yield robust criticality for endogenous pattern formation based on positive feedback. We show that this patch statistics is a manifestation of the law of proportionate effect. Mapping the stochastic model to a Markov birth-death process, the transition rates are shown to scale linearly with cluster size. This mapping provides a connection between patch statistics and the dynamics of the ecosystem; the "first passage time" for different colonies emerges as a powerful tool that discriminates between endogenous and exogenous clustering mechanisms. Imminent catastrophic shifts (such as desertification) manifest themselves in a drastic change of the stability properties of spatial colonies.

  2. Comparisons of non-Gaussian statistical models in DNA methylation analysis.

    PubMed

    Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-06-16

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.

  3. Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis

    PubMed Central

    Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-01-01

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687

  4. A toolbox for determining subdiffusive mechanisms

    NASA Astrophysics Data System (ADS)

    Meroz, Yasmine; Sokolov, Igor M.

    2015-04-01

    Subdiffusive processes have become a field of great interest in the last decades, due to amounting experimental evidence of subdiffusive behavior in complex systems, and especially in biological systems. Different physical scenarios leading to subdiffusion differ in the details of the dynamics. These differences are what allow to theoretically reconstruct the underlying physics from the results of observations, and will be the topic of this review. We review the main statistical analyses available today to distinguish between these scenarios, categorizing them according to the relevant characteristics. We collect the available tools and statistical tests, presenting them within a broader perspective. We also consider possible complications such as the subordination of subdiffusive mechanisms. Due to the advances in single particle tracking experiments in recent years, we focus on the relevant case of where the available experimental data is scant, at the level of single trajectories.

  5. Effect of Friction Stir Process Parameters on the Mechanical and Thermal Behavior of 5754-H111 Aluminum Plates.

    PubMed

    Serio, Livia Maria; Palumbo, Davide; De Filippis, Luigi Alberto Ciro; Galietti, Umberto; Ludovico, Antonio Domenico

    2016-02-23

    A study of the Friction Stir Welding (FSW) process was carried out in order to evaluate the influence of process parameters on the mechanical properties of aluminum plates (AA5754-H111). The process was monitored during each test by means of infrared cameras in order to correlate temperature information with eventual changes of the mechanical properties of joints. In particular, two process parameters were considered for tests: the welding tool rotation speed and the welding tool traverse speed. The quality of joints was evaluated by means of destructive and non-destructive tests. In this regard, the presence of defects and the ultimate tensile strength (UTS) were investigated for each combination of the process parameters. A statistical analysis was carried out to assess the correlation between the thermal behavior of joints and the process parameters, also proving the capability of Infrared Thermography for on-line monitoring of the quality of joints.

  6. Effect of Friction Stir Process Parameters on the Mechanical and Thermal Behavior of 5754-H111 Aluminum Plates

    PubMed Central

    Serio, Livia Maria; Palumbo, Davide; De Filippis, Luigi Alberto Ciro; Galietti, Umberto; Ludovico, Antonio Domenico

    2016-01-01

    A study of the Friction Stir Welding (FSW) process was carried out in order to evaluate the influence of process parameters on the mechanical properties of aluminum plates (AA5754-H111). The process was monitored during each test by means of infrared cameras in order to correlate temperature information with eventual changes of the mechanical properties of joints. In particular, two process parameters were considered for tests: the welding tool rotation speed and the welding tool traverse speed. The quality of joints was evaluated by means of destructive and non-destructive tests. In this regard, the presence of defects and the ultimate tensile strength (UTS) were investigated for each combination of the process parameters. A statistical analysis was carried out to assess the correlation between the thermal behavior of joints and the process parameters, also proving the capability of Infrared Thermography for on-line monitoring of the quality of joints. PMID:28773246

  7. Tool geometry and damage mechanisms influencing CNC turning efficiency of Ti6Al4V

    NASA Astrophysics Data System (ADS)

    Suresh, Sangeeth; Hamid, Darulihsan Abdul; Yazid, M. Z. A.; Nasuha, Nurdiyanah; Ain, Siti Nurul

    2017-12-01

    Ti6Al4V or Grade 5 titanium alloy is widely used in the aerospace, medical, automotive and fabrication industries, due to its distinctive combination of mechanical and physical properties. Ti6Al4V has always been perverse during its machining, strangely due to the same mix of properties mentioned earlier. Ti6Al4V machining has resulted in shorter cutting tool life which has led to objectionable surface integrity and rapid failure of the parts machined. However, the proven functional relevance of this material has prompted extensive research in the optimization of machine parameters and cutting tool characteristics. Cutting tool geometry plays a vital role in ensuring dimensional and geometric accuracy in machined parts. In this study, an experimental investigation is actualized to optimize the nose radius and relief angles of the cutting tools and their interaction to different levels of machining parameters. Low elastic modulus and thermal conductivity of Ti6Al4V contribute to the rapid tool damage. The impact of these properties over the tool tips damage is studied. An experimental design approach is utilized in the CNC turning process of Ti6Al4V to statistically analyze and propose optimum levels of input parameters to lengthen the tool life and enhance surface characteristics of the machined parts. A greater tool nose radius with a straight flank, combined with low feed rates have resulted in a desirable surface integrity. The presence of relief angle has proven to aggravate tool damage and also dimensional instability in the CNC turning of Ti6Al4V.

  8. New statistical potential for quality assessment of protein models and a survey of energy functions

    PubMed Central

    2010-01-01

    Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality. PMID:20226048

  9. Student engagement in pharmacology courses using online learning tools.

    PubMed

    Karaksha, Abdullah; Grant, Gary; Anoopkumar-Dukie, Shailendra; Nirthanan, S Niru; Davey, Andrew K

    2013-08-12

    To assess factors influencing student engagement with e-tools used as a learning supplement to the standard curriculum in pharmacology courses. A suite of 148 e-tools (interactive online teaching materials encompassing the basic mechanisms of action for different drug classes) were designed and implemented across 2 semesters for third-year pharmacy students. Student engagement and use of this new teaching strategy were assessed using a survey instrument and usage statistics for the material. Use of e-tools during semester 1 was low, a finding attributable to a majority (75%) of students either being unaware of or forgetting about the embedded e-tools and a few (20%) lacking interest in accessing additional learning materials. In contrast to semester 1, e-tool use significantly increased in semester 2 with the use of frequent reminders and announcements (p<0.001). The provision of online teaching and learning resources were only effective in increasing student engagement after the implementation of a "marketing strategy" that included e-mail reminders and motivation.

  10. Student Engagement in Pharmacology Courses Using Online Learning Tools

    PubMed Central

    Karaksha, Abdullah; Grant, Gary; Anoopkumar-Dukie, Shailendra; Nirthanan, S. Niru

    2013-01-01

    Objective. To assess factors influencing student engagement with e-tools used as a learning supplement to the standard curriculum in pharmacology courses. Design. A suite of 148 e-tools (interactive online teaching materials encompassing the basic mechanisms of action for different drug classes) were designed and implemented across 2 semesters for third-year pharmacy students. Assessment. Student engagement and use of this new teaching strategy were assessed using a survey instrument and usage statistics for the material. Use of e-tools during semester 1 was low, a finding attributable to a majority (75%) of students either being unaware of or forgetting about the embedded e-tools and a few (20%) lacking interest in accessing additional learning materials. In contrast to semester 1, e-tool use significantly increased in semester 2 with the use of frequent reminders and announcements (p<0.001). Conclusion. The provision of online teaching and learning resources were only effective in increasing student engagement after the implementation of a “marketing strategy” that included e-mail reminders and motivation. PMID:23966728

  11. Developing the WCRF International/University of Bristol Methodology for Identifying and Carrying Out Systematic Reviews of Mechanisms of Exposure-Cancer Associations.

    PubMed

    Lewis, Sarah J; Gardner, Mike; Higgins, Julian; Holly, Jeff M P; Gaunt, Tom R; Perks, Claire M; Turner, Suzanne D; Rinaldi, Sabina; Thomas, Steve; Harrison, Sean; Lennon, Rosie J; Tan, Vanessa; Borwick, Cath; Emmett, Pauline; Jeffreys, Mona; Northstone, Kate; Mitrou, Giota; Wiseman, Martin; Thompson, Rachel; Martin, Richard M

    2017-11-01

    Background: Human, animal, and cell experimental studies; human biomarker studies; and genetic studies complement epidemiologic findings and can offer insights into biological plausibility and pathways between exposure and disease, but methods for synthesizing such studies are lacking. We, therefore, developed a methodology for identifying mechanisms and carrying out systematic reviews of mechanistic studies that underpin exposure-cancer associations. Methods: A multidisciplinary team with expertise in informatics, statistics, epidemiology, systematic reviews, cancer biology, and nutrition was assembled. Five 1-day workshops were held to brainstorm ideas; in the intervening periods we carried out searches and applied our methods to a case study to test our ideas. Results: We have developed a two-stage framework, the first stage of which is designed to identify mechanisms underpinning a specific exposure-disease relationship; the second stage is a targeted systematic review of studies on a specific mechanism. As part of the methodology, we also developed an online tool for text mining for mechanism prioritization (TeMMPo) and a new graph for displaying related but heterogeneous data from epidemiologic studies (the Albatross plot). Conclusions: We have developed novel tools for identifying mechanisms and carrying out systematic reviews of mechanistic studies of exposure-disease relationships. In doing so, we have outlined how we have overcome the challenges that we faced and provided researchers with practical guides for conducting mechanistic systematic reviews. Impact: The aforementioned methodology and tools will allow potential mechanisms to be identified and the strength of the evidence underlying a particular mechanism to be assessed. Cancer Epidemiol Biomarkers Prev; 26(11); 1667-75. ©2017 AACR . ©2017 American Association for Cancer Research.

  12. BEAT: Bioinformatics Exon Array Tool to store, analyze and visualize Affymetrix GeneChip Human Exon Array data from disease experiments

    PubMed Central

    2012-01-01

    Background It is known from recent studies that more than 90% of human multi-exon genes are subject to Alternative Splicing (AS), a key molecular mechanism in which multiple transcripts may be generated from a single gene. It is widely recognized that a breakdown in AS mechanisms plays an important role in cellular differentiation and pathologies. Polymerase Chain Reactions, microarrays and sequencing technologies have been applied to the study of transcript diversity arising from alternative expression. Last generation Affymetrix GeneChip Human Exon 1.0 ST Arrays offer a more detailed view of the gene expression profile providing information on the AS patterns. The exon array technology, with more than five million data points, can detect approximately one million exons, and it allows performing analyses at both gene and exon level. In this paper we describe BEAT, an integrated user-friendly bioinformatics framework to store, analyze and visualize exon arrays datasets. It combines a data warehouse approach with some rigorous statistical methods for assessing the AS of genes involved in diseases. Meta statistics are proposed as a novel approach to explore the analysis results. BEAT is available at http://beat.ba.itb.cnr.it. Results BEAT is a web tool which allows uploading and analyzing exon array datasets using standard statistical methods and an easy-to-use graphical web front-end. BEAT has been tested on a dataset with 173 samples and tuned using new datasets of exon array experiments from 28 colorectal cancer and 26 renal cell cancer samples produced at the Medical Genetics Unit of IRCCS Casa Sollievo della Sofferenza. To highlight all possible AS events, alternative names, accession Ids, Gene Ontology terms and biochemical pathways annotations are integrated with exon and gene level expression plots. The user can customize the results choosing custom thresholds for the statistical parameters and exploiting the available clinical data of the samples for a multivariate AS analysis. Conclusions Despite exon array chips being widely used for transcriptomics studies, there is a lack of analysis tools offering advanced statistical features and requiring no programming knowledge. BEAT provides a user-friendly platform for a comprehensive study of AS events in human diseases, displaying the analysis results with easily interpretable and interactive tables and graphics. PMID:22536968

  13. Characterizing the lung tissue mechanical properties using a micromechanical model of alveolar sac

    NASA Astrophysics Data System (ADS)

    Karami, Elham; Seify, Behzad; Moghadas, Hadi; Sabsalinejad, Masoomeh; Lee, Ting-Yim; Samani, Abbas

    2017-03-01

    According to statistics, lung disease is among the leading causes of death worldwide. As such, many research groups are developing powerful tools for understanding, diagnosis and treatment of various lung diseases. Recently, biomechanical modeling has emerged as an effective tool for better understanding of human physiology, disease diagnosis and computer assisted medical intervention. Mechanical properties of lung tissue are important requirements for methods developed for lung disease diagnosis and medical intervention. As such, the main objective of this study is to develop an effective tool for estimating the mechanical properties of normal and pathological lung parenchyma tissue based on its microstructure. For this purpose, a micromechanical model of the lung tissue was developed using finite element (FE) method, and the model was demonstrated to have application in estimating the mechanical properties of lung alveolar wall. The proposed model was developed by assembling truncated octahedron tissue units resembling the alveoli. A compression test was simulated using finite element method on the created geometry and the hyper-elastic parameters of the alveoli wall were calculated using reported alveolar wall stress-strain data and an inverse optimization framework. Preliminary results indicate that the proposed model can be potentially used to reconstruct microstructural images of lung tissue using macro-scale tissue response for normal and different pathological conditions. Such images can be used for effective diagnosis of lung diseases such as Chronic Obstructive Pulmonary Disease (COPD).

  14. Optimization of Sinter Plant Operating Conditions Using Advanced Multivariate Statistics: Intelligent Data Processing

    NASA Astrophysics Data System (ADS)

    Fernández-González, Daniel; Martín-Duarte, Ramón; Ruiz-Bustinza, Íñigo; Mochón, Javier; González-Gasca, Carmen; Verdeja, Luis Felipe

    2016-08-01

    Blast furnace operators expect to get sinter with homogenous and regular properties (chemical and mechanical), necessary to ensure regular blast furnace operation. Blends for sintering also include several iron by-products and other wastes that are obtained in different processes inside the steelworks. Due to their source, the availability of such materials is not always consistent, but their total production should be consumed in the sintering process, to both save money and recycle wastes. The main scope of this paper is to obtain the least expensive iron ore blend for the sintering process, which will provide suitable chemical and mechanical features for the homogeneous and regular operation of the blast furnace. The systematic use of statistical tools was employed to analyze historical data, including linear and partial correlations applied to the data and fuzzy clustering based on the Sugeno Fuzzy Inference System to establish relationships among the available variables.

  15. Conceptual developments of non-equilibrium statistical mechanics in the early days of Japan

    NASA Astrophysics Data System (ADS)

    Ichiyanagi, Masakazu

    1995-11-01

    This paper reviews the research in nonequilibrium statistical mechanics made in Japan in the period between 1930 and 1960. Nearly thirty years have passed since the discovery of the exact formula for the electrical conductivity. With the rise of the linear response theory, the methods and results of which are quickly grasped by anyone, its rationale was pushed aside and even at the stage where the formulation was still incomplete some authors hurried to make physical applications. Such an attitude robbed it of most of its interest for the average physicist, who would approach an understanding of some basic concept, not through abstract and logical analysis but by simply increasing his technical experiences with the concept. The purpose of this review is to rescue the linear response theory from being labeled a mathematical tool and to show that it has considerable physical content. Many key papers, originally written in Japanese, are reproduced.

  16. Thermodynamics of Biological Processes

    PubMed Central

    Garcia, Hernan G.; Kondev, Jane; Orme, Nigel; Theriot, Julie A.; Phillips, Rob

    2012-01-01

    There is a long and rich tradition of using ideas from both equilibrium thermodynamics and its microscopic partner theory of equilibrium statistical mechanics. In this chapter, we provide some background on the origins of the seemingly unreasonable effectiveness of ideas from both thermodynamics and statistical mechanics in biology. After making a description of these foundational issues, we turn to a series of case studies primarily focused on binding that are intended to illustrate the broad biological reach of equilibrium thinking in biology. These case studies include ligand-gated ion channels, thermodynamic models of transcription, and recent applications to the problem of bacterial chemotaxis. As part of the description of these case studies, we explore a number of different uses of the famed Monod–Wyman–Changeux (MWC) model as a generic tool for providing a mathematical characterization of two-state systems. These case studies should provide a template for tailoring equilibrium ideas to other problems of biological interest. PMID:21333788

  17. Probabilistic finite elements for fatigue and fracture analysis

    NASA Astrophysics Data System (ADS)

    Belytschko, Ted; Liu, Wing Kam

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  18. Probabilistic finite elements for fatigue and fracture analysis

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Liu, Wing Kam

    1992-01-01

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  19. Provably unbounded memory advantage in stochastic simulation using quantum mechanics

    NASA Astrophysics Data System (ADS)

    Garner, Andrew J. P.; Liu, Qing; Thompson, Jayne; Vedral, Vlatko; Gu, mile

    2017-10-01

    Simulating the stochastic evolution of real quantities on a digital computer requires a trade-off between the precision to which these quantities are approximated, and the memory required to store them. The statistical accuracy of the simulation is thus generally limited by the internal memory available to the simulator. Here, using tools from computational mechanics, we show that quantum processors with a fixed finite memory can simulate stochastic processes of real variables to arbitrarily high precision. This demonstrates a provable, unbounded memory advantage that a quantum simulator can exhibit over its best possible classical counterpart.

  20. Defense Small Business Innovation Research Program (SBIR), Volume 4, Defense Agencies Abstracts of Phase 1 Awards 1991

    DTIC Science & Technology

    1991-01-01

    EXPERIENCE IN DEVELOPING INTEGRATED OPTICAL DEVICES, NONLINEAR MAGNETIC-OPTIC MATERIALS, HIGH FREQUENCY MODULATORS, COMPUTER-AIDED MODELING AND SOPHISTICATED... HIGH -LEVEL PRESENTATION AND DISTRIBUTED CONTROL MODELS FOR INTEGRATING HETEROGENEOUS MECHANICAL ENGINEERING APPLICATIONS AND TOOLS. THE DESIGN IS FOCUSED...STATISTICALLY ACCURATE WORST CASE DEVICE MODELS FOR CIRCUIT SIMULATION. PRESENT METHODS OF WORST CASE DEVICE DESIGN ARE AD HOC AND DO NOT ALLOW THE

  1. Quantum theory of multiscale coarse-graining.

    PubMed

    Han, Yining; Jin, Jaehyeok; Wagner, Jacob W; Voth, Gregory A

    2018-03-14

    Coarse-grained (CG) models serve as a powerful tool to simulate molecular systems at much longer temporal and spatial scales. Previously, CG models and methods have been built upon classical statistical mechanics. The present paper develops a theory and numerical methodology for coarse-graining in quantum statistical mechanics, by generalizing the multiscale coarse-graining (MS-CG) method to quantum Boltzmann statistics. A rigorous derivation of the sufficient thermodynamic consistency condition is first presented via imaginary time Feynman path integrals. It identifies the optimal choice of CG action functional and effective quantum CG (qCG) force field to generate a quantum MS-CG (qMS-CG) description of the equilibrium system that is consistent with the quantum fine-grained model projected onto the CG variables. A variational principle then provides a class of algorithms for optimally approximating the qMS-CG force fields. Specifically, a variational method based on force matching, which was also adopted in the classical MS-CG theory, is generalized to quantum Boltzmann statistics. The qMS-CG numerical algorithms and practical issues in implementing this variational minimization procedure are also discussed. Then, two numerical examples are presented to demonstrate the method. Finally, as an alternative strategy, a quasi-classical approximation for the thermal density matrix expressed in the CG variables is derived. This approach provides an interesting physical picture for coarse-graining in quantum Boltzmann statistical mechanics in which the consistency with the quantum particle delocalization is obviously manifest, and it opens up an avenue for using path integral centroid-based effective classical force fields in a coarse-graining methodology.

  2. On modelling the interaction between two rotating bodies with statistically distributed features: an application to dressing of grinding wheels

    NASA Astrophysics Data System (ADS)

    Spampinato, A.; Axinte, D. A.

    2017-12-01

    The mechanisms of interaction between bodies with statistically arranged features present characteristics common to different abrasive processes, such as dressing of abrasive tools. In contrast with the current empirical approach used to estimate the results of operations based on attritive interactions, the method we present in this paper allows us to predict the output forces and the topography of a simulated grinding wheel for a set of specific operational parameters (speed ratio and radial feed-rate), providing a thorough understanding of the complex mechanisms regulating these processes. In modelling the dressing mechanisms, the abrasive characteristics of both bodies (grain size, geometry, inter-space and protrusion) are first simulated; thus, their interaction is simulated in terms of grain collisions. Exploiting a specifically designed contact/impact evaluation algorithm, the model simulates the collisional effects of the dresser abrasives on the grinding wheel topography (grain fracture/break-out). The method has been tested for the case of a diamond rotary dresser, predicting output forces within less than 10% error and obtaining experimentally validated grinding wheel topographies. The study provides a fundamental understanding of the dressing operation, enabling the improvement of its performance in an industrial scenario, while being of general interest in modelling collision-based processes involving statistically distributed elements.

  3. Efficient exploration of pan-cancer networks by generalized covariance selection and interactive web content

    PubMed Central

    Kling, Teresia; Johansson, Patrik; Sanchez, José; Marinescu, Voichita D.; Jörnsten, Rebecka; Nelander, Sven

    2015-01-01

    Statistical network modeling techniques are increasingly important tools to analyze cancer genomics data. However, current tools and resources are not designed to work across multiple diagnoses and technical platforms, thus limiting their applicability to comprehensive pan-cancer datasets such as The Cancer Genome Atlas (TCGA). To address this, we describe a new data driven modeling method, based on generalized Sparse Inverse Covariance Selection (SICS). The method integrates genetic, epigenetic and transcriptional data from multiple cancers, to define links that are present in multiple cancers, a subset of cancers, or a single cancer. It is shown to be statistically robust and effective at detecting direct pathway links in data from TCGA. To facilitate interpretation of the results, we introduce a publicly accessible tool (cancerlandscapes.org), in which the derived networks are explored as interactive web content, linked to several pathway and pharmacological databases. To evaluate the performance of the method, we constructed a model for eight TCGA cancers, using data from 3900 patients. The model rediscovered known mechanisms and contained interesting predictions. Possible applications include prediction of regulatory relationships, comparison of network modules across multiple forms of cancer and identification of drug targets. PMID:25953855

  4. Formation of a deposit on workpiece surface in polishing nonmetallic materials

    NASA Astrophysics Data System (ADS)

    Filatov, Yu. D.; Monteil, G.; Sidorko, V. I.; Filatov, O. Y.

    2013-05-01

    During the last decades in the theory of machining nonmetallic materials some serious advances have been achieved in the field of applying fundamental scientific approaches to the grinding and polishing technologies for high-quality precision surfaces of electronic components, optical systems, and decorative articles made of natural and synthetic stone [1-9]. These achievements include a cluster model of material removal in polishing dielectric workpieces [1-3, 6-7] and a physical-statistical model of formation of debris (wear) particles and removal thereof from a workpiece surface [8-10]. The aforesaid models made it possible to calculate, without recourse to Preston's linear law, the removal rate in polishing nonmetallic materials and the wear intensity for bound-abrasive tools. Equally important for the investigation of the workpiece surface generation mechanism and formation of debris particles are the kinetic functions of surface roughness and reflectance of glass and quartz workpiece surfaces, which have been established directly in the course of polishing. During the in situ inspection of a workpiece surface by laser ellipsometry [11] and reflectometry [12] it was found out that the periodic change of the light reflection coefficient of a workpiece surface being polished is attributed to the formation of fragments of a deposit consisting of work material particles (debris particles) and tool wear particles [13, 14]. The subsequent studies of the mechanism of interaction between the debris particles and wear particles in the tool-workpiece contact zone, which were carried out based on classical concepts [15, 16], yielded some unexpected results. It was demonstrated that electrically charged debris and wear particles, which are located in the coolant-filled gap between a tool and a workpiece, move by closed circular trajectories enclosed in spheres measuring less than one fifth of the gap thickness. This implies that the probability of the debris and wear particles reaching the tool and workpiece surfaces and, especially, getting localized on the surfaces is extremely low, which contradicts the results of experimental examination of these surfaces. Based on the quantum-mechanical description of the process of scattering of the debris and wear particles that are as small as 3-4 nm in the tool-workpiece contact zone, the mechanism of formation of a workpiece microrelief and the mechanism of formation of a debris-particle deposit on the tool surface were clarified [17-21]. However, the mechanism of formation of the deposit fragments and their discrete arrangement on the workpiece surface in the process of polishing with a bound-abrasive tool has not been studied yet.

  5. Uterine Cancer Statistics

    MedlinePlus

    ... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...

  6. atBioNet--an integrated network analysis tool for genomics and biomarker discovery.

    PubMed

    Ding, Yijun; Chen, Minjun; Liu, Zhichao; Ding, Don; Ye, Yanbin; Zhang, Min; Kelly, Reagan; Guo, Li; Su, Zhenqiang; Harris, Stephen C; Qian, Feng; Ge, Weigong; Fang, Hong; Xu, Xiaowei; Tong, Weida

    2012-07-20

    Large amounts of mammalian protein-protein interaction (PPI) data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks). The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools/ucm285284.htm.

  7. IBiSA_Tools: A Computational Toolkit for Ion-Binding State Analysis in Molecular Dynamics Trajectories of Ion Channels.

    PubMed

    Kasahara, Kota; Kinoshita, Kengo

    2016-01-01

    Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD) method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML), which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.

  8. A study of the mechanical vibrations of a table-top extreme ultraviolet interference nanolithography tool.

    PubMed

    Prezioso, S; De Marco, P; Zuppella, P; Santucci, S; Ottaviano, L

    2010-04-01

    A prototype low cost table-top extreme ultraviolet (EUV) laser source (1.5 ns pulse duration, lambda=46.9 nm) was successfully employed as a laboratory scale interference nanolithography (INL) tool. Interference patterns were obtained with a simple Lloyd's mirror setup. Periodic structures on Polymethylmethacrylate/Si substrates were produced on large areas (8 mm(2)) with resolutions from 400 to 22.5 nm half pitch (the smallest resolution achieved so far with table-top EUV laser sources). The mechanical vibrations affecting both the laser source and Lloyd's setup were studied to determine if and how they affect the lateral resolution of the lithographic system. The vibration dynamics was described by a statistical model based on the assumption that the instantaneous position of the vibrating mechanical parts follows a normal distribution. An algorithm was developed to simulate the process of sample irradiation under different vibrations. The comparison between simulations and experiments allowed to estimate the characteristic amplitude of vibrations that was deduced to be lower than 50 nm. The same algorithm was used to reproduce the expected pattern profiles in the lambda/4 half pitch physical resolution limit. In that limit, a nonzero pattern modulation amplitude was obtained from the simulations, comparable to the peak-to-valley height (2-3 nm) measured for the 45 nm spaced fringes, indicating that the mechanical vibrations affecting the INL tool do not represent a limit in scaling down the resolution.

  9. Einstein's Approach to Statistical Mechanics: The 1902-04 Papers

    NASA Astrophysics Data System (ADS)

    Peliti, Luca; Rechtman, Raúl

    2017-05-01

    We summarize the papers published by Einstein in the Annalen der Physik in the years 1902-1904 on the derivation of the properties of thermal equilibrium on the basis of the mechanical equations of motion and of the calculus of probabilities. We point out the line of thought that led Einstein to an especially economical foundation of the discipline, and to focus on fluctuations of the energy as a possible tool for establishing the validity of this foundation. We also sketch a comparison of Einstein's approach with that of Gibbs, suggesting that although they obtained similar results, they had different motivations and interpreted them in very different ways.

  10. History, rare, and multiple events of mechanical unfolding of repeat proteins

    NASA Astrophysics Data System (ADS)

    Sumbul, Fidan; Marchesi, Arin; Rico, Felix

    2018-03-01

    Mechanical unfolding of proteins consisting of repeat domains is an excellent tool to obtain large statistics. Force spectroscopy experiments using atomic force microscopy on proteins presenting multiple domains have revealed that unfolding forces depend on the number of folded domains (history) and have reported intermediate states and rare events. However, the common use of unspecific attachment approaches to pull the protein of interest holds important limitations to study unfolding history and may lead to discarding rare and multiple probing events due to the presence of unspecific adhesion and uncertainty on the pulling site. Site-specific methods that have recently emerged minimize this uncertainty and would be excellent tools to probe unfolding history and rare events. However, detailed characterization of these approaches is required to identify their advantages and limitations. Here, we characterize a site-specific binding approach based on the ultrastable complex dockerin/cohesin III revealing its advantages and limitations to assess the unfolding history and to investigate rare and multiple events during the unfolding of repeated domains. We show that this approach is more robust, reproducible, and provides larger statistics than conventional unspecific methods. We show that the method is optimal to reveal the history of unfolding from the very first domain and to detect rare events, while being more limited to assess intermediate states. Finally, we quantify the forces required to unfold two molecules pulled in parallel, difficult when using unspecific approaches. The proposed method represents a step forward toward more reproducible measurements to probe protein unfolding history and opens the door to systematic probing of rare and multiple molecule unfolding mechanisms.

  11. Toward an objective assessment of technical skills: a national survey of surgical program directors in Saudi Arabia.

    PubMed

    Alkhayal, Abdullah; Aldhukair, Shahla; Alselaim, Nahar; Aldekhayel, Salah; Alhabdan, Sultan; Altaweel, Waleed; Magzoub, Mohi Elden; Zamakhshary, Mohammed

    2012-01-01

    After almost a decade of implementing competency-based programs in postgraduate training programs, the assessment of technical skills remains more subjective than objective. National data on the assessment of technical skills during surgical training are lacking. We conducted this study to document the assessment tools for technical skills currently used in different surgical specialties, their relationship with remediation, the recommended tools from the program directors' perspective, and program directors' attitudes toward the available objective tools to assess technical skills. This study was a cross-sectional survey of surgical program directors (PDs). The survey was initially developed using a focus group and was then sent to 116 PDs. The survey contains demographic information about the program, the objective assessment tools used, and the reason for not using assessment tools. The last section discusses the recommended tools to be used from the PDs' perspective and the PDs' attitude and motivation to apply these tools in each program. The associations between the responses to the assessment questions and remediation were statistically evaluated. Seventy-one (61%) participants responded. Of the respondents, 59% mentioned using only nonstandardized, subjective, direct observation for technical skills assessment. Sixty percent use only summative evaluation, whereas 15% perform only formative evaluations of their residents, and the remaining 22% conduct both summative and formative evaluations of their residents' technical skills. Operative portfolios are kept by 53% of programs. The percentage of programs with mechanisms for remediation is 29% (19 of 65). The survey showed that surgical training programs use different tools to assess surgical skills competency. Having a clear remediation mechanism was highly associated with reporting remediation, which reflects the capability to detect struggling residents. Surgical training leadership should invest more in standardizing the assessment of surgical skills.

  12. Optimization of metabolite detection by quantum mechanics simulations in magnetic resonance spectroscopy.

    PubMed

    Gambarota, Giulio

    2017-07-15

    Magnetic resonance spectroscopy (MRS) is a well established modality for investigating tissue metabolism in vivo. In recent years, many efforts by the scientific community have been directed towards the improvement of metabolite detection and quantitation. Quantum mechanics simulations allow for investigations of the MR signal behaviour of metabolites; thus, they provide an essential tool in the optimization of metabolite detection. In this review, we will examine quantum mechanics simulations based on the density matrix formalism. The density matrix was introduced by von Neumann in 1927 to take into account statistical effects within the theory of quantum mechanics. We will discuss the main steps of the density matrix simulation of an arbitrary spin system and show some examples for the strongly coupled two spin system. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. miRNet - dissecting miRNA-target interactions and functional associations through network-based visual analysis

    PubMed Central

    Fan, Yannan; Siklenka, Keith; Arora, Simran K.; Ribeiro, Paula; Kimmins, Sarah; Xia, Jianguo

    2016-01-01

    MicroRNAs (miRNAs) can regulate nearly all biological processes and their dysregulation is implicated in various complex diseases and pathological conditions. Recent years have seen a growing number of functional studies of miRNAs using high-throughput experimental technologies, which have produced a large amount of high-quality data regarding miRNA target genes and their interactions with small molecules, long non-coding RNAs, epigenetic modifiers, disease associations, etc. These rich sets of information have enabled the creation of comprehensive networks linking miRNAs with various biologically important entities to shed light on their collective functions and regulatory mechanisms. Here, we introduce miRNet, an easy-to-use web-based tool that offers statistical, visual and network-based approaches to help researchers understand miRNAs functions and regulatory mechanisms. The key features of miRNet include: (i) a comprehensive knowledge base integrating high-quality miRNA-target interaction data from 11 databases; (ii) support for differential expression analysis of data from microarray, RNA-seq and quantitative PCR; (iii) implementation of a flexible interface for data filtering, refinement and customization during network creation; (iv) a powerful fully featured network visualization system coupled with enrichment analysis. miRNet offers a comprehensive tool suite to enable statistical analysis and functional interpretation of various data generated from current miRNA studies. miRNet is freely available at http://www.mirnet.ca. PMID:27105848

  14. Temperature in and out of equilibrium: A review of concepts, tools and attempts

    NASA Astrophysics Data System (ADS)

    Puglisi, A.; Sarracino, A.; Vulpiani, A.

    2017-11-01

    We review the general aspects of the concept of temperature in equilibrium and non-equilibrium statistical mechanics. Although temperature is an old and well-established notion, it still presents controversial facets. After a short historical survey of the key role of temperature in thermodynamics and statistical mechanics, we tackle a series of issues which have been recently reconsidered. In particular, we discuss different definitions and their relevance for energy fluctuations. The interest in such a topic has been triggered by the recent observation of negative temperatures in condensed matter experiments. Moreover, the ability to manipulate systems at the micro and nano-scale urges to understand and clarify some aspects related to the statistical properties of small systems (as the issue of temperature's ;fluctuations;). We also discuss the notion of temperature in a dynamical context, within the theory of linear response for Hamiltonian systems at equilibrium and stochastic models with detailed balance, and the generalized fluctuation-response relations, which provide a hint for an extension of the definition of temperature in far-from-equilibrium systems. To conclude we consider non-Hamiltonian systems, such as granular materials, turbulence and active matter, where a general theoretical framework is still lacking.

  15. Machinability of titanium metal matrix composites (Ti-MMCs)

    NASA Astrophysics Data System (ADS)

    Aramesh, Maryam

    Titanium metal matrix composites (Ti-MMCs), as a new generation of materials, have various potential applications in aerospace and automotive industries. The presence of ceramic particles enhances the physical and mechanical properties of the alloy matrix. However, the hard and abrasive nature of these particles causes various issues in the field of their machinability. Severe tool wear and short tool life are the most important drawbacks of machining this class of materials. There is very limited work in the literature regarding the machinability of this class of materials especially in the area of tool life estimation and tool wear. By far, polycrystalline diamond (PCD) tools appear to be the best choice for machining MMCs from researchers' point of view. However, due to their high cost, economical alternatives are sought. Cubic boron nitride (CBN) inserts, as the second hardest available tools, show superior characteristics such as great wear resistance, high hardness at elevated temperatures, a low coefficient of friction and a high melting point. Yet, so far CBN tools have not been studied during machining of Ti-MMCs. In this study, a comprehensive study has been performed to explore the tool wear mechanisms of CBN inserts during turning of Ti-MMCs. The unique morphology of the worn faces of the tools was investigated for the first time, which led to new insights in the identification of chemical wear mechanisms during machining of Ti-MMCs. Utilizing the full tool life capacity of cutting tools is also very crucial, due to the considerable costs associated with suboptimal replacement of tools. This strongly motivates development of a reliable model for tool life estimation under any cutting conditions. In this study, a novel model based on the survival analysis methodology is developed to estimate the progressive states of tool wear under any cutting conditions during machining of Ti-MMCs. This statistical model takes into account the machining time in addition to the effect of cutting parameters. Thus, promising results were obtained which showed a very good agreement with the experimental results. Moreover, a more advanced model was constructed, by adding the tool wear as another variable to the previous model. Therefore, a new model was proposed for estimating the remaining life of worn inserts under different cutting conditions, using the current tool wear data as an input. The results of this model were validated with the experimental results. The estimated results were well consistent with the results obtained from the experiments.

  16. In silico environmental chemical science: properties and processes from statistical and computational modelling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tratnyek, Paul G.; Bylaska, Eric J.; Weber, Eric J.

    2017-01-01

    Quantitative structure–activity relationships (QSARs) have long been used in the environmental sciences. More recently, molecular modeling and chemoinformatic methods have become widespread. These methods have the potential to expand and accelerate advances in environmental chemistry because they complement observational and experimental data with “in silico” results and analysis. The opportunities and challenges that arise at the intersection between statistical and theoretical in silico methods are most apparent in the context of properties that determine the environmental fate and effects of chemical contaminants (degradation rate constants, partition coefficients, toxicities, etc.). The main example of this is the calibration of QSARs usingmore » descriptor variable data calculated from molecular modeling, which can make QSARs more useful for predicting property data that are unavailable, but also can make them more powerful tools for diagnosis of fate determining pathways and mechanisms. Emerging opportunities for “in silico environmental chemical science” are to move beyond the calculation of specific chemical properties using statistical models and toward more fully in silico models, prediction of transformation pathways and products, incorporation of environmental factors into model predictions, integration of databases and predictive models into more comprehensive and efficient tools for exposure assessment, and extending the applicability of all the above from chemicals to biologicals and materials.« less

  17. An argument for mechanism-based statistical inference in cancer

    PubMed Central

    Ochs, Michael; Price, Nathan D.; Tomasetti, Cristian; Younes, Laurent

    2015-01-01

    Cancer is perhaps the prototypical systems disease, and as such has been the focus of extensive study in quantitative systems biology. However, translating these programs into personalized clinical care remains elusive and incomplete. In this perspective, we argue that realizing this agenda—in particular, predicting disease phenotypes, progression and treatment response for individuals—requires going well beyond standard computational and bioinformatics tools and algorithms. It entails designing global mathematical models over network-scale configurations of genomic states and molecular concentrations, and learning the model parameters from limited available samples of high-dimensional and integrative omics data. As such, any plausible design should accommodate: biological mechanism, necessary for both feasible learning and interpretable decision making; stochasticity, to deal with uncertainty and observed variation at many scales; and a capacity for statistical inference at the patient level. This program, which requires a close, sustained collaboration between mathematicians and biologists, is illustrated in several contexts, including learning bio-markers, metabolism, cell signaling, network inference and tumorigenesis. PMID:25381197

  18. Path integral molecular dynamics for exact quantum statistics of multi-electronic-state systems.

    PubMed

    Liu, Xinzijian; Liu, Jian

    2018-03-14

    An exact approach to compute physical properties for general multi-electronic-state (MES) systems in thermal equilibrium is presented. The approach is extended from our recent progress on path integral molecular dynamics (PIMD), Liu et al. [J. Chem. Phys. 145, 024103 (2016)] and Zhang et al. [J. Chem. Phys. 147, 034109 (2017)], for quantum statistical mechanics when a single potential energy surface is involved. We first define an effective potential function that is numerically favorable for MES-PIMD and then derive corresponding estimators in MES-PIMD for evaluating various physical properties. Its application to several representative one-dimensional and multi-dimensional models demonstrates that MES-PIMD in principle offers a practical tool in either of the diabatic and adiabatic representations for studying exact quantum statistics of complex/large MES systems when the Born-Oppenheimer approximation, Condon approximation, and harmonic bath approximation are broken.

  19. Path integral molecular dynamics for exact quantum statistics of multi-electronic-state systems

    NASA Astrophysics Data System (ADS)

    Liu, Xinzijian; Liu, Jian

    2018-03-01

    An exact approach to compute physical properties for general multi-electronic-state (MES) systems in thermal equilibrium is presented. The approach is extended from our recent progress on path integral molecular dynamics (PIMD), Liu et al. [J. Chem. Phys. 145, 024103 (2016)] and Zhang et al. [J. Chem. Phys. 147, 034109 (2017)], for quantum statistical mechanics when a single potential energy surface is involved. We first define an effective potential function that is numerically favorable for MES-PIMD and then derive corresponding estimators in MES-PIMD for evaluating various physical properties. Its application to several representative one-dimensional and multi-dimensional models demonstrates that MES-PIMD in principle offers a practical tool in either of the diabatic and adiabatic representations for studying exact quantum statistics of complex/large MES systems when the Born-Oppenheimer approximation, Condon approximation, and harmonic bath approximation are broken.

  20. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  1. Fault Diagnosis for Rotating Machinery Using Vibration Measurement Deep Statistical Feature Learning.

    PubMed

    Li, Chuan; Sánchez, René-Vinicio; Zurita, Grover; Cerrada, Mariela; Cabrera, Diego

    2016-06-17

    Fault diagnosis is important for the maintenance of rotating machinery. The detection of faults and fault patterns is a challenging part of machinery fault diagnosis. To tackle this problem, a model for deep statistical feature learning from vibration measurements of rotating machinery is presented in this paper. Vibration sensor signals collected from rotating mechanical systems are represented in the time, frequency, and time-frequency domains, each of which is then used to produce a statistical feature set. For learning statistical features, real-value Gaussian-Bernoulli restricted Boltzmann machines (GRBMs) are stacked to develop a Gaussian-Bernoulli deep Boltzmann machine (GDBM). The suggested approach is applied as a deep statistical feature learning tool for both gearbox and bearing systems. The fault classification performances in experiments using this approach are 95.17% for the gearbox, and 91.75% for the bearing system. The proposed approach is compared to such standard methods as a support vector machine, GRBM and a combination model. In experiments, the best fault classification rate was detected using the proposed model. The results show that deep learning with statistical feature extraction has an essential improvement potential for diagnosing rotating machinery faults.

  2. Fault Diagnosis for Rotating Machinery Using Vibration Measurement Deep Statistical Feature Learning

    PubMed Central

    Li, Chuan; Sánchez, René-Vinicio; Zurita, Grover; Cerrada, Mariela; Cabrera, Diego

    2016-01-01

    Fault diagnosis is important for the maintenance of rotating machinery. The detection of faults and fault patterns is a challenging part of machinery fault diagnosis. To tackle this problem, a model for deep statistical feature learning from vibration measurements of rotating machinery is presented in this paper. Vibration sensor signals collected from rotating mechanical systems are represented in the time, frequency, and time-frequency domains, each of which is then used to produce a statistical feature set. For learning statistical features, real-value Gaussian-Bernoulli restricted Boltzmann machines (GRBMs) are stacked to develop a Gaussian-Bernoulli deep Boltzmann machine (GDBM). The suggested approach is applied as a deep statistical feature learning tool for both gearbox and bearing systems. The fault classification performances in experiments using this approach are 95.17% for the gearbox, and 91.75% for the bearing system. The proposed approach is compared to such standard methods as a support vector machine, GRBM and a combination model. In experiments, the best fault classification rate was detected using the proposed model. The results show that deep learning with statistical feature extraction has an essential improvement potential for diagnosing rotating machinery faults. PMID:27322273

  3. Extraction of process zones and low-dimensional attractive subspaces in stochastic fracture mechanics

    PubMed Central

    Kerfriden, P.; Schmidt, K.M.; Rabczuk, T.; Bordas, S.P.A.

    2013-01-01

    We propose to identify process zones in heterogeneous materials by tailored statistical tools. The process zone is redefined as the part of the structure where the random process cannot be correctly approximated in a low-dimensional deterministic space. Such a low-dimensional space is obtained by a spectral analysis performed on pre-computed solution samples. A greedy algorithm is proposed to identify both process zone and low-dimensional representative subspace for the solution in the complementary region. In addition to the novelty of the tools proposed in this paper for the analysis of localised phenomena, we show that the reduced space generated by the method is a valid basis for the construction of a reduced order model. PMID:27069423

  4. Instruction of Statistics via Computer-Based Tools: Effects on Statistics' Anxiety, Attitude, and Achievement

    ERIC Educational Resources Information Center

    Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar

    2014-01-01

    The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…

  5. Mechanical parameters and flight phase characteristics in aquatic plyometric jumping.

    PubMed

    Louder, Talin J; Searle, Cade J; Bressel, Eadric

    2016-09-01

    Plyometric jumping is a commonly prescribed method of training focused on the development of reactive strength and high-velocity concentric power. Literature suggests that aquatic plyometric training may be a low-impact, effective supplement to land-based training. The purpose of the present study was to quantify acute, biomechanical characteristics of the take-off and flight phase for plyometric movements performed in the water. Kinetic force platform data from 12 young, male adults were collected for counter-movement jumps performed on land and in water at two different immersion depths. The specificity of jumps between environmental conditions was assessed using kinetic measures, temporal characteristics, and an assessment of the statistical relationship between take-off velocity and time in the air. Greater peak mechanical power was observed for jumps performed in the water, and was influenced by immersion depth. Additionally, the data suggest that, in the water, the statistical relationship between take-off velocity and time in air is quadratic. Results highlight the potential application of aquatic plyometric training as a cross-training tool for improving mechanical power and suggest that water immersion depth and fluid drag play key roles in the specificity of the take-off phase for jumping movements performed in the water.

  6. "Dear Fresher …"--How Online Questionnaires Can Improve Learning and Teaching Statistics

    ERIC Educational Resources Information Center

    Bebermeier, Sarah; Nussbeck, Fridtjof W.; Ontrup, Greta

    2015-01-01

    Lecturers teaching statistics are faced with several challenges supporting students' learning in appropriate ways. A variety of methods and tools exist to facilitate students' learning on statistics courses. The online questionnaires presented in this report are a new, slightly different computer-based tool: the central aim was to support students…

  7. A multibody knee model with discrete cartilage prediction of tibio-femoral contact mechanics.

    PubMed

    Guess, Trent M; Liu, Hongzeng; Bhashyam, Sampath; Thiagarajan, Ganesh

    2013-01-01

    Combining musculoskeletal simulations with anatomical joint models capable of predicting cartilage contact mechanics would provide a valuable tool for studying the relationships between muscle force and cartilage loading. As a step towards producing multibody musculoskeletal models that include representation of cartilage tissue mechanics, this research developed a subject-specific multibody knee model that represented the tibia plateau cartilage as discrete rigid bodies that interacted with the femur through deformable contacts. Parameters for the compliant contact law were derived using three methods: (1) simplified Hertzian contact theory, (2) simplified elastic foundation contact theory and (3) parameter optimisation from a finite element (FE) solution. The contact parameters and contact friction were evaluated during a simulated walk in a virtual dynamic knee simulator, and the resulting kinematics were compared with measured in vitro kinematics. The effects on predicted contact pressures and cartilage-bone interface shear forces during the simulated walk were also evaluated. The compliant contact stiffness parameters had a statistically significant effect on predicted contact pressures as well as all tibio-femoral motions except flexion-extension. The contact friction was not statistically significant to contact pressures, but was statistically significant to medial-lateral translation and all rotations except flexion-extension. The magnitude of kinematic differences between model formulations was relatively small, but contact pressure predictions were sensitive to model formulation. The developed multibody knee model was computationally efficient and had a computation time 283 times faster than a FE simulation using the same geometries and boundary conditions.

  8. Statistical Analysis of the Processes Controlling Choline and Ethanolamine Glycerophospholipid Molecular Species Composition

    PubMed Central

    Kiebish, Michael A.; Yang, Kui; Han, Xianlin; Gross, Richard W.; Chuang, Jeffrey

    2012-01-01

    The regulation and maintenance of the cellular lipidome through biosynthetic, remodeling, and catabolic mechanisms are critical for biological homeostasis during development, health and disease. These complex mechanisms control the architectures of lipid molecular species, which have diverse yet highly regulated fatty acid chains at both the sn1 and sn2 positions. Phosphatidylcholine (PC) and phosphatidylethanolamine (PE) serve as the predominant biophysical scaffolds in membranes, acting as reservoirs for potent lipid signals and regulating numerous enzymatic processes. Here we report the first rigorous computational dissection of the mechanisms influencing PC and PE molecular architectures from high-throughput shotgun lipidomic data. Using novel statistical approaches, we have analyzed multidimensional mass spectrometry-based shotgun lipidomic data from developmental mouse heart and mature mouse heart, lung, brain, and liver tissues. We show that in PC and PE, sn1 and sn2 positions are largely independent, though for low abundance species regulatory processes may interact with both the sn1 and sn2 chain simultaneously, leading to cooperative effects. Chains with similar biochemical properties appear to be remodeled similarly. We also see that sn2 positions are more regulated than sn1, and that PC exhibits stronger cooperative effects than PE. A key aspect of our work is a novel statistically rigorous approach to determine cooperativity based on a modified Fisher's exact test using Markov Chain Monte Carlo sampling. This computational approach provides a novel tool for developing mechanistic insight into lipidomic regulation. PMID:22662143

  9. SurfKin: an ab initio kinetic code for modeling surface reactions.

    PubMed

    Le, Thong Nguyen-Minh; Liu, Bin; Huynh, Lam K

    2014-10-05

    In this article, we describe a C/C++ program called SurfKin (Surface Kinetics) to construct microkinetic mechanisms for modeling gas-surface reactions. Thermodynamic properties of reaction species are estimated based on density functional theory calculations and statistical mechanics. Rate constants for elementary steps (including adsorption, desorption, and chemical reactions on surfaces) are calculated using the classical collision theory and transition state theory. Methane decomposition and water-gas shift reaction on Ni(111) surface were chosen as test cases to validate the code implementations. The good agreement with literature data suggests this is a powerful tool to facilitate the analysis of complex reactions on surfaces, and thus it helps to effectively construct detailed microkinetic mechanisms for such surface reactions. SurfKin also opens a possibility for designing nanoscale model catalysts. Copyright © 2014 Wiley Periodicals, Inc.

  10. Validation of a Statistical Methodology for Extracting Vegetation Feedbacks: Focus on North African Ecosystems in the Community Earth System Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Yan; Notaro, Michael; Wang, Fuyao

    Generalized equilibrium feedback assessment (GEFA) is a potentially valuable multivariate statistical tool for extracting vegetation feedbacks to the atmosphere in either observations or coupled Earth system models. The reliability of GEFA at capturing the terrestrial impacts on regional climate is demonstrated in this paper using the National Center for Atmospheric Research Community Earth System Model (CESM), with focus on North Africa. The feedback is assessed statistically by applying GEFA to output from a fully coupled control run. To reduce the sampling error caused by short data records, the traditional or full GEFA is refined through stepwise GEFA by dropping unimportantmore » forcings. Two ensembles of dynamical experiments are developed for the Sahel or West African monsoon region against which GEFA-based vegetation feedbacks are evaluated. In these dynamical experiments, regional leaf area index (LAI) is modified either alone or in conjunction with soil moisture, with the latter runs motivated by strong regional soil moisture–LAI coupling. Stepwise GEFA boasts higher consistency between statistically and dynamically assessed atmospheric responses to land surface anomalies than full GEFA, especially with short data records. GEFA-based atmospheric responses are more consistent with the coupled soil moisture–LAI experiments, indicating that GEFA is assessing the combined impacts of coupled vegetation and soil moisture. Finally, both the statistical and dynamical assessments reveal a negative vegetation–rainfall feedback in the Sahel associated with an atmospheric stability mechanism in CESM versus a weaker positive feedback in the West African monsoon region associated with a moisture recycling mechanism in CESM.« less

  11. Validation of a Statistical Methodology for Extracting Vegetation Feedbacks: Focus on North African Ecosystems in the Community Earth System Model

    DOE PAGES

    Yu, Yan; Notaro, Michael; Wang, Fuyao; ...

    2018-02-05

    Generalized equilibrium feedback assessment (GEFA) is a potentially valuable multivariate statistical tool for extracting vegetation feedbacks to the atmosphere in either observations or coupled Earth system models. The reliability of GEFA at capturing the terrestrial impacts on regional climate is demonstrated in this paper using the National Center for Atmospheric Research Community Earth System Model (CESM), with focus on North Africa. The feedback is assessed statistically by applying GEFA to output from a fully coupled control run. To reduce the sampling error caused by short data records, the traditional or full GEFA is refined through stepwise GEFA by dropping unimportantmore » forcings. Two ensembles of dynamical experiments are developed for the Sahel or West African monsoon region against which GEFA-based vegetation feedbacks are evaluated. In these dynamical experiments, regional leaf area index (LAI) is modified either alone or in conjunction with soil moisture, with the latter runs motivated by strong regional soil moisture–LAI coupling. Stepwise GEFA boasts higher consistency between statistically and dynamically assessed atmospheric responses to land surface anomalies than full GEFA, especially with short data records. GEFA-based atmospheric responses are more consistent with the coupled soil moisture–LAI experiments, indicating that GEFA is assessing the combined impacts of coupled vegetation and soil moisture. Finally, both the statistical and dynamical assessments reveal a negative vegetation–rainfall feedback in the Sahel associated with an atmospheric stability mechanism in CESM versus a weaker positive feedback in the West African monsoon region associated with a moisture recycling mechanism in CESM.« less

  12. Natural Selection as Coarsening

    NASA Astrophysics Data System (ADS)

    Smerlak, Matteo

    2017-11-01

    Analogies between evolutionary dynamics and statistical mechanics, such as Fisher's second-law-like "fundamental theorem of natural selection" and Wright's "fitness landscapes", have had a deep and fruitful influence on the development of evolutionary theory. Here I discuss a new conceptual link between evolution and statistical physics. I argue that natural selection can be viewed as a coarsening phenomenon, similar to the growth of domain size in quenched magnets or to Ostwald ripening in alloys and emulsions. In particular, I show that the most remarkable features of coarsening—scaling and self-similarity—have strict equivalents in evolutionary dynamics. This analogy has three main virtues: it brings a set of well-developed mathematical tools to bear on evolutionary dynamics; it suggests new problems in theoretical evolution; and it provides coarsening physics with a new exactly soluble model.

  13. Natural Selection as Coarsening

    NASA Astrophysics Data System (ADS)

    Smerlak, Matteo

    2018-07-01

    Analogies between evolutionary dynamics and statistical mechanics, such as Fisher's second-law-like "fundamental theorem of natural selection" and Wright's "fitness landscapes", have had a deep and fruitful influence on the development of evolutionary theory. Here I discuss a new conceptual link between evolution and statistical physics. I argue that natural selection can be viewed as a coarsening phenomenon, similar to the growth of domain size in quenched magnets or to Ostwald ripening in alloys and emulsions. In particular, I show that the most remarkable features of coarsening—scaling and self-similarity—have strict equivalents in evolutionary dynamics. This analogy has three main virtues: it brings a set of well-developed mathematical tools to bear on evolutionary dynamics; it suggests new problems in theoretical evolution; and it provides coarsening physics with a new exactly soluble model.

  14. The Precision-Power-Gradient Theory for Teaching Basic Research Statistical Tools to Graduate Students.

    ERIC Educational Resources Information Center

    Cassel, Russell N.

    This paper relates educational and psychological statistics to certain "Research Statistical Tools" (RSTs) necessary to accomplish and understand general research in the behavioral sciences. Emphasis is placed on acquiring an effective understanding of the RSTs and to this end they are are ordered to a continuum scale in terms of individual…

  15. Syndromic surveillance of influenza activity in Sweden: an evaluation of three tools.

    PubMed

    Ma, T; Englund, H; Bjelkmar, P; Wallensten, A; Hulth, A

    2015-08-01

    An evaluation was conducted to determine which syndromic surveillance tools complement traditional surveillance by serving as earlier indicators of influenza activity in Sweden. Web queries, medical hotline statistics, and school absenteeism data were evaluated against two traditional surveillance tools. Cross-correlation calculations utilized aggregated weekly data for all-age, nationwide activity for four influenza seasons, from 2009/2010 to 2012/2013. The surveillance tool indicative of earlier influenza activity, by way of statistical and visual evidence, was identified. The web query algorithm and medical hotline statistics performed equally well as each other and to the traditional surveillance tools. School absenteeism data were not reliable resources for influenza surveillance. Overall, the syndromic surveillance tools did not perform with enough consistency in season lead nor in earlier timing of the peak week to be considered as early indicators. They do, however, capture incident cases before they have formally entered the primary healthcare system.

  16. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... data describing vehicle usage required by the Federal Automotive Statistical Tool (FAST) by October 15 of each year. FAST is accessed through http://fastweb.inel.gov/. (End of clause) [68 FR 43334, July...

  17. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... data describing vehicle usage required by the Federal Automotive Statistical Tool (FAST) by October 15 of each year. FAST is accessed through http://fastweb.inel.gov/. (End of clause) [68 FR 43334, July...

  18. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... data describing vehicle usage required by the Federal Automotive Statistical Tool (FAST) by October 15 of each year. FAST is accessed through http://fastweb.inel.gov/. (End of clause) [68 FR 43334, July...

  19. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... data describing vehicle usage required by the Federal Automotive Statistical Tool (FAST) by October 15 of each year. FAST is accessed through http://fastweb.inel.gov/. (End of clause) [68 FR 43334, July...

  20. A Framework for Assessing High School Students' Statistical Reasoning.

    PubMed

    Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.

  1. A Framework for Assessing High School Students' Statistical Reasoning

    PubMed Central

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students’ statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students’ statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework’s cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments. PMID:27812091

  2. (Finite) statistical size effects on compressive strength.

    PubMed

    Weiss, Jérôme; Girard, Lucas; Gimbert, Florent; Amitrano, David; Vandembroucq, Damien

    2014-04-29

    The larger structures are, the lower their mechanical strength. Already discussed by Leonardo da Vinci and Edmé Mariotte several centuries ago, size effects on strength remain of crucial importance in modern engineering for the elaboration of safety regulations in structural design or the extrapolation of laboratory results to geophysical field scales. Under tensile loading, statistical size effects are traditionally modeled with a weakest-link approach. One of its prominent results is a prediction of vanishing strength at large scales that can be quantified in the framework of extreme value statistics. Despite a frequent use outside its range of validity, this approach remains the dominant tool in the field of statistical size effects. Here we focus on compressive failure, which concerns a wide range of geophysical and geotechnical situations. We show on historical and recent experimental data that weakest-link predictions are not obeyed. In particular, the mechanical strength saturates at a nonzero value toward large scales. Accounting explicitly for the elastic interactions between defects during the damage process, we build a formal analogy of compressive failure with the depinning transition of an elastic manifold. This critical transition interpretation naturally entails finite-size scaling laws for the mean strength and its associated variability. Theoretical predictions are in remarkable agreement with measurements reported for various materials such as rocks, ice, coal, or concrete. This formalism, which can also be extended to the flowing instability of granular media under multiaxial compression, has important practical consequences for future design rules.

  3. Biological and mechanical interplay at the Macro- and Microscales Modulates the Cell-Niche Fate.

    PubMed

    Sarig, Udi; Sarig, Hadar; Gora, Aleksander; Krishnamoorthi, Muthu Kumar; Au-Yeung, Gigi Chi Ting; de-Berardinis, Elio; Chaw, Su Yin; Mhaisalkar, Priyadarshini; Bogireddi, Hanumakumar; Ramakrishna, Seeram; Boey, Freddy Yin Chiang; Venkatraman, Subbu S; Machluf, Marcelle

    2018-03-02

    Tissue development, regeneration, or de-novo tissue engineering in-vitro, are based on reciprocal cell-niche interactions. Early tissue formation mechanisms, however, remain largely unknown given complex in-vivo multifactoriality, and limited tools to effectively characterize and correlate specific micro-scaled bio-mechanical interplay. We developed a unique model system, based on decellularized porcine cardiac extracellular matrices (pcECMs)-as representative natural soft-tissue biomaterial-to study a spectrum of common cell-niche interactions. Model monocultures and 1:1 co-cultures on the pcECM of human umbilical vein endothelial cells (HUVECs) and human mesenchymal stem cells (hMSCs) were mechano-biologically characterized using macro- (Instron), and micro- (AFM) mechanical testing, histology, SEM and molecular biology aspects using RT-PCR arrays. The obtained data was analyzed using developed statistics, principal component and gene-set analyses tools. Our results indicated biomechanical cell-type dependency, bi-modal elasticity distributions at the micron cell-ECM interaction level, and corresponding differing gene expression profiles. We further show that hMSCs remodel the ECM, HUVECs enable ECM tissue-specific recognition, and their co-cultures synergistically contribute to tissue integration-mimicking conserved developmental pathways. We also suggest novel quantifiable measures as indicators of tissue assembly and integration. This work may benefit basic and translational research in materials science, developmental biology, tissue engineering, regenerative medicine and cancer biomechanics.

  4. NIRS-SPM: statistical parametric mapping for near infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Tak, Sungho; Jang, Kwang Eun; Jung, Jinwook; Jang, Jaeduck; Jeong, Yong; Ye, Jong Chul

    2008-02-01

    Even though there exists a powerful statistical parametric mapping (SPM) tool for fMRI, similar public domain tools are not available for near infrared spectroscopy (NIRS). In this paper, we describe a new public domain statistical toolbox called NIRS-SPM for quantitative analysis of NIRS signals. Specifically, NIRS-SPM statistically analyzes the NIRS data using GLM and makes inference as the excursion probability which comes from the random field that are interpolated from the sparse measurement. In order to obtain correct inference, NIRS-SPM offers the pre-coloring and pre-whitening method for temporal correlation estimation. For simultaneous recording NIRS signal with fMRI, the spatial mapping between fMRI image and real coordinate in 3-D digitizer is estimated using Horn's algorithm. These powerful tools allows us the super-resolution localization of the brain activation which is not possible using the conventional NIRS analysis tools.

  5. Controlling species richness in spin-glass model ecosystems

    NASA Astrophysics Data System (ADS)

    Poderoso, Fábio C.; Fontanari, José F.

    2006-11-01

    Within the framework of the random replicator model of ecosystems, we use equilibrium statistical mechanics tools to study the effect of manipulating the ecosystem so as to guarantee that a fixed fraction of the surviving species at equilibrium display a predefined set of characters (e.g., characters of economic value). Provided that the intraspecies competition is not too weak, we find that the consequence of such intervention on the ecosystem composition is a significant increase on the number of species that become extinct, and so the impoverishment of the ecosystem.

  6. The GenABEL Project for statistical genomics.

    PubMed

    Karssen, Lennart C; van Duijn, Cornelia M; Aulchenko, Yurii S

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the "core team", facilitating agile statistical omics methodology development and fast dissemination.

  7. Statistical model with two order parameters for ductile and soft fiber bundles in nanoscience and biomaterials.

    PubMed

    Rinaldi, Antonio

    2011-04-01

    Traditional fiber bundles models (FBMs) have been an effective tool to understand brittle heterogeneous systems. However, fiber bundles in modern nano- and bioapplications demand a new generation of FBM capturing more complex deformation processes in addition to damage. In the context of loose bundle systems and with reference to time-independent plasticity and soft biomaterials, we formulate a generalized statistical model for ductile fracture and nonlinear elastic problems capable of handling more simultaneous deformation mechanisms by means of two order parameters (as opposed to one). As the first rational FBM for coupled damage problems, it may be the cornerstone for advanced statistical models of heterogeneous systems in nanoscience and materials design, especially to explore hierarchical and bio-inspired concepts in the arena of nanobiotechnology. Applicative examples are provided for illustrative purposes at last, discussing issues in inverse analysis (i.e., nonlinear elastic polymer fiber and ductile Cu submicron bars arrays) and direct design (i.e., strength prediction).

  8. Deformed Materials: Towards a Theory of Materials Morphology Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sethna, James P

    This grant supported work on the response of crystals to external stress. Our primary work described how disordered structural materials break in two (statistical models of fracture in disordered materials), studied models of deformation bursts (avalanches) that mediate deformation on the microscale, and developed continuum dislocation dynamics models for plastic deformation (as when scooping ice cream bends a spoon, Fig. 9). Glass is brittle -- it breaks with almost atomically smooth fracture surfaces. Many metals are ductile -- when they break, the fracture surface is locally sheared and stretched, and it is this damage that makes them hard to break.more » Bone and seashells are made of brittle material, but they are strong because they are disordered -- lots of little cracks form as they are sheared and near the fracture surface, diluting the external force. We have studied materials like bone and seashells using simulations, mathematical tools, and statistical mechanics models from physics. In particular, we studied the extreme values of fracture strengths (how likely will a beam in a bridge break far below its design strength), and found that the traditional engineering tools could be improved greatly. We also studied fascinating crackling-noise precursors -- systems which formed microcracks of a broad range of sizes before they broke. Ductile metals under stress undergo irreversible plastic deformation -- the planes of atoms must slide across one another (through the motion of dislocations) to change the overall shape in response to the external force. Microscopically, the dislocations in crystals move in bursts of a broad range of sizes (termed 'avalanches' in the statistical mechanics community, whose motion is deemed 'crackling noise'). In this grant period, we resolved a longstanding mystery about the average shape of avalanches of fixed duration (using tools related to an emergent scale invariance), we developed the fundamental theory describing the shapes of avalanches and how they are affected by the edges of the microscope viewing window, we found that slow creep of dislocations can trigger an oscillating response explaining recent experiments, we explained avalanches under external voltage, and we have studied how avalanches in experiments on the microscale relate to deformation of large samples. Inside the crystals forming the metal, the dislocations arrange into mysterious cellular structures, usually ignored in theories of plasticity. Writing a natural continuum theory for dislocation dynamics, we found that it spontaneously formed walls -- much like models of traffic jams and sonic booms. These walls formed rather realistic cellular structures, which we examined in great detail -- our walls formed fractal structures with fascinating scaling properties, related to those found in turbulent fluids. We found, however, that the numerical and mathematical tools available to solve our equations were not flexible enough to incorporate materials-specific information, and our models did not show the dislocation avalanches seen experimentally. In the last year of this grant, we wrote an invited review article, explaining how plastic flow in metals shares features with other stressed materials, and how tools of statistical physics used in these other systems might be crucial for understanding plasticity.« less

  9. ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prigogine, I.; Balescu, R.; Henin, F.

    1960-12-01

    Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)

  10. Sequence History Update Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris

    2008-01-01

    The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.

  11. Optimization of preservation and processing of sea anemones for microbial community analysis using molecular tools.

    PubMed

    Rocha, Joana; Coelho, Francisco J R C; Peixe, Luísa; Gomes, Newton C M; Calado, Ricardo

    2014-11-11

    For several years, knowledge on the microbiome associated with marine invertebrates was impaired by the challenges associated with the characterization of bacterial communities. With the advent of culture independent molecular tools it is possible to gain new insights on the diversity and richness of microorganisms associated with marine invertebrates. In the present study, we evaluated if different preservation and processing methodologies (prior to DNA extraction) can affect the bacterial diversity retrieved from snakelocks anemone Anemonia viridis. Denaturing gradient gel electrophoresis (DGGE) community fingerprints were used as proxy to determine the bacterial diversity retrieved (H'). Statistical analyses indicated that preservation significantly affects H'. The best approach to preserve and process A. viridis biomass for bacterial community fingerprint analysis was flash freezing in liquid nitrogen (preservation) followed by the use of a mechanical homogenizer (process), as it consistently yielded higher H'. Alternatively, biomass samples can be processed fresh followed by cell lyses using a mechanical homogenizer or mortar &pestle. The suitability of employing these two alternative procedures was further reinforced by the quantification of the 16S rRNA gene; no significant differences were recorded when comparing these two approaches and the use of liquid nitrogen followed by processing with a mechanical homogenizer.

  12. Optimization of preservation and processing of sea anemones for microbial community analysis using molecular tools

    PubMed Central

    Rocha, Joana; Coelho, Francisco J. R. C.; Peixe, Luísa; Gomes, Newton C. M.; Calado, Ricardo

    2014-01-01

    For several years, knowledge on the microbiome associated with marine invertebrates was impaired by the challenges associated with the characterization of bacterial communities. With the advent of culture independent molecular tools it is possible to gain new insights on the diversity and richness of microorganisms associated with marine invertebrates. In the present study, we evaluated if different preservation and processing methodologies (prior to DNA extraction) can affect the bacterial diversity retrieved from snakelocks anemone Anemonia viridis. Denaturing gradient gel electrophoresis (DGGE) community fingerprints were used as proxy to determine the bacterial diversity retrieved (H′). Statistical analyses indicated that preservation significantly affects H′. The best approach to preserve and process A. viridis biomass for bacterial community fingerprint analysis was flash freezing in liquid nitrogen (preservation) followed by the use of a mechanical homogenizer (process), as it consistently yielded higher H′. Alternatively, biomass samples can be processed fresh followed by cell lyses using a mechanical homogenizer or mortar & pestle. The suitability of employing these two alternative procedures was further reinforced by the quantification of the 16S rRNA gene; no significant differences were recorded when comparing these two approaches and the use of liquid nitrogen followed by processing with a mechanical homogenizer. PMID:25384534

  13. Ego Defense Mechanisms and Types of Object Relations in Adults With ADHD.

    PubMed

    de Almeida Silva, Vanessa; Louzã, Mario Rodrigues; da Silva, Maria Aparecida; Nakano, Eduardo Yoshio

    2016-11-01

    This research evaluates the personality structure of adults with ADHD from a psychodynamic perspective. The hypothesis was that possible structural characteristics in personality could be correlated with this syndrome. Assessment tools for ego functions (Bell Object Relations and Reality Testing Inventory [BORRTI-Form O], Defense Style Questionnaire ( DSQ-40)) were applied to a sample of 90 adults with ADHD, recruited in a specialized clinic. Among the ADHD sample, 84.4% of the participants were identified as having object relations pathologies. Pathological elevations were observed mainly in the Alienation, Egocentricity, and Insecure Attachment subscales. Statistically, significant differences were found especially in the use of immature and neurotic defense mechanisms, compared with normative data. The findings indicate that adults with ADHD make more use of immature and neurotic defense mechanisms, and presented pathological internalized object relations that are typical of an archaic and poorly structured egoic structure. © The Author(s) 2012.

  14. Modeling Selection and Extinction Mechanisms of Biological Systems

    NASA Astrophysics Data System (ADS)

    Amirjanov, Adil

    In this paper, the behavior of a genetic algorithm is modeled to enhance its applicability as a modeling tool of biological systems. A new description model for selection mechanism is introduced which operates on a portion of individuals of population. The extinction and recolonization mechanism is modeled, and solving the dynamics analytically shows that the genetic drift in the population with extinction/recolonization is doubled. The mathematical analysis of the interaction between selection and extinction/recolonization processes is carried out to assess the dynamics of motion of the macroscopic statistical properties of population. Computer simulations confirm that the theoretical predictions of described models are in good approximations. A mathematical model of GA dynamics was also examined, which describes the anti-predator vigilance in an animal group with respect to a known analytical solution of the problem, and showed a good agreement between them to find the evolutionarily stable strategies.

  15. ZERODUR strength modeling with Weibull statistical distributions

    NASA Astrophysics Data System (ADS)

    Hartmann, Peter

    2016-07-01

    The decisive influence on breakage strength of brittle materials such as the low expansion glass ceramic ZERODUR is the surface condition. For polished or etched surfaces it is essential if micro cracks are present and how deep they are. Ground surfaces have many micro cracks caused by the generation process. Here only the depths of the micro cracks are relevant. In any case presence and depths of micro cracks are statistical by nature. The Weibull distribution is the model used traditionally for the representation of such data sets. It is based on the weakest link ansatz. The use of the two or three parameter Weibull distribution for data representation and reliability prediction depends on the underlying crack generation mechanisms. Before choosing the model for a specific evaluation, some checks should be done. Is there only one mechanism present or is it to be expected that an additional mechanism might contribute deviating results? For ground surfaces the main mechanism is the diamond grains' action on the surface. However, grains breaking from their bonding might be moved by the tool across the surface introducing a slightly deeper crack. It is not to be expected that these scratches follow the same statistical distribution as the grinding process. Hence, their description with the same distribution parameters is not adequate. Before including them a dedicated discussion should be performed. If there is additional information available influencing the selection of the model, for example the existence of a maximum crack depth, this should be taken into account also. Micro cracks introduced by small diamond grains on tools working with limited forces cannot be arbitrarily deep. For data obtained with such surfaces the existence of a threshold breakage stress should be part of the hypothesis. This leads to the use of the three parameter Weibull distribution. A differentiation based on the data set alone without preexisting information is possible but requires a large data set. With only 20 specimens per sample such differentiation is not possible. This requires 100 specimens per set, the more the better. The validity of the statistical evaluation methods is discussed with several examples. These considerations are of special importance because of their consequences on the prognosis methods and results. Especially the use of the two parameter Weibull distribution for high strength surfaces has led to non-realistic results. Extrapolation down to low acceptable probability of failure covers a wide range without data points existing and is mainly influenced by the slope determined by the high strength specimens. In the past this misconception has prevented the use of brittle materials for stress loads, which they could have endured easily.

  16. Statistical Tools for Fitting Models of the Population Consequences of Acoustic Disturbance to Data from Marine Mammal Populations (PCAD Tools II)

    DTIC Science & Technology

    2014-09-30

    Consequences of Acoustic Disturbance to Data from Marine Mammal Populations (PCAD Tools II) Len Thomas, John Harwood, Catriona Harris, and Robert S... mammals changes over time. This project will develop statistical tools to allow mathematical models of the population consequences of acoustic...disturbance to be fitted to data from marine mammal populations. We will work closely with Phase II of the ONR PCAD Working Group, and will provide

  17. Introducing SONS, a tool for operational taxonomic unit-based comparisons of microbial community memberships and structures.

    PubMed

    Schloss, Patrick D; Handelsman, Jo

    2006-10-01

    The recent advent of tools enabling statistical inferences to be drawn from comparisons of microbial communities has enabled the focus of microbial ecology to move from characterizing biodiversity to describing the distribution of that biodiversity. Although statistical tools have been developed to compare community structures across a phylogenetic tree, we lack tools to compare the memberships and structures of two communities at a particular operational taxonomic unit (OTU) definition. Furthermore, current tests of community structure do not indicate the similarity of the communities but only report the probability of a statistical hypothesis. Here we present a computer program, SONS, which implements nonparametric estimators for the fraction and richness of OTUs shared between two communities.

  18. Software Used to Generate Cancer Statistics - SEER Cancer Statistics

    Cancer.gov

    Videos that highlight topics and trends in cancer statistics and definitions of statistical terms. Also software tools for analyzing and reporting cancer statistics, which are used to compile SEER's annual reports.

  19. Mechanical knowledge does matter to tool use even when assessed with a non-production task: Evidence from left brain-damaged patients.

    PubMed

    Lesourd, Mathieu; Budriesi, Carla; Osiurak, François; Nichelli, Paolo F; Bartolo, Angela

    2017-12-20

    In the literature on apraxia of tool use, it is now accepted that using familiar tools requires semantic and mechanical knowledge. However, mechanical knowledge is nearly always assessed with production tasks, so one may assume that mechanical knowledge and familiar tool use are associated only because of their common motor mechanisms. This notion may be challenged by demonstrating that familiar tool use depends on an alternative tool selection task assessing mechanical knowledge, where alternative uses of tools are assumed according to their physical properties but where actual use of tools is not needed. We tested 21 left brain-damaged patients and 21 matched controls with familiar tool use tasks (pantomime and single tool use), semantic tasks and an alternative tool selection task. The alternative tool selection task accounted for a large amount of variance in the single tool use task and was the best predictor among all the semantic tasks. Concerning the pantomime of tool use task, group and individual results suggested that the integrity of the semantic system and preserved mechanical knowledge are neither necessary nor sufficient to produce pantomimes. These results corroborate the idea that mechanical knowledge is essential when we use tools, even when tasks assessing mechanical knowledge do not require the production of any motor action. Our results also confirm the value of pantomime of tool use, which can be considered as a complex activity involving several cognitive abilities (e.g., communicative skills) rather than the activation of gesture engrams. © 2017 The British Psychological Society.

  20. Peer Review of EPA's Draft BMDS Document: Exponential ...

    EPA Pesticide Factsheets

    BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling.

  1. GAPIT version 2: an enhanced integrated tool for genomic association and prediction

    USDA-ARS?s Scientific Manuscript database

    Most human diseases and agriculturally important traits are complex. Dissecting their genetic architecture requires continued development of innovative and powerful statistical methods. Corresponding advances in computing tools are critical to efficiently use these statistical innovations and to enh...

  2. The GenABEL Project for statistical genomics

    PubMed Central

    Karssen, Lennart C.; van Duijn, Cornelia M.; Aulchenko, Yurii S.

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the “core team”, facilitating agile statistical omics methodology development and fast dissemination. PMID:27347381

  3. Retractable tool bit having latch type catch mechanism

    NASA Technical Reports Server (NTRS)

    Voellmer, George (Inventor)

    1993-01-01

    A retractable tool bit assembly for a tool such as an allen key is presented. The assembly includes one or more spring loaded nestable or telescoping tubular sections together with a catch mechanism for capturing and holding the tool in its retracted position. The catch mechanism consists of a latch mechanism located in a base section and which engages a conically shaped tool head located at the inner end of the tool. The tool head adjoins an eccentric oval type neck portion which extends to a rear lip of the tool head. The latch mechanism releases when the ovular neck portion rotates about the catch members upon actuation of a rotary tool drive motor. When released, all the telescoping sections and the tool extends fully outward to a use position.

  4. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection

    PubMed Central

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-01-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces. PMID:21487489

  5. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection.

    PubMed

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-12-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces.

  6. Statistical Analysis of CO 2 Exposed Wells to Predict Long Term Leakage through the Development of an Integrated Neural-Genetic Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Boyun; Duguid, Andrew; Nygaard, Ronar

    The objective of this project is to develop a computerized statistical model with the Integrated Neural-Genetic Algorithm (INGA) for predicting the probability of long-term leak of wells in CO 2 sequestration operations. This object has been accomplished by conducting research in three phases: 1) data mining of CO 2-explosed wells, 2) INGA computer model development, and 3) evaluation of the predictive performance of the computer model with data from field tests. Data mining was conducted for 510 wells in two CO 2 sequestration projects in the Texas Gulf Coast region. They are the Hasting West field and Oyster Bayou fieldmore » in the Southern Texas. Missing wellbore integrity data were estimated using an analytical and Finite Element Method (FEM) model. The INGA was first tested for performances of convergence and computing efficiency with the obtained data set of high dimension. It was concluded that the INGA can handle the gathered data set with good accuracy and reasonable computing time after a reduction of dimension with a grouping mechanism. A computerized statistical model with the INGA was then developed based on data pre-processing and grouping. Comprehensive training and testing of the model were carried out to ensure that the model is accurate and efficient enough for predicting the probability of long-term leak of wells in CO 2 sequestration operations. The Cranfield in the southern Mississippi was select as the test site. Observation wells CFU31F2 and CFU31F3 were used for pressure-testing, formation-logging, and cement-sampling. Tools run in the wells include Isolation Scanner, Slim Cement Mapping Tool (SCMT), Cased Hole Formation Dynamics Tester (CHDT), and Mechanical Sidewall Coring Tool (MSCT). Analyses of the obtained data indicate no leak of CO 2 cross the cap zone while it is evident that the well cement sheath was invaded by the CO 2 from the storage zone. This observation is consistent with the result predicted by the INGA model which indicates the well has a CO 2 leak-safe probability of 72%. This comparison implies that the developed INGA model is valid for future use in predicting well leak probability.« less

  7. Power hand tool kinetics associated with upper limb injuries in an automobile assembly plant.

    PubMed

    Ku, Chia-Hua; Radwin, Robert G; Karsh, Ben-Tzion

    2007-06-01

    This study investigated the relationship between pneumatic nutrunner handle reactions, workstation characteristics, and prevalence of upper limb injuries in an automobile assembly plant. Tool properties (geometry, inertial properties, and motor characteristics), fastener properties, orientation relative to the fastener, and the position of the tool operator (horizontal and vertical distances) were measured for 69 workstations using 15 different pneumatic nutrunners. Handle reaction response was predicted using a deterministic mechanical model of the human operator and tool that was previously developed in our laboratory, specific to the measured tool, workstation, and job factors. Handle force was a function of target torque, tool geometry and inertial properties, motor speed, work orientation, and joint hardness. The study found that tool target torque was not well correlated with predicted handle reaction force (r=0.495) or displacement (r=0.285). The individual tool, tool shape, and threaded fastener joint hardness all affected predicted forces and displacements (p<0.05). The average peak handle force and displacement for right-angle tools were twice as great as pistol grip tools. Soft-threaded fastener joints had the greatest average handle forces and displacements. Upper limb injury cases were identified using plant OSHA 200 log and personnel records. Predicted handle forces for jobs where injuries were reported were significantly greater than those jobs free of injuries (p<0.05), whereas target torque and predicted handle displacement did not show statistically significant differences. The study concluded that quantification of handle reaction force, rather than target torque alone, is necessary for identifying stressful power hand tool operations and for controlling exposure to forces in manufacturing jobs involving power nutrunners. Therefore, a combination of tool, work station, and task requirements should be considered.

  8. STATWIZ - AN ELECTRONIC STATISTICAL TOOL (ABSTRACT)

    EPA Science Inventory

    StatWiz is a web-based, interactive, and dynamic statistical tool for researchers. It will allow researchers to input information and/or data and then receive experimental design options, or outputs from data analysis. StatWiz is envisioned as an expert system that will walk rese...

  9. MethVisual - visualization and exploratory statistical analysis of DNA methylation profiles from bisulfite sequencing.

    PubMed

    Zackay, Arie; Steinhoff, Christine

    2010-12-15

    Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org.

  10. MethVisual - visualization and exploratory statistical analysis of DNA methylation profiles from bisulfite sequencing

    PubMed Central

    2010-01-01

    Background Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. Findings MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. Conclusions The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org. PMID:21159174

  11. Statistics of dislocation pinning at localized obstacles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dutta, A.; Bhattacharya, M., E-mail: mishreyee@vecc.gov.in; Barat, P.

    2014-10-14

    Pinning of dislocations at nanosized obstacles like precipitates, voids, and bubbles is a crucial mechanism in the context of phenomena like hardening and creep. The interaction between such an obstacle and a dislocation is often studied at fundamental level by means of analytical tools, atomistic simulations, and finite element methods. Nevertheless, the information extracted from such studies cannot be utilized to its maximum extent on account of insufficient information about the underlying statistics of this process comprising a large number of dislocations and obstacles in a system. Here, we propose a new statistical approach, where the statistics of pinning ofmore » dislocations by idealized spherical obstacles is explored by taking into account the generalized size-distribution of the obstacles along with the dislocation density within a three-dimensional framework. Starting with a minimal set of material parameters, the framework employs the method of geometrical statistics with a few simple assumptions compatible with the real physical scenario. The application of this approach, in combination with the knowledge of fundamental dislocation-obstacle interactions, has successfully been demonstrated for dislocation pinning at nanovoids in neutron irradiated type 316-stainless steel in regard to the non-conservative motion of dislocations. An interesting phenomenon of transition from rare pinning to multiple pinning regimes with increasing irradiation temperature is revealed.« less

  12. DAnTE: a statistical tool for quantitative analysis of –omics data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep

    2008-05-03

    DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.

  13. Role-play as an educational tool in medication communication skills: Students’ perspectives

    PubMed Central

    Lavanya, S. H.; Kalpana, L.; Veena, R. M.; Bharath Kumar, V. D.

    2016-01-01

    Objectives: Medication communication skills are vital aspects of patient care that may influence treatment outcomes. However, traditional pharmacology curriculum deals with imparting factual information, with little emphasis on patient communication. The current study aims to explore students’ perceptions of role-play as an educational tool in acquiring communication skills and to ascertain the need of role-play for their future clinical practice. Materials and Methods: This questionnaire-based study was done in 2nd professional MBBS students. A consolidated concept of six training cases, focusing on major communication issues related to medication prescription in pharmacology, were developed for peer-role-play sessions for 2nd professional MBBS (n = 122) students. Structured scripts with specific emphasis on prescription medication communication and checklists for feedback were developed. Prevalidated questionnaires measured the quantitative aspects of role-plays in relation to their relevance as teaching–learning tool, perceived benefits of sessions, and their importance for future use. Statistical Analysis: Data analysis was performed using descriptive statistics. Results: The role-play concept was well appreciated and considered an effective means for acquiring medication communication skills. The structured feedback by peers and faculty was well received by many. Over 90% of the students reported immense confidence in communicating therapy details, namely, drug name, purpose, mechanism, dosing details, and precautions. Majority reported a better retention of pharmacology concepts and preferred more such sessions. Conclusions: Most students consider peer-role-play as an indispensable tool to acquire effective communication skills regarding drug therapy. By virtue of providing experiential learning opportunities and its feasibility of implementation, role-play sessions justify inclusion in undergraduate medical curricula. PMID:28031605

  14. Using Quality Management Tools to Enhance Feedback from Student Evaluations

    ERIC Educational Resources Information Center

    Jensen, John B.; Artz, Nancy

    2005-01-01

    Statistical tools found in the service quality assessment literature--the "T"[superscript 2] statistic combined with factor analysis--can enhance the feedback instructors receive from student ratings. "T"[superscript 2] examines variability across multiple sets of ratings to isolate individual respondents with aberrant response…

  15. Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces.

    PubMed

    Spezia, Riccardo; Martínez-Nuñez, Emilio; Vazquez, Saulo; Hase, William L

    2017-04-28

    In this Introduction, we show the basic problems of non-statistical and non-equilibrium phenomena related to the papers collected in this themed issue. Over the past few years, significant advances in both computing power and development of theories have allowed the study of larger systems, increasing the time length of simulations and improving the quality of potential energy surfaces. In particular, the possibility of using quantum chemistry to calculate energies and forces 'on the fly' has paved the way to directly study chemical reactions. This has provided a valuable tool to explore molecular mechanisms at given temperatures and energies and to see whether these reactive trajectories follow statistical laws and/or minimum energy pathways. This themed issue collects different aspects of the problem and gives an overview of recent works and developments in different contexts, from the gas phase to the condensed phase to excited states.This article is part of the themed issue 'Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces'. © 2017 The Author(s).

  16. Sources and characteristics of acoustic emissions from mechanically stressed geologic granular media — A review

    NASA Astrophysics Data System (ADS)

    Michlmayr, Gernot; Cohen, Denis; Or, Dani

    2012-05-01

    The formation of cracks and emergence of shearing planes and other modes of rapid macroscopic failure in geologic granular media involve numerous grain scale mechanical interactions often generating high frequency (kHz) elastic waves, referred to as acoustic emissions (AE). These acoustic signals have been used primarily for monitoring and characterizing fatigue and progressive failure in engineered systems, with only a few applications concerning geologic granular media reported in the literature. Similar to the monitoring of seismic events preceding an earthquake, AE may offer a means for non-invasive, in-situ, assessment of mechanical precursors associated with imminent landslides or other types of rapid mass movements (debris flows, rock falls, snow avalanches, glacier stick-slip events). Despite diverse applications and potential usefulness, a systematic description of the AE method and its relevance to mechanical processes in Earth sciences is lacking. This review is aimed at providing a sound foundation for linking observed AE with various micro-mechanical failure events in geologic granular materials, not only for monitoring of triggering events preceding mass mobilization, but also as a non-invasive tool in its own right for probing the rich spectrum of mechanical processes at scales ranging from a single grain to a hillslope. We review first studies reporting use of AE for monitoring of failure in various geologic materials, and describe AE generating source mechanisms in mechanically stressed geologic media (e.g., frictional sliding, micro-crackling, particle collisions, rupture of water bridges, etc.) including AE statistical features, such as frequency content and occurrence probabilities. We summarize available AE sensors and measurement principles. The high sampling rates of advanced AE systems enable detection of numerous discrete failure events within a volume and thus provide access to statistical descriptions of progressive collapse of systems with many interacting mechanical elements such as the fiber bundle model (FBM). We highlight intrinsic links between AE characteristics and established statistical models often used in structural engineering and material sciences, and outline potential applications for failure prediction and early-warning using the AE method in combination with the FBM. The biggest challenge to application of the AE method for field applications is strong signal attenuation. We provide an outlook for overcoming such limitations considering emergence of a class of fiber-optic based distributed AE sensors and deployment of acoustic waveguides as part of monitoring networks.

  17. SimHap GUI: an intuitive graphical user interface for genetic association analysis.

    PubMed

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-12-25

    Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  18. Tool use disorders after left brain damage.

    PubMed

    Baumard, Josselin; Osiurak, François; Lesourd, Mathieu; Le Gall, Didier

    2014-01-01

    In this paper we review studies that investigated tool use disorders in left-brain damaged (LBD) patients over the last 30 years. Four tasks are classically used in the field of apraxia: Pantomime of tool use, single tool use, real tool use and mechanical problem solving. Our aim was to address two issues, namely, (1) the role of mechanical knowledge in real tool use and (2) the cognitive mechanisms underlying pantomime of tool use, a task widely employed by clinicians and researchers. To do so, we extracted data from 36 papers and computed the difference between healthy subjects and LBD patients. On the whole, pantomime of tool use is the most difficult task and real tool use is the easiest one. Moreover, associations seem to appear between pantomime of tool use, real tool use and mechanical problem solving. These results suggest that the loss of mechanical knowledge is critical in LBD patients, even if all of those tasks (and particularly pantomime of tool use) might put differential demands on semantic memory and working memory.

  19. Tool use disorders after left brain damage

    PubMed Central

    Baumard, Josselin; Osiurak, François; Lesourd, Mathieu; Le Gall, Didier

    2014-01-01

    In this paper we review studies that investigated tool use disorders in left-brain damaged (LBD) patients over the last 30 years. Four tasks are classically used in the field of apraxia: Pantomime of tool use, single tool use, real tool use and mechanical problem solving. Our aim was to address two issues, namely, (1) the role of mechanical knowledge in real tool use and (2) the cognitive mechanisms underlying pantomime of tool use, a task widely employed by clinicians and researchers. To do so, we extracted data from 36 papers and computed the difference between healthy subjects and LBD patients. On the whole, pantomime of tool use is the most difficult task and real tool use is the easiest one. Moreover, associations seem to appear between pantomime of tool use, real tool use and mechanical problem solving. These results suggest that the loss of mechanical knowledge is critical in LBD patients, even if all of those tasks (and particularly pantomime of tool use) might put differential demands on semantic memory and working memory. PMID:24904487

  20. Strange non-chaotic attractors in a state controlled-cellular neural network-based quasiperiodically forced MLC circuit

    NASA Astrophysics Data System (ADS)

    Ezhilarasu, P. Megavarna; Inbavalli, M.; Murali, K.; Thamilmaran, K.

    2018-07-01

    In this paper, we report the dynamical transitions to strange non-chaotic attractors in a quasiperiodically forced state controlled-cellular neural network (SC-CNN)-based MLC circuit via two different mechanisms, namely the Heagy-Hammel route and the gradual fractalisation route. These transitions were observed through numerical simulations and hardware experiments and confirmed using statistical tools, such as maximal Lyapunov exponent spectrum and its variance and singular continuous spectral analysis. We find that there is a remarkable agreement of the results from both numerical simulations as well as from hardware experiments.

  1. Workflow based framework for life science informatics.

    PubMed

    Tiwari, Abhishek; Sekhar, Arvind K T

    2007-10-01

    Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.

  2. STATISTICAL TECHNIQUES FOR DETERMINATION AND PREDICTION OF FUNDAMENTAL FISH ASSEMBLAGES OF THE MID-ATLANTIC HIGHLANDS

    EPA Science Inventory

    A statistical software tool, Stream Fish Community Predictor (SFCP), based on EMAP stream sampling in the mid-Atlantic Highlands, was developed to predict stream fish communities using stream and watershed characteristics. Step one in the tool development was a cluster analysis t...

  3. Ultrasonic evaluation of the physical and mechanical properties of granites.

    PubMed

    Vasconcelos, G; Lourenço, P B; Alves, C A S; Pamplona, J

    2008-09-01

    Masonry is the oldest building material that survived until today, being used all over the world and being present in the most impressive historical structures as an evidence of spirit of enterprise of ancient cultures. Conservation, rehabilitation and strengthening of the built heritage and protection of human lives are clear demands of modern societies. In this process, the use of nondestructive methods has become much common in the diagnosis of structural integrity of masonry elements. With respect to the evaluation of the stone condition, the ultrasonic pulse velocity is a simple and economical tool. Thus, the central issue of the present paper concerns the evaluation of the suitability of the ultrasonic pulse velocity method for describing the mechanical and physical properties of granites (range size between 0.1-4.0 mm and 0.3-16.5 mm) and for the assessment of its weathering state. The mechanical properties encompass the compressive and tensile strength and modulus of elasticity, and the physical properties include the density and porosity. For this purpose, measurements of the longitudinal ultrasonic pulse velocity with distinct natural frequency of the transducers were carried out on specimens with different size and shape. A discussion of the factors that induce variations on the ultrasonic velocity is also provided. Additionally, statistical correlations between ultrasonic pulse velocity and mechanical and physical properties of granites are presented and discussed. The major output of the work is the confirmation that ultrasonic pulse velocity can be effectively used as a simple and economical nondestructive method for a preliminary prediction of mechanical and physical properties, as well as a tool for the assessment of the weathering changes of granites that occur during the serviceable life. This is of much interest due to the usual difficulties in removing specimens for mechanical characterization.

  4. An analysis on intersectional collaboration on non-communicable chronic disease prevention and control in China: a cross-sectional survey on main officials of community health service institutions.

    PubMed

    Li, Xing-Ming; Rasooly, Alon; Peng, Bo; JianWang; Xiong, Shu-Yu

    2017-11-10

    Our study aimed to design a tool of evaluating intersectional collaboration on Non-communicable Chronic Disease (NCD) prevention and control, and further to understand the current status of intersectional collaboration in community health service institutions of China. We surveyed 444 main officials of community health service institutions in Beijing, Tianjin, Hubei and Ningxia regions of China in 2014 by using a questionnaire. A model of collaboration measurement, including four relational dimensions of governance, shared goals and vision, formalization and internalization, was used to compare the scores of evaluation scale in NCD management procedures across community healthcare institutions and other ones. Reliability and validity of the evaluation tool on inter-organizational collaboration on NCD prevention and control were verified. The test on tool evaluating inter-organizational collaboration in community NCD management revealed a good reliability and validity (Cronbach's Alpha = 0.89,split-half reliability = 0.84, the variance contribution rate of an extracted principal component = 49.70%). The results of inter-organizational collaboration of different departments and management segments showed there were statistically significant differences in formalization dimension for physical examination (p = 0.01).There was statistically significant difference in governance dimension, formalization dimension and total score of the collaboration scale for health record sector (p = 0.01,0.00,0.00). Statistical differences were found in the formalization dimension for exercise and nutrition health education segment (p = 0.01). There were no statistically significant difference in formalization dimension of medication guidance for psychological consultation, medical referral service and rehabilitation guidance (all p > 0.05). The multi-department collaboration mechanism of NCD prevention and control has been rudimentarily established. Community management institutions and general hospitals are more active in participating in community NCD management with better collaboration score, whereas the CDC shows relatively poor collaboration in China. Xing-ming Li and Alon Rasooly have the same contribution to the paper. Xing-ming Li and Alon Rasooly listed as the same first author.

  5. A Statistical Project Control Tool for Engineering Managers

    NASA Technical Reports Server (NTRS)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  6. Modeling the milling tool wear by using an evolutionary SVM-based model from milling runs experimental data

    NASA Astrophysics Data System (ADS)

    Nieto, Paulino José García; García-Gonzalo, Esperanza; Vilán, José Antonio Vilán; Robleda, Abraham Segade

    2015-12-01

    The main aim of this research work is to build a new practical hybrid regression model to predict the milling tool wear in a regular cut as well as entry cut and exit cut of a milling tool. The model was based on Particle Swarm Optimization (PSO) in combination with support vector machines (SVMs). This optimization mechanism involved kernel parameter setting in the SVM training procedure, which significantly influences the regression accuracy. Bearing this in mind, a PSO-SVM-based model, which is based on the statistical learning theory, was successfully used here to predict the milling tool flank wear (output variable) as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc. To accomplish the objective of this study, the experimental dataset represents experiments from runs on a milling machine under various operating conditions. In this way, data sampled by three different types of sensors (acoustic emission sensor, vibration sensor and current sensor) were acquired at several positions. A second aim is to determine the factors with the greatest bearing on the milling tool flank wear with a view to proposing milling machine's improvements. Firstly, this hybrid PSO-SVM-based regression model captures the main perception of statistical learning theory in order to obtain a good prediction of the dependence among the flank wear (output variable) and input variables (time, depth of cut, feed, etc.). Indeed, regression with optimal hyperparameters was performed and a determination coefficient of 0.95 was obtained. The agreement of this model with experimental data confirmed its good performance. Secondly, the main advantages of this PSO-SVM-based model are its capacity to produce a simple, easy-to-interpret model, its ability to estimate the contributions of the input variables, and its computational efficiency. Finally, the main conclusions of this study are exposed.

  7. Universal properties of mythological networks

    NASA Astrophysics Data System (ADS)

    Mac Carron, Pádraig; Kenna, Ralph

    2012-07-01

    As in statistical physics, the concept of universality plays an important, albeit qualitative, role in the field of comparative mythology. Here we apply statistical mechanical tools to analyse the networks underlying three iconic mythological narratives with a view to identifying common and distinguishing quantitative features. Of the three narratives, an Anglo-Saxon and a Greek text are mostly believed by antiquarians to be partly historically based while the third, an Irish epic, is often considered to be fictional. Here we use network analysis in an attempt to discriminate real from imaginary social networks and place mythological narratives on the spectrum between them. This suggests that the perceived artificiality of the Irish narrative can be traced back to anomalous features associated with six characters. Speculating that these are amalgams of several entities or proxies, renders the plausibility of the Irish text comparable to the others from a network-theoretic point of view.

  8. Experimental statistical signature of many-body quantum interference

    NASA Astrophysics Data System (ADS)

    Giordani, Taira; Flamini, Fulvio; Pompili, Matteo; Viggianiello, Niko; Spagnolo, Nicolò; Crespi, Andrea; Osellame, Roberto; Wiebe, Nathan; Walschaers, Mattia; Buchleitner, Andreas; Sciarrino, Fabio

    2018-03-01

    Multi-particle interference is an essential ingredient for fundamental quantum mechanics phenomena and for quantum information processing to provide a computational advantage, as recently emphasized by boson sampling experiments. Hence, developing a reliable and efficient technique to witness its presence is pivotal in achieving the practical implementation of quantum technologies. Here, we experimentally identify genuine many-body quantum interference via a recent efficient protocol, which exploits statistical signatures at the output of a multimode quantum device. We successfully apply the test to validate three-photon experiments in an integrated photonic circuit, providing an extensive analysis on the resources required to perform it. Moreover, drawing upon established techniques of machine learning, we show how such tools help to identify the—a priori unknown—optimal features to witness these signatures. Our results provide evidence on the efficacy and feasibility of the method, paving the way for its adoption in large-scale implementations.

  9. Nonequilibrium statistical mechanics Brussels-Austin style

    NASA Astrophysics Data System (ADS)

    Bishop, Robert C.

    The fundamental problem on which Ilya Prigogine and the Brussels-Austin Group have focused can be stated briefly as follows. Our observations indicate that there is an arrow of time in our experience of the world (e.g., decay of unstable radioactive atoms like uranium, or the mixing of cream in coffee). Most of the fundamental equations of physics are time reversible, however, presenting an apparent conflict between our theoretical descriptions and experimental observations. Many have thought that the observed arrow of time was either an artifact of our observations or due to very special initial conditions. An alternative approach, followed by the Brussels-Austin Group, is to consider the observed direction of time to be a basic physical phenomenon due to the dynamics of physical systems. This essay focuses mainly on recent developments in the Brussels-Austin Group after the mid-1980s. The fundamental concerns are the same as in their earlier approaches (subdynamics, similarity transformations), but the contemporary approach utilizes rigged Hilbert space (whereas the older approaches used Hilbert space). While the emphasis on nonequilibrium statistical mechanics remains the same, their more recent approach addresses the physical features of large Poincaré systems, nonlinear dynamics and the mathematical tools necessary to analyze them.

  10. Non-operative management (NOM) of blunt hepatic trauma: 80 cases.

    PubMed

    Özoğul, Bünyami; Kısaoğlu, Abdullah; Aydınlı, Bülent; Öztürk, Gürkan; Bayramoğlu, Atıf; Sarıtemur, Murat; Aköz, Ayhan; Bulut, Özgür Hakan; Atamanalp, Sabri Selçuk

    2014-03-01

    Liver is the most frequently injured organ upon abdominal trauma. We present a group of patients with blunt hepatic trauma who were managed without any invasive diagnostic tools and/or surgical intervention. A total of 80 patients with blunt liver injury who were hospitalized to the general surgery clinic or other clinics due to the concomitant injuries were followed non-operatively. The normally distributed numeric variables were evaluated by Student's t-test or one way analysis of variance, while non-normally distributed variables were analyzed by Mann-Whitney U-test or Kruskal-Wallis variance analysis. Chi-square test was also employed for the comparison of categorical variables. Statistical significance was assumed for p<0.05. There was no significant relationship between patients' Hgb level and liver injury grade, outcome, and mechanism of injury. Also, there was no statistical relationship between liver injury grade, outcome, and mechanism of injury and ALT levels as well as AST level. There was no mortality in any of the patients. During the last quarter of century, changes in the diagnosis and treatment of liver injury were associated with increased survival. NOM of liver injury in patients with stable hemodynamics and hepatic trauma seems to be the gold standard.

  11. A computational DFT study of structural transitions in textured solid-fluid interfaces

    NASA Astrophysics Data System (ADS)

    Yatsyshin, Petr; Parry, Andrew O.; Kalliadasis, Serafim

    2015-11-01

    Fluids adsorbed at walls, in capillary pores and slits, and in more exotic, sculpted geometries such as grooves and wedges can exhibit many new phase transitions, including wetting, pre-wetting, capillary-condensation and filling, compared to their bulk counterparts. As well as being of fundamental interest to the modern statistical mechanical theory of inhomogeneous fluids, these are also relevant to nanofluidics, chemical- and bioengineering. In this talk we will show using a microscopic Density Functional Theory (DFT) for fluids how novel, continuous, interfacial transitions associated with the first-order prewetting line, can occur on steps, in grooves and in wedges, that are sensitive to both the range of the intermolecular forces and interfacial fluctuation effects. These transitions compete with wetting, filling and condensation producing very rich phase diagrams even for relatively simple geometries. We will also discuss practical aspects of DFT calculations, and demonstrate how this statistical-mechanical framework is capable of yielding complex fluid structure, interfacial tensions, and regions of thermodynamic stability of various fluid configurations. As a side note, this demonstrates that DFT is an excellent tool for the investigations of complex multiphase systems. We acknowledge financial support from the European Research Council via Advanced Grant No. 247031.

  12. The Shock and Vibration Digest. Volume 15, Number 7

    DTIC Science & Technology

    1983-07-01

    systems noise -- for tant analytical tool, the statistical energy analysis example, from a specific metal, chain driven, con- method, has been the subject...34Experimental Determination of Vibration Parameters Re- ~~~quired in the Statistical Energy Analysis Meth- .,i. 31. Dubowsky, S. and Morris, T.L., "An...34Coupling Loss Factors for 55. Upton, R., "Sound Intensity -. A Powerful New Statistical Energy Analysis of Sound Trans- Measurement Tool," S/V, Sound

  13. DECONV-TOOL: An IDL based deconvolution software package

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Landsman, W. B.

    1992-01-01

    There are a variety of algorithms for deconvolution of blurred images, each having its own criteria or statistic to be optimized in order to estimate the original image data. Using the Interactive Data Language (IDL), we have implemented the Maximum Likelihood, Maximum Entropy, Maximum Residual Likelihood, and sigma-CLEAN algorithms in a unified environment called DeConv_Tool. Most of the algorithms have as their goal the optimization of statistics such as standard deviation and mean of residuals. Shannon entropy, log-likelihood, and chi-square of the residual auto-correlation are computed by DeConv_Tool for the purpose of determining the performance and convergence of any particular method and comparisons between methods. DeConv_Tool allows interactive monitoring of the statistics and the deconvolved image during computation. The final results, and optionally, the intermediate results, are stored in a structure convenient for comparison between methods and review of the deconvolution computation. The routines comprising DeConv_Tool are available via anonymous FTP through the IDL Astronomy User's Library.

  14. The Statistical Basis of Chemical Equilibria.

    ERIC Educational Resources Information Center

    Hauptmann, Siegfried; Menger, Eva

    1978-01-01

    Describes a machine which demonstrates the statistical bases of chemical equilibrium, and in doing so conveys insight into the connections among statistical mechanics, quantum mechanics, Maxwell Boltzmann statistics, statistical thermodynamics, and transition state theory. (GA)

  15. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    NASA Astrophysics Data System (ADS)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  16. TASI: A software tool for spatial-temporal quantification of tumor spheroid dynamics.

    PubMed

    Hou, Yue; Konen, Jessica; Brat, Daniel J; Marcus, Adam I; Cooper, Lee A D

    2018-05-08

    Spheroid cultures derived from explanted cancer specimens are an increasingly utilized resource for studying complex biological processes like tumor cell invasion and metastasis, representing an important bridge between the simplicity and practicality of 2-dimensional monolayer cultures and the complexity and realism of in vivo animal models. Temporal imaging of spheroids can capture the dynamics of cell behaviors and microenvironments, and when combined with quantitative image analysis methods, enables deep interrogation of biological mechanisms. This paper presents a comprehensive open-source software framework for Temporal Analysis of Spheroid Imaging (TASI) that allows investigators to objectively characterize spheroid growth and invasion dynamics. TASI performs spatiotemporal segmentation of spheroid cultures, extraction of features describing spheroid morpho-phenotypes, mathematical modeling of spheroid dynamics, and statistical comparisons of experimental conditions. We demonstrate the utility of this tool in an analysis of non-small cell lung cancer spheroids that exhibit variability in metastatic and proliferative behaviors.

  17. The Math Problem: Advertising Students' Attitudes toward Statistics

    ERIC Educational Resources Information Center

    Fullerton, Jami A.; Kendrick, Alice

    2013-01-01

    This study used the Students' Attitudes toward Statistics Scale (STATS) to measure attitude toward statistics among a national sample of advertising students. A factor analysis revealed four underlying factors make up the attitude toward statistics construct--"Interest & Future Applicability," "Confidence," "Statistical Tools," and "Initiative."…

  18. The Statistical Consulting Center for Astronomy (SCCA)

    NASA Technical Reports Server (NTRS)

    Akritas, Michael

    2001-01-01

    The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.

  19. Balancing strength and toughness of calcium-silicate-hydrate via random nanovoids and particle inclusions: Atomistic modeling and statistical analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Ning; Shahsavari, Rouzbeh

    2016-11-01

    As the most widely used manufactured material on Earth, concrete poses serious societal and environmental concerns which call for innovative strategies to develop greener concrete with improved strength and toughness, properties that are exclusive in man-made materials. Herein, we focus on calcium silicate hydrate (C-S-H), the major binding phase of all Portland cement concretes, and study how engineering its nanovoids and portlandite particle inclusions can impart a balance of strength, toughness and stiffness. By performing an extensive +600 molecular dynamics simulations coupled with statistical analysis tools, our results provide new evidence of ductile fracture mechanisms in C-S-H - reminiscent of crystalline alloys and ductile metals - decoding the interplay between the crack growth, nanovoid/particle inclusions, and stoichiometry, which dictates the crystalline versus amorphous nature of the underlying matrix. We found that introduction of voids and portlandite particles can significantly increase toughness and ductility, specially in C-S-H with more amorphous matrices, mainly owing to competing mechanisms of crack deflection, voids coalescence, internal necking, accommodation, and geometry alteration of individual voids/particles, which together regulate toughness versus strength. Furthermore, utilizing a comprehensive global sensitivity analysis on random configuration-property relations, we show that the mean diameter of voids/particles is the most critical statistical parameter influencing the mechanical properties of C-S-H, irrespective of stoichiometry or crystalline or amorphous nature of the matrix. This study provides new fundamental insights, design guidelines, and de novo strategies to turn the brittle C-S-H into a ductile material, impacting modern engineering of strong and tough concrete infrastructures and potentially other complex brittle materials.

  20. Mesoscale Raised Rim Depressions (MRRDs) on Earth: A Review of the Characteristics, Processes, and Spatial Distributions of Analogs for Mars

    NASA Technical Reports Server (NTRS)

    Burr, Devon M.; Bruno, Barbara C.; Lanagan, Peter D.; Glaze, Lori; Jaeger, Windy L.; Soare, Richard J.; Tseung, Jean-Michel Wan Bun; Skinner, James A. Jr.; Baloga, Stephen M.

    2008-01-01

    Fields of mesoscale raised rim depressions (MRRDs) of various origins are found on Earth and Mars. Examples include rootless cones, mud volcanoes, collapsed pingos, rimmed kettle holes, and basaltic ring structures. Correct identification of MRRDs on Mars is valuable because different MRRD types have different geologic and/or climatic implications and are often associated with volcanism and/or water, which may provide locales for biotic or prebiotic activity. In order to facilitate correct identification of fields of MRRDs on Mars and their implications, this work provides a review of common terrestrial MRRD types that occur in fields. In this review, MRRDs by formation mechanism, including hydrovolcanic (phreatomagmatic cones, basaltic ring structures), sedimentological (mud volcanoes), and ice-related (pingos, volatile ice-block forms) mechanisms. For each broad mechanism, we present a comparative synopsis of (i) morphology and observations, (ii) physical formation processes, and (iii) published hypothesized locations on Mars. Because the morphology for MRRDs may be ambiguous, an additional tool is provided for distinguishing fields of MRRDs by origin on Mars, namely, spatial distribution analyses for MRRDs within fields on Earth. We find that MRRDs have both distinguishing and similar characteristics, and observation that applies both to their mesoscale morphology and to their spatial distribution statistics. Thus, this review provides tools for distinguishing between various MRRDs, while highlighting the utility of the multiple working hypotheses approach.

  1. [Application of finite element method in spinal biomechanics].

    PubMed

    Liu, Qiang; Zhang, Jun; Sun, Shu-Chun; Wang, Fei

    2017-02-25

    The finite element model is one of the most important methods in study of modern spinal biomechanics, according to the needs to simulate the various states of the spine, calculate the stress force and strain distribution of the different groups in the state, and explore its principle of mechanics, mechanism of injury, and treatment effectiveness. In addition, in the study of the pathological state of the spine, the finite element is mainly used in the understanding the mechanism of lesion location, evaluating the effects of different therapeutic tool, assisting and completing the selection and improvement of therapeutic tool, in order to provide a theoretical basis for the rehabilitation of spinal lesions. Finite element method can be more provide the service for the patients suffering from spinal correction, operation and individual implant design. Among the design and performance evaluation of the implant need to pay attention to the individual difference and perfect the evaluation system. At present, how to establish a model which is more close to the real situation has been the focus and difficulty of the study of human body's finite element.Although finite element method can better simulate complex working condition, it is necessary to improve the authenticity of the model and the sharing of the group by using many kinds of methods, such as image science, statistics, kinematics and so on. Copyright© 2017 by the China Journal of Orthopaedics and Traumatology Press.

  2. Examining the role of fluctuations in the early stages of homogenous polymer crystallization with simulation and statistical learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Welch, Jr., Paul Michael

    Here, we propose a relationship between the dynamics in the amorphous and crystalline domains during polymer crystallization: the fluctuations of ordering-rate about a material-specific value in the amorphous phase drive those fluctuations associated with the increase in percent crystallinity. This suggests a differential equation that satisfies the three experimentally observed time regimes for the rate of crystal growth. To test this postulated expression, we applied a suite of statistical learning tools to molecular dynamics simulations to extract the relevant phenomenology. This study shows that the proposed relationship holds in the early time regime. It illustrates the effectiveness of soft computingmore » tools in the analysis of coarse-grained simulations in which patterns exist, but may not easily yield to strict quantitative evaluation. This ability assists us in characterizing the critical early time molecular arrangement during the primary nucleation phase of polymer melt crystallization. In addition to supporting the validity of the proposed kinetics expression, the simulations show that (i) the classical nucleation and growth mechanism is active in the early stages of ordering; (ii) the number of nuclei and their masses grow linearly during this early time regime; and (iii) a fixed inter-nuclei distance is established.« less

  3. Examining the role of fluctuations in the early stages of homogenous polymer crystallization with simulation and statistical learning

    DOE PAGES

    Welch, Jr., Paul Michael

    2017-01-23

    Here, we propose a relationship between the dynamics in the amorphous and crystalline domains during polymer crystallization: the fluctuations of ordering-rate about a material-specific value in the amorphous phase drive those fluctuations associated with the increase in percent crystallinity. This suggests a differential equation that satisfies the three experimentally observed time regimes for the rate of crystal growth. To test this postulated expression, we applied a suite of statistical learning tools to molecular dynamics simulations to extract the relevant phenomenology. This study shows that the proposed relationship holds in the early time regime. It illustrates the effectiveness of soft computingmore » tools in the analysis of coarse-grained simulations in which patterns exist, but may not easily yield to strict quantitative evaluation. This ability assists us in characterizing the critical early time molecular arrangement during the primary nucleation phase of polymer melt crystallization. In addition to supporting the validity of the proposed kinetics expression, the simulations show that (i) the classical nucleation and growth mechanism is active in the early stages of ordering; (ii) the number of nuclei and their masses grow linearly during this early time regime; and (iii) a fixed inter-nuclei distance is established.« less

  4. Replica analysis of overfitting in regression models for time-to-event data

    NASA Astrophysics Data System (ADS)

    Coolen, A. C. C.; Barrett, J. E.; Paga, P.; Perez-Vicente, C. J.

    2017-09-01

    Overfitting, which happens when the number of parameters in a model is too large compared to the number of data points available for determining these parameters, is a serious and growing problem in survival analysis. While modern medicine presents us with data of unprecedented dimensionality, these data cannot yet be used effectively for clinical outcome prediction. Standard error measures in maximum likelihood regression, such as p-values and z-scores, are blind to overfitting, and even for Cox’s proportional hazards model (the main tool of medical statisticians), one finds in literature only rules of thumb on the number of samples required to avoid overfitting. In this paper we present a mathematical theory of overfitting in regression models for time-to-event data, which aims to increase our quantitative understanding of the problem and provide practical tools with which to correct regression outcomes for the impact of overfitting. It is based on the replica method, a statistical mechanical technique for the analysis of heterogeneous many-variable systems that has been used successfully for several decades in physics, biology, and computer science, but not yet in medical statistics. We develop the theory initially for arbitrary regression models for time-to-event data, and verify its predictions in detail for the popular Cox model.

  5. Granularity refined by knowledge: contingency tables and rough sets as tools of discovery

    NASA Astrophysics Data System (ADS)

    Zytkow, Jan M.

    2000-04-01

    Contingency tables represent data in a granular way and are a well-established tool for inductive generalization of knowledge from data. We show that the basic concepts of rough sets, such as concept approximation, indiscernibility, and reduct can be expressed in the language of contingency tables. We further demonstrate the relevance to rough sets theory of additional probabilistic information available in contingency tables and in particular of statistical tests of significance and predictive strength applied to contingency tables. Tests of both type can help the evaluation mechanisms used in inductive generalization based on rough sets. Granularity of attributes can be improved in feedback with knowledge discovered in data. We demonstrate how 49er's facilities for (1) contingency table refinement, for (2) column and row grouping based on correspondence analysis, and (3) the search for equivalence relations between attributes improve both granularization of attributes and the quality of knowledge. Finally we demonstrate the limitations of knowledge viewed as concept approximation, which is the focus of rough sets. Transcending that focus and reorienting towards the predictive knowledge and towards the related distinction between possible and impossible (or statistically improbable) situations will be very useful in expanding the rough sets approach to more expressive forms of knowledge.

  6. A Data Warehouse Architecture for DoD Healthcare Performance Measurements.

    DTIC Science & Technology

    1999-09-01

    design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse of healthcare metrics. With the DoD healthcare...framework, this thesis defines a methodology to design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse...21 F. INABILITY TO CONDUCT HELATHCARE ANALYSIS

  7. A Web-Based Learning Tool Improves Student Performance in Statistics: A Randomized Masked Trial

    ERIC Educational Resources Information Center

    Gonzalez, Jose A.; Jover, Lluis; Cobo, Erik; Munoz, Pilar

    2010-01-01

    Background: e-status is a web-based tool able to generate different statistical exercises and to provide immediate feedback to students' answers. Although the use of Information and Communication Technologies (ICTs) is becoming widespread in undergraduate education, there are few experimental studies evaluating its effects on learning. Method: All…

  8. Learning Axes and Bridging Tools in a Technology-Based Design for Statistics

    ERIC Educational Resources Information Center

    Abrahamson, Dor; Wilensky, Uri

    2007-01-01

    We introduce a design-based research framework, "learning axes and bridging tools," and demonstrate its application in the preparation and study of an implementation of a middle-school experimental computer-based unit on probability and statistics, "ProbLab" (Probability Laboratory, Abrahamson and Wilensky 2002 [Abrahamson, D., & Wilensky, U.…

  9. Statistical Physics in the Era of Big Data

    ERIC Educational Resources Information Center

    Wang, Dashun

    2013-01-01

    With the wealth of data provided by a wide range of high-throughout measurement tools and technologies, statistical physics of complex systems is entering a new phase, impacting in a meaningful fashion a wide range of fields, from cell biology to computer science to economics. In this dissertation, by applying tools and techniques developed in…

  10. Exploring complex dynamics in multi agent-based intelligent systems: Theoretical and experimental approaches using the Multi Agent-based Behavioral Economic Landscape (MABEL) model

    NASA Astrophysics Data System (ADS)

    Alexandridis, Konstantinos T.

    This dissertation adopts a holistic and detailed approach to modeling spatially explicit agent-based artificial intelligent systems, using the Multi Agent-based Behavioral Economic Landscape (MABEL) model. The research questions that addresses stem from the need to understand and analyze the real-world patterns and dynamics of land use change from a coupled human-environmental systems perspective. Describes the systemic, mathematical, statistical, socio-economic and spatial dynamics of the MABEL modeling framework, and provides a wide array of cross-disciplinary modeling applications within the research, decision-making and policy domains. Establishes the symbolic properties of the MABEL model as a Markov decision process, analyzes the decision-theoretic utility and optimization attributes of agents towards comprising statistically and spatially optimal policies and actions, and explores the probabilogic character of the agents' decision-making and inference mechanisms via the use of Bayesian belief and decision networks. Develops and describes a Monte Carlo methodology for experimental replications of agent's decisions regarding complex spatial parcel acquisition and learning. Recognizes the gap on spatially-explicit accuracy assessment techniques for complex spatial models, and proposes an ensemble of statistical tools designed to address this problem. Advanced information assessment techniques such as the Receiver-Operator Characteristic curve, the impurity entropy and Gini functions, and the Bayesian classification functions are proposed. The theoretical foundation for modular Bayesian inference in spatially-explicit multi-agent artificial intelligent systems, and the ensembles of cognitive and scenario assessment modular tools build for the MABEL model are provided. Emphasizes the modularity and robustness as valuable qualitative modeling attributes, and examines the role of robust intelligent modeling as a tool for improving policy-decisions related to land use change. Finally, the major contributions to the science are presented along with valuable directions for future research.

  11. Nonlinear dynamic mechanism of vocal tremor from voice analysis and model simulations

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; Jiang, Jack J.

    2008-09-01

    Nonlinear dynamic analysis and model simulations are used to study the nonlinear dynamic characteristics of vocal folds with vocal tremor, which can typically be characterized by low-frequency modulation and aperiodicity. Tremor voices from patients with disorders such as paresis, Parkinson's disease, hyperfunction, and adductor spasmodic dysphonia show low-dimensional characteristics, differing from random noise. Correlation dimension analysis statistically distinguishes tremor voices from normal voices. Furthermore, a nonlinear tremor model is proposed to study the vibrations of the vocal folds with vocal tremor. Fractal dimensions and positive Lyapunov exponents demonstrate the evidence of chaos in the tremor model, where amplitude and frequency play important roles in governing vocal fold dynamics. Nonlinear dynamic voice analysis and vocal fold modeling may provide a useful set of tools for understanding the dynamic mechanism of vocal tremor in patients with laryngeal diseases.

  12. Current-voltage characteristics and transition voltage spectroscopy of individual redox proteins.

    PubMed

    Artés, Juan M; López-Martínez, Montserrat; Giraudet, Arnaud; Díez-Pérez, Ismael; Sanz, Fausto; Gorostiza, Pau

    2012-12-19

    Understanding how molecular conductance depends on voltage is essential for characterizing molecular electronics devices. We reproducibly measured current-voltage characteristics of individual redox-active proteins by scanning tunneling microscopy under potentiostatic control in both tunneling and wired configurations. From these results, transition voltage spectroscopy (TVS) data for individual redox molecules can be calculated and analyzed statistically, adding a new dimension to conductance measurements. The transition voltage (TV) is discussed in terms of the two-step electron transfer (ET) mechanism. Azurin displays the lowest TV measured to date (0.4 V), consistent with the previously reported distance decay factor. This low TV may be advantageous for fabricating and operating molecular electronic devices for different applications. Our measurements show that TVS is a helpful tool for single-molecule ET measurements and suggest a mechanism for gating of ET between partner redox proteins.

  13. PANDA-view: An easy-to-use tool for statistical analysis and visualization of quantitative proteomics data.

    PubMed

    Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping

    2018-05-22

    Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.

  14. SimHap GUI: An intuitive graphical user interface for genetic association analysis

    PubMed Central

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-01-01

    Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877

  15. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    PubMed

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students' understanding and suggests better long-term knowledge retention.

  16. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    PubMed Central

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2009-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students’ understanding and suggests better long-term knowledge retention. PMID:19750185

  17. The physics of lipid droplet nucleation, growth and budding.

    PubMed

    Thiam, Abdou Rachid; Forêt, Lionel

    2016-08-01

    Lipid droplets (LDs) are intracellular oil-in-water emulsion droplets, covered by a phospholipid monolayer and mainly present in the cytosol. Despite their important role in cellular metabolism and growing number of newly identified functions, LD formation mechanism from the endoplasmic reticulum remains poorly understood. To form a LD, the oil molecules synthesized in the ER accumulate between the monolayer leaflets and induce deformation of the membrane. This formation process works through three steps: nucleation, growth and budding, exactly as in phase separation and dewetting phenomena. These steps involve sequential biophysical membrane remodeling mechanisms for which we present basic tools of statistical physics, membrane biophysics, and soft matter science underlying them. We aim to highlight relevant factors that could control LD formation size, site and number through this physics description. An emphasis will be given to a currently underestimated contribution of the molecular interactions between lipids to favor an energetically costless mechanism of LD formation. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Optimization of factors to obtain cassava starch films with improved mechanical properties

    NASA Astrophysics Data System (ADS)

    Monteiro, Mayra; Oliveira, Victor; Santos, Francisco; Barros Neto, Eduardo; Silva, Karyn; Silva, Rayane; Henrique, João; Chibério, Abimaelle

    2017-08-01

    In this study, was investigated the optimization of the factors that significantly influenced the mechanical property improvement of cassava starch films through complete factorial design 23. The factors to be analyzed were cassava starch, glycerol and modified clay contents. A regression model was proposed by the factorial analysis, aiming to estimate the condition of the individual factors investigated in the optimum state of the mechanical properties of the biofilm, using the following statistical tool: desirability function and response surface. The response variable that delimits the improvement of the mechanical property of the biofilm is the tensile strength, such improvement is obtained by maximizing the response variable. The factorial analysis showed that the best combination of factor configurations to reach the best response was found to be: with 5g of cassava starch, 10% of glycerol and 5% of modified clay, both percentages in relation to the dry mass of starch used. In addition, the starch biofilm showing the lowest response contained 2g of cassava starch, 0% of modified clay and 30% of glycerol, and was consequently considered the worst biofilm.

  19. Calibrating the Difficulty of an Assessment Tool: The Blooming of a Statistics Examination

    ERIC Educational Resources Information Center

    Dunham, Bruce; Yapa, Gaitri; Yu, Eugenia

    2015-01-01

    Bloom's taxonomy is proposed as a tool by which to assess the level of complexity of assessment tasks in statistics. Guidelines are provided for how to locate tasks at each level of the taxonomy, along with descriptions and examples of suggested test questions. Through the "Blooming" of an examination--that is, locating its constituent…

  20. Analytics for Cyber Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd.; Kolda, Tamara Gibson

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  1. On the blind use of statistical tools in the analysis of globular cluster stars

    NASA Astrophysics Data System (ADS)

    D'Antona, Francesca; Caloi, Vittoria; Tailo, Marco

    2018-04-01

    As with most data analysis methods, the Bayesian method must be handled with care. We show that its application to determine stellar evolution parameters within globular clusters can lead to paradoxical results if used without the necessary precautions. This is a cautionary tale on the use of statistical tools for big data analysis.

  2. Micro-mechanics of hydro-mechanical coupled processes during hydraulic fracturing in sandstone

    NASA Astrophysics Data System (ADS)

    Caulk, R.; Tomac, I.

    2017-12-01

    This contribution presents micro-mechanical study of hydraulic fracture initiation and propagation in sandstone. The Discrete Element Method (DEM) Yade software is used as a tool to model fully coupled hydro-mechanical behavior of the saturated sandstone under pressures typical for deep geo-reservoirs. Heterogeneity of sandstone strength tensile and shear parameters are introduced using statistical representation of cathodoluminiscence (CL) sandstone rock images. Weibull distribution of statistical parameter values was determined as a best match of the CL scans of sandstone grains and cement between grains. Results of hydraulic fracturing stimulation from the well bore indicate significant difference between models with the bond strengths informed from CL scans and uniform homogeneous representation of sandstone parameters. Micro-mechanical insight reveals formed hydraulic fracture typical for mode I or tensile cracking in both cases. However, the shear micro-cracks are abundant in the CL informed model while they are absent in the standard model with uniform strength distribution. Most of the mode II cracks, or shear micro-cracks, are not part of the main hydraulic fracture and occur in the near-tip and near-fracture areas. The position and occurrence of the shear micro-cracks is characterized as secondary effect which dissipates the hydraulic fracturing energy. Additionally, the shear micro-crack locations qualitatively resemble acoustic emission cloud of shear cracks frequently observed in hydraulic fracturing, and sometimes interpreted as re-activation of existing fractures. Clearly, our model does not contain pre-existing cracks and has continuous nature prior to fracturing. This observation is novel and interesting and is quantified in the paper. The shear particle contact forces field reveals significant relaxation compared to the model with uniform strength distribution.

  3. Personalizing oncology treatments by predicting drug efficacy, side-effects, and improved therapy: mathematics, statistics, and their integration.

    PubMed

    Agur, Zvia; Elishmereni, Moran; Kheifetz, Yuri

    2014-01-01

    Despite its great promise, personalized oncology still faces many hurdles, and it is increasingly clear that targeted drugs and molecular biomarkers alone yield only modest clinical benefit. One reason is the complex relationships between biomarkers and the patient's response to drugs, obscuring the true weight of the biomarkers in the overall patient's response. This complexity can be disentangled by computational models that integrate the effects of personal biomarkers into a simulator of drug-patient dynamic interactions, for predicting the clinical outcomes. Several computational tools have been developed for personalized oncology, notably evidence-based tools for simulating pharmacokinetics, Bayesian-estimated tools for predicting survival, etc. We describe representative statistical and mathematical tools, and discuss their merits, shortcomings and preliminary clinical validation attesting to their potential. Yet, the individualization power of mathematical models alone, or statistical models alone, is limited. More accurate and versatile personalization tools can be constructed by a new application of the statistical/mathematical nonlinear mixed effects modeling (NLMEM) approach, which until recently has been used only in drug development. Using these advanced tools, clinical data from patient populations can be integrated with mechanistic models of disease and physiology, for generating personal mathematical models. Upon a more substantial validation in the clinic, this approach will hopefully be applied in personalized clinical trials, P-trials, hence aiding the establishment of personalized medicine within the main stream of clinical oncology. © 2014 Wiley Periodicals, Inc.

  4. Applied Mathematical Methods in Theoretical Physics

    NASA Astrophysics Data System (ADS)

    Masujima, Michio

    2005-04-01

    All there is to know about functional analysis, integral equations and calculus of variations in a single volume. This advanced textbook is divided into two parts: The first on integral equations and the second on the calculus of variations. It begins with a short introduction to functional analysis, including a short review of complex analysis, before continuing a systematic discussion of different types of equations, such as Volterra integral equations, singular integral equations of Cauchy type, integral equations of the Fredholm type, with a special emphasis on Wiener-Hopf integral equations and Wiener-Hopf sum equations. After a few remarks on the historical development, the second part starts with an introduction to the calculus of variations and the relationship between integral equations and applications of the calculus of variations. It further covers applications of the calculus of variations developed in the second half of the 20th century in the fields of quantum mechanics, quantum statistical mechanics and quantum field theory. Throughout the book, the author presents over 150 problems and exercises -- many from such branches of physics as quantum mechanics, quantum statistical mechanics, and quantum field theory -- together with outlines of the solutions in each case. Detailed solutions are given, supplementing the materials discussed in the main text, allowing problems to be solved making direct use of the method illustrated. The original references are given for difficult problems. The result is complete coverage of the mathematical tools and techniques used by physicists and applied mathematicians Intended for senior undergraduates and first-year graduates in science and engineering, this is equally useful as a reference and self-study guide.

  5. OASIS 2: online application for survival analysis 2 with features for the analysis of maximal lifespan and healthspan in aging research.

    PubMed

    Han, Seong Kyu; Lee, Dongyeop; Lee, Heetak; Kim, Donghyo; Son, Heehwa G; Yang, Jae-Seong; Lee, Seung-Jae V; Kim, Sanguk

    2016-08-30

    Online application for survival analysis (OASIS) has served as a popular and convenient platform for the statistical analysis of various survival data, particularly in the field of aging research. With the recent advances in the fields of aging research that deal with complex survival data, we noticed a need for updates to the current version of OASIS. Here, we report OASIS 2 (http://sbi.postech.ac.kr/oasis2), which provides extended statistical tools for survival data and an enhanced user interface. In particular, OASIS 2 enables the statistical comparison of maximal lifespans, which is potentially useful for determining key factors that limit the lifespan of a population. Furthermore, OASIS 2 provides statistical and graphical tools that compare values in different conditions and times. That feature is useful for comparing age-associated changes in physiological activities, which can be used as indicators of "healthspan." We believe that OASIS 2 will serve as a standard platform for survival analysis with advanced and user-friendly statistical tools for experimental biologists in the field of aging research.

  6. Propensity score to detect baseline imbalance in cluster randomized trials: the role of the c-statistic.

    PubMed

    Leyrat, Clémence; Caille, Agnès; Foucher, Yohann; Giraudeau, Bruno

    2016-01-22

    Despite randomization, baseline imbalance and confounding bias may occur in cluster randomized trials (CRTs). Covariate imbalance may jeopardize the validity of statistical inferences if they occur on prognostic factors. Thus, the diagnosis of a such imbalance is essential to adjust statistical analysis if required. We developed a tool based on the c-statistic of the propensity score (PS) model to detect global baseline covariate imbalance in CRTs and assess the risk of confounding bias. We performed a simulation study to assess the performance of the proposed tool and applied this method to analyze the data from 2 published CRTs. The proposed method had good performance for large sample sizes (n =500 per arm) and when the number of unbalanced covariates was not too small as compared with the total number of baseline covariates (≥40% of unbalanced covariates). We also provide a strategy for pre selection of the covariates needed to be included in the PS model to enhance imbalance detection. The proposed tool could be useful in deciding whether covariate adjustment is required before performing statistical analyses of CRTs.

  7. Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data

    NASA Astrophysics Data System (ADS)

    Jern, Mikael

    Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.

  8. Use of Statistical Heuristics in Everyday Inductive Reasoning.

    ERIC Educational Resources Information Center

    Nisbett, Richard E.; And Others

    1983-01-01

    In everyday reasoning, people use statistical heuristics (judgmental tools that are rough intuitive equivalents of statistical principles). Use of statistical heuristics is more likely when (1) sampling is clear, (2) the role of chance is clear, (3) statistical reasoning is normative for the event, or (4) the subject has had training in…

  9. Subcritical crack growth in SiNx thin-film barriers studied by electro-mechanical two-point bending

    NASA Astrophysics Data System (ADS)

    Guan, Qingling; Laven, Jozua; Bouten, Piet C. P.; de With, Gijsbertus

    2013-06-01

    Mechanical failure resulting from subcritical crack growth in the SiNx inorganic barrier layer applied on a flexible multilayer structure was studied by an electro-mechanical two-point bending method. A 10 nm conducting tin-doped indium oxide layer was sputtered as an electrical probe to monitor the subcritical crack growth in the 150 nm dielectric SiNx layer carried by a polyethylene naphthalate substrate. In the electro-mechanical two-point bending test, dynamic and static loads were applied to investigate the crack propagation in the barrier layer. As consequence of using two loading modes, the characteristic failure strain and failure time could be determined. The failure probability distribution of strain and lifetime under each loading condition was described by Weibull statistics. In this study, results from the tests in dynamic and static loading modes were linked by a power law description to determine the critical failure over a range of conditions. The fatigue parameter n from the power law reduces greatly from 70 to 31 upon correcting for internal strain. The testing method and analysis tool as described in the paper can be used to understand the limit of thin-film barriers in terms of their mechanical properties.

  10. Education on invasive mechanical ventilation involving intensive care nurses: a systematic review.

    PubMed

    Guilhermino, Michelle C; Inder, Kerry J; Sundin, Deborah

    2018-03-26

    Intensive care unit nurses are critical for managing mechanical ventilation. Continuing education is essential in building and maintaining nurses' knowledge and skills, potentially improving patient outcomes. The aim of this study was to determine whether continuing education programmes on invasive mechanical ventilation involving intensive care unit nurses are effective in improving patient outcomes. Five electronic databases were searched from 2001 to 2016 using keywords such as mechanical ventilation, nursing and education. Inclusion criteria were invasive mechanical ventilation continuing education programmes that involved nurses and measured patient outcomes. Primary outcomes were intensive care unit mortality and in-hospital mortality. Secondary outcomes included hospital and intensive care unit length of stay, length of intubation, failed weaning trials, re-intubation incidence, ventilation-associated pneumonia rate and lung-protective ventilator strategies. Studies were excluded if they excluded nurses, patients were ventilated for less than 24 h, the education content focused on protocol implementation or oral care exclusively or the outcomes were participant satisfaction. Quality was assessed by two reviewers using an education intervention critical appraisal worksheet and a risk of bias assessment tool. Data were extracted independently by two reviewers and analysed narratively due to heterogeneity. Twelve studies met the inclusion criteria for full review: 11 pre- and post-intervention observational and 1 quasi-experimental design. Studies reported statistically significant reductions in hospital length of stay, length of intubation, ventilator-associated pneumonia rates, failed weaning trials and improvements in lung-protective ventilation compliance. Non-statistically significant results were reported for in-hospital and intensive care unit mortality, re-intubation and intensive care unit length of stay. Limited evidence of the effectiveness of continuing education programmes on mechanical ventilation involving nurses in improving patient outcomes exists. Comprehensive continuing education is required. Well-designed trials are required to confirm that comprehensive continuing education involving intensive care nurses about mechanical ventilation improves patient outcomes. © 2018 British Association of Critical Care Nurses.

  11. Econometric Assessment of "One Minute" Paper as a Pedagogic Tool

    ERIC Educational Resources Information Center

    Das, Amaresh

    2010-01-01

    This paper makes an econometric testing of one-minute paper used as a tool to manage and assess instruction in my statistics class. One of our findings is that the one minute paper when I have tested it by using an OLS estimate in a controlled Vs experimental design framework is found to statistically significant and effective in enhancing…

  12. Classroom Research: Assessment of Student Understanding of Sampling Distributions of Means and the Central Limit Theorem in Post-Calculus Probability and Statistics Classes

    ERIC Educational Resources Information Center

    Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy

    2006-01-01

    We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…

  13. Non-equilibrium statistical mechanics theory for the large scales of geophysical flows

    NASA Astrophysics Data System (ADS)

    Eric, S.; Bouchet, F.

    2010-12-01

    The aim of any theory of turbulence is to understand the statistical properties of the velocity field. As a huge number of degrees of freedom is involved, statistical mechanics is a natural approach. The self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. We discuss classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations, mean field approach) and thermodynamic concepts (ensemble inequivalence, negative heat capacity) are briefly explained and used to predict statistical equilibria for turbulent flows. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations will be discussed. We also present recent results for non-equilibrium situations, for which forces and dissipation are in a statistical balance. As an example, the concept of phase transition allows us to describe drastic changes of the whole system when a few external parameters are changed. F. Bouchet and E. Simonnet, Random Changes of Flow Topology in Two-Dimensional and Geophysical Turbulence, Physical Review Letters 102 (2009), no. 9, 094504-+. F. Bouchet and J. Sommeria, Emergence of intense jets and Jupiter's Great Red Spot as maximum-entropy structures, Journal of Fluid Mechanics 464 (2002), 165-207. A. Venaille and F. Bouchet, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. Bouchet and A. Venaille, Statistical mechanics of two-dimensional and geophysical flows, submitted to Physics Reports Non-equilibrium phase transitions for the 2D Navier-Stokes equations with stochastic forces (time series and probability density functions (PDFs) of the modulus of the largest scale Fourrier component, showing bistability between dipole and unidirectional flows). This bistability is predicted by statistical mechanics.

  14. ProUCL version 4.1.00 Documentation Downloads

    EPA Pesticide Factsheets

    ProUCL version 4.1.00 represents a comprehensive statistical software package equipped with statistical methods and graphical tools needed to address many environmental sampling and statistical issues as described in various these guidance documents.

  15. Comparative study of coated and uncoated tool inserts with dry machining of EN47 steel using Taguchi L9 optimization technique

    NASA Astrophysics Data System (ADS)

    Vasu, M.; Shivananda, Nayaka H.

    2018-04-01

    EN47 steel samples are machined on a self-centered lathe using Chemical Vapor Deposition of coated TiCN/Al2O3/TiN and uncoated tungsten carbide tool inserts, with nose radius 0.8mm. Results are compared with each other and optimized using statistical tool. Input (cutting) parameters that are considered in this work are feed rate (f), cutting speed (Vc), and depth of cut (ap), the optimization criteria are based on the Taguchi (L9) orthogonal array. ANOVA method is adopted to evaluate the statistical significance and also percentage contribution for each model. Multiple response characteristics namely cutting force (Fz), tool tip temperature (T) and surface roughness (Ra) are evaluated. The results discovered that coated tool insert (TiCN/Al2O3/TiN) exhibits 1.27 and 1.29 times better than the uncoated tool insert for tool tip temperature and surface roughness respectively. A slight increase in cutting force was observed for coated tools.

  16. Statistical mechanics based on fractional classical and quantum mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korichi, Z.; Meftah, M. T., E-mail: mewalid@yahoo.com

    2014-03-15

    The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.

  17. Socio-economic inequity in demand for insecticide-treated nets, in-door residual house spraying, larviciding and fogging in Sudan.

    PubMed

    Onwujekwe, Obinna; Malik, El-Fatih Mohamed; Mustafa, Sara Hassan; Mnzava, Abraham

    2005-12-15

    In order to optimally prioritize and use public and private budgets for equitable malaria vector control, there is a need to determine the level and determinants of consumer demand for different vector control tools. To determine the demand from people of different socio-economic groups for indoor residual house-spraying (IRHS), insecticide-treated nets (ITNs), larviciding with chemicals (LWC), and space spraying/fogging (SS) and the disease control implications of the result. Ratings and levels of willingness-to-pay (WTP) for the vector control tools were determined using a random cross-sectional sample of 720 householdes drawn from two states. WTP was elicited using the bidding game. An asset-based socio-economic status (SES) index was used to explore whether WTP was related to SES of the respondents. IRHS received the highest proportion of highest preferred rating (41.0%) followed by ITNs (23.1%). However, ITNs had the highest mean WTP followed by IRHS, while LWC had the least. The regression analysis showed that SES was positively and statistically significantly related to WTP across the four vector control tools and that the respondents' rating of IRHS and ITNs significantly explained their levels of WTP for the two tools. People were willing to pay for all the vector-control tools, but the demand for the vector control tools was related to the SES of the respondents. Hence, it is vital that there are public policies and financing mechanisms to ensure equitable provision and utilisation of vector control tools, as well as protecting the poor from cost-sharing arrangements.

  18. Many-Body Localization and Thermalization in Quantum Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Nandkishore, Rahul; Huse, David A.

    2015-03-01

    We review some recent developments in the statistical mechanics of isolated quantum systems. We provide a brief introduction to quantum thermalization, paying particular attention to the eigenstate thermalization hypothesis (ETH) and the resulting single-eigenstate statistical mechanics. We then focus on a class of systems that fail to quantum thermalize and whose eigenstates violate the ETH: These are the many-body Anderson-localized systems; their long-time properties are not captured by the conventional ensembles of quantum statistical mechanics. These systems can forever locally remember information about their local initial conditions and are thus of interest for possibilities of storing quantum information. We discuss key features of many-body localization (MBL) and review a phenomenology of the MBL phase. Single-eigenstate statistical mechanics within the MBL phase reveal dynamically stable ordered phases, and phase transitions among them, that are invisible to equilibrium statistical mechanics and can occur at high energy and low spatial dimensionality, where equilibrium ordering is forbidden.

  19. Compendium of Abstracts on Statistical Applications in Geotechnical Engineering.

    DTIC Science & Technology

    1983-09-01

    research in the application of probabilistic and statistical methods to soil mechanics, rock mechanics, and engineering geology problems have grown markedly...probability, statistics, soil mechanics, rock mechanics, and engineering geology. 2. The purpose of this report is to make available to the U. S...Deformation Dynamic Response Analysis Seepage, Soil Permeability and Piping Earthquake Engineering, Seismology, Settlement and Heave Seismic Risk Analysis

  20. Investigation into the Effects of Textural Properties on Cuttability Performance of a Chisel Tool

    NASA Astrophysics Data System (ADS)

    Tumac, Deniz; Copur, Hanifi; Balci, Cemal; Er, Selman; Avunduk, Emre

    2018-04-01

    The main objective of this study is to investigate the effect of textural properties of stones on cutting performance of a standard chisel tool. Therewithal, the relationships between textural properties and cutting performance parameters and physical and mechanical properties were statistically analyzed. For this purpose, physical and mechanical property tests and mineralogical and petrographic analyses were carried out on eighteen natural stone samples, which can be grouped into three fundamentally different geological origins, i.e., metamorphic, igneous, and sedimentary. Then, texture coefficient analyses were performed on the samples. To determine the cuttability of the stones; the samples were cut with a portable linear cutting machine using a standard chisel tool at different depths of cut in unrelieved (non-interactive) cutting mode. The average and maximum forces (normal and cutting) and specific energy were measured, and the obtained values were correlated with texture coefficient, packing weighting, and grain size. With reference to the relation between depth of cut and cutting performance of the chisel tool for three types of natural stone groups, specific energy decreases with increasing depth of cut, and cutting forces increase in proportion to the depth of cut. The same is observed for the relationship between packing weighting and both of specific energy and cutter forces. On the other hand, specific energy and the forces decrease while grain size increases. Based on the findings of the present study, texture coefficient has strong correlation with specific energy. Generally, the lower depth of cut values in cutting tests shows higher and more reliable correlations with texture coefficient than the increased depth of cut. The results of cutting tests show also that, at a lower depth of cut (less than 1.5 mm), even stronger correlations can be observed between texture coefficient and cutting performance. Experimental studies indicate that cutting performance of chisel tools can be predicted based on texture coefficients of the natural stones.

  1. Dynamics and Emergent Structures in Active Fluids

    NASA Astrophysics Data System (ADS)

    Baskaran, Aparna

    2014-03-01

    In this talk, we consider an active fluid of colloidal sized particles, with the primary manifestation of activity being a self-replenishing velocity along one body axis of the particle. This is a minimal model for varied systems such as bacterial colonies, cytoskeletal filament motility assays vibrated granular particles and self propelled diffusophoretic colloids, depending on the nature of interaction among the particles. Using microscopic Brownian dynamics simulations, coarse-graining using the tools of non-equilibrium statistical mechanics and analysis of macroscopic hydrodynamic theories, we characterize emergent structures seen in these systems, which are determined by the symmetry of the interactions among the active units, such as propagating density waves, dense stationary bands, asters and phase separated isotropic clusters. We identify a universal mechanism, termed ``self-regulation,'' as the underlying physics that leads to these structures in diverse systems. Support from NSF through DMR-1149266 and DMR-0820492.

  2. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  3. Mechanics and energetics in tool manufacture and use: a synthetic approach.

    PubMed

    Wang, Liyu; Brodbeck, Luzius; Iida, Fumiya

    2014-11-06

    Tool manufacture and use are observed not only in humans but also in other animals such as mammals, birds and insects. Manufactured tools are used for biomechanical functions such as effective control of fluids and small solid objects and extension of reaching. These tools are passive and used with gravity and the animal users' own energy. From the perspective of evolutionary biology, manufactured tools are extended phenotypes of the genes of the animal and exhibit phenotypic plasticity. This incurs energetic cost of manufacture as compared to the case with a fixed tool. This paper studies mechanics and energetics aspects of tool manufacture and use in non-human beings. Firstly, it investigates possible mechanical mechanisms of the use of passive manufactured tools. Secondly, it formulates the energetic cost of manufacture and analyses when phenotypic plasticity benefits an animal tool maker and user. We take a synthetic approach and use a controlled physical model, i.e. a robot arm. The robot is capable of additively manufacturing scoop and gripper structures from thermoplastic adhesives to pick and place fluid and solid objects, mimicking primates and birds manufacturing tools for a similar function. We evaluate the effectiveness of tool use in pick-and-place and explain the mechanism for gripper tools picking up solid objects with a solid-mechanics model. We propose a way to formulate the energetic cost of tool manufacture that includes modes of addition and reshaping, and use it to analyse the case of scoop tools. Experiment results show that with a single motor trajectory, the robot was able to effectively pick and place water, rice grains, a pebble and a plastic box with a scoop tool or gripper tools that were manufactured by itself. They also show that by changing the dimension of scoop tools, the energetic cost of tool manufacture and use could be reduced. The work should also be interesting for engineers to design adaptive machines. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  4. Mechanics and energetics in tool manufacture and use: a synthetic approach

    PubMed Central

    Wang, Liyu; Brodbeck, Luzius; Iida, Fumiya

    2014-01-01

    Tool manufacture and use are observed not only in humans but also in other animals such as mammals, birds and insects. Manufactured tools are used for biomechanical functions such as effective control of fluids and small solid objects and extension of reaching. These tools are passive and used with gravity and the animal users' own energy. From the perspective of evolutionary biology, manufactured tools are extended phenotypes of the genes of the animal and exhibit phenotypic plasticity. This incurs energetic cost of manufacture as compared to the case with a fixed tool. This paper studies mechanics and energetics aspects of tool manufacture and use in non-human beings. Firstly, it investigates possible mechanical mechanisms of the use of passive manufactured tools. Secondly, it formulates the energetic cost of manufacture and analyses when phenotypic plasticity benefits an animal tool maker and user. We take a synthetic approach and use a controlled physical model, i.e. a robot arm. The robot is capable of additively manufacturing scoop and gripper structures from thermoplastic adhesives to pick and place fluid and solid objects, mimicking primates and birds manufacturing tools for a similar function. We evaluate the effectiveness of tool use in pick-and-place and explain the mechanism for gripper tools picking up solid objects with a solid-mechanics model. We propose a way to formulate the energetic cost of tool manufacture that includes modes of addition and reshaping, and use it to analyse the case of scoop tools. Experiment results show that with a single motor trajectory, the robot was able to effectively pick and place water, rice grains, a pebble and a plastic box with a scoop tool or gripper tools that were manufactured by itself. They also show that by changing the dimension of scoop tools, the energetic cost of tool manufacture and use could be reduced. The work should also be interesting for engineers to design adaptive machines. PMID:25209405

  5. Studies of elasticity, sound propagation and attenuation of acoustic modes in granular media: final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makse, Hernan A.; Johnson, David L.

    2014-09-03

    This is the final report describing the results of DOE Grant # DE-FG02-03ER15458 with original termination date of April 31, 2013, which has been extended to April 31, 2014. The goal of this project is to develop a theoretical and experimental understanding of sound propagation, elasticity and dissipation in granular materials. The topic is relevant for the efficient production of hydrocarbon and for identifying and characterizing the underground formation for storage of either CO 2 or nuclear waste material. Furthermore, understanding the basic properties of acoustic propagation in granular media is of importance not only to the energy industry, butmore » also to the pharmaceutical, chemical and agricultural industries. We employ a set of experimental, theoretical and computational tools to develop a study of acoustics and dissipation in granular media. These include the concept effective mass of granular media, normal modes analysis, statistical mechanics frameworks and numerical simulations based on Discrete Element Methods. Effective mass measurements allow us to study the mechanisms of the elastic response and attenuation of acoustic modes in granular media. We perform experiments and simulations under varying conditions, including humidity and vacuum, and different interparticle force-laws to develop a fundamental understanding of the mechanisms of damping and acoustic propagation in granular media. A theoretical statistical approach studies the necessary phase space of configurations in pressure, volume fraction to classify granular materials.« less

  6. The Web as an educational tool for/in learning/teaching bioinformatics statistics.

    PubMed

    Oliver, J; Pisano, M E; Alonso, T; Roca, P

    2005-12-01

    Statistics provides essential tool in Bioinformatics to interpret the results of a database search or for the management of enormous amounts of information provided from genomics, proteomics and metabolomics. The goal of this project was the development of a software tool that would be as simple as possible to demonstrate the use of the Bioinformatics statistics. Computer Simulation Methods (CSMs) developed using Microsoft Excel were chosen for their broad range of applications, immediate and easy formula calculation, immediate testing and easy graphics representation, and of general use and acceptance by the scientific community. The result of these endeavours is a set of utilities which can be accessed from the following URL: http://gmein.uib.es/bioinformatica/statistics. When tested on students with previous coursework with traditional statistical teaching methods, the general opinion/overall consensus was that Web-based instruction had numerous advantages, but traditional methods with manual calculations were also needed for their theory and practice. Once having mastered the basic statistical formulas, Excel spreadsheets and graphics were shown to be very useful for trying many parameters in a rapid fashion without having to perform tedious calculations. CSMs will be of great importance for the formation of the students and professionals in the field of bioinformatics, and for upcoming applications of self-learning and continuous formation.

  7. Universal biology and the statistical mechanics of early life.

    PubMed

    Goldenfeld, Nigel; Biancalani, Tommaso; Jafarpour, Farshid

    2017-12-28

    All known life on the Earth exhibits at least two non-trivial common features: the canonical genetic code and biological homochirality, both of which emerged prior to the Last Universal Common Ancestor state. This article describes recent efforts to provide a narrative of this epoch using tools from statistical mechanics. During the emergence of self-replicating life far from equilibrium in a period of chemical evolution, minimal models of autocatalysis show that homochirality would have necessarily co-evolved along with the efficiency of early-life self-replicators. Dynamical system models of the evolution of the genetic code must explain its universality and its highly refined error-minimization properties. These have both been accounted for in a scenario where life arose from a collective, networked phase where there was no notion of species and perhaps even individuality itself. We show how this phase ultimately terminated during an event sometimes known as the Darwinian transition, leading to the present epoch of tree-like vertical descent of organismal lineages. These examples illustrate concrete examples of universal biology: the quest for a fundamental understanding of the basic properties of living systems, independent of precise instantiation in chemistry or other media.This article is part of the themed issue 'Reconceptualizing the origins of life'. © 2017 The Author(s).

  8. Universal biology and the statistical mechanics of early life

    NASA Astrophysics Data System (ADS)

    Goldenfeld, Nigel; Biancalani, Tommaso; Jafarpour, Farshid

    2017-11-01

    All known life on the Earth exhibits at least two non-trivial common features: the canonical genetic code and biological homochirality, both of which emerged prior to the Last Universal Common Ancestor state. This article describes recent efforts to provide a narrative of this epoch using tools from statistical mechanics. During the emergence of self-replicating life far from equilibrium in a period of chemical evolution, minimal models of autocatalysis show that homochirality would have necessarily co-evolved along with the efficiency of early-life self-replicators. Dynamical system models of the evolution of the genetic code must explain its universality and its highly refined error-minimization properties. These have both been accounted for in a scenario where life arose from a collective, networked phase where there was no notion of species and perhaps even individuality itself. We show how this phase ultimately terminated during an event sometimes known as the Darwinian transition, leading to the present epoch of tree-like vertical descent of organismal lineages. These examples illustrate concrete examples of universal biology: the quest for a fundamental understanding of the basic properties of living systems, independent of precise instantiation in chemistry or other media. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  9. Mesoscale raised rim depressions (MRRDs) on Earth: A review of the characteristics, processes, and spatial distributions of analogs for Mars

    USGS Publications Warehouse

    Burr, D.M.; Bruno, B.C.; Lanagan, P.D.; Glaze, L.S.; Jaeger, W.L.; Soare, R.J.; Wan, Bun Tseung J.-M.; Skinner, J.A.; Baloga, S.M.

    2009-01-01

    Fields of mesoscale raised rim depressions (MRRDs) of various origins are found on Earth and Mars. Examples include rootless cones, mud volcanoes, collapsed pingos, rimmed kettle holes, and basaltic ring structures. Correct identification of MRRDs on Mars is valuable because different MRRD types have different geologic and/or climatic implications and are often associated with volcanism and/or water, which may provide locales for biotic or prebiotic activity. In order to facilitate correct identification of fields of MRRDs on Mars and their implications, this work provides a review of common terrestrial MRRD types that occur in fields. In this review, MRRDs by formation mechanism, including hydrovolcanic (phreatomagmatic cones, basaltic ring structures), sedimentological (mud volcanoes), and ice-related (pingos, volatile ice-block forms) mechanisms. For each broad mechanism, we present a comparative synopsis of (i) morphology and observations, (ii) physical formation processes, and (iii) published hypothesized locations on Mars. Because the morphology for MRRDs may be ambiguous, an additional tool is provided for distinguishing fields of MRRDs by origin on Mars, namely, spatial distribution analyses for MRRDs within fields on Earth. We find that MRRDs have both distinguishing and similar characteristics, and observation that applies both to their mesoscale morphology and to their spatial distribution statistics. Thus, this review provides tools for distinguishing between various MRRDs, while highlighting the utility of the multiple working hypotheses approach. ?? 2008 Elsevier Ltd.

  10. Teaching Classical Statistical Mechanics: A Simulation Approach.

    ERIC Educational Resources Information Center

    Sauer, G.

    1981-01-01

    Describes a one-dimensional model for an ideal gas to study development of disordered motion in Newtonian mechanics. A Monte Carlo procedure for simulation of the statistical ensemble of an ideal gas with fixed total energy is developed. Compares both approaches for a pseudoexperimental foundation of statistical mechanics. (Author/JN)

  11. Fetal Alcohol Spectrum Disorders (FASDs)

    MedlinePlus

    ... other research. DATA & STATISTICS Data and statistics highlights. Interventions CHOICES program and alcohol screening and brief intervention (SBI). EDUCATION & TRAINING Tools, training centers, & educational resources. ...

  12. Development of a Hands-On Survey Course in the Physics of Living Systems

    NASA Astrophysics Data System (ADS)

    Matthews, Megan; Goldman, Daniel I.

    Due to the widespread availability and technological capabilities of modern smartphones, many biophysical systems can be investigated using easily accessible, low-cost, and/or ``homemade'' equipment. Our survey course is structured to provide students with an overview of research in the physics of living systems, emphasizing the interplay between measurement, mechanism, and modeling required to understand principles at the intersection of physics and biology. The course proceeds through seven modules each consisting of one week of lectures and one week of hands-on experiments, called ``microlabs''. Using smartphones, Arduinos, and 3D printed materials students create their own laboratory equipment, including a 150X van Leeuwenhoek microscope, a shaking incubator, and an oscilloscope, and then use them to study biological systems ranging in length scales from nanometers to meters. These systems include population dynamics of rotifer/algae cultures, experimental evolution of multicellularity in budding yeast, and the bio- & neuromechanics involved in animal locomotion, among others. In each module, students are introduced to fundamental biological and physical concepts as well as theoretical and computational tools (nonlinear dynamics, molecular dynamics simulation, and statistical mechanics). At the end of the course, students apply these concepts and tools to the creation of their own microlab that integrates hands-on experimentation and modeling in the study of their chosen biophysical system.

  13. Spectral Entropies as Information-Theoretic Tools for Complex Network Comparison

    NASA Astrophysics Data System (ADS)

    De Domenico, Manlio; Biamonte, Jacob

    2016-10-01

    Any physical system can be viewed from the perspective that information is implicitly represented in its state. However, the quantification of this information when it comes to complex networks has remained largely elusive. In this work, we use techniques inspired by quantum statistical mechanics to define an entropy measure for complex networks and to develop a set of information-theoretic tools, based on network spectral properties, such as Rényi q entropy, generalized Kullback-Leibler and Jensen-Shannon divergences, the latter allowing us to define a natural distance measure between complex networks. First, we show that by minimizing the Kullback-Leibler divergence between an observed network and a parametric network model, inference of model parameter(s) by means of maximum-likelihood estimation can be achieved and model selection can be performed with appropriate information criteria. Second, we show that the information-theoretic metric quantifies the distance between pairs of networks and we can use it, for instance, to cluster the layers of a multilayer system. By applying this framework to networks corresponding to sites of the human microbiome, we perform hierarchical cluster analysis and recover with high accuracy existing community-based associations. Our results imply that spectral-based statistical inference in complex networks results in demonstrably superior performance as well as a conceptual backbone, filling a gap towards a network information theory.

  14. eSACP - a new Nordic initiative towards developing statistical climate services

    NASA Astrophysics Data System (ADS)

    Thorarinsdottir, Thordis; Thejll, Peter; Drews, Martin; Guttorp, Peter; Venälainen, Ari; Uotila, Petteri; Benestad, Rasmus; Mesquita, Michel d. S.; Madsen, Henrik; Fox Maule, Cathrine

    2015-04-01

    The Nordic research council NordForsk has recently announced its support for a new 3-year research initiative on "statistical analysis of climate projections" (eSACP). eSACP will focus on developing e-science tools and services based on statistical analysis of climate projections for the purpose of helping decision-makers and planners in the face of expected future challenges in regional climate change. The motivation behind the project is the growing recognition in our society that forecasts of future climate change is associated with various sources of uncertainty, and that any long-term planning and decision-making dependent on a changing climate must account for this. At the same time there is an obvious gap between scientists from different fields and between practitioners in terms of understanding how climate information relates to different parts of the "uncertainty cascade". In eSACP we will develop generic e-science tools and statistical climate services to facilitate the use of climate projections by decision-makers and scientists from all fields for climate impact analyses and for the development of robust adaptation strategies, which properly (in a statistical sense) account for the inherent uncertainty. The new tool will be publically available and include functionality to utilize the extensive and dynamically growing repositories of data and use state-of-the-art statistical techniques to quantify the uncertainty and innovative approaches to visualize the results. Such a tool will not only be valuable for future assessments and underpin the development of dedicated climate services, but will also assist the scientific community in making more clearly its case on the consequences of our changing climate to policy makers and the general public. The eSACP project is led by Thordis Thorarinsdottir, Norwegian Computing Center, and also includes the Finnish Meteorological Institute, the Norwegian Meteorological Institute, the Technical University of Denmark and the Bjerknes Centre for Climate Research, Norway. This poster will present details of focus areas in the project and show some examples of the expected analysis tools.

  15. THE ATMOSPHERIC MODEL EVALUATION TOOL

    EPA Science Inventory

    This poster describes a model evaluation tool that is currently being developed and applied for meteorological and air quality model evaluation. The poster outlines the framework and provides examples of statistical evaluations that can be performed with the model evaluation tool...

  16. Mechanical problem-solving strategies in Alzheimer's disease and semantic dementia.

    PubMed

    Lesourd, Mathieu; Baumard, Josselin; Jarry, Christophe; Etcharry-Bouyx, Frédérique; Belliard, Serge; Moreaud, Olivier; Croisile, Bernard; Chauviré, Valérie; Granjon, Marine; Le Gall, Didier; Osiurak, François

    2016-07-01

    The goal of this study was to explore whether the tool-use disorders observed in Alzheimer's disease (AD) and semantic dementia (SD) are of the same nature as those observed in left brain-damaged (LBD) patients. Recent evidence indicates that LBD patients with apraxia of tool use encounter difficulties in solving mechanical problems, characterized by the absence of specific strategies. This pattern may show the presence of impaired mechanical knowledge, critical for both familiar and novel tool use. So, we explored the strategies followed by AD and SD patients in mechanical problem-solving tasks in order to determine whether mechanical knowledge is also impaired in these patients. We used a mechanical problem-solving task in both choice (i.e., several tools were proposed) and no-choice (i.e., only 1 tool was proposed) conditions. We analyzed quantitative data and strategy profiles. AD patients but not SD patients met difficulties in solving mechanical problem-solving tasks. However, the key finding is that AD patients, despite their difficulties, showed strategy profiles that are similar to that of SD patients or controls. Moreover, AD patients exhibited a strategy profile distinct from the one previously observed in LBD patients. Those observations lead us to consider that difficulties met by AD patients to solve mechanical problems or even to use familiar tools may not be caused by mechanical knowledge impairment per se. In broad terms, what we call apraxia of tool use in AD is certainly not the same as apraxia of tool use observed in LBD patients. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. Dragon-Kings, Black-Swans and Prediction (Invited)

    NASA Astrophysics Data System (ADS)

    Sornette, D.

    2010-12-01

    Extreme fluctuations or events are often associated with power law statistics. Indeed, it is a popular belief that "wild randomness'' is deeply associated with distributions with power law tails characterized by small exponents. In other words, power law tails are often seen as the epitome of extreme events (the "Black Swan'' story). Here, we document in very different systems that there is life beyond power law tails: power laws can be superseded by "dragon-kings'', monster events that occur beyond (or changing) the power law tail. Dragon-kings reveal hidden mechanisms that are only transiently active and that amplify the normal fluctuations (often described by the power laws of the normal regime). The goal of this lecture is to catalyze the interest of the community of geophysicists across all fields of geosciences so that the "invisible gorilla" fallacy may be avoided. Our own research illustrates that new statistics or representation of data are often necessary to identify dragon-kings, with strategies guided by the underlying mechanisms. Paradoxically, the monsters may be ignored or hidden by the use of inappropriate analysis or statistical tools that amount to cut a mamooth in small pieces, so as to lead to the incorrect belief that only mice exist. In order to stimulate further research, we will document and discuss the dragon-king phenomenon on the statistics of financial losses, economic geography, hydrodynamic turbulence, mechanical ruptures, avalanches in complex heterogeneous media, earthquakes, and epileptic seizures. The special status of dragon-kings open a new research program on their predictability, based on the fact that they belong to a different class of their own and express specific mechanisms amplifying the normal dynamics via positive feedbacks. We will present evidence of these claims for the predictions of material rupture, financial crashes and epileptic seizures. As a bonus, a few remarks will be offered at the end on how the dragon-king phenomenon allows us to understand the present World financial crisis as underpinned in two decades of successive financial and economic bubbles, inflating the mother of all bubbles with new monster dragon-kings at the horizon. The consequences in terms of a new "normal" are eye-opening. Ref: D. Sornette, Dragon-Kings, Black Swans and the Prediction of Crises, International Journal of Terraspace Science and Engineering 1(3), 1-17 (2009) (http://arXiv.org/abs/0907.4290) and (http://ssrn.com/abstract=1470006)

  18. Range of interaction in an opinion evolution model of ideological self-positioning: Contagion, hesitance and polarization

    NASA Astrophysics Data System (ADS)

    Gimenez, M. Cecilia; Paz García, Ana Pamela; Burgos Paci, Maxi A.; Reinaudi, Luis

    2016-04-01

    The evolution of public opinion using tools and concepts borrowed from Statistical Physics is an emerging area within the field of Sociophysics. In the present paper, a Statistical Physics model was developed to study the evolution of the ideological self-positioning of an ensemble of agents. The model consists of an array of L components, each one of which represents the ideology of an agent. The proposed mechanism is based on the ;voter model;, in which one agent can adopt the opinion of another one if the difference of their opinions lies within a certain range. The existence of ;undecided; agents (i.e. agents with no definite opinion) was implemented in the model. The possibility of radicalization of an agent's opinion upon interaction with another one was also implemented. The results of our simulations are compared to statistical data taken from the Latinobarómetro databank for the cases of Argentina, Chile, Brazil and Uruguay in the last decade. Among other results, the effect of taking into account the undecided agents is the formation of a single peak at the middle of the ideological spectrum (which corresponds to a centrist ideological position), in agreement with the real cases studied.

  19. Plackett-Burman experimental design to facilitate syntactic foam development

    DOE PAGES

    Smith, Zachary D.; Keller, Jennie R.; Bello, Mollie; ...

    2015-09-14

    The use of an eight-experiment Plackett–Burman method can assess six experimental variables and eight responses in a polysiloxane-glass microsphere syntactic foam. The approach aims to decrease the time required to develop a tunable polymer composite by identifying a reduced set of variables and responses suitable for future predictive modeling. The statistical design assesses the main effects of mixing process parameters, polymer matrix composition, microsphere density and volume loading, and the blending of two grades of microspheres, using a dummy factor in statistical calculations. Responses cover rheological, physical, thermal, and mechanical properties. The cure accelerator content of the polymer matrix andmore » the volume loading of the microspheres have the largest effects on foam properties. These factors are the most suitable for controlling the gel point of the curing foam, and the density of the cured foam. The mixing parameters introduce widespread variability and therefore should be fixed at effective levels during follow-up testing. Some responses may require greater contrast in microsphere-related factors. As a result, compared to other possible statistical approaches, the run economy of the Plackett–Burman method makes it a valuable tool for rapidly characterizing new foams.« less

  20. Some past and present challenges of econophysics

    NASA Astrophysics Data System (ADS)

    Mantegna, R. N.

    2016-12-01

    We discuss the cultural background that was shared by some of the first econophysicists when they started to work on economic and financial problems with methods and tools of statistical physics. In particular we discuss about the role of stylized facts and statistical physical laws in economics and statistical physics respectively. As an example of the problems and potentials associated with the interaction of different communities of scholars dealing with problems observed in economic and financial systems we briefly discuss the development and the perspectives of the use of tools and concepts of networks in econophysics, economics and finance.

  1. Cancer Statistics Animator

    Cancer.gov

    This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.

  2. Reports on Cancer - Cancer Statistics

    Cancer.gov

    Interactive tools for access to statistics for a cancer site by gender, race, ethnicity, calendar year, age, state, county, stage, and histology. Statistics include incidence, mortality, prevalence, cost, risk factors, behaviors, tobacco use, and policies and are presented as graphs, tables, or maps.

  3. Acoustic emissions diagnosis of rotor-stator rubs using the KS statistic

    NASA Astrophysics Data System (ADS)

    Hall, L. D.; Mba, D.

    2004-07-01

    Acoustic emission (AE) measurement at the bearings of rotating machinery has become a useful tool for diagnosing incipient fault conditions. In particular, AE can be used to detect unwanted intermittent or partial rubbing between a rotating central shaft and surrounding stationary components. This is a particular problem encountered in turbines used for power generation. For successful fault diagnosis, it is important to adopt AE signal analysis techniques capable of distinguishing between various types of rub mechanisms. It is also useful to develop techniques for inferring information such as the severity of rubbing or the type of seal material making contact on the shaft. It is proposed that modelling the cumulative distribution function of rub-induced AE signals with respect to appropriate theoretical distributions, and quantifying the goodness of fit with the Kolmogorov-Smirnov (KS) statistic, offers a suitable signal feature for diagnosis. This paper demonstrates the successful use of the KS feature for discriminating different classes of shaft-seal rubbing.

  4. The shape of CMB temperature and polarization peaks on the sphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcos-Caballero, A.; Fernández-Cobos, R.; Martínez-González, E.

    2016-04-01

    We present a theoretical study of CMB temperature peaks, including its effect over the polarization field, and allowing nonzero eccentricity. The formalism is developed in harmonic space and using the covariant derivative on the sphere, which guarantees that the expressions obtained are completely valid at large scales (i.e., no flat approximation). The expected patterns induced by the peak, either in temperature or polarization, are calculated, as well as their covariances. It is found that the eccentricity introduces a quadrupolar dependence in the peak shape, which is proportional to a complex bias parameter b {sub ε}, characterizing the peak asymmetry andmore » orientation. In addition, the one-point statistics of the variables defining the peak on the sphere is reviewed, finding some differences with respect to the flat case for large peaks. Finally, we present a mechanism to simulate constrained CMB maps with a particular peak on the field, which is an interesting tool for analysing the statistical properties of the peaks present in the data.« less

  5. The Impact of Fire on Active Layer Thicknes

    NASA Astrophysics Data System (ADS)

    Schaefer, K. M.; Parsekian, A.; Natali, S.; Ludwig, S.; Michaelides, R. J.; Zebker, H. A.; Chen, J.

    2016-12-01

    Fire influences permafrost thermodynamics by darkening the surface to increase solar absorption and removing insulating moss and organic soil, resulting in an increase in Active Layer Thickness (ALT). The summer of 2015 was one of the worst fire years on record in Alaska with multiple fires in the Yukon-Kuskokwim (YK) Delta. To understand the impacts of fire on permafrost, we need large-scale, extensive measurements of ALT both within and outside the fire zones. In August 2016, we surveyed ALT across multiple fire zones in the YK Delta using Ground Penetrating Radar (GPR) and mechanical probing. GPR uses pulsed, radio-frequency electromagnetic waves to noninvasively image the subsurface and is an effective tool to quickly map ALT over large areas. We supplemented this ALT data with measurements of Volumetric Water Content (VWC), Organic Layer Thickness (OLT), and burn severity. We quantified the impacts of fire by statistically comparing the measurements inside and outside the fire zones and statistically regressing ALT against VWC, change in OLT, and burn severity.

  6. Data Science in the Research Domain Criteria Era: Relevance of Machine Learning to the Study of Stress Pathology, Recovery, and Resilience

    PubMed Central

    Galatzer-Levy, Isaac R.; Ruggles, Kelly; Chen, Zhe

    2017-01-01

    Diverse environmental and biological systems interact to influence individual differences in response to environmental stress. Understanding the nature of these complex relationships can enhance the development of methods to: (1) identify risk, (2) classify individuals as healthy or ill, (3) understand mechanisms of change, and (4) develop effective treatments. The Research Domain Criteria (RDoC) initiative provides a theoretical framework to understand health and illness as the product of multiple inter-related systems but does not provide a framework to characterize or statistically evaluate such complex relationships. Characterizing and statistically evaluating models that integrate multiple levels (e.g. synapses, genes, environmental factors) as they relate to outcomes that a free from prior diagnostic benchmarks represents a challenge requiring new computational tools that are capable to capture complex relationships and identify clinically relevant populations. In the current review, we will summarize machine learning methods that can achieve these goals. PMID:29527592

  7. A Technology-Based Statistical Reasoning Assessment Tool in Descriptive Statistics for Secondary School Students

    ERIC Educational Resources Information Center

    Chan, Shiau Wei; Ismail, Zaleha

    2014-01-01

    The focus of assessment in statistics has gradually shifted from traditional assessment towards alternative assessment where more attention has been paid to the core statistical concepts such as center, variability, and distribution. In spite of this, there are comparatively few assessments that combine the significant three types of statistical…

  8. Technological Tools in the Introductory Statistics Classroom: Effects on Student Understanding of Inferential Statistics

    ERIC Educational Resources Information Center

    Meletiou-Mavrotheris, Maria

    2004-01-01

    While technology has become an integral part of introductory statistics courses, the programs typically employed are professional packages designed primarily for data analysis rather than for learning. Findings from several studies suggest that use of such software in the introductory statistics classroom may not be very effective in helping…

  9. Theory and computation of non-RRKM lifetime distributions and rates in chemical systems with three or more degrees of freedom

    NASA Astrophysics Data System (ADS)

    Gabern, Frederic; Koon, Wang S.; Marsden, Jerrold E.; Ross, Shane D.

    2005-11-01

    The computation, starting from basic principles, of chemical reaction rates in realistic systems (with three or more degrees of freedom) has been a longstanding goal of the chemistry community. Our current work, which merges tube dynamics with Monte Carlo methods provides some key theoretical and computational tools for achieving this goal. We use basic tools of dynamical systems theory, merging the ideas of Koon et al. [W.S. Koon, M.W. Lo, J.E. Marsden, S.D. Ross, Heteroclinic connections between periodic orbits and resonance transitions in celestial mechanics, Chaos 10 (2000) 427-469.] and De Leon et al. [N. De Leon, M.A. Mehta, R.Q. Topper, Cylindrical manifolds in phase space as mediators of chemical reaction dynamics and kinetics. I. Theory, J. Chem. Phys. 94 (1991) 8310-8328.], particularly the use of invariant manifold tubes that mediate the reaction, into a tool for the computation of lifetime distributions and rates of chemical reactions and scattering phenomena, even in systems that exhibit non-statistical behavior. Previously, the main problem with the application of tube dynamics has been with the computation of volumes in phase spaces of high dimension. The present work provides a starting point for overcoming this hurdle with some new ideas and implements them numerically. Specifically, an algorithm that uses tube dynamics to provide the initial bounding box for a Monte Carlo volume determination is used. The combination of a fine scale method for determining the phase space structure (invariant manifold theory) with statistical methods for volume computations (Monte Carlo) is the main contribution of this paper. The methodology is applied here to a three degree of freedom model problem and may be useful for higher degree of freedom systems as well.

  10. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets

    PubMed Central

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-01-01

    Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. Methods: A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. Results: The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. Conclusions: The work demonstrates the viability of the design approach and the software tool for analysis of large data sets. PMID:24320426

  11. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets.

    PubMed

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-11-01

    With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.

  12. Challenges, alternatives, and paths to sustainability: better public health promotion using social networking pages as key tools.

    PubMed

    Zaidan, A A; Zaidan, B B; Kadhem, Z; Larbani, M; Lakulu, M B; Hashim, M

    2015-02-01

    This paper discusses the possibility of promoting public health and implementing educational health services using Facebook. We discuss the challenges and strengths of using such a platform as a tool for public health care systems from two different perspectives, namely, the view of IT developers and that of physicians. We present a new way of evaluating user interactivity in health care systems from tools provided by Facebook that measure statistical traffic in the Internet. Findings show that Facebook is a very promising tool in promoting e-health services in Web 2.0. Results from statistical traffic show that a Facebook page is more efficient than other pages in promoting public health.

  13. Dynamic Monitoring Reveals Motor Task Characteristics in Prehistoric Technical Gestures

    PubMed Central

    Pfleging, Johannes; Stücheli, Marius; Iovita, Radu; Buchli, Jonas

    2015-01-01

    Reconstructing ancient technical gestures associated with simple tool actions is crucial for understanding the co-evolution of the human forelimb and its associated control-related cognitive functions on the one hand, and of the human technological arsenal on the other hand. Although the topic of gesture is an old one in Paleolithic archaeology and in anthropology in general, very few studies have taken advantage of the new technologies from the science of kinematics in order to improve replicative experimental protocols. Recent work in paleoanthropology has shown the potential of monitored replicative experiments to reconstruct tool-use-related motions through the study of fossil bones, but so far comparatively little has been done to examine the dynamics of the tool itself. In this paper, we demonstrate that we can statistically differentiate gestures used in a simple scraping task through dynamic monitoring. Dynamics combines kinematics (position, orientation, and speed) with contact mechanical parameters (force and torque). Taken together, these parameters are important because they play a role in the formation of a visible archaeological signature, use-wear. We present our new affordable, yet precise methodology for measuring the dynamics of a simple hide-scraping task, carried out using a pull-to (PT) and a push-away (PA) gesture. A strain gage force sensor combined with a visual tag tracking system records force, torque, as well as position and orientation of hafted flint stone tools. The set-up allows switching between two tool configurations, one with distal and the other one with perpendicular hafting of the scrapers, to allow for ethnographically plausible reconstructions. The data show statistically significant differences between the two gestures: scraping away from the body (PA) generates higher shearing forces, but requires greater hand torque. Moreover, most benchmarks associated with the PA gesture are more highly variable than in the PT gesture. These results demonstrate that different gestures used in ‘common’ prehistoric tasks can be distinguished quantitatively based on their dynamic parameters. Future research needs to assess our ability to reconstruct these parameters from observed use-wear patterns. PMID:26284785

  14. An Experimental Study of Cutting Performances of Worn Picks

    NASA Astrophysics Data System (ADS)

    Dogruoz, Cihan; Bolukbasi, Naci; Rostami, Jamal; Acar, Cemil

    2016-01-01

    The best means to assess rock cuttability and efficiency of cutting process for using mechanical excavation is specific energy (SE), measured in full-scale rock cutting test. This is especially true for the application of roadheaders, often fitted with drag-type cutting tools. Radial picks or drag bits are changed during the operation as they reach a certain amount of wear and become blunt. In this study, full-scale cutting tests in different sedimentary rock types with bits having various degree of wear were used to evaluate the influence of bit wear on cutting forces and specific energy. The relationship between the amount of wear as represented by the size of the wear flats at the tip of the bit, and cutting forces as well as specific energy was examined. The influence of various rock properties such as mineral content, uniaxial compressive strength, tensile strength, indentation index, shore hardness, Schmidt hammer hardness, and density with required SE of cutting using different levels of tool wear was also studied. The preliminary analysis of the data shows that the mean cutting forces increase 2-3 times and SE by 4-5 times when cutting with 4 mm wear flat as compared to cutting with new or sharp wedge shape bits. The grain size distribution of the muck for cutting different rock types and different level of bit wear was analyzed and discussed. The best fit prediction models for SE based on statistical analysis of laboratory test results are introduced. The model can be used for estimating the performance of mechanical excavators using radial tools, especially roadheaders, continuous miners and longwall drum shearers.

  15. Entrainment in the master equation.

    PubMed

    Margaliot, Michael; Grüne, Lars; Kriecherbauer, Thomas

    2018-04-01

    The master equation plays an important role in many scientific fields including physics, chemistry, systems biology, physical finance and sociodynamics. We consider the master equation with periodic transition rates. This may represent an external periodic excitation like the 24 h solar day in biological systems or periodic traffic lights in a model of vehicular traffic. Using tools from systems and control theory, we prove that under mild technical conditions every solution of the master equation converges to a periodic solution with the same period as the rates. In other words, the master equation entrains (or phase locks) to periodic excitations. We describe two applications of our theoretical results to important models from statistical mechanics and epidemiology.

  16. Optical stretching as a tool to investigate the mechanical properties of lipid bilayers.

    PubMed

    Solmaz, Mehmet E; Sankhagowit, Shalene; Biswas, Roshni; Mejia, Camilo A; Povinelli, Michelle L; Malmstadt, Noah

    2013-10-07

    Measurements of lipid bilayer bending modulus by various techniques produce widely divergent results. We attempt to resolve some of this ambiguity by measuring bending modulus in a system that can rapidly process large numbers of samples, yielding population statistics. This system is based on optical stretching of giant unilamellar vesicles (GUVs) in a microfluidic dual-beam optical trap (DBOT). The microfluidic DBOT system is used here to measure three populations of GUVs with distinct lipid compositions. We find that gel-phase membranes are significantly stiffer than liquid-phase membranes, consistent with previous reports. We also find that the addition of cholesterol does not alter the bending modulus of membranes composed of a monounsaturated phospholipid.

  17. Stochastic growth of cloud droplets by collisions during settling

    NASA Astrophysics Data System (ADS)

    Madival, Deepak G.

    2018-04-01

    In the last stage of droplet growth in clouds which leads to drizzle formation, larger droplets begin to settle under gravity and collide and coalesce with smaller droplets in their path. In this article, we shall deal with the simplified problem of a large drop settling amidst a population of identical smaller droplets. We present an expression for the probability that a given large drop suffers a given number of collisions, for a general statistically homogeneous distribution of droplets. We hope that our approach will serve as a valuable tool in dealing with droplet distribution in real clouds, which has been found to deviate from the idealized Poisson distribution due to mechanisms such as inertial clustering.

  18. Entrainment in the master equation

    PubMed Central

    Grüne, Lars; Kriecherbauer, Thomas

    2018-01-01

    The master equation plays an important role in many scientific fields including physics, chemistry, systems biology, physical finance and sociodynamics. We consider the master equation with periodic transition rates. This may represent an external periodic excitation like the 24 h solar day in biological systems or periodic traffic lights in a model of vehicular traffic. Using tools from systems and control theory, we prove that under mild technical conditions every solution of the master equation converges to a periodic solution with the same period as the rates. In other words, the master equation entrains (or phase locks) to periodic excitations. We describe two applications of our theoretical results to important models from statistical mechanics and epidemiology. PMID:29765669

  19. Information retrieval for a document writing assistance program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corral, M.L.; Simon, A.; Julien, C.

    This paper presents an Information Retrieval mechanism to facilitate the writing of technical documents in the space domain. To address the need for document exchange between partners in a given project, documents are standardized. The writing of a new document requires the re-use of existing documents or parts thereof. These parts can be identified by {open_quotes}tagging{close_quotes} the logical structure of documents and restored by means of a purpose-built Information Retrieval System (I.R.S.). The I.R.S. implemented in our writing assistance tool uses natural language queries and is based on a statistical linguistic approach which is enhanced by the use of documentmore » structure module.« less

  20. LETTER TO THE EDITOR: Exhaustive search for low-autocorrelation binary sequences

    NASA Astrophysics Data System (ADS)

    Mertens, S.

    1996-09-01

    Binary sequences with low autocorrelations are important in communication engineering and in statistical mechanics as ground states of the Bernasconi model. Computer searches are the main tool in the construction of such sequences. Owing to the exponential size 0305-4470/29/18/005/img1 of the configuration space, exhaustive searches are limited to short sequences. We discuss an exhaustive search algorithm with run-time characteristic 0305-4470/29/18/005/img2 and apply it to compile a table of exact ground states of the Bernasconi model up to N = 48. The data suggest F > 9 for the optimal merit factor in the limit 0305-4470/29/18/005/img3.

  1. nRC: non-coding RNA Classifier based on structural features.

    PubMed

    Fiannaca, Antonino; La Rosa, Massimo; La Paglia, Laura; Rizzo, Riccardo; Urso, Alfonso

    2017-01-01

    Non-coding RNA (ncRNA) are small non-coding sequences involved in gene expression regulation of many biological processes and diseases. The recent discovery of a large set of different ncRNAs with biologically relevant roles has opened the way to develop methods able to discriminate between the different ncRNA classes. Moreover, the lack of knowledge about the complete mechanisms in regulative processes, together with the development of high-throughput technologies, has required the help of bioinformatics tools in addressing biologists and clinicians with a deeper comprehension of the functional roles of ncRNAs. In this work, we introduce a new ncRNA classification tool, nRC (non-coding RNA Classifier). Our approach is based on features extraction from the ncRNA secondary structure together with a supervised classification algorithm implementing a deep learning architecture based on convolutional neural networks. We tested our approach for the classification of 13 different ncRNA classes. We obtained classification scores, using the most common statistical measures. In particular, we reach an accuracy and sensitivity score of about 74%. The proposed method outperforms other similar classification methods based on secondary structure features and machine learning algorithms, including the RNAcon tool that, to date, is the reference classifier. nRC tool is freely available as a docker image at https://hub.docker.com/r/tblab/nrc/. The source code of nRC tool is also available at https://github.com/IcarPA-TBlab/nrc.

  2. A new methodology for earthquake damage assessment (MEDEA) and its application following the Molise Italy earthquake of 31.10.02

    NASA Astrophysics Data System (ADS)

    Zuccaro, G.; Papa, F.; Spence, R.

    2003-04-01

    MEDEA is a multi-media tool designed to support earthquake damage assessment teams in Italy, by providing a means to train the technicians involved. In MEDEA, a range of alternative mechanisms of damage are defined and described, and the symptoms of each mechanism which can be recognised by the assessor are identified and linked to the related causative mechanisms. By using MEDEA, the assessor is guided by the experience of experts in the identification of the damage states and also of the separate mechanisms involved. This leads to a better safety assessment, a more homogeneous evaluation of damage across the affected area, and a great enhancement in the value of the damage statistics obtained in the assessment. The method is applied to both masonry and reinforced concrete buildings of the forms widespread in Italy and neighbouring countries. The paper will describe MEDEA and the context for which it was designed; and will present an example of its use in the M5.1 Molise earthquake of 31.10.02 in which 27 people died and which caused damage to hundreds of buildings.

  3. The search for mechanisms of change in behavioral treatments for alcohol use disorders: a commentary.

    PubMed

    Longabaugh, Richard

    2007-10-01

    Definitive results from efforts to identify mechanisms of change in behavioral treatments for alcohol use disorders have been elusive. The working hypothesis guiding this paper is that one of the reasons for this elusiveness is that the models we hypothesize to account for treatments effectiveness are unnecessarily restricted and too simple. This paper aims to accomplish 3 things. First, a typography for locating potential mediators of change will be presented. In the course of doing so, a nomenclature will be proposed with the hope that this will facilitate communications among alcohol treatment researchers studying mechanisms of change. Second, alternatives to the classic test of mediation of alcohol treatment effects will be considered and one such alternative described. Third, alternative ways of conceptualizing, constructing and analyzing variables to measure mediators will be suggested. It is hoped that this commentary will facilitate research on mechanisms of change in behavioral treatments for alcohol use disorders. Behavioral change is a complex process, and the models that we develop to account for this process need to reflect this complexity. Advances in statistical approaches for testing mediation, along with a better understanding as to how to use these tools should help in moving toward this goal.

  4. [Statistics for statistics?--Thoughts about psychological tools].

    PubMed

    Berger, Uwe; Stöbel-Richter, Yve

    2007-12-01

    Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.

  5. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  6. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  7. Socio-economic inequity in demand for insecticide-treated nets, in-door residual house spraying, larviciding and fogging in Sudan

    PubMed Central

    Onwujekwe, Obinna; Malik, El-Fatih Mohamed; Mustafa, Sara Hassan; Mnzava, Abraham

    2005-01-01

    Background In order to optimally prioritize and use public and private budgets for equitable malaria vector control, there is a need to determine the level and determinants of consumer demand for different vector control tools. Objectives To determine the demand from people of different socio-economic groups for indoor residual house-spraying (IRHS), insecticide-treated nets (ITNs), larviciding with chemicals (LWC), and space spraying/fogging (SS) and the disease control implications of the result. Methods Ratings and levels of willingness-to-pay (WTP) for the vector control tools were determined using a random cross-sectional sample of 720 householdes drawn from two states. WTP was elicited using the bidding game. An asset-based socio-economic status (SES) index was used to explore whether WTP was related to SES of the respondents. Results IRHS received the highest proportion of highest preferred rating (41.0%) followed by ITNs (23.1%). However, ITNs had the highest mean WTP followed by IRHS, while LWC had the least. The regression analysis showed that SES was positively and statistically significantly related to WTP across the four vector control tools and that the respondents' rating of IRHS and ITNs significantly explained their levels of WTP for the two tools. Conclusion People were willing to pay for all the vector-control tools, but the demand for the vector control tools was related to the SES of the respondents. Hence, it is vital that there are public policies and financing mechanisms to ensure equitable provision and utilisation of vector control tools, as well as protecting the poor from cost-sharing arrangements. PMID:16356177

  8. Statistics as Tools in Library Planning: On the State and Institutional Level.

    ERIC Educational Resources Information Center

    Trezza, Alphonse F.

    The principal uses of statistics in library planning may be illustrated by examples from the state of Illinois. State law specifies that the Illinois State Library compile and publish statistics on libraries. State agencies also play an important and expanding role in this effort. The state library now compiles statistics on all types of…

  9. A Web Site that Provides Resources for Assessing Students' Statistical Literacy, Reasoning and Thinking

    ERIC Educational Resources Information Center

    Garfield, Joan; delMas, Robert

    2010-01-01

    The Assessment Resource Tools for Improving Statistical Thinking (ARTIST) Web site was developed to provide high-quality assessment resources for faculty who teach statistics at the tertiary level but resources are also useful to statistics teachers at the secondary level. This article describes some of the numerous ARTIST resources and suggests…

  10. If at first you don't succeed… Studies of ontogeny shed light on the cognitive demands of habitual tool use

    PubMed Central

    Meulman, E. J. M.; Seed, A. M.; Mann, J.

    2013-01-01

    Many species use tools, but the mechanisms underpinning the behaviour differ between species and even among individuals within species, depending on the variants performed. When considering tool use ‘as adaptation’, an important first step is to understand the contribution made by fixed phenotypes as compared to flexible mechanisms, for instance learning. Social learning of tool use is sometimes inferred based on variation between populations of the same species but this approach is questionable. Specifically, alternative explanations cannot be ruled out because population differences are also driven by genetic and/or environmental factors. To better understand the mechanisms underlying routine but non-universal (i.e. habitual) tool use, we suggest focusing on the ontogeny of tool use and individual variation within populations. For example, if tool-using competence emerges late during ontogeny and improves with practice or varies with exposure to social cues, then a role for learning can be inferred. Experimental studies help identify the cognitive and developmental mechanisms used when tools are used to solve problems. The mechanisms underlying the route to tool-use acquisition have important consequences for our understanding of the accumulation in technological skill complexity over the life course of an individual, across generations and over evolutionary time. PMID:24101632

  11. Application of Statistics in Engineering Technology Programs

    ERIC Educational Resources Information Center

    Zhan, Wei; Fink, Rainer; Fang, Alex

    2010-01-01

    Statistics is a critical tool for robustness analysis, measurement system error analysis, test data analysis, probabilistic risk assessment, and many other fields in the engineering world. Traditionally, however, statistics is not extensively used in undergraduate engineering technology (ET) programs, resulting in a major disconnect from industry…

  12. Distance Learning for Teacher Professional Development in Statistics Education

    ERIC Educational Resources Information Center

    Meletiou-Mavrotheris, Maria; Mavrotheris, Efstathios; Paparistodemou, Efi

    2011-01-01

    We provide an overview of "EarlyStatistics," an online professional development course in statistics education targeting European elementary and middle school teachers. The course facilitates intercultural collaboration of teachers using contemporary technological and educational tools. An online information base offers access to all of…

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Amanda M.; Daly, Don S.; Willse, Alan R.

    The Automated Microarray Image Analysis (AMIA) Toolbox for MATLAB is a flexible, open-source microarray image analysis tool that allows the user to customize analysis of sets of microarray images. This tool provides several methods of identifying and quantify spot statistics, as well as extensive diagnostic statistics and images to identify poor data quality or processing. The open nature of this software allows researchers to understand the algorithms used to provide intensity estimates and to modify them easily if desired.

  14. Application of spatial technology in malaria research & control: some new insights.

    PubMed

    Saxena, Rekha; Nagpal, B N; Srivastava, Aruna; Gupta, S K; Dash, A P

    2009-08-01

    Geographical information System (GIS) has emerged as the core of the spatial technology which integrates wide range of dataset available from different sources including Remote Sensing (RS) and Global Positioning System (GPS). Literature published during the decade (1998-2007) has been compiled and grouped into six categories according to the usage of the technology in malaria epidemiology. Different GIS modules like spatial data sources, mapping and geo-processing tools, distance calculation, digital elevation model (DEM), buffer zone and geo-statistical analysis have been investigated in detail, illustrated with examples as per the derived results. These GIS tools have contributed immensely in understanding the epidemiological processes of malaria and examples drawn have shown that GIS is now widely used for research and decision making in malaria control. Statistical data analysis currently is the most consistent and established set of tools to analyze spatial datasets. The desired future development of GIS is in line with the utilization of geo-statistical tools which combined with high quality data has capability to provide new insight into malaria epidemiology and the complexity of its transmission potential in endemic areas.

  15. RipleyGUI: software for analyzing spatial patterns in 3D cell distributions

    PubMed Central

    Hansson, Kristin; Jafari-Mamaghani, Mehrdad; Krieger, Patrik

    2013-01-01

    The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. To facilitate the quantification of neuronal cell patterns we have developed RipleyGUI, a MATLAB-based software that can be used to detect patterns in the 3D distribution of cells. RipleyGUI uses Ripley's K-function to analyze spatial distributions. In addition the software contains statistical tools to determine quantitative statistical differences, and tools for spatial transformations that are useful for analyzing non-stationary point patterns. The software has a graphical user interface making it easy to use without programming experience, and an extensive user manual explaining the basic concepts underlying the different statistical tools used to analyze spatial point patterns. The described analysis tool can be used for determining the spatial organization of neurons that is important for a detailed study of structure-function relationships. For example, neocortex that can be subdivided into six layers based on cell density and cell types can also be analyzed in terms of organizational principles distinguishing the layers. PMID:23658544

  16. Cross-population validation of statistical distance as a measure of physiological dysregulation during aging.

    PubMed

    Cohen, Alan A; Milot, Emmanuel; Li, Qing; Legault, Véronique; Fried, Linda P; Ferrucci, Luigi

    2014-09-01

    Measuring physiological dysregulation during aging could be a key tool both to understand underlying aging mechanisms and to predict clinical outcomes in patients. However, most existing indices are either circular or hard to interpret biologically. Recently, we showed that statistical distance of 14 common blood biomarkers (a measure of how strange an individual's biomarker profile is) was associated with age and mortality in the WHAS II data set, validating its use as a measure of physiological dysregulation. Here, we extend the analyses to other data sets (WHAS I and InCHIANTI) to assess the stability of the measure across populations. We found that the statistical criteria used to determine the original 14 biomarkers produced diverging results across populations; in other words, had we started with a different data set, we would have chosen a different set of markers. Nonetheless, the same 14 markers (or the subset of 12 available for InCHIANTI) produced highly similar predictions of age and mortality. We include analyses of all combinatorial subsets of the markers and show that results do not depend much on biomarker choice or data set, but that more markers produce a stronger signal. We conclude that statistical distance as a measure of physiological dysregulation is stable across populations in Europe and North America. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Microstructure, Mechanical and Corrosion Properties of Friction Stir-Processed AISI D2 Tool Steel

    NASA Astrophysics Data System (ADS)

    Yasavol, Noushin; Jafari, Hassan

    2015-05-01

    In this study, AISI D2 tool steel underwent friction stir processing (FSP). The microstructure, mechanical properties, and corrosion resistance of the FSPed materials were then evaluated. A flat WC-Co tool was used; the rotation rate of the tool varied from 400 to 800 rpm, and the travel speed was maintained constant at 385 mm/s during the process. FSP improved mechanical properties and produced ultrafine-grained surface layers in the tool steel. Mechanical properties improvement is attributed to the homogenous distribution of two types of fine (0.2-0.3 μm) and coarse (1.6 μm) carbides in duplex ferrite-martensite matrix. In addition to the refinement of the carbides, the homogenous dispersion of the particles was found to be more effective in enhancing mechanical properties at 500 rpm tool rotation rate. The improved corrosion resistance was observed and is attributed to the volume fraction of low-angle grain boundaries produced after friction stir process of the AISI D2 steel.

  18. Software Tools on the Peregrine System | High-Performance Computing | NREL

    Science.gov Websites

    Debugger or performance analysis Tool for understanding the behavior of MPI applications. Intel VTune environment for statistical computing and graphics. VirtualGL/TurboVNC Visualization and analytics Remote Tools on the Peregrine System Software Tools on the Peregrine System NREL has a variety of

  19. Identifying Structure-Property Relationships Through DREAM.3D Representative Volume Elements and DAMASK Crystal Plasticity Simulations: An Integrated Computational Materials Engineering Approach

    NASA Astrophysics Data System (ADS)

    Diehl, Martin; Groeber, Michael; Haase, Christian; Molodov, Dmitri A.; Roters, Franz; Raabe, Dierk

    2017-05-01

    Predicting, understanding, and controlling the mechanical behavior is the most important task when designing structural materials. Modern alloy systems—in which multiple deformation mechanisms, phases, and defects are introduced to overcome the inverse strength-ductility relationship—give raise to multiple possibilities for modifying the deformation behavior, rendering traditional, exclusively experimentally-based alloy development workflows inappropriate. For fast and efficient alloy design, it is therefore desirable to predict the mechanical performance of candidate alloys by simulation studies to replace time- and resource-consuming mechanical tests. Simulation tools suitable for this task need to correctly predict the mechanical behavior in dependence of alloy composition, microstructure, texture, phase fractions, and processing history. Here, an integrated computational materials engineering approach based on the open source software packages DREAM.3D and DAMASK (Düsseldorf Advanced Materials Simulation Kit) that enables such virtual material development is presented. More specific, our approach consists of the following three steps: (1) acquire statistical quantities that describe a microstructure, (2) build a representative volume element based on these quantities employing DREAM.3D, and (3) evaluate the representative volume using a predictive crystal plasticity material model provided by DAMASK. Exemplarily, these steps are here conducted for a high-manganese steel.

  20. ToNER: A tool for identifying nucleotide enrichment signals in feature-enriched RNA-seq data.

    PubMed

    Promworn, Yuttachon; Kaewprommal, Pavita; Shaw, Philip J; Intarapanich, Apichart; Tongsima, Sissades; Piriyapongsa, Jittima

    2017-01-01

    Biochemical methods are available for enriching 5' ends of RNAs in prokaryotes, which are employed in the differential RNA-seq (dRNA-seq) and the more recent Cappable-seq protocols. Computational methods are needed to locate RNA 5' ends from these data by statistical analysis of the enrichment. Although statistical-based analysis methods have been developed for dRNA-seq, they may not be suitable for Cappable-seq data. The more efficient enrichment method employed in Cappable-seq compared with dRNA-seq could affect data distribution and thus algorithm performance. We present Transformation of Nucleotide Enrichment Ratios (ToNER), a tool for statistical modeling of enrichment from RNA-seq data obtained from enriched and unenriched libraries. The tool calculates nucleotide enrichment scores and determines the global transformation for fitting to the normal distribution using the Box-Cox procedure. From the transformed distribution, sites of significant enrichment are identified. To increase power of detection, meta-analysis across experimental replicates is offered. We tested the tool on Cappable-seq and dRNA-seq data for identifying Escherichia coli transcript 5' ends and compared the results with those from the TSSAR tool, which is designed for analyzing dRNA-seq data. When combining results across Cappable-seq replicates, ToNER detects more known transcript 5' ends than TSSAR. In general, the transcript 5' ends detected by ToNER but not TSSAR occur in regions which cannot be locally modeled by TSSAR. ToNER uses a simple yet robust statistical modeling approach, which can be used for detecting RNA 5'ends from Cappable-seq data, in particular when combining information from experimental replicates. The ToNER tool could potentially be applied for analyzing other RNA-seq datasets in which enrichment for other structural features of RNA is employed. The program is freely available for download at ToNER webpage (http://www4a.biotec.or.th/GI/tools/toner) and GitHub repository (https://github.com/PavitaKae/ToNER).

  1. The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, Curtis D.; Humphreys, William M.

    2003-01-01

    We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.

  2. The development of ensemble theory. A new glimpse at the history of statistical mechanics

    NASA Astrophysics Data System (ADS)

    Inaba, Hajime

    2015-12-01

    This paper investigates the history of statistical mechanics from the viewpoint of the development of the ensemble theory from 1871 to 1902. In 1871, Ludwig Boltzmann introduced a prototype model of an ensemble that represents a polyatomic gas. In 1879, James Clerk Maxwell defined an ensemble as copies of systems of the same energy. Inspired by H.W. Watson, he called his approach "statistical". Boltzmann and Maxwell regarded the ensemble theory as a much more general approach than the kinetic theory. In the 1880s, influenced by Hermann von Helmholtz, Boltzmann made use of ensembles to establish thermodynamic relations. In Elementary Principles in Statistical Mechanics of 1902, Josiah Willard Gibbs tried to get his ensemble theory to mirror thermodynamics, including thermodynamic operations in its scope. Thermodynamics played the role of a "blind guide". His theory of ensembles can be characterized as more mathematically oriented than Einstein's theory proposed in the same year. Mechanical, empirical, and statistical approaches to foundations of statistical mechanics are presented. Although it was formulated in classical terms, the ensemble theory provided an infrastructure still valuable in quantum statistics because of its generality.

  3. RADSS: an integration of GIS, spatial statistics, and network service for regional data mining

    NASA Astrophysics Data System (ADS)

    Hu, Haitang; Bao, Shuming; Lin, Hui; Zhu, Qing

    2005-10-01

    Regional data mining, which aims at the discovery of knowledge about spatial patterns, clusters or association between regions, has widely applications nowadays in social science, such as sociology, economics, epidemiology, crime, and so on. Many applications in the regional or other social sciences are more concerned with the spatial relationship, rather than the precise geographical location. Based on the spatial continuity rule derived from Tobler's first law of geography: observations at two sites tend to be more similar to each other if the sites are close together than if far apart, spatial statistics, as an important means for spatial data mining, allow the users to extract the interesting and useful information like spatial pattern, spatial structure, spatial association, spatial outlier and spatial interaction, from the vast amount of spatial data or non-spatial data. Therefore, by integrating with the spatial statistical methods, the geographical information systems will become more powerful in gaining further insights into the nature of spatial structure of regional system, and help the researchers to be more careful when selecting appropriate models. However, the lack of such tools holds back the application of spatial data analysis techniques and development of new methods and models (e.g., spatio-temporal models). Herein, we make an attempt to develop such an integrated software and apply it into the complex system analysis for the Poyang Lake Basin. This paper presents a framework for integrating GIS, spatial statistics and network service in regional data mining, as well as their implementation. After discussing the spatial statistics methods involved in regional complex system analysis, we introduce RADSS (Regional Analysis and Decision Support System), our new regional data mining tool, by integrating GIS, spatial statistics and network service. RADSS includes the functions of spatial data visualization, exploratory spatial data analysis, and spatial statistics. The tool also includes some fundamental spatial and non-spatial database in regional population and environment, which can be updated by external database via CD or network. Utilizing this data mining and exploratory analytical tool, the users can easily and quickly analyse the huge mount of the interrelated regional data, and better understand the spatial patterns and trends of the regional development, so as to make a credible and scientific decision. Moreover, it can be used as an educational tool for spatial data analysis and environmental studies. In this paper, we also present a case study on Poyang Lake Basin as an application of the tool and spatial data mining in complex environmental studies. At last, several concluding remarks are discussed.

  4. Precision tool holder with flexure-adjustable, three degrees of freedom for a four-axis lathe

    DOEpatents

    Bono, Matthew J [Pleasanton, CA; Hibbard, Robin L [Livermore, CA

    2008-03-04

    A precision tool holder for precisely positioning a single point cutting tool on 4-axis lathe, such that the center of the radius of the tool nose is aligned with the B-axis of the machine tool, so as to facilitate the machining of precision meso-scale components with complex three-dimensional shapes with sub-.mu.m accuracy on a four-axis lathe. The device is designed to fit on a commercial diamond turning machine and can adjust the cutting tool position in three orthogonal directions with sub-micrometer resolution. In particular, the tool holder adjusts the tool position using three flexure-based mechanisms, with two flexure mechanisms adjusting the lateral position of the tool to align the tool with the B-axis, and a third flexure mechanism adjusting the height of the tool. Preferably, the flexures are driven by manual micrometer adjusters. In this manner, this tool holder simplifies the process of setting a tool with sub-.mu.m accuracy, to substantially reduce the time required to set the tool.

  5. GeneTools--application for functional annotation and statistical hypothesis testing.

    PubMed

    Beisvag, Vidar; Jünge, Frode K R; Bergum, Hallgeir; Jølsum, Lars; Lydersen, Stian; Günther, Clara-Cecilie; Ramampiaro, Heri; Langaas, Mette; Sandvik, Arne K; Laegreid, Astrid

    2006-10-24

    Modern biology has shifted from "one gene" approaches to methods for genomic-scale analysis like microarray technology, which allow simultaneous measurement of thousands of genes. This has created a need for tools facilitating interpretation of biological data in "batch" mode. However, such tools often leave the investigator with large volumes of apparently unorganized information. To meet this interpretation challenge, gene-set, or cluster testing has become a popular analytical tool. Many gene-set testing methods and software packages are now available, most of which use a variety of statistical tests to assess the genes in a set for biological information. However, the field is still evolving, and there is a great need for "integrated" solutions. GeneTools is a web-service providing access to a database that brings together information from a broad range of resources. The annotation data are updated weekly, guaranteeing that users get data most recently available. Data submitted by the user are stored in the database, where it can easily be updated, shared between users and exported in various formats. GeneTools provides three different tools: i) NMC Annotation Tool, which offers annotations from several databases like UniGene, Entrez Gene, SwissProt and GeneOntology, in both single- and batch search mode. ii) GO Annotator Tool, where users can add new gene ontology (GO) annotations to genes of interest. These user defined GO annotations can be used in further analysis or exported for public distribution. iii) eGOn, a tool for visualization and statistical hypothesis testing of GO category representation. As the first GO tool, eGOn supports hypothesis testing for three different situations (master-target situation, mutually exclusive target-target situation and intersecting target-target situation). An important additional function is an evidence-code filter that allows users, to select the GO annotations for the analysis. GeneTools is the first "all in one" annotation tool, providing users with a rapid extraction of highly relevant gene annotation data for e.g. thousands of genes or clones at once. It allows a user to define and archive new GO annotations and it supports hypothesis testing related to GO category representations. GeneTools is freely available through www.genetools.no

  6. Validation of the World Health Organization tool for situational analysis to assess emergency and essential surgical care at district hospitals in Ghana.

    PubMed

    Osen, Hayley; Chang, David; Choo, Shelly; Perry, Henry; Hesse, Afua; Abantanga, Francis; McCord, Colin; Chrouser, Kristin; Abdullah, Fizan

    2011-03-01

    The World Health Organization (WHO) Tool for Situational Analysis to Assess Emergency and Essential Surgical Care (hereafter called the WHO Tool) has been used in more than 25 countries and is the largest effort to assess surgical care in the world. However, it has not yet been independently validated. Test-retest reliability is one way to validate the degree to which tests instruments are free from random error. The aim of the present field study was to determine the test-retest reliability of the WHO Tool. The WHO Tool was mailed to 10 district hospitals in Ghana. Written instructions were provided along with a letter from the Ghana Health Services requesting the hospital administrator to complete the survey tool. After ensuring delivery and completion of the forms, the study team readministered the WHO Tool at the time of an on-site visit less than 1 month later. The results of the two tests were compared to calculate kappa statistics for each of the 152 questions in the WHO Tool. The kappa statistic is a statistical measure of the degree of agreement above what would be expected based on chance alone. Ten hospitals were surveyed twice over a short interval (i.e., less than 1 month). Weighted and unweighted kappa statistics were calculated for 152 questions. The median unweighted kappa for the entire survey was 0.43 (interquartile range 0-0.84). The infrastructure section (24 questions) had a median kappa of 0.81; the human resources section (13 questions) had a median kappa of 0.77; the surgical procedures section (67 questions) had a median kappa of 0.00; and the emergency surgical equipment section (48 questions) had a median kappa of 0.81. Hospital capacity survey questions related to infrastructure characteristics had high reliability. However, questions related to process of care had poor reliability and may benefit from supplemental data gathered by direct observation. Limitations to the study include the small sample size: 10 district hospitals in a single country. Consistent and high correlations calculated from the field testing within the present analysis suggest that the WHO Tool for Situational Analysis is a reliable tool where it measures structure and setting, but it should be revised for measuring process of care.

  7. Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach

    PubMed Central

    Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen

    2016-01-01

    A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design. PMID:28773800

  8. Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach.

    PubMed

    Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen

    2016-08-09

    A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design.

  9. Real time software tools and methodologies

    NASA Technical Reports Server (NTRS)

    Christofferson, M. J.

    1981-01-01

    Real time systems are characterized by high speed processing and throughput as well as asynchronous event processing requirements. These requirements give rise to particular implementations of parallel or pipeline multitasking structures, of intertask or interprocess communications mechanisms, and finally of message (buffer) routing or switching mechanisms. These mechanisms or structures, along with the data structue, describe the essential character of the system. These common structural elements and mechanisms are identified, their implementation in the form of routines, tasks or macros - in other words, tools are formalized. The tools developed support or make available the following: reentrant task creation, generalized message routing techniques, generalized task structures/task families, standardized intertask communications mechanisms, and pipeline and parallel processing architectures in a multitasking environment. Tools development raise some interesting prospects in the areas of software instrumentation and software portability. These issues are discussed following the description of the tools themselves.

  10. Statistics? You Must Be Joking: The Application and Evaluation of Humor when Teaching Statistics

    ERIC Educational Resources Information Center

    Neumann, David L.; Hood, Michelle; Neumann, Michelle M.

    2009-01-01

    Humor has been promoted as a teaching tool that enhances student engagement and learning. The present report traces the pathway from research to practice by reflecting upon various ways to incorporate humor into the face-to-face teaching of statistics. The use of humor in an introductory university statistics course was evaluated via interviews…

  11. Network Meta-Analysis Using R: A Review of Currently Available Automated Packages

    PubMed Central

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687

  12. Network meta-analysis using R: a review of currently available automated packages.

    PubMed

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.

  13. Statistical mechanics explanation for the structure of ocean eddies and currents

    NASA Astrophysics Data System (ADS)

    Venaille, A.; Bouchet, F.

    2010-12-01

    The equilibrium statistical mechanics of two dimensional and geostrophic flows predicts the outcome for the large scales of the flow, resulting from the turbulent mixing. This theory has been successfully applied to describe detailed properties of Jupiter's Great Red Spot. We discuss the range of applicability of this theory to ocean dynamics. It is able to reproduce mesoscale structures like ocean rings. It explains, from statistical mechanics, the westward drift of rings at the speed of non dispersive baroclinic waves, and the recently observed (Chelton and col.) slower northward drift of cyclonic eddies and southward drift of anticyclonic eddies. We also uncover relations between strong eastward mid-basin inertial jets, like the Kuroshio extension and the Gulf Stream, and statistical equilibria. We explain under which conditions such strong mid-basin jets can be understood as statistical equilibria. We claim that these results are complementary to the classical Sverdrup-Munk theory: they explain the inertial part basin dynamics, the jets structure and location, using very simple theoretical arguments. References: A. VENAILLE and F. BOUCHET, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. BOUCHET and A. VENAILLE, Statistical mechanics of two-dimensional and geophysical flows, arxiv ...., submitted to Physics Reports P. BERLOFF, A. M. HOGG, W. DEWAR, The Turbulent Oscillator: A Mechanism of Low- Frequency Variability of the Wind-Driven Ocean Gyres, Journal of Physical Oceanography 37 (2007) 2363-+. D. B. CHELTON, M. G. SCHLAX, R. M. SAMELSON, R. A. de SZOEKE, Global observations of large oceanic eddies, Geo. Res. Lett.34 (2007) 15606-+ b) and c) are snapshots of streamfunction and potential vorticity (red: positive values; blue: negative values) in the upper layer of a three layer quasi-geostrophic model of a mid-latitude ocean basin (from Berloff and co.). a) Streamfunction predicted by statistical mechanics. Even in an out-equilibrium situation like this one, equilibrium statistical mechanics predicts remarkably the overall qualitative flow structure. Observation of westward drift of ocean eddies and of slower northward drift of cyclones and southward drift of anticyclones by Chelton and co. We explain these observations from statistical mechanics.

  14. Time-series panel analysis (TSPA): multivariate modeling of temporal associations in psychotherapy process.

    PubMed

    Ramseyer, Fabian; Kupper, Zeno; Caspar, Franz; Znoj, Hansjörg; Tschacher, Wolfgang

    2014-10-01

    Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  15. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  16. IFLA General Conference, 1986. Management and Technology Division. Section: Statistics. Papers.

    ERIC Educational Resources Information Center

    International Federation of Library Associations and Institutions, The Hague (Netherlands).

    Papers on statistics which were presented at the 1986 International Federation of Library Associations (IFLA) conference include: (1) "Library Data Collection in Brazil (Nice Menezes de Figueiredo, Brazil); (2) "Fact-Finding on Statistics and Reference Tools in Japan" (Yuriko Sugimoto, Chihomi Oka, Ikuko Mayumi, and Keiko Kurata,…

  17. Children's Services Statistical Neighbour Benchmarking Tool. Practitioner User Guide

    ERIC Educational Resources Information Center

    National Foundation for Educational Research, 2007

    2007-01-01

    Statistical neighbour models provide one method for benchmarking progress. For each local authority (LA), these models designate a number of other LAs deemed to have similar characteristics. These designated LAs are known as statistical neighbours. Any LA may compare its performance (as measured by various indicators) against its statistical…

  18. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    ERIC Educational Resources Information Center

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  19. Mechanism-Based FE Simulation of Tool Wear in Diamond Drilling of SiCp/Al Composites.

    PubMed

    Xiang, Junfeng; Pang, Siqin; Xie, Lijing; Gao, Feinong; Hu, Xin; Yi, Jie; Hu, Fang

    2018-02-07

    The aim of this work is to analyze the micro mechanisms underlying the wear of macroscale tools during diamond machining of SiC p /Al6063 composites and to develop the mechanism-based diamond wear model in relation to the dominant wear behaviors. During drilling, high volume fraction SiC p /Al6063 composites containing Cu, the dominant wear mechanisms of diamond tool involve thermodynamically activated physicochemical wear due to diamond-graphite transformation catalyzed by Cu in air atmosphere and mechanically driven abrasive wear due to high-frequency scrape of hard SiC reinforcement on tool surface. An analytical diamond wear model, coupling Usui abrasive wear model and Arrhenius extended graphitization wear model was proposed and implemented through a user-defined subroutine for tool wear estimates. Tool wear estimate in diamond drilling of SiC p /Al6063 composites was achieved by incorporating the combined abrasive-chemical tool wear subroutine into the coupled thermomechanical FE model of 3D drilling. The developed drilling FE model for reproducing diamond tool wear was validated for feasibility and reliability by comparing numerically simulated tool wear morphology and experimentally observed results after drilling a hole using brazed polycrystalline diamond (PCD) and chemical vapor deposition (CVD) diamond coated tools. A fairly good agreement of experimental and simulated results in cutting forces, chip and tool wear morphologies demonstrates that the developed 3D drilling FE model, combined with a subroutine for diamond tool wear estimate can provide a more accurate analysis not only in cutting forces and chip shape but also in tool wear behavior during drilling SiC p /Al6063 composites. Once validated and calibrated, the developed diamond tool wear model in conjunction with other machining FE models can be easily extended to the investigation of tool wear evolution with various diamond tool geometries and other machining processes in cutting different workpiece materials.

  20. Mechanism-Based FE Simulation of Tool Wear in Diamond Drilling of SiCp/Al Composites

    PubMed Central

    Xiang, Junfeng; Pang, Siqin; Xie, Lijing; Gao, Feinong; Hu, Xin; Yi, Jie; Hu, Fang

    2018-01-01

    The aim of this work is to analyze the micro mechanisms underlying the wear of macroscale tools during diamond machining of SiCp/Al6063 composites and to develop the mechanism-based diamond wear model in relation to the dominant wear behaviors. During drilling, high volume fraction SiCp/Al6063 composites containing Cu, the dominant wear mechanisms of diamond tool involve thermodynamically activated physicochemical wear due to diamond-graphite transformation catalyzed by Cu in air atmosphere and mechanically driven abrasive wear due to high-frequency scrape of hard SiC reinforcement on tool surface. An analytical diamond wear model, coupling Usui abrasive wear model and Arrhenius extended graphitization wear model was proposed and implemented through a user-defined subroutine for tool wear estimates. Tool wear estimate in diamond drilling of SiCp/Al6063 composites was achieved by incorporating the combined abrasive-chemical tool wear subroutine into the coupled thermomechanical FE model of 3D drilling. The developed drilling FE model for reproducing diamond tool wear was validated for feasibility and reliability by comparing numerically simulated tool wear morphology and experimentally observed results after drilling a hole using brazed polycrystalline diamond (PCD) and chemical vapor deposition (CVD) diamond coated tools. A fairly good agreement of experimental and simulated results in cutting forces, chip and tool wear morphologies demonstrates that the developed 3D drilling FE model, combined with a subroutine for diamond tool wear estimate can provide a more accurate analysis not only in cutting forces and chip shape but also in tool wear behavior during drilling SiCp/Al6063 composites. Once validated and calibrated, the developed diamond tool wear model in conjunction with other machining FE models can be easily extended to the investigation of tool wear evolution with various diamond tool geometries and other machining processes in cutting different workpiece materials. PMID:29414839

  1. Tools for Assessing Readability of Statistics Teaching Materials

    ERIC Educational Resources Information Center

    Lesser, Lawrence; Wagler, Amy

    2016-01-01

    This article provides tools and rationale for instructors in math and science to make their assessment and curriculum materials (more) readable for students. The tools discussed (MSWord, LexTutor, Coh-Metrix TEA) are readily available linguistic analysis applications that are grounded in current linguistic theory, but present output that can…

  2. Signal Detection Theory as a Tool for Successful Student Selection

    ERIC Educational Resources Information Center

    van Ooijen-van der Linden, Linda; van der Smagt, Maarten J.; Woertman, Liesbeth; te Pas, Susan F.

    2017-01-01

    Prediction accuracy of academic achievement for admission purposes requires adequate "sensitivity" and "specificity" of admission tools, yet the available information on the validity and predictive power of admission tools is largely based on studies using correlational and regression statistics. The goal of this study was to…

  3. Seismicity map tools for earthquake studies

    NASA Astrophysics Data System (ADS)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  4. Landau's statistical mechanics for quasi-particle models

    NASA Astrophysics Data System (ADS)

    Bannur, Vishnu M.

    2014-04-01

    Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.

  5. CRACK GROWTH ANALYSIS OF SOLID OXIDE FUEL CELL ELECTROLYTES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S. Bandopadhyay; N. Nagabhushana

    2003-10-01

    Defects and Flaws control the structural and functional property of ceramics. In determining the reliability and lifetime of ceramics structures it is very important to quantify the crack growth behavior of the ceramics. In addition, because of the high variability of the strength and the relatively low toughness of ceramics, a statistical design approach is necessary. The statistical nature of the strength of ceramics is currently well recognized, and is usually accounted for by utilizing Weibull or similar statistical distributions. Design tools such as CARES using a combination of strength measurements, stress analysis, and statistics are available and reasonably wellmore » developed. These design codes also incorporate material data such as elastic constants as well as flaw distributions and time-dependent properties. The fast fracture reliability for ceramics is often different from their time-dependent reliability. Further confounding the design complexity, the time-dependent reliability varies with the environment/temperature/stress combination. Therefore, it becomes important to be able to accurately determine the behavior of ceramics under simulated application conditions to provide a better prediction of the lifetime and reliability for a given component. In the present study, Yttria stabilized Zirconia (YSZ) of 9.6 mol% Yttria composition was procured in the form of tubes of length 100 mm. The composition is of interest as tubular electrolytes for Solid Oxide Fuel Cells. Rings cut from the tubes were characterized for microstructure, phase stability, mechanical strength (Weibull modulus) and fracture mechanisms. The strength at operating condition of SOFCs (1000 C) decreased to 95 MPa as compared to room temperature strength of 230 MPa. However, the Weibull modulus remains relatively unchanged. Slow crack growth (SCG) parameter, n = 17 evaluated at room temperature in air was representative of well studied brittle materials. Based on the results, further work was planned to evaluate the strength degradation, modulus and failure in more representative environment of the SOFCs.« less

  6. Analysis methodology and development of a statistical tool for biodistribution data from internal contamination with actinides.

    PubMed

    Lamart, Stephanie; Griffiths, Nina M; Tchitchek, Nicolas; Angulo, Jaime F; Van der Meeren, Anne

    2017-03-01

    The aim of this work was to develop a computational tool that integrates several statistical analysis features for biodistribution data from internal contamination experiments. These data represent actinide levels in biological compartments as a function of time and are derived from activity measurements in tissues and excreta. These experiments aim at assessing the influence of different contamination conditions (e.g. intake route or radioelement) on the biological behavior of the contaminant. The ever increasing number of datasets and diversity of experimental conditions make the handling and analysis of biodistribution data difficult. This work sought to facilitate the statistical analysis of a large number of datasets and the comparison of results from diverse experimental conditions. Functional modules were developed using the open-source programming language R to facilitate specific operations: descriptive statistics, visual comparison, curve fitting, and implementation of biokinetic models. In addition, the structure of the datasets was harmonized using the same table format. Analysis outputs can be written in text files and updated data can be written in the consistent table format. Hence, a data repository is built progressively, which is essential for the optimal use of animal data. Graphical representations can be automatically generated and saved as image files. The resulting computational tool was applied using data derived from wound contamination experiments conducted under different conditions. In facilitating biodistribution data handling and statistical analyses, this computational tool ensures faster analyses and a better reproducibility compared with the use of multiple office software applications. Furthermore, re-analysis of archival data and comparison of data from different sources is made much easier. Hence this tool will help to understand better the influence of contamination characteristics on actinide biokinetics. Our approach can aid the optimization of treatment protocols and therefore contribute to the improvement of the medical response after internal contamination with actinides.

  7. Ultimate compression after impact load prediction in graphite/epoxy coupons using neural network and multivariate statistical analyses

    NASA Astrophysics Data System (ADS)

    Gregoire, Alexandre David

    2011-07-01

    The goal of this research was to accurately predict the ultimate compressive load of impact damaged graphite/epoxy coupons using a Kohonen self-organizing map (SOM) neural network and multivariate statistical regression analysis (MSRA). An optimized use of these data treatment tools allowed the generation of a simple, physically understandable equation that predicts the ultimate failure load of an impacted damaged coupon based uniquely on the acoustic emissions it emits at low proof loads. Acoustic emission (AE) data were collected using two 150 kHz resonant transducers which detected and recorded the AE activity given off during compression to failure of thirty-four impacted 24-ply bidirectional woven cloth laminate graphite/epoxy coupons. The AE quantification parameters duration, energy and amplitude for each AE hit were input to the Kohonen self-organizing map (SOM) neural network to accurately classify the material failure mechanisms present in the low proof load data. The number of failure mechanisms from the first 30% of the loading for twenty-four coupons were used to generate a linear prediction equation which yielded a worst case ultimate load prediction error of 16.17%, just outside of the +/-15% B-basis allowables, which was the goal for this research. Particular emphasis was placed upon the noise removal process which was largely responsible for the accuracy of the results.

  8. Ergodic theorem, ergodic theory, and statistical mechanics

    PubMed Central

    Moore, Calvin C.

    2015-01-01

    This perspective highlights the mean ergodic theorem established by John von Neumann and the pointwise ergodic theorem established by George Birkhoff, proofs of which were published nearly simultaneously in PNAS in 1931 and 1932. These theorems were of great significance both in mathematics and in statistical mechanics. In statistical mechanics they provided a key insight into a 60-y-old fundamental problem of the subject—namely, the rationale for the hypothesis that time averages can be set equal to phase averages. The evolution of this problem is traced from the origins of statistical mechanics and Boltzman's ergodic hypothesis to the Ehrenfests' quasi-ergodic hypothesis, and then to the ergodic theorems. We discuss communications between von Neumann and Birkhoff in the Fall of 1931 leading up to the publication of these papers and related issues of priority. These ergodic theorems initiated a new field of mathematical-research called ergodic theory that has thrived ever since, and we discuss some of recent developments in ergodic theory that are relevant for statistical mechanics. PMID:25691697

  9. Statistical metrology—measurement and modeling of variation for advanced process development and design rule generation

    NASA Astrophysics Data System (ADS)

    Boning, Duane S.; Chung, James E.

    1998-11-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.

  10. An Evolving Ecosystem for Natural Language Processing in Department of Veterans Affairs.

    PubMed

    Garvin, Jennifer H; Kalsy, Megha; Brandt, Cynthia; Luther, Stephen L; Divita, Guy; Coronado, Gregory; Redd, Doug; Christensen, Carrie; Hill, Brent; Kelly, Natalie; Treitler, Qing Zeng

    2017-02-01

    In an ideal clinical Natural Language Processing (NLP) ecosystem, researchers and developers would be able to collaborate with others, undertake validation of NLP systems, components, and related resources, and disseminate them. We captured requirements and formative evaluation data from the Veterans Affairs (VA) Clinical NLP Ecosystem stakeholders using semi-structured interviews and meeting discussions. We developed a coding rubric to code interviews. We assessed inter-coder reliability using percent agreement and the kappa statistic. We undertook 15 interviews and held two workshop discussions. The main areas of requirements related to; design and functionality, resources, and information. Stakeholders also confirmed the vision of the second generation of the Ecosystem and recommendations included; adding mechanisms to better understand terms, measuring collaboration to demonstrate value, and datasets/tools to navigate spelling errors with consumer language, among others. Stakeholders also recommended capability to: communicate with developers working on the next version of the VA electronic health record (VistA Evolution), provide a mechanism to automatically monitor download of tools and to automatically provide a summary of the downloads to Ecosystem contributors and funders. After three rounds of coding and discussion, we determined the percent agreement of two coders to be 97.2% and the kappa to be 0.7851. The vision of the VA Clinical NLP Ecosystem met stakeholder needs. Interviews and discussion provided key requirements that inform the design of the VA Clinical NLP Ecosystem.

  11. Big Data and Neuroimaging.

    PubMed

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  12. A statistical mechanical approach to restricted integer partition functions

    NASA Astrophysics Data System (ADS)

    Zhou, Chi-Chun; Dai, Wu-Sheng

    2018-05-01

    The main aim of this paper is twofold: (1) suggesting a statistical mechanical approach to the calculation of the generating function of restricted integer partition functions which count the number of partitions—a way of writing an integer as a sum of other integers under certain restrictions. In this approach, the generating function of restricted integer partition functions is constructed from the canonical partition functions of various quantum gases. (2) Introducing a new type of restricted integer partition functions corresponding to general statistics which is a generalization of Gentile statistics in statistical mechanics; many kinds of restricted integer partition functions are special cases of this restricted integer partition function. Moreover, with statistical mechanics as a bridge, we reveal a mathematical fact: the generating function of restricted integer partition function is just the symmetric function which is a class of functions being invariant under the action of permutation groups. Using this approach, we provide some expressions of restricted integer partition functions as examples.

  13. Statistical Tools for Fitting Models of the Population Consequences of Acoustic Disturbance to Data from Marine Mammal Populations (PCAD Tools 2)

    DTIC Science & Technology

    2013-09-30

    proceedings of a recent conference on The Effects of Noise on Aquatic Life (Schick et al. 2014). In addition to this work, Schick has been working...Lisbon, Portugal (April), the UK National Centre for Statistical Ecology annual workshop (June), and the Effects of Aquatic Noise conference (August). We...A. N. Popper and A. Hawkins, editors. Effects of Noise on Aquatic Life II. Springer. [in press] Fleishman, E., M. Burgman, M. C. Runge, R. S

  14. Basic Engineer Equipment Mechanic.

    ERIC Educational Resources Information Center

    Marine Corps Inst., Washington, DC.

    This student guide, one of a series of correspondence training courses designed to improve the job performance of members of the Marine Corps, deals with the skills needed by basic engineer equipment mechanics. Addressed in the four individual units of the course are the following topics: mechanics and their tools (mechanics, hand tools, and power…

  15. Bulk measurements of messy chemistries are needed for a theory of the origins of life

    NASA Astrophysics Data System (ADS)

    Guttenberg, Nicholas; Virgo, Nathaniel; Chandru, Kuhan; Scharf, Caleb; Mamajanov, Irena

    2017-11-01

    A feature of many of the chemical systems plausibly involved in the origins of terrestrial life is that they are complex and messy-producing a wide range of compounds via a wide range of mechanisms. However, the fundamental behaviour of such systems is currently not well understood; we do not have the tools to make statistical predictions about such complex chemical networks. This is, in part, due to a lack of quantitative data from which such a theory could be built; specifically, functional measurements of messy chemical systems. Here, we propose that the pantheon of experimental approaches to the origins of life should be expanded to include the study of `functional measurements'-the direct study of bulk properties of chemical systems and their interactions with other compounds, the formation of structures and other behaviours, even in cases where the precise composition and mechanisms are unknown. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  16. A two-scale Weibull approach to the failure of porous ceramic structures made by robocasting: possibilities and limits

    PubMed Central

    Genet, Martin; Houmard, Manuel; Eslava, Salvador; Saiz, Eduardo; Tomsia, Antoni P.

    2012-01-01

    This paper introduces our approach to modeling the mechanical behavior of cellular ceramics, through the example of calcium phosphate scaffolds made by robocasting for bone-tissue engineering. The Weibull theory is used to deal with the scaffolds’ constitutive rods statistical failure, and the Sanchez-Palencia theory of periodic homogenization is used to link the rod- and scaffold-scales. Uniaxial compression of scaffolds and three-point bending of rods were performed to calibrate and validate the model. If calibration based on rod-scale data leads to over-conservative predictions of scaffold’s properties (as rods’ successive failures are not taken into account), we show that, for a given rod diameter, calibration based on scaffold-scale data leads to very satisfactory predictions for a wide range of rod spacing, i.e. of scaffold porosity, as well as for different loading conditions. This work establishes the proposed model as a reliable tool for understanding and optimizing cellular ceramics’ mechanical properties. PMID:23439936

  17. A Quantitative Comparative Study of Blended and Traditional Models in the Secondary Advanced Placement Statistics Classroom

    ERIC Educational Resources Information Center

    Owens, Susan T.

    2017-01-01

    Technology is becoming an integral tool in the classroom and can make a positive impact on how the students learn. This quantitative comparative research study examined gender-based differences among secondary Advanced Placement (AP) Statistic students comparing Educational Testing Service (ETS) College Board AP Statistic examination scores…

  18. An Integrated, Statistical Molecular Approach to the Physical Chemistry Curriculum

    ERIC Educational Resources Information Center

    Cartier, Stephen F.

    2009-01-01

    As an alternative to the "thermodynamics first" or "quantum first" approaches to the physical chemistry curriculum, the statistical definition of entropy and the Boltzmann distribution are introduced in the first days of the course and the entire two-semester curriculum is then developed from these concepts. Once the tools of statistical mechanics…

  19. Engineering Students Designing a Statistical Procedure for Quantifying Variability

    ERIC Educational Resources Information Center

    Hjalmarson, Margret A.

    2007-01-01

    The study examined first-year engineering students' responses to a statistics task that asked them to generate a procedure for quantifying variability in a data set from an engineering context. Teams used technological tools to perform computations, and their final product was a ranking procedure. The students could use any statistical measures,…

  20. The Importance of Statistical Modeling in Data Analysis and Inference

    ERIC Educational Resources Information Center

    Rollins, Derrick, Sr.

    2017-01-01

    Statistical inference simply means to draw a conclusion based on information that comes from data. Error bars are the most commonly used tool for data analysis and inference in chemical engineering data studies. This work demonstrates, using common types of data collection studies, the importance of specifying the statistical model for sound…

  1. Artificial Intelligence Approach to Support Statistical Quality Control Teaching

    ERIC Educational Resources Information Center

    Reis, Marcelo Menezes; Paladini, Edson Pacheco; Khator, Suresh; Sommer, Willy Arno

    2006-01-01

    Statistical quality control--SQC (consisting of Statistical Process Control, Process Capability Studies, Acceptance Sampling and Design of Experiments) is a very important tool to obtain, maintain and improve the Quality level of goods and services produced by an organization. Despite its importance, and the fact that it is taught in technical and…

  2. Ocean Drilling Program: Web Site Access Statistics

    Science.gov Websites

    and products Drilling services and tools Online Janus database Search the ODP/TAMU web site ODP's main See statistics for JOIDES members. See statistics for Janus database. 1997 October November December accessible only on www-odp.tamu.edu. ** End of ODP, start of IODP. Privacy Policy ODP | Search | Database

  3. Use of Data Visualisation in the Teaching of Statistics: A New Zealand Perspective

    ERIC Educational Resources Information Center

    Forbes, Sharleen; Chapman, Jeanette; Harraway, John; Stirling, Doug; Wild, Chris

    2014-01-01

    For many years, students have been taught to visualise data by drawing graphs. Recently, there has been a growing trend to teach statistics, particularly statistical concepts, using interactive and dynamic visualisation tools. Free down-loadable teaching and simulation software designed specifically for schools, and more general data visualisation…

  4. 78 FR 54485 - Apex Tool Group, LLC; Gastonia Operation Division; Including On-Site Leased Workers From Adecco...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-04

    ... Carolina. The workers are engaged in activities related to the production of mechanic's hand tool sets. The... production of mechanic's hand tool sets to a foreign country. Based on these findings, the Department is...

  5. FonaDyn - A system for real-time analysis of the electroglottogram, over the voice range

    NASA Astrophysics Data System (ADS)

    Ternström, Sten; Johansson, Dennis; Selamtzis, Andreas

    2018-01-01

    From soft to loud and low to high, the mechanisms of human voice have many degrees of freedom, making it difficult to assess phonation from the acoustic signal alone. FonaDyn is a research tool that combines acoustics with electroglottography (EGG). It characterizes and visualizes in real time the dynamics of EGG waveforms, using statistical clustering of the cycle-synchronous EGG Fourier components, and their sample entropy. The prevalence and stability of different EGG waveshapes are mapped as colored regions into a so-called voice range profile, without needing pre-defined thresholds or categories. With appropriately 'trained' clusters, FonaDyn can classify and map voice regimes. This is of potential scientific, clinical and pedagogical interest.

  6. Validity criteria for Fermi's golden rule scattering rates applied to metallic nanowires.

    PubMed

    Moors, Kristof; Sorée, Bart; Magnus, Wim

    2016-09-14

    Fermi's golden rule underpins the investigation of mobile carriers propagating through various solids, being a standard tool to calculate their scattering rates. As such, it provides a perturbative estimate under the implicit assumption that the effect of the interaction Hamiltonian which causes the scattering events is sufficiently small. To check the validity of this assumption, we present a general framework to derive simple validity criteria in order to assess whether the scattering rates can be trusted for the system under consideration, given its statistical properties such as average size, electron density, impurity density et cetera. We derive concrete validity criteria for metallic nanowires with conduction electrons populating a single parabolic band subjected to different elastic scattering mechanisms: impurities, grain boundaries and surface roughness.

  7. Application of the Cluster Expansion to a Mathematical Model of the Long Memory Phenomenon in a Financial Market

    NASA Astrophysics Data System (ADS)

    Kuroda, Koji; Maskawa, Jun-ichi; Murai, Joshin

    2013-08-01

    Empirical studies of the high frequency data in stock markets show that the time series of trade signs or signed volumes has a long memory property. In this paper, we present a discrete time stochastic process for polymer model which describes trader's trading strategy, and show that a scale limit of the process converges to superposition of fractional Brownian motions with Hurst exponents and Brownian motion, provided that the index γ of the time scale about the trader's investment strategy coincides with the index δ of the interaction range in the discrete time process. The main tool for the investigation is the method of cluster expansion developed in the mathematical study of statistical mechanics.

  8. Entanglement and Wigner Function Negativity of Multimode Non-Gaussian States

    NASA Astrophysics Data System (ADS)

    Walschaers, Mattia; Fabre, Claude; Parigi, Valentina; Treps, Nicolas

    2017-11-01

    Non-Gaussian operations are essential to exploit the quantum advantages in optical continuous variable quantum information protocols. We focus on mode-selective photon addition and subtraction as experimentally promising processes to create multimode non-Gaussian states. Our approach is based on correlation functions, as is common in quantum statistical mechanics and condensed matter physics, mixed with quantum optics tools. We formulate an analytical expression of the Wigner function after the subtraction or addition of a single photon, for arbitrarily many modes. It is used to demonstrate entanglement properties specific to non-Gaussian states and also leads to a practical and elegant condition for Wigner function negativity. Finally, we analyze the potential of photon addition and subtraction for an experimentally generated multimode Gaussian state.

  9. Entanglement and Wigner Function Negativity of Multimode Non-Gaussian States.

    PubMed

    Walschaers, Mattia; Fabre, Claude; Parigi, Valentina; Treps, Nicolas

    2017-11-03

    Non-Gaussian operations are essential to exploit the quantum advantages in optical continuous variable quantum information protocols. We focus on mode-selective photon addition and subtraction as experimentally promising processes to create multimode non-Gaussian states. Our approach is based on correlation functions, as is common in quantum statistical mechanics and condensed matter physics, mixed with quantum optics tools. We formulate an analytical expression of the Wigner function after the subtraction or addition of a single photon, for arbitrarily many modes. It is used to demonstrate entanglement properties specific to non-Gaussian states and also leads to a practical and elegant condition for Wigner function negativity. Finally, we analyze the potential of photon addition and subtraction for an experimentally generated multimode Gaussian state.

  10. Complexity transitions in global algorithms for sparse linear systems over finite fields

    NASA Astrophysics Data System (ADS)

    Braunstein, A.; Leone, M.; Ricci-Tersenghi, F.; Zecchina, R.

    2002-09-01

    We study the computational complexity of a very basic problem, namely that of finding solutions to a very large set of random linear equations in a finite Galois field modulo q. Using tools from statistical mechanics we are able to identify phase transitions in the structure of the solution space and to connect them to the changes in the performance of a global algorithm, namely Gaussian elimination. Crossing phase boundaries produces a dramatic increase in memory and CPU requirements necessary for the algorithms. In turn, this causes the saturation of the upper bounds for the running time. We illustrate the results on the specific problem of integer factorization, which is of central interest for deciphering messages encrypted with the RSA cryptosystem.

  11. Optical stretching as a tool to investigate the mechanical properties of lipid bilayers†

    PubMed Central

    Solmaz, Mehmet E.; Sankhagowit, Shalene; Biswas, Roshni; Mejia, Camilo A.; Povinelli, Michelle L.; Malmstadt, Noah

    2013-01-01

    Measurements of lipid bilayer bending modulus by various techniques produce widely divergent results. We attempt to resolve some of this ambiguity by measuring bending modulus in a system that can rapidly process large numbers of samples, yielding population statistics. This system is based on optical stretching of giant unilamellar vesicles (GUVs) in a microfluidic dual-beam optical trap (DBOT). The microfluidic DBOT system is used here to measure three populations of GUVs with distinct lipid compositions. We find that gel-phase membranes are significantly stiffer than liquid-phase membranes, consistent with previous reports. We also find that the addition of cholesterol does not alter the bending modulus of membranes composed of a monounsaturated phospholipid. PMID:24244843

  12. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    NASA Astrophysics Data System (ADS)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  13. Teaching meta-analysis using MetaLight.

    PubMed

    Thomas, James; Graziosi, Sergio; Higgins, Steve; Coe, Robert; Torgerson, Carole; Newman, Mark

    2012-10-18

    Meta-analysis is a statistical method for combining the results of primary studies. It is often used in systematic reviews and is increasingly a method and topic that appears in student dissertations. MetaLight is a freely available software application that runs simple meta-analyses and contains specific functionality to facilitate the teaching and learning of meta-analysis. While there are many courses and resources for meta-analysis available and numerous software applications to run meta-analyses, there are few pieces of software which are aimed specifically at helping those teaching and learning meta-analysis. Valuable teaching time can be spent learning the mechanics of a new software application, rather than on the principles and practices of meta-analysis. We discuss ways in which the MetaLight tool can be used to present some of the main issues involved in undertaking and interpreting a meta-analysis. While there are many software tools available for conducting meta-analysis, in the context of a teaching programme such software can require expenditure both in terms of money and in terms of the time it takes to learn how to use it. MetaLight was developed specifically as a tool to facilitate the teaching and learning of meta-analysis and we have presented here some of the ways it might be used in a training situation.

  14. Quality control troubleshooting tools for the mill floor

    Treesearch

    John Dramm

    2000-01-01

    Statistical Process Control (SPC) provides effective tools for improving process quality in the forest products industry resulting in reduced costs and improved productivity. Implementing SPC helps identify and locate problems that occur in wood products manufacturing. SPC tools achieve their real value when applied on the mill floor for monitoring and troubleshooting...

  15. Predictive Data Tools Find Uses in Schools

    ERIC Educational Resources Information Center

    Sparks, Sarah D.

    2011-01-01

    The use of analytic tools to predict student performance is exploding in higher education, and experts say the tools show even more promise for K-12 schools, in everything from teacher placement to dropout prevention. Use of such statistical techniques is hindered in precollegiate schools, however, by a lack of researchers trained to help…

  16. Statistical colour models: an automated digital image analysis method for quantification of histological biomarkers.

    PubMed

    Shu, Jie; Dolman, G E; Duan, Jiang; Qiu, Guoping; Ilyas, Mohammad

    2016-04-27

    Colour is the most important feature used in quantitative immunohistochemistry (IHC) image analysis; IHC is used to provide information relating to aetiology and to confirm malignancy. Statistical modelling is a technique widely used for colour detection in computer vision. We have developed a statistical model of colour detection applicable to detection of stain colour in digital IHC images. Model was first trained by massive colour pixels collected semi-automatically. To speed up the training and detection processes, we removed luminance channel, Y channel of YCbCr colour space and chose 128 histogram bins which is the optimal number. A maximum likelihood classifier is used to classify pixels in digital slides into positively or negatively stained pixels automatically. The model-based tool was developed within ImageJ to quantify targets identified using IHC and histochemistry. The purpose of evaluation was to compare the computer model with human evaluation. Several large datasets were prepared and obtained from human oesophageal cancer, colon cancer and liver cirrhosis with different colour stains. Experimental results have demonstrated the model-based tool achieves more accurate results than colour deconvolution and CMYK model in the detection of brown colour, and is comparable to colour deconvolution in the detection of pink colour. We have also demostrated the proposed model has little inter-dataset variations. A robust and effective statistical model is introduced in this paper. The model-based interactive tool in ImageJ, which can create a visual representation of the statistical model and detect a specified colour automatically, is easy to use and available freely at http://rsb.info.nih.gov/ij/plugins/ihc-toolbox/index.html . Testing to the tool by different users showed only minor inter-observer variations in results.

  17. Vibration reduction of pneumatic percussive rivet tools: mechanical and ergonomic re-design approaches.

    PubMed

    Cherng, John G; Eksioglu, Mahmut; Kizilaslan, Kemal

    2009-03-01

    This paper presents a systematic design approach, which is the result of years of research effort, to ergonomic re-design of rivet tools, i.e. rivet hammers and bucking bars. The investigation was carried out using both ergonomic approach and mechanical analysis of the rivet tools dynamic behavior. The optimal mechanical design parameters of the re-designed rivet tools were determined by Taguchi method. Two ergonomically re-designed rivet tools with vibration damping/isolation mechanisms were tested against two conventional rivet tools in both laboratory and field tests. Vibration characteristics of both types of tools were measured by laboratory tests using a custom-made test fixture. The subjective field evaluations of the tools were performed by six experienced riveters at an aircraft repair shop. Results indicate that the isolation spring and polymer damper are very effective in reducing the overall level of vibration under both unweighted and weighted acceleration conditions. The mass of the dolly head and the housing played a significant role in the vibration absorption of the bucking bars. Another important result was that the duct iron has better vibration reducing capability compared to steel and aluminum for bucking bars. Mathematical simulation results were also consistent with the experimental results. Overall conclusion obtained from the study was that by applying the design principles of ergonomics and by adding vibration damping/isolation mechanisms to the rivet tools, the vibration level can significantly be reduced and the tools become safer and user friendly. The details of the experience learned, design modifications, test methods, mathematical models and the results are included in the paper.

  18. A Diagnostics Tool to detect ensemble forecast system anomaly and guide operational decisions

    NASA Astrophysics Data System (ADS)

    Park, G. H.; Srivastava, A.; Shrestha, E.; Thiemann, M.; Day, G. N.; Draijer, S.

    2017-12-01

    The hydrologic community is moving toward using ensemble forecasts to take uncertainty into account during the decision-making process. The New York City Department of Environmental Protection (DEP) implements several types of ensemble forecasts in their decision-making process: ensemble products for a statistical model (Hirsch and enhanced Hirsch); the National Weather Service (NWS) Advanced Hydrologic Prediction Service (AHPS) forecasts based on the classical Ensemble Streamflow Prediction (ESP) technique; and the new NWS Hydrologic Ensemble Forecasting Service (HEFS) forecasts. To remove structural error and apply the forecasts to additional forecast points, the DEP post processes both the AHPS and the HEFS forecasts. These ensemble forecasts provide mass quantities of complex data, and drawing conclusions from these forecasts is time-consuming and difficult. The complexity of these forecasts also makes it difficult to identify system failures resulting from poor data, missing forecasts, and server breakdowns. To address these issues, we developed a diagnostic tool that summarizes ensemble forecasts and provides additional information such as historical forecast statistics, forecast skill, and model forcing statistics. This additional information highlights the key information that enables operators to evaluate the forecast in real-time, dynamically interact with the data, and review additional statistics, if needed, to make better decisions. We used Bokeh, a Python interactive visualization library, and a multi-database management system to create this interactive tool. This tool compiles and stores data into HTML pages that allows operators to readily analyze the data with built-in user interaction features. This paper will present a brief description of the ensemble forecasts, forecast verification results, and the intended applications for the diagnostic tool.

  19. A Statistical Tool for Examining Heat Waves and Other Extreme Phenomena Arising from Multiple Factors

    NASA Astrophysics Data System (ADS)

    Cooley, D. S.; Castillo, F.; Thibaud, E.

    2017-12-01

    A 2015 heatwave in Pakistan is blamed for over a thousand deaths. This event consisted of several days of very high temperatures and unusually high humidity for this region. However, none of these days exceeded the threshold for "extreme danger" in terms of the heat index. The heat index is a univariate function of both temperature and humidity which is universally applied at all locations regardless of local climate. Understanding extremes which arise from multiple factors is challenging. In this paper we will present a tool for examining bivariate extreme behavior. The tool, developed in the statistical software R, draws isolines of equal exceedance probability. These isolines can be understood as bivariate "return levels". The tool is based on a dependence framework specific for extremes, is semiparametric, and is able to extrapolate isolines beyond the range of the data. We illustrate this tool using the Pakistan heat wave data and other bivariate data.

  20. Translating statistical species-habitat models to interactive decision support tools

    USGS Publications Warehouse

    Wszola, Lyndsie S.; Simonsen, Victoria L.; Stuber, Erica F.; Gillespie, Caitlyn R.; Messinger, Lindsey N.; Decker, Karie L.; Lusk, Jeffrey J.; Jorgensen, Christopher F.; Bishop, Andrew A.; Fontaine, Joseph J.

    2017-01-01

    Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences.

  1. Modelling of peak temperature during friction stir processing of magnesium alloy AZ91

    NASA Astrophysics Data System (ADS)

    Vaira Vignesh, R.; Padmanaban, R.

    2018-02-01

    Friction stir processing (FSP) is a solid state processing technique with potential to modify the properties of the material through microstructural modification. The study of heat transfer in FSP aids in the identification of defects like flash, inadequate heat input, poor material flow and mixing etc. In this paper, transient temperature distribution during FSP of magnesium alloy AZ91 was simulated using finite element modelling. The numerical model results were validated using the experimental results from the published literature. The model was used to predict the peak temperature obtained during FSP for various process parameter combinations. The simulated peak temperature results were used to develop a statistical model. The effect of process parameters namely tool rotation speed, tool traverse speed and shoulder diameter of the tool on the peak temperature was investigated using the developed statistical model. It was found that peak temperature was directly proportional to tool rotation speed and shoulder diameter and inversely proportional to tool traverse speed.

  2. Translating statistical species-habitat models to interactive decision support tools.

    PubMed

    Wszola, Lyndsie S; Simonsen, Victoria L; Stuber, Erica F; Gillespie, Caitlyn R; Messinger, Lindsey N; Decker, Karie L; Lusk, Jeffrey J; Jorgensen, Christopher F; Bishop, Andrew A; Fontaine, Joseph J

    2017-01-01

    Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences.

  3. Translating statistical species-habitat models to interactive decision support tools

    PubMed Central

    Simonsen, Victoria L.; Stuber, Erica F.; Gillespie, Caitlyn R.; Messinger, Lindsey N.; Decker, Karie L.; Lusk, Jeffrey J.; Jorgensen, Christopher F.; Bishop, Andrew A.; Fontaine, Joseph J.

    2017-01-01

    Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences. PMID:29236707

  4. PRANAS: A New Platform for Retinal Analysis and Simulation.

    PubMed

    Cessac, Bruno; Kornprobst, Pierre; Kraria, Selim; Nasser, Hassan; Pamplona, Daniela; Portelli, Geoffrey; Viéville, Thierry

    2017-01-01

    The retina encodes visual scenes by trains of action potentials that are sent to the brain via the optic nerve. In this paper, we describe a new free access user-end software allowing to better understand this coding. It is called PRANAS (https://pranas.inria.fr), standing for Platform for Retinal ANalysis And Simulation. PRANAS targets neuroscientists and modelers by providing a unique set of retina-related tools. PRANAS integrates a retina simulator allowing large scale simulations while keeping a strong biological plausibility and a toolbox for the analysis of spike train population statistics. The statistical method (entropy maximization under constraints) takes into account both spatial and temporal correlations as constraints, allowing to analyze the effects of memory on statistics. PRANAS also integrates a tool computing and representing in 3D (time-space) receptive fields. All these tools are accessible through a friendly graphical user interface. The most CPU-costly of them have been implemented to run in parallel.

  5. The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes

    ERIC Educational Resources Information Center

    Cartier, Stephen F.

    2011-01-01

    A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…

  6. Statistical Learning of Phonetic Categories: Insights from a Computational Approach

    ERIC Educational Resources Information Center

    McMurray, Bob; Aslin, Richard N.; Toscano, Joseph C.

    2009-01-01

    Recent evidence (Maye, Werker & Gerken, 2002) suggests that statistical learning may be an important mechanism for the acquisition of phonetic categories in the infant's native language. We examined the sufficiency of this hypothesis and its implications for development by implementing a statistical learning mechanism in a computational model…

  7. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  8. Dose response explorer: an integrated open-source tool for exploring and modelling radiotherapy dose volume outcome relationships

    NASA Astrophysics Data System (ADS)

    El Naqa, I.; Suneja, G.; Lindsay, P. E.; Hope, A. J.; Alaly, J. R.; Vicic, M.; Bradley, J. D.; Apte, A.; Deasy, J. O.

    2006-11-01

    Radiotherapy treatment outcome models are a complicated function of treatment, clinical and biological factors. Our objective is to provide clinicians and scientists with an accurate, flexible and user-friendly software tool to explore radiotherapy outcomes data and build statistical tumour control or normal tissue complications models. The software tool, called the dose response explorer system (DREES), is based on Matlab, and uses a named-field structure array data type. DREES/Matlab in combination with another open-source tool (CERR) provides an environment for analysing treatment outcomes. DREES provides many radiotherapy outcome modelling features, including (1) fitting of analytical normal tissue complication probability (NTCP) and tumour control probability (TCP) models, (2) combined modelling of multiple dose-volume variables (e.g., mean dose, max dose, etc) and clinical factors (age, gender, stage, etc) using multi-term regression modelling, (3) manual or automated selection of logistic or actuarial model variables using bootstrap statistical resampling, (4) estimation of uncertainty in model parameters, (5) performance assessment of univariate and multivariate analyses using Spearman's rank correlation and chi-square statistics, boxplots, nomograms, Kaplan-Meier survival plots, and receiver operating characteristics curves, and (6) graphical capabilities to visualize NTCP or TCP prediction versus selected variable models using various plots. DREES provides clinical researchers with a tool customized for radiotherapy outcome modelling. DREES is freely distributed. We expect to continue developing DREES based on user feedback.

  9. Online use statistics.

    PubMed

    Tannery, Nancy Hrinya; Silverman, Deborah L; Epstein, Barbara A

    2002-01-01

    Online use statistics can provide libraries with a tool to be used when developing an online collection of resources. Statistics can provide information on overall use of a collection, individual print and electronic journal use, and collection use by specific user populations. They can also be used to determine the number of user licenses to purchase. This paper focuses on the issue of use statistics made available for one collection of online resources.

  10. Open Source Tools for Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Powers, P.

    2010-12-01

    The spatio-temporal analysis of seismicity plays an important role in earthquake forecasting and is integral to research on earthquake interactions and triggering. For instance, the third version of the Uniform California Earthquake Rupture Forecast (UCERF), currently under development, will use Epidemic Type Aftershock Sequences (ETAS) as a model for earthquake triggering. UCERF will be a "living" model and therefore requires robust, tested, and well-documented ETAS algorithms to ensure transparency and reproducibility. Likewise, as earthquake aftershock sequences unfold, real-time access to high quality hypocenter data makes it possible to monitor the temporal variability of statistical properties such as the parameters of the Omori Law and the Gutenberg Richter b-value. Such statistical properties are valuable as they provide a measure of how much a particular sequence deviates from expected behavior and can be used when assigning probabilities of aftershock occurrence. To address these demands and provide public access to standard methods employed in statistical seismology, we present well-documented, open-source JavaScript and Java software libraries for the on- and off-line analysis of seismicity. The Javascript classes facilitate web-based asynchronous access to earthquake catalog data and provide a framework for in-browser display, analysis, and manipulation of catalog statistics; implementations of this framework will be made available on the USGS Earthquake Hazards website. The Java classes, in addition to providing tools for seismicity analysis, provide tools for modeling seismicity and generating synthetic catalogs. These tools are extensible and will be released as part of the open-source OpenSHA Commons library.

  11. Reducing statistics anxiety and enhancing statistics learning achievement: effectiveness of a one-minute strategy.

    PubMed

    Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze

    2014-08-01

    Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.

  12. Upgrade Summer Severe Weather Tool

    NASA Technical Reports Server (NTRS)

    Watson, Leela

    2011-01-01

    The goal of this task was to upgrade to the existing severe weather database by adding observations from the 2010 warm season, update the verification dataset with results from the 2010 warm season, use statistical logistic regression analysis on the database and develop a new forecast tool. The AMU analyzed 7 stability parameters that showed the possibility of providing guidance in forecasting severe weather, calculated verification statistics for the Total Threat Score (TTS), and calculated warm season verification statistics for the 2010 season. The AMU also performed statistical logistic regression analysis on the 22-year severe weather database. The results indicated that the logistic regression equation did not show an increase in skill over the previously developed TTS. The equation showed less accuracy than TTS at predicting severe weather, little ability to distinguish between severe and non-severe weather days, and worse standard categorical accuracy measures and skill scores over TTS.

  13. Bayesian models based on test statistics for multiple hypothesis testing problems.

    PubMed

    Ji, Yuan; Lu, Yiling; Mills, Gordon B

    2008-04-01

    We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.

  14. Interactive Web Graphs with Fewer Restrictions

    NASA Technical Reports Server (NTRS)

    Fiedler, James

    2012-01-01

    There is growing popularity for interactive, statistical web graphs and programs to generate them. However, it seems that these programs tend to be somewhat restricted in which web browsers and statistical software are supported. For example, the software might use SVG (e.g., Protovis, gridSVG) or HTML canvas, both of which exclude most versions of Internet Explorer, or the software might be made specifically for R (gridSVG, CRanvas), thus excluding users of other stats software. There are more general tools (d3, Rapha lJS) which are compatible with most browsers, but using one of these to make statistical graphs requires more coding than is probably desired, and requires learning a new tool. This talk will present a method for making interactive web graphs, which, by design, attempts to support as many browsers and as many statistical programs as possible, while also aiming to be relatively easy to use and relatively easy to extend.

  15. TRAPR: R Package for Statistical Analysis and Visualization of RNA-Seq Data.

    PubMed

    Lim, Jae Hyun; Lee, Soo Youn; Kim, Ju Han

    2017-03-01

    High-throughput transcriptome sequencing, also known as RNA sequencing (RNA-Seq), is a standard technology for measuring gene expression with unprecedented accuracy. Numerous bioconductor packages have been developed for the statistical analysis of RNA-Seq data. However, these tools focus on specific aspects of the data analysis pipeline, and are difficult to appropriately integrate with one another due to their disparate data structures and processing methods. They also lack visualization methods to confirm the integrity of the data and the process. In this paper, we propose an R-based RNA-Seq analysis pipeline called TRAPR, an integrated tool that facilitates the statistical analysis and visualization of RNA-Seq expression data. TRAPR provides various functions for data management, the filtering of low-quality data, normalization, transformation, statistical analysis, data visualization, and result visualization that allow researchers to build customized analysis pipelines.

  16. Multivariate analysis for stormwater quality characteristics identification from different urban surface types in macau.

    PubMed

    Huang, J; Du, P; Ao, C; Ho, M; Lei, M; Zhao, D; Wang, Z

    2007-12-01

    Statistical analysis of stormwater runoff data enables general identification of runoff characteristics. Six catchments with different urban surface type including roofs, roadway, park, and residential/commercial in Macau were selected for sampling and study during the period from June 2005 to September 2006. Based on univariate statistical analysis of data sampled, major pollutants discharged from different urban surface type were identified. As for iron roof runoff, Zn is the most significant pollutant. The major pollutants from urban roadway runoff are TSS and COD. Stormwater runoff from commercial/residential and Park catchments show high level of COD, TN, and TP concentration. Principal component analysis was further done for identification of linkages between stormwater quality and urban surface types. Two potential pollution sources were identified for study catchments with different urban surface types. The first one is referred as nutrients losses, soil losses and organic pollutants discharges, the second is related to heavy metals losses. PCA was proved to be a viable tool to explain the type of pollution sources and its mechanism for different urban surface type catchments.

  17. Theory of hydromagnetic turbulence

    NASA Technical Reports Server (NTRS)

    Montgomery, D.

    1983-01-01

    The present state of MHD turbulence theory as a possible solar wind research tool is surveyed. The theory is statistical, and does not make statements about individual events. The ensembles considered typically have individual realizations which differ qualitatively, unlike equilibrium statistical mechanics. Most of the theory deals with highly symmetric situations; most of these symmetries have yet to be tested in the solar wind. The applicability of MHD itself to solar wind parameters is highly questionable; yet it has no competitors, as a potentially comprehensive dynamical description. The purpose of solar wind research require sharper articulation. If they are to understand radial turbulent plasma flows from spheres, laboratory experiments and numerical solution of equations of motion may be cheap alternative to spacecraft. If "real life" information is demanded, multiple spacecraft with variable separation may be necessary to go further. The principal emphasis in the theory so far has been on spectral behavior for spatial covariances in wave number space. There is no respectable theory of these for highly anisotropic situations. A rather slow development of theory acts as a brake on justifiable measurement, at this point.

  18. Big heart data: advancing health informatics through data sharing in cardiovascular imaging.

    PubMed

    Suinesiaputra, Avan; Medrano-Gracia, Pau; Cowan, Brett R; Young, Alistair A

    2015-07-01

    The burden of heart disease is rapidly worsening due to the increasing prevalence of obesity and diabetes. Data sharing and open database resources for heart health informatics are important for advancing our understanding of cardiovascular function, disease progression and therapeutics. Data sharing enables valuable information, often obtained at considerable expense and effort, to be reused beyond the specific objectives of the original study. Many government funding agencies and journal publishers are requiring data reuse, and are providing mechanisms for data curation and archival. Tools and infrastructure are available to archive anonymous data from a wide range of studies, from descriptive epidemiological data to gigabytes of imaging data. Meta-analyses can be performed to combine raw data from disparate studies to obtain unique comparisons or to enhance statistical power. Open benchmark datasets are invaluable for validating data analysis algorithms and objectively comparing results. This review provides a rationale for increased data sharing and surveys recent progress in the cardiovascular domain. We also highlight the potential of recent large cardiovascular epidemiological studies enabling collaborative efforts to facilitate data sharing, algorithms benchmarking, disease modeling and statistical atlases.

  19. Automated clustering-based workload characterization

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Menasce, Daniel A.; Yesha, Yelena

    1996-01-01

    The demands placed on the mass storage systems at various federal agencies and national laboratories are continuously increasing in intensity. This forces system managers to constantly monitor the system, evaluate the demand placed on it, and tune it appropriately using either heuristics based on experience or analytic models. Performance models require an accurate workload characterization. This can be a laborious and time consuming process. It became evident from our experience that a tool is necessary to automate the workload characterization process. This paper presents the design and discusses the implementation of a tool for workload characterization of mass storage systems. The main features of the tool discussed here are: (1)Automatic support for peak-period determination. Histograms of system activity are generated and presented to the user for peak-period determination; (2) Automatic clustering analysis. The data collected from the mass storage system logs is clustered using clustering algorithms and tightness measures to limit the number of generated clusters; (3) Reporting of varied file statistics. The tool computes several statistics on file sizes such as average, standard deviation, minimum, maximum, frequency, as well as average transfer time. These statistics are given on a per cluster basis; (4) Portability. The tool can easily be used to characterize the workload in mass storage systems of different vendors. The user needs to specify through a simple log description language how the a specific log should be interpreted. The rest of this paper is organized as follows. Section two presents basic concepts in workload characterization as they apply to mass storage systems. Section three describes clustering algorithms and tightness measures. The following section presents the architecture of the tool. Section five presents some results of workload characterization using the tool.Finally, section six presents some concluding remarks.

  20. Tool use and mechanical problem solving in apraxia.

    PubMed

    Goldenberg, G; Hagmann, S

    1998-07-01

    Moorlaas (1928) proposed that apraxic patients can identify objects and can remember the purpose they have been made for but do not know the way in which they must be used to achieve that purpose. Knowledge about the use of objects and tools can have two sources: It can be based on retrieval of instructions of use from semantic memory or on a direct inference of function from structure. The ability to infer function from structure enables subjects to use unfamiliar tools and to detect alternative uses of familiar tools. It is the basis of mechanical problem solving. The purpose of the present study was to analyze retrieval of instruction of use, mechanical problem solving, and actual tool use in patients with apraxia due to circumscribed lesions of the left hemisphere. For assessing mechanical problem solving we developed a test of selection and application of novel tools. Access to instruction of use was tested by pantomime of tool use. Actual tool use was examined for the same familiar tools. Forty two patients with left brain damage (LBD) and aphasia, 22 patients with right brain damage (RBD) and 22 controls were examined. Only LBD patients differed from controls on all tests. RBD patients had difficulties with the use but not with the selection of novel tools. In LBD patients there was a significant correlation between pantomime of tool use and novel tool selection but there were single cases who scored in the defective range on one of these tests and normally on the other. Analysis of LBD patients' lesions suggested that frontal lobe damage does not disturb novel tool selection. Only LBD patients who failed on pantomime of object use and on novel tool selection committed errors in actual use of familiar tools. The finding that mechanical problem solving is invariably defective in apraxic patients who commit errors with familiar tools is in good accord with clinical observations, as the gravity of their errors goes beyond what one would expect as a mere sequel of loss of access to instruction of use.

  1. Detergent-enzymatic decellularization of swine blood vessels: insight on mechanical properties for vascular tissue engineering.

    PubMed

    Pellegata, Alessandro F; Asnaghi, M Adelaide; Stefani, Ilaria; Maestroni, Anna; Maestroni, Silvia; Dominioni, Tommaso; Zonta, Sandro; Zerbini, Gianpaolo; Mantero, Sara

    2013-01-01

    Small caliber vessels substitutes still remain an unmet clinical need; few autologous substitutes are available, while synthetic grafts show insufficient patency in the long term. Decellularization is the complete removal of all cellular and nuclear matters from a tissue while leaving a preserved extracellular matrix representing a promising tool for the generation of acellular scaffolds for tissue engineering, already used for various tissues with positive outcomes. The aim of this work is to investigate the effect of a detergent-enzymatic decellularization protocol on swine arteries in terms of cell removal, extracellular matrix preservation, and mechanical properties. Furthermore, the effect of storage at -80°C on the mechanical properties of the tissue is evaluated. Swine arteries were harvested, frozen, and decellularized; histological analysis revealed complete cell removal and preserved extracellular matrix. Furthermore, the residual DNA content in decellularized tissues was far low compared to native one. Mechanical testings were performed on native, defrozen, and decellularized tissues; no statistically significant differences were reported for Young's modulus, ultimate stress, compliance, burst pressure, and suture retention strength, while ultimate strain and stress relaxation of decellularized vessels were significantly different from the native ones. Considering the overall results, the process was confirmed to be suitable for the generation of acellular scaffolds for vascular tissue engineering.

  2. Selected aspects of microelectronics technology and applications: Numerically controlled machine tools. Technology trends series no. 2

    NASA Astrophysics Data System (ADS)

    Sigurdson, J.; Tagerud, J.

    1986-05-01

    A UNIDO publication about machine tools with automatic control discusses the following: (1) numerical control (NC) machine tool perspectives, definition of NC, flexible manufacturing systems, robots and their industrial application, research and development, and sensors; (2) experience in developing a capability in NC machine tools; (3) policy issues; (4) procedures for retrieval of relevant documentation from data bases. Diagrams, statistics, bibliography are included.

  3. Flood- and drought-related natural hazards activities of the U.S. Geological Survey in New England

    USGS Publications Warehouse

    Lombard, Pamela J.

    2016-03-23

    Tools for natural hazard assessment and mitigation • Light detection and ranging (lidar) remote sensing technology • StreamStats Web-based tool for streamflow statistics • Flood inundation mapper

  4. ToNER: A tool for identifying nucleotide enrichment signals in feature-enriched RNA-seq data

    PubMed Central

    Promworn, Yuttachon; Kaewprommal, Pavita; Shaw, Philip J.; Intarapanich, Apichart; Tongsima, Sissades

    2017-01-01

    Background Biochemical methods are available for enriching 5′ ends of RNAs in prokaryotes, which are employed in the differential RNA-seq (dRNA-seq) and the more recent Cappable-seq protocols. Computational methods are needed to locate RNA 5′ ends from these data by statistical analysis of the enrichment. Although statistical-based analysis methods have been developed for dRNA-seq, they may not be suitable for Cappable-seq data. The more efficient enrichment method employed in Cappable-seq compared with dRNA-seq could affect data distribution and thus algorithm performance. Results We present Transformation of Nucleotide Enrichment Ratios (ToNER), a tool for statistical modeling of enrichment from RNA-seq data obtained from enriched and unenriched libraries. The tool calculates nucleotide enrichment scores and determines the global transformation for fitting to the normal distribution using the Box-Cox procedure. From the transformed distribution, sites of significant enrichment are identified. To increase power of detection, meta-analysis across experimental replicates is offered. We tested the tool on Cappable-seq and dRNA-seq data for identifying Escherichia coli transcript 5′ ends and compared the results with those from the TSSAR tool, which is designed for analyzing dRNA-seq data. When combining results across Cappable-seq replicates, ToNER detects more known transcript 5′ ends than TSSAR. In general, the transcript 5′ ends detected by ToNER but not TSSAR occur in regions which cannot be locally modeled by TSSAR. Conclusion ToNER uses a simple yet robust statistical modeling approach, which can be used for detecting RNA 5′ends from Cappable-seq data, in particular when combining information from experimental replicates. The ToNER tool could potentially be applied for analyzing other RNA-seq datasets in which enrichment for other structural features of RNA is employed. The program is freely available for download at ToNER webpage (http://www4a.biotec.or.th/GI/tools/toner) and GitHub repository (https://github.com/PavitaKae/ToNER). PMID:28542466

  5. NASTRAN as an analytical research tool for composite mechanics and composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.

    1976-01-01

    Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.

  6. Reply: Comparison of slope instability screening tools following a large storm event and application to forest management and policy

    NASA Astrophysics Data System (ADS)

    Whittaker, Kara A.; McShane, Dan

    2013-02-01

    A large storm event in southwest Washington State triggered over 2500 landslides and provided an opportunity to assess two slope stability screening tools. The statistical analysis conducted demonstrated that both screening tools are effective at predicting where landslides were likely to take place (Whittaker and McShane, 2012). Here we reply to two discussions of this article related to the development of the slope stability screening tools and the accuracy and scale of the spatial data used. Neither of the discussions address our statistical analysis or results. We provide greater detail on our sampling criteria and also elaborate on the policy and management implications of our findings and how they complement those of a separate investigation of landslides resulting from the same storm. The conclusions made in Whittaker and McShane (2012) stand as originally published unless future analysis indicates otherwise.

  7. Playing at Statistical Mechanics

    ERIC Educational Resources Information Center

    Clark, Paul M.; And Others

    1974-01-01

    Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)

  8. The Manipulative Complexity of Lower Paleolithic Stone Toolmaking

    PubMed Central

    Faisal, Aldo; Stout, Dietrich; Apel, Jan; Bradley, Bruce

    2010-01-01

    Background Early stone tools provide direct evidence of human cognitive and behavioral evolution that is otherwise unavailable. Proper interpretation of these data requires a robust interpretive framework linking archaeological evidence to specific behavioral and cognitive actions. Methodology/Principal Findings Here we employ a data glove to record manual joint angles in a modern experimental toolmaker (the 4th author) replicating ancient tool forms in order to characterize and compare the manipulative complexity of two major Lower Paleolithic technologies (Oldowan and Acheulean). To this end we used a principled and general measure of behavioral complexity based on the statistics of joint movements. Conclusions/Significance This allowed us to confirm that previously observed differences in brain activation associated with Oldowan versus Acheulean technologies reflect higher-level behavior organization rather than lower-level differences in manipulative complexity. This conclusion is consistent with a scenario in which the earliest stages of human technological evolution depended on novel perceptual-motor capacities (such as the control of joint stiffness) whereas later developments increasingly relied on enhanced mechanisms for cognitive control. This further suggests possible links between toolmaking and language evolution. PMID:21072164

  9. [Security at work in terms of risk factors (author's transl)].

    PubMed

    Faverge, J M

    1977-09-23

    Accident epidemiology aims at determining those factors which are associated with an increased risk. Here accidents at work are considered in which any given situation involves: one or many individuals (I); one or many tasks (T); one or many machines or production tools (M); an environment (E). Eight factors are proposed for each of which one of the above components is dominant. Each factor is defined and examples are given. In addition, where applicable, the following are given: subfactors; references to studies which demonstrate the association between risk and factor; one or several possible action mechanisms; proposals enabling a quantitative evaluation to be made for statistical purposes; suggestions for prevention. The factors are: Individual disposition (I) or liability. Worker's inexperience (I). Stress (T) imposed on the worker. Recovery (T) (an exceptional task must be performed in order to regain normal work conditions). Catachresis (M) (a tool is used for an unusual purpose or a machine is required to exceed normal work load). Material wear (M) or damage. Interference (E) between partially independent processes. Insufficient information (E) concerning the state of the system.

  10. An Informed Approach to Improving Quantitative Literacy and Mitigating Math Anxiety in Undergraduates Through Introductory Science Courses

    NASA Astrophysics Data System (ADS)

    Follette, K.; McCarthy, D.

    2012-08-01

    Current trends in the teaching of high school and college science avoid numerical engagement because nearly all students lack basic arithmetic skills and experience anxiety when encountering numbers. Nevertheless, such skills are essential to science and vital to becoming savvy consumers, citizens capable of recognizing pseudoscience, and discerning interpreters of statistics in ever-present polls, studies, and surveys in which our society is awash. Can a general-education collegiate course motivate students to value numeracy and to improve their quantitative skills in what may well be their final opportunity in formal education? We present a tool to assess whether skills in numeracy/quantitative literacy can be fostered and improved in college students through the vehicle of non-major introductory courses in astronomy. Initial classroom applications define the magnitude of this problem and indicate that significant improvements are possible. Based on these initial results we offer this tool online and hope to collaborate with other educators, both formal and informal, to develop effective mechanisms for encouraging all students to value and improve their skills in basic numeracy.

  11. An Overview of Promising Grades of Tool Materials Based on the Analysis of their Physical-Mechanical Characteristics

    NASA Astrophysics Data System (ADS)

    Kudryashov, E. A.; Smirnov, I. M.; Grishin, D. V.; Khizhnyak, N. A.

    2018-06-01

    The work is aimed at selecting a promising grade of a tool material, whose physical-mechanical characteristics would allow using it for processing the surfaces of discontinuous parts in the presence of shock loads. An analysis of the physical-mechanical characteristics of most common tool materials is performed and the data on a possible provision of the metal-working processes with promising composite grades are presented.

  12. Constructing and Modifying Sequence Statistics for relevent Using informR in 𝖱

    PubMed Central

    Marcum, Christopher Steven; Butts, Carter T.

    2015-01-01

    The informR package greatly simplifies the analysis of complex event histories in 𝖱 by providing user friendly tools to build sufficient statistics for the relevent package. Historically, building sufficient statistics to model event sequences (of the form a→b) using the egocentric generalization of Butts’ (2008) relational event framework for modeling social action has been cumbersome. The informR package simplifies the construction of the complex list of arrays needed by the rem() model fitting for a variety of cases involving egocentric event data, multiple event types, and/or support constraints. This paper introduces these tools using examples from real data extracted from the American Time Use Survey. PMID:26185488

  13. Enhancing interest in statistics among computer science students using computer tool entrepreneur role play

    NASA Astrophysics Data System (ADS)

    Judi, Hairulliza Mohamad; Sahari @ Ashari, Noraidah; Eksan, Zanaton Hj

    2017-04-01

    Previous research in Malaysia indicates that there is a problem regarding attitude towards statistics among students. They didn't show positive attitude in affective, cognitive, capability, value, interest and effort aspects although did well in difficulty. This issue should be given substantial attention because students' attitude towards statistics may give impacts on the teaching and learning process of the subject. Teaching statistics using role play is an appropriate attempt to improve attitudes to statistics, to enhance the learning of statistical techniques and statistical thinking, and to increase generic skills. The objectives of the paper are to give an overview on role play in statistics learning and to access the effect of these activities on students' attitude and learning in action research framework. The computer tool entrepreneur role play is conducted in a two-hour tutorial class session of first year students in Faculty of Information Sciences and Technology (FTSM), Universiti Kebangsaan Malaysia, enrolled in Probability and Statistics course. The results show that most students feel that they have enjoyable and great time in the role play. Furthermore, benefits and disadvantages from role play activities were highlighted to complete the review. Role play is expected to serve as an important activities that take into account students' experience, emotions and responses to provide useful information on how to modify student's thinking or behavior to improve learning.

  14. The Validity and Reliability of an iPhone App for Measuring Running Mechanics.

    PubMed

    Balsalobre-Fernández, Carlos; Agopyan, Hovannes; Morin, Jean-Benoit

    2017-07-01

    The purpose of this investigation was to analyze the validity of an iPhone application (Runmatic) for measuring running mechanics. To do this, 96 steps from 12 different runs at speeds ranging from 2.77-5.55 m·s -1 were recorded simultaneously with Runmatic, as well as with an opto-electronic device installed on a motorized treadmill to measure the contact and aerial time of each step. Additionally, several running mechanics variables were calculated using the contact and aerial times measured, and previously validated equations. Several statistics were computed to test the validity and reliability of Runmatic in comparison with the opto-electronic device for the measurement of contact time, aerial time, vertical oscillation, leg stiffness, maximum relative force, and step frequency. The running mechanics values obtained with both the app and the opto-electronic device showed a high degree of correlation (r = .94-.99, p < .001). Moreover, there was very close agreement between instruments as revealed by the ICC (2,1) (ICC = 0.965-0.991). Finally, both Runmatic and the opto-electronic device showed almost identical reliability levels when measuring each set of 8 steps for every run recorded. In conclusion, Runmatic has been proven to be a highly reliable tool for measuring the running mechanics studied in this work.

  15. HC StratoMineR: A Web-Based Tool for the Rapid Analysis of High-Content Datasets.

    PubMed

    Omta, Wienand A; van Heesbeen, Roy G; Pagliero, Romina J; van der Velden, Lieke M; Lelieveld, Daphne; Nellen, Mehdi; Kramer, Maik; Yeong, Marley; Saeidi, Amir M; Medema, Rene H; Spruit, Marco; Brinkkemper, Sjaak; Klumperman, Judith; Egan, David A

    2016-10-01

    High-content screening (HCS) can generate large multidimensional datasets and when aligned with the appropriate data mining tools, it can yield valuable insights into the mechanism of action of bioactive molecules. However, easy-to-use data mining tools are not widely available, with the result that these datasets are frequently underutilized. Here, we present HC StratoMineR, a web-based tool for high-content data analysis. It is a decision-supportive platform that guides even non-expert users through a high-content data analysis workflow. HC StratoMineR is built by using My Structured Query Language for storage and querying, PHP: Hypertext Preprocessor as the main programming language, and jQuery for additional user interface functionality. R is used for statistical calculations, logic and data visualizations. Furthermore, C++ and graphical processor unit power is diffusely embedded in R by using the rcpp and rpud libraries for operations that are computationally highly intensive. We show that we can use HC StratoMineR for the analysis of multivariate data from a high-content siRNA knock-down screen and a small-molecule screen. It can be used to rapidly filter out undesirable data; to select relevant data; and to perform quality control, data reduction, data exploration, morphological hit picking, and data clustering. Our results demonstrate that HC StratoMineR can be used to functionally categorize HCS hits and, thus, provide valuable information for hit prioritization.

  16. Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization

    NASA Astrophysics Data System (ADS)

    Eroglu, Sertac

    2014-10-01

    The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.

  17. Reinventing Biostatistics Education for Basic Scientists

    PubMed Central

    Weissgerber, Tracey L.; Garovic, Vesna D.; Milin-Lazovic, Jelena S.; Winham, Stacey J.; Obradovic, Zoran; Trzeciakowski, Jerome P.; Milic, Natasa M.

    2016-01-01

    Numerous studies demonstrating that statistical errors are common in basic science publications have led to calls to improve statistical training for basic scientists. In this article, we sought to evaluate statistical requirements for PhD training and to identify opportunities for improving biostatistics education in the basic sciences. We provide recommendations for improving statistics training for basic biomedical scientists, including: 1. Encouraging departments to require statistics training, 2. Tailoring coursework to the students’ fields of research, and 3. Developing tools and strategies to promote education and dissemination of statistical knowledge. We also provide a list of statistical considerations that should be addressed in statistics education for basic scientists. PMID:27058055

  18. Quantum mechanics as classical statistical mechanics with an ontic extension and an epistemic restriction.

    PubMed

    Budiyono, Agung; Rohrlich, Daniel

    2017-11-03

    Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.

  19. Noise Reduction in High-Throughput Gene Perturbation Screens

    USDA-ARS?s Scientific Manuscript database

    Motivation: Accurate interpretation of perturbation screens is essential for a successful functional investigation. However, the screened phenotypes are often distorted by noise, and their analysis requires specialized statistical analysis tools. The number and scope of statistical methods available...

  20. The nature and evaluation of commercial expert system building tools, revision 1

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1987-01-01

    This memorandum reviews the factors that constitute an Expert System Building Tool (ESBT) and evaluates current tools in terms of these factors. Evaluation of these tools is based on their structure and their alternative forms of knowledge representation, inference mechanisms and developer end-user interfaces. Next, functional capabilities, such as diagnosis and design, are related to alternative forms of mechanization. The characteristics and capabilities of existing commercial tools are then reviewed in terms of these criteria.

  1. Analysis of the Auditory Feedback and Phonation in Normal Voices.

    PubMed

    Arbeiter, Mareike; Petermann, Simon; Hoppe, Ulrich; Bohr, Christopher; Doellinger, Michael; Ziethe, Anke

    2018-02-01

    The aim of this study was to investigate the auditory feedback mechanisms and voice quality during phonation in response to a spontaneous pitch change in the auditory feedback. Does the pitch shift reflex (PSR) change voice pitch and voice quality? Quantitative and qualitative voice characteristics were analyzed during the PSR. Twenty-eight healthy subjects underwent transnasal high-speed video endoscopy (HSV) at 8000 fps during sustained phonation [a]. While phonating, the subjects heard their sound pitched up for 700 cents (interval of a fifth), lasting 300 milliseconds in their auditory feedback. The electroencephalography (EEG), acoustic voice signal, electroglottography (EGG), and high-speed-videoendoscopy (HSV) were analyzed to compare feedback mechanisms for the pitched and unpitched condition of the phonation paradigm statistically. Furthermore, quantitative and qualitative voice characteristics were analyzed. The PSR was successfully detected within all signals of the experimental tools (EEG, EGG, acoustic voice signal, HSV). A significant increase of the perturbation measures and an increase of the values of the acoustic parameters during the PSR were observed, especially for the audio signal. The auditory feedback mechanism seems not only to control for voice pitch but also for voice quality aspects.

  2. Uncertainty Quantification and Statistical Engineering for Hypersonic Entry Applications

    NASA Technical Reports Server (NTRS)

    Cozmuta, Ioana

    2011-01-01

    NASA has invested significant resources in developing and validating a mathematical construct for TPS margin management: a) Tailorable for low/high reliability missions; b) Tailorable for ablative/reusable TPS; c) Uncertainty Quantification and Statistical Engineering are valuable tools not exploited enough; and d) Need to define strategies combining both Theoretical Tools and Experimental Methods. The main reason for this lecture is to give a flavor of where UQ and SE could contribute and hope that the broader community will work with us to improve in these areas.

  3. Peer Review Documents Related to the Evaluation of ...

    EPA Pesticide Factsheets

    BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review and expert summaries of the BMDS application and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer reviews and expert summaries of the BMDS applications and its models as they were developed and eventually released.

  4. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    PubMed

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-02

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.

  5. Two Paradoxes in Linear Regression Analysis.

    PubMed

    Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong

    2016-12-25

    Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.

  6. Statistical Thermodynamics and Microscale Thermophysics

    NASA Astrophysics Data System (ADS)

    Carey, Van P.

    1999-08-01

    Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.

  7. The use and misuse of statistical methodologies in pharmacology research.

    PubMed

    Marino, Michael J

    2014-01-01

    Descriptive, exploratory, and inferential statistics are necessary components of hypothesis-driven biomedical research. Despite the ubiquitous need for these tools, the emphasis on statistical methods in pharmacology has become dominated by inferential methods often chosen more by the availability of user-friendly software than by any understanding of the data set or the critical assumptions of the statistical tests. Such frank misuse of statistical methodology and the quest to reach the mystical α<0.05 criteria has hampered research via the publication of incorrect analysis driven by rudimentary statistical training. Perhaps more critically, a poor understanding of statistical tools limits the conclusions that may be drawn from a study by divorcing the investigator from their own data. The net result is a decrease in quality and confidence in research findings, fueling recent controversies over the reproducibility of high profile findings and effects that appear to diminish over time. The recent development of "omics" approaches leading to the production of massive higher dimensional data sets has amplified these issues making it clear that new approaches are needed to appropriately and effectively mine this type of data. Unfortunately, statistical education in the field has not kept pace. This commentary provides a foundation for an intuitive understanding of statistics that fosters an exploratory approach and an appreciation for the assumptions of various statistical tests that hopefully will increase the correct use of statistics, the application of exploratory data analysis, and the use of statistical study design, with the goal of increasing reproducibility and confidence in the literature. Copyright © 2013. Published by Elsevier Inc.

  8. A robot end effector exchange mechanism for space applications

    NASA Technical Reports Server (NTRS)

    Gorin, Barney F.

    1990-01-01

    Efficient robot operation requires the use of specialized end effectors or tools for tasks. In spacecraft applications, the microgravity environment precludes the use of gravitational forces to retain the tools in holding fixture. As a result of this, a retention mechanism which forms a part of the tool storage container is required. A unique approach to this problem has resulted in the development of an end effector exchange mechanism that meets the requirements for spaceflight applications while avoiding the complexity usually involved. This mechanism uses multiple latching cams both on the manipulator and in the tool storage container, combined with a system of catch rings to provide retention in both locations and the required failure tolerance. Because of the cam configuration the mechanism operates passively, requiring no electrical commands except those needed to move the manipulator into position. Similarly, it inherently provides interlocks to prevent the release of one cam before its opposite number is engaged.

  9. The Use of Chest Computed Tomographic Angiography in Blunt Trauma Pediatric Population.

    PubMed

    Hasadia, Rabea; DuBose, Joseph; Peleg, Kobi; Stephenson, Jacob; Givon, Adi; Kessel, Boris

    2018-02-05

    Blunt chest trauma in children is common. Although rare, associated major thoracic vascular injuries (TVIs) are lethal potential sequelae of these mechanisms. The preferred study for definitive diagnosis of TVI in stable patients is computed tomographic angiography imaging of the chest. This imaging modality is, however, associated with high doses of ionizing radiation that represent significant carcinogenic risk for pediatric patients. The aim of the present investigation was to define the incidence of TVI among blunt pediatric trauma patients in an effort to better elucidate the usefulness of computed tomographic angiography use in this population. A retrospective cohort study was conducted including all blunt pediatric (age < 14 y) trauma victims registered in Israeli National Trauma Registry maintained by Gertner Institute for Epidemiology and Health Policy Research between the years 1997 and 2015. Data collected included age, sex, mechanism of injury, Glasgow Coma Scale, Injury Severity Score, and incidence of chest named vessel injuries. Statistical analysis was performed using SAS statistical software version 9.2 (SAS Institute Inc, Cary, NC). Among 433,325 blunt trauma victims, 119,821patients were younger than 14 years. Twelve (0.0001%, 12/119821) of these children were diagnosed with TVI. The most common mechanism in this group was pedestrian hit by a car. Mortality was 41.7% (5/12). Thoracic vascular injury is exceptionally rare among pediatric blunt trauma victims but does contribute to the high morbidity and mortality seen with blunt chest trauma. Computed tomographic angiography, with its associated radiation exposure risk, should not be used as a standard tool after trauma in injured children. Clinical protocols are needed in this population to minimize radiation risk while allowing prompt identification of life-threatening injuries.

  10. Crowdsourcing awareness: exploration of the ovarian cancer knowledge gap through Amazon Mechanical Turk.

    PubMed

    Carter, Rebecca R; DiFeo, Analisa; Bogie, Kath; Zhang, Guo-Qiang; Sun, Jiayang

    2014-01-01

    Ovarian cancer is the most lethal gynecologic disease in the United States, with more women dying from this cancer than all gynecological cancers combined. Ovarian cancer has been termed the "silent killer" because some patients do not show clear symptoms at an early stage. Currently, there is a lack of approved and effective early diagnostic tools for ovarian cancer. There is also an apparent severe knowledge gap of ovarian cancer in general and of its indicative symptoms among both public and many health professionals. These factors have significantly contributed to the late stage diagnosis of most ovarian cancer patients (63% are diagnosed at Stage III or above), where the 5-year survival rate is less than 30%. The paucity of knowledge concerning ovarian cancer in the United States is unknown. The present investigation examined current public awareness and knowledge about ovarian cancer. The study implemented design strategies to develop an unbiased survey with quality control measures, including the modern application of multiple statistical analyses. The survey assessed a reasonable proxy of the US population by crowdsourcing participants through the online task marketplace Amazon Mechanical Turk, at a highly condensed rate of cost and time compared to traditional recruitment methods. Knowledge of ovarian cancer was compared to that of breast cancer using repeated measures, bias control and other quality control measures in the survey design. Analyses included multinomial logistic regression and categorical data analysis procedures such as correspondence analysis, among other statistics. We confirmed the relatively poor public knowledge of ovarian cancer among the US population. The simple, yet novel design should set an example for designing surveys to obtain quality data via Amazon Mechanical Turk with the associated analyses.

  11. Computer implemented method, and apparatus for controlling a hand-held tool

    NASA Technical Reports Server (NTRS)

    Wagner, Kenneth William (Inventor); Taylor, James Clayton (Inventor)

    1999-01-01

    The invention described here in is a computer-implemented method and apparatus for controlling a hand-held tool. In particular, the control of a hand held tool is for the purpose of controlling the speed of a fastener interface mechanism and the torque applied to fasteners by the fastener interface mechanism of the hand-held tool and monitoring the operating parameters of the tool. The control is embodied in intool software embedded on a processor within the tool which also communicates with remote software. An operator can run the tool, or through the interaction of both software, operate the tool from a remote location, analyze data from a performance history recorded by the tool, and select various torque and speed parameters for each fastener.

  12. Electromechanical Coupling Factor of Breast Tissue as a Biomarker for Breast Cancer.

    PubMed

    Park, Kihan; Chen, Wenjin; Chekmareva, Marina A; Foran, David J; Desai, Jaydev P

    2018-01-01

    This research aims to validate a new biomarker of breast cancer by introducing electromechanical coupling factor of breast tissue samples as a possible additional indicator of breast cancer. Since collagen fibril exhibits a structural organization that gives rise to a piezoelectric effect, the difference in collagen density between normal and cancerous tissue can be captured by identifying the corresponding electromechanical coupling factor. The design of a portable diagnostic tool and a microelectromechanical systems (MEMS)-based biochip, which is integrated with a piezoresistive sensing layer for measuring the reaction force as well as a microheater for temperature control, is introduced. To verify that electromechanical coupling factor can be used as a biomarker for breast cancer, the piezoelectric model for breast tissue is described with preliminary experimental results on five sets of normal and invasive ductal carcinoma (IDC) samples in the 25-45 temperature range. While the stiffness of breast tissues can be captured as a representative mechanical signature which allows one to discriminate among tissue types especially in the higher strain region, the electromechanical coupling factor shows more distinct differences between the normal and IDC groups over the entire strain region than the mechanical signature. From the two-sample -test, the electromechanical coupling factor under compression shows statistically significant differences ( 0.0039) between the two groups. The increase in collagen density in breast tissue is an objective and reproducible characteristic of breast cancer. Although characterization of mechanical tissue property has been shown to be useful for differentiating cancerous tissue from normal tissue, using a single parameter may not be sufficient for practical usage due to inherent variation among biological samples. The portable breast cancer diagnostic tool reported in this manuscript shows the feasibility of measuring multiple parameters of breast tissue allowing for practical application.

  13. Using assemblage data in ecological indicators: A comparison and evaluation of commonly available statistical tools

    USGS Publications Warehouse

    Smith, Joseph M.; Mather, Martha E.

    2012-01-01

    Ecological indicators are science-based tools used to assess how human activities have impacted environmental resources. For monitoring and environmental assessment, existing species assemblage data can be used to make these comparisons through time or across sites. An impediment to using assemblage data, however, is that these data are complex and need to be simplified in an ecologically meaningful way. Because multivariate statistics are mathematical relationships, statistical groupings may not make ecological sense and will not have utility as indicators. Our goal was to define a process to select defensible and ecologically interpretable statistical simplifications of assemblage data in which researchers and managers can have confidence. For this, we chose a suite of statistical methods, compared the groupings that resulted from these analyses, identified convergence among groupings, then we interpreted the groupings using species and ecological guilds. When we tested this approach using a statewide stream fish dataset, not all statistical methods worked equally well. For our dataset, logistic regression (Log), detrended correspondence analysis (DCA), cluster analysis (CL), and non-metric multidimensional scaling (NMDS) provided consistent, simplified output. Specifically, the Log, DCA, CL-1, and NMDS-1 groupings were ≥60% similar to each other, overlapped with the fluvial-specialist ecological guild, and contained a common subset of species. Groupings based on number of species (e.g., Log, DCA, CL and NMDS) outperformed groupings based on abundance [e.g., principal components analysis (PCA) and Poisson regression]. Although the specific methods that worked on our test dataset have generality, here we are advocating a process (e.g., identifying convergent groupings with redundant species composition that are ecologically interpretable) rather than the automatic use of any single statistical tool. We summarize this process in step-by-step guidance for the future use of these commonly available ecological and statistical methods in preparing assemblage data for use in ecological indicators.

  14. Probability and Statistics in Sensor Performance Modeling

    DTIC Science & Technology

    2010-12-01

    language software program is called Environmental Awareness for Sensor and Emitter Employment. Some important numerical issues in the implementation...3 Statistical analysis for measuring sensor performance...complementary cumulative distribution function cdf cumulative distribution function DST decision-support tool EASEE Environmental Awareness of

  15. Concept Maps in Introductory Statistics

    ERIC Educational Resources Information Center

    Witmer, Jeffrey A.

    2016-01-01

    Concept maps are tools for organizing thoughts on the main ideas in a course. I present an example of a concept map that was created through the work of students in an introductory class and discuss major topics in statistics and relationships among them.

  16. Computational Modeling of Statistical Learning: Effects of Transitional Probability versus Frequency and Links to Word Learning

    ERIC Educational Resources Information Center

    Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.

    2010-01-01

    Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…

  17. The Effect on Prospective Teachers of the Learning Environment Supported by Dynamic Statistics Software

    ERIC Educational Resources Information Center

    Koparan, Timur

    2016-01-01

    In this study, the effect on the achievement and attitudes of prospective teachers is examined. With this aim ahead, achievement test, attitude scale for statistics and interviews were used as data collection tools. The achievement test comprises 8 problems based on statistical data, and the attitude scale comprises 13 Likert-type items. The study…

  18. Generalized statistical mechanics approaches to earthquakes and tectonics.

    PubMed

    Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios

    2016-12-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.

  19. Generalized statistical mechanics approaches to earthquakes and tectonics

    PubMed Central

    Papadakis, Giorgos; Michas, Georgios

    2016-01-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548

  20. LANDSCAPE ASSESSMENT TOOLS FOR WATERSHED CHARACTERIZATION

    EPA Science Inventory

    A combination of process-based, empirical and statistical models has been developed to assist states in their efforts to assess water quality, locate impairments over large areas, and calculate TMDL allocations. By synthesizing outputs from a number of these tools, LIPS demonstr...

  1. METABOLOMICS AS A DIAGNOSTIC TOOL FOR SMALL FISH TOXICOLOGY RESEARCH

    EPA Science Inventory

    Metabolomics involves the application of advanced analytical and statistical tools to profile changes in levels of endogenous metabolites in tissues and biofluids resulting from disease onset or stress. While certain metabolites are being specifically targeted in these studies, w...

  2. Statistical Analysis on the Mechanical Properties of Magnesium Alloys

    PubMed Central

    Liu, Ruoyu; Jiang, Xianquan; Zhang, Hongju; Zhang, Dingfei; Wang, Jingfeng; Pan, Fusheng

    2017-01-01

    Knowledge of statistical characteristics of mechanical properties is very important for the practical application of structural materials. Unfortunately, the scatter characteristics of magnesium alloys for mechanical performance remain poorly understood until now. In this study, the mechanical reliability of magnesium alloys is systematically estimated using Weibull statistical analysis. Interestingly, the Weibull modulus, m, of strength for magnesium alloys is as high as that for aluminum and steels, confirming the very high reliability of magnesium alloys. The high predictability in the tensile strength of magnesium alloys represents the capability of preventing catastrophic premature failure during service, which is essential for safety and reliability assessment. PMID:29113116

  3. Tool wear mechanisms in turning titanium-aluminum-vanadium using tungsten carbide and polycrystalline diamond inserts

    NASA Astrophysics Data System (ADS)

    Schrock, David James

    The objective of this work is to identify some of the tool wear mechanisms at the material level for the machining of titanium and to provide some understanding of these mechanisms for use in physics based tool wear models. Turning experiments were conducted at cutting speeds of 61m/min, 91m/min, and 122m/min on Ti-6Al-4V, an alloy of titanium, using two different grades of tungsten carbide cutting inserts and one grade of polycrystalline diamond inserts. Three-dimensional wear data and two-dimensional wear profiles of the rake face were generated using Confocal Laser Scanning Microscopy to quantify the tool wear mechanisms. Additionally, the microstructure of the deformed work material (chip) and un-deformed parent material (work piece) were studied using Orientation Imaging Microscopy (OIM). Observations from tool wear studies on the PCD inserts revealed the presence of two fundamentally different wear mechanisms operating at the different cutting speeds. Microstructural analyses of the chip and the work material showed phase dependent tool wear mechanisms for machining titanium. There is a high likelihood of phase change occurring in the work material during machining, with a transformation from the alpha phase to the beta phase. The observed dramatic increase in wear is attributed to a combination of increased diffusivity in the beta phase of the titanium alloy in conjunction with a higher degree of recrystallization of the prior beta phase upon cooling. Results of other observations such as the influence of carbide grain size on tool wear are also discussed.

  4. Evidence-based pathology in its second decade: toward probabilistic cognitive computing.

    PubMed

    Marchevsky, Alberto M; Walts, Ann E; Wick, Mark R

    2017-03-01

    Evidence-based pathology advocates using a combination of best available data ("evidence") from the literature and personal experience for the diagnosis, estimation of prognosis, and assessment of other variables that impact individual patient care. Evidence-based pathology relies on systematic reviews of the literature, evaluation of the quality of evidence as categorized by evidence levels and statistical tools such as meta-analyses, estimates of probabilities and odds, and others. However, it is well known that previously "statistically significant" information usually does not accurately forecast the future for individual patients. There is great interest in "cognitive computing" in which "data mining" is combined with "predictive analytics" designed to forecast future events and estimate the strength of those predictions. This study demonstrates the use of IBM Watson Analytics software to evaluate and predict the prognosis of 101 patients with typical and atypical pulmonary carcinoid tumors in which Ki-67 indices have been determined. The results obtained with this system are compared with those previously reported using "routine" statistical software and the help of a professional statistician. IBM Watson Analytics interactively provides statistical results that are comparable to those obtained with routine statistical tools but much more rapidly, with considerably less effort and with interactive graphics that are intuitively easy to apply. It also enables analysis of natural language variables and yields detailed survival predictions for patient subgroups selected by the user. Potential applications of this tool and basic concepts of cognitive computing are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Total Quality Management (TQM), an Overview

    DTIC Science & Technology

    1991-09-01

    Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge

  6. Knowing More than Words Can Say: Using Multimodal Assessment Tools to Excavate and Construct Knowledge about Wolves

    ERIC Educational Resources Information Center

    O'Byrne, Barbara

    2009-01-01

    The purpose of this study was to investigate how multimodal assessment tools enabled Grade 2 students to show knowledge and understanding of wolves. The research design was a case study across three years employing descriptive statistics to portray student knowledge and understanding associated with the use of each tool. The findings indicate that…

  7. Investigation of machinability characteristics on EN47 steel for cutting force and tool wear using optimization technique

    NASA Astrophysics Data System (ADS)

    M, Vasu; Shivananda Nayaka, H.

    2018-06-01

    In this experimental work dry turning process carried out on EN47 spring steel with coated tungsten carbide tool insert with 0.8 mm nose radius are optimized by using statistical technique. Experiments were conducted at three different cutting speeds (625, 796 and 1250 rpm) with three different feed rates (0.046, 0.062 and 0.093 mm/rev) and depth of cuts (0.2, 0.3 and 0.4 mm). Experiments are conducted based on full factorial design (FFD) 33 three factors and three levels. Analysis of variance is used to identify significant factor for each output response. The result reveals that feed rate is the most significant factor influencing on cutting force followed by depth of cut and cutting speed having less significance. Optimum machining condition for cutting force obtained from the statistical technique. Tool wear measurements are performed with optimum condition of Vc = 796 rpm, ap = 0.2 mm, f = 0.046 mm/rev. The minimum tool wear observed as 0.086 mm with 5 min machining. Analysis of tool wear was done by confocal microscope it was observed that tool wear increases with increasing cutting time.

  8. SWATH2stats: An R/Bioconductor Package to Process and Convert Quantitative SWATH-MS Proteomics Data for Downstream Analysis Tools.

    PubMed

    Blattmann, Peter; Heusel, Moritz; Aebersold, Ruedi

    2016-01-01

    SWATH-MS is an acquisition and analysis technique of targeted proteomics that enables measuring several thousand proteins with high reproducibility and accuracy across many samples. OpenSWATH is popular open-source software for peptide identification and quantification from SWATH-MS data. For downstream statistical and quantitative analysis there exist different tools such as MSstats, mapDIA and aLFQ. However, the transfer of data from OpenSWATH to the downstream statistical tools is currently technically challenging. Here we introduce the R/Bioconductor package SWATH2stats, which allows convenient processing of the data into a format directly readable by the downstream analysis tools. In addition, SWATH2stats allows annotation, analyzing the variation and the reproducibility of the measurements, FDR estimation, and advanced filtering before submitting the processed data to downstream tools. These functionalities are important to quickly analyze the quality of the SWATH-MS data. Hence, SWATH2stats is a new open-source tool that summarizes several practical functionalities for analyzing, processing, and converting SWATH-MS data and thus facilitates the efficient analysis of large-scale SWATH/DIA datasets.

  9. GARNET--gene set analysis with exploration of annotation relations.

    PubMed

    Rho, Kyoohyoung; Kim, Bumjin; Jang, Youngjun; Lee, Sanghyun; Bae, Taejeong; Seo, Jihae; Seo, Chaehwa; Lee, Jihyun; Kang, Hyunjung; Yu, Ungsik; Kim, Sunghoon; Lee, Sanghyuk; Kim, Wan Kyu

    2011-02-15

    Gene set analysis is a powerful method of deducing biological meaning for an a priori defined set of genes. Numerous tools have been developed to test statistical enrichment or depletion in specific pathways or gene ontology (GO) terms. Major difficulties towards biological interpretation are integrating diverse types of annotation categories and exploring the relationships between annotation terms of similar information. GARNET (Gene Annotation Relationship NEtwork Tools) is an integrative platform for gene set analysis with many novel features. It includes tools for retrieval of genes from annotation database, statistical analysis & visualization of annotation relationships, and managing gene sets. In an effort to allow access to a full spectrum of amassed biological knowledge, we have integrated a variety of annotation data that include the GO, domain, disease, drug, chromosomal location, and custom-defined annotations. Diverse types of molecular networks (pathways, transcription and microRNA regulations, protein-protein interaction) are also included. The pair-wise relationship between annotation gene sets was calculated using kappa statistics. GARNET consists of three modules--gene set manager, gene set analysis and gene set retrieval, which are tightly integrated to provide virtually automatic analysis for gene sets. A dedicated viewer for annotation network has been developed to facilitate exploration of the related annotations. GARNET (gene annotation relationship network tools) is an integrative platform for diverse types of gene set analysis, where complex relationships among gene annotations can be easily explored with an intuitive network visualization tool (http://garnet.isysbio.org/ or http://ercsb.ewha.ac.kr/garnet/).

  10. Detecting differential DNA methylation from sequencing of bisulfite converted DNA of diverse species.

    PubMed

    Huh, Iksoo; Wu, Xin; Park, Taesung; Yi, Soojin V

    2017-07-21

    DNA methylation is one of the most extensively studied epigenetic modifications of genomic DNA. In recent years, sequencing of bisulfite-converted DNA, particularly via next-generation sequencing technologies, has become a widely popular method to study DNA methylation. This method can be readily applied to a variety of species, dramatically expanding the scope of DNA methylation studies beyond the traditionally studied human and mouse systems. In parallel to the increasing wealth of genomic methylation profiles, many statistical tools have been developed to detect differentially methylated loci (DMLs) or differentially methylated regions (DMRs) between biological conditions. We discuss and summarize several key properties of currently available tools to detect DMLs and DMRs from sequencing of bisulfite-converted DNA. However, the majority of the statistical tools developed for DML/DMR analyses have been validated using only mammalian data sets, and less priority has been placed on the analyses of invertebrate or plant DNA methylation data. We demonstrate that genomic methylation profiles of non-mammalian species are often highly distinct from those of mammalian species using examples of honey bees and humans. We then discuss how such differences in data properties may affect statistical analyses. Based on these differences, we provide three specific recommendations to improve the power and accuracy of DML and DMR analyses of invertebrate data when using currently available statistical tools. These considerations should facilitate systematic and robust analyses of DNA methylation from diverse species, thus advancing our understanding of DNA methylation. © The Author 2017. Published by Oxford University Press.

  11. Two Paradoxes in Linear Regression Analysis

    PubMed Central

    FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong

    2016-01-01

    Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214

  12. Australian Apprentice & Trainee Statistics: Mechanical Engineering and Fabrication Trades, 1995-1999. Australian Vocational Education & Training.

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research, Leabrook (Australia).

    Statistics regarding Australians participating in apprenticeships and traineeships in the mechanical engineering and fabrication trades in 1995-1999 were reviewed to provide an indication of where skill shortages may be occurring or will likely occur in relation to the following occupations: mechanical engineering trades; fabrication engineering…

  13. Statistical-Mechanical Studies of the Collective Binding of Proteins to DNA

    NASA Astrophysics Data System (ADS)

    Zhang, Houyin

    My dissertation work focuses on the microscopic statistical-mechanical studies of DNA-protein interactions and mainly comprises of three projects. In living cells, binding of proteins to DNA controls gene expression and packaging of the genome. Single-DNA stretching and twisting experiments provide a powerful tool to detect binding of proteins, via detection of their modification of DNA mechanical properties. However, it is often difficult or impossible to determine the numbers of proteins bound in such experiments, especially when the proteins interact nonspecifically with DNA. In the first project, we developed single-molecule versions of classical thermodynamic Maxwell relations and proposed that these relations could be used to measure DNA-bound protein numbers, changes in DNA double-helix torque with force, and many other quantities which are hard to directly measure. This approach does not need any theoretical assumptions beyond the existence of thermodynamic equilibrium and has been used in single-DNA experiments. Many single-molecule experiments associated with DNA-bending proteins suggest the existence of cooperative interactions between adjacent DNA-bound proteins. In the second project, we studied a statistical-mechanical worm-like chain model including binding cooperativity effects and found that the intrinsic cooperativity of binding sharpens force-extension curves and causes enhancement of fluctuation of extension and protein occupation. This model also allows us to estimate the intrinsic cooperativity in experiments. We also analyzed force-generated cooperativity and found that the related interaction between proteins is always attractive. This suggests that tension in DNA in vivo could alter the distribution of proteins bound along DNA, causing chromosome refolding, or changes in gene expression. In the third project, we investigated the correlations along DNA-protein complexes. We found there are two different correlation lengths corrected to the geometry of DNA bending - the shorter “longitudinal” correlation length ξ∥(f, μ) and the longer “transverse” correlation length ξ⊥( f, μ). In the high-force limit, ξ∥(f, μ) = ξ⊥(f, μ)/2 = A/4bf . Surprisingly, the range of the interaction between DNA-bending proteins is controlled by the second-longest correlation length. The effect arises from the protein-bend contribution to the Hamiltonian having an axial rotational symmetry which eliminates its coupling to the transverse bending fluctuations.

  14. Data processing, multi-omic pathway mapping, and metabolite activity analysis using XCMS Online

    PubMed Central

    Forsberg, Erica M; Huan, Tao; Rinehart, Duane; Benton, H Paul; Warth, Benedikt; Hilmers, Brian; Siuzdak, Gary

    2018-01-01

    Systems biology is the study of complex living organisms, and as such, analysis on a systems-wide scale involves the collection of information-dense data sets that are representative of an entire phenotype. To uncover dynamic biological mechanisms, bioinformatics tools have become essential to facilitating data interpretation in large-scale analyses. Global metabolomics is one such method for performing systems biology, as metabolites represent the downstream functional products of ongoing biological processes. We have developed XCMS Online, a platform that enables online metabolomics data processing and interpretation. A systems biology workflow recently implemented within XCMS Online enables rapid metabolic pathway mapping using raw metabolomics data for investigating dysregulated metabolic processes. In addition, this platform supports integration of multi-omic (such as genomic and proteomic) data to garner further systems-wide mechanistic insight. Here, we provide an in-depth procedure showing how to effectively navigate and use the systems biology workflow within XCMS Online without a priori knowledge of the platform, including uploading liquid chromatography (LCLC)–mass spectrometry (MS) data from metabolite-extracted biological samples, defining the job parameters to identify features, correcting for retention time deviations, conducting statistical analysis of features between sample classes and performing predictive metabolic pathway analysis. Additional multi-omics data can be uploaded and overlaid with previously identified pathways to enhance systems-wide analysis of the observed dysregulations. We also describe unique visualization tools to assist in elucidation of statistically significant dysregulated metabolic pathways. Parameter input takes 5–10 min, depending on user experience; data processing typically takes 1–3 h, and data analysis takes ~30 min. PMID:29494574

  15. Optimizing Mass Spectrometry Analyses: A Tailored Review on the Utility of Design of Experiments.

    PubMed

    Hecht, Elizabeth S; Oberg, Ann L; Muddiman, David C

    2016-05-01

    Mass spectrometry (MS) has emerged as a tool that can analyze nearly all classes of molecules, with its scope rapidly expanding in the areas of post-translational modifications, MS instrumentation, and many others. Yet integration of novel analyte preparatory and purification methods with existing or novel mass spectrometers can introduce new challenges for MS sensitivity. The mechanisms that govern detection by MS are particularly complex and interdependent, including ionization efficiency, ion suppression, and transmission. Performance of both off-line and MS methods can be optimized separately or, when appropriate, simultaneously through statistical designs, broadly referred to as "design of experiments" (DOE). The following review provides a tutorial-like guide into the selection of DOE for MS experiments, the practices for modeling and optimization of response variables, and the available software tools that support DOE implementation in any laboratory. This review comes 3 years after the latest DOE review (Hibbert DB, 2012), which provided a comprehensive overview on the types of designs available and their statistical construction. Since that time, new classes of DOE, such as the definitive screening design, have emerged and new calls have been made for mass spectrometrists to adopt the practice. Rather than exhaustively cover all possible designs, we have highlighted the three most practical DOE classes available to mass spectrometrists. This review further differentiates itself by providing expert recommendations for experimental setup and defining DOE entirely in the context of three case-studies that highlight the utility of different designs to achieve different goals. A step-by-step tutorial is also provided.

  16. Optimizing Mass Spectrometry Analyses: A Tailored Review on the Utility of Design of Experiments

    PubMed Central

    Hecht, Elizabeth S.; Oberg, Ann L.; Muddiman, David

    2016-01-01

    SUMMARY Mass spectrometry (MS) has emerged as a tool that can analyze nearly all classes of molecules, with its scope rapidly expanding in the areas of post-translational modifications, MS instrumentation, and many others. Yet integration of novel analyte preparatory and purification methods with existing or novel mass spectrometers can introduce new challenges for MS sensitivity. The mechanisms that govern detection by MS are particularly complex and interdependent, including ionization efficiency, ion suppression, and transmission. Performance of both off-line and MS methods can be optimized separately or, when appropriate, simultaneously through statistical designs, broadly referred to as “design of experiments” (DOE). The following review provides a tutorial-like guide into the selection of DOE for MS experiments, the practices for modeling and optimization of response variables, and the available software tools that support DOE implementation in any laboratory. This review comes three years after the latest DOE review (Hibbert DB 2012), which provided a comprehensive overview on the types of designs available and their statistical construction. Since that time, new classes of DOE, such as the definitive screening design, have emerged and new calls have been made for mass spectrometrists to adopt the practice. Rather than exhaustively cover all possible designs, we have highlighted the three most practical DOE classes available to mass spectrometrists. This review further differentiates itself by providing expert recommendations for experimental setup and defining DOE entirely in the context of three case-studies that highlight the utility of different designs to achieve different goals. A step-by-step tutorial is also provided. PMID:26951559

  17. From time-series to complex networks: Application to the cerebrovascular flow patterns in atrial fibrillation

    NASA Astrophysics Data System (ADS)

    Scarsoglio, Stefania; Cazzato, Fabio; Ridolfi, Luca

    2017-09-01

    A network-based approach is presented to investigate the cerebrovascular flow patterns during atrial fibrillation (AF) with respect to normal sinus rhythm (NSR). AF, the most common cardiac arrhythmia with faster and irregular beating, has been recently and independently associated with the increased risk of dementia. However, the underlying hemodynamic mechanisms relating the two pathologies remain mainly undetermined so far; thus, the contribution of modeling and refined statistical tools is valuable. Pressure and flow rate temporal series in NSR and AF are here evaluated along representative cerebral sites (from carotid arteries to capillary brain circulation), exploiting reliable artificially built signals recently obtained from an in silico approach. The complex network analysis evidences, in a synthetic and original way, a dramatic signal variation towards the distal/capillary cerebral regions during AF, which has no counterpart in NSR conditions. At the large artery level, networks obtained from both AF and NSR hemodynamic signals exhibit elongated and chained features, which are typical of pseudo-periodic series. These aspects are almost completely lost towards the microcirculation during AF, where the networks are topologically more circular and present random-like characteristics. As a consequence, all the physiological phenomena at the microcerebral level ruled by periodicity—such as regular perfusion, mean pressure per beat, and average nutrient supply at the cellular level—can be strongly compromised, since the AF hemodynamic signals assume irregular behaviour and random-like features. Through a powerful approach which is complementary to the classical statistical tools, the present findings further strengthen the potential link between AF hemodynamic and cognitive decline.

  18. Effect of multiple forming tools on geometrical and mechanical properties in incremental sheet forming

    NASA Astrophysics Data System (ADS)

    Wernicke, S.; Dang, T.; Gies, S.; Tekkaya, A. E.

    2018-05-01

    The tendency to a higher variety of products requires economical manufacturing processes suitable for the production of prototypes and small batches. In the case of complex hollow-shaped parts, single point incremental forming (SPIF) represents a highly flexible process. The flexibility of this process comes along with a very long process time. To decrease the process time, a new incremental forming approach with multiple forming tools is investigated. The influence of two incremental forming tools on the resulting mechanical and geometrical component properties compared to SPIF is presented. Sheets made of EN AW-1050A were formed to frustums of a pyramid using different tool-path strategies. Furthermore, several variations of the tool-path strategy are analyzed. A time saving between 40% and 60% was observed depending on the tool-path and the radii of the forming tools while the mechanical properties remained unchanged. This knowledge can increase the cost efficiency of incremental forming processes.

  19. `New insight into statistical hydrology' preface to the special issue

    NASA Astrophysics Data System (ADS)

    Kochanek, Krzysztof

    2018-04-01

    Statistical methods are still the basic tool for investigating random, extreme events occurring in hydrosphere. On 21-22 September 2017, in Warsaw (Poland) the international workshop of the Statistical Hydrology (StaHy) 2017 took place under the auspices of the International Association of Hydrological Sciences. The authors of the presentations proposed to publish their research results in the Special Issue of the Acta Geophysica-`New Insight into Statistical Hydrology'. Five papers were selected for publication, touching on the most crucial issues of statistical methodology in hydrology.

  20. The Physiological Mechanisms of Effect of Vitamins and Amino Acids on Tendon and Muscle Healing: A Systematic Review.

    PubMed

    Tack, Christopher; Shorthouse, Faye; Kass, Lindsy

    2018-05-01

    To evaluate the current literature via systematic review to ascertain whether amino acids/vitamins provide any influence on musculotendinous healing and if so, by which physiological mechanisms. EBSCO, PubMed, ScienceDirect, Embase Classic/Embase, and MEDLINE were searched using terms including "vitamins," "amino acids," "healing," "muscle," and "tendon." The primary search had 479 citations, of which 466 were excluded predominantly due to nonrandomized design. Randomized human and animal studies investigating all supplement types/forms of administration were included. Critical appraisal of internal validity was assessed using the Cochrane risk of Bias Tool or the Systematic Review Centre for Laboratory Animal Experimentation Risk of Bias Tool for human and animal studies, respectively. Two reviewers performed duel data extraction. Twelve studies met criteria for inclusion: eight examined tendon healing and four examined muscle healing. All studies used animal models, except two human trials using a combined integrator. Narrative synthesis was performed via content analysis of demonstrated statistically significant effects and thematic analysis of proposed physiological mechanisms of intervention. Vitamin C/taurine demonstrated indirect effects on tendon healing through antioxidant activity. Vitamin A/glycine showed direct effects on extracellular matrix tissue synthesis. Vitamin E shows an antiproliferative influence on collagen deposition. Leucine directly influences signaling pathways to promote muscle protein synthesis. Preliminary evidence exists, demonstrating that vitamins and amino acids may facilitate multilevel changes in musculotendinous healing; however, recommendations on clinical utility should be made with caution. All animal studies and one human study showed high risk of bias with moderate interobserver agreement (k = 0.46). Currently, there is limited evidence to support the use of vitamins and amino acids for musculotendinous injury. Both high-quality animal experimentation of the proposed mechanisms confirming the physiological influence of supplementation and human studies evaluating effects on tissue morphology and biochemistry are required before practical application.

  1. Platform of integrated tools to support environmental studies and management of dredging activities.

    PubMed

    Feola, Alessandra; Lisi, Iolanda; Salmeri, Andrea; Venti, Francesco; Pedroncini, Andrea; Gabellini, Massimo; Romano, Elena

    2016-01-15

    Dredging activities can cause environmental impacts due to, among other, the increase of the Suspended Solid Concentration (SSC) and their subsequent dispersion and deposition (DEP) far from the dredging point. The dynamics of the resulting dredging plume can strongly differ in spatial and temporal evolution. This evolution, for both conventional mechanical and hydraulic dredges, depends on the different mechanisms of sediment release in water column and the site-specific environmental conditions. Several numerical models are currently in use to simulate the dredging plume dynamics. Model results can be analysed to study dispersion and advection processes at different depths and distances from the dredging source. Usually, scenarios with frequent and extreme meteomarine conditions are chosen and extreme values of parameters (i.e. maximum intensity or total duration) are evaluated for environmental assessment. This paper presents a flexible, consistent and integrated methodological approach. Statistical parameters and indexes are derived from the analysis of SSC and DEP simulated time-series to numerically estimate their spatial (vertical and horizontal) and seasonal variability, thereby allowing a comparison of the effects of hydraulic and mechanical dredges. Events that exceed defined thresholds are described in term of magnitude, duration and frequency. A new integrated index combining these parameters, SSCnum, is proposed for environmental assessment. Maps representing the proposed parameters allow direct comparison of effects due to different (mechanical and hydraulic) dredges at progressive distances from the dredging zone. Results can contribute towards identification and assessment of the potential environmental effects of a proposed dredging project. A suitable evaluation of alternative technical choices, appropriate mitigation, management and monitoring measure is allowed in this framework. Environmental Risk Assessment and Decision Support Systems (DSS) may take advantage of the proposed tool. The approach is applied to a hypothetical dredging project in the Augusta Harbour (Eastern coast of Sicily Island-Italy). Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. A multiscale model for charge inversion in electric double layers

    NASA Astrophysics Data System (ADS)

    Mashayak, S. Y.; Aluru, N. R.

    2018-06-01

    Charge inversion is a widely observed phenomenon. It is a result of the rich statistical mechanics of the molecular interactions between ions, solvent, and charged surfaces near electric double layers (EDLs). Electrostatic correlations between ions and hydration interactions between ions and water molecules play a dominant role in determining the distribution of ions in EDLs. Due to highly polar nature of water, near a surface, an inhomogeneous and anisotropic arrangement of water molecules gives rise to pronounced variations in the electrostatic and hydration energies of ions. Classical continuum theories fail to accurately describe electrostatic correlations and molecular effects of water in EDLs. In this work, we present an empirical potential based quasi-continuum theory (EQT) to accurately predict the molecular-level properties of aqueous electrolytes. In EQT, we employ rigorous statistical mechanics tools to incorporate interatomic interactions, long-range electrostatics, correlations, and orientation polarization effects at a continuum-level. Explicit consideration of atomic interactions of water molecules is both theoretically and numerically challenging. We develop a systematic coarse-graining approach to coarse-grain interactions of water molecules and electrolyte ions from a high-resolution atomistic scale to the continuum scale. To demonstrate the ability of EQT to incorporate the water orientation polarization, ion hydration, and electrostatic correlations effects, we simulate confined KCl aqueous electrolyte and show that EQT can accurately predict the distribution of ions in a thin EDL and also predict the complex phenomenon of charge inversion.

  3. Fit reduced GUTS models online: From theory to practice.

    PubMed

    Baudrot, Virgile; Veber, Philippe; Gence, Guillaume; Charles, Sandrine

    2018-05-20

    Mechanistic modeling approaches, such as the toxicokinetic-toxicodynamic (TKTD) framework, are promoted by international institutions such as the European Food Safety Authority and the Organization for Economic Cooperation and Development to assess the environmental risk of chemical products generated by human activities. TKTD models can encompass a large set of mechanisms describing the kinetics of compounds inside organisms (e.g., uptake and elimination) and their effect at the level of individuals (e.g., damage accrual, recovery, and death mechanism). Compared to classical dose-response models, TKTD approaches have many advantages, including accounting for temporal aspects of exposure and toxicity, considering data points all along the experiment and not only at the end, and making predictions for untested situations as realistic exposure scenarios. Among TKTD models, the general unified threshold model of survival (GUTS) is within the most recent and innovative framework but is still underused in practice, especially by risk assessors, because specialist programming and statistical skills are necessary to run it. Making GUTS models easier to use through a new module freely available from the web platform MOSAIC (standing for MOdeling and StAtistical tools for ecotoxIClogy) should promote GUTS operability in support of the daily work of environmental risk assessors. This paper presents the main features of MOSAIC_GUTS: uploading of the experimental data, GUTS fitting analysis, and LCx estimates with their uncertainty. These features will be exemplified from literature data. Integr Environ Assess Manag 2018;00:000-000. © 2018 SETAC. © 2018 SETAC.

  4. Realizing the Potential of Mobile Mental Health: New Methods for New Data in Psychiatry

    PubMed Central

    Staples, Patrick; Onnela, Jukka-Pekka

    2015-01-01

    Smartphones are now ubiquitous and can be harnessed to offer psychiatry a wealth of real-time data regarding patient behavior, self-reported symptoms, and even physiology. The data collected from smartphones meet the three criteria of big data: velocity, volume, and variety. Although these data have tremendous potential, transforming them into clinically valid and useful information requires using new tools and methods as a part of assessment in psychiatry. In this paper, we introduce and explore numerous analytical methods and tools from the computational and statistical sciences that appear readily applicable to psychiatric data collected using smartphones. By matching smartphone data with appropriate statistical methods, psychiatry can better realize the potential of mobile mental health and empower both patients and providers with novel clinical tools. PMID:26073363

  5. Bibliometric indexes, databases and impact factors in cardiology

    PubMed Central

    Bienert, Igor R C; de Oliveira, Rogério Carvalho; de Andrade, Pedro Beraldo; Caramori, Carlos Antonio

    2015-01-01

    Bibliometry is a quantitative statistical technique to measure levels of production and dissemination of knowledge, as well as a useful tool to track the development of an scientific area. The valuation of production required for recognition of researchers and magazines is accomplished through tools called bibliometricindexes, divided into quality indicators and scientific impact. Initially developed for monographs of statistical measures especially in libraries, today bibliometrics is mainly used to evaluate productivity of authors and citation repercussion. However, these tools have limitations and sometimes provoke controversies about indiscriminate application, leading to the development of newer indexes. It is important to know the most common search indexes and use it properly even acknowledging its limitations as it has a direct impact in their daily practice, reputation and funds achievement. PMID:26107458

  6. Automated subtyping of HIV-1 genetic sequences for clinical and surveillance purposes: performance evaluation of the new REGA version 3 and seven other tools.

    PubMed

    Pineda-Peña, Andrea-Clemencia; Faria, Nuno Rodrigues; Imbrechts, Stijn; Libin, Pieter; Abecasis, Ana Barroso; Deforche, Koen; Gómez-López, Arley; Camacho, Ricardo J; de Oliveira, Tulio; Vandamme, Anne-Mieke

    2013-10-01

    To investigate differences in pathogenesis, diagnosis and resistance pathways between HIV-1 subtypes, an accurate subtyping tool for large datasets is needed. We aimed to evaluate the performance of automated subtyping tools to classify the different subtypes and circulating recombinant forms using pol, the most sequenced region in clinical practice. We also present the upgraded version 3 of the Rega HIV subtyping tool (REGAv3). HIV-1 pol sequences (PR+RT) for 4674 patients retrieved from the Portuguese HIV Drug Resistance Database, and 1872 pol sequences trimmed from full-length genomes retrieved from the Los Alamos database were classified with statistical-based tools such as COMET, jpHMM and STAR; similarity-based tools such as NCBI and Stanford; and phylogenetic-based tools such as REGA version 2 (REGAv2), REGAv3, and SCUEAL. The performance of these tools, for pol, and for PR and RT separately, was compared in terms of reproducibility, sensitivity and specificity with respect to the gold standard which was manual phylogenetic analysis of the pol region. The sensitivity and specificity for subtypes B and C was more than 96% for seven tools, but was variable for other subtypes such as A, D, F and G. With regard to the most common circulating recombinant forms (CRFs), the sensitivity and specificity for CRF01_AE was ~99% with statistical-based tools, with phylogenetic-based tools and with Stanford, one of the similarity based tools. CRF02_AG was correctly identified for more than 96% by COMET, REGAv3, Stanford and STAR. All the tools reached a specificity of more than 97% for most of the subtypes and the two main CRFs (CRF01_AE and CRF02_AG). Other CRFs were identified only by COMET, REGAv2, REGAv3, and SCUEAL and with variable sensitivity. When analyzing sequences for PR and RT separately, the performance for PR was generally lower and variable between the tools. Similarity and statistical-based tools were 100% reproducible, but this was lower for phylogenetic-based tools such as REGA (~99%) and SCUEAL (~96%). REGAv3 had an improved performance for subtype B and CRF02_AG compared to REGAv2 and is now able to also identify all epidemiologically relevant CRFs. In general the best performing tools, in alphabetical order, were COMET, jpHMM, REGAv3, and SCUEAL when analyzing pure subtypes in the pol region, and COMET and REGAv3 when analyzing most of the CRFs. Based on this study, we recommend to confirm subtyping with 2 well performing tools, and be cautious with the interpretation of short sequences. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  7. SEPEM: A tool for statistical modeling the solar energetic particle environment

    NASA Astrophysics Data System (ADS)

    Crosby, Norma; Heynderickx, Daniel; Jiggens, Piers; Aran, Angels; Sanahuja, Blai; Truscott, Pete; Lei, Fan; Jacobs, Carla; Poedts, Stefaan; Gabriel, Stephen; Sandberg, Ingmar; Glover, Alexi; Hilgers, Alain

    2015-07-01

    Solar energetic particle (SEP) events are a serious radiation hazard for spacecraft as well as a severe health risk to humans traveling in space. Indeed, accurate modeling of the SEP environment constitutes a priority requirement for astrophysics and solar system missions and for human exploration in space. The European Space Agency's Solar Energetic Particle Environment Modelling (SEPEM) application server is a World Wide Web interface to a complete set of cross-calibrated data ranging from 1973 to 2013 as well as new SEP engineering models and tools. Both statistical and physical modeling techniques have been included, in order to cover the environment not only at 1 AU but also in the inner heliosphere ranging from 0.2 AU to 1.6 AU using a newly developed physics-based shock-and-particle model to simulate particle flux profiles of gradual SEP events. With SEPEM, SEP peak flux and integrated fluence statistics can be studied, as well as durations of high SEP flux periods. Furthermore, effects tools are also included to allow calculation of single event upset rate and radiation doses for a variety of engineering scenarios.

  8. Reliability analysis of composite structures

    NASA Technical Reports Server (NTRS)

    Kan, Han-Pin

    1992-01-01

    A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.

  9. Human waves in stadiums

    NASA Astrophysics Data System (ADS)

    Farkas, I.; Helbing, D.; Vicsek, T.

    2003-12-01

    Mexican wave first widely broadcasted during the 1986 World Cup held in Mexico, is a human wave moving along the stands of stadiums as one section of spectators stands up, arms lifting, then sits down as the next section does the same. Here we use variants of models originally developed for the description of excitable media to demonstrate that this collective human behaviour can be quantitatively interpreted by methods of statistical physics. Adequate modelling of reactions to triggering attempts provides a deeper insight into the mechanisms by which a crowd can be stimulated to execute a particular pattern of behaviour and represents a possible tool of control during events involving excited groups of people. Interactive simulations, video recordings and further images are available at the webpage dedicated to this work: http://angel.elte.hu/wave.

  10. Time series analysis for minority game simulations of financial markets

    NASA Astrophysics Data System (ADS)

    Ferreira, Fernando F.; Francisco, Gerson; Machado, Birajara S.; Muruganandam, Paulsamy

    2003-04-01

    The minority game (MG) model introduced recently provides promising insights into the understanding of the evolution of prices, indices and rates in the financial markets. In this paper we perform a time series analysis of the model employing tools from statistics, dynamical systems theory and stochastic processes. Using benchmark systems and a financial index for comparison, several conclusions are obtained about the generating mechanism for this kind of evolution. The motion is deterministic, driven by occasional random external perturbation. When the interval between two successive perturbations is sufficiently large, one can find low dimensional chaos in this regime. However, the full motion of the MG model is found to be similar to that of the first differences of the SP500 index: stochastic, nonlinear and (unit root) stationary.

  11. Variability in Humoral Immunity to Measles Vaccine: New Developments

    PubMed Central

    Haralambieva, Iana H.; Kennedy, Richard B.; Ovsyannikova, Inna G.; Whitaker, Jennifer A.; Poland, Gregory A.

    2015-01-01

    Despite the existence of an effective measles vaccine, resurgence in measles cases in the United States and across Europe has occurred, including in individuals vaccinated with two doses of the vaccine. Host genetic factors result in inter-individual variation in measles vaccine-induced antibodies, and play a role in vaccine failure. Studies have identified HLA and non-HLA genetic influences that individually or jointly contribute to the observed variability in the humoral response to vaccination among healthy individuals. In this exciting era, new high-dimensional approaches and techniques including vaccinomics, systems biology, GWAS, epitope prediction and sophisticated bioinformatics/statistical algorithms, provide powerful tools to investigate immune response mechanisms to the measles vaccine. These might predict, on an individual basis, outcomes of acquired immunity post measles vaccination. PMID:26602762

  12. Untargeted metabolomics analysis reveals dynamic changes in azelaic acid- and salicylic acid derivatives in LPS-treated Nicotiana tabacum cells.

    PubMed

    Mhlongo, M I; Tugizimana, F; Piater, L A; Steenkamp, P A; Madala, N E; Dubery, I A

    2017-01-22

    To counteract biotic stress factors, plants employ multilayered defense mechanisms responsive to pathogen-derived elicitor molecules, and regulated by different phytohormones and signaling molecules. Here, lipopolysaccharide (LPS), a microbe-associated molecular pattern (MAMP) molecule, was used to induce defense responses in Nicotiana tabacum cell suspensions. Intracellular metabolites were extracted with methanol and analyzed using a liquid chromatography-mass spectrometry (UHPLC-qTOF-MS/MS) platform. The generated data were processed and examined with multivariate and univariate statistical tools. The results show time-dependent dynamic changes and accumulation of glycosylated signaling molecules, specifically those of azelaic acid, salicylic acid and methyl-salicylate as contributors to the altered metabolomic state in LPS-treated cells. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Validity criteria for Fermi’s golden rule scattering rates applied to metallic nanowires

    NASA Astrophysics Data System (ADS)

    Moors, Kristof; Sorée, Bart; Magnus, Wim

    2016-09-01

    Fermi’s golden rule underpins the investigation of mobile carriers propagating through various solids, being a standard tool to calculate their scattering rates. As such, it provides a perturbative estimate under the implicit assumption that the effect of the interaction Hamiltonian which causes the scattering events is sufficiently small. To check the validity of this assumption, we present a general framework to derive simple validity criteria in order to assess whether the scattering rates can be trusted for the system under consideration, given its statistical properties such as average size, electron density, impurity density et cetera. We derive concrete validity criteria for metallic nanowires with conduction electrons populating a single parabolic band subjected to different elastic scattering mechanisms: impurities, grain boundaries and surface roughness.

  14. Inferring Markov chains: Bayesian estimation, model comparison, entropy rate, and out-of-class modeling.

    PubMed

    Strelioff, Christopher C; Crutchfield, James P; Hübler, Alfred W

    2007-07-01

    Markov chains are a natural and well understood tool for describing one-dimensional patterns in time or space. We show how to infer kth order Markov chains, for arbitrary k , from finite data by applying Bayesian methods to both parameter estimation and model-order selection. Extending existing results for multinomial models of discrete data, we connect inference to statistical mechanics through information-theoretic (type theory) techniques. We establish a direct relationship between Bayesian evidence and the partition function which allows for straightforward calculation of the expectation and variance of the conditional relative entropy and the source entropy rate. Finally, we introduce a method that uses finite data-size scaling with model-order comparison to infer the structure of out-of-class processes.

  15. Transition probability, dynamic regimes, and the critical point of financial crisis

    NASA Astrophysics Data System (ADS)

    Tang, Yinan; Chen, Ping

    2015-07-01

    An empirical and theoretical analysis of financial crises is conducted based on statistical mechanics in non-equilibrium physics. The transition probability provides a new tool for diagnosing a changing market. Both calm and turbulent markets can be described by the birth-death process for price movements driven by identical agents. The transition probability in a time window can be estimated from stock market indexes. Positive and negative feedback trading behaviors can be revealed by the upper and lower curves in transition probability. Three dynamic regimes are discovered from two time periods including linear, quasi-linear, and nonlinear patterns. There is a clear link between liberalization policy and market nonlinearity. Numerical estimation of a market turning point is close to the historical event of the US 2008 financial crisis.

  16. Animal models of cerebral ischemia

    NASA Astrophysics Data System (ADS)

    Khodanovich, M. Yu.; Kisel, A. A.

    2015-11-01

    Cerebral ischemia remains one of the most frequent causes of death and disability worldwide. Animal models are necessary to understand complex molecular mechanisms of brain damage as well as for the development of new therapies for stroke. This review considers a certain range of animal models of cerebral ischemia, including several types of focal and global ischemia. Since animal models vary in specificity for the human disease which they reproduce, the complexity of surgery, infarct size, reliability of reproduction for statistical analysis, and adequate models need to be chosen according to the aim of a study. The reproduction of a particular animal model needs to be evaluated using appropriate tools, including the behavioral assessment of injury and non-invasive and post-mortem control of brain damage. These problems also have been summarized in the review.

  17. Auger recombination in sodium iodide

    NASA Astrophysics Data System (ADS)

    McAllister, Andrew; Kioupakis, Emmanouil; Åberg, Daniel; Schleife, André

    2014-03-01

    Scintillators are an important tool used to detect high energy radiation - both in the interest of national security and in medicine. However, scintillator detectors currently suffer from lower energy resolutions than expected from basic counting statistics. This has been attributed to non-proportional light yield compared to incoming radiation, but the specific mechanism for this non-proportionality has not been identified. Auger recombination is a non-radiative process that could be contributing to the non-proportionality of scintillating materials. Auger recombination comes in two types - direct and phonon-assisted. We have used first-principles calculations to study Auger recombination in sodium iodide, a well characterized scintillating material. Our findings indicate that phonon-assisted Auger recombination is stronger in sodium iodide than direct Auger recombination. Computational resources provided by LLNL and NERSC. Funding provided by NA-22.

  18. Rainfall variability in southern Spain on decadal to centennial time scales

    NASA Astrophysics Data System (ADS)

    Rodrigo, F. S.; Esteban-Parra, M. J.; Pozo-Vázquez, D.; Castro-Díez, Y.

    2000-06-01

    In this work a long rainfall series in Andalusia (southern Spain) is analysed. Methods of historical climatology were used to reconstruct a 500-year series from historical sources. Different statistical tools were used to detect and characterize significant changes in this series. Results indicate rainfall fluctuations, without abrupt changes, in the following alternating dry and wet phases: 1501-1589 dry, 1590-1649 wet, 1650-1775 dry, 1776-1937 wet and 1938-1997 dry. Possible causal mechanisms are discussed, emphasizing the important contribution of the North Atlantic Oscillation (NAO) to rainfall variability in the region. Solar activity is discussed in relation to the Maunder Minimum period, and finally the past and present are compared. Results indicate that the magnitude of fluctuations is similar in the past and present.

  19. Development and Application of Computational/In Vitro Toxicological Methods for Chemical Hazard Risk Reduction of New Materials for Advanced Weapon Systems

    NASA Technical Reports Server (NTRS)

    Frazier, John M.; Mattie, D. R.; Hussain, Saber; Pachter, Ruth; Boatz, Jerry; Hawkins, T. W.

    2000-01-01

    The development of quantitative structure-activity relationship (QSAR) is essential for reducing the chemical hazards of new weapon systems. The current collaboration between HEST (toxicology research and testing), MLPJ (computational chemistry) and PRS (computational chemistry, new propellant synthesis) is focusing R&D efforts on basic research goals that will rapidly transition to useful products for propellant development. Computational methods are being investigated that will assist in forecasting cellular toxicological end-points. Models developed from these chemical structure-toxicity relationships are useful for the prediction of the toxicological endpoints of new related compounds. Research is focusing on the evaluation tools to be used for the discovery of such relationships and the development of models of the mechanisms of action. Combinations of computational chemistry techniques, in vitro toxicity methods, and statistical correlations, will be employed to develop and explore potential predictive relationships; results for series of molecular systems that demonstrate the viability of this approach are reported. A number of hydrazine salts have been synthesized for evaluation. Computational chemistry methods are being used to elucidate the mechanism of action of these salts. Toxicity endpoints such as viability (LDH) and changes in enzyme activity (glutahoione peroxidase and catalase) are being experimentally measured as indicators of cellular damage. Extrapolation from computational/in vitro studies to human toxicity, is the ultimate goal. The product of this program will be a predictive tool to assist in the development of new, less toxic propellants.

  20. Geospatial methods and data analysis for assessing distribution of grazing livestock

    USDA-ARS?s Scientific Manuscript database

    Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...

  1. How freight moves : estimating milage and routes using an innovative GIS tool

    DOT National Transportation Integrated Search

    2007-06-01

    The Bureau of Transportation Statistics (BTS) has developed an innovative software tool, called GeoMiler, that is helping researchers better estimate freight travel. GeoMiler is being used to compute mileages along likely routes for the nearly 6 mill...

  2. Chorioamnionitis and chronic lung disease of prematurity: a path analysis of causality.

    PubMed

    Dessardo, Nada Sindičić; Mustać, Elvira; Dessardo, Sandro; Banac, Srđan; Peter, Branimir; Finderle, Aleksandar; Marić, Marinko; Haller, Herman

    2012-02-01

    Current evidence suggests that additional pathogenetic factors could play a role in the development of chronic lung disease of prematurity, other than mechanical ventilation and free radical injury. The introduction of the concept of "fetal inflammatory response syndrome" offers a new perspective on the pathogenesis of chronic lung disease of prematurity. New statistical approaches could be useful tools in evaluating causal relationships in the development of chronic morbidity in preterm infants. The aim of this study was to test a new statistical framework incorporating path analysis to evaluate causality between exposure to chorioamnionitis and fetal inflammatory response syndrome and the development of chronic lung disease of prematurity. We designed a prospective cohort study that included consecutively born premature infants less than 32 weeks of gestation whose placentas were collected for histological analysis. Histological chorioamnionitis, clinical data, and neonatal outcomes were related to chronic lung disease. Along with standard statistical methods, a path analysis was performed to test the relationship between histological chorioamnionitis, gestational age, mechanical ventilation, and development of chronic lung disease of prematurity. Among the newborns enrolled in the study, 69/189 (36%) had histological chorioamnionitis. Of those with histological chorioamnionitis, 28/69 (37%) were classified as having fetal inflammatory response syndrome, according to the presence of severe chorioamnionitis and funisitis. Histological chorioamnionitis was associated with a lower birth weight, shorter gestation, higher frequency of patent ductus arteriosus, greater use of surfactant, and higher frequency of chronic lung disease of prematurity. Severe chorioamnionitis and funisitis were significantly associated with lower birth weight, lower gestational age, lower Apgar score at 5 minutes, more frequent use of mechanical ventilatory support and surfactant, as well as higher frequency of patent ductus arteriosus and chronic lung disease. The results of the path analysis showed that fetal inflammatory response syndrome has a significant direct (0.66), indirect (0.11), and overall (0.77) effect on chronic lung disease. This study demonstrated a strong positive correlation between exposure of the fetus to a severe inflammatory response and the development of chronic lung disease of prematurity. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  3. The boundary is mixed

    NASA Astrophysics Data System (ADS)

    Bianchi, Eugenio; Haggard, Hal M.; Rovelli, Carlo

    2017-08-01

    We show that in Oeckl's boundary formalism the boundary vectors that do not have a tensor form represent, in a precise sense, statistical states. Therefore the formalism incorporates quantum statistical mechanics naturally. We formulate general-covariant quantum statistical mechanics in this language. We illustrate the formalism by showing how it accounts for the Unruh effect. We observe that the distinction between pure and mixed states weakens in the general covariant context, suggesting that local gravitational processes are naturally statistical without a sharp quantal versus probabilistic distinction.

  4. Introduction to the topical issue: Nonadditive entropy and nonextensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Sugiyama, Masaru

    . Dear CMT readers, it is my pleasure to introduce you to this topical issue dealing with a new research field of great interest, nonextensive statistical mechanics. This theory was initiated by Constantino Tsallis' work in 1998, as a possible generalization of Boltzmann-Gibbs thermostatistics. It is based on a nonadditive entropy, nowadays referred to as the Tsallis entropy. Nonextensive statistical mechanics is expected to be a consistent and unified theoretical framework for describing the macroscopic properties of complex systems that are anomalous in view of ordinary thermostatistics. In such systems, the long-standing problem regarding the relationship between statistical and dynamical laws becomes highlighted, since ergodicity and mixing may not be well realized in situations such as the edge of chaos. The phase space appears to self-organize in a structure that is not simply Euclidean but (multi)fractal. Due to this nontrivial structure, the concept of homogeneity of the system, which is the basic premise in ordinary thermodynamics, is violated and accordingly the additivity postulate for the thermodynamic quantities such as the internal energy and entropy may not be justified, in general. (Physically, nonadditivity is deeply relevant to nonextensivity of a system, in which the thermodynamic quantities do not scale with size in a simple way. Typical examples are systems with long-range interactions like self-gravitating systems as well as nonneutral charged ones.) A point of crucial importance here is that, phenomenologically, such an exotic phase-space structure has a fairly long lifetime. Therefore, this state, referred to as a metaequilibrium state or a nonequilibrium stationary state, appears to be described by a generalized entropic principle different from the traditional Boltzmann-Gibbs form, even though it may eventually approach the Boltzmann-Gibbs equilibrium state. The limits t-> ∞ and N-> ∞ do not commute, where t and N are time and the number of particles, respectively. The present topical issue is devoted to summarizing the current status of nonextensive statistical mechanics from various perspectives. It is my hope that this issue can inform the reader of one of the foremost research areas in thermostatistics. This issue consists of eight articles. The first one by Tsallis and Brigatti presents a general introduction and an overview of nonextensive statistical mechanics. At first glance, generalization of the ordinary Boltzmann-Gibbs-Shannon entropy might be completely arbitrary. But Abe's article explains how Tsallis' generalization of the statistical entropy can uniquely be characterized by both physical and mathematical principles. Then, the article by Pluchino, Latora, and Rapisarda presents a strong evidence that nonextensive statistical mechanics is in fact relevant to nonextensive systems with long-range interactions. The articles by Rajagopal, by Wada, and by Plastino, Miller, and Plastino are concerned with the macroscopic thermodynamic properties of nonextensive statistical mechanics. Rajagopal discusses the first and second laws of thermodynamics. Wada develops a discussion about the condition under which the nonextensive statistical-mechanical formalism is thermodynamically stable. The work of Plastino, Miller, and Plastino addresses the thermodynamic Legendre-transform structure and its robustness for generalizations of entropy. After these fundamental investigations, Sakagami and Taruya examine the theory for self-gravitating systems. Finally, Beck presents a novel idea of the so-called superstatistics, which provides nonextensive statistical mechanics with a physical interpretation based on nonequilibrium concepts including temperature fluctuations. Its applications to hydrodynamic turbulence and pattern formation in thermal convection states are also discussed. Nonextensive statistical mechanics is already a well-studied field, and a number of works are available in the literature. It is recommended that the interested reader visit the URL http: //tsallis.cat.cbpf.br/TEMUCO.pdf. There, one can find a comprehensive list of references to more than one thousand papers including important results that, due to lack of space, have not been mentioned in the present issue. Though there are so many published works, nonextensive statistical mechanics is still a developing field. This can naturally be understood, since the program that has been undertaken is an extremely ambitious one that makes a serious attempt to enlarge the horizons of the realm of statistical mechanics. The possible influence of nonextensive statistical mechanics on continuum mechanics and thermodynamics seems to be wide and deep. I will therefore be happy if this issue contributes to attracting the interest of researchers and stimulates research activities not only in the very field of nonextensive statistical mechanics but also in the field of continuum mechanics and thermodynamics in a wider context. As the editor of the present topical issue, I would like to express my sincere thanks to all those who joined up to make this issue. I cordially thank Professor S. Abe for advising me on the editorial policy. Without his help, the present topical issue would never have been brought out.

  5. Evaluation of the internal and external responsiveness of the Pressure Ulcer Scale for Healing (PUSH) tool for assessing acute and chronic wounds.

    PubMed

    Choi, Edmond P H; Chin, Weng Yee; Wan, Eric Y F; Lam, Cindy L K

    2016-05-01

    To examine the internal and external responsiveness of the Pressure Ulcer Scale for Healing (PUSH) tool for assessing the healing progress in acute and chronic wounds. It is important to establish the responsiveness of instruments used in conducting wound care assessments to ensure that they are able to capture changes in wound healing accurately over time. Prospective longitudinal observational study. The key study instrument was the PUSH tool. Internal responsiveness was assessed using paired t-testing and effect size statistics. External responsiveness was assessed using multiple linear regression. All new patients with at least one eligible acute or chronic wound, enrolled in the Nurse and Allied Health Clinic-Wound Care programme between 1 December 2012 - 31 March 2013 were included for analysis (N = 541). Overall, the PUSH tool was able to detect statistically significant changes in wound healing between baseline and discharge. The effect size statistics were large. The internal responsiveness of the PUSH tool was confirmed in patients with a variety of different wound types including venous ulcers, pressure ulcers, neuropathic ulcers, burns and scalds, skin tears, surgical wounds and traumatic wounds. After controlling for age, gender and wound type, subjects in the 'wound improved but not healed' group had a smaller change in PUSH scores than those in the 'wound healed' group. Subjects in the 'wound static or worsened' group had the smallest change in PUSH scores. The external responsiveness was confirmed. The internal and external responsiveness of the PUSH tool confirmed that it can be used to track the healing progress of both acute and chronic wounds. © 2016 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.

  6. Learning Predictive Statistics: Strategies and Brain Mechanisms.

    PubMed

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe

    2017-08-30

    When immersed in a new environment, we are challenged to decipher initially incomprehensible streams of sensory information. However, quite rapidly, the brain finds structure and meaning in these incoming signals, helping us to predict and prepare ourselves for future actions. This skill relies on extracting the statistics of event streams in the environment that contain regularities of variable complexity from simple repetitive patterns to complex probabilistic combinations. Here, we test the brain mechanisms that mediate our ability to adapt to the environment's statistics and predict upcoming events. By combining behavioral training and multisession fMRI in human participants (male and female), we track the corticostriatal mechanisms that mediate learning of temporal sequences as they change in structure complexity. We show that learning of predictive structures relates to individual decision strategy; that is, selecting the most probable outcome in a given context (maximizing) versus matching the exact sequence statistics. These strategies engage distinct human brain regions: maximizing engages dorsolateral prefrontal, cingulate, sensory-motor regions, and basal ganglia (dorsal caudate, putamen), whereas matching engages occipitotemporal regions (including the hippocampus) and basal ganglia (ventral caudate). Our findings provide evidence for distinct corticostriatal mechanisms that facilitate our ability to extract behaviorally relevant statistics to make predictions. SIGNIFICANCE STATEMENT Making predictions about future events relies on interpreting streams of information that may initially appear incomprehensible. Past work has studied how humans identify repetitive patterns and associative pairings. However, the natural environment contains regularities that vary in complexity from simple repetition to complex probabilistic combinations. Here, we combine behavior and multisession fMRI to track the brain mechanisms that mediate our ability to adapt to changes in the environment's statistics. We provide evidence for an alternate route for learning complex temporal statistics: extracting the most probable outcome in a given context is implemented by interactions between executive and motor corticostriatal mechanisms compared with visual corticostriatal circuits (including hippocampal cortex) that support learning of the exact temporal statistics. Copyright © 2017 Wang et al.

  7. Six new mechanics corresponding to further shape theories

    NASA Astrophysics Data System (ADS)

    Anderson, Edward

    2016-02-01

    In this paper, suite of relational notions of shape are presented at the level of configuration space geometry, with corresponding new theories of shape mechanics and shape statistics. These further generalize two quite well known examples: (i) Kendall’s (metric) shape space with his shape statistics and Barbour’s mechanics thereupon. (ii) Leibnizian relational space alias metric scale-and-shape space to which corresponds Barbour-Bertotti mechanics. This paper’s new theories include, using the invariant and group namings, (iii) Angle alias conformal shape mechanics. (iv) Area ratio alias e shape mechanics. (v) Area alias e scale-and-shape mechanics. (iii)-(v) rest respectively on angle space, area-ratio space, and area space configuration spaces. Probability and statistics applications are also pointed to in outline. (vi) Various supersymmetric counterparts of (i)-(v) are considered. Since supergravity differs considerably from GR-based conceptions of background independence, some of the new supersymmetric shape mechanics are compared with both. These reveal compatibility between supersymmetry and GR-based conceptions of background independence, at least within these simpler model arenas.

  8. FT. Sam 91 Whiskey Combat Medic Medical Simulation Training Quantitative Integration Enhancement Program

    DTIC Science & Technology

    2011-07-01

    joined the project team in the statistical and research coordination role. Dr. Collin is an employee at the University of Pittsburgh. A successful...3. Submit to Ft. Detrick Completed Milestone: Statistical analysis planning 1. Review planned data metrics and data gathering tools...approach to performance assessment for continuous quality improvement.  Analyzing data with modern statistical techniques to determine the

  9. Effects of various tool pin profiles on mechanical and metallurgical properties of friction stir welded joints of cryorolled AA2219 aluminium alloy

    NASA Astrophysics Data System (ADS)

    Kamal Babu, Karupannan; Panneerselvam, Kavan; Sathiya, Paulraj; Noorul Haq, Abdul Haq; Sundarrajan, Srinivasan; Mastanaiah, Potta; Srinivasa Murthy, Chunduri Venkata

    2018-02-01

    Friction stir welding (FSW) process was conducted on cryorolled (CR) AA2219 plate using different tool pin profiles such as cylindrical pin, threaded cylindrical pin, square pin and hexagonal pin profiles. The FSW was carried out with pairs of 6 mm thick CR aluminium plates with different tool pin profiles. The different tool pin profile weld portions' behaviors like mechanical (tensile strength, impact and hardness) and metallurgical characteristics were analyzed. The results of the mechanical analysis revealed that the joint made by the hexagonal pin tool had good strength compared to other pin profiles. This was due to the pulsating action and material flow of the tool resulting in dynamic recrystallization in the weld zone. This was confirmed by the ultra fine grain structure formation in Weld Nugget (WN) of hexagonal pin tool joint with a higher percentage of precipitate dissolution. The fractograph of the hexagonal tool pin weld portion confirmed the finer dimple structure morphology without having any interior defect compared to other tool pin profiles. The lowest weld joint strength was obtained from cylindrical pin profile weld joint due to insufficient material flow during welding. The Transmission Electron Microscope and EDX analysis showed the dissolution of the metastable θ″, θ' (Al2Cu) partial precipitates in the WN and proved the influence of metastable precipitates on enhancement of mechanical behavior of weld. The XRD results also confirmed the Al2Cu precipitation dissolution in the weld zone.

  10. Millisecond Microwave Spikes: Statistical Study and Application for Plasma Diagnostics

    NASA Astrophysics Data System (ADS)

    Rozhansky, I. V.; Fleishman, G. D.; Huang, G.-L.

    2008-07-01

    We analyze a dense cluster of solar radio spikes registered at 4.5-6 GHz by the Purple Mountain Observatory spectrometer (Nanjing, China), operating in the 4.5-7.5 GHz range with 5 ms temporal resolution. To handle the data from the spectrometer, we developed a new technique that uses a nonlinear multi-Gaussian spectral fit based on χ2 criteria to extract individual spikes from the originally recorded spectra. Applying this method to the experimental raw data, we eventually identified about 3000 spikes for this event, which allows us to make a detailed statistical analysis. Various statistical characteristics of the spikes have been evaluated, including the intensity distributions, the spectral bandwidth distributions, and the distribution of the spike mean frequencies. The most striking finding of this analysis is the distributions of the spike bandwidth, which are remarkably asymmetric. To reveal the underlaying microphysics, we explore the local-trap model with the renormalized theory of spectral profiles of the electron cyclotron maser (ECM) emission peak in a source with random magnetic irregularities. The distribution of the solar spike relative bandwidths calculated within the local-trap model represents an excellent fit to the experimental data. Accordingly, the developed technique may offer a new tool with which to study very low levels of magnetic turbulence in the spike sources, when the ECM mechanism of the spike cluster is confirmed.

  11. Quantitative approaches in climate change ecology

    PubMed Central

    Brown, Christopher J; Schoeman, David S; Sydeman, William J; Brander, Keith; Buckley, Lauren B; Burrows, Michael; Duarte, Carlos M; Moore, Pippa J; Pandolfi, John M; Poloczanska, Elvira; Venables, William; Richardson, Anthony J

    2011-01-01

    Contemporary impacts of anthropogenic climate change on ecosystems are increasingly being recognized. Documenting the extent of these impacts requires quantitative tools for analyses of ecological observations to distinguish climate impacts in noisy data and to understand interactions between climate variability and other drivers of change. To assist the development of reliable statistical approaches, we review the marine climate change literature and provide suggestions for quantitative approaches in climate change ecology. We compiled 267 peer-reviewed articles that examined relationships between climate change and marine ecological variables. Of the articles with time series data (n = 186), 75% used statistics to test for a dependency of ecological variables on climate variables. We identified several common weaknesses in statistical approaches, including marginalizing other important non-climate drivers of change, ignoring temporal and spatial autocorrelation, averaging across spatial patterns and not reporting key metrics. We provide a list of issues that need to be addressed to make inferences more defensible, including the consideration of (i) data limitations and the comparability of data sets; (ii) alternative mechanisms for change; (iii) appropriate response variables; (iv) a suitable model for the process under study; (v) temporal autocorrelation; (vi) spatial autocorrelation and patterns; and (vii) the reporting of rates of change. While the focus of our review was marine studies, these suggestions are equally applicable to terrestrial studies. Consideration of these suggestions will help advance global knowledge of climate impacts and understanding of the processes driving ecological change.

  12. Improving information retrieval in functional analysis.

    PubMed

    Rodriguez, Juan C; González, Germán A; Fresno, Cristóbal; Llera, Andrea S; Fernández, Elmer A

    2016-12-01

    Transcriptome analysis is essential to understand the mechanisms regulating key biological processes and functions. The first step usually consists of identifying candidate genes; to find out which pathways are affected by those genes, however, functional analysis (FA) is mandatory. The most frequently used strategies for this purpose are Gene Set and Singular Enrichment Analysis (GSEA and SEA) over Gene Ontology. Several statistical methods have been developed and compared in terms of computational efficiency and/or statistical appropriateness. However, whether their results are similar or complementary, the sensitivity to parameter settings, or possible bias in the analyzed terms has not been addressed so far. Here, two GSEA and four SEA methods and their parameter combinations were evaluated in six datasets by comparing two breast cancer subtypes with well-known differences in genetic background and patient outcomes. We show that GSEA and SEA lead to different results depending on the chosen statistic, model and/or parameters. Both approaches provide complementary results from a biological perspective. Hence, an Integrative Functional Analysis (IFA) tool is proposed to improve information retrieval in FA. It provides a common gene expression analytic framework that grants a comprehensive and coherent analysis. Only a minimal user parameter setting is required, since the best SEA/GSEA alternatives are integrated. IFA utility was demonstrated by evaluating four prostate cancer and the TCGA breast cancer microarray datasets, which showed its biological generalization capabilities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Simple Statistics: - Summarized!

    ERIC Educational Resources Information Center

    Blai, Boris, Jr.

    Statistics are an essential tool for making proper judgement decisions. It is concerned with probability distribution models, testing of hypotheses, significance tests and other means of determining the correctness of deductions and the most likely outcome of decisions. Measures of central tendency include the mean, median and mode. A second…

  14. mvMapper: statistical and geographical data exploration and visualization of multivariate analysis of population structure

    USDA-ARS?s Scientific Manuscript database

    Characterizing population genetic structure across geographic space is a fundamental challenge in population genetics. Multivariate statistical analyses are powerful tools for summarizing genetic variability, but geographic information and accompanying metadata is not always easily integrated into t...

  15. SWToolbox: A surface-water tool-box for statistical analysis of streamflow time series

    USGS Publications Warehouse

    Kiang, Julie E.; Flynn, Kate; Zhai, Tong; Hummel, Paul; Granato, Gregory

    2018-03-07

    This report is a user guide for the low-flow analysis methods provided with version 1.0 of the Surface Water Toolbox (SWToolbox) computer program. The software combines functionality from two software programs—U.S. Geological Survey (USGS) SWSTAT and U.S. Environmental Protection Agency (EPA) DFLOW. Both of these programs have been used primarily for computation of critical low-flow statistics. The main analysis methods are the computation of hydrologic frequency statistics such as the 7-day minimum flow that occurs on average only once every 10 years (7Q10), computation of design flows including biologically based flows, and computation of flow-duration curves and duration hydrographs. Other annual, monthly, and seasonal statistics can also be computed. The interface facilitates retrieval of streamflow discharge data from the USGS National Water Information System and outputs text reports for a record of the analysis. Tools for graphing data and screening tests are available to assist the analyst in conducting the analysis.

  16. Enhancing efficiency and quality of statistical estimation of immunogenicity assay cut points through standardization and automation.

    PubMed

    Su, Cheng; Zhou, Lei; Hu, Zheng; Weng, Winnie; Subramani, Jayanthi; Tadkod, Vineet; Hamilton, Kortney; Bautista, Ami; Wu, Yu; Chirmule, Narendra; Zhong, Zhandong Don

    2015-10-01

    Biotherapeutics can elicit immune responses, which can alter the exposure, safety, and efficacy of the therapeutics. A well-designed and robust bioanalytical method is critical for the detection and characterization of relevant anti-drug antibody (ADA) and the success of an immunogenicity study. As a fundamental criterion in immunogenicity testing, assay cut points need to be statistically established with a risk-based approach to reduce subjectivity. This manuscript describes the development of a validated, web-based, multi-tier customized assay statistical tool (CAST) for assessing cut points of ADA assays. The tool provides an intuitive web interface that allows users to import experimental data generated from a standardized experimental design, select the assay factors, run the standardized analysis algorithms, and generate tables, figures, and listings (TFL). It allows bioanalytical scientists to perform complex statistical analysis at a click of the button to produce reliable assay parameters in support of immunogenicity studies. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Study Designs and Statistical Analyses for Biomarker Research

    PubMed Central

    Gosho, Masahiko; Nagashima, Kengo; Sato, Yasunori

    2012-01-01

    Biomarkers are becoming increasingly important for streamlining drug discovery and development. In addition, biomarkers are widely expected to be used as a tool for disease diagnosis, personalized medication, and surrogate endpoints in clinical research. In this paper, we highlight several important aspects related to study design and statistical analysis for clinical research incorporating biomarkers. We describe the typical and current study designs for exploring, detecting, and utilizing biomarkers. Furthermore, we introduce statistical issues such as confounding and multiplicity for statistical tests in biomarker research. PMID:23012528

  18. Capture approximations beyond a statistical quantum mechanical method for atom-diatom reactions

    NASA Astrophysics Data System (ADS)

    Barrios, Lizandra; Rubayo-Soneira, Jesús; González-Lezana, Tomás

    2016-03-01

    Statistical techniques constitute useful approaches to investigate atom-diatom reactions mediated by insertion dynamics which involves complex-forming mechanisms. Different capture schemes based on energy considerations regarding the specific diatom rovibrational states are suggested to evaluate the corresponding probabilities of formation of such collision species between reactants and products in an attempt to test reliable alternatives for computationally demanding processes. These approximations are tested in combination with a statistical quantum mechanical method for the S + H2(v = 0 ,j = 1) → SH + H and Si + O2(v = 0 ,j = 1) → SiO + O reactions, where this dynamical mechanism plays a significant role, in order to probe their validity.

  19. Paleomagnetism.org: An online multi-platform open source environment for paleomagnetic data analysis

    NASA Astrophysics Data System (ADS)

    Koymans, Mathijs R.; Langereis, Cor G.; Pastor-Galán, Daniel; van Hinsbergen, Douwe J. J.

    2016-08-01

    This contribution provides an overview of Paleomagnetism.org, an open-source, multi-platform online environment for paleomagnetic data analysis. Paleomagnetism.org provides an interactive environment where paleomagnetic data can be interpreted, evaluated, visualized, and exported. The Paleomagnetism.org application is split in to an interpretation portal, a statistics portal, and a portal for miscellaneous paleomagnetic tools. In the interpretation portal, principle component analysis can be performed on visualized demagnetization diagrams. Interpreted directions and great circles can be combined to find great circle solutions. These directions can be used in the statistics portal, or exported as data and figures. The tools in the statistics portal cover standard Fisher statistics for directions and VGPs, including other statistical parameters used as reliability criteria. Other available tools include an eigenvector approach foldtest, two reversal test including a Monte Carlo simulation on mean directions, and a coordinate bootstrap on the original data. An implementation is included for the detection and correction of inclination shallowing in sediments following TK03.GAD. Finally we provide a module to visualize VGPs and expected paleolatitudes, declinations, and inclinations relative to widely used global apparent polar wander path models in coordinates of major continent-bearing plates. The tools in the miscellaneous portal include a net tectonic rotation (NTR) analysis to restore a body to its paleo-vertical and a bootstrapped oroclinal test using linear regressive techniques, including a modified foldtest around a vertical axis. Paleomagnetism.org provides an integrated approach for researchers to work with visualized (e.g. hemisphere projections, Zijderveld diagrams) paleomagnetic data. The application constructs a custom exportable file that can be shared freely and included in public databases. This exported file contains all data and can later be imported to the application by other researchers. The accessibility and simplicity through which paleomagnetic data can be interpreted, analyzed, visualized, and shared makes Paleomagnetism.org of interest to the community.

  20. Detection of Cutting Tool Wear using Statistical Analysis and Regression Model

    NASA Astrophysics Data System (ADS)

    Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin

    2010-10-01

    This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.

  1. Optical tweezers and multiphoton microscopies integrated photonic tool for mechanical and biochemical cell processes studies

    NASA Astrophysics Data System (ADS)

    de Thomaz, A. A.; Faustino, W. M.; Fontes, A.; Fernandes, H. P.; Barjas-Castro, M. d. L.; Metze, K.; Giorgio, S.; Barbosa, L. C.; Cesar, C. L.

    2007-09-01

    The research in biomedical photonics is clearly evolving in the direction of the understanding of biological processes at the cell level. The spatial resolution to accomplish this task practically requires photonics tools. However, an integration of different photonic tools and a multimodal and functional approach will be necessary to access the mechanical and biochemical cell processes. This way we can observe mechanicaly triggered biochemical events or biochemicaly triggered mechanical events, or even observe simultaneously mechanical and biochemical events triggered by other means, e.g. electricaly. One great advantage of the photonic tools is its easiness for integration. Therefore, we developed such integrated tool by incorporating single and double Optical Tweezers with Confocal Single and Multiphoton Microscopies. This system can perform 2-photon excited fluorescence and Second Harmonic Generation microscopies together with optical manipulations. It also can acquire Fluorescence and SHG spectra of specific spots. Force, elasticity and viscosity measurements of stretched membranes can be followed by real time confocal microscopies. Also opticaly trapped living protozoas, such as leishmania amazonensis. Integration with CARS microscopy is under way. We will show several examples of the use of such integrated instrument and its potential to observe mechanical and biochemical processes at cell level.

  2. A drilling tool design and in situ identification of planetary regolith mechanical parameters

    NASA Astrophysics Data System (ADS)

    Zhang, Weiwei; Jiang, Shengyuan; Ji, Jie; Tang, Dewei

    2018-05-01

    The physical and mechanical properties as well as the heat flux of regolith are critical evidence in the study of planetary origin and evolution. Moreover, the mechanical properties of planetary regolith have great value for guiding future human planetary activities. For planetary subsurface exploration, an inchworm boring robot (IBR) has been proposed to penetrate the regolith, and the mechanical properties of the regolith are expected to be simultaneously investigated during the penetration process using the drilling tool on the IBR. This paper provides a preliminary study of an in situ method for measuring planetary regolith mechanical parameters using a drilling tool on a test bed. A conical-screw drilling tool was designed, and its drilling load characteristics were experimentally analyzed. Based on the drilling tool-regolith interaction model, two identification methods for determining the planetary regolith bearing and shearing parameters are proposed. The bearing and shearing parameters of lunar regolith simulant were successfully determined according to the pressure-sinkage tests and shear tests conducted on the test bed. The effects of the operating parameters on the identification results were also analyzed. The results indicate a feasible scheme for future planetary subsurface exploration.

  3. Knowledge-based machine indexing from natural language text: Knowledge base design, development, and maintenance

    NASA Technical Reports Server (NTRS)

    Genuardi, Michael T.

    1993-01-01

    One strategy for machine-aided indexing (MAI) is to provide a concept-level analysis of the textual elements of documents or document abstracts. In such systems, natural-language phrases are analyzed in order to identify and classify concepts related to a particular subject domain. The overall performance of these MAI systems is largely dependent on the quality and comprehensiveness of their knowledge bases. These knowledge bases function to (1) define the relations between a controlled indexing vocabulary and natural language expressions; (2) provide a simple mechanism for disambiguation and the determination of relevancy; and (3) allow the extension of concept-hierarchical structure to all elements of the knowledge file. After a brief description of the NASA Machine-Aided Indexing system, concerns related to the development and maintenance of MAI knowledge bases are discussed. Particular emphasis is given to statistically-based text analysis tools designed to aid the knowledge base developer. One such tool, the Knowledge Base Building (KBB) program, presents the domain expert with a well-filtered list of synonyms and conceptually-related phrases for each thesaurus concept. Another tool, the Knowledge Base Maintenance (KBM) program, functions to identify areas of the knowledge base affected by changes in the conceptual domain (for example, the addition of a new thesaurus term). An alternate use of the KBM as an aid in thesaurus construction is also discussed.

  4. Users' manual for the Hydroecological Integrity Assessment Process software (including the New Jersey Assessment Tools)

    USGS Publications Warehouse

    Henriksen, James A.; Heasley, John; Kennen, Jonathan G.; Nieswand, Steven

    2006-01-01

    Applying the Hydroecological Integrity Assessment Process involves four steps: (1) a hydrologic classification of relatively unmodified streams in a geographic area using long-term gage records and 171 ecologically relevant indices; (2) the identification of statistically significant, nonredundant, hydroecologically relevant indices associated with the five major flow components for each stream class; and (3) the development of a stream-classification tool and a hydrologic assessment tool. Four computer software tools have been developed.

  5. Statistical mechanics in the context of special relativity.

    PubMed

    Kaniadakis, G

    2002-11-01

    In Ref. [Physica A 296, 405 (2001)], starting from the one parameter deformation of the exponential function exp(kappa)(x)=(sqrt[1+kappa(2)x(2)]+kappax)(1/kappa), a statistical mechanics has been constructed which reduces to the ordinary Boltzmann-Gibbs statistical mechanics as the deformation parameter kappa approaches to zero. The distribution f=exp(kappa)(-beta E+betamu) obtained within this statistical mechanics shows a power law tail and depends on the nonspecified parameter beta, containing all the information about the temperature of the system. On the other hand, the entropic form S(kappa)= integral d(3)p(c(kappa) f(1+kappa)+c(-kappa) f(1-kappa)), which after maximization produces the distribution f and reduces to the standard Boltzmann-Shannon entropy S0 as kappa-->0, contains the coefficient c(kappa) whose expression involves, beside the Boltzmann constant, another nonspecified parameter alpha. In the present effort we show that S(kappa) is the unique existing entropy obtained by a continuous deformation of S0 and preserving unaltered its fundamental properties of concavity, additivity, and extensivity. These properties of S(kappa) permit to determine unequivocally the values of the above mentioned parameters beta and alpha. Subsequently, we explain the origin of the deformation mechanism introduced by kappa and show that this deformation emerges naturally within the Einstein special relativity. Furthermore, we extend the theory in order to treat statistical systems in a time dependent and relativistic context. Then, we show that it is possible to determine in a self consistent scheme within the special relativity the values of the free parameter kappa which results to depend on the light speed c and reduces to zero as c--> infinity recovering in this way the ordinary statistical mechanics and thermodynamics. The statistical mechanics here presented, does not contain free parameters, preserves unaltered the mathematical and epistemological structure of the ordinary statistical mechanics and is suitable to describe a very large class of experimentally observed phenomena in low and high energy physics and in natural, economic, and social sciences. Finally, in order to test the correctness and predictability of the theory, as working example we consider the cosmic rays spectrum, which spans 13 decades in energy and 33 decades in flux, finding a high quality agreement between our predictions and observed data.

  6. Infant Statistical Learning

    PubMed Central

    Saffran, Jenny R.; Kirkham, Natasha Z.

    2017-01-01

    Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812

  7. Statistical Irreversible Thermodynamics in the Framework of Zubarev's Nonequilibrium Statistical Operator Method

    NASA Astrophysics Data System (ADS)

    Luzzi, R.; Vasconcellos, A. R.; Ramos, J. G.; Rodrigues, C. G.

    2018-01-01

    We describe the formalism of statistical irreversible thermodynamics constructed based on Zubarev's nonequilibrium statistical operator (NSO) method, which is a powerful and universal tool for investigating the most varied physical phenomena. We present brief overviews of the statistical ensemble formalism and statistical irreversible thermodynamics. The first can be constructed either based on a heuristic approach or in the framework of information theory in the Jeffreys-Jaynes scheme of scientific inference; Zubarev and his school used both approaches in formulating the NSO method. We describe the main characteristics of statistical irreversible thermodynamics and discuss some particular considerations of several authors. We briefly describe how Rosenfeld, Bohr, and Prigogine proposed to derive a thermodynamic uncertainty principle.

  8. System for exchanging tools and end effectors on a robot

    DOEpatents

    Burry, David B.; Williams, Paul M.

    1991-02-19

    A system and method for exchanging tools and end effectors on a robot permits exchange during a programmed task. The exchange mechanism is located off the robot, thus reducing the mass of the robot arm and permitting smaller robots to perform designated tasks. A simple spring/collet mechanism mounted on the robot is used which permits the engagement and disengagement of the tool or end effector without the need for a rotational orientation of the tool to the end effector/collet interface. As the tool changing system is not located on the robot arm no umbilical cords are located on robot.

  9. Analysis of Ten Reverse Engineering Tools

    NASA Astrophysics Data System (ADS)

    Koskinen, Jussi; Lehmonen, Tero

    Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.

  10. MyPMFs: a simple tool for creating statistical potentials to assess protein structural models.

    PubMed

    Postic, Guillaume; Hamelryck, Thomas; Chomilier, Jacques; Stratmann, Dirk

    2018-05-29

    Evaluating the model quality of protein structures that evolve in environments with particular physicochemical properties requires scoring functions that are adapted to their specific residue compositions and/or structural characteristics. Thus, computational methods developed for structures from the cytosol cannot work properly on membrane or secreted proteins. Here, we present MyPMFs, an easy-to-use tool that allows users to train statistical potentials of mean force (PMFs) on the protein structures of their choice, with all parameters being adjustable. We demonstrate its use by creating an accurate statistical potential for transmembrane protein domains. We also show its usefulness to study the influence of the physical environment on residue interactions within protein structures. Our open-source software is freely available for download at https://github.com/bibip-impmc/mypmfs. Copyright © 2018. Published by Elsevier B.V.

  11. Fiber reinforced hybrid phenolic foam

    NASA Astrophysics Data System (ADS)

    Desai, Amit

    Hybrid composites in recent times have been developed by using more than one type of fiber reinforcement to bestow synergistic properties of the chosen filler and matrix and also facilitating the design of materials with specific properties matched to end use. However, the studies for hybrid foams have been very limited because of problems related to fiber dispersion in matrix, non uniform mixing due to presence of more than one filler and partially cured foams. An effective approach to synthesize hybrid phenolic foam has been proposed and investigated here. Hybrid composite phenolic foams were reinforced with chopped glass and aramid fibers in varied proportions. On assessing mechanical properties in compression and shear several interesting facts surfaced but overall hybrid phenolic foams exhibited a more graceful failure, greater resistance to cracking and were significantly stiffer and stronger than foams with only glass and aramid fibers. The optimum fiber ratio for the reinforced hybrid phenolic foam system was found to be 1:1 ratio of glass to aramid fibers. Also, the properties of hybrid foam were found to deviate from rule of mixture (ROM) and thus the existing theories of fiber reinforcement fell short in explaining their complex behavior. In an attempt to describe and predict mechanical behavior of hybrid foams a statistical design tool using analysis of variance technique was employed. The utilization of a statistical model for predicting foam properties was found to be an appropriate tool that affords a global perspective of the influence of process variables such as fiber weight fraction, fiber length etc. on foam properties (elastic modulus and strength). Similar approach could be extended to study other fiber composite foam systems such as polyurethane, epoxy etc. and doing so will reduce the number of experimental iterations needed to optimize foam properties and identify critical process variables. Diffusivity, accelerated aging and flammability of hybrid foams were evaluated and the results indicate that hybrid foam surpassed several commercial foams and thus could fulfill the current needs for an insulation material which is low cost, has excellent fire properties and retains compressive stiffness even after aging.

  12. Design and evaluation of a slave manipulator with roll-pitch-roll wrist and automatic tool loading mechanism in telerobotic surgery.

    PubMed

    Kim, Ki-Young; Lee, Jung-Ju

    2012-12-01

    As there is a shortage of scrub nurses in many hospitals, automatic surgical tool exchanging mechanism without human labour has been studied. Minimally invasive robotic surgeries (MIRS) also require scrub nurses. A surgical tool loading mechanism without a scrub nurse's assistance for MIRS is proposed. Many researchers have developed minimally invasive surgical instruments with a wrist joint that can be movable inside the abdomen. However, implementation of a distal rolling joint on a gripper is rare. To implement surgical tool exchanging without a scrub nurse's assistance, a slave manipulator and a tool loader were developed to load and unload a surgical tool unit. A surgical tool unit with a roll-pitch-roll wrist was developed. Several experiments were performed to validate the effectiveness of the slave manipulator and the surgical tool unit. The slave manipulator and the tool loader were able to successfully unload and load the surgical tool unit without human assistance. The total duration of unloading and loading the surgical tool unit was 97 s. Motion tracking experiments of the distal rolling joint were performed. The maximum positioning error of the step input response was 2°. The advantage of the proposed slave manipulator and tool loader is that other robotic systems or human labour are not needed for surgical tool loading. The feasibility of the distal rolling joint in MIS is verified. Copyright © 2012 John Wiley & Sons, Ltd.

  13. Role-play as an educational tool in medication communication skills: Students' perspectives.

    PubMed

    Lavanya, S H; Kalpana, L; Veena, R M; Bharath Kumar, V D

    2016-10-01

    Medication communication skills are vital aspects of patient care that may influence treatment outcomes. However, traditional pharmacology curriculum deals with imparting factual information, with little emphasis on patient communication. The current study aims to explore students' perceptions of role-play as an educational tool in acquiring communication skills and to ascertain the need of role-play for their future clinical practice. This questionnaire-based study was done in 2 nd professional MBBS students. A consolidated concept of six training cases, focusing on major communication issues related to medication prescription in pharmacology, were developed for peer-role-play sessions for 2 nd professional MBBS ( n = 122) students. Structured scripts with specific emphasis on prescription medication communication and checklists for feedback were developed. Prevalidated questionnaires measured the quantitative aspects of role-plays in relation to their relevance as teaching-learning tool, perceived benefits of sessions, and their importance for future use. Data analysis was performed using descriptive statistics. The role-play concept was well appreciated and considered an effective means for acquiring medication communication skills. The structured feedback by peers and faculty was well received by many. Over 90% of the students reported immense confidence in communicating therapy details, namely, drug name, purpose, mechanism, dosing details, and precautions. Majority reported a better retention of pharmacology concepts and preferred more such sessions. Most students consider peer-role-play as an indispensable tool to acquire effective communication skills regarding drug therapy. By virtue of providing experiential learning opportunities and its feasibility of implementation, role-play sessions justify inclusion in undergraduate medical curricula.

  14. Validation of Ultrasound Elastography Imaging for Nondestructive Characterization of Stiffer Biomaterials.

    PubMed

    Zhou, Haoyan; Goss, Monika; Hernandez, Christopher; Mansour, Joseph M; Exner, Agata

    2016-05-01

    Ultrasound elastography (UE) has been widely used as a "digital palpation" tool to characterize tissue mechanical properties in the clinic. UE benefits from the capability of noninvasively generating 2-D elasticity encoded maps. This spatial distribution of elasticity can be especially useful in the in vivo assessment of tissue engineering scaffolds and implantable drug delivery platforms. However, the detection limitations have not been fully characterized and thus its true potential has not been completely discovered. Characterization studies have focused primarily on the range of moduli corresponding to soft tissues, 20-600 kPa. However, polymeric biomaterials used in biomedical applications such as tissue scaffolds, stents, and implantable drug delivery devices can be much stiffer. In order to explore UE's potential to assess mechanical properties of biomaterials in a broader range of applications, this work investigated the detection limit of UE strain imaging beyond soft tissue range. To determine the detection limit, measurements using standard mechanical testing and UE on the same polydimethylsiloxane samples were compared and statistically evaluated. The broadest detection range found based on the current optimized setup is between 47 kPa and 4 MPa which exceeds the modulus of normal soft tissue suggesting the possibility of using this technique for stiffer materials' mechanical characterization. The detectable difference was found to be as low as 157 kPa depending on sample stiffness and experimental setup.

  15. Failure Predictions for VHTR Core Components using a Probabilistic Contiuum Damage Mechanics Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fok, Alex

    2013-10-30

    The proposed work addresses the key research need for the development of constitutive models and overall failure models for graphite and high temperature structural materials, with the long-term goal being to maximize the design life of the Next Generation Nuclear Plant (NGNP). To this end, the capability of a Continuum Damage Mechanics (CDM) model, which has been used successfully for modeling fracture of virgin graphite, will be extended as a predictive and design tool for the core components of the very high- temperature reactor (VHTR). Specifically, irradiation and environmental effects pertinent to the VHTR will be incorporated into the modelmore » to allow fracture of graphite and ceramic components under in-reactor conditions to be modeled explicitly using the finite element method. The model uses a combined stress-based and fracture mechanics-based failure criterion, so it can simulate both the initiation and propagation of cracks. Modern imaging techniques, such as x-ray computed tomography and digital image correlation, will be used during material testing to help define the baseline material damage parameters. Monte Carlo analysis will be performed to address inherent variations in material properties, the aim being to reduce the arbitrariness and uncertainties associated with the current statistical approach. The results can potentially contribute to the current development of American Society of Mechanical Engineers (ASME) codes for the design and construction of VHTR core components.« less

  16. From Mechanical Motion to Brownian Motion, Thermodynamics and Particle Transport Theory

    ERIC Educational Resources Information Center

    Bringuier, E.

    2008-01-01

    The motion of a particle in a medium is dealt with either as a problem of mechanics or as a transport process in non-equilibrium statistical physics. The two kinds of approach are often unrelated as they are taught in different textbooks. The aim of this paper is to highlight the link between the mechanical and statistical treatments of particle…

  17. CAMERRA: An analysis tool for the computation of conformational dynamics by evaluating residue-residue associations.

    PubMed

    Johnson, Quentin R; Lindsay, Richard J; Shen, Tongye

    2018-02-21

    A computational method which extracts the dominant motions from an ensemble of biomolecular conformations via a correlation analysis of residue-residue contacts is presented. The algorithm first renders the structural information into contact matrices, then constructs the collective modes based on the correlated dynamics of a selected set of dynamic contacts. Associated programs can bridge the results for further visualization using graphics software. The aim of this method is to provide an analysis of conformations of biopolymers from the contact viewpoint. It may assist a systematical uncovering of conformational switching mechanisms existing in proteins and biopolymer systems in general by statistical analysis of simulation snapshots. In contrast to conventional correlation analyses of Cartesian coordinates (such as distance covariance analysis and Cartesian principal component analysis), this program also provides an alternative way to locate essential collective motions in general. Herein, we detail the algorithm in a stepwise manner and comment on the importance of the method as applied to decoding allosteric mechanisms. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  18. Photodissociation of quantum state-selected diatomic molecules yields new insight into ultracold chemistry

    NASA Astrophysics Data System (ADS)

    McDonald, Mickey; McGuyer, Bart H.; Lee, Chih-Hsi; Apfelbeck, Florian; Zelevinsky, Tanya

    2016-05-01

    When a molecule is subjected to a sufficiently energetic photon it can break apart into fragments through a process called ``photodissociation''. For over 70 years this simple chemical reaction has served as a vital experimental tool for acquiring information about molecular structure, since the character of the photodissociative transition can be inferred by measuring the 3D photofragment angular distribution (PAD). While theoretical understanding of this process has gradually evolved from classical considerations to a fully quantum approach, experiments to date have not yet revealed the full quantum nature of this process. In my talk I will describe recent experiments involving the photodissociation of ultracold, optical lattice-trapped, and fully quantum state-resolved 88Sr2 molecules. Optical absorption images of the PADs produced in these experiments reveal features which are inherently quantum mechanical in nature, such as matter-wave interference between output channels, and are sensitive to the quantum statistics of the molecular wavefunctions. The results of these experiments cannot be predicted using quasiclassical methods. Instead, we describe our results with a fully quantum mechanical model yielding new intuition about ultracold chemistry.

  19. A Ring Polymer Molecular Dynamics Approach to Study the Transition between Statistical and Direct Mechanisms in the H2 + H3+ → H3+ + H2 Reaction.

    PubMed

    Suleimanov, Yury V; Aguado, Alfredo; Gómez-Carrasco, Susana; Roncero, Octavio

    2018-05-03

    Because of its fundamental importance in astrochemistry, the H 2 + H 3 + → H 3 + + H 2 reaction has been studied experimentally in a wide temperature range. Theoretical studies of the title reaction significantly lag primarily because of the challenges associated with the proper treatment of the zero-point energy (ZPE). As a result, all previous theoretical estimates for the ratio between a direct proton-hop and indirect exchange (via the H 5 + complex) channels deviate from the experiment, in particular, at lower temperatures where the quantum effects dominate. In this work, the ring polymer molecular dynamics (RPMD) method is applied to study this reaction, providing very good agreement with the experiment. RPMD is immune to the shortcomings associated with the ZPE leakage and is able to describe the transition from direct to indirect mechanisms below room temperature. We argue that RPMD represents a useful tool for further studies of numerous ZPE-sensitive chemical reactions that are of high interest in astrochemistry.

  20. Particle Swarm Optimization with Double Learning Patterns.

    PubMed

    Shen, Yuanxia; Wei, Linna; Zeng, Chuanhua; Chen, Jian

    2016-01-01

    Particle Swarm Optimization (PSO) is an effective tool in solving optimization problems. However, PSO usually suffers from the premature convergence due to the quick losing of the swarm diversity. In this paper, we first analyze the motion behavior of the swarm based on the probability characteristic of learning parameters. Then a PSO with double learning patterns (PSO-DLP) is developed, which employs the master swarm and the slave swarm with different learning patterns to achieve a trade-off between the convergence speed and the swarm diversity. The particles in the master swarm and the slave swarm are encouraged to explore search for keeping the swarm diversity and to learn from the global best particle for refining a promising solution, respectively. When the evolutionary states of two swarms interact, an interaction mechanism is enabled. This mechanism can help the slave swarm in jumping out of the local optima and improve the convergence precision of the master swarm. The proposed PSO-DLP is evaluated on 20 benchmark functions, including rotated multimodal and complex shifted problems. The simulation results and statistical analysis show that PSO-DLP obtains a promising performance and outperforms eight PSO variants.

  1. Teacher-Perceived Adequacy of Tools and Equipment Available to Teach Agricultural Mechanics

    ERIC Educational Resources Information Center

    McCubbins, O. P.; Anderson, Ryan G.; Paulsen, Thomas H.; Wells, Trent

    2016-01-01

    Agricultural mechanics is an important component of a well-rounded school-based agricultural education (SBAE) program. Within agricultural mechanics courses lies a plethora of topics and skills to be covered. Adequate tools and equipment are vital in preparing students to fill an expanding, 21st century workforce. The issue of inadequate teaching…

  2. Intelligent Monitoring? Assessing the ability of the Care Quality Commission's statistical surveillance tool to predict quality and prioritise NHS hospital inspections.

    PubMed

    Griffiths, Alex; Beaussier, Anne-Laure; Demeritt, David; Rothstein, Henry

    2017-02-01

    The Care Quality Commission (CQC) is responsible for ensuring the quality of the health and social care delivered by more than 30 000 registered providers in England. With only limited resources for conducting on-site inspections, the CQC has used statistical surveillance tools to help it identify which providers it should prioritise for inspection. In the face of planned funding cuts, the CQC plans to put more reliance on statistical surveillance tools to assess risks to quality and prioritise inspections accordingly. To evaluate the ability of the CQC's latest surveillance tool, Intelligent Monitoring (IM), to predict the quality of care provided by National Health Service (NHS) hospital trusts so that those at greatest risk of providing poor-quality care can be identified and targeted for inspection. The predictive ability of the IM tool is evaluated through regression analyses and χ 2 testing of the relationship between the quantitative risk score generated by the IM tool and the subsequent quality rating awarded following detailed on-site inspection by large expert teams of inspectors. First, the continuous risk scores generated by the CQC's IM statistical surveillance tool cannot predict inspection-based quality ratings of NHS hospital trusts (OR 0.38 (0.14 to 1.05) for Outstanding/Good, OR 0.94 (0.80 to -1.10) for Good/Requires improvement, and OR 0.90 (0.76 to 1.07) for Requires improvement/Inadequate). Second, the risk scores cannot be used more simply to distinguish the trusts performing poorly-those subsequently rated either 'Requires improvement' or 'Inadequate'-from the trusts performing well-those subsequently rated either 'Good' or 'Outstanding' (OR 1.07 (0.91 to 1.26)). Classifying CQC's risk bandings 1-3 as high risk and 4-6 as low risk, 11 of the high risk trusts were performing well and 43 of the low risk trusts were performing poorly, resulting in an overall accuracy rate of 47.6%. Third, the risk scores cannot be used even more simply to distinguish the worst performing trusts-those subsequently rated 'Inadequate'-from the remaining, better performing trusts (OR 1.11 (0.94 to 1.32)). Classifying CQC's risk banding 1 as high risk and 2-6 as low risk, the highest overall accuracy rate of 72.8% was achieved, but still only 6 of the 13 Inadequate trusts were correctly classified as being high risk. Since the IM statistical surveillance tool cannot predict the outcome of NHS hospital trust inspections, it cannot be used for prioritisation. A new approach to inspection planning is therefore required. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  3. PSSMSearch: a server for modeling, visualization, proteome-wide discovery and annotation of protein motif specificity determinants.

    PubMed

    Krystkowiak, Izabella; Manguy, Jean; Davey, Norman E

    2018-06-05

    There is a pressing need for in silico tools that can aid in the identification of the complete repertoire of protein binding (SLiMs, MoRFs, miniMotifs) and modification (moiety attachment/removal, isomerization, cleavage) motifs. We have created PSSMSearch, an interactive web-based tool for rapid statistical modeling, visualization, discovery and annotation of protein motif specificity determinants to discover novel motifs in a proteome-wide manner. PSSMSearch analyses proteomes for regions with significant similarity to a motif specificity determinant model built from a set of aligned motif-containing peptides. Multiple scoring methods are available to build a position-specific scoring matrix (PSSM) describing the motif specificity determinant model. This model can then be modified by a user to add prior knowledge of specificity determinants through an interactive PSSM heatmap. PSSMSearch includes a statistical framework to calculate the significance of specificity determinant model matches against a proteome of interest. PSSMSearch also includes the SLiMSearch framework's annotation, motif functional analysis and filtering tools to highlight relevant discriminatory information. Additional tools to annotate statistically significant shared keywords and GO terms, or experimental evidence of interaction with a motif-recognizing protein have been added. Finally, PSSM-based conservation metrics have been created for taxonomic range analyses. The PSSMSearch web server is available at http://slim.ucd.ie/pssmsearch/.

  4. User’s guide for the Delaware River Basin Streamflow Estimator Tool (DRB-SET)

    USGS Publications Warehouse

    Stuckey, Marla H.; Ulrich, James E.

    2016-06-09

    IntroductionThe Delaware River Basin Streamflow Estimator Tool (DRB-SET) is a tool for the simulation of streamflow at a daily time step for an ungaged stream location in the Delaware River Basin. DRB-SET was developed by the U.S. Geological Survey (USGS) and funded through WaterSMART as part of the National Water Census, a USGS research program on national water availability and use that develops new water accounting tools and assesses water availability at the regional and national scales. DRB-SET relates probability exceedances at a gaged location to those at an ungaged stream location. Once the ungaged stream location has been identified by the user, an appropriate streamgage is automatically selected in DRB-SET using streamflow correlation (map correlation method). Alternately, the user can manually select a different streamgage or use the closest streamgage. A report file is generated documenting the reference streamgage and ungaged stream location information, basin characteristics, any warnings, baseline (minimally altered) and altered (affected by regulation, diversion, mining, or other anthropogenic activities) daily mean streamflow, and the mean and median streamflow. The estimated daily flows for the ungaged stream location can be easily exported as a text file that can be used as input into a statistical software package to determine additional streamflow statistics, such as flow duration exceedance or streamflow frequency statistics.

  5. A Statistical Bias Correction Tool for Generating Climate Change Scenarios in Indonesia based on CMIP5 Datasets

    NASA Astrophysics Data System (ADS)

    Faqih, A.

    2017-03-01

    Providing information regarding future climate scenarios is very important in climate change study. The climate scenario can be used as basic information to support adaptation and mitigation studies. In order to deliver future climate scenarios over specific region, baseline and projection data from the outputs of global climate models (GCM) is needed. However, due to its coarse resolution, the data have to be downscaled and bias corrected in order to get scenario data with better spatial resolution that match the characteristics of the observed data. Generating this downscaled data is mostly difficult for scientist who do not have specific background, experience and skill in dealing with the complex data from the GCM outputs. In this regards, it is necessary to develop a tool that can be used to simplify the downscaling processes in order to help scientist, especially in Indonesia, for generating future climate scenario data that can be used for their climate change-related studies. In this paper, we introduce a tool called as “Statistical Bias Correction for Climate Scenarios (SiBiaS)”. The tool is specially designed to facilitate the use of CMIP5 GCM data outputs and process their statistical bias corrections relative to the reference data from observations. It is prepared for supporting capacity building in climate modeling in Indonesia as part of the Indonesia 3rd National Communication (TNC) project activities.

  6. Research of Extension of the Life Cycle of Helicopter Rotor Blade in Hungary

    DTIC Science & Technology

    2003-02-01

    Radiography (DXR), and (iii) Vibration Diagnostics (VD) with Statistical Energy Analysis (SEA) were semi- simultaneously applied [1]. The used three...2.2. Vibration Diagnostics (VD)) Parallel to the NDT measurements the Statistical Energy Analysis (SEA) as a vibration diagnostical tool were...noises were analysed with a dual-channel real time frequency analyser (BK2035). In addition to the Statistical Energy Analysis measurement a small

  7. Toward Determining ATPase Mechanism in ABC Transporters: Development of the Reaction Path–Force Matching QM/MM Method

    PubMed Central

    Zhou, Y.; Ojeda-May, P.; Nagaraju, M.; Pu, J.

    2016-01-01

    Adenosine triphosphate (ATP)-binding cassette (ABC) transporters are ubiquitous ATP-dependent membrane proteins involved in translocations of a wide variety of substrates across cellular membranes. To understand the chemomechanical coupling mechanism as well as functional asymmetry in these systems, a quantitative description of how ABC transporters hydrolyze ATP is needed. Complementary to experimental approaches, computer simulations based on combined quantum mechanical and molecular mechanical (QM/MM) potentials have provided new insights into the catalytic mechanism in ABC transporters. Quantitatively reliable determination of the free energy requirement for enzymatic ATP hydrolysis, however, requires substantial statistical sampling on QM/MM potential. A case study shows that brute force sampling of ab initio QM/MM (AI/MM) potential energy surfaces is computationally impractical for enzyme simulations of ABC transporters. On the other hand, existing semiempirical QM/MM (SE/MM) methods, although affordable for free energy sampling, are unreliable for studying ATP hydrolysis. To close this gap, a multiscale QM/MM approach named reaction path–force matching (RP–FM) has been developed. In RP–FM, specific reaction parameters for a selected SE method are optimized against AI reference data along reaction paths by employing the force matching technique. The feasibility of the method is demonstrated for a proton transfer reaction in the gas phase and in solution. The RP–FM method may offer a general tool for simulating complex enzyme systems such as ABC transporters. PMID:27498639

  8. Mechanical problem-solving strategies in left-brain damaged patients and apraxia of tool use.

    PubMed

    Osiurak, François; Jarry, Christophe; Lesourd, Mathieu; Baumard, Josselin; Le Gall, Didier

    2013-08-01

    Left brain damage (LBD) can impair the ability to use familiar tools (apraxia of tool use) as well as novel tools to solve mechanical problems. Thus far, the emphasis has been placed on quantitative analyses of patients' performance. Nevertheless, the question still to be answered is, what are the strategies employed by those patients when confronted with tool use situations? To answer it, we asked 16 LBD patients and 43 healthy controls to solve mechanical problems by means of several potential tools. To specify the strategies, we recorded the time spent in performing four kinds of action (no manipulation, tool manipulation, box manipulation, and tool-box manipulation) as well as the number of relevant and irrelevant tools grasped. We compared LBD patients' performance with that of controls who encountered difficulties with the task (controls-) or not (controls+). Our results indicated that LBD patients grasped a higher number of irrelevant tools than controls+ and controls-. Concerning time allocation, controls+ and controls- spent significantly more time in performing tool-box manipulation than LBD patients. These results are inconsistent with the possibility that LBD patients could engage in trial-and-error strategies and, rather, suggest that they tend to be perplexed. These findings seem to indicate that the inability to reason about the objects' physical properties might prevent LBD patients from following any problem-solving strategy. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Effect of CorrelatedRotational Noise

    NASA Astrophysics Data System (ADS)

    Hancock, Benjamin; Wagner, Caleb; Baskaran, Aparna

    The traditional model of a self-propelled particle (SPP) is one where the body axis along which the particle travels reorients itself through rotational diffusion. If the reorientation process was driven by colored noise, instead of the standard Gaussian white noise, the resulting statistical mechanics cannot be accessed through conventional methods. In this talk we present results comparing three methods of deriving the statistical mechanics of a SPP with a reorientation process driven by colored noise. We illustrate the differences/similarities in the resulting statistical mechanics by their ability to accurately capture the particles response to external aligning fields.

  10. A Study on Tooling and Its Effect on Heat Generation and Mechanical Properties of Welded Joints in Friction Stir Welding

    NASA Astrophysics Data System (ADS)

    Tikader, Sujoy; Biswas, Pankaj; Puri, Asit Baran

    2018-04-01

    Friction stir welding (FSW) has been the most attracting solid state welding process as it serves numerous advantages like good mechanical, metallurgical properties etc. Non weldable aluminium alloys like 5XXX, 7XXX series can be simply joined by this process. In this present study a mathematical model has been developed and experiments were successfully performed to evaluate mechanical properties of FSW on similar aluminium alloys i.e. AA1100 for different process parameters and mainly two kind of tool geometry (straight cylindrical and conical or cylindrical tapered shaped pin with flat shoulder). Tensile strength and micro hardness for different process parameters are reported of the welded plate sample. It was noticed that in FSW of similar alloy with tool made of SS-310 tool steel, friction is the major contributor for the heat generation. It was seen that tool geometry, tool rotational speed, plunging force by the tool and traverse speed have significant effect on tensile strength and hardness of friction stir welded joints.

  11. Bioinformatic tools for inferring functional information from plant microarray data: tools for the first steps.

    PubMed

    Page, Grier P; Coulibaly, Issa

    2008-01-01

    Microarrays are a very powerful tool for quantifying the amount of RNA in samples; however, their ability to query essentially every gene in a genome, which can number in the tens of thousands, presents analytical and interpretative problems. As a result, a variety of software and web-based tools have been developed to help with these issues. This article highlights and reviews some of the tools for the first steps in the analysis of a microarray study. We have tried for a balance between free and commercial systems. We have organized the tools by topics including image processing tools (Section 2), power analysis tools (Section 3), image analysis tools (Section 4), database tools (Section 5), databases of functional information (Section 6), annotation tools (Section 7), statistical and data mining tools (Section 8), and dissemination tools (Section 9).

  12. The Statistical Handbook on Technology.

    ERIC Educational Resources Information Center

    Berinstein, Paula

    This volume tells stories about the tools we use, but these narratives are told in numbers rather than in words. Organized by various aspects of society, each chapter uses tables and statistics to examine everything from budgets, costs, sales, trade, employment, patents, prices, usage, access and consumption. In each chapter, each major topic is…

  13. Interactive Visualisations and Statistical Literacy

    ERIC Educational Resources Information Center

    Sutherland, Sinclair; Ridgway, Jim

    2017-01-01

    Statistical literacy involves engagement with the data one encounters. New forms of data and new ways to engage with data--notably via interactive data visualisations--are emerging. Some of the skills required to work effectively with these new visualisation tools are described. We argue that interactive data visualisations will have as profound…

  14. Application of Transformations in Parametric Inference

    ERIC Educational Resources Information Center

    Brownstein, Naomi; Pensky, Marianna

    2008-01-01

    The objective of the present paper is to provide a simple approach to statistical inference using the method of transformations of variables. We demonstrate performance of this powerful tool on examples of constructions of various estimation procedures, hypothesis testing, Bayes analysis and statistical inference for the stress-strength systems.…

  15. Bayesian Posterior Odds Ratios: Statistical Tools for Collaborative Evaluations

    ERIC Educational Resources Information Center

    Hicks, Tyler; Rodríguez-Campos, Liliana; Choi, Jeong Hoon

    2018-01-01

    To begin statistical analysis, Bayesians quantify their confidence in modeling hypotheses with priors. A prior describes the probability of a certain modeling hypothesis apart from the data. Bayesians should be able to defend their choice of prior to a skeptical audience. Collaboration between evaluators and stakeholders could make their choices…

  16. Low-Cost, Full-Field Surface Profiling Tool for Mechanical Damage Evaluation

    DOT National Transportation Integrated Search

    2010-03-03

    In this project, Intelligent Optical Systems (IOS) developed an inexpensive, full-field, surfaceprofiling tool for mechanical damage evaluation based on the processing of a single digital image. Little operator training is required for acquiring the ...

  17. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    NASA Astrophysics Data System (ADS)

    Franco, P.; Estrems, M.; Faura, F.

    2007-04-01

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools.

  18. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R

    PubMed Central

    Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia

    2016-01-01

    Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis. PMID:27792763

  19. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R.

    PubMed

    Chen, Shi-Yi; Deng, Feilong; Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia

    2016-01-01

    Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.

  20. Overview of the SAMSI year-long program on Statistical, Mathematical and Computational Methods for Astronomy

    NASA Astrophysics Data System (ADS)

    Jogesh Babu, G.

    2017-01-01

    A year-long research (Aug 2016- May 2017) program on `Statistical, Mathematical and Computational Methods for Astronomy (ASTRO)’ is well under way at Statistical and Applied Mathematical Sciences Institute (SAMSI), a National Science Foundation research institute in Research Triangle Park, NC. This program has brought together astronomers, computer scientists, applied mathematicians and statisticians. The main aims of this program are: to foster cross-disciplinary activities; to accelerate the adoption of modern statistical and mathematical tools into modern astronomy; and to develop new tools needed for important astronomical research problems. The program provides multiple avenues for cross-disciplinary interactions, including several workshops, long-term visitors, and regular teleconferences, so participants can continue collaborations, even if they can only spend limited time in residence at SAMSI. The main program is organized around five working groups:i) Uncertainty Quantification and Astrophysical Emulationii) Synoptic Time Domain Surveysiii) Multivariate and Irregularly Sampled Time Seriesiv) Astrophysical Populationsv) Statistics, computation, and modeling in cosmology.A brief description of each of the work under way by these groups will be given. Overlaps among various working groups will also be highlighted. How the wider astronomy community can both participate and benefit from the activities, will be briefly mentioned.

  1. Bootstrapping Methods Applied for Simulating Laboratory Works

    ERIC Educational Resources Information Center

    Prodan, Augustin; Campean, Remus

    2005-01-01

    Purpose: The aim of this work is to implement bootstrapping methods into software tools, based on Java. Design/methodology/approach: This paper presents a category of software e-tools aimed at simulating laboratory works and experiments. Findings: Both students and teaching staff use traditional statistical methods to infer the truth from sample…

  2. Technology-Supported Mathematics Environments: Telecollaboration in a Secondary Statistics Classroom

    ERIC Educational Resources Information Center

    Staley, John; Moyer-Packenham, Patricia; Lynch, Monique C.

    2005-01-01

    The Internet, an exciting and radically different medium infiltrating pop culture, business, and education, is also a powerful educational tool with teaching and learning potential for mathematics. Web-based instructional tools allow students and teachers to actively and interactively participate in the learning process (Lynch, Moyer, Frye & Suh,…

  3. Data Mining in Health and Medical Information.

    ERIC Educational Resources Information Center

    Bath, Peter A.

    2004-01-01

    Presents a literature review that covers the following topics related to data mining (DM) in health and medical information: the potential of DM in health and medicine; statistical methods; evaluation of methods; DM tools for health and medicine; inductive learning of symbolic rules; application of DM tools in diagnosis and prognosis; and…

  4. Harnessing the complexity of gene expression data from cancer: from single gene to structural pathway methods

    PubMed Central

    2012-01-01

    High-dimensional gene expression data provide a rich source of information because they capture the expression level of genes in dynamic states that reflect the biological functioning of a cell. For this reason, such data are suitable to reveal systems related properties inside a cell, e.g., in order to elucidate molecular mechanisms of complex diseases like breast or prostate cancer. However, this is not only strongly dependent on the sample size and the correlation structure of a data set, but also on the statistical hypotheses tested. Many different approaches have been developed over the years to analyze gene expression data to (I) identify changes in single genes, (II) identify changes in gene sets or pathways, and (III) identify changes in the correlation structure in pathways. In this paper, we review statistical methods for all three types of approaches, including subtypes, in the context of cancer data and provide links to software implementations and tools and address also the general problem of multiple hypotheses testing. Further, we provide recommendations for the selection of such analysis methods. Reviewers This article was reviewed by Arcady Mushegian, Byung-Soo Kim and Joel Bader. PMID:23227854

  5. Observability of ionospheric space-time structure with ISR: A simulation study

    NASA Astrophysics Data System (ADS)

    Swoboda, John; Semeter, Joshua; Zettergren, Matthew; Erickson, Philip J.

    2017-02-01

    The sources of error from electronically steerable array (ESA) incoherent scatter radar (ISR) systems are investigated both theoretically and with use of an open-source ISR simulator, developed by the authors, called Simulator for ISR (SimISR). The main sources of error incorporated in the simulator include statistical uncertainty, which arises due to nature of the measurement mechanism and the inherent space-time ambiguity from the sensor. SimISR can take a field of plasma parameters, parameterized by time and space, and create simulated ISR data at the scattered electric field (i.e., complex receiver voltage) level, subsequently processing these data to show possible reconstructions of the original parameter field. To demonstrate general utility, we show a number of simulation examples, with two cases using data from a self-consistent multifluid transport model. Results highlight the significant influence of the forward model of the ISR process and the resulting statistical uncertainty on plasma parameter measurements and the core experiment design trade-offs that must be made when planning observations. These conclusions further underscore the utility of this class of measurement simulator as a design tool for more optimal experiment design efforts using flexible ESA class ISR systems.

  6. Estimating topological properties of weighted networks from limited information

    NASA Astrophysics Data System (ADS)

    Gabrielli, Andrea; Cimini, Giulio; Garlaschelli, Diego; Squartini, Angelo

    A typical problem met when studying complex systems is the limited information available on their topology, which hinders our understanding of their structural and dynamical properties. A paramount example is provided by financial networks, whose data are privacy protected. Yet, the estimation of systemic risk strongly depends on the detailed structure of the interbank network. The resulting challenge is that of using aggregate information to statistically reconstruct a network and correctly predict its higher-order properties. Standard approaches either generate unrealistically dense networks, or fail to reproduce the observed topology by assigning homogeneous link weights. Here we develop a reconstruction method, based on statistical mechanics concepts, that exploits the empirical link density in a highly non-trivial way. Technically, our approach consists in the preliminary estimation of node degrees from empirical node strengths and link density, followed by a maximum-entropy inference based on a combination of empirical strengths and estimated degrees. Our method is successfully tested on the international trade network and the interbank money market, and represents a valuable tool for gaining insights on privacy-protected or partially accessible systems. Acknoweledgement to ``Growthcom'' ICT - EC project (Grant No: 611272) and ``Crisislab'' Italian Project.

  7. A statistical design for testing apomictic diversification through linkage analysis.

    PubMed

    Zeng, Yanru; Hou, Wei; Song, Shuang; Feng, Sisi; Shen, Lin; Xia, Guohua; Wu, Rongling

    2014-03-01

    The capacity of apomixis to generate maternal clones through seed reproduction has made it a useful characteristic for the fixation of heterosis in plant breeding. It has been observed that apomixis displays pronounced intra- and interspecific diversification, but the genetic mechanisms underlying this diversification remains elusive, obstructing the exploitation of this phenomenon in practical breeding programs. By capitalizing on molecular information in mapping populations, we describe and assess a statistical design that deploys linkage analysis to estimate and test the pattern and extent of apomictic differences at various levels from genotypes to species. The design is based on two reciprocal crosses between two individuals each chosen from a hermaphrodite or monoecious species. A multinomial distribution likelihood is constructed by combining marker information from two crosses. The EM algorithm is implemented to estimate the rate of apomixis and test its difference between two plant populations or species as the parents. The design is validated by computer simulation. A real data analysis of two reciprocal crosses between hickory (Carya cathayensis) and pecan (C. illinoensis) demonstrates the utilization and usefulness of the design in practice. The design provides a tool to address fundamental and applied questions related to the evolution and breeding of apomixis.

  8. Analysis of alterations in white matter integrity of adult patients with comitant exotropia.

    PubMed

    Li, Dan; Li, Shenghong; Zeng, Xianjun

    2018-05-01

    Objective This study was performed to investigate structural abnormalities of the white matter in patients with comitant exotropia using the tract-based spatial statistics (TBSS) method. Methods Diffusion tensor imaging data from magnetic resonance images of the brain were collected from 20 patients with comitant exotropia and 20 age- and sex-matched healthy controls. The FMRIB Software Library was used to compute the diffusion measures, including fractional anisotropy (FA), mean diffusivity (MD), axial diffusivity (AD), and radial diffusivity (RD). These measures were obtained using voxel-wise statistics with threshold-free cluster enhancement. Results The FA values in the right inferior fronto-occipital fasciculus (IFO) and right inferior longitudinal fasciculus were significantly higher and the RD values in the bilateral IFO, forceps minor, left anterior corona radiata, and left anterior thalamic radiation were significantly lower in the comitant exotropia group than in the healthy controls. No significant differences in the MD or AD values were found between the two groups. Conclusions Alterations in FA and RD values may indicate the underlying neuropathologic mechanism of comitant exotropia. The TBSS method can be a useful tool to investigate neuronal tract participation in patients with this disease.

  9. Predictability of catastrophic events: Material rupture, earthquakes, turbulence, financial crashes, and human birth

    PubMed Central

    Sornette, Didier

    2002-01-01

    We propose that catastrophic events are “outliers” with statistically different properties than the rest of the population and result from mechanisms involving amplifying critical cascades. We describe a unifying approach for modeling and predicting these catastrophic events or “ruptures,” that is, sudden transitions from a quiescent state to a crisis. Such ruptures involve interactions between structures at many different scales. Applications and the potential for prediction are discussed in relation to the rupture of composite materials, great earthquakes, turbulence, and abrupt changes of weather regimes, financial crashes, and human parturition (birth). Future improvements will involve combining ideas and tools from statistical physics and artificial/computational intelligence, to identify and classify possible universal structures that occur at different scales, and to develop application-specific methodologies to use these structures for prediction of the “crises” known to arise in each application of interest. We live on a planet and in a society with intermittent dynamics rather than a state of equilibrium, and so there is a growing and urgent need to sensitize students and citizens to the importance and impacts of ruptures in their multiple forms. PMID:11875205

  10. An Artificial Intelligence Classification Tool and Its Application to Gamma-Ray Bursts

    NASA Technical Reports Server (NTRS)

    Hakkila, Jon; Haglin, David J.; Roiger, Richard J.; Giblin, Timothy; Paciesas, William S.; Pendleton, Geoffrey N.; Mallozzi, Robert S.

    2004-01-01

    Despite being the most energetic phenomenon in the known universe, the astrophysics of gamma-ray bursts (GRBs) has still proven difficult to understand. It has only been within the past five years that the GRB distance scale has been firmly established, on the basis of a few dozen bursts with x-ray, optical, and radio afterglows. The afterglows indicate source redshifts of z=1 to z=5, total energy outputs of roughly 10(exp 52) ergs, and energy confined to the far x-ray to near gamma-ray regime of the electromagnetic spectrum. The multi-wavelength afterglow observations have thus far provided more insight on the nature of the GRB mechanism than the GRB observations; far more papers have been written about the few observed gamma-ray burst afterglows in the past few years than about the thousands of detected gamma-ray bursts. One reason the GRB central engine is still so poorly understood is that GRBs have complex, overlapping characteristics that do not appear to be produced by one homogeneous process. At least two subclasses have been found on the basis of duration, spectral hardness, and fluence (time integrated flux); Class 1 bursts are softer, longer, and brighter than Class 2 bursts (with two second durations indicating a rough division). A third GRB subclass, overlapping the other two, has been identified using statistical clustering techniques; Class 3 bursts are intermediate between Class 1 and Class 2 bursts in brightness and duration, but are softer than Class 1 bursts. We are developing a tool to aid scientists in the study of GRB properties. In the process of developing this tool, we are building a large gamma-ray burst classification database. We are also scientifically analyzing some GRB data as we develop the tool. Tool development thus proceeds in tandem with the dataset for which it is being designed. The tool invokes a modified KDD (Knowledge Discovery in Databases) process, which is described as follows.

  11. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    PubMed Central

    2010-01-01

    Background The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/. PMID:20482791

  12. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments.

    PubMed

    Ma, Jingming; Dykes, Carrie; Wu, Tao; Huang, Yangxin; Demeter, Lisa; Wu, Hulin

    2010-05-18

    The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  13. omiRas: a Web server for differential expression analysis of miRNAs derived from small RNA-Seq data.

    PubMed

    Müller, Sören; Rycak, Lukas; Winter, Peter; Kahl, Günter; Koch, Ina; Rotter, Björn

    2013-10-15

    Small RNA deep sequencing is widely used to characterize non-coding RNAs (ncRNAs) differentially expressed between two conditions, e.g. healthy and diseased individuals and to reveal insights into molecular mechanisms underlying condition-specific phenotypic traits. The ncRNAome is composed of a multitude of RNAs, such as transfer RNA, small nucleolar RNA and microRNA (miRNA), to name few. Here we present omiRas, a Web server for the annotation, comparison and visualization of interaction networks of ncRNAs derived from next-generation sequencing experiments of two different conditions. The Web tool allows the user to submit raw sequencing data and results are presented as: (i) static annotation results including length distribution, mapping statistics, alignments and quantification tables for each library as well as lists of differentially expressed ncRNAs between conditions and (ii) an interactive network visualization of user-selected miRNAs and their target genes based on the combination of several miRNA-mRNA interaction databases. The omiRas Web server is implemented in Python, PostgreSQL, R and can be accessed at: http://tools.genxpro.net/omiras/.

  14. Monitoring, Analyzing and Assessing Radiation Belt Loss and Energization

    NASA Astrophysics Data System (ADS)

    Daglis, I. A.; Bourdarie, S.; Khotyaintsev, Y.; Santolik, O.; Horne, R.; Mann, I.; Turner, D.; Anastasiadis, A.; Angelopoulos, V.; Balasis, G.; Chatzichristou, E.; Cully, C.; Georgiou, M.; Glauert, S.; Grison, B.; Kolmasova, I.; Lazaro, D.; Macusova, E.; Maget, V.; Papadimitriou, C.; Ropokis, G.; Sandberg, I.; Usanova, M.

    2012-09-01

    We present the concept, objectives and expected impact of the MAARBLE (Monitoring, Analyzing and Assessing Radiation Belt Loss and Energization) project, which is being implemented by a consortium of seven institutions (five European, one Canadian and one US) with support from the European Community's Seventh Framework Programme. The MAARBLE project employs multi-spacecraft monitoring of the geospace environment, complemented by ground-based monitoring, in order to analyze and assess the physical mechanisms leading to radiation belt particle energization and loss. Particular attention is paid to the role of ULF/VLF waves. A database containing properties of the waves is being created and will be made available to the scientific community. Based on the wave database, a statistical model of the wave activity dependent on the level of geomagnetic activity, solar wind forcing, and magnetospheric region will be developed. Furthermore, we will incorporate multi-spacecraft particle measurements into data assimilation tools, aiming at a new understanding of the causal relationships between ULF/VLF waves and radiation belt dynamics. Data assimilation techniques have been proven to be a valuable tool in the field of radiation belts, able to guide 'the best' estimate of the state of a complex system.

  15. Virtual tool mark generation for efficient striation analysis in forensic science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekstrand, Laura

    In 2009, a National Academy of Sciences report called for investigation into the scienti c basis behind tool mark comparisons (National Academy of Sciences, 2009). Answering this call, Chumbley et al. (2010) attempted to prove or disprove the hypothesis that tool marks are unique to a single tool. They developed a statistical algorithm that could, in most cases, discern matching and non-matching tool marks made at di erent angles by sequentially numbered screwdriver tips. Moreover, in the cases where the algorithm misinterpreted a pair of marks, an experienced forensics examiner could discern the correct outcome. While this research served tomore » con rm the basic assumptions behind tool mark analysis, it also suggested that statistical analysis software could help to reduce the examiner's workload. This led to a new tool mark analysis approach, introduced in this thesis, that relies on 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. These scans are carefully cleaned to remove noise from the data acquisition process and assigned a coordinate system that mathematically de nes angles and twists in a natural way. The marking process is then simulated by using a 3D graphics software package to impart rotations to the tip and take the projection of the tip's geometry in the direction of tool travel. The edge of this projection, retrieved from the 3D graphics software, becomes a virtual tool mark. Using this method, virtual marks are made at increments of 5 and compared to a scan of the evidence mark. The previously developed statistical package from Chumbley et al. (2010) performs the comparison, comparing the similarity of the geometry of both marks to the similarity that would occur due to random chance. The resulting statistical measure of the likelihood of the match informs the examiner of the angle of the best matching virtual mark, allowing the examiner to focus his/her mark analysis on a smaller range of angles. Preliminary results are quite promising. In a study with both sides of 6 screwdriver tips and 34 corresponding marks, the method distinguished known matches from known non-matches with zero false positive matches and only two matches mistaken for non-matches. For matches, it could predict the correct marking angle within 5-10 . Moreover, on a standard desktop computer, the virtual marking software is capable of cleaning 3D tip and plate scans in minutes and producing a virtual mark and comparing it to a real mark in seconds. These results support several of the professional conclusions of the tool mark analysis com- munity, including the idea that marks produced by the same tool only match if they are made at similar angles. The method also displays the potential to automate part of the comparison process, freeing the examiner to focus on other tasks, which is important in busy, backlogged crime labs. Finally, the method o ers the unique chance to directly link an evidence mark to the tool that produced it while reducing potential damage to the evidence.« less

  16. System engineering toolbox for design-oriented engineers

    NASA Technical Reports Server (NTRS)

    Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.

    1994-01-01

    This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.

  17. Exploring students’ perceived and actual ability in solving statistical problems based on Rasch measurement tools

    NASA Astrophysics Data System (ADS)

    Azila Che Musa, Nor; Mahmud, Zamalia; Baharun, Norhayati

    2017-09-01

    One of the important skills that is required from any student who are learning statistics is knowing how to solve statistical problems correctly using appropriate statistical methods. This will enable them to arrive at a conclusion and make a significant contribution and decision for the society. In this study, a group of 22 students majoring in statistics at UiTM Shah Alam were given problems relating to topics on testing of hypothesis which require them to solve the problems using confidence interval, traditional and p-value approach. Hypothesis testing is one of the techniques used in solving real problems and it is listed as one of the difficult concepts for students to grasp. The objectives of this study is to explore students’ perceived and actual ability in solving statistical problems and to determine which item in statistical problem solving that students find difficult to grasp. Students’ perceived and actual ability were measured based on the instruments developed from the respective topics. Rasch measurement tools such as Wright map and item measures for fit statistics were used to accomplish the objectives. Data were collected and analysed using Winsteps 3.90 software which is developed based on the Rasch measurement model. The results showed that students’ perceived themselves as moderately competent in solving the statistical problems using confidence interval and p-value approach even though their actual performance showed otherwise. Item measures for fit statistics also showed that the maximum estimated measures were found on two problems. These measures indicate that none of the students have attempted these problems correctly due to reasons which include their lack of understanding in confidence interval and probability values.

  18. How do energetic ions damage metallic surfaces?

    DOE PAGES

    Osetskiy, Yury N.; Calder, Andrew F.; Stoller, Roger E.

    2015-02-20

    Surface modification under bombardment by energetic ions observed under different conditions in structural and functional materials and can be either unavoidable effect of the conditions or targeted modification to enhance materials properties. Understanding basic mechanisms is necessary for predicting properties changes. The mechanisms activated during ion irradiation are of atomic scale and atomic scale modeling is the most suitable tool to study these processes. In this paper we present results of an extensive simulation program aimed at developing an understanding of primary surface damage in iron by energetic particles. We simulated 25 keV self-ion bombardment of Fe thin films withmore » (100) and (110) surfaces at room temperature. A large number of simulations, ~400, were carried out allow a statistically significant treatment of the results. The particular mechanism of surface damage depends on how the destructive supersonic shock wave generated by the displacement cascade interacts with the free surface. Three basic scenarios were observed, with the limiting cases being damage created far below the surface with little or no impact on the surface itself, and extensive direct surface damage on the timescale of a few picoseconds. In some instances, formation of large <100> vacancy loops beneath the free surface was observed, which may explain some earlier experimental observations.« less

  19. Acoustic emission during quench training of superconducting accelerator magnets

    NASA Astrophysics Data System (ADS)

    Marchevsky, M.; Sabbi, G.; Bajas, H.; Gourlay, S.

    2015-07-01

    Acoustic emission (AE) sensing is a viable tool for superconducting magnet diagnostics. Using in-house developed cryogenic amplified piezoelectric sensors, we conducted AE studies during quench training of the US LARP's high-field quadrupole HQ02 and the LBNL's high-field dipole HD3. For both magnets, AE bursts were observed, with spike amplitude and frequency increasing toward the quench current during current up-ramps. In the HQ02, the AE onset upon current ramping is distinct and exhibits a clear memory of the previously-reached quench current (Kaiser effect). On the other hand, in the HD3 magnet the AE amplitude begins to increase well before the previously-reached quench current (felicity effect), suggesting an ongoing progressive mechanical motion in the coils. A clear difference in the AE signature exists between the untrained and trained mechanical states in HD3. Time intervals between the AE signals detected at the opposite ends of HD3 coils were processed using a combination of narrow-band pass filtering; threshold crossing and correlation algorithms, and the spatial distributions of AE sources and the mechanical energy release were calculated. Both distributions appear to be consistent with the quench location distribution. Energy statistics of the AE spikes exhibits a power-law scaling typical for the self-organized critical state.

  20. On the influence of Ti-Al intermetallic coating architecture on mechanical properties and wear resistance of end mills

    NASA Astrophysics Data System (ADS)

    Vardanyan, E. L.; Budilov, V. V.; Ramazanov, K. N.; Ataullin, Z. R.

    2017-07-01

    Thin-film wear-resistant coatings are widely used to increase life and efficiency of metal cutting tools. This paper shows the results of a study on the influence of architecture (number, sequence and thickness of layers) of wear-resistant coatings on physical, mechanical and operational properties of end mills. Coatings consisting of alternating Ti-Al/Ti-Al-N layers of equal thickness demonstrated the best physical and mechanical properties. Durability of coated tools when processing materials from chromium-vanadium steel increased twice as compared to uncoated tools.

  1. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    NASA Technical Reports Server (NTRS)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  2. [Is there life beyond SPSS? Discover R].

    PubMed

    Elosua Oliden, Paula

    2009-11-01

    R is a GNU statistical and programming environment with very high graphical capabilities. It is very powerful for research purposes, but it is also an exceptional tool for teaching. R is composed of more than 1400 packages that allow using it for simple statistics and applying the most complex and most recent formal models. Using graphical interfaces like the Rcommander package, permits working in user-friendly environments which are similar to the graphical environment used by SPSS. This last characteristic allows non-statisticians to overcome the obstacle of accessibility, and it makes R the best tool for teaching. Is there anything better? Open, free, affordable, accessible and always on the cutting edge.

  3. Identifying and Investigating Unexpected Response to Treatment: A Diabetes Case Study.

    PubMed

    Ozery-Flato, Michal; Ein-Dor, Liat; Parush-Shear-Yashuv, Naama; Aharonov, Ranit; Neuvirth, Hani; Kohn, Martin S; Hu, Jianying

    2016-09-01

    The availability of electronic health records creates fertile ground for developing computational models of various medical conditions. We present a new approach for detecting and analyzing patients with unexpected responses to treatment, building on machine learning and statistical methodology. Given a specific patient, we compute a statistical score for the deviation of the patient's response from responses observed in other patients having similar characteristics and medication regimens. These scores are used to define cohorts of patients showing deviant responses. Statistical tests are then applied to identify clinical features that correlate with these cohorts. We implement this methodology in a tool that is designed to assist researchers in the pharmaceutical field to uncover new features associated with reduced response to a treatment. It can also aid physicians by flagging patients who are not responding to treatment as expected and hence deserve more attention. The tool provides comprehensive visualizations of the analysis results and the supporting data, both at the cohort level and at the level of individual patients. We demonstrate the utility of our methodology and tool in a population of type II diabetic patients, treated with antidiabetic drugs, and monitored by the HbA1C test.

  4. Physical concepts in the development of constitutive equations

    NASA Technical Reports Server (NTRS)

    Cassenti, B. N.

    1985-01-01

    Proposed viscoplastic material models include in their formulation observed material response but do not generally incorporate principles from thermodynamics, statistical mechanics, and quantum mechanics. Numerous hypotheses were made for material response based on first principles. Many of these hypotheses were tested experimentally. The proposed viscoplastic theories and the experimental basis of these hypotheses must be checked against the hypotheses. The physics of thermodynamics, statistical mechanics and quantum mechanics, and the effects of defects, are reviewed for their application to the development of constitutive laws.

  5. Using health statistics: a Nightingale legacy.

    PubMed

    Schloman, B F

    2001-01-01

    No more forceful example of the value of using health statistics to understand and improve health conditions exists than displayed by Florence Nightingale. The recent book by Dossey (1999), Florence Nightingale: Mystic, Visionary, Healer, relates the dramatic tale of Nightingale s use of statistics to understand the causes of deaths in the Crimean War and of her advocacy to standardize the collection of medical data within the army and in civilian hospitals. For her, the use of health statistics was a major tool to improve health and influence public opinion.

  6. System for exchanging tools and end effectors on a robot

    DOEpatents

    Burry, D.B.; Williams, P.M.

    1991-02-19

    A system and method for exchanging tools and end effectors on a robot permits exchange during a programmed task. The exchange mechanism is located off the robot, thus reducing the mass of the robot arm and permitting smaller robots to perform designated tasks. A simple spring/collet mechanism mounted on the robot is used which permits the engagement and disengagement of the tool or end effector without the need for a rotational orientation of the tool to the end effector/collet interface. As the tool changing system is not located on the robot arm no umbilical cords are located on robot. 12 figures.

  7. A Study on Predictive Analytics Application to Ship Machinery Maintenance

    DTIC Science & Technology

    2013-09-01

    Looking at the nature of the time series forecasting method , it would be better applied to offline analysis . The application for real- time online...other system attributes in future. Two techniques of statistical analysis , mainly time series models and cumulative sum control charts, are discussed in...statistical tool employed for the two techniques of statistical analysis . Both time series forecasting as well as CUSUM control charts are shown to be

  8. RooStatsCms: A tool for analysis modelling, combination and statistical studies

    NASA Astrophysics Data System (ADS)

    Piparo, D.; Schott, G.; Quast, G.

    2010-04-01

    RooStatsCms is an object oriented statistical framework based on the RooFit technology. Its scope is to allow the modelling, statistical analysis and combination of multiple search channels for new phenomena in High Energy Physics. It provides a variety of methods described in literature implemented as classes, whose design is oriented to the execution of multiple CPU intensive jobs on batch systems or on the Grid.

  9. Statistical Relational Learning (SRL) as an Enabling Technology for Data Acquisition and Data Fusion in Video

    DTIC Science & Technology

    2013-05-02

    REPORT Statistical Relational Learning ( SRL ) as an Enabling Technology for Data Acquisition and Data Fusion in Video 14. ABSTRACT 16. SECURITY...particular, it is important to reason about which portions of video require expensive analysis and storage. This project aims to make these...inferences using new and existing tools from Statistical Relational Learning ( SRL ). SRL is a recently emerging technology that enables the effective 1

  10. Statistical analysis and handling of missing data in cluster randomized trials: a systematic review.

    PubMed

    Fiero, Mallorie H; Huang, Shuang; Oren, Eyal; Bell, Melanie L

    2016-02-09

    Cluster randomized trials (CRTs) randomize participants in groups, rather than as individuals and are key tools used to assess interventions in health research where treatment contamination is likely or if individual randomization is not feasible. Two potential major pitfalls exist regarding CRTs, namely handling missing data and not accounting for clustering in the primary analysis. The aim of this review was to evaluate approaches for handling missing data and statistical analysis with respect to the primary outcome in CRTs. We systematically searched for CRTs published between August 2013 and July 2014 using PubMed, Web of Science, and PsycINFO. For each trial, two independent reviewers assessed the extent of the missing data and method(s) used for handling missing data in the primary and sensitivity analyses. We evaluated the primary analysis and determined whether it was at the cluster or individual level. Of the 86 included CRTs, 80 (93%) trials reported some missing outcome data. Of those reporting missing data, the median percent of individuals with a missing outcome was 19% (range 0.5 to 90%). The most common way to handle missing data in the primary analysis was complete case analysis (44, 55%), whereas 18 (22%) used mixed models, six (8%) used single imputation, four (5%) used unweighted generalized estimating equations, and two (2%) used multiple imputation. Fourteen (16%) trials reported a sensitivity analysis for missing data, but most assumed the same missing data mechanism as in the primary analysis. Overall, 67 (78%) trials accounted for clustering in the primary analysis. High rates of missing outcome data are present in the majority of CRTs, yet handling missing data in practice remains suboptimal. Researchers and applied statisticians should carry out appropriate missing data methods, which are valid under plausible assumptions in order to increase statistical power in trials and reduce the possibility of bias. Sensitivity analysis should be performed, with weakened assumptions regarding the missing data mechanism to explore the robustness of results reported in the primary analysis.

  11. Maximum entropy models of ecosystem functioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertram, Jason, E-mail: jason.bertram@anu.edu.au

    2014-12-05

    Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on themore » information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.« less

  12. Qualitative Discovery in Medical Databases

    NASA Technical Reports Server (NTRS)

    Maluf, David A.

    2000-01-01

    Implication rules have been used in uncertainty reasoning systems to confirm and draw hypotheses or conclusions. However a major bottleneck in developing such systems lies in the elicitation of these rules. This paper empirically examines the performance of evidential inferencing with implication networks generated using a rule induction tool called KAT. KAT utilizes an algorithm for the statistical analysis of empirical case data, and hence reduces the knowledge engineering efforts and biases in subjective implication certainty assignment. The paper describes several experiments in which real-world diagnostic problems were investigated; namely, medical diagnostics. In particular, it attempts to show that: (1) with a limited number of case samples, KAT is capable of inducing implication networks useful for making evidential inferences based on partial observations, and (2) observation driven by a network entropy optimization mechanism is effective in reducing the uncertainty of predicted events.

  13. Gearbox damage identification and quantification using stochastic resonance

    NASA Astrophysics Data System (ADS)

    Mba, Clement U.; Marchesiello, Stefano; Fasana, Alessandro; Garibaldi, Luigi

    2018-03-01

    Amongst the many new tools used for vibration based mechanical fault diagnosis in rotating machineries, stochastic resonance (SR) has been shown to be able to identify as well as quantify gearbox damage via numerical simulations. To validate the numerical simulation results that were obtained in a previous work by the authors, SR is applied in the present study to data from an experimental gearbox that is representative of an industrial gearbox. Both spur and helical gears are used in the gearbox setup. While the results of the direct application of SR to experimental data do not exactly corroborate the numerical simulation results, applying SR to experimental data in pre-processed form is shown to be quite effective. In addition, it is demonstrated that traditional statistical techniques used for gearbox diagnosis can be used as a reference to check how well SR performs.

  14. Within-subject mediation analysis for experimental data in cognitive psychology and neuroscience.

    PubMed

    Vuorre, Matti; Bolger, Niall

    2017-12-15

    Statistical mediation allows researchers to investigate potential causal effects of experimental manipulations through intervening variables. It is a powerful tool for assessing the presence and strength of postulated causal mechanisms. Although mediation is used in certain areas of psychology, it is rarely applied in cognitive psychology and neuroscience. One reason for the scarcity of applications is that these areas of psychology commonly employ within-subjects designs, and mediation models for within-subjects data are considerably more complicated than for between-subjects data. Here, we draw attention to the importance and ubiquity of mediational hypotheses in within-subjects designs, and we present a general and flexible software package for conducting Bayesian within-subjects mediation analyses in the R programming environment. We use experimental data from cognitive psychology to illustrate the benefits of within-subject mediation for theory testing and comparison.

  15. Why Can't We Resolve Recruitment?

    NASA Astrophysics Data System (ADS)

    Ferreira, S. A.; Payne, M. R.; Hátún, H.; MacKenzie, B. R.; Butenschön, M.; Visser, A. W.

    2016-02-01

    During the last century, Johan Hjort's work has lead to signicant advances in explaining anomalous year-classes within sheries science. However, distinguishing between the competing mechanisms of year-class regulation (e.g., food conditions, predation, transport) has proved challenging. We use blue whiting (Micromesistius poutassou) in the North-east Atlantic Ocean as a case study, which, during the late 1990s and early 2000s, generated year-classes up to nearly an order of magnitude higher than those seen before or after. There presently exists no models that can quantify past variations in recruitment for this stock. Using modern stock-statistical and observational tools, we catalog a range of environmentally-driven hypotheses relevant for recruitment of blue whiting, including physical and biogeographic conditions, phenology, parental effects and predation. We have run the analyses to test some hypotheses and results will be presented at the session.

  16. Enhancing Important Fluctuations: Rare Events and Metadynamics from a Conceptual Viewpoint

    NASA Astrophysics Data System (ADS)

    Valsson, Omar; Tiwary, Pratyush; Parrinello, Michele

    2016-05-01

    Atomistic simulations play a central role in many fields of science. However, their usefulness is often limited by the fact that many systems are characterized by several metastable states separated by high barriers, leading to kinetic bottlenecks. Transitions between metastable states are thus rare events that occur on significantly longer timescales than one can simulate in practice. Numerous enhanced sampling methods have been introduced to alleviate this timescale problem, including methods based on identifying a few crucial order parameters or collective variables and enhancing the sampling of these variables. Metadynamics is one such method that has proven successful in a great variety of fields. Here we review the conceptual and theoretical foundations of metadynamics. As demonstrated, metadynamics is not just a practical tool but can also be considered an important development in the theory of statistical mechanics.

  17. Recommendations and strategies for using reclaimed asphalt pavement in the Flemish Region based on a first life cycle assessment research

    NASA Astrophysics Data System (ADS)

    Van den bergh, Wim; Kara, Patricia; Anthonissen, Joke; Margaritis, Alexandros; Jacobs, Geert; Couscheir, Karolien

    2017-09-01

    In Flanders, using Reclaimed Asphalt Pavement (RAP) is allowed in asphalt mixes for base layers. Primary economic and secondary laboratory-measured mechanical properties are given as justification for higher amounts in specific mixes. However, one should evaluate the performance of these mixes on long-term by environmental impact of the production until end-of-life. In this paper recommendations and strategies for using RA, based on current research, are discussed in a broader perspective such as using a carbon-footprint tool and warm-mix asphalt production in the Flemish Region. The paper aims to a wide discussion by reporting several outcomes of laboratory research, statistics and practical application in order to set a general strategy for the road engineering sector in the Flemish Region.

  18. Swept Mechanism of Micro-Milling Tool Geometry Effect on Machined Oxygen Free High Conductivity Copper (OFHC) Surface Roughness

    PubMed Central

    Shi, Zhenyu; Liu, Zhanqiang; Li, Yuchao; Qiao, Yang

    2017-01-01

    Cutting tool geometry should be very much considered in micro-cutting because it has a significant effect on the topography and accuracy of the machined surface, particularly considering the uncut chip thickness is comparable to the cutting edge radius. The objective of this paper was to clarify the influence of the mechanism of the cutting tool geometry on the surface topography in the micro-milling process. Four different cutting tools including two two-fluted end milling tools with different helix angles of 15° and 30° cutting tools, as well as two three-fluted end milling tools with different helix angles of 15° and 30° were investigated by combining theoretical modeling analysis with experimental research. The tool geometry was mathematically modeled through coordinate translation and transformation to make all three cutting edges at the cutting tool tip into the same coordinate system. Swept mechanisms, minimum uncut chip thickness, and cutting tool run-out were considered on modeling surface roughness parameters (the height of surface roughness Rz and average surface roughness Ra) based on the established mathematical model. A set of cutting experiments was carried out using four different shaped cutting tools. It was found that the sweeping volume of the cutting tool increases with the decrease of both the cutting tool helix angle and the flute number. Great coarse machined surface roughness and more non-uniform surface topography are generated when the sweeping volume increases. The outcome of this research should bring about new methodologies for micro-end milling tool design and manufacturing. The machined surface roughness can be improved by appropriately selecting the tool geometrical parameters. PMID:28772479

  19. Vortex dynamics and Lagrangian statistics in a model for active turbulence.

    PubMed

    James, Martin; Wilczek, Michael

    2018-02-14

    Cellular suspensions such as dense bacterial flows exhibit a turbulence-like phase under certain conditions. We study this phenomenon of "active turbulence" statistically by using numerical tools. Following Wensink et al. (Proc. Natl. Acad. Sci. U.S.A. 109, 14308 (2012)), we model active turbulence by means of a generalized Navier-Stokes equation. Two-point velocity statistics of active turbulence, both in the Eulerian and the Lagrangian frame, is explored. We characterize the scale-dependent features of two-point statistics in this system. Furthermore, we extend this statistical study with measurements of vortex dynamics in this system. Our observations suggest that the large-scale statistics of active turbulence is close to Gaussian with sub-Gaussian tails.

  20. Influence of intermetallic coatings of system Ti-Al on durability of slotting tool from high speed steel

    NASA Astrophysics Data System (ADS)

    Vardanyan, E. L.; Budilov, V. V.; Ramazanov, K. N.; Khusnimardanov, R. N.; Nagimov, R. Sh

    2017-05-01

    The operation conditions and mechanism of wear of slotting tools from high-speed steel was researched. The analysis of methods increasing durability was carried out. The effect of intermetallic coatings deposited from vacuum-arc discharge plasma on the physical-mechanical high-speed steel EP657MP was discovered. The pilot batch of the slotting tool and production tests were carried out.

Top