Sample records for dynamic statistical graphics

  1. Dynamic Graphics in Excel for Teaching Statistics: Understanding the Probability Density Function

    ERIC Educational Resources Information Center

    Coll-Serrano, Vicente; Blasco-Blasco, Olga; Alvarez-Jareno, Jose A.

    2011-01-01

    In this article, we show a dynamic graphic in Excel that is used to introduce an important concept in our subject, Statistics I: the probability density function. This interactive graphic seeks to facilitate conceptual understanding of the main aspects analysed by the learners.

  2. Visualizing and Understanding Probability and Statistics: Graphical Simulations Using Excel

    ERIC Educational Resources Information Center

    Gordon, Sheldon P.; Gordon, Florence S.

    2009-01-01

    The authors describe a collection of dynamic interactive simulations for teaching and learning most of the important ideas and techniques of introductory statistics and probability. The modules cover such topics as randomness, simulations of probability experiments such as coin flipping, dice rolling and general binomial experiments, a simulation…

  3. AnthropMMD: An R package with a graphical user interface for the mean measure of divergence.

    PubMed

    Santos, Frédéric

    2018-01-01

    The mean measure of divergence is a dissimilarity measure between groups of individuals described by dichotomous variables. It is well suited to datasets with many missing values, and it is generally used to compute distance matrices and represent phenograms. Although often used in biological anthropology and archaeozoology, this method suffers from a lack of implementation in common statistical software. A package for the R statistical software, AnthropMMD, is presented here. Offering a dynamic graphical user interface, it is the first one dedicated to Smith's mean measure of divergence. The package also provides facilities for graphical representations and the crucial step of trait selection, so that the entire analysis can be performed through the graphical user interface. Its use is demonstrated using an artificial dataset, and the impact of trait selection is discussed. Finally, AnthropMMD is compared to three other free tools available for calculating the mean measure of divergence, and is proven to be consistent with them. © 2017 Wiley Periodicals, Inc.

  4. Principal Component Analysis in the Spectral Analysis of the Dynamic Laser Speckle Patterns

    NASA Astrophysics Data System (ADS)

    Ribeiro, K. M.; Braga, R. A., Jr.; Horgan, G. W.; Ferreira, D. D.; Safadi, T.

    2014-02-01

    Dynamic laser speckle is a phenomenon that interprets an optical patterns formed by illuminating a surface under changes with coherent light. Therefore, the dynamic change of the speckle patterns caused by biological material is known as biospeckle. Usually, these patterns of optical interference evolving in time are analyzed by graphical or numerical methods, and the analysis in frequency domain has also been an option, however involving large computational requirements which demands new approaches to filter the images in time. Principal component analysis (PCA) works with the statistical decorrelation of data and it can be used as a data filtering. In this context, the present work evaluated the PCA technique to filter in time the data from the biospeckle images aiming the reduction of time computer consuming and improving the robustness of the filtering. It was used 64 images of biospeckle in time observed in a maize seed. The images were arranged in a data matrix and statistically uncorrelated by PCA technique, and the reconstructed signals were analyzed using the routine graphical and numerical methods to analyze the biospeckle. Results showed the potential of the PCA tool in filtering the dynamic laser speckle data, with the definition of markers of principal components related to the biological phenomena and with the advantage of fast computational processing.

  5. Recent advances in parametric neuroreceptor mapping with dynamic PET: basic concepts and graphical analyses.

    PubMed

    Seo, Seongho; Kim, Su Jin; Lee, Dong Soo; Lee, Jae Sung

    2014-10-01

    Tracer kinetic modeling in dynamic positron emission tomography (PET) has been widely used to investigate the characteristic distribution patterns or dysfunctions of neuroreceptors in brain diseases. Its practical goal has progressed from regional data quantification to parametric mapping that produces images of kinetic-model parameters by fully exploiting the spatiotemporal information in dynamic PET data. Graphical analysis (GA) is a major parametric mapping technique that is independent on any compartmental model configuration, robust to noise, and computationally efficient. In this paper, we provide an overview of recent advances in the parametric mapping of neuroreceptor binding based on GA methods. The associated basic concepts in tracer kinetic modeling are presented, including commonly-used compartment models and major parameters of interest. Technical details of GA approaches for reversible and irreversible radioligands are described, considering both plasma input and reference tissue input models. Their statistical properties are discussed in view of parametric imaging.

  6. DROIDS 1.20: A GUI-Based Pipeline for GPU-Accelerated Comparative Protein Dynamics.

    PubMed

    Babbitt, Gregory A; Mortensen, Jamie S; Coppola, Erin E; Adams, Lily E; Liao, Justin K

    2018-03-13

    Traditional informatics in comparative genomics work only with static representations of biomolecules (i.e., sequence and structure), thereby ignoring the molecular dynamics (MD) of proteins that define function in the cell. A comparative approach applied to MD would connect this very short timescale process, defined in femtoseconds, to one of the longest in the universe: molecular evolution measured in millions of years. Here, we leverage advances in graphics-processing-unit-accelerated MD simulation software to develop a comparative method of MD analysis and visualization that can be applied to any two homologous Protein Data Bank structures. Our open-source pipeline, DROIDS (Detecting Relative Outlier Impacts in Dynamic Simulations), works in conjunction with existing molecular modeling software to convert any Linux gaming personal computer into a "comparative computational microscope" for observing the biophysical effects of mutations and other chemical changes in proteins. DROIDS implements structural alignment and Benjamini-Hochberg-corrected Kolmogorov-Smirnov statistics to compare nanosecond-scale atom bond fluctuations on the protein backbone, color mapping the significant differences identified in protein MD with single-amino-acid resolution. DROIDS is simple to use, incorporating graphical user interface control for Amber16 MD simulations, cpptraj analysis, and the final statistical and visual representations in R graphics and UCSF Chimera. We demonstrate that DROIDS can be utilized to visually investigate molecular evolution and disease-related functional changes in MD due to genetic mutation and epigenetic modification. DROIDS can also be used to potentially investigate binding interactions of pharmaceuticals, toxins, or other biomolecules in a functional evolutionary context as well. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  7. Perception in statistical graphics

    NASA Astrophysics Data System (ADS)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  8. Statistical mechanics of complex neural systems and high dimensional data

    NASA Astrophysics Data System (ADS)

    Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya

    2013-03-01

    Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.

  9. Using R in Introductory Statistics Courses with the pmg Graphical User Interface

    ERIC Educational Resources Information Center

    Verzani, John

    2008-01-01

    The pmg add-on package for the open source statistics software R is described. This package provides a simple to use graphical user interface (GUI) that allows introductory statistics students, without advanced computing skills, to quickly create the graphical and numeric summaries expected of them. (Contains 9 figures.)

  10. A proposal for the measurement of graphical statistics effectiveness: Does it enhance or interfere with statistical reasoning?

    NASA Astrophysics Data System (ADS)

    Agus, M.; Penna, M. P.; Peró-Cebollero, M.; Guàrdia-Olmos, J.

    2015-02-01

    Numerous studies have examined students' difficulties in understanding some notions related to statistical problems. Some authors observed that the presentation of distinct visual representations could increase statistical reasoning, supporting the principle of graphical facilitation. But other researchers disagree with this viewpoint, emphasising the impediments related to the use of illustrations that could overcharge the cognitive system with insignificant data. In this work we aim at comparing the probabilistic statistical reasoning regarding two different formats of problem presentations: graphical and verbal-numerical. We have conceived and presented five pairs of homologous simple problems in the verbal numerical and graphical format to 311 undergraduate Psychology students (n=156 in Italy and n=155 in Spain) without statistical expertise. The purpose of our work was to evaluate the effect of graphical facilitation in probabilistic statistical reasoning. Every undergraduate has solved each pair of problems in two formats in different problem presentation orders and sequences. Data analyses have highlighted that the effect of graphical facilitation is infrequent in psychology undergraduates. This effect is related to many factors (as knowledge, abilities, attitudes, and anxiety); moreover it might be considered the resultant of interaction between individual and task characteristics.

  11. UAV Swarm Mission Planning Development Using Evolutionary Algorithms - Part I

    DTIC Science & Technology

    2008-05-01

    desired behaviors in autonomous vehicles is a difficult problem at best and in general prob- ably impossible to completely resolve in complex dynamic...associated behaviors. Various techniques inspired by biological self-organized systems as found in forging insects and flocking birds, revolve around...swarms of heterogeneous vehicles in a distributed simulation system with animated graphics. Statistical measurements and observations indicate that bio

  12. Statistical Inference in Graphical Models

    DTIC Science & Technology

    2008-06-17

    fuse probability theory and graph theory in such a way as to permit efficient rep- resentation and computation with probability distributions. They...message passing. 59 viii 1. INTRODUCTION In approaching real-world problems, we often need to deal with uncertainty. Probability and statis- tics provide a...dynamic programming methods. However, for many sensors of interest, the signal-to-noise ratio does not allow such a treatment. Another source of

  13. Student's Conceptions in Statistical Graph's Interpretation

    ERIC Educational Resources Information Center

    Kukliansky, Ida

    2016-01-01

    Histograms, box plots and cumulative distribution graphs are popular graphic representations for statistical distributions. The main research question that this study focuses on is how college students deal with interpretation of these statistical graphs when translating graphical representations into analytical concepts in descriptive statistics.…

  14. GRAPHICAL USER INTERFACE WITH APPLICATIONS IN SUSCEPTIBLE-INFECTIOUS-SUSCEPTIBLE MODELS.

    PubMed

    Ilea, M; Turnea, M; Arotăriţei, D; Rotariu, Mariana; Popescu, Marilena

    2015-01-01

    Practical significance of understanding the dynamics and evolution of infectious diseases increases continuously in contemporary world. The mathematical study of the dynamics of infectious diseases has a long history. By incorporating statistical methods and computer-based simulations in dynamic epidemiological models, it could be possible for modeling methods and theoretical analyses to be more realistic and reliable, allowing a more detailed understanding of the rules governing epidemic spreading. To provide the basis for a disease transmission, the population of a region is often divided into various compartments, and the model governing their relation is called the compartmental model. To present all of the information available, a graphical user interface provides icons and visual indicators. The graphical interface shown in this paper is performed using the MATLAB software ver. 7.6.0. MATLAB software offers a wide range of techniques by which data can be displayed graphically. The process of data viewing involves a series of operations. To achieve it, I had to make three separate files, one for defining the mathematical model and two for the interface itself. Considering a fixed population, it is observed that the number of susceptible individuals diminishes along with an increase in the number of infectious individuals so that in about ten days the number of individuals infected and susceptible, respectively, has the same value. If the epidemic is not controlled, it will continue for an indefinite period of time. By changing the global parameters specific of the SIS model, a more rapid increase of infectious individuals is noted. Using the graphical user interface shown in this paper helps achieving a much easier interaction with the computer, simplifying the structure of complex instructions by using icons and menus, and, in particular, programs and files are much easier to organize. Some numerical simulations have been presented to illustrate theoretical analysis.

  15. Probabilistic Graphical Model Representation in Phylogenetics

    PubMed Central

    Höhna, Sebastian; Heath, Tracy A.; Boussau, Bastien; Landis, Michael J.; Ronquist, Fredrik; Huelsenbeck, John P.

    2014-01-01

    Recent years have seen a rapid expansion of the model space explored in statistical phylogenetics, emphasizing the need for new approaches to statistical model representation and software development. Clear communication and representation of the chosen model is crucial for: (i) reproducibility of an analysis, (ii) model development, and (iii) software design. Moreover, a unified, clear and understandable framework for model representation lowers the barrier for beginners and nonspecialists to grasp complex phylogenetic models, including their assumptions and parameter/variable dependencies. Graphical modeling is a unifying framework that has gained in popularity in the statistical literature in recent years. The core idea is to break complex models into conditionally independent distributions. The strength lies in the comprehensibility, flexibility, and adaptability of this formalism, and the large body of computational work based on it. Graphical models are well-suited to teach statistical models, to facilitate communication among phylogeneticists and in the development of generic software for simulation and statistical inference. Here, we provide an introduction to graphical models for phylogeneticists and extend the standard graphical model representation to the realm of phylogenetics. We introduce a new graphical model component, tree plates, to capture the changing structure of the subgraph corresponding to a phylogenetic tree. We describe a range of phylogenetic models using the graphical model framework and introduce modules to simplify the representation of standard components in large and complex models. Phylogenetic model graphs can be readily used in simulation, maximum likelihood inference, and Bayesian inference using, for example, Metropolis–Hastings or Gibbs sampling of the posterior distribution. [Computation; graphical models; inference; modularization; statistical phylogenetics; tree plate.] PMID:24951559

  16. Modeling and simulation of dust behaviors behind a moving vehicle

    NASA Astrophysics Data System (ADS)

    Wang, Jingfang

    Simulation of physically realistic complex dust behaviors is a difficult and attractive problem in computer graphics. A fast, interactive and visually convincing model of dust behaviors behind moving vehicles is very useful in computer simulation, training, education, art, advertising, and entertainment. In my dissertation, an experimental interactive system has been implemented for the simulation of dust behaviors behind moving vehicles. The system includes physically-based models, particle systems, rendering engines and graphical user interface (GUI). I have employed several vehicle models including tanks, cars, and jeeps to test and simulate in different scenarios and conditions. Calm weather, winding condition, vehicle turning left or right, and vehicle simulation controlled by users from the GUI are all included. I have also tested the factors which play against the physical behaviors and graphics appearances of the dust particles through GUI or off-line scripts. The simulations are done on a Silicon Graphics Octane station. The animation of dust behaviors is achieved by physically-based modeling and simulation. The flow around a moving vehicle is modeled using computational fluid dynamics (CFD) techniques. I implement a primitive variable and pressure-correction approach to solve the three dimensional incompressible Navier Stokes equations in a volume covering the moving vehicle. An alternating- direction implicit (ADI) method is used for the solution of the momentum equations, with a successive-over- relaxation (SOR) method for the solution of the Poisson pressure equation. Boundary conditions are defined and simplified according to their dynamic properties. The dust particle dynamics is modeled using particle systems, statistics, and procedure modeling techniques. Graphics and real-time simulation techniques, such as dynamics synchronization, motion blur, blending, and clipping have been employed in the rendering to achieve realistic appearing dust behaviors. In addition, I introduce a temporal smoothing technique to eliminate the jagged effect caused by large simulation time. Several algorithms are used to speed up the simulation. For example, pre-calculated tables and display lists are created to replace some of the most commonly used functions, scripts and processes. The performance study shows that both time and space costs of the algorithms are linear in the number of particles in the system. On a Silicon Graphics Octane, three vehicles with 20,000 particles run at 6-8 frames per second on average. This speed does not include the extra calculations of convergence of the numerical integration for fluid dynamics which usually takes about 4-5 minutes to achieve steady state.

  17. Dissociation of polycyclic aromatic hydrocarbons: molecular dynamics studies

    NASA Astrophysics Data System (ADS)

    Simon, A.; Rapacioli, M.; Rouaut, G.; Trinquier, G.; Gadéa, F. X.

    2017-03-01

    We present dynamical studies of the dissociation of polycyclic aromatic hydrocarbon (PAH) radical cations in their ground electronic states with significant internal energy. Molecular dynamics simulations are performed, the electronic structure being described on-the-fly at the self-consistent-charge density functional-based tight binding (SCC-DFTB) level of theory. The SCC-DFTB approach is first benchmarked against DFT results. Extensive simulations are achieved for naphthalene , pyrene and coronene at several energies. Such studies enable one to derive significant trends on branching ratios, kinetics, structures and hints on the formation mechanism of the ejected neutral fragments. In particular, dependence of branching ratios on PAH size and energy were retrieved. The losses of H and C2H2 (recognized as the ethyne molecule) were identified as major dissociation channels. The H/C2H2 ratio was found to increase with PAH size and to decrease with energy. For , which is the most interesting PAH from the astrophysical point of view, the loss of H was found as the quasi-only channel for an internal energy of 30 eV. Overall, in line with experimental trends, decreasing the internal energy or increasing the PAH size will favour the hydrogen loss channels with respect to carbonaceous fragments. This article is part of the themed issue 'Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces'.

  18. New dimensions from statistical graphics for GIS (geographic information system) analysis and interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCord, R.A.; Olson, R.J.

    1988-01-01

    Environmental research and assessment activities at Oak Ridge National Laboratory (ORNL) include the analysis of spatial and temporal patterns of ecosystem response at a landscape scale. Analysis through use of geographic information system (GIS) involves an interaction between the user and thematic data sets frequently expressed as maps. A portion of GIS analysis has a mathematical or statistical aspect, especially for the analysis of temporal patterns. ARC/INFO is an excellent tool for manipulating GIS data and producing the appropriate map graphics. INFO also has some limited ability to produce statistical tabulation. At ORNL we have extended our capabilities by graphicallymore » interfacing ARC/INFO and SAS/GRAPH to provide a combined mapping and statistical graphics environment. With the data management, statistical, and graphics capabilities of SAS added to ARC/INFO, we have expanded the analytical and graphical dimensions of the GIS environment. Pie or bar charts, frequency curves, hydrographs, or scatter plots as produced by SAS can be added to maps from attribute data associated with ARC/INFO coverages. Numerous, small, simplified graphs can also become a source of complex map ''symbols.'' These additions extend the dimensions of GIS graphics to include time, details of the thematic composition, distribution, and interrelationships. 7 refs., 3 figs.« less

  19. Understanding Summary Statistics and Graphical Techniques to Compare Michael Jordan versus LeBron James

    ERIC Educational Resources Information Center

    Williams, Immanuel James; Williams, Kelley Kim

    2016-01-01

    Understanding summary statistics and graphical techniques are building blocks to comprehending concepts beyond basic statistics. It's known that motivated students perform better in school. Using examples that students find engaging allows them to understand the concepts at a deeper level.

  20. Statistical Analysis of CFD Solutions From the Fifth AIAA Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.

    2013-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using a common grid sequence and multiple turbulence models for the June 2012 fifth Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was the Common Research Model subsonic transport wing-body previously used for the 4th Drag Prediction Workshop. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.

  1. The effect of using graphic organizers in the teaching of standard biology

    NASA Astrophysics Data System (ADS)

    Pepper, Wade Louis, Jr.

    This study was conducted to determine if the use of graphic organizers in the teaching of standard biology would increase student achievement, involvement and quality of activities. The subjects were 10th grade standard biology students in a large southern inner city high school. The study was conducted over a six-week period in an instructional setting using action research as the investigative format. After calculation of the homogeneity between classes, random selection was used to determine the graphic organizer class and the control class. The graphic organizer class was taught unit material through a variety of instructional methods along with the use of teacher generated graphic organizers. The control class was taught the same unit material using the same instructional methods, but without the use of graphic organizers. Data for the study were gathered from in-class written assignments, teacher-generated tests and text-generated tests, and rubric scores of an out-of-class written assignment and project. Also, data were gathered from student reactions, comments, observations and a teacher's research journal. Results were analyzed using descriptive statistics and qualitative interpretation. By comparing statistical results, it was determined that the use of graphic organizers did not make a statistically significant difference in the understanding of biological concepts and retention of factual information. Furthermore, the use of graphic organizers did not make a significant difference in motivating students to fulfill all class assignments with quality efforts and products. However, based upon student reactions and comments along with observations by the researcher, graphic organizers were viewed by the students as a favorable and helpful instructional tool. In lieu of statistical results, student gains from instructional activities using graphic organizers were positive and merit the continuation of their use as an instructional tool.

  2. Translating glucose variability metrics into the clinic via Continuous Glucose Monitoring: a Graphical User Interface for Diabetes Evaluation (CGM-GUIDE©).

    PubMed

    Rawlings, Renata A; Shi, Hang; Yuan, Lo-Hua; Brehm, William; Pop-Busui, Rodica; Nelson, Patrick W

    2011-12-01

    Several metrics of glucose variability have been proposed to date, but an integrated approach that provides a complete and consistent assessment of glycemic variation is missing. As a consequence, and because of the tedious coding necessary during quantification, most investigators and clinicians have not yet adopted the use of multiple glucose variability metrics to evaluate glycemic variation. We compiled the most extensively used statistical techniques and glucose variability metrics, with adjustable hyper- and hypoglycemic limits and metric parameters, to create a user-friendly Continuous Glucose Monitoring Graphical User Interface for Diabetes Evaluation (CGM-GUIDE©). In addition, we introduce and demonstrate a novel transition density profile that emphasizes the dynamics of transitions between defined glucose states. Our combined dashboard of numerical statistics and graphical plots support the task of providing an integrated approach to describing glycemic variability. We integrated existing metrics, such as SD, area under the curve, and mean amplitude of glycemic excursion, with novel metrics such as the slopes across critical transitions and the transition density profile to assess the severity and frequency of glucose transitions per day as they move between critical glycemic zones. By presenting the above-mentioned metrics and graphics in a concise aggregate format, CGM-GUIDE provides an easy to use tool to compare quantitative measures of glucose variability. This tool can be used by researchers and clinicians to develop new algorithms of insulin delivery for patients with diabetes and to better explore the link between glucose variability and chronic diabetes complications.

  3. Translating Glucose Variability Metrics into the Clinic via Continuous Glucose Monitoring: A Graphical User Interface for Diabetes Evaluation (CGM-GUIDE©)

    PubMed Central

    Rawlings, Renata A.; Shi, Hang; Yuan, Lo-Hua; Brehm, William; Pop-Busui, Rodica

    2011-01-01

    Abstract Background Several metrics of glucose variability have been proposed to date, but an integrated approach that provides a complete and consistent assessment of glycemic variation is missing. As a consequence, and because of the tedious coding necessary during quantification, most investigators and clinicians have not yet adopted the use of multiple glucose variability metrics to evaluate glycemic variation. Methods We compiled the most extensively used statistical techniques and glucose variability metrics, with adjustable hyper- and hypoglycemic limits and metric parameters, to create a user-friendly Continuous Glucose Monitoring Graphical User Interface for Diabetes Evaluation (CGM-GUIDE©). In addition, we introduce and demonstrate a novel transition density profile that emphasizes the dynamics of transitions between defined glucose states. Results Our combined dashboard of numerical statistics and graphical plots support the task of providing an integrated approach to describing glycemic variability. We integrated existing metrics, such as SD, area under the curve, and mean amplitude of glycemic excursion, with novel metrics such as the slopes across critical transitions and the transition density profile to assess the severity and frequency of glucose transitions per day as they move between critical glycemic zones. Conclusions By presenting the above-mentioned metrics and graphics in a concise aggregate format, CGM-GUIDE provides an easy to use tool to compare quantitative measures of glucose variability. This tool can be used by researchers and clinicians to develop new algorithms of insulin delivery for patients with diabetes and to better explore the link between glucose variability and chronic diabetes complications. PMID:21932986

  4. Visualizing water

    NASA Astrophysics Data System (ADS)

    Baart, F.; van Gils, A.; Hagenaars, G.; Donchyts, G.; Eisemann, E.; van Velzen, J. W.

    2016-12-01

    A compelling visualization is captivating, beautiful and narrative. Here we show how melding the skills of computer graphics, art, statistics, and environmental modeling can be used to generate innovative, attractive and very informative visualizations. We focus on the topic of visualizing forecasts and measurements of water (water level, waves, currents, density, and salinity). For the field of computer graphics and arts, water is an important topic because it occurs in many natural scenes. For environmental modeling and statistics, water is an important topic because the water is essential for transport, a healthy environment, fruitful agriculture, and a safe environment.The different disciplines take different approaches to visualizing water. In computer graphics, one focusses on creating water as realistic looking as possible. The focus on realistic perception (versus the focus on the physical balance pursued by environmental scientists) resulted in fascinating renderings, as seen in recent games and movies. Visualization techniques for statistical results have benefited from the advancement in design and journalism, resulting in enthralling infographics. The field of environmental modeling has absorbed advances in contemporary cartography as seen in the latest interactive data-driven maps. We systematically review the design emerging types of water visualizations. The examples that we analyze range from dynamically animated forecasts, interactive paintings, infographics, modern cartography to web-based photorealistic rendering. By characterizing the intended audience, the design choices, the scales (e.g. time, space), and the explorability we provide a set of guidelines and genres. The unique contributions of the different fields show how the innovations in the current state of the art of water visualization have benefited from inter-disciplinary collaborations.

  5. Understanding of Relation Structures of Graphical Models by Lower Secondary Students

    ERIC Educational Resources Information Center

    van Buuren, Onne; Heck, André; Ellermeijer, Ton

    2016-01-01

    A learning path has been developed on system dynamical graphical modelling, integrated into the Dutch lower secondary physics curriculum. As part of the developmental research for this learning path, students' understanding of the relation structures shown in the diagrams of graphical system dynamics based models has been investigated. One of our…

  6. Identification of natural images and computer-generated graphics based on statistical and textural features.

    PubMed

    Peng, Fei; Li, Jiao-ting; Long, Min

    2015-03-01

    To discriminate the acquisition pipelines of digital images, a novel scheme for the identification of natural images and computer-generated graphics is proposed based on statistical and textural features. First, the differences between them are investigated from the view of statistics and texture, and 31 dimensions of feature are acquired for identification. Then, LIBSVM is used for the classification. Finally, the experimental results are presented. The results show that it can achieve an identification accuracy of 97.89% for computer-generated graphics, and an identification accuracy of 97.75% for natural images. The analyses also demonstrate the proposed method has excellent performance, compared with some existing methods based only on statistical features or other features. The method has a great potential to be implemented for the identification of natural images and computer-generated graphics. © 2014 American Academy of Forensic Sciences.

  7. Making Conjectures in Dynamic Geometry: The Potential of a Particular Way of Dragging

    ERIC Educational Resources Information Center

    Mariotti, Maria Alessandra; Baccaglini-Frank, Anna

    2011-01-01

    When analyzing what has changed in the geometry scenario with the advent of dynamic geometry systems (DGS), one can notice a transition from the traditional graphic environment made of paper-and-pencil, and the classical construction tools like the ruler and compass, to a virtual graphic space, made of a computer screen, graphical tools that are…

  8. Statistical Analysis of CFD Solutions from the 6th AIAA CFD Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Derlaga, Joseph M.; Morrison, Joseph H.

    2017-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N- version test of a collection of Reynolds-averaged Navier-Stokes computational uid dynam- ics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using both common and custom grid sequencees as well as multiple turbulence models for the June 2016 6th AIAA CFD Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic con guration for this workshop was the Common Research Model subsonic transport wing- body previously used for both the 4th and 5th Drag Prediction Workshops. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.

  9. Update: Advancement of Contact Dynamics Modeling for Human Spaceflight Simulation Applications

    NASA Technical Reports Server (NTRS)

    Brain, Thomas A.; Kovel, Erik B.; MacLean, John R.; Quiocho, Leslie J.

    2017-01-01

    Pong is a new software tool developed at the NASA Johnson Space Center that advances interference-based geometric contact dynamics based on 3D graphics models. The Pong software consists of three parts: a set of scripts to extract geometric data from 3D graphics models, a contact dynamics engine that provides collision detection and force calculations based on the extracted geometric data, and a set of scripts for visualizing the dynamics response with the 3D graphics models. The contact dynamics engine can be linked with an external multibody dynamics engine to provide an integrated multibody contact dynamics simulation. This paper provides a detailed overview of Pong including the overall approach and modeling capabilities, which encompasses force generation from contact primitives and friction to computational performance. Two specific Pong-based examples of International Space Station applications are discussed, and the related verification and validation using this new tool are also addressed.

  10. Interpreting Association from Graphical Displays

    ERIC Educational Resources Information Center

    Fitzallen, Noleine

    2016-01-01

    Research that has explored students' interpretations of graphical representations has not extended to include how students apply understanding of particular statistical concepts related to one graphical representation to interpret different representations. This paper reports on the way in which students' understanding of covariation, evidenced…

  11. GPU-computing in econophysics and statistical physics

    NASA Astrophysics Data System (ADS)

    Preis, T.

    2011-03-01

    A recent trend in computer science and related fields is general purpose computing on graphics processing units (GPUs), which can yield impressive performance. With multiple cores connected by high memory bandwidth, today's GPUs offer resources for non-graphics parallel processing. This article provides a brief introduction into the field of GPU computing and includes examples. In particular computationally expensive analyses employed in financial market context are coded on a graphics card architecture which leads to a significant reduction of computing time. In order to demonstrate the wide range of possible applications, a standard model in statistical physics - the Ising model - is ported to a graphics card architecture as well, resulting in large speedup values.

  12. Sources of Safety Data and Statistical Strategies for Design and Analysis: Postmarket Surveillance.

    PubMed

    Izem, Rima; Sanchez-Kam, Matilde; Ma, Haijun; Zink, Richard; Zhao, Yueqin

    2018-03-01

    Safety data are continuously evaluated throughout the life cycle of a medical product to accurately assess and characterize the risks associated with the product. The knowledge about a medical product's safety profile continually evolves as safety data accumulate. This paper discusses data sources and analysis considerations for safety signal detection after a medical product is approved for marketing. This manuscript is the second in a series of papers from the American Statistical Association Biopharmaceutical Section Safety Working Group. We share our recommendations for the statistical and graphical methodologies necessary to appropriately analyze, report, and interpret safety outcomes, and we discuss the advantages and disadvantages of safety data obtained from passive postmarketing surveillance systems compared to other sources. Signal detection has traditionally relied on spontaneous reporting databases that have been available worldwide for decades. However, current regulatory guidelines and ease of reporting have increased the size of these databases exponentially over the last few years. With such large databases, data-mining tools using disproportionality analysis and helpful graphics are often used to detect potential signals. Although the data sources have many limitations, analyses of these data have been successful at identifying safety signals postmarketing. Experience analyzing these dynamic data is useful in understanding the potential and limitations of analyses with new data sources such as social media, claims, or electronic medical records data.

  13. Quantitative Graphics in Newspapers.

    ERIC Educational Resources Information Center

    Tankard, James W., Jr.

    The use of quantitative graphics in newspapers requires achieving a balance between being accurate and getting the attention of the reader. The statistical representations in newspapers are drawn by graphic designers whose key technique is fusion--the striking combination of two visual images. This technique often results in visual puns,…

  14. 2D-dynamic representation of DNA sequences as a graphical tool in bioinformatics

    NASA Astrophysics Data System (ADS)

    Bielińska-Wa̧Ż, D.; Wa̧Ż, P.

    2016-10-01

    2D-dynamic representation of DNA sequences is briefly reviewed. Some new examples of 2D-dynamic graphs which are the graphical tool of the method are shown. Using the examples of the complete genome sequences of the Zika virus it is shown that the present method can be applied for the study of the evolution of viral genomes.

  15. Dynamic Assessment of Graphic Symbol Combinations by Children with Autism.

    ERIC Educational Resources Information Center

    Nigam, Ravi

    2001-01-01

    This article offers teaching strategies in the dynamic assessment of the potential of students with autism to acquire and use multiple graphic symbol combinations for communicative purposes. Examples are given of the matrix strategy and milieu language teaching strategies. It also describes the Individualized Communication-Care Protocol, which…

  16. Detecting subtle hydrochemical anomalies with multivariate statistics: an example from homogeneous groundwaters in the Great Artesian Basin, Australia

    NASA Astrophysics Data System (ADS)

    O'Shea, Bethany; Jankowski, Jerzy

    2006-12-01

    The major ion composition of Great Artesian Basin groundwater in the lower Namoi River valley is relatively homogeneous in chemical composition. Traditional graphical techniques have been combined with multivariate statistical methods to determine whether subtle differences in the chemical composition of these waters can be delineated. Hierarchical cluster analysis and principal components analysis were successful in delineating minor variations within the groundwaters of the study area that were not visually identified in the graphical techniques applied. Hydrochemical interpretation allowed geochemical processes to be identified in each statistically defined water type and illustrated how these groundwaters differ from one another. Three main geochemical processes were identified in the groundwaters: ion exchange, precipitation, and mixing between waters from different sources. Both statistical methods delineated an anomalous sample suspected of being influenced by magmatic CO2 input. The use of statistical methods to complement traditional graphical techniques for waters appearing homogeneous is emphasized for all investigations of this type. Copyright

  17. Graphical Man/Machine Communications

    DTIC Science & Technology

    Progress is reported concerning the use of computer controlled graphical displays in the areas of radiaton diffusion and hydrodynamics, general...ventricular dynamics. Progress is continuing on the use of computer graphics in architecture. Some progress in halftone graphics is reported with no basic...developments presented. Colored halftone perspective pictures are being used to represent multivariable situations. Nonlinear waveform processing is

  18. Graphic Novels in Your School Library

    ERIC Educational Resources Information Center

    Karp, Jesse

    2011-01-01

    Many educators now agree that graphic novels inform as well as entertain, and to dismiss the educational potential of the graphic novel is to throw away a golden opportunity to reach out to young readers. This dynamic book takes a look at the term "graphic novel," how the format has become entwined in our culture, and the ways in which graphic…

  19. Statistical Analysis of CFD Solutions from the Fourth AIAA Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.

    2010-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from the U.S., Europe, Asia, and Russia using a variety of grid systems and turbulence models for the June 2009 4th Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was a new subsonic transport model, the Common Research Model, designed using a modern approach for the wing and included a horizontal tail. The fourth workshop focused on the prediction of both absolute and incremental drag levels for wing-body and wing-body-horizontal tail configurations. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with earlier workshops using the statistical framework.

  20. Teaching "Instant Experience" with Graphical Model Validation Techniques

    ERIC Educational Resources Information Center

    Ekstrøm, Claus Thorn

    2014-01-01

    Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

  1. Moving beyond the Bar Plot and the Line Graph to Create Informative and Attractive Graphics

    ERIC Educational Resources Information Center

    Larson-Hall, Jenifer

    2017-01-01

    Graphics are often mistaken for a mere frill in the methodological arsenal of data analysis when in fact they can be one of the simplest and at the same time most powerful methods of communicating statistical information (Tufte, 2001). The first section of the article argues for the statistical necessity of graphs, echoing and amplifying similar…

  2. SimHap GUI: an intuitive graphical user interface for genetic association analysis.

    PubMed

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-12-25

    Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  3. Learning strategy preferences, verbal-visual cognitive styles, and multimedia preferences for continuing engineering education instructional design

    NASA Astrophysics Data System (ADS)

    Baukal, Charles Edward, Jr.

    A literature search revealed very little information on how to teach working engineers, which became the motivation for this research. Effective training is important for many reasons such as preventing accidents, maximizing fuel efficiency, minimizing pollution emissions, and reducing equipment downtime. The conceptual framework for this study included the development of a new instructional design framework called the Multimedia Cone of Abstraction (MCoA). This was developed by combining Dale's Cone of Experience and Mayer's Cognitive Theory of Multimedia Learning. An anonymous survey of 118 engineers from a single Midwestern manufacturer was conducted to determine their demographics, learning strategy preferences, verbal-visual cognitive styles, and multimedia preferences. The learning strategy preference profile and verbal-visual cognitive styles of the sample were statistically significantly different than the general population. The working engineers included more Problem Solvers and were much more visually-oriented than the general population. To study multimedia preferences, five of the seven levels in the MCoA were used. Eight types of multimedia were compared in four categories (types in parantheses): text (text and narration), static graphics (drawing and photograph), non-interactive dynamic graphics (animation and video), and interactive dynamic graphics (simulated virtual reality and real virtual reality). The first phase of the study examined multimedia preferences within a category. Participants compared multimedia types in pairs on dual screens using relative preference, rating, and ranking. Surprisingly, the more abstract multimedia (text, drawing, animation, and simulated virtual reality) were preferred in every category to the more concrete multimedia (narration, photograph, video, and real virtual reality), despite the fact that most participants had relatively little prior subject knowledge. However, the more abstract graphics were only slightly preferred to the more concrete graphics. In the second phase, the more preferred multimedia types in each category from the first phase were compared against each other using relative preference, rating, and ranking and overall rating and ranking. Drawing was the most preferred multimedia type overall, although only slightly more than animation and simulated virtual reality. Text was a distant fourth. These results suggest that instructional content for continuing engineering education should include problem solving and should be highly visual.

  4. Quantitative graphical analysis of simultaneous dynamic PET/MRI for assessment of prostate cancer.

    PubMed

    Rosenkrantz, Andrew B; Koesters, Thomas; Vahle, Anne-Kristin; Friedman, Kent; Bartlett, Rachel M; Taneja, Samir S; Ding, Yu-Shin; Logan, Jean

    2015-04-01

    Dynamic FDG imaging for prostate cancer characterization is limited by generally small size and low uptake in prostate tumors. Our aim in this pilot study was to explore feasibility of simultaneous PET/MRI to guide localization of prostate lesions for dynamic FDG analysis using a graphical approach. Three patients with biopsy-proven prostate cancer underwent simultaneous FDG PET/MRI, incorporating dynamic prostate imaging. Histology and multiparametric MRI findings were used to localize tumors, which in turn guided identification of tumors on FDG images. Regions of interest were manually placed on tumor and benign prostate tissue. Blood activity was extracted from a region of interest placed on the femoral artery on PET images. FDG data were analyzed by graphical analysis using the influx constant Ki (Patlak analysis) when FDG binding seemed irreversible and distribution volume VT (reversible graphical analysis) when FDG binding seemed reversible given the presence of washout. Given inherent coregistration, simultaneous acquisition facilitated use of MRI data to localize small lesions on PET and subsequent graphical analysis in all cases. In 2 cases with irreversible binding, tumor had higher Ki than benign using Patlak analysis (0.023 vs 0.006 and 0.019 vs 0.008 mL/cm3 per minute). In 1 case appearing reversible, tumor had higher VT than benign using reversible graphical analysis (0.68 vs 0.52 mL/cm3). Simultaneous PET/MRI allows localization of small prostate tumors for dynamic PET analysis. By taking advantage of inclusion of the femoral arteries in the FOV, we applied advanced PET data analysis methods beyond conventional static measures and without blood sampling.

  5. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling.

    PubMed

    Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall

    2016-01-01

    Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.

  6. A Statistical Graphical Model of the California Reservoir System

    NASA Astrophysics Data System (ADS)

    Taeb, A.; Reager, J. T.; Turmon, M.; Chandrasekaran, V.

    2017-11-01

    The recent California drought has highlighted the potential vulnerability of the state's water management infrastructure to multiyear dry intervals. Due to the high complexity of the network, dynamic storage changes in California reservoirs on a state-wide scale have previously been difficult to model using either traditional statistical or physical approaches. Indeed, although there is a significant line of research on exploring models for single (or a small number of) reservoirs, these approaches are not amenable to a system-wide modeling of the California reservoir network due to the spatial and hydrological heterogeneities of the system. In this work, we develop a state-wide statistical graphical model to characterize the dependencies among a collection of 55 major California reservoirs across the state; this model is defined with respect to a graph in which the nodes index reservoirs and the edges specify the relationships or dependencies between reservoirs. We obtain and validate this model in a data-driven manner based on reservoir volumes over the period 2003-2016. A key feature of our framework is a quantification of the effects of external phenomena that influence the entire reservoir network. We further characterize the degree to which physical factors (e.g., state-wide Palmer Drought Severity Index (PDSI), average temperature, snow pack) and economic factors (e.g., consumer price index, number of agricultural workers) explain these external influences. As a consequence of this analysis, we obtain a system-wide health diagnosis of the reservoir network as a function of PDSI.

  7. Total Quality Management: Statistics and Graphics III - Experimental Design and Taguchi Methods. AIR 1993 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Schwabe, Robert A.

    Interest in Total Quality Management (TQM) at institutions of higher education has been stressed in recent years as an important area of activity for institutional researchers. Two previous AIR Forum papers have presented some of the statistical and graphical methods used for TQM. This paper, the third in the series, first discusses some of the…

  8. Learning from Animation Enabled by Collaboration

    ERIC Educational Resources Information Center

    Rebetez, Cyril; Betrancourt, Mireille; Sangin, Mirweis; Dillenbourg, Pierre

    2010-01-01

    Animated graphics are extensively used in multimedia instructions explaining how natural or artificial dynamic systems work. As animation directly depicts spatial changes over time, it is legitimate to believe that animated graphics will improve comprehension over static graphics. However, the research failed to find clear evidence in favour of…

  9. Multibody dynamics model building using graphical interfaces

    NASA Technical Reports Server (NTRS)

    Macala, Glenn A.

    1989-01-01

    In recent years, the extremely laborious task of manually deriving equations of motion for the simulation of multibody spacecraft dynamics has largely been eliminated. Instead, the dynamicist now works with commonly available general purpose dynamics simulation programs which generate the equations of motion either explicitly or implicitly via computer codes. The user interface to these programs has predominantly been via input data files, each with its own required format and peculiarities, causing errors and frustrations during program setup. Recent progress in a more natural method of data input for dynamics programs: the graphical interface, is described.

  10. Note: Quasi-real-time analysis of dynamic near field scattering data using a graphics processing unit

    NASA Astrophysics Data System (ADS)

    Cerchiari, G.; Croccolo, F.; Cardinaux, F.; Scheffold, F.

    2012-10-01

    We present an implementation of the analysis of dynamic near field scattering (NFS) data using a graphics processing unit. We introduce an optimized data management scheme thereby limiting the number of operations required. Overall, we reduce the processing time from hours to minutes, for typical experimental conditions. Previously the limiting step in such experiments, the processing time is now comparable to the data acquisition time. Our approach is applicable to various dynamic NFS methods, including shadowgraph, Schlieren and differential dynamic microscopy.

  11. [Is there life beyond SPSS? Discover R].

    PubMed

    Elosua Oliden, Paula

    2009-11-01

    R is a GNU statistical and programming environment with very high graphical capabilities. It is very powerful for research purposes, but it is also an exceptional tool for teaching. R is composed of more than 1400 packages that allow using it for simple statistics and applying the most complex and most recent formal models. Using graphical interfaces like the Rcommander package, permits working in user-friendly environments which are similar to the graphical environment used by SPSS. This last characteristic allows non-statisticians to overcome the obstacle of accessibility, and it makes R the best tool for teaching. Is there anything better? Open, free, affordable, accessible and always on the cutting edge.

  12. Software for Data Analysis with Graphical Models

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.; Roy, H. Scott

    1994-01-01

    Probabilistic graphical models are being used widely in artificial intelligence and statistics, for instance, in diagnosis and expert systems, as a framework for representing and reasoning with probabilities and independencies. They come with corresponding algorithms for performing statistical inference. This offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper illustrates the framework with an example and then presents some basic techniques for the task: problem decomposition and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  13. An introduction to real-time graphical techniques for analyzing multivariate data

    NASA Astrophysics Data System (ADS)

    Friedman, Jerome H.; McDonald, John Alan; Stuetzle, Werner

    1987-08-01

    Orion I is a graphics system used to study applications of computer graphics - especially interactive motion graphics - in statistics. Orion I is the newest of a family of "Prim" systems, whose most striking common feature is the use of real-time motion graphics to display three dimensional scatterplots. Orion I differs from earlier Prim systems through the use of modern and relatively inexpensive raster graphics and microprocessor technology. It also delivers more computing power to its user; Orion I can perform more sophisticated real-time computations than were possible on previous such systems. We demonstrate some of Orion I's capabilities in our film: "Exploring data with Orion I".

  14. The Communicability of Graphical Alternatives to Tabular Displays of Statistical Simulation Studies

    PubMed Central

    Cook, Alex R.; Teo, Shanice W. L.

    2011-01-01

    Simulation studies are often used to assess the frequency properties and optimality of statistical methods. They are typically reported in tables, which may contain hundreds of figures to be contrasted over multiple dimensions. To assess the degree to which these tables are fit for purpose, we performed a randomised cross-over experiment in which statisticians were asked to extract information from (i) such a table sourced from the literature and (ii) a graphical adaptation designed by the authors, and were timed and assessed for accuracy. We developed hierarchical models accounting for differences between individuals of different experience levels (under- and post-graduate), within experience levels, and between different table-graph pairs. In our experiment, information could be extracted quicker and, for less experienced participants, more accurately from graphical presentations than tabular displays. We also performed a literature review to assess the prevalence of hard-to-interpret design features in tables of simulation studies in three popular statistics journals, finding that many are presented innumerately. We recommend simulation studies be presented in graphical form. PMID:22132184

  15. The communicability of graphical alternatives to tabular displays of statistical simulation studies.

    PubMed

    Cook, Alex R; Teo, Shanice W L

    2011-01-01

    Simulation studies are often used to assess the frequency properties and optimality of statistical methods. They are typically reported in tables, which may contain hundreds of figures to be contrasted over multiple dimensions. To assess the degree to which these tables are fit for purpose, we performed a randomised cross-over experiment in which statisticians were asked to extract information from (i) such a table sourced from the literature and (ii) a graphical adaptation designed by the authors, and were timed and assessed for accuracy. We developed hierarchical models accounting for differences between individuals of different experience levels (under- and post-graduate), within experience levels, and between different table-graph pairs. In our experiment, information could be extracted quicker and, for less experienced participants, more accurately from graphical presentations than tabular displays. We also performed a literature review to assess the prevalence of hard-to-interpret design features in tables of simulation studies in three popular statistics journals, finding that many are presented innumerately. We recommend simulation studies be presented in graphical form.

  16. Probabilistic reasoning under time pressure: an assessment in Italian, Spanish and English psychology undergraduates

    NASA Astrophysics Data System (ADS)

    Agus, M.; Hitchcott, P. K.; Penna, M. P.; Peró-Cebollero, M.; Guàrdia-Olmos, J.

    2016-11-01

    Many studies have investigated the features of probabilistic reasoning developed in relation to different formats of problem presentation, showing that it is affected by various individual and contextual factors. Incomplete understanding of the identity and role of these factors may explain the inconsistent evidence concerning the effect of problem presentation format. Thus, superior performance has sometimes been observed for graphically, rather than verbally, presented problems. The present study was undertaken to address this issue. Psychology undergraduates without any statistical expertise (N = 173 in Italy; N = 118 in Spain; N = 55 in England) were administered statistical problems in two formats (verbal-numerical and graphical-pictorial) under a condition of time pressure. Students also completed additional measures indexing several potentially relevant individual dimensions (statistical ability, statistical anxiety, attitudes towards statistics and confidence). Interestingly, a facilitatory effect of graphical presentation was observed in the Italian and Spanish samples but not in the English one. Significantly, the individual dimensions predicting statistical performance also differed between the samples, highlighting a different role of confidence. Hence, these findings confirm previous observations concerning problem presentation format while simultaneously highlighting the importance of individual dimensions.

  17. SimHap GUI: An intuitive graphical user interface for genetic association analysis

    PubMed Central

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-01-01

    Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877

  18. Incorporating prior information into differential network analysis using non-paranormal graphical models.

    PubMed

    Zhang, Xiao-Fei; Ou-Yang, Le; Yan, Hong

    2017-08-15

    Understanding how gene regulatory networks change under different cellular states is important for revealing insights into network dynamics. Gaussian graphical models, which assume that the data follow a joint normal distribution, have been used recently to infer differential networks. However, the distributions of the omics data are non-normal in general. Furthermore, although much biological knowledge (or prior information) has been accumulated, most existing methods ignore the valuable prior information. Therefore, new statistical methods are needed to relax the normality assumption and make full use of prior information. We propose a new differential network analysis method to address the above challenges. Instead of using Gaussian graphical models, we employ a non-paranormal graphical model that can relax the normality assumption. We develop a principled model to take into account the following prior information: (i) a differential edge less likely exists between two genes that do not participate together in the same pathway; (ii) changes in the networks are driven by certain regulator genes that are perturbed across different cellular states and (iii) the differential networks estimated from multi-view gene expression data likely share common structures. Simulation studies demonstrate that our method outperforms other graphical model-based algorithms. We apply our method to identify the differential networks between platinum-sensitive and platinum-resistant ovarian tumors, and the differential networks between the proneural and mesenchymal subtypes of glioblastoma. Hub nodes in the estimated differential networks rediscover known cancer-related regulator genes and contain interesting predictions. The source code is at https://github.com/Zhangxf-ccnu/pDNA. szuouyl@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  19. A Symbolic and Graphical Computer Representation of Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Gould, Laurence I.

    2005-04-01

    AUTONO is a Macsyma/Maxima program, designed at the University of Hartford, for solving autonomous systems of differential equations as well as for relating Lagrangians and Hamiltonians to their associated dynamical equations. AUTONO can be used in a number of fields to decipher a variety of complex dynamical systems with ease, producing their Lagrangian and Hamiltonian equations in seconds. These equations can then be incorporated into VisSim, a modeling and simulation program, which yields graphical representations of motion in a given system through easily chosen input parameters. The program, along with the VisSim differential-equations graphical package, allows for resolution and easy understanding of complex problems in a relatively short time; thus enabling quicker and more advanced computing of dynamical systems on any number of platforms---from a network of sensors on a space probe, to the behavior of neural networks, to the effects of an electromagnetic field on components in a dynamical system. A flowchart of AUTONO, along with some simple applications and VisSim output, will be shown.

  20. Switching among graphic patterns is governed by oscillatory coordination dynamics: implications for understanding handwriting

    PubMed Central

    Zanone, Pier-Giorgio; Athènes, Sylvie

    2013-01-01

    Revisiting an original idea by Hollerbach (1981), previous work has established that the production of graphic shapes, assumed to be the blueprint for handwriting, is governed by the dynamics of orthogonal non-linear coupled oscillators. Such dynamics determines few stable coordination patterns, giving rise to a limited set of preferred graphic shapes, namely, four lines and four ellipsoids independent of orientation. The present study investigates the rules of switching among such graphic coordination patterns. Seven participants were required to voluntarily switch within twelve pairs of shapes presented on a graphic tablet. In line with previous theoretical and experimental work on bimanual coordination, results corroborated our hypothesis that the relative stability of the produced coordination patterns determines the time needed for switching: the transition to a more stable pattern was shorter, and inversely. Moreover, switching between patterns with the same orientation but different eccentricities was faster than with a change in orientation. Nonetheless, the switching time covaried strictly with the change in relative phase effected by the transition between two shapes, whether this implied a change in eccentricity or in orientation. These findings suggest a new operational definition of what the (motor) units or strokes of handwriting are and shed a novel light on how coarticulation and recruitment of degrees of freedom may occur in graphic skills. They also yield some leads for understanding the acquisition and the neural underpinnings of handwriting. PMID:24069014

  1. Broken Ergodicity in Ideal, Homogeneous, Incompressible Turbulence

    NASA Technical Reports Server (NTRS)

    Morin, Lee; Shebalin, John; Fu, Terry; Nguyen, Phu; Shum, Victor

    2010-01-01

    We discuss the statistical mechanics of numerical models of ideal homogeneous, incompressible turbulence and their relevance for dissipative fluids and magnetofluids. These numerical models are based on Fourier series and the relevant statistical theory predicts that Fourier coefficients of fluid velocity and magnetic fields (if present) are zero-mean random variables. However, numerical simulations clearly show that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation. We explain this phenomena in terms of broken ergodicity', which is defined to occur when dynamical behavior does not match ensemble predictions on very long time-scales. We review the theoretical basis of broken ergodicity, apply it to 2-D and 3-D fluid and magnetohydrodynamic simulations of homogeneous turbulence, and show new results from simulations using GPU (graphical processing unit) computers.

  2. Descriptive statistics.

    PubMed

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  3. Comparing Psychology Undergraduates' Performance in Probabilistic Reasoning under Verbal-Numerical and Graphical-Pictorial Problem Presentation Format: What Is the Role of Individual and Contextual Dimensions?

    ERIC Educational Resources Information Center

    Agus, Mirian; Peró-Cebollero, Maribel; Penna, Maria Pietronilla; Guàrdia-Olmos, Joan

    2015-01-01

    This study aims to investigate about the existence of a graphical facilitation effect on probabilistic reasoning. Measures of undergraduates' performances on problems presented in both verbal-numerical and graphical-pictorial formats have been related to visuo-spatial and numerical prerequisites, to statistical anxiety, to attitudes towards…

  4. Investigating the Effectiveness of Graphic Organizer Instruction on the Comprehension and Recall of Science Content by Students with Learning Disabilities.

    ERIC Educational Resources Information Center

    Griffin, Cynthia C.; And Others

    1991-01-01

    The effect of graphic organizers on 14 learning-disabled students' (grades 5 and 6) recall of science content was compared with that of students taught identical content without the graphic organizer condition. Evaluation two weeks later found no statistically significant differences between the groups on production or choice response tasks. (DB)

  5. IMAT graphics manual

    NASA Technical Reports Server (NTRS)

    Stockwell, Alan E.; Cooper, Paul A.

    1991-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) consists of a menu driven executive system coupled with a relational database which links commercial structures, structural dynamics and control codes. The IMAT graphics system, a key element of the software, provides a common interface for storing, retrieving, and displaying graphical information. The IMAT Graphics Manual shows users of commercial analysis codes (MATRIXx, MSC/NASTRAN and I-DEAS) how to use the IMAT graphics system to obtain high quality graphical output using familiar plotting procedures. The manual explains the key features of the IMAT graphics system, illustrates their use with simple step-by-step examples, and provides a reference for users who wish to take advantage of the flexibility of the software to customize their own applications.

  6. A primer on the study of transitory dynamics in ecological series using the scale-dependent correlation analysis.

    PubMed

    Rodríguez-Arias, Miquel Angel; Rodó, Xavier

    2004-03-01

    Here we describe a practical, step-by-step primer to scale-dependent correlation (SDC) analysis. The analysis of transitory processes is an important but often neglected topic in ecological studies because only a few statistical techniques appear to detect temporary features accurately enough. We introduce here the SDC analysis, a statistical and graphical method to study transitory processes at any temporal or spatial scale. SDC analysis, thanks to the combination of conventional procedures and simple well-known statistical techniques, becomes an improved time-domain analogue of wavelet analysis. We use several simple synthetic series to describe the method, a more complex example, full of transitory features, to compare SDC and wavelet analysis, and finally we analyze some selected ecological series to illustrate the methodology. The SDC analysis of time series of copepod abundances in the North Sea indicates that ENSO primarily is the main climatic driver of short-term changes in population dynamics. SDC also uncovers some long-term, unexpected features in the population. Similarly, the SDC analysis of Nicholson's blowflies data locates where the proposed models fail and provides new insights about the mechanism that drives the apparent vanishing of the population cycle during the second half of the series.

  7. Empirical comparison study of approximate methods for structure selection in binary graphical models.

    PubMed

    Viallon, Vivian; Banerjee, Onureena; Jougla, Eric; Rey, Grégoire; Coste, Joel

    2014-03-01

    Looking for associations among multiple variables is a topical issue in statistics due to the increasing amount of data encountered in biology, medicine, and many other domains involving statistical applications. Graphical models have recently gained popularity for this purpose in the statistical literature. In the binary case, however, exact inference is generally very slow or even intractable because of the form of the so-called log-partition function. In this paper, we review various approximate methods for structure selection in binary graphical models that have recently been proposed in the literature and compare them through an extensive simulation study. We also propose a modification of one existing method, that is shown to achieve good performance and to be generally very fast. We conclude with an application in which we search for associations among causes of death recorded on French death certificates. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Statistical Characterization of School Bus Drive Cycles Collected via Onboard Logging Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duran, A.; Walkowicz, K.

    In an effort to characterize the dynamics typical of school bus operation, National Renewable Energy Laboratory (NREL) researchers set out to gather in-use duty cycle data from school bus fleets operating across the country. Employing a combination of Isaac Instruments GPS/CAN data loggers in conjunction with existing onboard telemetric systems resulted in the capture of operating information for more than 200 individual vehicles in three geographically unique domestic locations. In total, over 1,500 individual operational route shifts from Washington, New York, and Colorado were collected. Upon completing the collection of in-use field data using either NREL-installed data acquisition devices ormore » existing onboard telemetry systems, large-scale duty-cycle statistical analyses were performed to examine underlying vehicle dynamics trends within the data and to explore vehicle operation variations between fleet locations. Based on the results of these analyses, high, low, and average vehicle dynamics requirements were determined, resulting in the selection of representative standard chassis dynamometer test cycles for each condition. In this paper, the methodology and accompanying results of the large-scale duty-cycle statistical analysis are presented, including graphical and tabular representations of a number of relationships between key duty-cycle metrics observed within the larger data set. In addition to presenting the results of this analysis, conclusions are drawn and presented regarding potential applications of advanced vehicle technology as it relates specifically to school buses.« less

  9. A Guide to the Literature on Learning Graphical Models

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.; Friedland, Peter (Technical Monitor)

    1994-01-01

    This literature review discusses different methods under the general rubric of learning Bayesian networks from data, and more generally, learning probabilistic graphical models. Because many problems in artificial intelligence, statistics and neural networks can be represented as a probabilistic graphical model, this area provides a unifying perspective on learning. This paper organizes the research in this area along methodological lines of increasing complexity.

  10. Introductory comments on the USGS geographic applications program

    NASA Technical Reports Server (NTRS)

    Gerlach, A. C.

    1970-01-01

    The third phase of remote sensing technologies and potentials applied to the operations of the U.S. Geological Survey is introduced. Remote sensing data with multidisciplinary spatial data from traditional sources is combined with geographic theory and techniques of environmental modeling. These combined imputs are subject to four sequential activities that involve: (1) thermatic mapping of land use and environmental factors; (2) the dynamics of change detection; (3) environmental surveillance to identify sudden changes and general trends; and (4) preparation of statistical model and analytical reports. Geography program functions, products, clients, and goals are presented in graphical form, along with aircraft photo missions, geography test sites, and FY-70.

  11. Evaluation of accelerated iterative x-ray CT image reconstruction using floating point graphics hardware.

    PubMed

    Kole, J S; Beekman, F J

    2006-02-21

    Statistical reconstruction methods offer possibilities to improve image quality as compared with analytical methods, but current reconstruction times prohibit routine application in clinical and micro-CT. In particular, for cone-beam x-ray CT, the use of graphics hardware has been proposed to accelerate the forward and back-projection operations, in order to reduce reconstruction times. In the past, wide application of this texture hardware mapping approach was hampered owing to limited intrinsic accuracy. Recently, however, floating point precision has become available in the latest generation commodity graphics cards. In this paper, we utilize this feature to construct a graphics hardware accelerated version of the ordered subset convex reconstruction algorithm. The aims of this paper are (i) to study the impact of using graphics hardware acceleration for statistical reconstruction on the reconstructed image accuracy and (ii) to measure the speed increase one can obtain by using graphics hardware acceleration. We compare the unaccelerated algorithm with the graphics hardware accelerated version, and for the latter we consider two different interpolation techniques. A simulation study of a micro-CT scanner with a mathematical phantom shows that at almost preserved reconstructed image accuracy, speed-ups of a factor 40 to 222 can be achieved, compared with the unaccelerated algorithm, and depending on the phantom and detector sizes. Reconstruction from physical phantom data reconfirms the usability of the accelerated algorithm for practical cases.

  12. On the significance of δ13C correlations in ancient sediments

    NASA Astrophysics Data System (ADS)

    Derry, Louis A.

    2010-08-01

    A graphical analysis of the correlations between δc and ɛTOC was introduced by Rothman et al. (2003) to obtain estimates of the carbon isotopic composition of inputs to the oceans and the organic carbon burial fraction. Applied to Cenozoic data, the method agrees with independent estimates, but with Neoproterozoic data the method yields results that cannot be accommodated with standard models of sedimentary carbon isotope mass balance. We explore the sensitivity of the graphical correlation method and find that the variance ratio between δc and δo is an important control on the correlation of δc and ɛ. If the variance ratio σc/ σo ≥ 1 highly correlated arrays very similar to those obtained from the data are produced from independent random variables. The Neoproterozoic data shows such variance patterns, and the regression parameters for the Neoproterozoic data are statistically indistinguishable from the randomized model at the 95% confidence interval. The projection of the data into δc- ɛ space cannot distinguish between signal and noise, such as post-depositional alteration, under these circumstances. There appears to be no need to invoke unusual carbon cycle dynamics to explain the Neoproterozoic δc- ɛ array. The Cenozoic data have σc/ σo < 1 and the δc vs. ɛ correlation is probably geologically significant, but the analyzed sample size is too small to yield statistically significant results.

  13. Taming Crowded Visual Scenes

    DTIC Science & Technology

    2014-08-12

    Nolan Warner, Mubarak Shah. Tracking in Dense Crowds Using Prominenceand Neighborhood Motion Concurrence, IEEE Transactions on Pattern Analysis...of  computer  vision,   computer   graphics  and  evacuation  dynamics  by  providing  a  common  platform,  and  provides...areas  that  includes  Computer  Vision,  Computer   Graphics ,  and  Pedestrian   Evacuation  Dynamics.  Despite  the

  14. ProUCL version 4.1.00 Documentation Downloads

    EPA Pesticide Factsheets

    ProUCL version 4.1.00 represents a comprehensive statistical software package equipped with statistical methods and graphical tools needed to address many environmental sampling and statistical issues as described in various these guidance documents.

  15. SVG-Based Web Publishing

    NASA Astrophysics Data System (ADS)

    Gao, Jerry Z.; Zhu, Eugene; Shim, Simon

    2003-01-01

    With the increasing applications of the Web in e-commerce, advertising, and publication, new technologies are needed to improve Web graphics technology due to the current limitation of technology. The SVG (Scalable Vector Graphics) technology is a new revolutionary solution to overcome the existing problems in the current web technology. It provides precise and high-resolution web graphics using plain text format commands. It sets a new standard for web graphic format to allow us to present complicated graphics with rich test fonts and colors, high printing quality, and dynamic layout capabilities. This paper provides a tutorial overview about SVG technology and its essential features, capability, and advantages. The reports a comparison studies between SVG and other web graphics technologies.

  16. The effect of a graphical interpretation of a statistic trend indicator (Trigg's Tracking Variable) on the detection of simulated changes.

    PubMed

    Kennedy, R R; Merry, A F

    2011-09-01

    Anaesthesia involves processing large amounts of information over time. One task of the anaesthetist is to detect substantive changes in physiological variables promptly and reliably. It has been previously demonstrated that a graphical trend display of historical data leads to more rapid detection of such changes. We examined the effect of a graphical indication of the magnitude of Trigg's Tracking Variable, a simple statistically based trend detection algorithm, on the accuracy and latency of the detection of changes in a micro-simulation. Ten anaesthetists each viewed 20 simulations with four variables displayed as the current value with a simple graphical trend display. Values for these variables were generated by a computer model, and updated every second; after a period of stability a change occurred to a new random value at least 10 units from baseline. In 50% of the simulations an indication of the rate of change was given by a five level graphical representation of the value of Trigg's Tracking Variable. Participants were asked to indicate when they thought a change was occurring. Changes were detected 10.9% faster with the trend indicator present (mean 13.1 [SD 3.1] cycles vs 14.6 [SD 3.4] cycles, 95% confidence interval 0.4 to 2.5 cycles, P = 0.013. There was no difference in accuracy of detection (median with trend detection 97% [interquartile range 95 to 100%], without trend detection 100% [98 to 100%]), P = 0.8. We conclude that simple statistical trend detection may speed detection of changes during routine anaesthesia, even when a graphical trend display is present.

  17. Graphics supercomputer for computational fluid dynamics research

    NASA Astrophysics Data System (ADS)

    Liaw, Goang S.

    1994-11-01

    The objective of this project is to purchase a state-of-the-art graphics supercomputer to improve the Computational Fluid Dynamics (CFD) research capability at Alabama A & M University (AAMU) and to support the Air Force research projects. A cutting-edge graphics supercomputer system, Onyx VTX, from Silicon Graphics Computer Systems (SGI), was purchased and installed. Other equipment including a desktop personal computer, PC-486 DX2 with a built-in 10-BaseT Ethernet card, a 10-BaseT hub, an Apple Laser Printer Select 360, and a notebook computer from Zenith were also purchased. A reading room has been converted to a research computer lab by adding some furniture and an air conditioning unit in order to provide an appropriate working environments for researchers and the purchase equipment. All the purchased equipment were successfully installed and are fully functional. Several research projects, including two existing Air Force projects, are being performed using these facilities.

  18. phiGENOME: an integrative navigation throughout bacteriophage genomes.

    PubMed

    Stano, Matej; Klucar, Lubos

    2011-11-01

    phiGENOME is a web-based genome browser generating dynamic and interactive graphical representation of phage genomes stored in the phiSITE, database of gene regulation in bacteriophages. phiGENOME is an integral part of the phiSITE web portal (http://www.phisite.org/phigenome) and it was optimised for visualisation of phage genomes with the emphasis on the gene regulatory elements. phiGENOME consists of three components: (i) genome map viewer built using Adobe Flash technology, providing dynamic and interactive graphical display of phage genomes; (ii) sequence browser based on precisely formatted HTML tags, providing detailed exploration of genome features on the sequence level and (iii) regulation illustrator, based on Scalable Vector Graphics (SVG) and designed for graphical representation of gene regulations. Bringing 542 complete genome sequences accompanied with their rich annotations and references, makes phiGENOME a unique information resource in the field of phage genomics. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. 3D Modeling as Method for Construction and Analysis of Graphic Objects

    NASA Astrophysics Data System (ADS)

    Kheyfets, A. L.; Vasilieva, V. N.

    2017-11-01

    The use of 3D modeling when constructing and analyzing perspective projections and shadows is considered. The creation of photorealistic image is shown. The perspective of the construction project and characterization of its image are given as an example. The authors consider the construction of a dynamic block as a means of graphical information storage and automation of geometric constructions. The example of the dynamic block construction at creating a truss node is demonstrated. The constructions are considered as applied to the Auto-CAD software. The paper is aimed at improving the graphic methods of architectural design and improving the educational process when training the Bachelor’s degree students majoring in construction.

  20. Stochastic inference with spiking neurons in the high-conductance state

    NASA Astrophysics Data System (ADS)

    Petrovici, Mihai A.; Bill, Johannes; Bytschok, Ilja; Schemmel, Johannes; Meier, Karlheinz

    2016-10-01

    The highly variable dynamics of neocortical circuits observed in vivo have been hypothesized to represent a signature of ongoing stochastic inference but stand in apparent contrast to the deterministic response of neurons measured in vitro. Based on a propagation of the membrane autocorrelation across spike bursts, we provide an analytical derivation of the neural activation function that holds for a large parameter space, including the high-conductance state. On this basis, we show how an ensemble of leaky integrate-and-fire neurons with conductance-based synapses embedded in a spiking environment can attain the correct firing statistics for sampling from a well-defined target distribution. For recurrent networks, we examine convergence toward stationarity in computer simulations and demonstrate sample-based Bayesian inference in a mixed graphical model. This points to a new computational role of high-conductance states and establishes a rigorous link between deterministic neuron models and functional stochastic dynamics on the network level.

  1. GATA: A graphic alignment tool for comparative sequenceanalysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nix, David A.; Eisen, Michael B.

    2005-01-01

    Several problems exist with current methods used to align DNA sequences for comparative sequence analysis. Most dynamic programming algorithms assume that conserved sequence elements are collinear. This assumption appears valid when comparing orthologous protein coding sequences. Functional constraints on proteins provide strong selective pressure against sequence inversions, and minimize sequence duplications and feature shuffling. For non-coding sequences this collinearity assumption is often invalid. For example, enhancers contain clusters of transcription factor binding sites that change in number, orientation, and spacing during evolution yet the enhancer retains its activity. Dotplot analysis is often used to estimate non-coding sequence relatedness. Yet dotmore » plots do not actually align sequences and thus cannot account well for base insertions or deletions. Moreover, they lack an adequate statistical framework for comparing sequence relatedness and are limited to pairwise comparisons. Lastly, dot plots and dynamic programming text outputs fail to provide an intuitive means for visualizing DNA alignments.« less

  2. Accelerating Molecular Dynamic Simulation on Graphics Processing Units

    PubMed Central

    Friedrichs, Mark S.; Eastman, Peter; Vaidyanathan, Vishal; Houston, Mike; Legrand, Scott; Beberg, Adam L.; Ensign, Daniel L.; Bruns, Christopher M.; Pande, Vijay S.

    2009-01-01

    We describe a complete implementation of all-atom protein molecular dynamics running entirely on a graphics processing unit (GPU), including all standard force field terms, integration, constraints, and implicit solvent. We discuss the design of our algorithms and important optimizations needed to fully take advantage of a GPU. We evaluate its performance, and show that it can be more than 700 times faster than a conventional implementation running on a single CPU core. PMID:19191337

  3. Computer Graphics Simulations of Sampling Distributions.

    ERIC Educational Resources Information Center

    Gordon, Florence S.; Gordon, Sheldon P.

    1989-01-01

    Describes the use of computer graphics simulations to enhance student understanding of sampling distributions that arise in introductory statistics. Highlights include the distribution of sample proportions, the distribution of the difference of sample means, the distribution of the difference of sample proportions, and the distribution of sample…

  4. Photo-Realistic Statistical Skull Morphotypes: New Exemplars for Ancestry and Sex Estimation in Forensic Anthropology.

    PubMed

    Caple, Jodi; Stephan, Carl N

    2017-05-01

    Graphic exemplars of cranial sex and ancestry are essential to forensic anthropology for standardizing casework, training analysts, and communicating group trends. To date, graphic exemplars have comprised hand-drawn sketches, or photographs of individual specimens, which risks bias/subjectivity. Here, we performed quantitative analysis of photographic data to generate new photo-realistic and objective exemplars of skull form. Standardized anterior and left lateral photographs of skulls for each sex were analyzed in the computer graphics program Psychomorph for the following groups: South African Blacks, South African Whites, American Blacks, American Whites, and Japanese. The average cranial form was calculated for each photographic view, before the color information for every individual was warped to the average form and combined to produce statistical averages. These mathematically derived exemplars-and their statistical exaggerations or extremes-retain the high-resolution detail of the original photographic dataset, making them the ideal casework and training reference standards. © 2016 American Academy of Forensic Sciences.

  5. Recent developments of NASTRAN pre- amd post-processors: Response spectrum analysis (RESPAN) and interactive graphics (GIFTS)

    NASA Technical Reports Server (NTRS)

    Hirt, E. F.; Fox, G. L.

    1982-01-01

    Two specific NASTRAN preprocessors and postprocessors are examined. A postprocessor for dynamic analysis and a graphical interactive package for model generation and review of resuls are presented. A computer program that provides response spectrum analysis capability based on data from NASTRAN finite element model is described and the GIFTS system, a graphic processor to augment NASTRAN is introduced.

  6. A graphical vector autoregressive modelling approach to the analysis of electronic diary data

    PubMed Central

    2010-01-01

    Background In recent years, electronic diaries are increasingly used in medical research and practice to investigate patients' processes and fluctuations in symptoms over time. To model dynamic dependence structures and feedback mechanisms between symptom-relevant variables, a multivariate time series method has to be applied. Methods We propose to analyse the temporal interrelationships among the variables by a structural modelling approach based on graphical vector autoregressive (VAR) models. We give a comprehensive description of the underlying concepts and explain how the dependence structure can be recovered from electronic diary data by a search over suitable constrained (graphical) VAR models. Results The graphical VAR approach is applied to the electronic diary data of 35 obese patients with and without binge eating disorder (BED). The dynamic relationships for the two subgroups between eating behaviour, depression, anxiety and eating control are visualized in two path diagrams. Results show that the two subgroups of obese patients with and without BED are distinguishable by the temporal patterns which influence their respective eating behaviours. Conclusion The use of the graphical VAR approach for the analysis of electronic diary data leads to a deeper insight into patient's dynamics and dependence structures. An increasing use of this modelling approach could lead to a better understanding of complex psychological and physiological mechanisms in different areas of medical care and research. PMID:20359333

  7. 17 CFR 229.1111 - (Item 1111) Pool assets.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... information for the asset pool, including statistical information regarding delinquencies and losses. (d.... Present statistical information in tabular or graphical format, if such presentation will aid understanding. Present statistical information in appropriate distributional groups or incremental ranges in...

  8. Use of graphics and symbols on dynamic message signs : technical report.

    DOT National Transportation Integrated Search

    2009-03-01

    This project has taken a step toward defining how graphic and symbol displays can improve or assist communication with drivers. Through three human factors evaluations of alternative designs, researchers identified specific design elements that shoul...

  9. Using Computer Graphics in Statistics.

    ERIC Educational Resources Information Center

    Kerley, Lyndell M.

    1990-01-01

    Described is software which allows a student to use simulation to produce analytical output as well as graphical results. The results include a frequency histogram of a selected population distribution, a frequency histogram of the distribution of the sample means, and test the normality distributions of the sample means. (KR)

  10. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  11. Easy GROMACS: A Graphical User Interface for GROMACS Molecular Dynamics Simulation Package

    NASA Astrophysics Data System (ADS)

    Dizkirici, Ayten; Tekpinar, Mustafa

    2015-03-01

    GROMACS is a widely used molecular dynamics simulation package. Since it is a command driven program, it is difficult to use this program for molecular biologists, biochemists, new graduate students and undergraduate researchers who are interested in molecular dynamics simulations. To alleviate the problem for those researchers, we wrote a graphical user interface that simplifies protein preparation for a classical molecular dynamics simulation. Our program can work with various GROMACS versions and it can perform essential analyses of GROMACS trajectories as well as protein preparation. We named our open source program `Easy GROMACS'. Easy GROMACS can give researchers more time for scientific research instead of dealing with technical intricacies.

  12. The development of an engineering computer graphics laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, D. C.; Garrett, R. E.

    1975-01-01

    Hardware and software systems developed to further research and education in interactive computer graphics were described, as well as several of the ongoing application-oriented projects, educational graphics programs, and graduate research projects. The software system consists of a FORTRAN 4 subroutine package, in conjunction with a PDP 11/40 minicomputer as the primary computation processor and the Imlac PDS-1 as an intelligent display processor. The package comprises a comprehensive set of graphics routines for dynamic, structured two-dimensional display manipulation, and numerous routines to handle a variety of input devices at the Imlac.

  13. Quick Overview Scout 2008 Version 1.0

    EPA Science Inventory

    The Scout 2008 version 1.0 statistical software package has been updated from past DOS and Windows versions to provide classical and robust univariate and multivariate graphical and statistical methods that are not typically available in commercial or freeware statistical softwar...

  14. Item Screening in Graphical Loglinear Rasch Models

    ERIC Educational Resources Information Center

    Kreiner, Svend; Christensen, Karl Bang

    2011-01-01

    In behavioural sciences, local dependence and DIF are common, and purification procedures that eliminate items with these weaknesses often result in short scales with poor reliability. Graphical loglinear Rasch models (Kreiner & Christensen, in "Statistical Methods for Quality of Life Studies," ed. by M. Mesbah, F.C. Cole & M.T.…

  15. Study on application of dynamic monitoring of land use based on mobile GIS technology

    NASA Astrophysics Data System (ADS)

    Tian, Jingyi; Chu, Jian; Guo, Jianxing; Wang, Lixin

    2006-10-01

    The land use dynamic monitoring is an important mean to maintain the real-time update of the land use data. Mobile GIS technology integrates GIS, GPS and Internet. It can update the historic al data in real time with site-collected data and realize the data update in large scale with high precision. The Monitoring methods on the land use change data with the mobile GIS technology were discussed. Mobile terminal of mobile GIS has self-developed for this study with GPS-25 OEM and notebook computer. The RTD (real-time difference) operation mode is selected. Mobile GIS system of dynamic monitoring of land use have developed with Visual C++ as operation platform, MapObjects control as graphic platform and MSCmm control as communication platform, which realizes organic integration of GPS, GPRS and GIS. This system has such following basic functions as data processing, graphic display, graphic editing, attribute query and navigation. Qinhuangdao city was selected as the experiential area. Shown by the study result, the mobile GIS integration system of dynamic monitoring of land use developed by this study has practical application value.

  16. Robot graphic simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.

    1991-01-01

    The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts.

  17. Application of computer generated color graphic techniques to the processing and display of three dimensional fluid dynamic data

    NASA Technical Reports Server (NTRS)

    Anderson, B. H.; Putt, C. W.; Giamati, C. C.

    1981-01-01

    Color coding techniques used in the processing of remote sensing imagery were adapted and applied to the fluid dynamics problems associated with turbofan mixer nozzles. The computer generated color graphics were found to be useful in reconstructing the measured flow field from low resolution experimental data to give more physical meaning to this information and in scanning and interpreting the large volume of computer generated data from the three dimensional viscous computer code used in the analysis.

  18. General purpose molecular dynamics simulations fully implemented on graphics processing units

    NASA Astrophysics Data System (ADS)

    Anderson, Joshua A.; Lorenz, Chris D.; Travesset, A.

    2008-05-01

    Graphics processing units (GPUs), originally developed for rendering real-time effects in computer games, now provide unprecedented computational power for scientific applications. In this paper, we develop a general purpose molecular dynamics code that runs entirely on a single GPU. It is shown that our GPU implementation provides a performance equivalent to that of fast 30 processor core distributed memory cluster. Our results show that GPUs already provide an inexpensive alternative to such clusters and discuss implications for the future.

  19. A statistical data analysis and plotting program for cloud microphysics experiments

    NASA Technical Reports Server (NTRS)

    Jordan, A. J.

    1981-01-01

    The analysis software developed for atmospheric cloud microphysics experiments conducted in the laboratory as well as aboard a KC-135 aircraft is described. A group of four programs was developed and implemented on a Hewlett Packard 1000 series F minicomputer running under HP's RTE-IVB operating system. The programs control and read data from a MEMODYNE Model 3765-8BV cassette recorder, format the data on the Hewlett Packard disk subsystem, and generate statistical data (mean, variance, standard deviation) and voltage and engineering unit plots on a user selected plotting device. The programs are written in HP FORTRAN IV and HP ASSEMBLY Language with the graphics software using the HP 1000 Graphics. The supported plotting devices are the HP 2647A graphics terminal, the HP 9872B four color pen plotter, and the HP 2608A matrix line printer.

  20. On an Additive Semigraphoid Model for Statistical Networks With Application to Pathway Analysis.

    PubMed

    Li, Bing; Chun, Hyonho; Zhao, Hongyu

    2014-09-01

    We introduce a nonparametric method for estimating non-gaussian graphical models based on a new statistical relation called additive conditional independence, which is a three-way relation among random vectors that resembles the logical structure of conditional independence. Additive conditional independence allows us to use one-dimensional kernel regardless of the dimension of the graph, which not only avoids the curse of dimensionality but also simplifies computation. It also gives rise to a parallel structure to the gaussian graphical model that replaces the precision matrix by an additive precision operator. The estimators derived from additive conditional independence cover the recently introduced nonparanormal graphical model as a special case, but outperform it when the gaussian copula assumption is violated. We compare the new method with existing ones by simulations and in genetic pathway analysis.

  1. Writing a Scientific Paper II. Communication by Graphics

    NASA Astrophysics Data System (ADS)

    Sterken, C.

    2011-07-01

    This paper discusses facets of visual communication by way of images, graphs, diagrams and tabular material. Design types and elements of graphical images are presented, along with advice on how to create graphs, and on how to read graphical illustrations. This is done in astronomical context, using case studies and historical examples of good and bad graphics. Design types of graphs (scatter and vector plots, histograms, pie charts, ternary diagrams and three-dimensional surface graphs) are explicated, as well as the major components of graphical images (axes, legends, textual parts, etc.). The basic features of computer graphics (image resolution, vector images, bitmaps, graphical file formats and file conversions) are explained, as well as concepts of color models and of color spaces (with emphasis on aspects of readability of color graphics by viewers suffering from color-vision deficiencies). Special attention is given to the verity of graphical content, and to misrepresentations and errors in graphics and associated basic statistics. Dangers of dot joining and curve fitting are discussed, with emphasis on the perception of linearity, the issue of nonsense correlations, and the handling of outliers. Finally, the distinction between data, fits and models is illustrated.

  2. Gromita: a fully integrated graphical user interface to gromacs 4.

    PubMed

    Sellis, Diamantis; Vlachakis, Dimitrios; Vlassi, Metaxia

    2009-09-07

    Gromita is a fully integrated and efficient graphical user interface (GUI) to the recently updated molecular dynamics suite Gromacs, version 4. Gromita is a cross-platform, perl/tcl-tk based, interactive front end designed to break the command line barrier and introduce a new user-friendly environment to run molecular dynamics simulations through Gromacs. Our GUI features a novel workflow interface that guides the user through each logical step of the molecular dynamics setup process, making it accessible to both advanced and novice users. This tool provides a seamless interface to the Gromacs package, while providing enhanced functionality by speeding up and simplifying the task of setting up molecular dynamics simulations of biological systems. Gromita can be freely downloaded from http://bio.demokritos.gr/gromita/.

  3. Study of journal bearing dynamics using 3-dimensional motion picture graphics

    NASA Technical Reports Server (NTRS)

    Brewe, D. E.; Sosoka, D. J.

    1985-01-01

    Computer generated motion pictures of three dimensional graphics are being used to analyze journal bearings under dynamically loaded conditions. The motion pictures simultaneously present the motion of the journal and the pressures predicted within the fluid film of the bearing as they evolve in time. The correct prediction of these fluid film pressures can be complicated by the development of cavitation within the fluid. The numerical model that is used predicts the formation of the cavitation bubble and its growth, downstream movement, and subsequent collapse. A complete physical picture is created in the motion picture as the journal traverses through the entire dynamic cycle.

  4. Novice Interpretations of Progress Monitoring Graphs: Extreme Values and Graphical Aids

    ERIC Educational Resources Information Center

    Newell, Kirsten W.; Christ, Theodore J.

    2017-01-01

    Curriculum-Based Measurement of Reading (CBM-R) is frequently used to monitor instructional effects and evaluate response to instruction. Educators often view the data graphically on a time-series graph that might include a variety of statistical and visual aids, which are intended to facilitate the interpretation. This study evaluated the effects…

  5. A graphical user interface for RAId, a knowledge integrated proteomics analysis suite with accurate statistics.

    PubMed

    Joyce, Brendan; Lee, Danny; Rubio, Alex; Ogurtsov, Aleksey; Alves, Gelio; Yu, Yi-Kuo

    2018-03-15

    RAId is a software package that has been actively developed for the past 10 years for computationally and visually analyzing MS/MS data. Founded on rigorous statistical methods, RAId's core program computes accurate E-values for peptides and proteins identified during database searches. Making this robust tool readily accessible for the proteomics community by developing a graphical user interface (GUI) is our main goal here. We have constructed a graphical user interface to facilitate the use of RAId on users' local machines. Written in Java, RAId_GUI not only makes easy executions of RAId but also provides tools for data/spectra visualization, MS-product analysis, molecular isotopic distribution analysis, and graphing the retrieval versus the proportion of false discoveries. The results viewer displays and allows the users to download the analyses results. Both the knowledge-integrated organismal databases and the code package (containing source code, the graphical user interface, and a user manual) are available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads/raid.html .

  6. Methodology to assess clinical liver safety data.

    PubMed

    Merz, Michael; Lee, Kwan R; Kullak-Ublick, Gerd A; Brueckner, Andreas; Watkins, Paul B

    2014-11-01

    Analysis of liver safety data has to be multivariate by nature and needs to take into account time dependency of observations. Current standard tools for liver safety assessment such as summary tables, individual data listings, and narratives address these requirements to a limited extent only. Using graphics in the context of a systematic workflow including predefined graph templates is a valuable addition to standard instruments, helping to ensure completeness of evaluation, and supporting both hypothesis generation and testing. Employing graphical workflows interactively allows analysis in a team-based setting and facilitates identification of the most suitable graphics for publishing and regulatory reporting. Another important tool is statistical outlier detection, accounting for the fact that for assessment of Drug-Induced Liver Injury, identification and thorough evaluation of extreme values has much more relevance than measures of central tendency in the data. Taken together, systematical graphical data exploration and statistical outlier detection may have the potential to significantly improve assessment and interpretation of clinical liver safety data. A workshop was convened to discuss best practices for the assessment of drug-induced liver injury (DILI) in clinical trials.

  7. CAMERRA: An analysis tool for the computation of conformational dynamics by evaluating residue-residue associations.

    PubMed

    Johnson, Quentin R; Lindsay, Richard J; Shen, Tongye

    2018-02-21

    A computational method which extracts the dominant motions from an ensemble of biomolecular conformations via a correlation analysis of residue-residue contacts is presented. The algorithm first renders the structural information into contact matrices, then constructs the collective modes based on the correlated dynamics of a selected set of dynamic contacts. Associated programs can bridge the results for further visualization using graphics software. The aim of this method is to provide an analysis of conformations of biopolymers from the contact viewpoint. It may assist a systematical uncovering of conformational switching mechanisms existing in proteins and biopolymer systems in general by statistical analysis of simulation snapshots. In contrast to conventional correlation analyses of Cartesian coordinates (such as distance covariance analysis and Cartesian principal component analysis), this program also provides an alternative way to locate essential collective motions in general. Herein, we detail the algorithm in a stepwise manner and comment on the importance of the method as applied to decoding allosteric mechanisms. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  8. The End of the Rainbow? Color Schemes for Improved Data Graphics

    NASA Astrophysics Data System (ADS)

    Light, Adam; Bartlein, Patrick J.

    2004-10-01

    Modern computer displays and printers enable the widespread use of color in scientific communication, but the expertise for designing effective graphics has not kept pace with the technology for producing them. Historically, even the most prestigious publications have tolerated high defect rates in figures and illustrations, and technological advances that make creating and reproducing graphics easier do not appear to have decreased the frequency of errors. Flawed graphics consequently beget more flawed graphics as authors emulate published examples. Color has the potential to enhance communication, but design mistakes can result in color figures that are less effective than gray scale displays of the same data. Empirical research on human subjects can build a fundamental understanding of visual perception and scientific methods can be used to evaluate existing designs, but creating effective data graphics is a design task and not fundamentally a scientific pursuit. Like writing well, creating good data graphics requires a combination of formal knowledge and artistic sensibility tempered by experience: a combination of ``substance, statistics, and design''.

  9. High-performance dynamic quantum clustering on graphics processors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wittek, Peter, E-mail: peterwittek@acm.org

    2013-01-15

    Clustering methods in machine learning may benefit from borrowing metaphors from physics. Dynamic quantum clustering associates a Gaussian wave packet with the multidimensional data points and regards them as eigenfunctions of the Schroedinger equation. The clustering structure emerges by letting the system evolve and the visual nature of the algorithm has been shown to be useful in a range of applications. Furthermore, the method only uses matrix operations, which readily lend themselves to parallelization. In this paper, we develop an implementation on graphics hardware and investigate how this approach can accelerate the computations. We achieve a speedup of up tomore » two magnitudes over a multicore CPU implementation, which proves that quantum-like methods and acceleration by graphics processing units have a great relevance to machine learning.« less

  10. Multimedia category preferences of working engineers

    NASA Astrophysics Data System (ADS)

    Baukal, Charles E.; Ausburn, Lynna J.

    2016-09-01

    Many have argued for the importance of continuing engineering education (CEE), but relatively few recommendations were found in the literature for how to use multimedia technologies to deliver it most effectively. The study reported here addressed this gap by investigating the multimedia category preferences of working engineers. Four categories of multimedia, with two types in each category, were studied: verbal (text and narration), static graphics (drawing and photograph), dynamic non-interactive graphics (animation and video), and dynamic interactive graphics (simulated virtual reality (VR) and photo-real VR). The results showed that working engineers strongly preferred text over narration and somewhat preferred drawing over photograph, animation over video, and simulated VR over photo-real VR. These results suggest that a variety of multimedia types should be used in the instructional design of CEE content.

  11. The timber industries of Kentucky, 1986

    Treesearch

    Eric H. Wharton; Stephen C. Kayse; Robert L., Jr. Nevel; Robert L. Nevel

    1992-01-01

    A statistical report based on a survey of primary wood manufacturers using wood from Kentucky. Contains statistics on production and consumption of industrial forest products by species, geographic units, and state; and production and disposition of manufacturing residues. Includes graphics and statistical tables showing current and historical data.

  12. The effects of a dynamic graphical model during simulation-based training of console operation skill

    NASA Technical Reports Server (NTRS)

    Farquhar, John D.; Regian, J. Wesley

    1993-01-01

    LOADER is a Windows-based simulation of a complex procedural task. The task requires subjects to execute long sequences of console-operation actions (e.g., button presses, switch actuations, dial rotations) to accomplish specific goals. The LOADER interface is a graphical computer-simulated console which controls railroad cars, tracks, and cranes in a fictitious railroad yard. We hypothesized that acquisition of LOADER performance skill would be supported by the representation of a dynamic graphical model linking console actions to goal and goal states in the 'railroad yard'. Twenty-nine subjects were randomly assigned to one of two treatments (i.e., dynamic model or no model). During training, both groups received identical text-based instruction in an instructional-window above the LOADER interface. One group, however, additionally saw a dynamic version of the bird's-eye view of the railroad yard. After training, both groups were tested under identical conditions. They were asked to perform the complete procedure without guidance and without access to either type of railroad yard representation. Results indicate that rather than becoming dependent on the animated rail yard model, subjects in the dynamic model condition apparently internalized the model, as evidenced by their performance after the model was removed.

  13. Discrete Dynamical Modeling.

    ERIC Educational Resources Information Center

    Sandefur, James T.

    1991-01-01

    Discussed is the process of translating situations involving changing quantities into mathematical relationships. This process, called dynamical modeling, allows students to learn new mathematics while sharpening their algebraic skills. A description of dynamical systems, problem-solving methods, a graphical analysis, and available classroom…

  14. A visual basic program to generate sediment grain-size statistics and to extrapolate particle distributions

    USGS Publications Warehouse

    Poppe, L.J.; Eliason, A.H.; Hastings, M.E.

    2004-01-01

    Measures that describe and summarize sediment grain-size distributions are important to geologists because of the large amount of information contained in textural data sets. Statistical methods are usually employed to simplify the necessary comparisons among samples and quantify the observed differences. The two statistical methods most commonly used by sedimentologists to describe particle distributions are mathematical moments (Krumbein and Pettijohn, 1938) and inclusive graphics (Folk, 1974). The choice of which of these statistical measures to use is typically governed by the amount of data available (Royse, 1970). If the entire distribution is known, the method of moments may be used; if the next to last accumulated percent is greater than 95, inclusive graphics statistics can be generated. Unfortunately, earlier programs designed to describe sediment grain-size distributions statistically do not run in a Windows environment, do not allow extrapolation of the distribution's tails, or do not generate both moment and graphic statistics (Kane and Hubert, 1963; Collias et al., 1963; Schlee and Webster, 1967; Poppe et al., 2000)1.Owing to analytical limitations, electro-resistance multichannel particle-size analyzers, such as Coulter Counters, commonly truncate the tails of the fine-fraction part of grain-size distributions. These devices do not detect fine clay in the 0.6–0.1 μm range (part of the 11-phi and all of the 12-phi and 13-phi fractions). Although size analyses performed down to 0.6 μm microns are adequate for most freshwater and near shore marine sediments, samples from many deeper water marine environments (e.g. rise and abyssal plain) may contain significant material in the fine clay fraction, and these analyses benefit from extrapolation.The program (GSSTAT) described herein generates statistics to characterize sediment grain-size distributions and can extrapolate the fine-grained end of the particle distribution. It is written in Microsoft Visual Basic 6.0 and provides a window to facilitate program execution. The input for the sediment fractions is weight percentages in whole-phi notation (Krumbein, 1934; Inman, 1952), and the program permits the user to select output in either method of moments or inclusive graphics statistics (Fig. 1). Users select options primarily with mouse-click events, or through interactive dialogue boxes.

  15. Human sense utilization method on real-time computer graphics

    NASA Astrophysics Data System (ADS)

    Maehara, Hideaki; Ohgashi, Hitoshi; Hirata, Takao

    1997-06-01

    We are developing an adjustment method of real-time computer graphics, to obtain effective ones which give audience various senses intended by producer, utilizing human sensibility technologically. Generally, production of real-time computer graphics needs much adjustment of various parameters, such as 3D object models/their motions/attributes/view angle/parallax etc., in order that the graphics gives audience superior effects as reality of materials, sense of experience and so on. And it is also known it costs much to adjust such various parameters by trial and error. A graphics producer often evaluates his graphics to improve it. For example, it may lack 'sense of speed' or be necessary to be given more 'sense of settle down,' to improve it. On the other hand, we can know how the parameters in computer graphics affect such senses by means of statistically analyzing several samples of computer graphics which provide different senses. We paid attention to these two facts, so that we designed an adjustment method of the parameters by inputting phases of sense into a computer. By the way of using this method, it becomes possible to adjust real-time computer graphics more effectively than by conventional way of trial and error.

  16. Real-time application of advanced three-dimensional graphic techniques for research aircraft simulation

    NASA Technical Reports Server (NTRS)

    Davis, Steven B.

    1990-01-01

    Visual aids are valuable assets to engineers for design, demonstration, and evaluation. Discussed here are a variety of advanced three-dimensional graphic techniques used to enhance the displays of test aircraft dynamics. The new software's capabilities are examined and possible future uses are considered.

  17. Installation of dynamic travel time signs and efforts to obtain and test a graphical route information panel (GRIP) sign in Austin.

    DOT National Transportation Integrated Search

    2016-08-01

    Graphic Route Information Panel (GRIP) signs use a combination of text, colors, and representative maps of : the roadway system to convey real-time roadway congestion location and severity information. The intent of : this project was to facilitate t...

  18. On an additive partial correlation operator and nonparametric estimation of graphical models.

    PubMed

    Lee, Kuang-Yao; Li, Bing; Zhao, Hongyu

    2016-09-01

    We introduce an additive partial correlation operator as an extension of partial correlation to the nonlinear setting, and use it to develop a new estimator for nonparametric graphical models. Our graphical models are based on additive conditional independence, a statistical relation that captures the spirit of conditional independence without having to resort to high-dimensional kernels for its estimation. The additive partial correlation operator completely characterizes additive conditional independence, and has the additional advantage of putting marginal variation on appropriate scales when evaluating interdependence, which leads to more accurate statistical inference. We establish the consistency of the proposed estimator. Through simulation experiments and analysis of the DREAM4 Challenge dataset, we demonstrate that our method performs better than existing methods in cases where the Gaussian or copula Gaussian assumption does not hold, and that a more appropriate scaling for our method further enhances its performance.

  19. On an additive partial correlation operator and nonparametric estimation of graphical models

    PubMed Central

    Li, Bing; Zhao, Hongyu

    2016-01-01

    Abstract We introduce an additive partial correlation operator as an extension of partial correlation to the nonlinear setting, and use it to develop a new estimator for nonparametric graphical models. Our graphical models are based on additive conditional independence, a statistical relation that captures the spirit of conditional independence without having to resort to high-dimensional kernels for its estimation. The additive partial correlation operator completely characterizes additive conditional independence, and has the additional advantage of putting marginal variation on appropriate scales when evaluating interdependence, which leads to more accurate statistical inference. We establish the consistency of the proposed estimator. Through simulation experiments and analysis of the DREAM4 Challenge dataset, we demonstrate that our method performs better than existing methods in cases where the Gaussian or copula Gaussian assumption does not hold, and that a more appropriate scaling for our method further enhances its performance. PMID:29422689

  20. LFSTAT - Low-Flow Analysis in R

    NASA Astrophysics Data System (ADS)

    Koffler, Daniel; Laaha, Gregor

    2013-04-01

    The calculation of characteristic stream flow during dry conditions is a basic requirement for many problems in hydrology, ecohydrology and water resources management. As opposed to floods, a number of different indices are used to characterise low flows and streamflow droughts. Although these indices and methods of calculation have been well documented in the WMO Manual on Low-flow Estimation and Prediction [1], a comprehensive software was missing which enables a fast and standardized calculation of low flow statistics. We present the new software package lfstat to fill in this obvious gap. Our software package is based on the statistical open source software R, and expands it to analyse daily stream flow data records focusing on low-flows. As command-line based programs are not everyone's preference, we also offer a plug-in for the R-Commander, an easy to use graphical user interface (GUI) provided for R which is based on tcl/tk. The functionality of lfstat includes estimation methods for low-flow indices, extreme value statistics, deficit characteristics, and additional graphical methods to control the computation of complex indices and to illustrate the data. Beside the basic low flow indices, the baseflow index and recession constants can be computed. For extreme value statistics, state-of-the-art methods for L-moment based local and regional frequency analysis (RFA) are available. The tools for deficit characteristics include various pooling and threshold selection methods to support the calculation of drought duration and deficit indices. The most common graphics for low flow analysis are available, and the plots can be modified according to the user preferences. Graphics include hydrographs for different periods, flexible streamflow deficit plots, baseflow visualisation, recession diagnostic, flow duration curves as well as double mass curves, and many more. From a technical point of view, the package uses a S3-class called lfobj (low-flow objects). This objects are usual R-data-frames including date, flow, hydrological year and possibly baseflow information. Once these objects are created, analysis can be performed by mouse-click and a script can be saved to make the analysis easily reproducible. At the moment we are offering implementation of all major methods proposed in the WMO manual on Low-flow Estimation and Predictions [1]. Future plans include a dynamic low flow report in odt-file format using odf-weave which allows automatic updates if data or analysis change. We hope to offer a tool to ease and structure the analysis of stream flow data focusing on low-flows and to make analysis transparent and communicable. The package can also be used in teaching students the first steps in low-flow hydrology. The software packages can be installed from CRAN (latest stable) and R-Forge: http://r-forge.r-project.org (development version). References: [1] Gustard, Alan; Demuth, Siegfried, (eds.) Manual on Low-flow Estimation and Prediction. Geneva, Switzerland, World Meteorological Organization, (Operational Hydrology Report No. 50, WMO-No. 1029).

  1. Seeing is believing: good graphic design principles for medical research.

    PubMed

    Duke, Susan P; Bancken, Fabrice; Crowe, Brenda; Soukup, Mat; Botsis, Taxiarchis; Forshee, Richard

    2015-09-30

    Have you noticed when you browse a book, journal, study report, or product label how your eye is drawn to figures more than to words and tables? Statistical graphs are powerful ways to transparently and succinctly communicate the key points of medical research. Furthermore, the graphic design itself adds to the clarity of the messages in the data. The goal of this paper is to provide a mechanism for selecting the appropriate graph to thoughtfully construct quality deliverables using good graphic design principles. Examples are motivated by the efforts of a Safety Graphics Working Group that consisted of scientists from the pharmaceutical industry, Food and Drug Administration, and academic institutions. Copyright © 2015 John Wiley & Sons, Ltd.

  2. ResidPlots-2: Computer Software for IRT Graphical Residual Analyses

    ERIC Educational Resources Information Center

    Liang, Tie; Han, Kyung T.; Hambleton, Ronald K.

    2009-01-01

    This article discusses the ResidPlots-2, a computer software that provides a powerful tool for IRT graphical residual analyses. ResidPlots-2 consists of two components: a component for computing residual statistics and another component for communicating with users and for plotting the residual graphs. The features of the ResidPlots-2 software are…

  3. Computer-Based Graphical Displays for Enhancing Mental Animation and Improving Reasoning in Novice Learning of Probability

    ERIC Educational Resources Information Center

    Kaplan, Danielle E.; Wu, Erin Chia-ling

    2006-01-01

    Our research suggests static and animated graphics can lead to more animated thinking and more correct problem solving in computer-based probability learning. Pilot software modules were developed for graduate online statistics courses and representation research. A study with novice graduate student statisticians compared problem solving in five…

  4. Robust PRNG based on homogeneously distributed chaotic dynamics

    NASA Astrophysics Data System (ADS)

    Garasym, Oleg; Lozi, René; Taralova, Ina

    2016-02-01

    This paper is devoted to the design of new chaotic Pseudo Random Number Generator (CPRNG). Exploring several topologies of network of 1-D coupled chaotic mapping, we focus first on two dimensional networks. Two topologically coupled maps are studied: TTL rc non-alternate, and TTL SC alternate. The primary idea of the novel maps has been based on an original coupling of the tent and logistic maps to achieve excellent random properties and homogeneous /uniform/ density in the phase plane, thus guaranteeing maximum security when used for chaos base cryptography. In this aim two new nonlinear CPRNG: MTTL 2 sc and NTTL 2 are proposed. The maps successfully passed numerous statistical, graphical and numerical tests, due to proposed ring coupling and injection mechanisms.

  5. Application of interactive computer graphics in wind-tunnel dynamic model testing

    NASA Technical Reports Server (NTRS)

    Doggett, R. V., Jr.; Hammond, C. E.

    1975-01-01

    The computer-controlled data-acquisition system recently installed for use with a transonic dynamics tunnel was described. This includes a discussion of the hardware/software features of the system. A subcritical response damping technique, called the combined randomdec/moving-block method, for use in windtunnel-model flutter testing, that has been implemented on the data-acquisition system, is described in some detail. Some results using the method are presented and the importance of using interactive graphics in applying the technique in near real time during wind-tunnel test operations is discussed.

  6. 3D Graphics Through the Internet: A "Shoot-Out"

    NASA Technical Reports Server (NTRS)

    Watson, Val; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    3D graphics through the Internet needs to move beyond the current lowest common denominator of pre-computed movies, which consume bandwidth and are non-interactive. Panelists will demonstrate and compare 3D graphical tools for accessing, analyzing, and collaborating on information through the Internet and World-wide web. The "shoot-out" will illustrate which tools are likely to be the best for the various types of information, including dynamic scientific data, 3-D objects, and virtual environments. The goal of the panel is to encourage more effective use of the Internet by encouraging suppliers and users of information to adopt the next generation of graphical tools.

  7. Interactive computer graphics and its role in control system design of large space structures

    NASA Technical Reports Server (NTRS)

    Reddy, A. S. S. R.

    1985-01-01

    This paper attempts to show the relevance of interactive computer graphics in the design of control systems to maintain attitude and shape of large space structures to accomplish the required mission objectives. The typical phases of control system design, starting from the physical model such as modeling the dynamics, modal analysis, and control system design methodology are reviewed and the need of the interactive computer graphics is demonstrated. Typical constituent parts of large space structures such as free-free beams and free-free plates are used to demonstrate the complexity of the control system design and the effectiveness of the interactive computer graphics.

  8. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions

    PubMed Central

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage. PMID:27468262

  9. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions.

    PubMed

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage.

  10. Deriving Vegetation Dynamics of Natural Terrestrial Ecosystems from MODIS NDVI/EVI Data over Turkey.

    PubMed

    Evrendilek, Fatih; Gulbeyaz, Onder

    2008-09-01

    The 16-day composite MODIS vegetation indices (VIs) at 500-m resolution for the period between 2000 to 2007 were seasonally averaged on the basis of the estimated distribution of 16 potential natural terrestrial ecosystems (NTEs) across Turkey. Graphical and statistical analyses of the time-series VIs for the NTEs spatially disaggregated in terms of biogeoclimate zones and land cover types included descriptive statistics, correlations, discrete Fourier transform (DFT), time-series decomposition, and simple linear regression (SLR) models. Our spatio-temporal analyses revealed that both MODIS VIs, on average, depicted similar seasonal variations for the NTEs, with the NDVI values having higher mean and SD values. The seasonal VIs were most correlated in decreasing order for: barren/sparsely vegetated land > grassland > shrubland/woodland > forest; (sub)nival > warm temperate > alpine > cool temperate > boreal = Mediterranean; and summer > spring > autumn > winter. Most pronounced differences between the MODIS VI responses over Turkey occurred in boreal and Mediterranean climate zones and forests, and in winter (the senescence phase of the growing season). Our results showed the potential of the time-series MODIS VI datasets in the estimation and monitoring of seasonal and interannual ecosystem dynamics over Turkey that needs to be further improved and refined through systematic and extensive field measurements and validations across various biomes.

  11. User’s guide for MapMark4GUI—A graphical user interface for the MapMark4 R package

    USGS Publications Warehouse

    Shapiro, Jason

    2018-05-29

    MapMark4GUI is an R graphical user interface (GUI) developed by the U.S. Geological Survey to support user implementation of the MapMark4 R statistical software package. MapMark4 was developed by the U.S. Geological Survey to implement probability calculations for simulating undiscovered mineral resources in quantitative mineral resource assessments. The GUI provides an easy-to-use tool to input data, run simulations, and format output results for the MapMark4 package. The GUI is written and accessed in the R statistical programming language. This user’s guide includes instructions on installing and running MapMark4GUI and descriptions of the statistical output processes, output files, and test data files.

  12. KMWin--a convenient tool for graphical presentation of results from Kaplan-Meier survival time analysis.

    PubMed

    Gross, Arnd; Ziepert, Marita; Scholz, Markus

    2012-01-01

    Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows) facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav), SAS export (xpt) or text file (dat), which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups.

  13. KMWin – A Convenient Tool for Graphical Presentation of Results from Kaplan-Meier Survival Time Analysis

    PubMed Central

    Gross, Arnd; Ziepert, Marita; Scholz, Markus

    2012-01-01

    Background Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows) facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav), SAS export (xpt) or text file (dat), which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. Results On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. Conclusions We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups. PMID:22723912

  14. Homogeneous nucleation and microstructure evolution in million-atom molecular dynamics simulation

    PubMed Central

    Shibuta, Yasushi; Oguchi, Kanae; Takaki, Tomohiro; Ohno, Munekazu

    2015-01-01

    Homogeneous nucleation from an undercooled iron melt is investigated by the statistical sampling of million-atom molecular dynamics (MD) simulations performed on a graphics processing unit (GPU). Fifty independent instances of isothermal MD calculations with one million atoms in a quasi-two-dimensional cell over a nanosecond reveal that the nucleation rate and the incubation time of nucleation as functions of temperature have characteristic shapes with a nose at the critical temperature. This indicates that thermally activated homogeneous nucleation occurs spontaneously in MD simulations without any inducing factor, whereas most previous studies have employed factors such as pressure, surface effect, and continuous cooling to induce nucleation. Moreover, further calculations over ten nanoseconds capture the microstructure evolution on the order of tens of nanometers from the atomistic viewpoint and the grain growth exponent is directly estimated. Our novel approach based on the concept of “melting pots in a supercomputer” is opening a new phase in computational metallurgy with the aid of rapid advances in computational environments. PMID:26311304

  15. What Can Causal Networks Tell Us about Metabolic Pathways?

    PubMed Central

    Blair, Rachael Hageman; Kliebenstein, Daniel J.; Churchill, Gary A.

    2012-01-01

    Graphical models describe the linear correlation structure of data and have been used to establish causal relationships among phenotypes in genetic mapping populations. Data are typically collected at a single point in time. Biological processes on the other hand are often non-linear and display time varying dynamics. The extent to which graphical models can recapitulate the architecture of an underlying biological processes is not well understood. We consider metabolic networks with known stoichiometry to address the fundamental question: “What can causal networks tell us about metabolic pathways?”. Using data from an Arabidopsis BaySha population and simulated data from dynamic models of pathway motifs, we assess our ability to reconstruct metabolic pathways using graphical models. Our results highlight the necessity of non-genetic residual biological variation for reliable inference. Recovery of the ordering within a pathway is possible, but should not be expected. Causal inference is sensitive to subtle patterns in the correlation structure that may be driven by a variety of factors, which may not emphasize the substrate-product relationship. We illustrate the effects of metabolic pathway architecture, epistasis and stochastic variation on correlation structure and graphical model-derived networks. We conclude that graphical models should be interpreted cautiously, especially if the implied causal relationships are to be used in the design of intervention strategies. PMID:22496633

  16. Gist: A scientific graphics package for Python

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busby, L.E.

    1996-05-08

    {open_quotes}Gist{close_quotes} is a scientific graphics library written by David H. Munro of Lawrence Livermore National Laboratory (LLNL). It features support for three common graphics output devices: X Windows, (Color) PostScript, and ANSI/ISO Standard Computer Graphics Metafiles (CGM). The library is small (written directly to Xlib), portable, efficient, and full-featured. It produces X versus Y plots with {open_quotes}good{close_quotes} tick marks and tick labels, 2-dimensional quadrilateral mesh plots with contours, vector fields, or pseudo color maps on such meshes, with 3-dimensional plots on the way. The Python Gist module utilizes the new {open_quotes}Numeric{close_quotes} module due to J. Hugunin and others. It ismore » therefore fast and able to handle large datasets. The Gist module includes an X Windows event dispatcher which can be dynamically added (e.g., via importing a dynamically loaded module) to the Python interpreter after a simple two-line modification to the Python core. This makes fast mouse-controlled zoom, pan, and other graphic operations available to the researcher while maintaining the usual Python command-line interface. Munro`s Gist library is already freely available. The Python Gist module is currently under review and is also expected to qualify for unlimited release.« less

  17. Applying Statistical Process Quality Control Methodology to Educational Settings.

    ERIC Educational Resources Information Center

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  18. Statistics for People Who (Think They) Hate Statistics. Third Edition

    ERIC Educational Resources Information Center

    Salkind, Neil J.

    2007-01-01

    This text teaches an often intimidating and difficult subject in a way that is informative, personable, and clear. The author takes students through various statistical procedures, beginning with correlation and graphical representation of data and ending with inferential techniques and analysis of variance. In addition, the text covers SPSS, and…

  19. Using Spreadsheets to Teach Statistics in Geography.

    ERIC Educational Resources Information Center

    Lee, M. P.; Soper, J. B.

    1987-01-01

    Maintains that teaching methods of statistical calculation in geography may be enhanced by using a computer spreadsheet. The spreadsheet format of rows and columns allows the data to be inspected and altered to demonstrate various statistical properties. The inclusion of graphics and database facilities further adds to the value of a spreadsheet.…

  20. Metaplot: a novel stata graph for assessing heterogeneity at a glance.

    PubMed

    Poorolajal, J; Mahmoodi, M; Majdzadeh, R; Fotouhi, A

    2010-01-01

    Heterogeneity is usually a major concern in meta-analysis. Although there are some statistical approaches for assessing variability across studies, here we present a new approach to heterogeneity using "MetaPlot" that investigate the influence of a single study on the overall heterogeneity. MetaPlot is a two-way (x, y) graph, which can be considered as a complementary graphical approach for testing heterogeneity. This method shows graphically as well as numerically the results of an influence analysis, in which Higgins' I(2) statistic with 95% (Confidence interval) CI are computed omitting one study in each turn and then are plotted against reciprocal of standard error (1/SE) or "precision". In this graph, "1/SE" lies on x axis and "I(2) results" lies on y axe. Having a first glance at MetaPlot, one can predict to what extent omission of a single study may influence the overall heterogeneity. The precision on x-axis enables us to distinguish the size of each trial. The graph describes I(2) statistic with 95% CI graphically as well as numerically in one view for prompt comparison. It is possible to implement MetaPlot for meta-analysis of different types of outcome data and summary measures. This method presents a simple graphical approach to identify an outlier and its effect on overall heterogeneity at a glance. We wish to suggest MetaPlot to Stata experts to prepare its module for the software.

  1. Metaplot: A Novel Stata Graph for Assessing Heterogeneity at a Glance

    PubMed Central

    Poorolajal, J; Mahmoodi, M; Majdzadeh, R; Fotouhi, A

    2010-01-01

    Background: Heterogeneity is usually a major concern in meta-analysis. Although there are some statistical approaches for assessing variability across studies, here we present a new approach to heterogeneity using “MetaPlot” that investigate the influence of a single study on the overall heterogeneity. Methods: MetaPlot is a two-way (x, y) graph, which can be considered as a complementary graphical approach for testing heterogeneity. This method shows graphically as well as numerically the results of an influence analysis, in which Higgins’ I2 statistic with 95% (Confidence interval) CI are computed omitting one study in each turn and then are plotted against reciprocal of standard error (1/SE) or “precision”. In this graph, “1/SE” lies on x axis and “I2 results” lies on y axe. Results: Having a first glance at MetaPlot, one can predict to what extent omission of a single study may influence the overall heterogeneity. The precision on x-axis enables us to distinguish the size of each trial. The graph describes I2 statistic with 95% CI graphically as well as numerically in one view for prompt comparison. It is possible to implement MetaPlot for meta-analysis of different types of outcome data and summary measures. Conclusion: This method presents a simple graphical approach to identify an outlier and its effect on overall heterogeneity at a glance. We wish to suggest MetaPlot to Stata experts to prepare its module for the software. PMID:23113013

  2. Graphical Methods: A Review of Current Methods and Computer Hardware and Software. Technical Report No. 27.

    ERIC Educational Resources Information Center

    Bessey, Barbara L.; And Others

    Graphical methods for displaying data, as well as available computer software and hardware, are reviewed. The authors have emphasized the types of graphs which are most relevant to the needs of the National Center for Education Statistics (NCES) and its readers. The following types of graphs are described: tabulations, stem-and-leaf displays,…

  3. The Graphical Display of Simulation Results, with Applications to the Comparison of Robust IRT Estimators of Ability.

    ERIC Educational Resources Information Center

    Thissen, David; Wainer, Howard

    Simulation studies of the performance of (potentially) robust statistical estimation produce large quantities of numbers in the form of performance indices of the various estimators under various conditions. This report presents a multivariate graphical display used to aid in the digestion of the plentiful results in a current study of Item…

  4. Graphical tests for Hardy-Weinberg equilibrium based on the ternary plot.

    PubMed

    Graffelman, Jan; Camarena, Jair Morales

    2008-01-01

    We design a graphical test for Hardy-Weinberg equilibrium. This can circumvent the calculation of p values and the statistical (non)significance of a large number of bi-allelic markers can be inferred from their position in a graph. By rewriting expressions for the chi(2) statistic (with and without continuity correction) in terms of the heterozygote frequency an acceptance region for Hardy-Weinberg equilibrium is obtained that can be depicted in a ternary plot. We obtain equations for curves in the ternary plot that separate markers that are out of Hardy-Weinberg equilibrium from those that are in equilibrium. The curves depend on the chosen significance level, the sample size and on a continuity correction parameter. Some examples of graphical tests using a set of 106 SNPs on the long arm of human chromosome 22 are described. Significant markers and poor markers with a lot of missing values are easily identified in the proposed plots. R software for making the diagrams is provided. The proposed graphs can be used as control charts for spotting problematic markers in large scale genotyping studies, and constitute an excellent tool for the graphical exploration of bi-allelic marker data. (c) 2007 S. Karger AG, Basel.

  5. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  6. Flow visualization of CFD using graphics workstations

    NASA Technical Reports Server (NTRS)

    Lasinski, Thomas; Buning, Pieter; Choi, Diana; Rogers, Stuart; Bancroft, Gordon

    1987-01-01

    High performance graphics workstations are used to visualize the fluid flow dynamics obtained from supercomputer solutions of computational fluid dynamic programs. The visualizations can be done independently on the workstation or while the workstation is connected to the supercomputer in a distributed computing mode. In the distributed mode, the supercomputer interactively performs the computationally intensive graphics rendering tasks while the workstation performs the viewing tasks. A major advantage of the workstations is that the viewers can interactively change their viewing position while watching the dynamics of the flow fields. An overview of the computer hardware and software required to create these displays is presented. For complex scenes the workstation cannot create the displays fast enough for good motion analysis. For these cases, the animation sequences are recorded on video tape or 16 mm film a frame at a time and played back at the desired speed. The additional software and hardware required to create these video tapes or 16 mm movies are also described. Photographs illustrating current visualization techniques are discussed. Examples of the use of the workstations for flow visualization through animation are available on video tape.

  7. Make Movies out of Your Dynamical Simulations with OGRE!

    NASA Astrophysics Data System (ADS)

    Tamayo, Daniel; Douglas, R. W.; Ge, H. W.; Burns, J. A.

    2013-10-01

    We have developed OGRE, the Orbital GRaphics Environment, an open-source project comprising a graphical user interface that allows the user to view the output from several dynamical integrators (e.g., SWIFT) that are commonly used for academic work. One can interactively vary the display speed, rotate the view and zoom the camera. This makes OGRE a great tool for students or the general public to explore accurate orbital histories that may display interesting dynamical features, e.g. the destabilization of Solar System orbits under the Nice model, or interacting pairs of exoplanets. Furthermore, OGRE allows the user to choreograph sequences of transformations as the simulation is played to generate movies for use in public talks or professional presentations. The graphical user interface is coded using Qt to ensure portability across different operating systems. OGRE will run on Linux and Mac OS X. The program is available as a self-contained executable, or as source code that the user can compile. We are continually updating the code, and hope that people who find it useful will contribute to the development of new features.

  8. Make Movies out of Your Dynamical Simulations with OGRE!

    NASA Astrophysics Data System (ADS)

    Tamayo, Daniel; Douglas, R. W.; Ge, H. W.; Burns, J. A.

    2014-01-01

    We have developed OGRE, the Orbital GRaphics Environment, an open-source project comprising a graphical user interface that allows the user to view the output from several dynamical integrators (e.g., SWIFT) that are commonly used for academic work. One can interactively vary the display speed, rotate the view and zoom the camera. This makes OGRE a great tool for students or the general public to explore accurate orbital histories that may display interesting dynamical features, e.g. the destabilization of Solar System orbits under the Nice model, or interacting pairs of exoplanets. Furthermore, OGRE allows the user to choreograph sequences of transformations as the simulation is played to generate movies for use in public talks or professional presentations. The graphical user interface is coded using Qt to ensure portability across different operating systems. OGRE will run on Linux and Mac OS X. The program is available as a self-contained executable, or as source code that the user can compile. We are continually updating the code, and hope that people who find it useful will contribute to the development of new features.

  9. Tropical geometry of statistical models.

    PubMed

    Pachter, Lior; Sturmfels, Bernd

    2004-11-16

    This article presents a unified mathematical framework for inference in graphical models, building on the observation that graphical models are algebraic varieties. From this geometric viewpoint, observations generated from a model are coordinates of a point in the variety, and the sum-product algorithm is an efficient tool for evaluating specific coordinates. Here, we address the question of how the solutions to various inference problems depend on the model parameters. The proposed answer is expressed in terms of tropical algebraic geometry. The Newton polytope of a statistical model plays a key role. Our results are applied to the hidden Markov model and the general Markov model on a binary tree.

  10. Pilot climate data system: A state-of-the-art capability in scientific data management

    NASA Technical Reports Server (NTRS)

    Smith, P. H.; Treinish, L. A.; Novak, L. V.

    1983-01-01

    The Pilot Climate Data System (PCDS) was developed by the Information Management Branch of NASA's Goddard Space Flight Center to manage a large collection of climate-related data of interest to the research community. The PCDS now provides uniform data catalogs, inventories, access methods, graphical displays and statistical calculations for selected NASA and non-NASA data sets. Data manipulation capabilities were developed to permit researchers to easily combine or compare data. The current capabilities of the PCDS include many tools for the statistical survey of climate data. A climate researcher can examine any data set of interest via flexible utilities to create a variety of two- and three-dimensional displays, including vector plots, scatter diagrams, histograms, contour plots, surface diagrams and pseudo-color images. The graphics and statistics subsystems employ an intermediate data storage format which is data-set independent. Outside of the graphics system there exist other utilities to select, filter, list, compress, and calculate time-averages and variances for any data of interest. The PCDS now fully supports approximately twenty different data sets and is being used on a trial basis by several different in-house research grounds.

  11. Process and representation in graphical displays

    NASA Technical Reports Server (NTRS)

    Gillan, Douglas J.; Lewis, Robert; Rudisill, Marianne

    1993-01-01

    Our initial model of graphic comprehension has focused on statistical graphs. Like other models of human-computer interaction, models of graphical comprehension can be used by human-computer interface designers and developers to create interfaces that present information in an efficient and usable manner. Our investigation of graph comprehension addresses two primary questions: how do people represent the information contained in a data graph?; and how do they process information from the graph? The topics of focus for graphic representation concern the features into which people decompose a graph and the representations of the graph in memory. The issue of processing can be further analyzed as two questions: what overall processing strategies do people use?; and what are the specific processing skills required?

  12. Parametric inference for biological sequence analysis.

    PubMed

    Pachter, Lior; Sturmfels, Bernd

    2004-11-16

    One of the major successes in computational biology has been the unification, by using the graphical model formalism, of a multitude of algorithms for annotating and comparing biological sequences. Graphical models that have been applied to these problems include hidden Markov models for annotation, tree models for phylogenetics, and pair hidden Markov models for alignment. A single algorithm, the sum-product algorithm, solves many of the inference problems that are associated with different statistical models. This article introduces the polytope propagation algorithm for computing the Newton polytope of an observation from a graphical model. This algorithm is a geometric version of the sum-product algorithm and is used to analyze the parametric behavior of maximum a posteriori inference calculations for graphical models.

  13. R and Spatial Data

    EPA Science Inventory

    R is an open source language and environment for statistical computing and graphics that can also be used for both spatial analysis (i.e. geoprocessing and mapping of different types of spatial data) and spatial data analysis (i.e. the application of statistical descriptions and ...

  14. Stata companion.

    PubMed

    Brennan, Jennifer Sousa

    2010-01-01

    This chapter is an introductory reference guide highlighting some of the most common statistical topics, broken down into both command-line syntax and graphical interface point-and-click commands. This chapter serves to supplement more formal statistics lessons and expedite using Stata to compute basic analyses.

  15. A conceptual model for quantifying connectivity using graph theory and cellular (per-pixel) approach

    NASA Astrophysics Data System (ADS)

    Singh, Manudeo; Sinha, Rajiv; Tandon, Sampat K.

    2016-04-01

    The concept of connectivity is being increasingly used for understanding hydro-geomorphic processes at all spatio-temporal scales. Connectivity is defined as the potential for energy and material flux (water, sediments, nutrients, heat, etc.) to navigate within or between the landscape systems and has two components, structural connectivity and dynamic connectivity. Structural connectivity is defined by the spatially connected features (physical linkages) through which energy and materials flow. Dynamic connectivity is a process defined connectivity component. These two connectivity components also interact with each other by forming a feedback system. This study attempts to explore a method to quantify structural and dynamic connectivity. In fluvial transport systems, sediment and water can flow in either a diffused manner or in a channelized way. At all the scales, hydrological and sediment fluxes can be tracked using a cellular (per-pixel) approach and can be quantified using graphical approach. The material flux, slope and LULC (Land Use Land Cover) weightage factors of a pixel together determine if it will contribute towards connectivity of the landscape/system. In a graphical approach, all the contributing pixels will form a node at their centroid and this node will be connected to the next 'down-node' via a directed edge with 'least cost path'. The length of the edge will depend on the desired spatial scale and its path direction will depend on the traversed pixel's slope and the LULC (weightage) factors. The weightage factors will lie in-between 0 to 1. This value approaches 1 for the LULC factors which promote connectivity. For example, in terms of sediment connectivity, the weightage could be RUSLE (Revised Universal Soil Loss Equation) C-factors with bare unconsolidated surfaces having values close to 1. This method is best suited for areas with low slopes, where LULC can be a deciding as well as dominating factor. The degree of connectivity and its pathways will show changes under different LULC conditions even if the slope remains the same. The graphical approach provides the statistics of connected and disconnected graph elements (edges, nodes) and graph components, thereby allowing the quantification of structural connectivity. This approach also quantifies the dynamic connectivity by allowing the measurement of the fluxes (e.g. via hydrographs or sedimentographs) at any node as well as at any system outlet. The contribution of any sub-system can be understood by removing the remaining sub-systems which can be conveniently achieved by masking associated graph elements.

  16. Analog-to-digital clinical data collection on networked workstations with graphic user interface.

    PubMed

    Lunt, D

    1991-02-01

    An innovative respiratory examination system has been developed that combines physiological response measurement, real-time graphic displays, user-driven operating sequences, and networked file archiving and review into a scientific research and clinical diagnosis tool. This newly constructed computer network is being used to enhance the research center's ability to perform patient pulmonary function examinations. Respiratory data are simultaneously acquired and graphically presented during patient breathing maneuvers and rapidly transformed into graphic and numeric reports, suitable for statistical analysis or database access. The environment consists of the hardware (Macintosh computer, MacADIOS converters, analog amplifiers), the software (HyperCard v2.0, HyperTalk, XCMDs), and the network (AppleTalk, fileservers, printers) as building blocks for data acquisition, analysis, editing, and storage. System operation modules include: Calibration, Examination, Reports, On-line Help Library, Graphic/Data Editing, and Network Storage.

  17. The use of computer graphic simulation in the development of on-orbit tele-robotic systems

    NASA Technical Reports Server (NTRS)

    Fernandez, Ken; Hinman, Elaine

    1987-01-01

    This paper describes the use of computer graphic simulation techniques to resolve critical design and operational issues for robotic systems used for on-orbit operations. These issues are robot motion control, robot path-planning/verification, and robot dynamics. The major design issues in developing effective telerobotic systems are discussed, and the use of ROBOSIM, a NASA-developed computer graphic simulation tool, to address these issues is presented. Simulation plans for the Space Station and the Orbital Maneuvering Vehicle are presented and discussed.

  18. Advanced graphical user interface for multi-physics simulations using AMST

    NASA Astrophysics Data System (ADS)

    Hoffmann, Florian; Vogel, Frank

    2017-07-01

    Numerical modelling of particulate matter has gained much popularity in recent decades. Advanced Multi-physics Simulation Technology (AMST) is a state-of-the-art three dimensional numerical modelling technique combining the eX-tended Discrete Element Method (XDEM) with Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) [1]. One major limitation of this code is the lack of a graphical user interface (GUI) meaning that all pre-processing has to be made directly in a HDF5-file. This contribution presents the first graphical pre-processor developed for AMST.

  19. Linked Micromaps: Statistical Summaries in a Spatial Context

    EPA Science Inventory

    Communicating summaries of spatial data to decision makers and the public is challenging. We present a graphical method that provides both a geographic context and a statistical summary for such spatial data. Monitoring programs have a need for such geographical summaries. For ...

  20. A First Assignment to Create Student Buy-In in an Introductory Business Statistics Course

    ERIC Educational Resources Information Center

    Newfeld, Daria

    2016-01-01

    This paper presents a sample assignment to be administered after the first two weeks of an introductory business focused statistics course in order to promote student buy-in. This assignment integrates graphical displays of data, descriptive statistics and cross-tabulation analysis through the lens of a marketing analysis study. A marketing sample…

  1. A Survey of Statistical Models for Reverse Engineering Gene Regulatory Networks

    PubMed Central

    Huang, Yufei; Tienda-Luna, Isabel M.; Wang, Yufeng

    2009-01-01

    Statistical models for reverse engineering gene regulatory networks are surveyed in this article. To provide readers with a system-level view of the modeling issues in this research, a graphical modeling framework is proposed. This framework serves as the scaffolding on which the review of different models can be systematically assembled. Based on the framework, we review many existing models for many aspects of gene regulation; the pros and cons of each model are discussed. In addition, network inference algorithms are also surveyed under the graphical modeling framework by the categories of point solutions and probabilistic solutions and the connections and differences among the algorithms are provided. This survey has the potential to elucidate the development and future of reverse engineering GRNs and bring statistical signal processing closer to the core of this research. PMID:20046885

  2. Graphical Neuroimaging Informatics: Application to Alzheimer’s Disease

    PubMed Central

    Bowman, Ian; Joshi, Shantanu H.; Greer, Vaughan

    2013-01-01

    The Informatics Visualization for Neuroimaging (INVIZIAN) framework allows one to graphically display image and meta-data information from sizeable collections of neuroimaging data as a whole using a dynamic and compelling user interface. Users can fluidly interact with an entire collection of cortical surfaces using only their mouse. In addition, users can cluster and group brains according in multiple ways for subsequent comparison using graphical data mining tools. In this article, we illustrate the utility of INVIZIAN for simultaneous exploration and mining a large collection of extracted cortical surface data arising in clinical neuroimaging studies of patients with Alzheimer’s Disease, mild cognitive impairment, as well as healthy control subjects. Alzheimer’s Disease is particularly interesting due to the wide-spread effects on cortical architecture and alterations of volume in specific brain areas associated with memory. We demonstrate INVIZIAN’s ability to render multiple brain surfaces from multiple diagnostic groups of subjects, showcase the interactivity of the system, and showcase how INVIZIAN can be employed to generate hypotheses about the collection of data which would be suitable for direct access to the underlying raw data and subsequent formal statistical analysis. Specifically, we use INVIZIAN show how cortical thickness and hippocampal volume differences between group are evident even in the absence of more formal hypothesis testing. In the context of neurological diseases linked to brain aging such as AD, INVIZIAN provides a unique means for considering the entirety of whole brain datasets, look for interesting relationships among them, and thereby derive new ideas for further research and study. PMID:24203652

  3. Learning planar Ising models

    DOE PAGES

    Johnson, Jason K.; Oyen, Diane Adele; Chertkov, Michael; ...

    2016-12-01

    Inference and learning of graphical models are both well-studied problems in statistics and machine learning that have found many applications in science and engineering. However, exact inference is intractable in general graphical models, which suggests the problem of seeking the best approximation to a collection of random variables within some tractable family of graphical models. In this paper, we focus on the class of planar Ising models, for which exact inference is tractable using techniques of statistical physics. Based on these techniques and recent methods for planarity testing and planar embedding, we propose a greedy algorithm for learning the bestmore » planar Ising model to approximate an arbitrary collection of binary random variables (possibly from sample data). Given the set of all pairwise correlations among variables, we select a planar graph and optimal planar Ising model defined on this graph to best approximate that set of correlations. Finally, we demonstrate our method in simulations and for two applications: modeling senate voting records and identifying geo-chemical depth trends from Mars rover data.« less

  4. Learning planar Ising models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Jason K.; Oyen, Diane Adele; Chertkov, Michael

    Inference and learning of graphical models are both well-studied problems in statistics and machine learning that have found many applications in science and engineering. However, exact inference is intractable in general graphical models, which suggests the problem of seeking the best approximation to a collection of random variables within some tractable family of graphical models. In this paper, we focus on the class of planar Ising models, for which exact inference is tractable using techniques of statistical physics. Based on these techniques and recent methods for planarity testing and planar embedding, we propose a greedy algorithm for learning the bestmore » planar Ising model to approximate an arbitrary collection of binary random variables (possibly from sample data). Given the set of all pairwise correlations among variables, we select a planar graph and optimal planar Ising model defined on this graph to best approximate that set of correlations. Finally, we demonstrate our method in simulations and for two applications: modeling senate voting records and identifying geo-chemical depth trends from Mars rover data.« less

  5. CSTminer: a web tool for the identification of coding and noncoding conserved sequence tags through cross-species genome comparison

    PubMed Central

    Castrignanò, Tiziana; Canali, Alessandro; Grillo, Giorgio; Liuni, Sabino; Mignone, Flavio; Pesole, Graziano

    2004-01-01

    The identification and characterization of genome tracts that are highly conserved across species during evolution may contribute significantly to the functional annotation of whole-genome sequences. Indeed, such sequences are likely to correspond to known or unknown coding exons or regulatory motifs. Here, we present a web server implementing a previously developed algorithm that, by comparing user-submitted genome sequences, is able to identify statistically significant conserved blocks and assess their coding or noncoding nature through the measure of a coding potential score. The web tool, available at http://www.caspur.it/CSTminer/, is dynamically interconnected with the Ensembl genome resources and produces a graphical output showing a map of detected conserved sequences and annotated gene features. PMID:15215464

  6. Robotics On-Board Trainer (ROBoT)

    NASA Technical Reports Server (NTRS)

    Johnson, Genevieve; Alexander, Greg

    2013-01-01

    ROBoT is an on-orbit version of the ground-based Dynamics Skills Trainer (DST) that astronauts use for training on a frequent basis. This software consists of two primary software groups. The first series of components is responsible for displaying the graphical scenes. The remaining components are responsible for simulating the Mobile Servicing System (MSS), the Japanese Experiment Module Remote Manipulator System (JEMRMS), and the H-II Transfer Vehicle (HTV) Free Flyer Robotics Operations. The MSS simulation software includes: Robotic Workstation (RWS) simulation, a simulation of the Space Station Remote Manipulator System (SSRMS), a simulation of the ISS Command and Control System (CCS), and a portion of the Portable Computer System (PCS) software necessary for MSS operations. These components all run under the CentOS4.5 Linux operating system. The JEMRMS simulation software includes real-time, HIL, dynamics, manipulator multi-body dynamics, and a moving object contact model with Tricks discrete time scheduling. The JEMRMS DST will be used as a functional proficiency and skills trainer for flight crews. The HTV Free Flyer Robotics Operations simulation software adds a functional simulation of HTV vehicle controllers, sensors, and data to the MSS simulation software. These components are intended to support HTV ISS visiting vehicle analysis and training. The scene generation software will use DOUG (Dynamic On-orbit Ubiquitous Graphics) to render the graphical scenes. DOUG runs on a laptop running the CentOS4.5 Linux operating system. DOUG is an Open GL-based 3D computer graphics rendering package. It uses pre-built three-dimensional models of on-orbit ISS and space shuttle systems elements, and provides realtime views of various station and shuttle configurations.

  7. Some Tests of Randomness with Applications

    DTIC Science & Technology

    1981-02-01

    freedom. For further details, the reader is referred to Gnanadesikan (1977, p. 169) wherein other relevant tests are also given, Graphical tests, as...sample from a gamma distri- bution. J. Am. Statist. Assoc. 71, 480-7. Gnanadesikan , R. (1977). Methods for Statistical Data Analysis of Multivariate

  8. Displaying Geographically-Based Domestic Statistics

    NASA Technical Reports Server (NTRS)

    Quann, J.; Dalton, J.; Banks, M.; Helfer, D.; Szczur, M.; Winkert, G.; Billingsley, J.; Borgstede, R.; Chen, J.; Chen, L.; hide

    1982-01-01

    Decision Information Display System (DIDS) is rapid-response information-retrieval and color-graphics display system. DIDS transforms tables of geographically-based domestic statistics (such as population or unemployment by county, energy usage by county, or air-quality figures) into high-resolution, color-coded maps on television display screen.

  9. Descriptive statistics: the specification of statistical measures and their presentation in tables and graphs. Part 7 of a series on evaluation of scientific publications.

    PubMed

    Spriestersbach, Albert; Röhrig, Bernd; du Prel, Jean-Baptist; Gerhold-Ay, Aslihan; Blettner, Maria

    2009-09-01

    Descriptive statistics are an essential part of biometric analysis and a prerequisite for the understanding of further statistical evaluations, including the drawing of inferences. When data are well presented, it is usually obvious whether the author has collected and evaluated them correctly and in keeping with accepted practice in the field. Statistical variables in medicine may be of either the metric (continuous, quantitative) or categorical (nominal, ordinal) type. Easily understandable examples are given. Basic techniques for the statistical description of collected data are presented and illustrated with examples. The goal of a scientific study must always be clearly defined. The definition of the target value or clinical endpoint determines the level of measurement of the variables in question. Nearly all variables, whatever their level of measurement, can be usefully presented graphically and numerically. The level of measurement determines what types of diagrams and statistical values are appropriate. There are also different ways of presenting combinations of two independent variables graphically and numerically. The description of collected data is indispensable. If the data are of good quality, valid and important conclusions can already be drawn when they are properly described. Furthermore, data description provides a basis for inferential statistics.

  10. Quantitative analysis of the text and graphic content in ophthalmic slide presentations.

    PubMed

    Ing, Edsel; Celo, Erdit; Ing, Royce; Weisbrod, Lawrence; Ing, Mercedes

    2017-04-01

    To determine the characteristics of ophthalmic digital slide presentations. Retrospective quantitative analysis. Slide presentations from a 2015 Canadian primary eye care conference were analyzed for their duration, character and word count, font size, words per minute (wpm), lines per slide, words per slide, slides per minute (spm), text density product (wpm × spm), proportion of graphic content, and Flesch Reading Ease (FRE) score using Microsoft PowerPoint and Word. The median audience evaluation score for the lectures was used to dichotomize the higher scoring lectures (HSL) from the lower scoring lectures (LSL). A priori we hypothesized that there would be a difference in the wpm, spm, text density product, and FRE score between HSL and LSL. Wilcoxon rank-sum tests with Bonferroni correction were utilized. The 17 lectures had medians of 2.5 spm, 20.3 words per slide, 5.0 lines per slide, 28-point sans serif font, 36% graphic content, and text density product of 136.4 words × slides/minute 2 . Although not statistically significant, the HSL had more wpm, fewer words per slide, more graphics per slide, greater text density, and higher FRE score than LSL. There was a statistically significant difference in the spm of the HSL (3.1 ± 1.0) versus the LSL (2.2 ± 1.0) at p = 0.0124. All presenters showed more than 1 slide per minute. The HSL showed more spm than the LSL. The descriptive statistics from this study may aid in the preparation of slides used for teaching and conferences. Copyright © 2017 Canadian Ophthalmological Society. Published by Elsevier Inc. All rights reserved.

  11. Dynamic whole body PET parametric imaging: II. Task-oriented statistical estimation

    PubMed Central

    Karakatsanis, Nicolas A.; Lodge, Martin A.; Zhou, Y.; Wahl, Richard L.; Rahmim, Arman

    2013-01-01

    In the context of oncology, dynamic PET imaging coupled with standard graphical linear analysis has been previously employed to enable quantitative estimation of tracer kinetic parameters of physiological interest at the voxel level, thus, enabling quantitative PET parametric imaging. However, dynamic PET acquisition protocols have been confined to the limited axial field-of-view (~15–20cm) of a single bed position and have not been translated to the whole-body clinical imaging domain. On the contrary, standardized uptake value (SUV) PET imaging, considered as the routine approach in clinical oncology, commonly involves multi-bed acquisitions, but is performed statically, thus not allowing for dynamic tracking of the tracer distribution. Here, we pursue a transition to dynamic whole body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. In a companion study, we presented a novel clinically feasible dynamic (4D) multi-bed PET acquisition protocol as well as the concept of whole body PET parametric imaging employing Patlak ordinary least squares (OLS) regression to estimate the quantitative parameters of tracer uptake rate Ki and total blood distribution volume V. In the present study, we propose an advanced hybrid linear regression framework, driven by Patlak kinetic voxel correlations, to achieve superior trade-off between contrast-to-noise ratio (CNR) and mean squared error (MSE) than provided by OLS for the final Ki parametric images, enabling task-based performance optimization. Overall, whether the observer's task is to detect a tumor or quantitatively assess treatment response, the proposed statistical estimation framework can be adapted to satisfy the specific task performance criteria, by adjusting the Patlak correlation-coefficient (WR) reference value. The multi-bed dynamic acquisition protocol, as optimized in the preceding companion study, was employed along with extensive Monte Carlo simulations and an initial clinical FDG patient dataset to validate and demonstrate the potential of the proposed statistical estimation methods. Both simulated and clinical results suggest that hybrid regression in the context of whole-body Patlak Ki imaging considerably reduces MSE without compromising high CNR. Alternatively, for a given CNR, hybrid regression enables larger reductions than OLS in the number of dynamic frames per bed, allowing for even shorter acquisitions of ~30min, thus further contributing to the clinical adoption of the proposed framework. Compared to the SUV approach, whole body parametric imaging can provide better tumor quantification, and can act as a complement to SUV, for the task of tumor detection. PMID:24080994

  12. Dynamic whole-body PET parametric imaging: II. Task-oriented statistical estimation.

    PubMed

    Karakatsanis, Nicolas A; Lodge, Martin A; Zhou, Y; Wahl, Richard L; Rahmim, Arman

    2013-10-21

    In the context of oncology, dynamic PET imaging coupled with standard graphical linear analysis has been previously employed to enable quantitative estimation of tracer kinetic parameters of physiological interest at the voxel level, thus, enabling quantitative PET parametric imaging. However, dynamic PET acquisition protocols have been confined to the limited axial field-of-view (~15-20 cm) of a single-bed position and have not been translated to the whole-body clinical imaging domain. On the contrary, standardized uptake value (SUV) PET imaging, considered as the routine approach in clinical oncology, commonly involves multi-bed acquisitions, but is performed statically, thus not allowing for dynamic tracking of the tracer distribution. Here, we pursue a transition to dynamic whole-body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. In a companion study, we presented a novel clinically feasible dynamic (4D) multi-bed PET acquisition protocol as well as the concept of whole-body PET parametric imaging employing Patlak ordinary least squares (OLS) regression to estimate the quantitative parameters of tracer uptake rate Ki and total blood distribution volume V. In the present study, we propose an advanced hybrid linear regression framework, driven by Patlak kinetic voxel correlations, to achieve superior trade-off between contrast-to-noise ratio (CNR) and mean squared error (MSE) than provided by OLS for the final Ki parametric images, enabling task-based performance optimization. Overall, whether the observer's task is to detect a tumor or quantitatively assess treatment response, the proposed statistical estimation framework can be adapted to satisfy the specific task performance criteria, by adjusting the Patlak correlation-coefficient (WR) reference value. The multi-bed dynamic acquisition protocol, as optimized in the preceding companion study, was employed along with extensive Monte Carlo simulations and an initial clinical (18)F-deoxyglucose patient dataset to validate and demonstrate the potential of the proposed statistical estimation methods. Both simulated and clinical results suggest that hybrid regression in the context of whole-body Patlak Ki imaging considerably reduces MSE without compromising high CNR. Alternatively, for a given CNR, hybrid regression enables larger reductions than OLS in the number of dynamic frames per bed, allowing for even shorter acquisitions of ~30 min, thus further contributing to the clinical adoption of the proposed framework. Compared to the SUV approach, whole-body parametric imaging can provide better tumor quantification, and can act as a complement to SUV, for the task of tumor detection.

  13. Engineering visualization utilizing advanced animation

    NASA Technical Reports Server (NTRS)

    Sabionski, Gunter R.; Robinson, Thomas L., Jr.

    1989-01-01

    Engineering visualization is the use of computer graphics to depict engineering analysis and simulation in visual form from project planning through documentation. Graphics displays let engineers see data represented dynamically which permits the quick evaluation of results. The current state of graphics hardware and software generally allows the creation of two types of 3D graphics. The use of animated video as an engineering visualization tool is presented. The engineering, animation, and videography aspects of animated video production are each discussed. Specific issues include the integration of staffing expertise, hardware, software, and the various production processes. A detailed explanation of the animation process reveals the capabilities of this unique engineering visualization method. Automation of animation and video production processes are covered and future directions are proposed.

  14. Normal Approximations to the Distributions of the Wilcoxon Statistics: Accurate to What "N"? Graphical Insights

    ERIC Educational Resources Information Center

    Bellera, Carine A.; Julien, Marilyse; Hanley, James A.

    2010-01-01

    The Wilcoxon statistics are usually taught as nonparametric alternatives for the 1- and 2-sample Student-"t" statistics in situations where the data appear to arise from non-normal distributions, or where sample sizes are so small that we cannot check whether they do. In the past, critical values, based on exact tail areas, were…

  15. Data-Based Detection of Potential Terrorist Attacks: Statistical and Graphical Methods

    DTIC Science & Technology

    2010-06-01

    Naren; Vasquez-Robinet, Cecilia; Watkinson, Jonathan: "A General Probabilistic Model of the PCR Process," Applied Mathematics and Computation 182(1...September 2006. Seminar, Measuring the effect of Length biased sampling, Mathematical Sciences Section, National Security Agency, 19 September 2006...Committee on National Statistics, 9 February 2007. Invited seminar, Statistical Tests for Bullet Lead Comparisons, Department of Mathematics , Butler

  16. Class Evolution Tree: A Graphical Tool to Support Decisions on the Number of Classes in Exploratory Categorical Latent Variable Modeling for Rehabilitation Research

    ERIC Educational Resources Information Center

    Kriston, Levente; Melchior, Hanne; Hergert, Anika; Bergelt, Corinna; Watzke, Birgit; Schulz, Holger; von Wolff, Alessa

    2011-01-01

    The aim of our study was to develop a graphical tool that can be used in addition to standard statistical criteria to support decisions on the number of classes in explorative categorical latent variable modeling for rehabilitation research. Data from two rehabilitation research projects were used. In the first study, a latent profile analysis was…

  17. Contour plot assessment of existing meta-analyses confirms robust association of statin use and acute kidney injury risk.

    PubMed

    Chevance, Aurélie; Schuster, Tibor; Steele, Russell; Ternès, Nils; Platt, Robert W

    2015-10-01

    Robustness of an existing meta-analysis can justify decisions on whether to conduct an additional study addressing the same research question. We illustrate the graphical assessment of the potential impact of an additional study on an existing meta-analysis using published data on statin use and the risk of acute kidney injury. A previously proposed graphical augmentation approach is used to assess the sensitivity of the current test and heterogeneity statistics extracted from existing meta-analysis data. In addition, we extended the graphical augmentation approach to assess potential changes in the pooled effect estimate after updating a current meta-analysis and applied the three graphical contour definitions to data from meta-analyses on statin use and acute kidney injury risk. In the considered example data, the pooled effect estimates and heterogeneity indices demonstrated to be considerably robust to the addition of a future study. Supportingly, for some previously inconclusive meta-analyses, a study update might yield statistically significant kidney injury risk increase associated with higher statin exposure. The illustrated contour approach should become a standard tool for the assessment of the robustness of meta-analyses. It can guide decisions on whether to conduct additional studies addressing a relevant research question. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. SDAR 1.0 a New Quantitative Toolkit for Analyze Stratigraphic Data

    NASA Astrophysics Data System (ADS)

    Ortiz, John; Moreno, Carlos; Cardenas, Andres; Jaramillo, Carlos

    2015-04-01

    Since the foundation of stratigraphy geoscientists have recognized that data obtained from stratigraphic columns (SC), two dimensional schemes recording descriptions of both geological and paleontological features (e.g., thickness of rock packages, grain size, fossil and lithological components, and sedimentary structures), are key elements for establishing reliable hypotheses about the distribution in space and time of rock sequences, and ancient sedimentary environmental and paleobiological dynamics. Despite the tremendous advances on the way geoscientists store, plot, and quantitatively analyze sedimentological and paleontological data (e.g., Macrostrat [http://www.macrostrat.org/], Paleobiology Database [http://www.paleodb.org/], respectively), there is still a lack of computational methodologies designed to quantitatively examine data from a highly detailed SCs. Moreover, frequently the stratigraphic information is plotted "manually" using vector graphics editors (e.g., Corel Draw, Illustrator), however, this information although store on a digital format, cannot be used readily for any quantitative analysis. Therefore, any attempt to examine the stratigraphic data in an analytical fashion necessarily takes further steps. Given these issues, we have developed the sofware 'Stratigraphic Data Analysis in R' (SDAR), which stores in a database all sedimentological, stratigraphic, and paleontological information collected from a SC, allowing users to generate high-quality graphic plots (including one or multiple features stored in the database). SDAR also encompasses quantitative analyses helping users to quantify stratigraphic information (e.g. grain size, sorting and rounding, proportion of sand/shale). Finally, given that the SDAR analysis module, has been written in the open-source high-level computer language "R graphics/statistics language" [R Development Core Team, 2014], it is already loaded with many of the crucial features required to accomplish basic and complex tasks of statistical analysis (i.e., R language provide more than hundred spatial libraries that allow users to explore various Geostatistics and spatial analysis). Consequently, SDAR allows a deeper exploration of the stratigraphic data collected in the field, it will allow the geoscientific community in the near future to develop complex analyses related with the distribution in space and time of rock sequences, such as lithofacial correlations, by a multivariate comparison between empirical SCs with quantitative lithofacial models established from modern sedimentary environments.

  19. Evaluation of risk communication in a mammography patient decision aid.

    PubMed

    Klein, Krystal A; Watson, Lindsey; Ash, Joan S; Eden, Karen B

    2016-07-01

    We characterized patients' comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest-posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Participants' positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Regression Models for Identifying Noise Sources in Magnetic Resonance Images

    PubMed Central

    Zhu, Hongtu; Li, Yimei; Ibrahim, Joseph G.; Shi, Xiaoyan; An, Hongyu; Chen, Yashen; Gao, Wei; Lin, Weili; Rowe, Daniel B.; Peterson, Bradley S.

    2009-01-01

    Stochastic noise, susceptibility artifacts, magnetic field and radiofrequency inhomogeneities, and other noise components in magnetic resonance images (MRIs) can introduce serious bias into any measurements made with those images. We formally introduce three regression models including a Rician regression model and two associated normal models to characterize stochastic noise in various magnetic resonance imaging modalities, including diffusion-weighted imaging (DWI) and functional MRI (fMRI). Estimation algorithms are introduced to maximize the likelihood function of the three regression models. We also develop a diagnostic procedure for systematically exploring MR images to identify noise components other than simple stochastic noise, and to detect discrepancies between the fitted regression models and MRI data. The diagnostic procedure includes goodness-of-fit statistics, measures of influence, and tools for graphical display. The goodness-of-fit statistics can assess the key assumptions of the three regression models, whereas measures of influence can isolate outliers caused by certain noise components, including motion artifacts. The tools for graphical display permit graphical visualization of the values for the goodness-of-fit statistic and influence measures. Finally, we conduct simulation studies to evaluate performance of these methods, and we analyze a real dataset to illustrate how our diagnostic procedure localizes subtle image artifacts by detecting intravoxel variability that is not captured by the regression models. PMID:19890478

  1. Evaluation of risk communication in a mammography patient decision aid

    PubMed Central

    Klein, Krystal A.; Watson, Lindsey; Ash, Joan S.; Eden, Karen B.

    2016-01-01

    Objectives We characterized patients’ comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Methods Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest–posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Results Participants’ positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Conclusions Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Practice implications Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics PMID:26965020

  2. Application of dynamic uncertain causality graph in spacecraft fault diagnosis: Logic cycle

    NASA Astrophysics Data System (ADS)

    Yao, Quanying; Zhang, Qin; Liu, Peng; Yang, Ping; Zhu, Ma; Wang, Xiaochen

    2017-04-01

    Intelligent diagnosis system are applied to fault diagnosis in spacecraft. Dynamic Uncertain Causality Graph (DUCG) is a new probability graphic model with many advantages. In the knowledge expression of spacecraft fault diagnosis, feedback among variables is frequently encountered, which may cause directed cyclic graphs (DCGs). Probabilistic graphical models (PGMs) such as bayesian network (BN) have been widely applied in uncertain causality representation and probabilistic reasoning, but BN does not allow DCGs. In this paper, DUGG is applied to fault diagnosis in spacecraft: introducing the inference algorithm for the DUCG to deal with feedback. Now, DUCG has been tested in 16 typical faults with 100% diagnosis accuracy.

  3. System dynamic modeling: an alternative method for budgeting.

    PubMed

    Srijariya, Witsanuchai; Riewpaiboon, Arthorn; Chaikledkaew, Usa

    2008-03-01

    To construct, validate, and simulate a system dynamic financial model and compare it against the conventional method. The study was a cross-sectional analysis of secondary data retrieved from the National Health Security Office (NHSO) in the fiscal year 2004. The sample consisted of all emergency patients who received emergency services outside their registered hospital-catchments area. The dependent variable used was the amount of reimbursed money. Two types of model were constructed, namely, the system dynamic model using the STELLA software and the multiple linear regression model. The outputs of both methods were compared. The study covered 284,716 patients from various levels of providers. The system dynamic model had the capability of producing various types of outputs, for example, financial and graphical analyses. For the regression analysis, statistically significant predictors were composed of service types (outpatient or inpatient), operating procedures, length of stay, illness types (accident or not), hospital characteristics, age, and hospital location (adjusted R(2) = 0.74). The total budget arrived at from using the system dynamic model and regression model was US$12,159,614.38 and US$7,301,217.18, respectively, whereas the actual NHSO reimbursement cost was US$12,840,805.69. The study illustrated that the system dynamic model is a useful financial management tool, although it is not easy to construct. The model is not only more accurate in prediction but is also more capable of analyzing large and complex real-world situations than the conventional method.

  4. Facts & Figures, 1999: A Compendium of Statistics on Ontario Universities.

    ERIC Educational Resources Information Center

    Council of Ontario Universities, Toronto.

    This is the sixth edition of statistical and graphical information on the Ontario (Canada) university system. The report contains six sections: (1) Ontario population data, which includes population projections to 2021, income and employment rates by educational attainment, and university participation rates; (2) applicant/registrant data, which…

  5. Evaluating Independent Proportions for Statistical Difference, Equivalence, Indeterminacy, and Trivial Difference Using Inferential Confidence Intervals

    ERIC Educational Resources Information Center

    Tryon, Warren W.; Lewis, Charles

    2009-01-01

    Tryon presented a graphic inferential confidence interval (ICI) approach to analyzing two independent and dependent means for statistical difference, equivalence, replication, indeterminacy, and trivial difference. Tryon and Lewis corrected the reduction factor used to adjust descriptive confidence intervals (DCIs) to create ICIs and introduced…

  6. Survey of statistical techniques used in validation studies of air pollution prediction models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bornstein, R D; Anderson, S F

    1979-03-01

    Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.

  7. A review of contemporary methods for the presentation of scientific uncertainty.

    PubMed

    Makinson, K A; Hamby, D M; Edwards, J A

    2012-12-01

    Graphic methods for displaying uncertainty are often the most concise and informative way to communicate abstract concepts. Presentation methods currently in use for the display and interpretation of scientific uncertainty are reviewed. Numerous subjective and objective uncertainty display methods are presented, including qualitative assessments, node and arrow diagrams, standard statistical methods, box-and-whisker plots,robustness and opportunity functions, contribution indexes, probability density functions, cumulative distribution functions, and graphical likelihood functions.

  8. Proceedings of the Annual Meeting of the Association for Education in Journalism and Mass Communication (75th, Montreal, Quebec, Canada, August 5-8, 1992). Part IX: Media and Technology.

    ERIC Educational Resources Information Center

    Association for Education in Journalism and Mass Communication.

    The Media and Technology section of these proceedings contains the following six papers: "The Effects of Tabular and Graphical Display Formats on Time Spent Processing Statistics" (James D. Kelly); "Program Choice in a Broadband Environment" (Steven S. Wildman and Nancy Y. Lee); "Visual Crosstabs: A Technique for Enriching Information Graphics"…

  9. A daily living activity remote monitoring system for solitary elderly people.

    PubMed

    Maki, Hiromichi; Ogawa, Hidekuni; Matsuoka, Shingo; Yonezawa, Yoshiharu; Caldwell, W Morton

    2011-01-01

    A daily living activity remote monitoring system has been developed for supporting solitary elderly people. The monitoring system consists of a tri-axis accelerometer, six low-power active filters, a low-power 8-bit microcontroller (MC), a 1GB SD memory card (SDMC) and a 2.4 GHz low transmitting power mobile phone (PHS). The tri-axis accelerometer attached to the subject's chest can simultaneously measure dynamic and static acceleration forces produced by heart sound, respiration, posture and behavior. The heart rate, respiration rate, activity, posture and behavior are detected from the dynamic and static acceleration forces. These data are stored in the SD. The MC sends the data to the server computer every hour. The server computer stores the data and makes a graphic chart from the data. When the caregiver calls from his/her mobile phone to the server computer, the server computer sends the graphical chart via the PHS. The caregiver's mobile phone displays the chart to the monitor graphically.

  10. Dynamic graph system for a semantic database

    DOEpatents

    Mizell, David

    2016-04-12

    A method and system in a computer system for dynamically providing a graphical representation of a data store of entries via a matrix interface is disclosed. A dynamic graph system provides a matrix interface that exposes to an application program a graphical representation of data stored in a data store such as a semantic database storing triples. To the application program, the matrix interface represents the graph as a sparse adjacency matrix that is stored in compressed form. Each entry of the data store is considered to represent a link between nodes of the graph. Each entry has a first field and a second field identifying the nodes connected by the link and a third field with a value for the link that connects the identified nodes. The first, second, and third fields represent the rows, column, and elements of the adjacency matrix.

  11. Dynamic graph system for a semantic database

    DOEpatents

    Mizell, David

    2015-01-27

    A method and system in a computer system for dynamically providing a graphical representation of a data store of entries via a matrix interface is disclosed. A dynamic graph system provides a matrix interface that exposes to an application program a graphical representation of data stored in a data store such as a semantic database storing triples. To the application program, the matrix interface represents the graph as a sparse adjacency matrix that is stored in compressed form. Each entry of the data store is considered to represent a link between nodes of the graph. Each entry has a first field and a second field identifying the nodes connected by the link and a third field with a value for the link that connects the identified nodes. The first, second, and third fields represent the rows, column, and elements of the adjacency matrix.

  12. Human-display interactions: Context-specific biases

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary Kister; Proffitt, Dennis R.

    1987-01-01

    Recent developments in computer engineering have greatly enhanced the capabilities of display technology. As displays are no longer limited to simple alphanumeric output, they can present a wide variety of graphic information, using either static or dynamic presentation modes. At the same time that interface designers exploit the increased capabilities of these displays, they must be aware of the inherent limitation of these displays. Generally, these limitations can be divided into those that reflect limitations of the medium (e.g., reducing three-dimensional representations onto a two-dimensional projection) and those reflecting the perceptual and conceptual biases of the operator. The advantages and limitations of static and dynamic graphic displays are considered. Rather than enter into the discussion of whether dynamic or static displays are superior, general advantages and limitations are explored which are contextually specific to each type of display.

  13. The Cognitive Visualization System with the Dynamic Projection of Multidimensional Data

    NASA Astrophysics Data System (ADS)

    Gorohov, V.; Vitkovskiy, V.

    2008-08-01

    The phenomenon of cognitive machine drawing consists in the generation on the screen the special graphic representations, which create in the brain of human operator entertainment means. These means seem man by aesthetically attractive and, thus, they stimulate its descriptive imagination, closely related to the intuitive mechanisms of thinking. The essence of cognitive effect lies in the fact that man receives the moving projection as pseudo-three-dimensional object characterizing multidimensional means in the multidimensional space. After the thorough qualitative study of the visual aspects of multidimensional means with the aid of the enumerated algorithms appears the possibility, using algorithms of standard machine drawing to paint the interesting user separate objects or the groups of objects. Then it is possible to again return to the dynamic behavior of the rotation of means for the purpose of checking the intuitive ideas of user about the clusters and the connections in multidimensional data. Is possible the development of the methods of cognitive machine drawing in combination with other information technologies, first of all with the packets of digital processing of images and multidimensional statistical analysis.

  14. Application of control theory to dynamic systems simulation

    NASA Technical Reports Server (NTRS)

    Auslander, D. M.; Spear, R. C.; Young, G. E.

    1982-01-01

    The application of control theory is applied to dynamic systems simulation. Theory and methodology applicable to controlled ecological life support systems are considered. Spatial effects on system stability, design of control systems with uncertain parameters, and an interactive computing language (PARASOL-II) designed for dynamic system simulation, report quality graphics, data acquisition, and simple real time control are discussed.

  15. Motor Coordination Dynamics Underlying Graphic Motion in 7- to 11-Year-Old Children

    ERIC Educational Resources Information Center

    Danna, Jeremy; Enderli, Fabienne; Athenes, Sylvie; Zanone, Pier-Giorgio

    2012-01-01

    Using concepts and tools of a dynamical system approach in order to understand motor coordination underlying graphomotor skills, the aim of the current study was to establish whether the basic coordination dynamics found in adults is already established in children at elementary school, when handwriting is trained and eventually acquired. In the…

  16. MCNP4A: Features and philosophy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, J.S.

    This paper describes MCNP, states its philosophy, introduces a number of new features becoming available with version MCNP4A, and answers a number of questions asked by participants in the workshop. MCNP is a general-purpose three-dimensional neutron, photon and electron transport code. Its philosophy is ``Quality, Value and New Features.`` Quality is exemplified by new software quality assurance practices and a program of benchmarking against experiments. Value includes a strong emphasis on documentation and code portability. New features are the third priority. MCNP4A is now available at Los Alamos. New features in MCNP4A include enhanced statistical analysis, distributed processor multitasking, newmore » photon libraries, ENDF/B-VI capabilities, X-Windows graphics, dynamic memory allocation, expanded criticality output, periodic boundaries, plotting of particle tracks via SABRINA, and many other improvements. 23 refs.« less

  17. Viewpoints: A High-Performance High-Dimensional Exploratory Data Analysis Tool

    NASA Astrophysics Data System (ADS)

    Gazis, P. R.; Levit, C.; Way, M. J.

    2010-12-01

    Scientific data sets continue to increase in both size and complexity. In the past, dedicated graphics systems at supercomputing centers were required to visualize large data sets, but as the price of commodity graphics hardware has dropped and its capability has increased, it is now possible, in principle, to view large complex data sets on a single workstation. To do this in practice, an investigator will need software that is written to take advantage of the relevant graphics hardware. The Viewpoints visualization package described herein is an example of such software. Viewpoints is an interactive tool for exploratory visual analysis of large high-dimensional (multivariate) data. It leverages the capabilities of modern graphics boards (GPUs) to run on a single workstation or laptop. Viewpoints is minimalist: it attempts to do a small set of useful things very well (or at least very quickly) in comparison with similar packages today. Its basic feature set includes linked scatter plots with brushing, dynamic histograms, normalization, and outlier detection/removal. Viewpoints was originally designed for astrophysicists, but it has since been used in a variety of fields that range from astronomy, quantum chemistry, fluid dynamics, machine learning, bioinformatics, and finance to information technology server log mining. In this article, we describe the Viewpoints package and show examples of its usage.

  18. Facts and Figures. A Compendium of Statistics on Ontario Universities. Volume 4.

    ERIC Educational Resources Information Center

    Council of Ontario Universities, Toronto.

    The purpose of this compendium is to provide consistent and accurate statistical and graphical information on the Ontario (Canada) university system. The compendium consists of seven sections: (1) Ontario population data with population projections 1986-2021, median income by educational attainment 1985-1994, and unemployment rates by educational…

  19. The Effects of Data and Graph Type on Concepts and Visualizations of Variability

    ERIC Educational Resources Information Center

    Cooper, Linda L.; Shore, Felice S.

    2010-01-01

    Recognizing and interpreting variability in data lies at the heart of statistical reasoning. Since graphical displays should facilitate communication about data, statistical literacy should include an understanding of how variability in data can be gleaned from a graph. This paper identifies several types of graphs that students typically…

  20. U.S. Virgin Islands 1983-84 School Statistical Summary.

    ERIC Educational Resources Information Center

    Romain, Louise

    The 1984 edition of the United States Virgin Islands School Statistical Summary presents narratives, 48 tables, and 7 graphic illustrations of education in public and private elementary and secondary schools during the 1983-84 school year. Tables provide data on the U.S. Virgin Islands demography and economy, enrollment and average daily…

  1. Statistical basis and outputs of stable isotope mixing models: Comment on Fry (2013)

    EPA Science Inventory

    A recent article by Fry (2013; Mar Ecol Prog Ser 472:1−13) reviewed approaches to solving underdetermined stable isotope mixing systems, and presented a new graphical approach and set of summary statistics for the analysis of such systems. In his review, Fry (2013) mis-characteri...

  2. Computer-Automated Approach for Scoring Short Essays in an Introductory Statistics Course

    ERIC Educational Resources Information Center

    Zimmerman, Whitney Alicia; Kang, Hyun Bin; Kim, Kyung; Gao, Mengzhao; Johnson, Glenn; Clariana, Roy; Zhang, Fan

    2018-01-01

    Over two semesters short essay prompts were developed for use with the Graphical Interface for Knowledge Structure (GIKS), an automated essay scoring system. Participants were students in an undergraduate-level online introductory statistics course. The GIKS compares students' writing samples with an expert's to produce keyword occurrence and…

  3. Causal tapestries for psychology and physics.

    PubMed

    Sulis, William H

    2012-04-01

    Archetypal dynamics is a formal approach to the modeling of information flow in complex systems used to study emergence. It is grounded in the Fundamental Triad of realisation (system), interpretation (archetype) and representation (formal model). Tapestries play a fundamental role in the framework of archetypal dynamics as a formal representational system. They represent information flow by means of multi layered, recursive, interlinked graphical structures that express both geometry (form or sign) and logic (semantics). This paper presents a detailed mathematical description of a specific tapestry model, the causal tapestry, selected for use in describing behaving systems such as appear in psychology and physics from the standpoint of Process Theory. Causal tapestries express an explicit Lorentz invariant transient now generated by means of a reality game. Observables are represented by tapestry informons while subjective or hidden components (for example intellectual and emotional processes) are incorporated into the reality game that determines the tapestry dynamics. As a specific example, we formulate a random graphical dynamical system using causal tapestries.

  4. Graphics performance in rich Internet applications.

    PubMed

    Hoetzlein, Rama C

    2012-01-01

    Rendering performance for rich Internet applications (RIAs) has recently focused on the debate between using Flash and HTML5 for streaming video and gaming on mobile devices. A key area not widely explored, however, is the scalability of raw bitmap graphics performance for RIAs. Does Flash render animated sprites faster than HTML5? How much faster is WebGL than Flash? Answers to these questions are essential for developing large-scale data visualizations, online games, and truly dynamic websites. A new test methodology analyzes graphics performance across RIA frameworks and browsers, revealing specific performance outliers in existing frameworks. The results point toward a future in which all online experiences might be GPU accelerated.

  5. Graphic Server: A real time system for displaying and monitoring telemetry data of several satellites

    NASA Technical Reports Server (NTRS)

    Douard, Stephane

    1994-01-01

    Known as a Graphic Server, the system presented was designed for the control ground segment of the Telecom 2 satellites. It is a tool used to dynamically display telemetry data within graphic pages, also known as views. The views are created off-line through various utilities and then, on the operator's request, displayed and animated in real time as data is received. The system was designed as an independent component, and is installed in different Telecom 2 operational control centers. It enables operators to monitor changes in the platform and satellite payloads in real time. It has been in operation since December 1991.

  6. LinkWinds: An Approach to Visual Data Analysis

    NASA Technical Reports Server (NTRS)

    Jacobson, Allan S.

    1992-01-01

    The Linked Windows Interactive Data System (LinkWinds) is a prototype visual data exploration and analysis system resulting from a NASA/JPL program of research into graphical methods for rapidly accessing, displaying and analyzing large multivariate multidisciplinary datasets. It is an integrated multi-application execution environment allowing the dynamic interconnection of multiple windows containing visual displays and/or controls through a data-linking paradigm. This paradigm, which results in a system much like a graphical spreadsheet, is not only a powerful method for organizing large amounts of data for analysis, but provides a highly intuitive, easy to learn user interface on top of the traditional graphical user interface.

  7. Experiments in cooperative manipulation: A system perspective

    NASA Technical Reports Server (NTRS)

    Schneider, Stanley A.; Cannon, Robert H., Jr.

    1989-01-01

    In addition to cooperative dynamic control, the system incorporates real time vision feedback, a novel programming technique, and a graphical high level user interface. By focusing on the vertical integration problem, not only these subsystems are examined, but also their interfaces and interactions. The control system implements a multi-level hierarchical structure; the techniques developed for operator input, strategic command, and cooperative dynamic control are presented. At the highest level, a mouse-based graphical user interface allows an operator to direct the activities of the system. Strategic command is provided by a table-driven finite state machine; this methodology provides a powerful yet flexible technique for managing the concurrent system interactions. The dynamic controller implements object impedance control; an extension of Nevill Hogan's impedance control concept to cooperative arm manipulation of a single object. Experimental results are presented, showing the system locating and identifying a moving object catching it, and performing a simple cooperative assembly. Results from dynamic control experiments are also presented, showing the controller's excellent dynamic trajectory tracking performance, while also permitting control of environmental contact force.

  8. The Web as an educational tool for/in learning/teaching bioinformatics statistics.

    PubMed

    Oliver, J; Pisano, M E; Alonso, T; Roca, P

    2005-12-01

    Statistics provides essential tool in Bioinformatics to interpret the results of a database search or for the management of enormous amounts of information provided from genomics, proteomics and metabolomics. The goal of this project was the development of a software tool that would be as simple as possible to demonstrate the use of the Bioinformatics statistics. Computer Simulation Methods (CSMs) developed using Microsoft Excel were chosen for their broad range of applications, immediate and easy formula calculation, immediate testing and easy graphics representation, and of general use and acceptance by the scientific community. The result of these endeavours is a set of utilities which can be accessed from the following URL: http://gmein.uib.es/bioinformatica/statistics. When tested on students with previous coursework with traditional statistical teaching methods, the general opinion/overall consensus was that Web-based instruction had numerous advantages, but traditional methods with manual calculations were also needed for their theory and practice. Once having mastered the basic statistical formulas, Excel spreadsheets and graphics were shown to be very useful for trying many parameters in a rapid fashion without having to perform tedious calculations. CSMs will be of great importance for the formation of the students and professionals in the field of bioinformatics, and for upcoming applications of self-learning and continuous formation.

  9. Interactive graphics for the Macintosh: software review of FlexiGraphs.

    PubMed

    Antonak, R F

    1990-01-01

    While this product is clearly unique, its usefulness to individuals outside small business environments is somewhat limited. FlexiGraphs is, however, a reasonable first attempt to design a microcomputer software package that controls data through interactive editing within a graph. Although the graphics capabilities of mainframe programs such as MINITAB (Ryan, Joiner, & Ryan, 1981) and the graphic manipulations available through exploratory data analysis (e.g., Velleman & Hoaglin, 1981) will not be surpassed anytime soon by this program, a researcher may want to add this program to a software library containing other Macintosh statistics, drawing, and graphics programs if only to obtain the easy-to-obtain curve fitting and line smoothing options. I welcome the opportunity to review the enhanced "scientific" version of FlexiGraphs that the author of the program indicates is currently under development. An MS-DOS version of the program should be available within the year.

  10. Application of 2D graphic representation of protein sequence based on Huffman tree method.

    PubMed

    Qi, Zhao-Hui; Feng, Jun; Qi, Xiao-Qin; Li, Ling

    2012-05-01

    Based on Huffman tree method, we propose a new 2D graphic representation of protein sequence. This representation can completely avoid loss of information in the transfer of data from a protein sequence to its graphic representation. The method consists of two parts. One is about the 0-1 codes of 20 amino acids by Huffman tree with amino acid frequency. The amino acid frequency is defined as the statistical number of an amino acid in the analyzed protein sequences. The other is about the 2D graphic representation of protein sequence based on the 0-1 codes. Then the applications of the method on ten ND5 genes and seven Escherichia coli strains are presented in detail. The results show that the proposed model may provide us with some new sights to understand the evolution patterns determined from protein sequences and complete genomes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations

    NASA Astrophysics Data System (ADS)

    Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.

  12. A Novel Technique for Running the NASA Legacy Code LAPIN Synchronously With Simulations Developed Using Simulink

    NASA Technical Reports Server (NTRS)

    Vrnak, Daniel R.; Stueber, Thomas J.; Le, Dzu K.

    2012-01-01

    This report presents a method for running a dynamic legacy inlet simulation in concert with another dynamic simulation that uses a graphical interface. The legacy code, NASA's LArge Perturbation INlet (LAPIN) model, was coded using the FORTRAN 77 (The Portland Group, Lake Oswego, OR) programming language to run in a command shell similar to other applications that used the Microsoft Disk Operating System (MS-DOS) (Microsoft Corporation, Redmond, WA). Simulink (MathWorks, Natick, MA) is a dynamic simulation that runs on a modern graphical operating system. The product of this work has both simulations, LAPIN and Simulink, running synchronously on the same computer with periodic data exchanges. Implementing the method described in this paper avoided extensive changes to the legacy code and preserved its basic operating procedure. This paper presents a novel method that promotes inter-task data communication between the synchronously running processes.

  13. Advances in Software Tools for Pre-processing and Post-processing of Overset Grid Computations

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    Recent developments in three pieces of software for performing pre-processing and post-processing work on numerical computations using overset grids are presented. The first is the OVERGRID graphical interface which provides a unified environment for the visualization, manipulation, generation and diagnostics of geometry and grids. Modules are also available for automatic boundary conditions detection, flow solver input preparation, multiple component dynamics input preparation and dynamics animation, simple solution viewing for moving components, and debris trajectory analysis input preparation. The second is a grid generation script library that enables rapid creation of grid generation scripts. A sample of recent applications will be described. The third is the OVERPLOT graphical interface for displaying and analyzing history files generated by the flow solver. Data displayed include residuals, component forces and moments, number of supersonic and reverse flow points, and various dynamics parameters.

  14. METAGUI 3: A graphical user interface for choosing the collective variables in molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Giorgino, Toni; Laio, Alessandro; Rodriguez, Alex

    2017-08-01

    Molecular dynamics (MD) simulations allow the exploration of the phase space of biopolymers through the integration of equations of motion of their constituent atoms. The analysis of MD trajectories often relies on the choice of collective variables (CVs) along which the dynamics of the system is projected. We developed a graphical user interface (GUI) for facilitating the interactive choice of the appropriate CVs. The GUI allows: defining interactively new CVs; partitioning the configurations into microstates characterized by similar values of the CVs; calculating the free energies of the microstates for both unbiased and biased (metadynamics) simulations; clustering the microstates in kinetic basins; visualizing the free energy landscape as a function of a subset of the CVs used for the analysis. A simple mouse click allows one to quickly inspect structures corresponding to specific points in the landscape.

  15. AWE-WQ: fast-forwarding molecular dynamics using the accelerated weighted ensemble.

    PubMed

    Abdul-Wahid, Badi'; Feng, Haoyun; Rajan, Dinesh; Costaouec, Ronan; Darve, Eric; Thain, Douglas; Izaguirre, Jesús A

    2014-10-27

    A limitation of traditional molecular dynamics (MD) is that reaction rates are difficult to compute. This is due to the rarity of observing transitions between metastable states since high energy barriers trap the system in these states. Recently the weighted ensemble (WE) family of methods have emerged which can flexibly and efficiently sample conformational space without being trapped and allow calculation of unbiased rates. However, while WE can sample correctly and efficiently, a scalable implementation applicable to interesting biomolecular systems is not available. We provide here a GPLv2 implementation called AWE-WQ of a WE algorithm using the master/worker distributed computing WorkQueue (WQ) framework. AWE-WQ is scalable to thousands of nodes and supports dynamic allocation of computer resources, heterogeneous resource usage (such as central processing units (CPU) and graphical processing units (GPUs) concurrently), seamless heterogeneous cluster usage (i.e., campus grids and cloud providers), and support for arbitrary MD codes such as GROMACS, while ensuring that all statistics are unbiased. We applied AWE-WQ to a 34 residue protein which simulated 1.5 ms over 8 months with peak aggregate performance of 1000 ns/h. Comparison was done with a 200 μs simulation collected on a GPU over a similar timespan. The folding and unfolded rates were of comparable accuracy.

  16. AWE-WQ: Fast-Forwarding Molecular Dynamics Using the Accelerated Weighted Ensemble

    PubMed Central

    2015-01-01

    A limitation of traditional molecular dynamics (MD) is that reaction rates are difficult to compute. This is due to the rarity of observing transitions between metastable states since high energy barriers trap the system in these states. Recently the weighted ensemble (WE) family of methods have emerged which can flexibly and efficiently sample conformational space without being trapped and allow calculation of unbiased rates. However, while WE can sample correctly and efficiently, a scalable implementation applicable to interesting biomolecular systems is not available. We provide here a GPLv2 implementation called AWE-WQ of a WE algorithm using the master/worker distributed computing WorkQueue (WQ) framework. AWE-WQ is scalable to thousands of nodes and supports dynamic allocation of computer resources, heterogeneous resource usage (such as central processing units (CPU) and graphical processing units (GPUs) concurrently), seamless heterogeneous cluster usage (i.e., campus grids and cloud providers), and support for arbitrary MD codes such as GROMACS, while ensuring that all statistics are unbiased. We applied AWE-WQ to a 34 residue protein which simulated 1.5 ms over 8 months with peak aggregate performance of 1000 ns/h. Comparison was done with a 200 μs simulation collected on a GPU over a similar timespan. The folding and unfolded rates were of comparable accuracy. PMID:25207854

  17. Effects of game-like interactive graphics on risk perceptions and decisions.

    PubMed

    Ancker, Jessica S; Weber, Elke U; Kukafka, Rita

    2011-01-01

    Many patients have difficulty interpreting risks described in statistical terms as percentages. Computer game technology offers the opportunity to experience how often an event occurs, rather than simply read about its frequency. . To assess effects of interactive graphics on risk perceptions and decisions. . Electronic questionnaire. Participants and setting. Respondents (n = 165) recruited online or at an urban hospital. Intervention. Health risks were illustrated by either static graphics or interactive game-like graphics. The interactive search graphic was a grid of squares, which, when clicked, revealed stick figures underneath. Respondents had to click until they found a figure affected by the disease. Measurements. Risk feelings, risk estimates, intention to take preventive action. . Different graphics did not affect mean risk estimates, risk feelings, or intention. Low-numeracy participants reported significantly higher risk feelings than high-numeracy ones except with the interactive search graphic. Unexpectedly, respondents reported stronger intentions to take preventive action when the intention question followed questions about efficacy and disease severity than when it followed perceived risk questions (65% v. 34%; P < 0.001). When respondents reported risk feelings immediately after using the search graphic, the interaction affected perceived risk (the longer the search to find affected stick figures, the higher the risk feeling: ρ = 0.57; P = 0.009). Limitations. The authors used hypothetical decisions. . A game-like graphic that allowed consumers to search for stick figures affected by disease had no main effect on risk perception but reduced differences based on numeracy. In one condition, the game-like graphic increased concern about rare risks. Intentions for preventive action were stronger with a question order that focused first on efficacy and disease severity than with one that focused first on perceived risk.

  18. Faster Mass Spectrometry-based Protein Inference: Junction Trees are More Efficient than Sampling and Marginalization by Enumeration

    PubMed Central

    Serang, Oliver; Noble, William Stafford

    2012-01-01

    The problem of identifying the proteins in a complex mixture using tandem mass spectrometry can be framed as an inference problem on a graph that connects peptides to proteins. Several existing protein identification methods make use of statistical inference methods for graphical models, including expectation maximization, Markov chain Monte Carlo, and full marginalization coupled with approximation heuristics. We show that, for this problem, the majority of the cost of inference usually comes from a few highly connected subgraphs. Furthermore, we evaluate three different statistical inference methods using a common graphical model, and we demonstrate that junction tree inference substantially improves rates of convergence compared to existing methods. The python code used for this paper is available at http://noble.gs.washington.edu/proj/fido. PMID:22331862

  19. Arlequin suite ver 3.5: a new series of programs to perform population genetics analyses under Linux and Windows.

    PubMed

    Excoffier, Laurent; Lischer, Heidi E L

    2010-05-01

    We present here a new version of the Arlequin program available under three different forms: a Windows graphical version (Winarl35), a console version of Arlequin (arlecore), and a specific console version to compute summary statistics (arlsumstat). The command-line versions run under both Linux and Windows. The main innovations of the new version include enhanced outputs in XML format, the possibility to embed graphics displaying computation results directly into output files, and the implementation of a new method to detect loci under selection from genome scans. Command-line versions are designed to handle large series of files, and arlsumstat can be used to generate summary statistics from simulated data sets within an Approximate Bayesian Computation framework. © 2010 Blackwell Publishing Ltd.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad Allen

    EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statistical associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can selectmore » a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.« less

  1. Proceedings of the 1977 Image Conference Held at Williams Air Force Base, Arizona on 17-18 May 1977

    DTIC Science & Technology

    1977-05-01

    both stimulating and informative. May your experiences at this Conference he most rewarding and enjoyable. rr KEYNOTE ADDRESS Senator 3!rry M. Goldwater...capabilities, human motion perceiving capabilities and the dynamics of the flight system being simulated. Subliminal washout schemes and recent develop...facilities and the military user by the graphic presentation of information with auditory overlay. Background Pieces of the electronic graphic mail concept

  2. Monitoring and analysis of data in cyberspace

    NASA Technical Reports Server (NTRS)

    Schwuttke, Ursula M. (Inventor); Angelino, Robert (Inventor)

    2001-01-01

    Information from monitored systems is displayed in three dimensional cyberspace representations defining a virtual universe having three dimensions. Fixed and dynamic data parameter outputs from the monitored systems are visually represented as graphic objects that are positioned in the virtual universe based on relationships to the system and to the data parameter categories. Attributes and values of the data parameters are indicated by manipulating properties of the graphic object such as position, color, shape, and motion.

  3. SEDIDAT: A BASIC program for the collection and statistical analysis of particle settling velocity data

    NASA Astrophysics Data System (ADS)

    Wright, Robyn; Thornberg, Steven M.

    SEDIDAT is a series of compiled IBM-BASIC (version 2.0) programs that direct the collection, statistical calculation, and graphic presentation of particle settling velocity and equivalent spherical diameter for samples analyzed using the settling tube technique. The programs follow a menu-driven format that is understood easily by students and scientists with little previous computer experience. Settling velocity is measured directly (cm,sec) and also converted into Chi units. Equivalent spherical diameter (reported in Phi units) is calculated using a modified Gibbs equation for different particle densities. Input parameters, such as water temperature, settling distance, particle density, run time, and Phi;Chi interval are changed easily at operator discretion. Optional output to a dot-matrix printer includes a summary of moment and graphic statistical parameters, a tabulation of individual and cumulative weight percents, a listing of major distribution modes, and cumulative and histogram plots of a raw time, settling velocity. Chi and Phi data.

  4. ConvAn: a convergence analyzing tool for optimization of biochemical networks.

    PubMed

    Kostromins, Andrejs; Mozga, Ivars; Stalidzans, Egils

    2012-01-01

    Dynamic models of biochemical networks usually are described as a system of nonlinear differential equations. In case of optimization of models for purpose of parameter estimation or design of new properties mainly numerical methods are used. That causes problems of optimization predictability as most of numerical optimization methods have stochastic properties and the convergence of the objective function to the global optimum is hardly predictable. Determination of suitable optimization method and necessary duration of optimization becomes critical in case of evaluation of high number of combinations of adjustable parameters or in case of large dynamic models. This task is complex due to variety of optimization methods, software tools and nonlinearity features of models in different parameter spaces. A software tool ConvAn is developed to analyze statistical properties of convergence dynamics for optimization runs with particular optimization method, model, software tool, set of optimization method parameters and number of adjustable parameters of the model. The convergence curves can be normalized automatically to enable comparison of different methods and models in the same scale. By the help of the biochemistry adapted graphical user interface of ConvAn it is possible to compare different optimization methods in terms of ability to find the global optima or values close to that as well as the necessary computational time to reach them. It is possible to estimate the optimization performance for different number of adjustable parameters. The functionality of ConvAn enables statistical assessment of necessary optimization time depending on the necessary optimization accuracy. Optimization methods, which are not suitable for a particular optimization task, can be rejected if they have poor repeatability or convergence properties. The software ConvAn is freely available on www.biosystems.lv/convan. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  5. The effects of 3D interactive animated graphics on student learning and attitudes in computer-based instruction

    NASA Astrophysics Data System (ADS)

    Moon, Hye Sun

    Visuals are most extensively used as instructional tools in education to present spatially-based information. Recent computer technology allows the generation of 3D animated visuals to extend the presentation in computer-based instruction. Animated visuals in 3D representation not only possess motivational value that promotes positive attitudes toward instruction but also facilitate learning when the subject matter requires dynamic motion and 3D visual cue. In this study, three questions are explored: (1) how 3D graphics affects student learning and attitude, in comparison with 2D graphics; (2) how animated graphics affects student learning and attitude, in comparison with static graphics; and (3) whether the use of 3D graphics, when they are supported by interactive animation, is the most effective visual cues to improve learning and to develop positive attitudes. A total of 145 eighth-grade students participated in a 2 x 2 factorial design study. The subjects were randomly assigned to one of four computer-based instructions: 2D static; 2D animated; 3D static; and 3D animated. The results indicated that: (1) Students in the 3D graphic condition exhibited more positive attitudes toward instruction than those in the 2D graphic condition. No group differences were found between the posttest score of 3D graphic condition and that of 2D graphic condition. However, students in the 3D graphic condition took less time for information retrieval on posttest than those in the 2D graphic condition. (2) Students in the animated graphic condition exhibited slightly more positive attitudes toward instruction than those in the static graphic condition. No group differences were found between the posttest score of animated graphic condition and that of static graphic condition. However, students in the animated graphic condition took less time for information retrieval on posttest than those in the static graphic condition. (3) Students in the 3D animated graphic condition exhibited more positive attitudes toward instruction than those in other treatment conditions (2D static, 2D animated, and 3D static conditions). No group differences were found in the posttest scores among four treatment conditions. However, students in the 3D animated condition took less time for information retrieval on posttest than those in other treatment conditions.

  6. Linear mixed-effects models for within-participant psychology experiments: an introductory tutorial and free, graphical user interface (LMMgui).

    PubMed

    Magezi, David A

    2015-01-01

    Linear mixed-effects models (LMMs) are increasingly being used for data analysis in cognitive neuroscience and experimental psychology, where within-participant designs are common. The current article provides an introductory review of the use of LMMs for within-participant data analysis and describes a free, simple, graphical user interface (LMMgui). LMMgui uses the package lme4 (Bates et al., 2014a,b) in the statistical environment R (R Core Team).

  7. Weighted analysis methods for mapped plot forest inventory data: Tables, regressions, maps and graphs

    Treesearch

    Paul C. Van Deusen; Linda S. Heath

    2010-01-01

    Weighted estimation methods for analysis of mapped plot forest inventory data are discussed. The appropriate weighting scheme can vary depending on the type of analysis and graphical display. Both statistical issues and user expectations need to be considered in these methods. A weighting scheme is proposed that balances statistical considerations and the logical...

  8. Teaching Students to Use Summary Statistics and Graphics to Clean and Analyze Data

    ERIC Educational Resources Information Center

    Holcomb, John; Spalsbury, Angela

    2005-01-01

    Textbooks and websites today abound with real data. One neglected issue is that statistical investigations often require a good deal of "cleaning" to ready data for analysis. The purpose of this dataset and exercise is to teach students to use exploratory tools to identify erroneous observations. This article discusses the merits of such…

  9. Analysis of Multiple Contingency Tables by Exact Conditional Tests for Zero Partial Association.

    ERIC Educational Resources Information Center

    Kreiner, Svend

    The tests for zero partial association in a multiple contingency table have gained new importance with the introduction of graphical models. It is shown how these may be performed as exact conditional tests, using as test criteria either the ordinary likelihood ratio, the standard x squared statistic, or any other appropriate statistics. A…

  10. Effects of Concept Mapping Strategy on Learning Performance in Business and Economics Statistics

    ERIC Educational Resources Information Center

    Chiou, Chei-Chang

    2009-01-01

    A concept map (CM) is a hierarchically arranged, graphic representation of the relationships among concepts. Concept mapping (CMING) is the process of constructing a CM. This paper examines whether a CMING strategy can be useful in helping students to improve their learning performance in a business and economics statistics course. A single…

  11. Consistent Tolerance Bounds for Statistical Distributions

    NASA Technical Reports Server (NTRS)

    Mezzacappa, M. A.

    1983-01-01

    Assumption that sample comes from population with particular distribution is made with confidence C if data lie between certain bounds. These "confidence bounds" depend on C and assumption about distribution of sampling errors around regression line. Graphical test criteria using tolerance bounds are applied in industry where statistical analysis influences product development and use. Applied to evaluate equipment life.

  12. Quantum Dynamics and a Semiclassical Description of the Photon.

    ERIC Educational Resources Information Center

    Henderson, Giles

    1980-01-01

    Uses computer graphics and nonstationary, superposition wave functions to reveal the dynamic quantum trajectories of several molecular and electronic transitions. These methods are then coupled with classical electromagnetic theory to provide a conceptually clear picture of the emission process and emitted radiation localized in time and space.…

  13. Learning about Locomotion Patterns from Visualizations: Effects of Presentation Format and Realism

    ERIC Educational Resources Information Center

    Imhof, Birgit; Scheiter, Katharina; Gerjets, Peter

    2011-01-01

    The rapid development of computer graphics technology has made possible an easy integration of dynamic visualizations into computer-based learning environments. This study examines the relative effectiveness of dynamic visualizations, compared either to sequentially or simultaneously presented static visualizations. Moreover, the degree of realism…

  14. The Sport Students’ Ability of Literacy and Statistical Reasoning

    NASA Astrophysics Data System (ADS)

    Hidayah, N.

    2017-03-01

    The ability of literacy and statistical reasoning is very important for the students of sport education college due to the materials of statistical learning can be taken from their many activities such as sport competition, the result of test and measurement, predicting achievement based on training, finding connection among variables, and others. This research tries to describe the sport education college students’ ability of literacy and statistical reasoning related to the identification of data type, probability, table interpretation, description and explanation by using bar or pie graphic, explanation of variability, interpretation, the calculation and explanation of mean, median, and mode through an instrument. This instrument is tested to 50 college students majoring in sport resulting only 26% of all students have the ability above 30% while others still below 30%. Observing from all subjects; 56% of students have the ability of identification data classification, 49% of students have the ability to read, display and interpret table through graphic, 27% students have the ability in probability, 33% students have the ability to describe variability, and 16.32% students have the ability to read, count and describe mean, median and mode. The result of this research shows that the sport students’ ability of literacy and statistical reasoning has not been adequate and students’ statistical study has not reached comprehending concept, literary ability trining and statistical rasoning, so it is critical to increase the sport students’ ability of literacy and statistical reasoning

  15. mcaGUI: microbial community analysis R-Graphical User Interface (GUI).

    PubMed

    Copeland, Wade K; Krishnan, Vandhana; Beck, Daniel; Settles, Matt; Foster, James A; Cho, Kyu-Chul; Day, Mitch; Hickey, Roxana; Schütte, Ursel M E; Zhou, Xia; Williams, Christopher J; Forney, Larry J; Abdo, Zaid

    2012-08-15

    Microbial communities have an important role in natural ecosystems and have an impact on animal and human health. Intuitive graphic and analytical tools that can facilitate the study of these communities are in short supply. This article introduces Microbial Community Analysis GUI, a graphical user interface (GUI) for the R-programming language (R Development Core Team, 2010). With this application, researchers can input aligned and clustered sequence data to create custom abundance tables and perform analyses specific to their needs. This GUI provides a flexible modular platform, expandable to include other statistical tools for microbial community analysis in the future. The mcaGUI package and source are freely available as part of Bionconductor at http://www.bioconductor.org/packages/release/bioc/html/mcaGUI.html

  16. Representing Learning With Graphical Models

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    Probabilistic graphical models are being used widely in artificial intelligence, for instance, in diagnosis and expert systems, as a unified qualitative and quantitative framework for representing and reasoning with probabilities and independencies. Their development and use spans several fields including artificial intelligence, decision theory and statistics, and provides an important bridge between these communities. This paper shows by way of example that these models can be extended to machine learning, neural networks and knowledge discovery by representing the notion of a sample on the graphical model. Not only does this allow a flexible variety of learning problems to be represented, it also provides the means for representing the goal of learning and opens the way for the automatic development of learning algorithms from specifications.

  17. Methods for evaluating the predictive accuracy of structural dynamic models

    NASA Technical Reports Server (NTRS)

    Hasselman, T. K.; Chrostowski, Jon D.

    1990-01-01

    Uncertainty of frequency response using the fuzzy set method and on-orbit response prediction using laboratory test data to refine an analytical model are emphasized with respect to large space structures. Two aspects of the fuzzy set approach were investigated relative to its application to large structural dynamics problems: (1) minimizing the number of parameters involved in computing possible intervals; and (2) the treatment of extrema which may occur in the parameter space enclosed by all possible combinations of the important parameters of the model. Extensive printer graphics were added to the SSID code to help facilitate model verification, and an application of this code to the LaRC Ten Bay Truss is included in the appendix to illustrate this graphics capability.

  18. Data modeling of network dynamics

    NASA Astrophysics Data System (ADS)

    Jaenisch, Holger M.; Handley, James W.; Faucheux, Jeffery P.; Harris, Brad

    2004-01-01

    This paper highlights Data Modeling theory and its use for text data mining as a graphical network search engine. Data Modeling is then used to create a real-time filter capable of monitoring network traffic down to the port level for unusual dynamics and changes in business as usual. This is accomplished in an unsupervised fashion without a priori knowledge of abnormal characteristics. Two novel methods for converting streaming binary data into a form amenable to graphics based search and change detection are introduced. These techniques are then successfully applied to 1999 KDD Cup network attack data log-on sessions to demonstrate that Data Modeling can detect attacks without prior training on any form of attack behavior. Finally, two new methods for data encryption using these ideas are proposed.

  19. Routine Microsecond Molecular Dynamics Simulations with AMBER on GPUs. 2. Explicit Solvent Particle Mesh Ewald.

    PubMed

    Salomon-Ferrer, Romelia; Götz, Andreas W; Poole, Duncan; Le Grand, Scott; Walker, Ross C

    2013-09-10

    We present an implementation of explicit solvent all atom classical molecular dynamics (MD) within the AMBER program package that runs entirely on CUDA-enabled GPUs. First released publicly in April 2010 as part of version 11 of the AMBER MD package and further improved and optimized over the last two years, this implementation supports the three most widely used statistical mechanical ensembles (NVE, NVT, and NPT), uses particle mesh Ewald (PME) for the long-range electrostatics, and runs entirely on CUDA-enabled NVIDIA graphics processing units (GPUs), providing results that are statistically indistinguishable from the traditional CPU version of the software and with performance that exceeds that achievable by the CPU version of AMBER software running on all conventional CPU-based clusters and supercomputers. We briefly discuss three different precision models developed specifically for this work (SPDP, SPFP, and DPDP) and highlight the technical details of the approach as it extends beyond previously reported work [Götz et al., J. Chem. Theory Comput. 2012, DOI: 10.1021/ct200909j; Le Grand et al., Comp. Phys. Comm. 2013, DOI: 10.1016/j.cpc.2012.09.022].We highlight the substantial improvements in performance that are seen over traditional CPU-only machines and provide validation of our implementation and precision models. We also provide evidence supporting our decision to deprecate the previously described fully single precision (SPSP) model from the latest release of the AMBER software package.

  20. GTest: a software tool for graphical assessment of empirical distributions' Gaussianity.

    PubMed

    Barca, E; Bruno, E; Bruno, D E; Passarella, G

    2016-03-01

    In the present paper, the novel software GTest is introduced, designed for testing the normality of a user-specified empirical distribution. It has been implemented with two unusual characteristics; the first is the user option of selecting four different versions of the normality test, each of them suited to be applied to a specific dataset or goal, and the second is the inferential paradigm that informs the output of such tests: it is basically graphical and intrinsically self-explanatory. The concept of inference-by-eye is an emerging inferential approach which will find a successful application in the near future due to the growing need of widening the audience of users of statistical methods to people with informal statistical skills. For instance, the latest European regulation concerning environmental issues introduced strict protocols for data handling (data quality assurance, outliers detection, etc.) and information exchange (areal statistics, trend detection, etc.) between regional and central environmental agencies. Therefore, more and more frequently, laboratory and field technicians will be requested to utilize complex software applications for subjecting data coming from monitoring, surveying or laboratory activities to specific statistical analyses. Unfortunately, inferential statistics, which actually influence the decisional processes for the correct managing of environmental resources, are often implemented in a way which expresses its outcomes in a numerical form with brief comments in a strict statistical jargon (degrees of freedom, level of significance, accepted/rejected H0, etc.). Therefore, often, the interpretation of such outcomes is really difficult for people with poor statistical knowledge. In such framework, the paradigm of the visual inference can contribute to fill in such gap, providing outcomes in self-explanatory graphical forms with a brief comment in the common language. Actually, the difficulties experienced by colleagues and their request for an effective tool for addressing such difficulties motivated us in adopting the inference-by-eye paradigm and implementing an easy-to-use, quick and reliable statistical tool. GTest visualizes its outcomes as a modified version of the Q-Q plot. The application has been developed in Visual Basic for Applications (VBA) within MS Excel 2010, which demonstrated to have all the characteristics of robustness and reliability needed. GTest provides true graphical normality tests which are as reliable as any statistical quantitative approach but much easier to understand. The Q-Q plots have been integrated with the outlining of an acceptance region around the representation of the theoretical distribution, defined in accordance with the alpha level of significance and the data sample size. The test decision rule is the following: if the empirical scatterplot falls completely within the acceptance region, then it can be concluded that the empirical distribution fits the theoretical one at the given alpha level. A comprehensive case study has been carried out with simulated and real-world data in order to check the robustness and reliability of the software.

  1. PRay - A graphical user interface for interactive visualization and modification of rayinvr models

    NASA Astrophysics Data System (ADS)

    Fromm, T.

    2016-01-01

    PRay is a graphical user interface for interactive displaying and editing of velocity models for seismic refraction. It is optimized for editing rayinvr models but can also be used as a dynamic viewer for ray tracing results from other software. The main features are the graphical editing of nodes and fast adjusting of the display (stations and phases). It can be extended by user-defined shell scripts and links to phase picking software. PRay is open source software written in the scripting language Perl, runs on Unix-like operating systems including Mac OS X and provides a version controlled source code repository for community development (https://sourceforge.net/projects/pray-plot-rayinvr/).

  2. Computer aided analysis and optimization of mechanical system dynamics

    NASA Technical Reports Server (NTRS)

    Haug, E. J.

    1984-01-01

    The purpose is to outline a computational approach to spatial dynamics of mechanical systems that substantially enlarges the scope of consideration to include flexible bodies, feedback control, hydraulics, and related interdisciplinary effects. Design sensitivity analysis and optimization is the ultimate goal. The approach to computer generation and solution of the system dynamic equations and graphical methods for creating animations as output is outlined.

  3. Tuberculosis Data and Statistics

    MedlinePlus

    ... TB programs can use to design and prioritize effective public health interventions. Tuberculosis — United States, 2017 (Provisional Data) Take on Tuberculosis Infographic and Social Media Graphics Customizable Take on TB Infographic with Instructions ...

  4. Gender differences in learning physical science concepts: Does computer animation help equalize them?

    NASA Astrophysics Data System (ADS)

    Jacek, Laura Lee

    This dissertation details an experiment designed to identify gender differences in learning using three experimental treatments: animation, static graphics, and verbal instruction alone. Three learning presentations were used in testing of 332 university students. Statistical analysis was performed using ANOVA, binomial tests for differences of proportion, and descriptive statistics. Results showed that animation significantly improved women's long-term learning over static graphics (p = 0.067), but didn't significantly improve men's long-term learning over static graphics. In all cases, women's scores improved with animation over both other forms of instruction for long-term testing, indicating that future research should not abandon the study of animation as a tool that may promote gender equity in science. Short-term test differences were smaller, and not statistically significant. Variation present in short-term scores was related more to presentation topic than treatment. This research also details characteristics of each of the three presentations, to identify variables (e.g. level of abstraction in presentation) affecting score differences within treatments. Differences between men's and women's scores were non-standard between presentations, but these differences were not statistically significant (long-term p = 0.2961, short-term p = 0.2893). In future research, experiments might be better designed to test these presentational variables in isolation, possibly yielding more distinctive differences between presentational scores. Differences in confidence interval overlaps between presentations suggested that treatment superiority may be somewhat dependent on the design or topic of the learning presentation. Confidence intervals greatly overlap in all situations. This undercut, to some degree, the surety of conclusions indicating superiority of one treatment type over the others. However, confidence intervals for animation were smaller, overlapped nearly completely for men and women (there was less overlap between the genders for the other two treatments), and centered around slightly higher means, lending further support to the conclusion that animation helped equalize men's and women's learning. The most important conclusion identified in this research is that gender is an important variable experimental populations testing animation as a learning device. Averages indicated that both men and women prefer to work with animation over either static graphics or verbal instruction alone.

  5. OMV mission simulator

    NASA Technical Reports Server (NTRS)

    Cok, Keith E.

    1989-01-01

    The Orbital Maneuvering Vehicle (OMV) will be remotely piloted during rendezvous, docking, or proximity operations with target spacecraft from a ground control console (GCC). The real-time mission simulator and graphics being used to design a console pilot-machine interface are discussed. A real-time orbital dynamics simulator drives the visual displays. The dynamics simulator includes a J2 oblate earth gravity model and a generalized 1962 rotating atmospheric and drag model. The simulator also provides a variable-length communication delay to represent use of the Tracking and Data Relay Satellite System (TDRSS) and NASA Communications (NASCOM). Input parameter files determine the graphics display. This feature allows rapid prototyping since displays can be easily modified from pilot recommendations. A series of pilot reviews are being held to determine an effective pilot-machine interface. Pilots fly missions with nominal to 3-sigma dispersions in translational or rotational axes. Console dimensions, switch type and layout, hand controllers, and graphic interfaces are evaluated by the pilots and the GCC simulator is modified for subsequent runs. Initial results indicate a pilot preference for analog versus digital displays and for two 3-degree-of-freedom hand controllers.

  6. The design and implementation of CRT displays in the TCV real-time simulation

    NASA Technical Reports Server (NTRS)

    Leavitt, J. B.; Tariq, S. I.; Steinmetz, G. G.

    1975-01-01

    The design and application of computer graphics to the Terminal Configured Vehicle (TCV) program were described. A Boeing 737-100 series aircraft was modified with a second flight deck and several computers installed in the passenger cabin. One of the elements in support of the TCV program is a sophisticated simulation system developed to duplicate the operation of the aft flight deck. This facility consists of an aft flight deck simulator, equipped with realistic flight instrumentation, a CDC 6600 computer, and an Adage graphics terminal; this terminal presents to the simulator pilot displays similar to those used on the aircraft with equivalent man-machine interactions. These two displays form the primary flight instrumentation for the pilot and are dynamic images depicting critical flight information. The graphics terminal is a high speed interactive refresh-type graphics system. To support the cockpit display, two remote CRT's were wired in parallel with two of the Adage scopes.

  7. Local introduction and heterogeneous spatial spread of dengue-suppressing Wolbachia through an urban population of Aedes aegypti

    PubMed Central

    Schmidt, Tom L.; Barton, Nicholas H.; Rašić, Gordana; Turley, Andrew P.; Montgomery, Brian L.; Iturbe-Ormaetxe, Inaki; Cook, Peter E.; Ryan, Peter A.; Ritchie, Scott A.; Hoffmann, Ary A.; O’Neill, Scott L.

    2017-01-01

    Dengue-suppressing Wolbachia strains are promising tools for arbovirus control, particularly as they have the potential to self-spread following local introductions. To test this, we followed the frequency of the transinfected Wolbachia strain wMel through Ae. aegypti in Cairns, Australia, following releases at 3 nonisolated locations within the city in early 2013. Spatial spread was analysed graphically using interpolation and by fitting a statistical model describing the position and width of the wave. For the larger 2 of the 3 releases (covering 0.97 km2 and 0.52 km2), we observed slow but steady spatial spread, at about 100–200 m per year, roughly consistent with theoretical predictions. In contrast, the smallest release (0.11 km2) produced erratic temporal and spatial dynamics, with little evidence of spread after 2 years. This is consistent with the prediction concerning fitness-decreasing Wolbachia transinfections that a minimum release area is needed to achieve stable local establishment and spread in continuous habitats. Our graphical and likelihood analyses produced broadly consistent estimates of wave speed and wave width. Spread at all sites was spatially heterogeneous, suggesting that environmental heterogeneity will affect large-scale Wolbachia transformations of urban mosquito populations. The persistence and spread of Wolbachia in release areas meeting minimum area requirements indicates the promise of successful large-scale population transformation. PMID:28557993

  8. Local introduction and heterogeneous spatial spread of dengue-suppressing Wolbachia through an urban population of Aedes aegypti.

    PubMed

    Schmidt, Tom L; Barton, Nicholas H; Rašić, Gordana; Turley, Andrew P; Montgomery, Brian L; Iturbe-Ormaetxe, Inaki; Cook, Peter E; Ryan, Peter A; Ritchie, Scott A; Hoffmann, Ary A; O'Neill, Scott L; Turelli, Michael

    2017-05-01

    Dengue-suppressing Wolbachia strains are promising tools for arbovirus control, particularly as they have the potential to self-spread following local introductions. To test this, we followed the frequency of the transinfected Wolbachia strain wMel through Ae. aegypti in Cairns, Australia, following releases at 3 nonisolated locations within the city in early 2013. Spatial spread was analysed graphically using interpolation and by fitting a statistical model describing the position and width of the wave. For the larger 2 of the 3 releases (covering 0.97 km2 and 0.52 km2), we observed slow but steady spatial spread, at about 100-200 m per year, roughly consistent with theoretical predictions. In contrast, the smallest release (0.11 km2) produced erratic temporal and spatial dynamics, with little evidence of spread after 2 years. This is consistent with the prediction concerning fitness-decreasing Wolbachia transinfections that a minimum release area is needed to achieve stable local establishment and spread in continuous habitats. Our graphical and likelihood analyses produced broadly consistent estimates of wave speed and wave width. Spread at all sites was spatially heterogeneous, suggesting that environmental heterogeneity will affect large-scale Wolbachia transformations of urban mosquito populations. The persistence and spread of Wolbachia in release areas meeting minimum area requirements indicates the promise of successful large-scale population transformation.

  9. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis.

    PubMed

    Delorme, Arnaud; Makeig, Scott

    2004-03-15

    We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of single-trial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multi-trial ERP-image plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), independent component analysis (ICA) and time/frequency decompositions including channel and component cross-coherence supported by bootstrap statistical methods based on data resampling. EEGLAB functions are organized into three layers. Top-layer functions allow users to interact with the data through the graphic interface without needing to use MATLAB syntax. Menu options allow users to tune the behavior of EEGLAB to available memory. Middle-layer functions allow users to customize data processing using command history and interactive 'pop' functions. Experienced MATLAB users can use EEGLAB data structures and stand-alone signal processing functions to write custom and/or batch analysis scripts. Extensive function help and tutorial information are included. A 'plug-in' facility allows easy incorporation of new EEG modules into the main menu. EEGLAB is freely available (http://www.sccn.ucsd.edu/eeglab/) under the GNU public license for noncommercial use and open source development, together with sample data, user tutorial and extensive documentation.

  10. Development of automation and robotics for space via computer graphic simulation methods

    NASA Technical Reports Server (NTRS)

    Fernandez, Ken

    1988-01-01

    A robot simulation system, has been developed to perform automation and robotics system design studies. The system uses a procedure-oriented solid modeling language to produce a model of the robotic mechanism. The simulator generates the kinematics, inverse kinematics, dynamics, control, and real-time graphic simulations needed to evaluate the performance of the model. Simulation examples are presented, including simulation of the Space Station and the design of telerobotics for the Orbital Maneuvering Vehicle.

  11. imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel.

    PubMed

    Grapov, Dmitry; Newman, John W

    2012-09-01

    Interactive modules for Data Exploration and Visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data through a user-friendly interface. Individual modules enables interactive and dynamic analyses of large data by interfacing R's multivariate statistics and highly customizable visualizations with the spreadsheet environment, aiding robust inferences and generating information-rich data visualizations. This tool provides access to multiple comparisons with false discovery correction, hierarchical clustering, principal and independent component analyses, partial least squares regression and discriminant analysis, through an intuitive interface for creating high-quality two- and a three-dimensional visualizations including scatter plot matrices, distribution plots, dendrograms, heat maps, biplots, trellis biplots and correlation networks. Freely available for download at http://sourceforge.net/projects/imdev/. Implemented in R and VBA and supported by Microsoft Excel (2003, 2007 and 2010).

  12. EEG and MEG data analysis in SPM8.

    PubMed

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools.

  13. Research Techniques Made Simple: Bioinformatics for Genome-Scale Biology.

    PubMed

    Foulkes, Amy C; Watson, David S; Griffiths, Christopher E M; Warren, Richard B; Huber, Wolfgang; Barnes, Michael R

    2017-09-01

    High-throughput biology presents unique opportunities and challenges for dermatological research. Drawing on a small handful of exemplary studies, we review some of the major lessons of these new technologies. We caution against several common errors and introduce helpful statistical concepts that may be unfamiliar to researchers without experience in bioinformatics. We recommend specific software tools that can aid dermatologists at varying levels of computational literacy, including platforms with command line and graphical user interfaces. The future of dermatology lies in integrative research, in which clinicians, laboratory scientists, and data analysts come together to plan, execute, and publish their work in open forums that promote critical discussion and reproducibility. In this article, we offer guidelines that we hope will steer researchers toward best practices for this new and dynamic era of data intensive dermatology. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Comparison of Response Surface and Kriging Models in the Multidisciplinary Design of an Aerospike Nozzle

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.

    1998-01-01

    The use of response surface models and kriging models are compared for approximating non-random, deterministic computer analyses. After discussing the traditional response surface approach for constructing polynomial models for approximation, kriging is presented as an alternative statistical-based approximation method for the design and analysis of computer experiments. Both approximation methods are applied to the multidisciplinary design and analysis of an aerospike nozzle which consists of a computational fluid dynamics model and a finite element analysis model. Error analysis of the response surface and kriging models is performed along with a graphical comparison of the approximations. Four optimization problems are formulated and solved using both approximation models. While neither approximation technique consistently outperforms the other in this example, the kriging models using only a constant for the underlying global model and a Gaussian correlation function perform as well as the second order polynomial response surface models.

  15. EEG and MEG Data Analysis in SPM8

    PubMed Central

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools. PMID:21437221

  16. Easy handling of tectonic data: the programs TectonicVB for Mac and TectonicsFP for Windows™

    NASA Astrophysics Data System (ADS)

    Ortner, Hugo; Reiter, Franz; Acs, Peter

    2002-12-01

    TectonicVB for Macintosh and TectonicsFP for Windows TM operating systems are two menu-driven computer programs which allow the shared use of data on these environments. The programs can produce stereographic plots of orientation data (great circles, poles, lineations). Frequently used statistical procedures like calculation of eigenvalues and eigenvectors, calculation of mean vector with concentration parameters and confidence cone can be easily performed. Fault data can be plotted in stereographic projection (Angelier and Hoeppener plots). Sorting of datasets into homogeneous subsets and rotation of tectonic data can be performed in interactive two-diagram windows. The paleostress tensor can be calculated from fault data sets using graphical (calculation of kinematic axes and right dihedra method) or mathematical methods (direct inversion or numerical dynamical analysis). The calculations can be checked in dimensionless Mohr diagrams and fluctuation histograms.

  17. Basic statistics with Microsoft Excel: a review.

    PubMed

    Divisi, Duilio; Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto

    2017-06-01

    The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel.

  18. Basic statistics with Microsoft Excel: a review

    PubMed Central

    Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto

    2017-01-01

    The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel. PMID:28740690

  19. Report to the Nation on Crime and Justice and Technical Appendix. Second Edition.

    ERIC Educational Resources Information Center

    Department of Justice, Washington, DC. Bureau of Justice Statistics.

    This report on crime and justice aims to present statistical information in a format that can be easily understood by a nontechnical audience. It uses graphics and a nontechnical format to bring together data from the Bureau of Justice Statistics, the Federal Bureau of Investigation Uniform Crime Reports, the Bureau of the Census, the National…

  20. Including the Tukey Mean-Difference (Bland-Altman) Plot in a Statistics Course

    ERIC Educational Resources Information Center

    Kozak, Marcin; Wnuk, Agnieszka

    2014-01-01

    The Tukey mean-difference plot, also called the Bland-Altman plot, is a recognized graphical tool in the exploration of biometrical data. We show that this technique deserves a place on an introductory statistics course by encouraging students to think about the kind of graph they wish to create, rather than just creating the default graph for the…

  1. Key Characteristics of Successful Science Learning: The Promise of Learning by Modelling

    ERIC Educational Resources Information Center

    Mulder, Yvonne G.; Lazonder, Ard W.; de Jong, Ton

    2015-01-01

    The basic premise underlying this research is that scientific phenomena are best learned by creating an external representation that complies with the complex and dynamic nature of such phenomena. Effective representations are assumed to incorporate three key characteristics: they are graphical, dynamic, and provide a pre-specified outline of the…

  2. NPV Sensitivity Analysis: A Dynamic Excel Approach

    ERIC Educational Resources Information Center

    Mangiero, George A.; Kraten, Michael

    2017-01-01

    Financial analysts generally create static formulas for the computation of NPV. When they do so, however, it is not readily apparent how sensitive the value of NPV is to changes in multiple interdependent and interrelated variables. It is the aim of this paper to analyze this variability by employing a dynamic, visually graphic presentation using…

  3. Exploring Classroom Interaction with Dynamic Social Network Analysis

    ERIC Educational Resources Information Center

    Bokhove, Christian

    2018-01-01

    This article reports on an exploratory project in which technology and dynamic social network analysis (SNA) are used for modelling classroom interaction. SNA focuses on the links between social actors, draws on graphic imagery to reveal and display the patterning of those links, and develops mathematical and computational models to describe and…

  4. Suggested Courseware for the Non-Calculus Physics Student: Projectile Motion, Circular Motion, Rotational Dynamics, and Statics.

    ERIC Educational Resources Information Center

    Mahoney, Joyce; And Others

    1988-01-01

    Evaluates 10 courseware packages covering topics for introductory physics. Discusses the price; sub-topics; program type; interaction; possible hardware; time; calculus required; graphics; and comments on each program. Recommends two packages in projectile and circular motion, and three packages in statics and rotational dynamics. (YP)

  5. 17 CFR 229.1121 - (Item 1121) Distribution and pool performance information.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... may be found). Present statistical information in tabular or graphical format, if such presentation... realization rates. (9) Delinquency and loss information for the period. In addition, describe any material...

  6. Heterogeneity in homogeneous nucleation from billion-atom molecular dynamics simulation of solidification of pure metal.

    PubMed

    Shibuta, Yasushi; Sakane, Shinji; Miyoshi, Eisuke; Okita, Shin; Takaki, Tomohiro; Ohno, Munekazu

    2017-04-05

    Can completely homogeneous nucleation occur? Large scale molecular dynamics simulations performed on a graphics-processing-unit rich supercomputer can shed light on this long-standing issue. Here, a billion-atom molecular dynamics simulation of homogeneous nucleation from an undercooled iron melt reveals that some satellite-like small grains surrounding previously formed large grains exist in the middle of the nucleation process, which are not distributed uniformly. At the same time, grains with a twin boundary are formed by heterogeneous nucleation from the surface of the previously formed grains. The local heterogeneity in the distribution of grains is caused by the local accumulation of the icosahedral structure in the undercooled melt near the previously formed grains. This insight is mainly attributable to the multi-graphics processing unit parallel computation combined with the rapid progress in high-performance computational environments.Nucleation is a fundamental physical process, however it is a long-standing issue whether completely homogeneous nucleation can occur. Here the authors reveal, via a billion-atom molecular dynamics simulation, that local heterogeneity exists during homogeneous nucleation in an undercooled iron melt.

  7. Non-convex Statistical Optimization for Sparse Tensor Graphical Model

    PubMed Central

    Sun, Wei; Wang, Zhaoran; Liu, Han; Cheng, Guang

    2016-01-01

    We consider the estimation of sparse graphical models that characterize the dependency structure of high-dimensional tensor-valued data. To facilitate the estimation of the precision matrix corresponding to each way of the tensor, we assume the data follow a tensor normal distribution whose covariance has a Kronecker product structure. The penalized maximum likelihood estimation of this model involves minimizing a non-convex objective function. In spite of the non-convexity of this estimation problem, we prove that an alternating minimization algorithm, which iteratively estimates each sparse precision matrix while fixing the others, attains an estimator with the optimal statistical rate of convergence as well as consistent graph recovery. Notably, such an estimator achieves estimation consistency with only one tensor sample, which is unobserved in previous work. Our theoretical results are backed by thorough numerical studies. PMID:28316459

  8. Database Performance Monitoring for the Photovoltaic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Katherine A.

    The Database Performance Monitoring (DPM) software (copyright in processes) is being developed at Sandia National Laboratories to perform quality control analysis on time series data. The software loads time indexed databases (currently csv format), performs a series of quality control tests defined by the user, and creates reports which include summary statistics, tables, and graphics. DPM can be setup to run on an automated schedule defined by the user. For example, the software can be run once per day to analyze data collected on the previous day. HTML formatted reports can be sent via email or hosted on a website.more » To compare performance of several databases, summary statistics and graphics can be gathered in a dashboard view which links to detailed reporting information for each database. The software can be customized for specific applications.« less

  9. Statistical iterative reconstruction for streak artefact reduction when using multidetector CT to image the dento-alveolar structures.

    PubMed

    Dong, J; Hayakawa, Y; Kober, C

    2014-01-01

    When metallic prosthetic appliances and dental fillings exist in the oral cavity, the appearance of metal-induced streak artefacts is not avoidable in CT images. The aim of this study was to develop a method for artefact reduction using the statistical reconstruction on multidetector row CT images. Adjacent CT images often depict similar anatomical structures. Therefore, reconstructed images with weak artefacts were attempted using projection data of an artefact-free image in a neighbouring thin slice. Images with moderate and strong artefacts were continuously processed in sequence by successive iterative restoration where the projection data was generated from the adjacent reconstructed slice. First, the basic maximum likelihood-expectation maximization algorithm was applied. Next, the ordered subset-expectation maximization algorithm was examined. Alternatively, a small region of interest setting was designated. Finally, the general purpose graphic processing unit machine was applied in both situations. The algorithms reduced the metal-induced streak artefacts on multidetector row CT images when the sequential processing method was applied. The ordered subset-expectation maximization and small region of interest reduced the processing duration without apparent detriments. A general-purpose graphic processing unit realized the high performance. A statistical reconstruction method was applied for the streak artefact reduction. The alternative algorithms applied were effective. Both software and hardware tools, such as ordered subset-expectation maximization, small region of interest and general-purpose graphic processing unit achieved fast artefact correction.

  10. The statistical analysis of global climate change studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, J.W.

    1992-01-01

    The focus of this work is to contribute to the enhancement of the relationship between climatologists and statisticians. The analysis of global change data has been underway for many years by atmospheric scientists. Much of this analysis includes a heavy reliance on statistics and statistical inference. Some specific climatological analyses are presented and the dependence on statistics is documented before the analysis is undertaken. The first problem presented involves the fluctuation-dissipation theorem and its application to global climate models. This problem has a sound theoretical niche in the literature of both climate modeling and physics, but a statistical analysis inmore » which the data is obtained from the model to show graphically the relationship has not been undertaken. It is under this motivation that the author presents this problem. A second problem concerning the standard errors in estimating global temperatures is purely statistical in nature although very little materials exists for sampling on such a frame. This problem not only has climatological and statistical ramifications, but political ones as well. It is planned to use these results in a further analysis of global warming using actual data collected on the earth. In order to simplify the analysis of these problems, the development of a computer program, MISHA, is presented. This interactive program contains many of the routines, functions, graphics, and map projections needed by the climatologist in order to effectively enter the arena of data visualization.« less

  11. Gender differences in joint biomechanics during walking: normative study in young adults.

    PubMed

    Kerrigan, D C; Todd, M K; Della Croce, U

    1998-01-01

    The effect of gender on specific joint biomechanics during gait has been largely unexplored. Given the perceived, subjective, and temporal differences in walking between genders, we hypothesized that quantitative analysis would reveal specific gender differences in joint biomechanics as well. Sagittal kinematic (joint motion) and kinetic (joint torque and power) data from the lower limbs during walking were collected and analyzed in 99 young adult subjects (49 females), aged 20 to 40 years, using an optoelectronic motion analysis and force platform system. Kinetic data were normalized for both height and weight. Female and male data were compared graphically and statistically to assess differences in all major peak joint kinematic and kinetic values. Females had significantly greater hip flexion and less knee extension before initial contact, greater knee flexion moment in pre-swing, and greater peak mechanical joint power absorption at the knee in pre-swing (P < 0.0019 for each parameter). Other differences were noted (P < 0.05) that were not statistically significant when accounting for multiple comparisons. These gender differences may provide new insights into walking dynamics and may be important for both clinical and research studies in motivating the development of separate biomechanical reference databases for males and females.

  12. Data Images and Other Graphical Displays for Directional Data

    NASA Technical Reports Server (NTRS)

    Morphet, Bill; Symanzik, Juergen

    2005-01-01

    Vectors, axes, and periodic phenomena have direction. Directional variation can be expressed as points on a unit circle and is the subject of circular statistics, a relatively new application of statistics. An overview of existing methods for the display of directional data is given. The data image for linear variables is reviewed, then extended to directional variables by displaying direction using a color scale composed of a sequence of four or more color gradients with continuity between sequences and ordered intuitively in a color wheel such that the color of the 0deg angle is the same as the color of the 360deg angle. Cross over, which arose in automating the summarization of historical wind data, and color discontinuity resulting from the use a single color gradient in computational fluid dynamics visualization are eliminated. The new method provides for simultaneous resolution of detail on a small scale and overall structure on a large scale. Example circular data images are given of a global view of average wind direction of El Nino periods, computed rocket motor internal combustion flow, a global view of direction of the horizontal component of earth's main magnetic field on 9/15/2004, and Space Shuttle solid rocket motor nozzle vectoring.

  13. Applications of Remote Sensing and GIS(Geographic Information System) in Crime Analysis of Gujranwala City.

    NASA Astrophysics Data System (ADS)

    Munawar, Iqra

    2016-07-01

    Crime mapping is a dynamic process. It can be used to assist all stages of the problem solving process. Mapping crime can help police protect citizens more effectively. The decision to utilize a certain type of map or design element may change based on the purpose of a map, the audience or the available data. If the purpose of the crime analysis map is to assist in the identification of a particular problem, selected data may be mapped to identify patterns of activity that have been previously undetected. The main objective of this research was to study the spatial distribution patterns of the four common crimes i.e Narcotics, Arms, Burglary and Robbery in Gujranwala City using spatial statistical techniques to identify the hotspots. Hotspots or location of clusters were identified using Getis-Ord Gi* Statistic. Crime analysis mapping can be used to conduct a comprehensive spatial analysis of the problem. Graphic presentations of such findings provide a powerful medium to communicate conditions, patterns and trends thus creating an avenue for analysts to bring about significant policy changes. Moreover Crime mapping also helps in the reduction of crime rate.

  14. Modeling biochemical transformation processes and information processing with Narrator.

    PubMed

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from http://www.narrator-tool.org.

  15. Modeling biochemical transformation processes and information processing with Narrator

    PubMed Central

    Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-01-01

    Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from . PMID:17389034

  16. Phase Shadows: An Enhanced Representation of Nonlinear Dynamic Systems

    NASA Astrophysics Data System (ADS)

    Luque, Amalia; Barbancho, Julio; Cañete, Javier Fernández; Córdoba, Antonio

    2017-12-01

    Many nonlinear dynamic systems have a rotating behavior where an angle defining its state may extend to more than 360∘. In these cases the use of the phase portrait does not properly depict the system’s evolution. Normalized phase portraits or cylindrical phase portraits have been extensively used to overcome the original phase portrait’s disadvantages. In this research a new graphic representation is introduced: the phase shadow. Its use clearly reveals the system behavior while overcoming the drawback of the existing plots. Through the paper the method to obtain the graphic is stated. Additionally, to show the phase shadow’s expressiveness, a rotating pendulum is considered. The work exposes that the new graph is an enhanced representational tool for systems having equilibrium points, limit cycles, chaotic attractors and/or bifurcations.

  17. Method and System for Air Traffic Rerouting for Airspace Constraint Resolution

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz (Inventor); Morando, Alexander R. (Inventor); Sheth, Kapil S. (Inventor); McNally, B. David (Inventor); Clymer, Alexis A. (Inventor); Shih, Fu-tai (Inventor)

    2017-01-01

    A dynamic constraint avoidance route system automatically analyzes routes of aircraft flying, or to be flown, in or near constraint regions and attempts to find more time and fuel efficient reroutes around current and predicted constraints. The dynamic constraint avoidance route system continuously analyzes all flight routes and provides reroute advisories that are dynamically updated in real time. The dynamic constraint avoidance route system includes a graphical user interface that allows users to visualize, evaluate, modify if necessary, and implement proposed reroutes.

  18. Analysis of Flexural Fatigue Strength of Self Compacting Fibre Reinforced Concrete Beams

    NASA Astrophysics Data System (ADS)

    Murali, G.; Sudar Celestina, J. P. Arul; Subhashini, N.; Vigneshwari, M.

    2017-07-01

    This study presents the extensive statistical investigation ofvariations in flexural fatigue life of self-compacting Fibrous Concrete (FC) beams. For this purpose, the experimental data of earlier researchers were examined by two parameter Weibull distribution.Two methods namely Graphical and moment wereused to analyse the variations in experimental data and the results have been presented in the form of probability of survival. The Weibull parameters values obtained from graphical and method of moments are precise. At 0.7 stress level, the fatigue life shows 59861 cyclesfor areliability of 90%.

  19. Graphical aids for visualizing and interpreting patterns in departures from agreement in ordinal categorical observer agreement data.

    PubMed

    Bangdiwala, Shrikant I

    2017-01-01

    When studying the agreement between two observers rating the same n units into the same k discrete ordinal categories, Bangdiwala (1985) proposed using the "agreement chart" to visually assess agreement. This article proposes that often it is more interesting to focus on the patterns of disagreement and visually understanding the departures from perfect agreement. The article reviews the use of graphical techniques for descriptively assessing agreement and disagreements, and also reviews some of the available summary statistics that quantify such relationships.

  20. Computational tools for multi-linked flexible structures

    NASA Technical Reports Server (NTRS)

    Lee, Gordon K. F.; Brubaker, Thomas A.; Shults, James R.

    1990-01-01

    A software module which designs and tests controllers and filters in Kalman Estimator form, based on a polynomial state-space model is discussed. The user-friendly program employs an interactive graphics approach to simplify the design process. A variety of input methods are provided to test the effectiveness of the estimator. Utilities are provided which address important issues in filter design such as graphical analysis, statistical analysis, and calculation time. The program also provides the user with the ability to save filter parameters, inputs, and outputs for future use.

  1. Method and System for Dynamic Automated Corrections to Weather Avoidance Routes for Aircraft in En Route Airspace

    NASA Technical Reports Server (NTRS)

    McNally, B. David (Inventor); Erzberger, Heinz (Inventor); Sheth, Kapil (Inventor)

    2015-01-01

    A dynamic weather route system automatically analyzes routes for in-flight aircraft flying in convective weather regions and attempts to find more time and fuel efficient reroutes around current and predicted weather cells. The dynamic weather route system continuously analyzes all flights and provides reroute advisories that are dynamically updated in real time while the aircraft are in flight. The dynamic weather route system includes a graphical user interface that allows users to visualize, evaluate, modify if necessary, and implement proposed reroutes.

  2. On the Statistical Analysis of the Radar Signature of the MQM-34D

    DTIC Science & Technology

    1975-01-31

    target drone for aspect angles near normal to the roll axis for a vertically polarized measurements system. The radar cross section and glint are... drone . The raw data from RATSCAT are reported in graphical form in an AFSWC three-volume report.. The results reported here are a statistical analysis of...Ta1get Drones , AFSWC-rR.74-0l, January 1974. 2James W. Wright, On the Statistical Analysis of the Radar Signature of the MQM-34D, Interim Report

  3. Dynamic assessment of microbial ecology (DAME): a web app for interactive analysis and visualization of microbial sequencing data.

    PubMed

    Piccolo, Brian D; Wankhade, Umesh D; Chintapalli, Sree V; Bhattacharyya, Sudeepa; Chunqiao, Luo; Shankar, Kartik

    2018-03-15

    Dynamic assessment of microbial ecology (DAME) is a Shiny-based web application for interactive analysis and visualization of microbial sequencing data. DAME provides researchers not familiar with R programming the ability to access the most current R functions utilized for ecology and gene sequencing data analyses. Currently, DAME supports group comparisons of several ecological estimates of α-diversity and β-diversity, along with differential abundance analysis of individual taxa. Using the Shiny framework, the user has complete control of all aspects of the data analysis, including sample/experimental group selection and filtering, estimate selection, statistical methods and visualization parameters. Furthermore, graphical and tabular outputs are supported by R packages using D3.js and are fully interactive. DAME was implemented in R but can be modified by Hypertext Markup Language (HTML), Cascading Style Sheets (CSS), and JavaScript. It is freely available on the web at https://acnc-shinyapps.shinyapps.io/DAME/. Local installation and source code are available through Github (https://github.com/bdpiccolo/ACNC-DAME). Any system with R can launch DAME locally provided the shiny package is installed. bdpiccolo@uams.edu.

  4. Spectral Properties and Dynamics of Gold Nanorods Revealed by EMCCD Based Spectral-Phasor Method

    PubMed Central

    Chen, Hongtao; Digman, Michelle A.

    2015-01-01

    Gold nanorods (NRs) with tunable plasmon-resonant absorption in the near-infrared region have considerable advantages over organic fluorophores as imaging agents. However, the luminescence spectral properties of NRs have not been fully explored at the single particle level in bulk due to lack of proper analytic tools. Here we present a global spectral phasor analysis method which allows investigations of NRs' spectra at single particle level with their statistic behavior and spatial information during imaging. The wide phasor distribution obtained by the spectral phasor analysis indicates spectra of NRs are different from particle to particle. NRs with different spectra can be identified graphically in corresponding spatial images with high spectral resolution. Furthermore, spectral behaviors of NRs under different imaging conditions, e.g. different excitation powers and wavelengths, were carefully examined by our laser-scanning multiphoton microscope with spectral imaging capability. Our results prove that the spectral phasor method is an easy and efficient tool in hyper-spectral imaging analysis to unravel subtle changes of the emission spectrum. Moreover, we applied this method to study the spectral dynamics of NRs during direct optical trapping and by optothermal trapping. Interestingly, spectral shifts were observed in both trapping phenomena. PMID:25684346

  5. Cartographic symbol library considering symbol relations based on anti-aliasing graphic library

    NASA Astrophysics Data System (ADS)

    Mei, Yang; Li, Lin

    2007-06-01

    Cartographic visualization represents geographic information with a map form, which enables us retrieve useful geospatial information. In digital environment, cartographic symbol library is the base of cartographic visualization and is an essential component of Geographic Information System as well. Existing cartographic symbol libraries have two flaws. One is the display quality and the other one is relations adjusting. Statistic data presented in this paper indicate that the aliasing problem is a major factor on the symbol display quality on graphic display devices. So, effective graphic anti-aliasing methods based on a new anti-aliasing algorithm are presented and encapsulated in an anti-aliasing graphic library with the form of Component Object Model. Furthermore, cartographic visualization should represent feature relation in the way of correctly adjusting symbol relations besides displaying an individual feature. But current cartographic symbol libraries don't have this capability. This paper creates a cartographic symbol design model to implement symbol relations adjusting. Consequently the cartographic symbol library based on this design model can provide cartographic visualization with relations adjusting capability. The anti-aliasing graphic library and the cartographic symbol library are sampled and the results prove that the two libraries both have better efficiency and effect.

  6. Struggling readers learning with graphic-rich digital science text: Effects of a Highlight & Animate Feature and Manipulable Graphics

    NASA Astrophysics Data System (ADS)

    Defrance, Nancy L.

    Technology offers promise of 'leveling the playing field' for struggling readers. That is, instructional support features within digital texts may enable all readers to learn. This quasi-experimental study examined the effects on learning of two support features, which offered unique opportunities to interact with text. The Highlight & Animate Feature highlighted an important idea in prose, while simultaneously animating its representation in an adjacent graphic. It invited readers to integrate ideas depicted in graphics and prose, using each one to interpret the other. The Manipulable Graphics had parts that the reader could operate to discover relationships among phenomena. It invited readers to test or refine the ideas that they brought to, or gleaned from, the text. Use of these support features was compulsory. Twenty fifth grade struggling readers read a graphic-rich digital science text in a clinical interview setting, under one of two conditions: using either the Highlight & Animate Feature or the Manipulable Graphics. Participants in both conditions made statistically significant gains on a multiple choice measure of knowledge of the topic of the text. While there were no significant differences by condition in the amount of knowledge gained; there were significant differences in the quality of knowledge expressed. Transcripts revealed that understandings about light and vision, expressed by those who used the Highlight & Animate Feature, were more often conceptually and linguistically 'complete.' That is, their understandings included both a description of phenomena as well as an explanation of underlying scientific principles, which participants articulated using the vocabulary of the text. This finding may be attributed to the multiple opportunities to integrate graphics (depicting the behavior of phenomena) and prose (providing the scientific explanation of that phenomena), which characterized the Highlight & Animate Condition. Those who used the Manipulable Graphics were more likely to express complete understandings when they were able to structure a systematic investigation of the graphic and when the graphic was designed to confront their own naive conceptions about light and vision. The Manipulable Graphics also provided a foothold for those who entered the study with very little prior knowledge of the topic.

  7. Combining Graphical and Analytical Methods with Molecular Simulations To Analyze Time-Resolved FRET Measurements of Labeled Macromolecules Accurately

    PubMed Central

    2017-01-01

    Förster resonance energy transfer (FRET) measurements from a donor, D, to an acceptor, A, fluorophore are frequently used in vitro and in live cells to reveal information on the structure and dynamics of DA labeled macromolecules. Accurate descriptions of FRET measurements by molecular models are complicated because the fluorophores are usually coupled to the macromolecule via flexible long linkers allowing for diffusional exchange between multiple states with different fluorescence properties caused by distinct environmental quenching, dye mobilities, and variable DA distances. It is often assumed for the analysis of fluorescence intensity decays that DA distances and D quenching are uncorrelated (homogeneous quenching by FRET) and that the exchange between distinct fluorophore states is slow (quasistatic). This allows us to introduce the FRET-induced donor decay, εD(t), a function solely depending on the species fraction distribution of the rate constants of energy transfer by FRET, for a convenient joint analysis of fluorescence decays of FRET and reference samples by integrated graphical and analytical procedures. Additionally, we developed a simulation toolkit to model dye diffusion, fluorescence quenching by the protein surface, and FRET. A benchmark study with simulated fluorescence decays of 500 protein structures demonstrates that the quasistatic homogeneous model works very well and recovers for single conformations the average DA distances with an accuracy of < 2%. For more complex cases, where proteins adopt multiple conformations with significantly different dye environments (heterogeneous case), we introduce a general analysis framework and evaluate its power in resolving heterogeneities in DA distances. The developed fast simulation methods, relying on Brownian dynamics of a coarse-grained dye in its sterically accessible volume, allow us to incorporate structural information in the decay analysis for heterogeneous cases by relating dye states with protein conformations to pave the way for fluorescence and FRET-based dynamic structural biology. Finally, we present theories and simulations to assess the accuracy and precision of steady-state and time-resolved FRET measurements in resolving DA distances on the single-molecule and ensemble level and provide a rigorous framework for estimating approximation, systematic, and statistical errors. PMID:28709377

  8. Generalizing Experimental Findings

    DTIC Science & Technology

    2015-06-01

    ity.” In graphical terms, these assumptions may require several d-separation tests on several sub-graphs. It is utterly unimaginable therefore that...Education) (a) (Salary)(Education) (Skill) (b) S ( Test ) YX ZZ YX Figure 1: (a) A transportability model in which a post-treatment variable Z is S-admissible...observational studies to estimate population treatment ef- fects. Journal Royal Statistical Society: Series A (Statistics in Society) Forthcoming, doi

  9. Statistical, graphical, and trend summaries of selected water-quality and streamflow data from the Trinity River near Crockett, Texas, 1964-85

    USGS Publications Warehouse

    Goss, Richard L.

    1987-01-01

    As part of the statistical summaries, trend tests were conducted. Several small uptrends were detected for total nitrogen, total organic nitrogen, total ammonia nitrogen, total nitrite nitrogen, total nitrate nitrogen, total organic plus ammonia nitrogen, total nitrite plus nitrate nitrogen, and total phosphorus. Small downtrends were detected for biochemical oxygen demand and dissolved magnesium.

  10. permGPU: Using graphics processing units in RNA microarray association studies.

    PubMed

    Shterev, Ivo D; Jung, Sin-Ho; George, Stephen L; Owzar, Kouros

    2010-06-16

    Many analyses of microarray association studies involve permutation, bootstrap resampling and cross-validation, that are ideally formulated as embarrassingly parallel computing problems. Given that these analyses are computationally intensive, scalable approaches that can take advantage of multi-core processor systems need to be developed. We have developed a CUDA based implementation, permGPU, that employs graphics processing units in microarray association studies. We illustrate the performance and applicability of permGPU within the context of permutation resampling for a number of test statistics. An extensive simulation study demonstrates a dramatic increase in performance when using permGPU on an NVIDIA GTX 280 card compared to an optimized C/C++ solution running on a conventional Linux server. permGPU is available as an open-source stand-alone application and as an extension package for the R statistical environment. It provides a dramatic increase in performance for permutation resampling analysis in the context of microarray association studies. The current version offers six test statistics for carrying out permutation resampling analyses for binary, quantitative and censored time-to-event traits.

  11. Meta-analyses and Forest plots using a microsoft excel spreadsheet: step-by-step guide focusing on descriptive data analysis.

    PubMed

    Neyeloff, Jeruza L; Fuchs, Sandra C; Moreira, Leila B

    2012-01-20

    Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.

  12. Meta-analyses and Forest plots using a microsoft excel spreadsheet: step-by-step guide focusing on descriptive data analysis

    PubMed Central

    2012-01-01

    Background Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. Findings We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. Conclusions It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software. PMID:22264277

  13. Solving Differential Equations in R

    EPA Science Inventory

    Although R is still predominantly applied for statistical analysis and graphical representation, it is rapidly becoming more suitable for mathematical computing. One of the fields where considerable progress has been made recently is the solution of differential equations. Here w...

  14. SEDPAK—A comprehensive operational system and data-processing package in APPLESOFT BASIC for a settling tube, sediment analyzer

    NASA Astrophysics Data System (ADS)

    Goldbery, R.; Tehori, O.

    SEDPAK provides a comprehensive software package for operation of a settling tube and sand analyzer (2-0.063 mm) and includes data-processing programs for statistical and graphic output of results. The programs are menu-driven and written in APPLESOFT BASIC, conforming with APPLE 3.3 DOS. Data storage and retrieval from disc is an important feature of SEDPAK. Additional features of SEDPAK include condensation of raw settling data via standard size-calibration curves to yield statistical grain-size parameters, plots of grain-size frequency distributions and cumulative log/probability curves. The program also has a module for processing of grain-size frequency data from sieved samples. An addition feature of SEDPAK is the option for automatic data processing and graphic output of a sequential or nonsequential array of samples on one side of a disc.

  15. Summary of water-surface-elevation data for 116 U.S. Geological Survey lake and reservoir stations in Texas and comparison to data for water year 2006

    USGS Publications Warehouse

    Asquith, William H.; Vrabel, Joseph; Roussel, Meghan C.

    2007-01-01

    The U.S. Geological Survey (USGS), in cooperation with numerous Federal, State, municipal, and local agencies, currently (2007) collects data for more than 120 lakes and reservoirs in Texas through a realtime, data-collection network. The National Water Information System that processes and archives water-resources data for the Nation provides a central source for retrieval of real-time as well as historical data. This report provides a brief description of the real-time, data-collection network and graphically summarizes the period-of-record daily mean water-surface elevations for 116 active and discontinued USGS lake and reservoir stations in Texas. The report also graphically depicts selected statistics (minimum, maximum, and mean) of daily mean water-surface-elevation data. The data for water year 2006 are compared to the selected statistics.

  16. Interactive application of quadratic expansion of chi-square statistic to nonlinear curve fitting

    NASA Technical Reports Server (NTRS)

    Badavi, F. F.; Everhart, Joel L.

    1987-01-01

    This report contains a detailed theoretical description of an all-purpose, interactive curve-fitting routine that is based on P. R. Bevington's description of the quadratic expansion of the Chi-Square statistic. The method is implemented in the associated interactive, graphics-based computer program. Taylor's expansion of Chi-Square is first introduced, and justifications for retaining only the first term are presented. From the expansion, a set of n simultaneous linear equations is derived, then solved by matrix algebra. A brief description of the code is presented along with a limited number of changes that are required to customize the program of a particular task. To evaluate the performance of the method and the goodness of nonlinear curve fitting, two typical engineering problems are examined and the graphical and tabular output of each is discussed. A complete listing of the entire package is included as an appendix.

  17. Graphical tools for network meta-analysis in STATA.

    PubMed

    Chaimani, Anna; Higgins, Julian P T; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  18. Graphical Tools for Network Meta-Analysis in STATA

    PubMed Central

    Chaimani, Anna; Higgins, Julian P. T.; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results. PMID:24098547

  19. Thoth: Software for data visualization & statistics

    NASA Astrophysics Data System (ADS)

    Laher, R. R.

    2016-10-01

    Thoth is a standalone software application with a graphical user interface for making it easy to query, display, visualize, and analyze tabular data stored in relational databases and data files. From imported data tables, it can create pie charts, bar charts, scatter plots, and many other kinds of data graphs with simple menus and mouse clicks (no programming required), by leveraging the open-source JFreeChart library. It also computes useful table-column data statistics. A mature tool, having underwent development and testing over several years, it is written in the Java computer language, and hence can be run on any computing platform that has a Java Virtual Machine and graphical-display capability. It can be downloaded and used by anyone free of charge, and has general applicability in science, engineering, medical, business, and other fields. Special tools and features for common tasks in astronomy and astrophysical research are included in the software.

  20. LiGRO: a graphical user interface for protein-ligand molecular dynamics.

    PubMed

    Kagami, Luciano Porto; das Neves, Gustavo Machado; da Silva, Alan Wilter Sousa; Caceres, Rafael Andrade; Kawano, Daniel Fábio; Eifler-Lima, Vera Lucia

    2017-10-04

    To speed up the drug-discovery process, molecular dynamics (MD) calculations performed in GROMACS can be coupled to docking simulations for the post-screening analyses of large compound libraries. This requires generating the topology of the ligands in different software, some basic knowledge of Linux command lines, and a certain familiarity in handling the output files. LiGRO-the python-based graphical interface introduced here-was designed to overcome these protein-ligand parameterization challenges by allowing the graphical (non command line-based) control of GROMACS (MD and analysis), ACPYPE (ligand topology builder) and PLIP (protein-binder interactions monitor)-programs that can be used together to fully perform and analyze the outputs of complex MD simulations (including energy minimization and NVT/NPT equilibration). By allowing the calculation of linear interaction energies in a simple and quick fashion, LiGRO can be used in the drug-discovery pipeline to select compounds with a better protein-binding interaction profile. The design of LiGRO allows researchers to freely download and modify the software, with the source code being available under the terms of a GPLv3 license from http://www.ufrgs.br/lasomfarmacia/ligro/ .

  1. ToxPi Graphical User Interface 2.0: Dynamic exploration, visualization, and sharing of integrated data models.

    PubMed

    Marvel, Skylar W; To, Kimberly; Grimm, Fabian A; Wright, Fred A; Rusyn, Ivan; Reif, David M

    2018-03-05

    Drawing integrated conclusions from diverse source data requires synthesis across multiple types of information. The ToxPi (Toxicological Prioritization Index) is an analytical framework that was developed to enable integration of multiple sources of evidence by transforming data into integrated, visual profiles. Methodological improvements have advanced ToxPi and expanded its applicability, necessitating a new, consolidated software platform to provide functionality, while preserving flexibility for future updates. We detail the implementation of a new graphical user interface for ToxPi (Toxicological Prioritization Index) that provides interactive visualization, analysis, reporting, and portability. The interface is deployed as a stand-alone, platform-independent Java application, with a modular design to accommodate inclusion of future analytics. The new ToxPi interface introduces several features, from flexible data import formats (including legacy formats that permit backward compatibility) to similarity-based clustering to options for high-resolution graphical output. We present the new ToxPi interface for dynamic exploration, visualization, and sharing of integrated data models. The ToxPi interface is freely-available as a single compressed download that includes the main Java executable, all libraries, example data files, and a complete user manual from http://toxpi.org .

  2. Operator-assisted planning and execution of proximity operations subject to operational constraints

    NASA Technical Reports Server (NTRS)

    Grunwald, Arthur J.; Ellis, Stephen R.

    1991-01-01

    Future multi-vehicle operations will involve multiple scenarios that will require a planning tool for the rapid, interactive creation of fuel-efficient trajectories. The planning process must deal with higher-order, non-linear processes involving dynamics that are often counter-intuitive. The optimization of resulting trajectories can be difficult to envision. An interaction proximity operations planning system is being developed to provide the operator with easily interpreted visual feedback of trajectories and constraints. This system is hosted on an IRIS 4D graphics platform and utilizes the Clohessy-Wiltshire equations. An inverse dynamics algorithm is used to remove non-linearities while the trajectory maneuvers are decoupled and separated in a geometric spreadsheet. The operator has direct control of the position and time of trajectory waypoints to achieve the desired end conditions. Graphics provide the operator with visualization of satisfying operational constraints such as structural clearance, plume impingement, approach velocity limits, and arrival or departure corridors. Primer vector theory is combined with graphical presentation to improve operator understanding of suggested automated system solutions and to allow the operator to review, edit, or provide corrective action to the trajectory plan.

  3. Using graph-based assessments within socratic tutorials to reveal and refine students' analytical thinking about molecular networks.

    PubMed

    Trujillo, Caleb; Cooper, Melanie M; Klymkowsky, Michael W

    2012-01-01

    Biological systems, from the molecular to the ecological, involve dynamic interaction networks. To examine student thinking about networks we used graphical responses, since they are easier to evaluate for implied, but unarticulated assumptions. Senior college level molecular biology students were presented with simple molecular level scenarios; surprisingly, most students failed to articulate the basic assumptions needed to generate reasonable graphical representations; their graphs often contradicted their explicit assumptions. We then developed a tiered Socratic tutorial based on leading questions designed to provoke metacognitive reflection. The activity is characterized by leading questions (prompts) designed to provoke meta-cognitive reflection. When applied in a group or individual setting, there was clear improvement in targeted areas. Our results highlight the promise of using graphical responses and Socratic prompts in a tutorial context as both a formative assessment for students and an informative feedback system for instructors, in part because graphical responses are relatively easy to evaluate for implied, but unarticulated assumptions. Copyright © 2011 Wiley Periodicals, Inc.

  4. SEURAT: visual analytics for the integrated analysis of microarray data.

    PubMed

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  5. Distributed 3D Information Visualization - Towards Integration of the Dynamic 3D Graphics and Web Services

    NASA Astrophysics Data System (ADS)

    Vucinic, Dean; Deen, Danny; Oanta, Emil; Batarilo, Zvonimir; Lacor, Chris

    This paper focuses on visualization and manipulation of graphical content in distributed network environments. The developed graphical middleware and 3D desktop prototypes were specialized for situational awareness. This research was done in the LArge Scale COllaborative decision support Technology (LASCOT) project, which explored and combined software technologies to support human-centred decision support system for crisis management (earthquake, tsunami, flooding, airplane or oil-tanker incidents, chemical, radio-active or other pollutants spreading, etc.). The performed state-of-the-art review did not identify any publicly available large scale distributed application of this kind. Existing proprietary solutions rely on the conventional technologies and 2D representations. Our challenge was to apply the "latest" available technologies, such Java3D, X3D and SOAP, compatible with average computer graphics hardware. The selected technologies are integrated and we demonstrate: the flow of data, which originates from heterogeneous data sources; interoperability across different operating systems and 3D visual representations to enhance the end-users interactions.

  6. Graphical explanation in an expert system for Space Station Freedom rack integration

    NASA Technical Reports Server (NTRS)

    Craig, F. G.; Cutts, D. E.; Fennel, T. R.; Purves, B.

    1990-01-01

    The rationale and methodology used to incorporate graphics into explanations provided by an expert system for Space Station Freedom rack integration is examined. The rack integration task is typical of a class of constraint satisfaction problems for large programs where expertise from several areas is required. Graphically oriented approaches are used to explain the conclusions made by the system, the knowledge base content, and even at more abstract levels the control strategies employed by the system. The implemented architecture combines hypermedia and inference engine capabilities. The advantages of this architecture include: closer integration of user interface, explanation system, and knowledge base; the ability to embed links to deeper knowledge underlying the compiled knowledge used in the knowledge base; and allowing for more direct control of explanation depth and duration by the user. The graphical techniques employed range from simple statis presentation of schematics to dynamic creation of a series of pictures presented motion picture style. User models control the type, amount, and order of information presented.

  7. Data visualization, bar naked: A free tool for creating interactive graphics.

    PubMed

    Weissgerber, Tracey L; Savic, Marko; Winham, Stacey J; Stanisavljevic, Dejana; Garovic, Vesna D; Milic, Natasa M

    2017-12-15

    Although bar graphs are designed for categorical data, they are routinely used to present continuous data in studies that have small sample sizes. This presentation is problematic, as many data distributions can lead to the same bar graph, and the actual data may suggest different conclusions from the summary statistics. To address this problem, many journals have implemented new policies that require authors to show the data distribution. This paper introduces a free, web-based tool for creating an interactive alternative to the bar graph (http://statistika.mfub.bg.ac.rs/interactive-dotplot/). This tool allows authors with no programming expertise to create customized interactive graphics, including univariate scatterplots, box plots, and violin plots, for comparing values of a continuous variable across different study groups. Individual data points may be overlaid on the graphs. Additional features facilitate visualization of subgroups or clusters of non-independent data. A second tool enables authors to create interactive graphics from data obtained with repeated independent experiments (http://statistika.mfub.bg.ac.rs/interactive-repeated-experiments-dotplot/). These tools are designed to encourage exploration and critical evaluation of the data behind the summary statistics and may be valuable for promoting transparency, reproducibility, and open science in basic biomedical research. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  8. Web-GIS-based SARS epidemic situation visualization

    NASA Astrophysics Data System (ADS)

    Lu, Xiaolin

    2004-03-01

    In order to research, perform statistical analysis and broadcast the information of SARS epidemic situation according to the relevant spatial position, this paper proposed a unified global visualization information platform for SARS epidemic situation based on Web-GIS and scientific virtualization technology. To setup the unified global visual information platform, the architecture of Web-GIS based interoperable information system is adopted to enable public report SARS virus information to health cure center visually by using the web visualization technology. A GIS java applet is used to visualize the relationship between spatial graphical data and virus distribution, and other web based graphics figures such as curves, bars, maps and multi-dimensional figures are used to visualize the relationship between SARS virus tendency with time, patient number or locations. The platform is designed to display the SARS information in real time, simulate visually for real epidemic situation and offer an analyzing tools for health department and the policy-making government department to support the decision-making for preventing against the SARS epidemic virus. It could be used to analyze the virus condition through visualized graphics interface, isolate the areas of virus source, and control the virus condition within shortest time. It could be applied to the visualization field of SARS preventing systems for SARS information broadcasting, data management, statistical analysis, and decision supporting.

  9. A Correlational Study of Graphic Organizers and Science Achievement of English Language Learners

    NASA Astrophysics Data System (ADS)

    Clarke, William Gordon

    English language learners (ELLs) demonstrate lower academic performance and have lower graduation and higher dropout rates than their non-ELL peers. The primary purpose of this correlational quantitative study was to investigate the relationship between the use of graphic organizer-infused science instruction and science learning of high school ELLs. Another objective was to determine if the method of instruction, socioeconomic status (SES), gender, and English language proficiency (ELP) were predictors of academic achievement of high school ELLs. Data were gathered from a New York City (NYC) high school fall 2012-2013 archival records of 145 ninth-grade ELLs who had received biology instruction in freestanding English as a second language (ESL) classes, followed by a test of their learning of the material. Fifty-four (37.2%) of these records were of students who had learned science by the conventional textbook method, and 91 (62.8%) by using graphic organizers. Data analysis employed the Statistical Package for the Social Sciences (SPSS) software for multiple regression analysis, which found graphic organizer use to be a significant predictor of New York State Regents Living Environment (NYSRLE) test scores (p < .01). One significant regression model was returned whereby, when combined, the four predictor variables (method of instruction, SES, gender, and ELP) explained 36% of the variance of the NYSRLE score. Implications of the study findings noted graphic organizer use as advantageous for ELL science achievement. Recommendations made for practice were for (a) the adoption of graphic organizer infused-instruction, (b) establishment of a protocol for the implementation of graphic organizer-infused instruction, and (c) increased length of graphic organizer instructional time. Recommendations made for future research were (a) a replication quantitative correlational study in two or more high schools, (b) a quantitative quasi-experimental quantitative study to determine the influence of graphic organizer instructional intervention and ELL science achievement, (c) a quantitative quasi-experimental study to determine the effect of teacher-based factors on graphic organizer-infused instruction, and (c) a causal comparative study to determine the efficacy of graphic organizer use in testing modifications for high school ELL science.

  10. Transportable Applications Environment (TAE) Plus: A NASA tool used to develop and manage graphical user interfaces

    NASA Technical Reports Server (NTRS)

    Szczur, Martha R.

    1992-01-01

    The Transportable Applications Environment (TAE) Plus was built to support the construction of graphical user interfaces (GUI's) for highly interactive applications, such as real-time processing systems and scientific analysis systems. It is a general purpose portable tool that includes a 'What You See Is What You Get' WorkBench that allows user interface designers to layout and manipulate windows and interaction objects. The WorkBench includes both user entry objects (e.g., radio buttons, menus) and data-driven objects (e.g., dials, gages, stripcharts), which dynamically change based on values of realtime data. Discussed here is what TAE Plus provides, how the implementation has utilized state-of-the-art technologies within graphic workstations, and how it has been used both within and without NASA.

  11. Preparing Colorful Astronomical Images and Illustrations

    NASA Astrophysics Data System (ADS)

    Levay, Z. G.; Frattare, L. M.

    2001-12-01

    We present techniques for using mainstream graphics software, specifically Adobe Photoshop and Illustrator, for producing composite color images and illustrations from astronomical data. These techniques have been used with numerous images from the Hubble Space Telescope to produce printed and web-based news, education and public presentation products as well as illustrations for technical publication. While Photoshop is not intended for quantitative analysis of full dynamic range data (as are IRAF or IDL, for example), we have had much success applying Photoshop's numerous, versatile tools to work with scaled images, masks, text and graphics in multiple semi-transparent layers and channels. These features, along with its user-oriented, visual interface, provide convenient tools to produce high-quality, full-color images and graphics for printed and on-line publication and presentation.

  12. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munro, J.K. Jr.

    1980-05-01

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone whomore » wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables.« less

  13. USING LINKED MICROMAP PLOTS TO CHARACTERIZE OMERNIK ECOREGIONS

    EPA Science Inventory

    The paper introduces linked micromap (LM plots for presenting environmental summaries. The LM template includes parallel sequences of micromap, able, and statistical summary graphics panels with attention paid to perceptual grouping, sorting and linking of the summary components...

  14. Graphical augmentations to the funnel plot assess the impact of additional evidence on a meta-analysis.

    PubMed

    Langan, Dean; Higgins, Julian P T; Gregory, Walter; Sutton, Alexander J

    2012-05-01

    We aim to illustrate the potential impact of a new study on a meta-analysis, which gives an indication of the robustness of the meta-analysis. A number of augmentations are proposed to one of the most widely used of graphical displays, the funnel plot. Namely, 1) statistical significance contours, which define regions of the funnel plot in which a new study would have to be located to change the statistical significance of the meta-analysis; and 2) heterogeneity contours, which show how a new study would affect the extent of heterogeneity in a given meta-analysis. Several other features are also described, and the use of multiple features simultaneously is considered. The statistical significance contours suggest that one additional study, no matter how large, may have a very limited impact on the statistical significance of a meta-analysis. The heterogeneity contours illustrate that one outlying study can increase the level of heterogeneity dramatically. The additional features of the funnel plot have applications including 1) informing sample size calculations for the design of future studies eligible for inclusion in the meta-analysis; and 2) informing the updating prioritization of a portfolio of meta-analyses such as those prepared by the Cochrane Collaboration. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. The effect of graphic cigarette warning labels on smoking behavior: evidence from the Canadian experience.

    PubMed

    Azagba, Sunday; Sharaf, Mesbah F

    2013-03-01

    There is a substantial literature that graphic tobacco warnings are effective; however, there is limited evidence based on actual smoking behavior. The objective of this paper is to assess the effect of graphic cigarette warning labels on smoking prevalence and quit attempts. A nationally representative sample of individuals aged 15 years and older from the Canadian National Population Health Survey 1998-2008 is used. The sample consists of 4,853 individuals for the smoking prevalence regression and 1,549 smokers for quit attempts. The generalized estimating equation (GEE) model was used to examine the population-averaged (marginal) effects of tobacco graphic warnings on smoking prevalence and quit attempts. To assess the effect of graphic tobacco health warnings on smoking behavior, we used a scaled variable that takes the value of 0 for the first 6 months in 2001, then increases gradually to 1 from December 2001. We found that graphic warnings had a statistically significant effect on smoking prevalence and quit attempts. In particular, the warnings decreased the odds of being a smoker (odds ratio [OR] = 0.875; 95% CI = 0.821-0.932) and increased the odds of making a quit attempt (OR = 1.330, CI = 1.187-1.490). Similar results were obtained when we allowed for more time for the warnings to appear in retail outlets. This study adds to the growing body of evidence on the effectiveness of graphic warnings. Our findings suggest that warnings had a significant effect on smoking prevalence and quit attempts in Canada.

  16. Design and evaluation of a computer tutorial on electric fields

    NASA Astrophysics Data System (ADS)

    Morse, Jeanne Jackson

    Research has shown that students do not fully understand electric fields and their interactions with charged particles after completing traditional classroom instruction. The purpose of this project was to develop a computer tutorial to remediate some of these difficulties. Research on the effectiveness of computer-delivered instructional materials showed that students would learn better from media incorporating user-controlled interactive graphics. Two versions of the tutorial were tested. One version used interactive graphics and the other used static graphics. The two versions of the tutorial were otherwise identical. This project was done in four phases. Phases I and II were used to refine the topics covered in the tutorial and to test the usability of the tutorial. The final version of the tutorial was tested in Phases III and IV. The tutorial was tested using a pretest-posttest design with a control group. Both tests were administered in an interview setting. The tutorial using interactive graphics was more effective at remediating students' difficulties than the tutorial using static graphics for students in Phase III (p = 0.001). In Phase IV students who viewed the tutorial with static graphics did better than those viewing interactive graphics. The sample size in Phase IV was too small for this to be a statistically meaningful result. Some student reasoning errors were noted during the interviews. These include difficulty with the vector representation of electric fields, treating electric charge as if it were mass, using faulty algebraic reasoning to answer questions involving ratios and proportions, and using Coulomb's law in situations in which it is not appropriate.

  17. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Wang, Tee-See; Griffin, Lisa; Turner, James E. (Technical Monitor)

    2001-01-01

    This document is a presentation graphic which reviews the activities of the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center (i.e., Code TD64). The work of this group focused on supporting the space transportation programs. The work of the group is in Computational Fluid Dynamic tool development. This development is driven by hardware design needs. The major applications for the design and analysis tools are: turbines, pumps, propulsion-to-airframe integration, and combustion devices.

  18. Comparisons of Kinematics and Dynamics Simulation Software Tools

    NASA Technical Reports Server (NTRS)

    Shiue, Yeu-Sheng Paul

    2002-01-01

    Kinematic and dynamic analyses for moving bodies are essential to system engineers and designers in the process of design and validations. 3D visualization and motion simulation plus finite element analysis (FEA) give engineers a better way to present ideas and results. Marshall Space Flight Center (MSFC) system engineering researchers are currently using IGRIP from DELMIA Inc. as a kinematic simulation tool for discrete bodies motion simulations. Although IGRIP is an excellent tool for kinematic simulation with some dynamic analysis capabilities in robotic control, explorations of other alternatives with more powerful dynamic analysis and FEA capabilities are necessary. Kinematics analysis will only examine the displacement, velocity, and acceleration of the mechanism without considering effects from masses of components. With dynamic analysis and FEA, effects such as the forces or torques at the joint due to mass and inertia of components can be identified. With keen market competition, ALGOR Mechanical Event Simulation (MES), MSC visualNastran 4D, Unigraphics Motion+, and Pro/MECHANICA were chosen for explorations. In this study, comparisons between software tools were presented in terms of following categories: graphical user interface (GUI), import capability, tutorial availability, ease of use, kinematic simulation capability, dynamic simulation capability, FEA capability, graphical output, technical support, and cost. Propulsion Test Article (PTA) with Fastrac engine model exported from IGRIP and an office chair mechanism were used as examples for simulations.

  19. Metaphors in Mathematics Classrooms: Analyzing the Dynamic Process of Teaching and Learning of Graph Functions

    ERIC Educational Resources Information Center

    Font, Vicenc; Bolite, Janete; Acevedo, Jorge

    2010-01-01

    This article presents an analysis of a phenomenon that was observed within the dynamic processes of teaching and learning to read and elaborate Cartesian graphs for functions at high-school level. Two questions were considered during this investigation: What types of metaphors does the teacher use to explain the graphic representation of functions…

  20. Quantum Monte Carlo: Faster, More Reliable, And More Accurate

    NASA Astrophysics Data System (ADS)

    Anderson, Amos Gerald

    2010-06-01

    The Schrodinger Equation has been available for about 83 years, but today, we still strain to apply it accurately to molecules of interest. The difficulty is not theoretical in nature, but practical, since we're held back by lack of sufficient computing power. Consequently, effort is applied to find acceptable approximations to facilitate real time solutions. In the meantime, computer technology has begun rapidly advancing and changing the way we think about efficient algorithms. For those who can reorganize their formulas to take advantage of these changes and thereby lift some approximations, incredible new opportunities await. Over the last decade, we've seen the emergence of a new kind of computer processor, the graphics card. Designed to accelerate computer games by optimizing quantity instead of quality in processor, they have become of sufficient quality to be useful to some scientists. In this thesis, we explore the first known use of a graphics card to computational chemistry by rewriting our Quantum Monte Carlo software into the requisite "data parallel" formalism. We find that notwithstanding precision considerations, we are able to speed up our software by about a factor of 6. The success of a Quantum Monte Carlo calculation depends on more than just processing power. It also requires the scientist to carefully design the trial wavefunction used to guide simulated electrons. We have studied the use of Generalized Valence Bond wavefunctions to simply, and yet effectively, captured the essential static correlation in atoms and molecules. Furthermore, we have developed significantly improved two particle correlation functions, designed with both flexibility and simplicity considerations, representing an effective and reliable way to add the necessary dynamic correlation. Lastly, we present our method for stabilizing the statistical nature of the calculation, by manipulating configuration weights, thus facilitating efficient and robust calculations. Our combination of Generalized Valence Bond wavefunctions, improved correlation functions, and stabilized weighting techniques for calculations run on graphics cards, represents a new way for using Quantum Monte Carlo to study arbitrarily sized molecules.

  1. People detection method using graphics processing units for a mobile robot with an omnidirectional camera

    NASA Astrophysics Data System (ADS)

    Kang, Sungil; Roh, Annah; Nam, Bodam; Hong, Hyunki

    2011-12-01

    This paper presents a novel vision system for people detection using an omnidirectional camera mounted on a mobile robot. In order to determine regions of interest (ROI), we compute a dense optical flow map using graphics processing units, which enable us to examine compliance with the ego-motion of the robot in a dynamic environment. Shape-based classification algorithms are employed to sort ROIs into human beings and nonhumans. The experimental results show that the proposed system detects people more precisely than previous methods.

  2. Discrete Dynamics Lab

    NASA Astrophysics Data System (ADS)

    Wuensche, Andrew

    DDLab is interactive graphics software for creating, visualizing, and analyzing many aspects of Cellular Automata, Random Boolean Networks, and Discrete Dynamical Networks in general and studying their behavior, both from the time-series perspective — space-time patterns, and from the state-space perspective — attractor basins. DDLab is relevant to research, applications, and education in the fields of complexity, self-organization, emergent phenomena, chaos, collision-based computing, neural networks, content addressable memory, genetic regulatory networks, dynamical encryption, generative art and music, and the study of the abstract mathematical/physical/dynamical phenomena in their own right.

  3. Simulation of Earth textures by conditional image quilting

    NASA Astrophysics Data System (ADS)

    Mahmud, K.; Mariethoz, G.; Caers, J.; Tahmasebi, P.; Baker, A.

    2014-04-01

    Training image-based approaches for stochastic simulations have recently gained attention in surface and subsurface hydrology. This family of methods allows the creation of multiple realizations of a study domain, with a spatial continuity based on a training image (TI) that contains the variability, connectivity, and structural properties deemed realistic. A major drawback of these methods is their computational and/or memory cost, making certain applications challenging. It was found that similar methods, also based on training images or exemplars, have been proposed in computer graphics. One such method, image quilting (IQ), is introduced in this paper and adapted for hydrogeological applications. The main difficulty is that Image Quilting was originally not designed to produce conditional simulations and was restricted to 2-D images. In this paper, the original method developed in computer graphics has been modified to accommodate conditioning data and 3-D problems. This new conditional image quilting method (CIQ) is patch based, does not require constructing a pattern databases, and can be used with both categorical and continuous training images. The main concept is to optimally cut the patches such that they overlap with minimum discontinuity. The optimal cut is determined using a dynamic programming algorithm. Conditioning is accomplished by prior selection of patches that are compatible with the conditioning data. The performance of CIQ is tested for a variety of hydrogeological test cases. The results, when compared with previous multiple-point statistics (MPS) methods, indicate an improvement in CPU time by a factor of at least 50.

  4. How static media is understood and used by high school science teachers

    NASA Astrophysics Data System (ADS)

    Hirata, Miguel

    The purpose of the present study is to explore the role of static media in textbooks, as defined by Mayer (2001) in the form of printed images and text, and how these media are viewed and used by high school science teachers. Textbooks appeared in the United States in the late 1800s, and since then pictorial aids have been used extensively in them to support the teacher's work in the classroom (Giordano, 2003). According to Woodward, Elliott, and Nagel (1988/2013) the research on textbooks prior to the 1970s doesn't present relevant work related to the curricular role and the quality and instructional design of textbooks. Since then there has been abundant research, specially on the use of visual images in textbooks that has been approached from: (a) the text/image ratio (Evans, Watson, & Willows, 1987; Levin & Mayer, 1993; Mayer, 1993; Woodward, 1993), and (b) the instructional effectiveness of images (Woodward, 1993). The theoretical framework for this study comes from multimedia learning (Mayer, 2001), information design (Pettersson, 2002), and visual literacy (Moore & Dwyer, 1994). Data was collected through in-depth interviews of three high school science teachers and the graphic analyses of three textbooks used by the interviewed teachers. The interview data were compared through an analytic model developed from the literature, and the graphic analyses were performed using Mayer's multimedia learning principles (Mayer, 2001) and the Graphic Analysis Protocol (GAP) (Slough & McTigue, 2013). The conclusions of this study are: (1) pictures are specially useful for teaching science because science is a difficult subject to teach, (2) due this difficulty, pictures are very important to make the class dynamic and avoid students distraction, (3) static and dynamic media when used together can be more effective, (4) some specific type of graphics were found in the science textbooks used by the participants, in this case they were naturalistic drawings, stylized drawings, scale diagram, flow chart - cycle, flow chart - sequence, and hybrids, no photographs were found, (5) graphics can be related not only to the general text but specifically to the captions, (6) the textbooks analyzed had a balanced proportion of text and graphics, and (7) to facilitate the text-graphics relationship the spatial contiguity of both elements is key to their semantic integration.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liscom, W.L.

    This book presents a complete graphic and statistical portrait of the dramatic shifts in global energy flows during the 1970s and the resultant transfer of economic and political power from the industrial nations to the oil-producing states. The information was extracted from government-source documents and compiled in a computer data base. Computer graphics were combined with the data base to produce over 400 full-color graphs. The energy commodities covered are oil, natural gas, coal, nuclear, and conventional electric-power generation. Also included are data on hydroelectric and geothermal power, oil shale, tar sands, and other alternative energy sources. 72 references.

  6. Potential Impact of Graphic Health Warnings on Cigarette Packages in Reducing Cigarette Demand and Smoking-Related Deaths in Vietnam.

    PubMed

    Minh, Hoang Van; Chung, Le Hong; Giang, Kim Bao; Duc, Duong Minh; Hinh, Nguyen Duc; Mai, Vu Quynh; Cuong, Nguyen Manh; Manh, Pham Duc; Duc, Ha Anh; Yang, Jui-Chen

    2016-01-01

    Two years after implementation of the graphic health warning intervention in Vietnam, it is very important to evaluate the intervention's potential impact. The objective of this paper was to predict effects of graphic health warnings on cigarette packages, particularly in reducing cigarette demand and smoking-associated deaths in Vietnam. In this study, a discrete choice experiment (DCE) method was used to evaluate the potential impact of graphic tobacco health warnings on smoking demand. To predict the impact of GHWs on reducing premature deaths associated with smoking, we constructed different static models. We adapted the method developed by University of Toronto, Canada and found that GHWs had statistically significant impact on reducing cigarette demand (up to 10.1% through images of lung damage), resulting in an overall decrease of smoking prevalence in Vietnam. We also found that between 428,417- 646,098 premature deaths would be prevented as a result of the GHW intervention. The potential impact of the GHW labels on reducing premature smoking-associated deaths in Vietnam were shown to be stronger among lower socio-economic groups.

  7. Hybrid Electron Microscopy Normal Mode Analysis graphical interface and protocol.

    PubMed

    Sorzano, Carlos Oscar S; de la Rosa-Trevín, José Miguel; Tama, Florence; Jonić, Slavica

    2014-11-01

    This article presents an integral graphical interface to the Hybrid Electron Microscopy Normal Mode Analysis (HEMNMA) approach that was developed for capturing continuous motions of large macromolecular complexes from single-particle EM images. HEMNMA was shown to be a good approach to analyze multiple conformations of a macromolecular complex but it could not be widely used in the EM field due to a lack of an integral interface. In particular, its use required switching among different software sources as well as selecting modes for image analysis was difficult without the graphical interface. The graphical interface was thus developed to simplify the practical use of HEMNMA. It is implemented in the open-source software package Xmipp 3.1 (http://xmipp.cnb.csic.es) and only a small part of it relies on MATLAB that is accessible through the main interface. Such integration provides the user with an easy way to perform the analysis of macromolecular dynamics and forms a direct connection to the single-particle reconstruction process. A step-by-step HEMNMA protocol with the graphical interface is given in full details in Supplementary material. The graphical interface will be useful to experimentalists who are interested in studies of continuous conformational changes of macromolecular complexes beyond the modeling of continuous heterogeneity in single particle reconstruction. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. TEMPORAL VARIABILITY OF TOXIC CONTAMINANTS IN ANIMAL DIETS

    EPA Science Inventory

    Uncertified commercial research animal feed (Purina Chow TM) was analyzed over forty-one months to determine essential and trace elements and toxic contaminants. Parametric statistics and graphic chronologic progressions of the results are presented for cat, monkey, rodent (rat/m...

  9. A synthesis of research on color, typography and graphics as they relate to readability

    NASA Astrophysics Data System (ADS)

    Lamoreaux, M. E.

    1985-09-01

    A foundation for future research on the use of color, typography, and graphics to improve readability is provided. Articles from the broad fields of education and psychology, as well as from the fields of journalism and printing, have been reviewed for research relating color, typography, and graphics to reading ease, speed, or comprehension. The most relevant articles reviewed are presented in an annoated bibliography; the remaining articles are also presented in bibliographic format. This literature review indicates that recognition and recall of printed material may be improved through the use of headings, underlining, color, and, especially, illustrations. Current research suggests that individuals can remember pictures far longer than past research indicates. However, researchers are divided on the usefulness of illustrations to improve reading comprehension. On the other hand, reading comprehension can be improved through the use of statistical graphs and tables if the reader is properly trained in the use of these devices.

  10. Potential Application of a Graphical Processing Unit to Parallel Computations in the NUBEAM Code

    NASA Astrophysics Data System (ADS)

    Payne, J.; McCune, D.; Prater, R.

    2010-11-01

    NUBEAM is a comprehensive computational Monte Carlo based model for neutral beam injection (NBI) in tokamaks. NUBEAM computes NBI-relevant profiles in tokamak plasmas by tracking the deposition and the slowing of fast ions. At the core of NUBEAM are vector calculations used to track fast ions. These calculations have recently been parallelized to run on MPI clusters. However, cost and interlink bandwidth limit the ability to fully parallelize NUBEAM on an MPI cluster. Recent implementation of double precision capabilities for Graphical Processing Units (GPUs) presents a cost effective and high performance alternative or complement to MPI computation. Commercially available graphics cards can achieve up to 672 GFLOPS double precision and can handle hundreds of thousands of threads. The ability to execute at least one thread per particle simultaneously could significantly reduce the execution time and the statistical noise of NUBEAM. Progress on implementation on a GPU will be presented.

  11. Replicates in high dimensions, with applications to latent variable graphical models.

    PubMed

    Tan, Kean Ming; Ning, Yang; Witten, Daniela M; Liu, Han

    2016-12-01

    In classical statistics, much thought has been put into experimental design and data collection. In the high-dimensional setting, however, experimental design has been less of a focus. In this paper, we stress the importance of collecting multiple replicates for each subject in this setting. We consider learning the structure of a graphical model with latent variables, under the assumption that these variables take a constant value across replicates within each subject. By collecting multiple replicates for each subject, we are able to estimate the conditional dependence relationships among the observed variables given the latent variables. To test the null hypothesis of conditional independence between two observed variables, we propose a pairwise decorrelated score test. Theoretical guarantees are established for parameter estimation and for this test. We show that our proposal is able to estimate latent variable graphical models more accurately than some existing proposals, and apply the proposed method to a brain imaging dataset.

  12. Realistic natural atmospheric phenomena and weather effects for interactive virtual environments

    NASA Astrophysics Data System (ADS)

    McLoughlin, Leigh

    Clouds and the weather are important aspects of any natural outdoor scene, but existing dynamic techniques within computer graphics only offer the simplest of cloud representations. The problem that this work looks to address is how to provide a means of simulating clouds and weather features such as precipitation, that are suitable for virtual environments. Techniques for cloud simulation are available within the area of meteorology, but numerical weather prediction systems are computationally expensive, give more numerical accuracy than we require for graphics and are restricted to the laws of physics. Within computer graphics, we often need to direct and adjust physical features or to bend reality to meet artistic goals, which is a key difference between the subjects of computer graphics and physical science. Pure physically-based simulations, however, evolve their solutions according to pre-set rules and are notoriously difficult to control. The challenge then is for the solution to be computationally lightweight and able to be directed in some measure while at the same time producing believable results. This work presents a lightweight physically-based cloud simulation scheme that simulates the dynamic properties of cloud formation and weather effects. The system simulates water vapour, cloud water, cloud ice, rain, snow and hail. The water model incorporates control parameters and the cloud model uses an arbitrary vertical temperature profile, with a tool described to allow the user to define this. The result of this work is that clouds can now be simulated in near real-time complete with precipitation. The temperature profile and tool then provide a means of directing the resulting formation..

  13. Computational fluid dynamics for propulsion technology: Geometric grid visualization in CFD-based propulsion technology research

    NASA Technical Reports Server (NTRS)

    Ziebarth, John P.; Meyer, Doug

    1992-01-01

    The coordination is examined of necessary resources, facilities, and special personnel to provide technical integration activities in the area of computational fluid dynamics applied to propulsion technology. Involved is the coordination of CFD activities between government, industry, and universities. Current geometry modeling, grid generation, and graphical methods are established to use in the analysis of CFD design methodologies.

  14. Chaotic dynamics of Heisenberg ferromagnetic spin chain with bilinear and biquadratic interactions

    NASA Astrophysics Data System (ADS)

    Blessy, B. S. Gnana; Latha, M. M.

    2017-10-01

    We investigate the chaotic dynamics of one dimensional Heisenberg ferromagnetic spin chain by constructing the Hamiltonian equations of motion. We present the trajectory and phase plots of the system with bilinear and also biquadratic interactions. The stability of the system is analysed in both cases by constructing the Jacobian matrix and by measuring the Lyapunov exponents. The results are illustrated graphically.

  15. Network Inference via the Time-Varying Graphical Lasso

    PubMed Central

    Hallac, David; Park, Youngsuk; Boyd, Stephen; Leskovec, Jure

    2018-01-01

    Many important problems can be modeled as a system of interconnected entities, where each entity is recording time-dependent observations or measurements. In order to spot trends, detect anomalies, and interpret the temporal dynamics of such data, it is essential to understand the relationships between the different entities and how these relationships evolve over time. In this paper, we introduce the time-varying graphical lasso (TVGL), a method of inferring time-varying networks from raw time series data. We cast the problem in terms of estimating a sparse time-varying inverse covariance matrix, which reveals a dynamic network of interdependencies between the entities. Since dynamic network inference is a computationally expensive task, we derive a scalable message-passing algorithm based on the Alternating Direction Method of Multipliers (ADMM) to solve this problem in an efficient way. We also discuss several extensions, including a streaming algorithm to update the model and incorporate new observations in real time. Finally, we evaluate our TVGL algorithm on both real and synthetic datasets, obtaining interpretable results and outperforming state-of-the-art baselines in terms of both accuracy and scalability. PMID:29770256

  16. Performance of an inverted pendulum model directly applied to normal human gait.

    PubMed

    Buczek, Frank L; Cooney, Kevin M; Walker, Matthew R; Rainbow, Michael J; Concha, M Cecilia; Sanders, James O

    2006-03-01

    In clinical gait analysis, we strive to understand contributions to body support and propulsion as this forms a basis for treatment selection, yet the relative importance of gravitational forces and joint powers can be controversial even for normal gait. We hypothesized that an inverted pendulum model, propelled only by gravity, would be inadequate to predict velocities and ground reaction forces during gait. Unlike previous ballistic and passive dynamic walking studies, we directly compared model predictions to gait data for 24 normal children. We defined an inverted pendulum from the average center-of-pressure to the instantaneous center-of-mass, and derived equations of motion during single support that allowed a telescoping action. Forward and inverse dynamics predicted pendulum velocities and ground reaction forces, and these were statistically and graphically compared to actual gait data for identical strides. Results of forward dynamics replicated those in the literature, with reasonable predictions for velocities and anterior ground reaction forces, but poor predictions for vertical ground reaction forces. Deviations from actual values were explained by joint powers calculated for these subjects. With a telescoping action during inverse dynamics, predicted vertical forces improved dramatically and gained a dual-peak pattern previously missing in the literature, yet expected for normal gait. These improvements vanished when telescoping terms were set to zero. Because this telescoping action is difficult to explain without muscle activity, we believe these results support the need for both gravitational forces and joint powers in normal gait. Our approach also begins to quantify the relative contributions of each.

  17. RevBayes: Bayesian Phylogenetic Inference Using Graphical Models and an Interactive Model-Specification Language

    PubMed Central

    Höhna, Sebastian; Landis, Michael J.

    2016-01-01

    Programs for Bayesian inference of phylogeny currently implement a unique and fixed suite of models. Consequently, users of these software packages are simultaneously forced to use a number of programs for a given study, while also lacking the freedom to explore models that have not been implemented by the developers of those programs. We developed a new open-source software package, RevBayes, to address these problems. RevBayes is entirely based on probabilistic graphical models, a powerful generic framework for specifying and analyzing statistical models. Phylogenetic-graphical models can be specified interactively in RevBayes, piece by piece, using a new succinct and intuitive language called Rev. Rev is similar to the R language and the BUGS model-specification language, and should be easy to learn for most users. The strength of RevBayes is the simplicity with which one can design, specify, and implement new and complex models. Fortunately, this tremendous flexibility does not come at the cost of slower computation; as we demonstrate, RevBayes outperforms competing software for several standard analyses. Compared with other programs, RevBayes has fewer black-box elements. Users need to explicitly specify each part of the model and analysis. Although this explicitness may initially be unfamiliar, we are convinced that this transparency will improve understanding of phylogenetic models in our field. Moreover, it will motivate the search for improvements to existing methods by brazenly exposing the model choices that we make to critical scrutiny. RevBayes is freely available at http://www.RevBayes.com. [Bayesian inference; Graphical models; MCMC; statistical phylogenetics.] PMID:27235697

  18. RevBayes: Bayesian Phylogenetic Inference Using Graphical Models and an Interactive Model-Specification Language.

    PubMed

    Höhna, Sebastian; Landis, Michael J; Heath, Tracy A; Boussau, Bastien; Lartillot, Nicolas; Moore, Brian R; Huelsenbeck, John P; Ronquist, Fredrik

    2016-07-01

    Programs for Bayesian inference of phylogeny currently implement a unique and fixed suite of models. Consequently, users of these software packages are simultaneously forced to use a number of programs for a given study, while also lacking the freedom to explore models that have not been implemented by the developers of those programs. We developed a new open-source software package, RevBayes, to address these problems. RevBayes is entirely based on probabilistic graphical models, a powerful generic framework for specifying and analyzing statistical models. Phylogenetic-graphical models can be specified interactively in RevBayes, piece by piece, using a new succinct and intuitive language called Rev. Rev is similar to the R language and the BUGS model-specification language, and should be easy to learn for most users. The strength of RevBayes is the simplicity with which one can design, specify, and implement new and complex models. Fortunately, this tremendous flexibility does not come at the cost of slower computation; as we demonstrate, RevBayes outperforms competing software for several standard analyses. Compared with other programs, RevBayes has fewer black-box elements. Users need to explicitly specify each part of the model and analysis. Although this explicitness may initially be unfamiliar, we are convinced that this transparency will improve understanding of phylogenetic models in our field. Moreover, it will motivate the search for improvements to existing methods by brazenly exposing the model choices that we make to critical scrutiny. RevBayes is freely available at http://www.RevBayes.com [Bayesian inference; Graphical models; MCMC; statistical phylogenetics.]. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  19. High Dynamic Imaging for Photometry and Graphic Arts Evaluation

    NASA Astrophysics Data System (ADS)

    T. S., Sudheer Kumar; Kurian, Ciji Pearl; Shama, Kumara; K. R., Shailesh

    2018-05-01

    High Dynamic Range Imaging (HDRI) techniques for luminance measurement is gaining importance in recent years. This paper presents the application of the HDRI system for obtaining the photometric characteristics of lighting fixtures as well to assess the quality of lighting in colour viewing booth of a printing press. The process of quality control of prints in a printing press is known as graphic arts evaluation. This light booth plays a major role in the quality control of prints. In this work, Nikon D5100 camera was used to obtain the photometric characteristics of narrow beam spotlight. The results of this experiment are in agreement with photometric characteristics obtained from a standard industry grade Gonio-photometer. Similarly, Canon 60D camera was used to assess the quality of spatial luminance distribution of light in the colour viewing booth. This work demonstrates the usefulness of HDRI technology for photometric measurements and luminance distributions of illuminated interiors.

  20. Law of Large Numbers: the Theory, Applications and Technology-based Education

    PubMed Central

    Dinov, Ivo D.; Christou, Nicolas; Gould, Robert

    2011-01-01

    Modern approaches for technology-based blended education utilize a variety of recently developed novel pedagogical, computational and network resources. Such attempts employ technology to deliver integrated, dynamically-linked, interactive-content and heterogeneous learning environments, which may improve student comprehension and information retention. In this paper, we describe one such innovative effort of using technological tools to expose students in probability and statistics courses to the theory, practice and usability of the Law of Large Numbers (LLN). We base our approach on integrating pedagogical instruments with the computational libraries developed by the Statistics Online Computational Resource (www.SOCR.ucla.edu). To achieve this merger we designed a new interactive Java applet and a corresponding demonstration activity that illustrate the concept and the applications of the LLN. The LLN applet and activity have common goals – to provide graphical representation of the LLN principle, build lasting student intuition and present the common misconceptions about the law of large numbers. Both the SOCR LLN applet and activity are freely available online to the community to test, validate and extend (Applet: http://socr.ucla.edu/htmls/exp/Coin_Toss_LLN_Experiment.html, and Activity: http://wiki.stat.ucla.edu/socr/index.php/SOCR_EduMaterials_Activities_LLN). PMID:21603584

  1. medplot: a web application for dynamic summary and analysis of longitudinal medical data based on R.

    PubMed

    Ahlin, Črt; Stupica, Daša; Strle, Franc; Lusa, Lara

    2015-01-01

    In biomedical studies the patients are often evaluated numerous times and a large number of variables are recorded at each time-point. Data entry and manipulation of longitudinal data can be performed using spreadsheet programs, which usually include some data plotting and analysis capabilities and are straightforward to use, but are not designed for the analyses of complex longitudinal data. Specialized statistical software offers more flexibility and capabilities, but first time users with biomedical background often find its use difficult. We developed medplot, an interactive web application that simplifies the exploration and analysis of longitudinal data. The application can be used to summarize, visualize and analyze data by researchers that are not familiar with statistical programs and whose knowledge of statistics is limited. The summary tools produce publication-ready tables and graphs. The analysis tools include features that are seldom available in spreadsheet software, such as correction for multiple testing, repeated measurement analyses and flexible non-linear modeling of the association of the numerical variables with the outcome. medplot is freely available and open source, it has an intuitive graphical user interface (GUI), it is accessible via the Internet and can be used within a web browser, without the need for installing and maintaining programs locally on the user's computer. This paper describes the application and gives detailed examples describing how to use the application on real data from a clinical study including patients with early Lyme borreliosis.

  2. imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel

    PubMed Central

    Grapov, Dmitry; Newman, John W.

    2012-01-01

    Summary: Interactive modules for Data Exploration and Visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data through a user-friendly interface. Individual modules enables interactive and dynamic analyses of large data by interfacing R's multivariate statistics and highly customizable visualizations with the spreadsheet environment, aiding robust inferences and generating information-rich data visualizations. This tool provides access to multiple comparisons with false discovery correction, hierarchical clustering, principal and independent component analyses, partial least squares regression and discriminant analysis, through an intuitive interface for creating high-quality two- and a three-dimensional visualizations including scatter plot matrices, distribution plots, dendrograms, heat maps, biplots, trellis biplots and correlation networks. Availability and implementation: Freely available for download at http://sourceforge.net/projects/imdev/. Implemented in R and VBA and supported by Microsoft Excel (2003, 2007 and 2010). Contact: John.Newman@ars.usda.gov Supplementary Information: Installation instructions, tutorials and users manual are available at http://sourceforge.net/projects/imdev/. PMID:22815358

  3. High-performance parallel computing in the classroom using the public goods game as an example

    NASA Astrophysics Data System (ADS)

    Perc, Matjaž

    2017-07-01

    The use of computers in statistical physics is common because the sheer number of equations that describe the behaviour of an entire system particle by particle often makes it impossible to solve them exactly. Monte Carlo methods form a particularly important class of numerical methods for solving problems in statistical physics. Although these methods are simple in principle, their proper use requires a good command of statistical mechanics, as well as considerable computational resources. The aim of this paper is to demonstrate how the usage of widely accessible graphics cards on personal computers can elevate the computing power in Monte Carlo simulations by orders of magnitude, thus allowing live classroom demonstration of phenomena that would otherwise be out of reach. As an example, we use the public goods game on a square lattice where two strategies compete for common resources in a social dilemma situation. We show that the second-order phase transition to an absorbing phase in the system belongs to the directed percolation universality class, and we compare the time needed to arrive at this result by means of the main processor and by means of a suitable graphics card. Parallel computing on graphics processing units has been developed actively during the last decade, to the point where today the learning curve for entry is anything but steep for those familiar with programming. The subject is thus ripe for inclusion in graduate and advanced undergraduate curricula, and we hope that this paper will facilitate this process in the realm of physics education. To that end, we provide a documented source code for an easy reproduction of presented results and for further development of Monte Carlo simulations of similar systems.

  4. Perspective: chemical dynamics simulations of non-statistical reaction dynamics

    PubMed Central

    Ma, Xinyou; Hase, William L.

    2017-01-01

    Non-statistical chemical dynamics are exemplified by disagreements with the transition state (TS), RRKM and phase space theories of chemical kinetics and dynamics. The intrinsic reaction coordinate (IRC) is often used for the former two theories, and non-statistical dynamics arising from non-IRC dynamics are often important. In this perspective, non-statistical dynamics are discussed for chemical reactions, with results primarily obtained from chemical dynamics simulations and to a lesser extent from experiment. The non-statistical dynamical properties discussed are: post-TS dynamics, including potential energy surface bifurcations, product energy partitioning in unimolecular dissociation and avoiding exit-channel potential energy minima; non-RRKM unimolecular decomposition; non-IRC dynamics; direct mechanisms for bimolecular reactions with pre- and/or post-reaction potential energy minima; non-TS theory barrier recrossings; and roaming dynamics. This article is part of the themed issue ‘Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces’. PMID:28320906

  5. SEURAT: Visual analytics for the integrated analysis of microarray data

    PubMed Central

    2010-01-01

    Background In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. Results We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. Conclusions The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data. PMID:20525257

  6. Modeling of adsorption dynamics at air-liquid interfaces using statistical rate theory (SRT).

    PubMed

    Biswas, M E; Chatzis, I; Ioannidis, M A; Chen, P

    2005-06-01

    A large number of natural and technological processes involve mass transfer at interfaces. Interfacial properties, e.g., adsorption, play a key role in such applications as wetting, foaming, coating, and stabilizing of liquid films. The mechanistic understanding of surface adsorption often assumes molecular diffusion in the bulk liquid and subsequent adsorption at the interface. Diffusion is well described by Fick's law, while adsorption kinetics is less understood and is commonly described using Langmuir-type empirical equations. In this study, a general theoretical model for adsorption kinetics/dynamics at the air-liquid interface is developed; in particular, a new kinetic equation based on the statistical rate theory (SRT) is derived. Similar to many reported kinetic equations, the new kinetic equation also involves a number of parameters, but all these parameters are theoretically obtainable. In the present model, the adsorption dynamics is governed by three dimensionless numbers: psi (ratio of adsorption thickness to diffusion length), lambda (ratio of square of the adsorption thickness to the ratio of adsorption to desorption rate constant), and Nk (ratio of the adsorption rate constant to the product of diffusion coefficient and bulk concentration). Numerical simulations for surface adsorption using the proposed model are carried out and verified. The difference in surface adsorption between the general and the diffusion controlled model is estimated and presented graphically as contours of deviation. Three different regions of adsorption dynamics are identified: diffusion controlled (deviation less than 10%), mixed diffusion and transfer controlled (deviation in the range of 10-90%), and transfer controlled (deviation more than 90%). These three different modes predominantly depend on the value of Nk. The corresponding ranges of Nk for the studied values of psi (10(-2)

  7. Graphical Tests for Power Comparison of Competing Designs.

    PubMed

    Hofmann, H; Follett, L; Majumder, M; Cook, D

    2012-12-01

    Lineups have been established as tools for visual testing similar to standard statistical inference tests, allowing us to evaluate the validity of graphical findings in an objective manner. In simulation studies lineups have been shown as being efficient: the power of visual tests is comparable to classical tests while being much less stringent in terms of distributional assumptions made. This makes lineups versatile, yet powerful, tools in situations where conditions for regular statistical tests are not or cannot be met. In this paper we introduce lineups as a tool for evaluating the power of competing graphical designs. We highlight some of the theoretical properties and then show results from two studies evaluating competing designs: both studies are designed to go to the limits of our perceptual abilities to highlight differences between designs. We use both accuracy and speed of evaluation as measures of a successful design. The first study compares the choice of coordinate system: polar versus cartesian coordinates. The results show strong support in favor of cartesian coordinates in finding fast and accurate answers to spotting patterns. The second study is aimed at finding shift differences between distributions. Both studies are motivated by data problems that we have recently encountered, and explore using simulated data to evaluate the plot designs under controlled conditions. Amazon Mechanical Turk (MTurk) is used to conduct the studies. The lineups provide an effective mechanism for objectively evaluating plot designs.

  8. Computer Series, 29: Bits and Pieces, 10.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1982-01-01

    Describes computer programs (available from authors) including molecular input to computer, programs for quantum chemistry, library orientation to technical literature, plotting potentiometric titration data, simulating oscilloscope curves, organic qualitative analysis with dynamic graphics, extended Huckel calculations, and calculator programs…

  9. SystemSketch Poster

    EPA Science Inventory

    SystemSketch is a dynamic, graphic visualization tool to help stakeholders better understand system context and access information resources.  It is constructed using the Driver-Pressure-State-Impact-Response framework, and functions both as a stand-alone tool and as a component ...

  10. Graphical user interface for wireless sensor networks simulator

    NASA Astrophysics Data System (ADS)

    Paczesny, Tomasz; Paczesny, Daniel; Weremczuk, Jerzy

    2008-01-01

    Wireless Sensor Networks (WSN) are currently very popular area of development. It can be suited in many applications form military through environment monitoring, healthcare, home automation and others. Those networks, when working in dynamic, ad-hoc model, need effective protocols which must differ from common computer networks algorithms. Research on those protocols would be difficult without simulation tool, because real applications often use many nodes and tests on such a big networks take much effort and costs. The paper presents Graphical User Interface (GUI) for simulator which is dedicated for WSN studies, especially in routing and data link protocols evaluation.

  11. Arctic Ice Dynamics Joint Experiment (AIDJEX) 1975-1976, Physical Oceanography Data Report Profiling Current Meter Data -- Camp Caribou. Volume 1.

    DTIC Science & Technology

    1980-02-01

    base of the ice. Hourly averages pertaining to the fixed-mast current meters can be obtained through the National Oceano - graphic Data Center. The...431 441 451 461 471 481 49t 50 40 ’II 421 431 441 󈧷j 461 4󈨋 481 491 Sol , 71ME :N :AYs Fig’ure 11. Speed and direction plotted for the manned AIDJEX...EDDIES Swift mesoscale undercurrents are one of the most notable oceano - graphic features observed in the AIDJEX area of the Arctic Ocean. The eddy form

  12. A standard format and a graphical user interface for spin system specification.

    PubMed

    Biternas, A G; Charnock, G T P; Kuprov, Ilya

    2014-03-01

    We introduce a simple and general XML format for spin system description that is the result of extensive consultations within Magnetic Resonance community and unifies under one roof all major existing spin interaction specification conventions. The format is human-readable, easy to edit and easy to parse using standard XML libraries. We also describe a graphical user interface that was designed to facilitate construction and visualization of complicated spin systems. The interface is capable of generating input files for several popular spin dynamics simulation packages. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Waterborne Commerce of the United States, calendar year 1998. Part 5 : national summaries

    DOT National Transportation Integrated Search

    2000-03-01

    Waterborne Commerce of the United States, WCUS, Part 5 is one of a series of publications which provides statistics on the foreign and domestic waterborne commerce moved on the United States waters. WCUS, Part 5 provides extensive graphic and tabular...

  14. Trend Analysis Using Microcomputers.

    ERIC Educational Resources Information Center

    Berger, Carl F.

    A trend analysis statistical package and additional programs for the Apple microcomputer are presented. They illustrate strategies of data analysis suitable to the graphics and processing capabilities of the microcomputer. The programs analyze data sets using examples of: (1) analysis of variance with multiple linear regression; (2) exponential…

  15. Product plots.

    PubMed

    Wickham, Hadley; Hofmann, Heike

    2011-12-01

    We propose a new framework for visualising tables of counts, proportions and probabilities. We call our framework product plots, alluding to the computation of area as a product of height and width, and the statistical concept of generating a joint distribution from the product of conditional and marginal distributions. The framework, with extensions, is sufficient to encompass over 20 visualisations previously described in fields of statistical graphics and infovis, including bar charts, mosaic plots, treemaps, equal area plots and fluctuation diagrams. © 2011 IEEE

  16. Models of dyadic social interaction.

    PubMed Central

    Griffin, Dale; Gonzalez, Richard

    2003-01-01

    We discuss the logic of research designs for dyadic interaction and present statistical models with parameters that are tied to psychologically relevant constructs. Building on Karl Pearson's classic nineteenth-century statistical analysis of within-organism similarity, we describe several approaches to indexing dyadic interdependence and provide graphical methods for visualizing dyadic data. We also describe several statistical and conceptual solutions to the 'levels of analytic' problem in analysing dyadic data. These analytic strategies allow the researcher to examine and measure psychological questions of interdependence and social influence. We provide illustrative data from casually interacting and romantic dyads. PMID:12689382

  17. What's in a Name?

    NASA Astrophysics Data System (ADS)

    Bonneau, Joseph; Just, Mike; Matthews, Greg

    We study the efficiency of statistical attacks on human authentication systems relying on personal knowledge questions. We adapt techniques from guessing theory to measure security against a trawling attacker attempting to compromise a large number of strangers' accounts. We then examine a diverse corpus of real-world statistical distributions for likely answer categories such as the names of people, pets, and places and find that personal knowledge questions are significantly less secure than graphical or textual passwords. We also demonstrate that statistics can be used to increase security by proactively shaping the answer distribution to lower the prevalence of common responses.

  18. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    PubMed

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  19. Wood Products Analysis

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Structural Reliability Consultants' computer program creates graphic plots showing the statistical parameters of glue laminated timbers, or 'glulam.' The company president, Dr. Joseph Murphy, read in NASA Tech Briefs about work related to analysis of Space Shuttle surface tile strength performed for Johnson Space Center by Rockwell International Corporation. Analysis led to a theory of 'consistent tolerance bounds' for statistical distributions, applicable in industrial testing where statistical analysis can influence product development and use. Dr. Murphy then obtained the Tech Support Package that covers the subject in greater detail. The TSP became the basis for Dr. Murphy's computer program PC-DATA, which he is marketing commercially.

  20. A web-based relational database for monitoring and analyzing mosquito population dynamics.

    PubMed

    Sucaet, Yves; Van Hemert, John; Tucker, Brad; Bartholomay, Lyric

    2008-07-01

    Mosquito population dynamics have been monitored on an annual basis in the state of Iowa since 1969. The primary goal of this project was to integrate light trap data from these efforts into a centralized back-end database and interactive website that is available through the internet at http://iowa-mosquito.ent.iastate.edu. For comparative purposes, all data were categorized according to the week of the year and normalized according to the number of traps running. Users can readily view current, weekly mosquito abundance compared with data from previous years. Additional interactive capabilities facilitate analyses of the data based on mosquito species, distribution, or a time frame of interest. All data can be viewed in graphical and tabular format and can be downloaded to a comma separated value (CSV) file for import into a spreadsheet or more specialized statistical software package. Having this long-term dataset in a centralized database/website is useful for informing mosquito and mosquito-borne disease control and for exploring the ecology of the species represented therein. In addition to mosquito population dynamics, this database is available as a standardized platform that could be modified and applied to a multitude of projects that involve repeated collection of observational data. The development and implementation of this tool provides capacity for the user to mine data from standard spreadsheets into a relational database and then view and query the data in an interactive website.

  1. Automated Flight Dynamics Product Generation for the EOS AM-1 Spacecraft

    NASA Technical Reports Server (NTRS)

    Matusow, Carla

    1999-01-01

    As part of NASA's Earth Science Enterprise, the Earth Observing System (EOS) AM-1 spacecraft is designed to monitor long-term, global, environmental changes. Because of the complexity of the AM-1 spacecraft, the mission operations center requires more than 80 distinct flight dynamics products (reports). To create these products, the AM-1 Flight Dynamics Team (FDT) will use a combination of modified commercial software packages (e.g., Analytical Graphic's Satellite ToolKit) and NASA-developed software applications. While providing the most cost-effective solution to meeting the mission requirements, the integration of these software applications raises several operational concerns: (1) Routine product generation requires knowledge of multiple applications executing on variety of hardware platforms. (2) Generating products is a highly interactive process requiring a user to interact with each application multiple times to generate each product. (3) Routine product generation requires several hours to complete. (4) User interaction with each application introduces the potential for errors, since users are required to manually enter filenames and input parameters as well as run applications in the correct sequence. Generating products requires some level of flight dynamics expertise to determine the appropriate inputs and sequencing. To address these issues, the FDT developed an automation software tool called AutoProducts, which runs on a single hardware platform and provides all necessary coordination and communication among the various flight dynamics software applications. AutoProducts, autonomously retrieves necessary files, sequences and executes applications with correct input parameters, and deliver the final flight dynamics products to the appropriate customers. Although AutoProducts will normally generate pre-programmed sets of routine products, its graphical interface allows for easy configuration of customized and one-of-a-kind products. Additionally, AutoProducts has been designed as a mission-independent tool, and can be easily reconfigured to support other missions or incorporate new flight dynamics software packages. After the AM-1 launch, AutoProducts will run automatically at pre-determined time intervals . The AutoProducts tool reduces many of the concerns associated with the flight dynamics product generation. Although AutoProducts required a significant effort to develop because of the complexity of the interfaces involved, its use will provide significant cost savings through reduced operator time and maximum product reliability. In addition, user satisfaction is significantly improved and flight dynamics experts have more time to perform valuable analysis work. This paper will describe the evolution of the AutoProducts tool, highlighting the cost savings and customer satisfaction resulting from its development. It will also provide details about the tool including its graphical interface and operational capabilities.

  2. Visualization of fluid dynamics at NASA Ames

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1989-01-01

    The hardware and software currently used for visualization of fluid dynamics at NASA Ames is described. The software includes programs to create scenes (for example particle traces representing the flow over an aircraft), programs to interactively view the scenes, and programs to control the creation of video tapes and 16mm movies. The hardware includes high performance graphics workstations, a high speed network, digital video equipment, and film recorders.

  3. Graphical Presentation of Patient-Treatment Interaction Elucidated by Continuous Biomarkers. Current Practice and Scope for Improvement.

    PubMed

    Shen, Yu-Ming; Le, Lien D; Wilson, Rory; Mansmann, Ulrich

    2017-01-09

    Biomarkers providing evidence for patient-treatment interaction are key in the development and practice of personalized medicine. Knowledge that a patient with a specific feature - as demonstrated through a biomarker - would have an advantage under a given treatment vs. a competing treatment can aid immensely in medical decision-making. Statistical strategies to establish evidence of continuous biomarkers are complex and their formal results are thus not easy to communicate. Good graphical representations would help to translate such findings for use in the clinical community. Although general guidelines on how to present figures in clinical reports are available, there remains little guidance for figures elucidating the role of continuous biomarkers in patient-treatment interaction (CBPTI). To combat the current lack of comprehensive reviews or adequate guides on graphical presentation within this topic, our study proposes presentation principles for CBPTI plots. In order to understand current practice, we review the development of CBPTI methodology and how CBPTI plots are currently used in clinical research. The quality of a CBPTI plot is determined by how well the presentation provides key information for clinical decision-making. Several criteria for a good CBPTI plot are proposed, including general principles of visual display, use of units presenting absolute outcome measures, appropriate quantification of statistical uncertainty, correct display of benchmarks, and informative content for answering clinical questions especially on the quantitative advantage for an individual patient with regard to a specific treatment. We examined the development of CBPTI methodology from the years 2000 - 2014, and reviewed how CBPTI plots were currently used in clinical research in six major clinical journals from 2013 - 2014 using the principle of theoretical saturation. Each CBPTI plot found was assessed for appropriateness of its presentation and clinical utility. In our review, a total of seven methodological papers and five clinical reports used CBPTI plots which we categorized into four types: those that distinguish the outcome effect for each treatment group; those that show the outcome differences between treatment groups (by either partitioning all individuals into subpopulations or modelling the functional form of the interaction); those that evaluate the proportion of population impact of the biomarker; and those that show the classification accuracy of the biomarker. The current practice of utilizing CBPTI plots in clinical reports suffers from methodological shortcomings: the lack of presentation of statistical uncertainty, the outcome measure scaled by relative unit instead of absolute unit, incorrect use of benchmarks, and being non-informative in answering clinical questions. There is considerable scope for improvement in the graphical representation of CBPTI in clinical reports. The current challenge is to develop instruments for high-quality graphical plots which not only convey quantitative concepts to readers with limited statistical knowledge, but also facilitate medical decision-making.

  4. Children's identification of graphic symbols representing four basic emotions: comparison of Afrikaans-speaking and Sepedi-speaking children.

    PubMed

    DeKlerk, Hester M; Dada, Shakila; Alant, Erna

    2014-01-01

    Speech language pathologists recommend graphic symbols for AAC users to facilitate communication, including labelling and expressing emotions. The purpose of the current study was to describe and compare how 5- to 6-year-old Afrikaans- and Sepedi-speaking children identify and choose graphic symbols to depict four basic emotions, specifically happy, sad, afraid, and angry. Ninety participants were asked to select the graphic symbol from a 16-matrix communication overlay that would represent the emotion in response to 24 vignettes. The results of the t-tests indicated that the differences between the two groups' selection of target symbols to represent the four emotions are statistically significant. The results of the study indicate that children from different language groups may not perceive graphic symbols in the same way. The Afrikaans-speaking participants more often choose target symbols to represent target basic emotions than did the Sepedi-speaking participants. The most preferred symbols per emotion were identified and these different symbols were analysed in terms of facial features that distinguish them. Readers of this article will (1) recognise the importance of expressing basic emotions for children, particularly those that use AAC, (2) identify the possible limitations of line drawings for expressing and labelling basic emotions in typically developing children and (3) recognise the importance of cultural influences on recognition of basic emotions. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Visualising associations between paired ‘omics’ data sets

    PubMed Central

    2012-01-01

    Background Each omics platform is now able to generate a large amount of data. Genomics, proteomics, metabolomics, interactomics are compiled at an ever increasing pace and now form a core part of the fundamental systems biology framework. Recently, several integrative approaches have been proposed to extract meaningful information. However, these approaches lack of visualisation outputs to fully unravel the complex associations between different biological entities. Results The multivariate statistical approaches ‘regularized Canonical Correlation Analysis’ and ‘sparse Partial Least Squares regression’ were recently developed to integrate two types of highly dimensional ‘omics’ data and to select relevant information. Using the results of these methods, we propose to revisit few graphical outputs to better understand the relationships between two ‘omics’ data and to better visualise the correlation structure between the different biological entities. These graphical outputs include Correlation Circle plots, Relevance Networks and Clustered Image Maps. We demonstrate the usefulness of such graphical outputs on several biological data sets and further assess their biological relevance using gene ontology analysis. Conclusions Such graphical outputs are undoubtedly useful to aid the interpretation of these promising integrative analysis tools and will certainly help in addressing fundamental biological questions and understanding systems as a whole. Availability The graphical tools described in this paper are implemented in the freely available R package mixOmics and in its associated web application. PMID:23148523

  6. Rapid processing of PET list-mode data for efficient uncertainty estimation and data analysis

    NASA Astrophysics Data System (ADS)

    Markiewicz, P. J.; Thielemans, K.; Schott, J. M.; Atkinson, D.; Arridge, S. R.; Hutton, B. F.; Ourselin, S.

    2016-07-01

    In this technical note we propose a rapid and scalable software solution for the processing of PET list-mode data, which allows the efficient integration of list mode data processing into the workflow of image reconstruction and analysis. All processing is performed on the graphics processing unit (GPU), making use of streamed and concurrent kernel execution together with data transfers between disk and CPU memory as well as CPU and GPU memory. This approach leads to fast generation of multiple bootstrap realisations, and when combined with fast image reconstruction and analysis, it enables assessment of uncertainties of any image statistic and of any component of the image generation process (e.g. random correction, image processing) within reasonable time frames (e.g. within five minutes per realisation). This is of particular value when handling complex chains of image generation and processing. The software outputs the following: (1) estimate of expected random event data for noise reduction; (2) dynamic prompt and random sinograms of span-1 and span-11 and (3) variance estimates based on multiple bootstrap realisations of (1) and (2) assuming reasonable count levels for acceptable accuracy. In addition, the software produces statistics and visualisations for immediate quality control and crude motion detection, such as: (1) count rate curves; (2) centre of mass plots of the radiodistribution for motion detection; (3) video of dynamic projection views for fast visual list-mode skimming and inspection; (4) full normalisation factor sinograms. To demonstrate the software, we present an example of the above processing for fast uncertainty estimation of regional SUVR (standard uptake value ratio) calculation for a single PET scan of 18F-florbetapir using the Siemens Biograph mMR scanner.

  7. Knowledge representation in space flight operations

    NASA Technical Reports Server (NTRS)

    Busse, Carl

    1989-01-01

    In space flight operations rapid understanding of the state of the space vehicle is essential. Representation of knowledge depicting space vehicle status in a dynamic environment presents a difficult challenge. The NASA Jet Propulsion Laboratory has pursued areas of technology associated with the advancement of spacecraft operations environment. This has led to the development of several advanced mission systems which incorporate enhanced graphics capabilities. These systems include: (1) Spacecraft Health Automated Reasoning Prototype (SHARP); (2) Spacecraft Monitoring Environment (SME); (3) Electrical Power Data Monitor (EPDM); (4) Generic Payload Operations Control Center (GPOCC); and (5) Telemetry System Monitor Prototype (TSM). Knowledge representation in these systems provides a direct representation of the intrinsic images associated with the instrument and satellite telemetry and telecommunications systems. The man-machine interface includes easily interpreted contextual graphic displays. These interactive video displays contain multiple display screens with pop-up windows and intelligent, high resolution graphics linked through context and mouse-sensitive icons and text.

  8. A flexible flight display research system using a ground-based interactive graphics terminal

    NASA Technical Reports Server (NTRS)

    Hatfield, J. J.; Elkins, H. C.; Batson, V. M.; Poole, W. L.

    1975-01-01

    Requirements and research areas for the air transportation system of the 1980 to 1990's were reviewed briefly to establish the need for a flexible flight display generation research tool. Specific display capabilities required by aeronautical researchers are listed and a conceptual system for providing these capabilities is described. The conceptual system uses a ground-based interactive graphics terminal driven by real-time radar and telemetry data to generate dynamic, experimental flight displays. These displays are scan converted to television format, processed, and transmitted to the cockpits of evaluation aircraft. The attendant advantages of a Flight Display Research System (FDRS) designed to employ this concept are presented. The detailed implementation of an FDRS is described. The basic characteristics of the interactive graphics terminal and supporting display electronic subsystems are presented and the resulting system capability is summarized. Finally, the system status and utilization are reviewed.

  9. On the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods

    PubMed Central

    Lee, Anthony; Yau, Christopher; Giles, Michael B.; Doucet, Arnaud; Holmes, Christopher C.

    2011-01-01

    We present a case-study on the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods. Graphics cards, containing multiple Graphics Processing Units (GPUs), are self-contained parallel computational devices that can be housed in conventional desktop and laptop computers and can be thought of as prototypes of the next generation of many-core processors. For certain classes of population-based Monte Carlo algorithms they offer massively parallel simulation, with the added advantage over conventional distributed multi-core processors that they are cheap, easily accessible, easy to maintain, easy to code, dedicated local devices with low power consumption. On a canonical set of stochastic simulation examples including population-based Markov chain Monte Carlo methods and Sequential Monte Carlo methods, we nd speedups from 35 to 500 fold over conventional single-threaded computer code. Our findings suggest that GPUs have the potential to facilitate the growth of statistical modelling into complex data rich domains through the availability of cheap and accessible many-core computation. We believe the speedup we observe should motivate wider use of parallelizable simulation methods and greater methodological attention to their design. PMID:22003276

  10. Decision Aiding in Europe: Assessment Report,

    DTIC Science & Technology

    1983-05-26

    does not need extreme realism ; combined to yield an attractive index of rather, he needs a dynamic scene represen- mental workload. In the same...graphic functions but are multicriteria aspirations are often contra- not specifically European. Cinematic dictory and cannot be achieved simulta

  11. Designing for the Next Web.

    ERIC Educational Resources Information Center

    Bremser, Wayne

    1998-01-01

    Discusses how to choose from the available interactive graphic-design possibilities for the World Wide Web. Compatibility and appropriateness are discussed; and DHTML (Dynamic Hypertext Markup Language), Java, CSS (Cascading Style Sheets), plug-ins, ActiveX, and Push and channel technologies are described. (LRW)

  12. Problems with Multivariate Normality: Can the Multivariate Bootstrap Help?

    ERIC Educational Resources Information Center

    Thompson, Bruce

    Multivariate normality is required for some statistical tests. This paper explores the implications of violating the assumption of multivariate normality and illustrates a graphical procedure for evaluating multivariate normality. The logic for using the multivariate bootstrap is presented. The multivariate bootstrap can be used when distribution…

  13. Parametric Method Performance for Dynamic 3'-Deoxy-3'-18F-Fluorothymidine PET/CT in Epidermal Growth Factor Receptor-Mutated Non-Small Cell Lung Carcinoma Patients Before and During Therapy.

    PubMed

    Kramer, Gerbrand Maria; Frings, Virginie; Heijtel, Dennis; Smit, E F; Hoekstra, Otto S; Boellaard, Ronald

    2017-06-01

    The objective of this study was to validate several parametric methods for quantification of 3'-deoxy-3'- 18 F-fluorothymidine ( 18 F-FLT) PET in advanced-stage non-small cell lung carcinoma (NSCLC) patients with an activating epidermal growth factor receptor mutation who were treated with gefitinib or erlotinib. Furthermore, we evaluated the impact of noise on accuracy and precision of the parametric analyses of dynamic 18 F-FLT PET/CT to assess the robustness of these methods. Methods : Ten NSCLC patients underwent dynamic 18 F-FLT PET/CT at baseline and 7 and 28 d after the start of treatment. Parametric images were generated using plasma input Logan graphic analysis and 2 basis functions-based methods: a 2-tissue-compartment basis function model (BFM) and spectral analysis (SA). Whole-tumor-averaged parametric pharmacokinetic parameters were compared with those obtained by nonlinear regression of the tumor time-activity curve using a reversible 2-tissue-compartment model with blood volume fraction. In addition, 2 statistically equivalent datasets were generated by countwise splitting the original list-mode data, each containing 50% of the total counts. Both new datasets were reconstructed, and parametric pharmacokinetic parameters were compared between the 2 replicates and the original data. Results: After the settings of each parametric method were optimized, distribution volumes (V T ) obtained with Logan graphic analysis, BFM, and SA all correlated well with those derived using nonlinear regression at baseline and during therapy ( R 2 ≥ 0.94; intraclass correlation coefficient > 0.97). SA-based V T images were most robust to increased noise on a voxel-level (repeatability coefficient, 16% vs. >26%). Yet BFM generated the most accurate K 1 values ( R 2 = 0.94; intraclass correlation coefficient, 0.96). Parametric K 1 data showed a larger variability in general; however, no differences were found in robustness between methods (repeatability coefficient, 80%-84%). Conclusion: Both BFM and SA can generate quantitatively accurate parametric 18 F-FLT V T images in NSCLC patients before and during therapy. SA was more robust to noise, yet BFM provided more accurate parametric K 1 data. We therefore recommend BFM as the preferred parametric method for analysis of dynamic 18 F-FLT PET/CT studies; however, SA can also be used. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  14. Inhibition of Metalloprotease Botulinum Serotype A from a Pseudo-Peptide Binding Mode to a Small Molecule that is Active in Primary Neurons

    DTIC Science & Technology

    2007-02-16

    a modeled binding mode for inhibitor 2-mercapto-3-phenylpropionyl-RATKML (Ki 330 nM) was generated, and required the use of a molecular dynamic ...2-mercapto-3-phenylpropionyl-RATKML (K(i) = 330 nM) was generated, and required the use of a molecular dynamic conformer of the enzyme displaying the...SiliconGraphicsOctane 2. Insight II (Accelrys, San Diego, CA) was used to build and inspect models. Energy refinement and molecular dynamics were performed using the

  15. Study of effect of magnetohydrodynamics and couple stress on steady and dynamic characteristics of porous exponential slider bearings

    NASA Astrophysics Data System (ADS)

    Hanumagowda, B. N.; Gonchigara, Thippeswamy; Santhosh Kumar, J.; MShiva Kumar, H.

    2018-04-01

    Exponential slider bearings with porous facing is analysed in this article. The modified Reynolds equation is derived for the Exponential porous slider bearing with MHD and couple stress fluid. Computed values of Steady film pressure, Steady load capacity, Dynamic stiffness and Damping coefficient are presented in graphical form. The Steady film pressure, Steady load capacity, Dynamic stiffness and Damping coefficient decreases with increasing values of permeability parameter and increases with increasing values of couplestress parameter and Hartmann number.

  16. Mechanical, Thermal and Dynamic Mechanical Properties of PP/GF/xGnP Nanocomposites

    NASA Astrophysics Data System (ADS)

    Ashenai Ghasemi, F.; Ghorbani, A.; Ghasemi, I.

    2017-03-01

    The mechanical, thermal, and dynamic mechanical properties of ternary nanocomposites based on polypropylene, short glass fibers, and exfoliated graphene nanoplatelets were studied. To investigate the mechanical properties, uniaxial tensile and Charpy impact tests were carried out. To study the crystallinity of the compositions, a DSC test was performed. A dynamic mechanical analysis was used to characterize the storage modulus and loss factor (tan δ). The morphology of the composites was studied by a scanning electron microscope (SEM). The results obtained are presented in tables and graphics.

  17. IAC-1.5 - INTEGRATED ANALYSIS CAPABILITY

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1994-01-01

    The objective of the Integrated Analysis Capability (IAC) system is to provide a highly effective, interactive analysis tool for the integrated design of large structures. IAC was developed to interface programs from the fields of structures, thermodynamics, controls, and system dynamics with an executive system and a database to yield a highly efficient multi-disciplinary system. Special attention is given to user requirements such as data handling and on-line assistance with operational features, and the ability to add new modules of the user's choice at a future date. IAC contains an executive system, a database, general utilities, interfaces to various engineering programs, and a framework for building interfaces to other programs. IAC has shown itself to be effective in automating data transfer among analysis programs. The IAC system architecture is modular in design. 1) The executive module contains an input command processor, an extensive data management system, and driver code to execute the application modules. 2) Technical modules provide standalone computational capability as well as support for various solution paths or coupled analyses. 3) Graphics and model generation modules are supplied for building and viewing models. 4) Interface modules provide for the required data flow between IAC and other modules. 5) User modules can be arbitrary executable programs or JCL procedures with no pre-defined relationship to IAC. 6) Special purpose modules are included, such as MIMIC (Model Integration via Mesh Interpolation Coefficients), which transforms field values from one model to another; LINK, which simplifies incorporation of user specific modules into IAC modules; and DATAPAC, the National Bureau of Standards statistical analysis package. The IAC database contains structured files which provide a common basis for communication between modules and the executive system, and can contain unstructured files such as NASTRAN checkpoint files, DISCOS plot files, object code, etc. The user can define groups of data and relations between them. A full data manipulation and query system operates with the database. The current interface modules comprise five groups: 1) Structural analysis - IAC contains a NASTRAN interface for standalone analysis or certain structural/control/thermal combinations. IAC provides enhanced structural capabilities for normal modes and static deformation analysis via special DMAP sequences. 2) Thermal analysis - IAC supports finite element and finite difference techniques for steady state or transient analysis. There are interfaces for the NASTRAN thermal analyzer, SINDA/SINFLO, and TRASYS II. 3) System dynamics - A DISCOS interface allows full use of this simulation program for either nonlinear time domain analysis or linear frequency domain analysis. 4) Control analysis - Interfaces for the ORACLS, SAMSAN, NBOD2, and INCA programs allow a wide range of control system analyses and synthesis techniques. 5) Graphics - The graphics packages PLOT and MOSAIC are included in IAC. PLOT generates vector displays of tabular data in the form of curves, charts, correlation tables, etc., while MOSAIC generates color raster displays of either tabular of array type data. Either DI3000 or PLOT-10 graphics software is required for full graphics capability. IAC is available by license for a period of 10 years to approved licensees. The licensed program product includes one complete set of supporting documentation. Additional copies of the documentation may be purchased separately. IAC is written in FORTRAN 77 and has been implemented on a DEC VAX series computer operating under VMS. IAC can be executed by multiple concurrent users in batch or interactive mode. The basic central memory requirement is approximately 750KB. IAC includes the executive system, graphics modules, a database, general utilities, and the interfaces to all analysis and controls programs described above. Source code is provided for the control programs ORACLS, SAMSAN, NBOD2, and DISCOS. The following programs are also available from COSMIC a

  18. IAC-1.5 - INTEGRATED ANALYSIS CAPABILITY

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1994-01-01

    The objective of the Integrated Analysis Capability (IAC) system is to provide a highly effective, interactive analysis tool for the integrated design of large structures. IAC was developed to interface programs from the fields of structures, thermodynamics, controls, and system dynamics with an executive system and a database to yield a highly efficient multi-disciplinary system. Special attention is given to user requirements such as data handling and on-line assistance with operational features, and the ability to add new modules of the user's choice at a future date. IAC contains an executive system, a database, general utilities, interfaces to various engineering programs, and a framework for building interfaces to other programs. IAC has shown itself to be effective in automating data transfer among analysis programs. The IAC system architecture is modular in design. 1) The executive module contains an input command processor, an extensive data management system, and driver code to execute the application modules. 2) Technical modules provide standalone computational capability as well as support for various solution paths or coupled analyses. 3) Graphics and model generation modules are supplied for building and viewing models. 4) Interface modules provide for the required data flow between IAC and other modules. 5) User modules can be arbitrary executable programs or JCL procedures with no pre-defined relationship to IAC. 6) Special purpose modules are included, such as MIMIC (Model Integration via Mesh Interpolation Coefficients), which transforms field values from one model to another; LINK, which simplifies incorporation of user specific modules into IAC modules; and DATAPAC, the National Bureau of Standards statistical analysis package. The IAC database contains structured files which provide a common basis for communication between modules and the executive system, and can contain unstructured files such as NASTRAN checkpoint files, DISCOS plot files, object code, etc. The user can define groups of data and relations between them. A full data manipulation and query system operates with the database. The current interface modules comprise five groups: 1) Structural analysis - IAC contains a NASTRAN interface for standalone analysis or certain structural/control/thermal combinations. IAC provides enhanced structural capabilities for normal modes and static deformation analysis via special DMAP sequences. 2) Thermal analysis - IAC supports finite element and finite difference techniques for steady state or transient analysis. There are interfaces for the NASTRAN thermal analyzer, SINDA/SINFLO, and TRASYS II. 3) System dynamics - A DISCOS interface allows full use of this simulation program for either nonlinear time domain analysis or linear frequency domain analysis. 4) Control analysis - Interfaces for the ORACLS, SAMSAN, NBOD2, and INCA programs allow a wide range of control system analyses and synthesis techniques. 5) Graphics - The graphics packages PLOT and MOSAIC are included in IAC. PLOT generates vector displays of tabular data in the form of curves, charts, correlation tables, etc., while MOSAIC generates color raster displays of either tabular of array type data. Either DI3000 or PLOT-10 graphics software is required for full graphics capability. IAC is available by license for a period of 10 years to approved licensees. The licensed program product includes one complete set of supporting documentation. Additional copies of the documentation may be purchased separately. IAC is written in FORTRAN 77 and has been implemented on a DEC VAX series computer operating under VMS. IAC can be executed by multiple concurrent users in batch or interactive mode. The basic central memory requirement is approximately 750KB. IAC includes the executive system, graphics modules, a database, general utilities, and the interfaces to all analysis and controls programs described above. Source code is provided for the control programs ORACLS, SAMSAN, NBOD2, and DISCOS. The following programs are also available from COSMIC as separate packages: NASTRAN, SINDA/SINFLO, TRASYS II, DISCOS, ORACLS, SAMSAN, NBOD2, and INCA. IAC was developed in 1985.

  19. PyCOOL — A Cosmological Object-Oriented Lattice code written in Python

    NASA Astrophysics Data System (ADS)

    Sainio, J.

    2012-04-01

    There are a number of different phenomena in the early universe that have to be studied numerically with lattice simulations. This paper presents a graphics processing unit (GPU) accelerated Python program called PyCOOL that solves the evolution of scalar fields in a lattice with very precise symplectic integrators. The program has been written with the intention to hit a sweet spot of speed, accuracy and user friendliness. This has been achieved by using the Python language with the PyCUDA interface to make a program that is easy to adapt to different scalar field models. In this paper we derive the symplectic dynamics that govern the evolution of the system and then present the implementation of the program in Python and PyCUDA. The functionality of the program is tested in a chaotic inflation preheating model, a single field oscillon case and in a supersymmetric curvaton model which leads to Q-ball production. We have also compared the performance of a consumer graphics card to a professional Tesla compute card in these simulations. We find that the program is not only accurate but also very fast. To further increase the usefulness of the program we have equipped it with numerous post-processing functions that provide useful information about the cosmological model. These include various spectra and statistics of the fields. The program can be additionally used to calculate the generated curvature perturbation. The program is publicly available under GNU General Public License at https://github.com/jtksai/PyCOOL. Some additional information can be found from http://www.physics.utu.fi/tiedostot/theory/particlecosmology/pycool/.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sainio, J., E-mail: jani.sainio@utu.fi; Department of Physics and Astronomy, University of Turku, FI-20014 Turku

    There are a number of different phenomena in the early universe that have to be studied numerically with lattice simulations. This paper presents a graphics processing unit (GPU) accelerated Python program called PyCOOL that solves the evolution of scalar fields in a lattice with very precise symplectic integrators. The program has been written with the intention to hit a sweet spot of speed, accuracy and user friendliness. This has been achieved by using the Python language with the PyCUDA interface to make a program that is easy to adapt to different scalar field models. In this paper we derive themore » symplectic dynamics that govern the evolution of the system and then present the implementation of the program in Python and PyCUDA. The functionality of the program is tested in a chaotic inflation preheating model, a single field oscillon case and in a supersymmetric curvaton model which leads to Q-ball production. We have also compared the performance of a consumer graphics card to a professional Tesla compute card in these simulations. We find that the program is not only accurate but also very fast. To further increase the usefulness of the program we have equipped it with numerous post-processing functions that provide useful information about the cosmological model. These include various spectra and statistics of the fields. The program can be additionally used to calculate the generated curvature perturbation. The program is publicly available under GNU General Public License at https://github.com/jtksai/PyCOOL. Some additional information can be found from http://www.physics.utu.fi/tiedostot/theory/particlecosmology/pycool/.« less

  1. Computing Science and Statistics: Volume 24. Graphics and Visualization

    DTIC Science & Technology

    1993-03-20

    r, is set to 3.569, the population examples include: kneading ingredients into a bread eventually oscillates about 16 fixed values. However the dough ...34fun statistics". My goal is to offer leagues I said in jest "After all, regression analysis is you the equivalent of a fortune cookie which clearly is... cookie of the night reads: One problem that statisticians traditionally seem to "You have good friends who will come to your aid in have is that they

  2. "These books give me life": Considering what happens when comics and graphic novels are welcomed into a middle school space

    NASA Astrophysics Data System (ADS)

    Dallacqua, Ashley Kaye

    This year-long, ethnographic study documents the use of comics and graphic novels as academic literature across the curriculum in a suburban middle school. Because the use of this medium in classrooms is relatively new, it is a process that has not been extensively documented. While comics and graphic novels can provide a complex and valuable experience for readers, they can also be challenging to both students and teachers. In particular, this dissertation documents the tensions that surfaced as comics and graphic novels were integrated into a curriculum. This study is situated in a middle school entrenched in neoliberal ideologies, with focuses on high-stakes testing, a standardized curriculum, and individual, rather than collaborative work. Yet, the faculty in this middle school was also inviting nontraditional texts into classrooms, and operating in tension with a neoliberal agenda. By focusing on teaching and learning literacy practices with comics and graphic novels and talk about those practices, this study also addresses negative assumptions and hesitancies around such texts being used for academic purposes. Participants included seventh grade teachers and students engaged in working with and talking about comics. This research considers how comics and graphic novels were welcomed into this school, as well as impacts around time and space, and positioning. All of these themes point back to how comics and graphic novels were working within and against normative structures in this school. This study is positioned to consider conventional literacy practices and how teaching and learning with comics and graphic novels supports and disrupts those practices. Serving as an example and a starting point for bringing this dynamic medium into classrooms, this study fills a significant gap, supporting and challenging traditional literacies practices and analyzing potential for new ways of operating in a school.

  3. Thermal Energy Storage Flight Experiment in Microgravity

    NASA Technical Reports Server (NTRS)

    Namkoong, David

    1992-01-01

    The Thermal Energy Storage Flight Experiment was designed to characterize void shape and location in LiF-based phase change materials in different energy storage configurations representative of advanced solar dynamic systems. Experiment goals and payload design are described in outline and graphic form.

  4. EVA prep

    NASA Image and Video Library

    2014-08-04

    ISS040-E-088730 (4 Aug. 2014) --- In the International Space Station?s Harmony node, NASA astronauts Steve Swanson (foreground), Expedition 40 commander; and Reid Wiseman, flight engineer, perform a portable onboard computer Dynamic Onboard Ubiquitous Graphics (DOUG) software review in preparation for two upcoming U.S. spacewalks.

  5. National Agricultural Statistics Service (NASS): Agricultural Chemical Use

    Science.gov Websites

    Management Agricultural Chemical Use Database Search Tips Usage Search | US Maps | Graphical Reports effort among USDA, the USDA Regional Pest Management Centers and the NSF Center for Integrated Pest Management (CIPM). All data available have been previously published by NASS and have been consolidated at

  6. Effectiveness of Simulation in a Hybrid and Online Networking Course.

    ERIC Educational Resources Information Center

    Cameron, Brian H.

    2003-01-01

    Reports on a study that compares the performance of students enrolled in two sections of a Web-based computer networking course: one utilizing a simulation package and the second utilizing a static, graphical software package. Analysis shows statistically significant improvements in performance in the simulation group compared to the…

  7. Assessment Atlas, 1983-84.

    ERIC Educational Resources Information Center

    Yosemite Community Coll. District, Modesto, CA.

    Designed to provide information of value in establishing a base for decision making in the Yosemite Community College District (YCCD), this assessment atlas graphically presents statistical data for the District as a whole, its two campuses, and YCCD Central Services for 1983-84. After an introduction to the use of the assessment atlas and…

  8. Assessment Atlas, 1982-83.

    ERIC Educational Resources Information Center

    Yosemite Community Coll. District, Modesto, CA.

    Designed to provide information of value in establishing a base for decisionmaking in the Yosemite Community College District (YCCD), this assessment atlas graphically presents statistical data on the District as a whole, its two campuses, and YCCD Central Services for 1982-83. After an introduction to the use of the assessment atlas and…

  9. The Middle East Today: An Atlas of Reproducible Pages. Revised Edition.

    ERIC Educational Resources Information Center

    World Eagle, Inc., Wellesley, MA.

    This book contains blank outline maps of the continent/region, tables and graphics depicting the size, population, resources and water, commodities , trade, cities, languages, religions, industry, energy, food and agriculture, demographic statistics, aspects of the national economies, and aspects of the national governments of the Middle East.…

  10. Child Health USA '96-'97.

    ERIC Educational Resources Information Center

    Health Resources and Services Administration (DHHS/PHS), Washington, DC. Maternal and Child Health Bureau.

    Intended to inform policymaking in the public and private sector, this booklet compiles secondary data for 53 health status indicators and service needs of America's children. The book provides both a graphic and textual summary for the data and addresses long-term trends, where applicable. Some statistics indicate the extent of progress toward…

  11. Exploring Rating Quality in Rater-Mediated Assessments Using Mokken Scale Analysis

    ERIC Educational Resources Information Center

    Wind, Stefanie A.; Engelhard, George, Jr.

    2016-01-01

    Mokken scale analysis is a probabilistic nonparametric approach that offers statistical and graphical tools for evaluating the quality of social science measurement without placing potentially inappropriate restrictions on the structure of a data set. In particular, Mokken scaling provides a useful method for evaluating important measurement…

  12. The Use of Modelling for Theory Building in Qualitative Analysis

    ERIC Educational Resources Information Center

    Briggs, Ann R. J.

    2007-01-01

    The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…

  13. Atlas of the African Child.

    ERIC Educational Resources Information Center

    Patel, Mahesh, Ed.

    Using data primarily from United Nations Statistical Yearbooks, but from other sources as well, this Atlas provides an overview, in graphical form, of issues affecting children in Africa. Some of the issues covered, such as immunization, affect children directly. Others, such as economic progress, are included because they form part of the…

  14. Use of microcomputers for planning and managing silviculture habitat relationships.

    Treesearch

    B.G. Marcot; R.S. McNay; R.E. Page

    1988-01-01

    Microcomputers aid in monitoring, modeling, and decision support for integrating objectives of silviculture and wildlife habitat management. Spreadsheets, data bases, statistics, and graphics programs are described for use in monitoring. Stand growth models, modeling languages, area and geobased information systems, and optimization models are discussed for use in...

  15. Appraisal of within- and between-laboratory reproducibility of non-radioisotopic local lymph node assay using flow cytometry, LLNA:BrdU-FCM: comparison of OECD TG429 performance standard and statistical evaluation.

    PubMed

    Yang, Hyeri; Na, Jihye; Jang, Won-Hee; Jung, Mi-Sook; Jeon, Jun-Young; Heo, Yong; Yeo, Kyung-Wook; Jo, Ji-Hoon; Lim, Kyung-Min; Bae, SeungJin

    2015-05-05

    Mouse local lymph node assay (LLNA, OECD TG429) is an alternative test replacing conventional guinea pig tests (OECD TG406) for the skin sensitization test but the use of a radioisotopic agent, (3)H-thymidine, deters its active dissemination. New non-radioisotopic LLNA, LLNA:BrdU-FCM employs a non-radioisotopic analog, 5-bromo-2'-deoxyuridine (BrdU) and flow cytometry. For an analogous method, OECD TG429 performance standard (PS) advises that two reference compounds be tested repeatedly and ECt(threshold) values obtained must fall within acceptable ranges to prove within- and between-laboratory reproducibility. However, this criteria is somewhat arbitrary and sample size of ECt is less than 5, raising concerns about insufficient reliability. Here, we explored various statistical methods to evaluate the reproducibility of LLNA:BrdU-FCM with stimulation index (SI), the raw data for ECt calculation, produced from 3 laboratories. Descriptive statistics along with graphical representation of SI was presented. For inferential statistics, parametric and non-parametric methods were applied to test the reproducibility of SI of a concurrent positive control and the robustness of results were investigated. Descriptive statistics and graphical representation of SI alone could illustrate the within- and between-laboratory reproducibility. Inferential statistics employing parametric and nonparametric methods drew similar conclusion. While all labs passed within- and between-laboratory reproducibility criteria given by OECD TG429 PS based on ECt values, statistical evaluation based on SI values showed that only two labs succeeded in achieving within-laboratory reproducibility. For those two labs that satisfied the within-lab reproducibility, between-laboratory reproducibility could be also attained based on inferential as well as descriptive statistics. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Efficient molecular dynamics simulations with many-body potentials on graphics processing units

    NASA Astrophysics Data System (ADS)

    Fan, Zheyong; Chen, Wei; Vierimaa, Ville; Harju, Ari

    2017-09-01

    Graphics processing units have been extensively used to accelerate classical molecular dynamics simulations. However, there is much less progress on the acceleration of force evaluations for many-body potentials compared to pairwise ones. In the conventional force evaluation algorithm for many-body potentials, the force, virial stress, and heat current for a given atom are accumulated within different loops, which could result in write conflict between different threads in a CUDA kernel. In this work, we provide a new force evaluation algorithm, which is based on an explicit pairwise force expression for many-body potentials derived recently (Fan et al., 2015). In our algorithm, the force, virial stress, and heat current for a given atom can be accumulated within a single thread and is free of write conflicts. We discuss the formulations and algorithms and evaluate their performance. A new open-source code, GPUMD, is developed based on the proposed formulations. For the Tersoff many-body potential, the double precision performance of GPUMD using a Tesla K40 card is equivalent to that of the LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator) molecular dynamics code running with about 100 CPU cores (Intel Xeon CPU X5670 @ 2.93 GHz).

  17. Architectural evaluation of dynamic and partial reconfigurable systems designed with DREAMS tool

    NASA Astrophysics Data System (ADS)

    Otero, Andrés.; Gallego, Ángel; de la Torre, Eduardo; Riesgo, Teresa

    2013-05-01

    Benefits of dynamic and partial reconfigurable systems are increasingly being more accepted by the industry. For this reason, SRAM-based FPGA manufacturers have improved, or even included for the first time, the support they offer for the design of this kind of systems. However, commercial tools still offer a poor flexibility, which leads to a limited efficiency. This is witnessed by the overhead introduced by the communication primitives, as well as by the inability to relocate reconfigurable modules, among others. For this reason, authors have proposed an academic design tool called DREAMS, which targets the design of dynamically reconfigurable systems. In this paper, main features offered by DREAMS are described, comparing them with existing commercial and academic tools. Moreover, a graphic user interface (GUI) is originally described in this work, with the aim of simplifying the design process, as well as to hide the low level device dependent details to the system designer. The overall goal is to increase the designer productivity. Using the graphic interface, different reconfigurable architectures are provided as design examples. Among them, both conventional slot-based architectures and mesh type designs have been included.

  18. Simulation of cooperating robot manipulators on a mobile platform

    NASA Technical Reports Server (NTRS)

    Murphy, Steve H.; Wen, John T.; Saridis, George N.

    1990-01-01

    The dynamic equations of motion for two manipulators holding a common object on a freely moving mobile platform are developed. The full dynamic interactions from arms to platform and arm-tip to arm-tip are included in the formulation. The development of the closed chain dynamics allows for the use of any solution for the open topological tree of base and manipulator links. In particular, because the system has 18 degrees of freedom, recursive solutions for the dynamic simulation become more promising for efficient calculations of the motion. Simulation of the system is accomplished through a MATLAB program, and the response is visualized graphically using the SILMA Cimstation.

  19. Pre- and postprocessing for reservoir simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogers, W.L.; Ingalls, L.J.; Prasad, S.J.

    1991-05-01

    This paper describes the functionality and underlying programing paradigms of Shell's simulator-related reservoir-engineering graphics system. THis system includes the simulation postprocessing programs Reservoir Display System (RDS) and Fast Reservoir Engineering Displays (FRED), a hypertext-like on-line documentation system (DOC), and a simulator input preprocessor (SIMPLSIM). RDS creates displays of reservoir simulation results. These displays represent the areal or cross-section distribution of computer reservoir parameters, such as pressure, phase saturation, or temperature. Generation of these images at real-time animation rates is discussed. FRED facilitates the creation of plot files from reservoir simulation output. The use of dynamic memory allocation, asynchronous I/O, amore » table-driven screen manager, and mixed-language (FORTRAN and C) programming are detailed. DOC is used to create and access on-line documentation for the pre-and post-processing programs and the reservoir simulators. DOC can be run by itself or can be accessed from within any other graphics or nongraphics application program. DOC includes a text editor, which is that basis for a reservoir simulation tutorial and greatly simplifies the preparation of simulator input. The use of sharable images, graphics, and the documentation file network are described. Finally, SIMPLSIM is a suite of program that uses interactive graphics in the preparation of reservoir description data for input into reservoir simulators. The SIMPLSIM user-interface manager (UIM) and its graphic interface for reservoir description are discussed.« less

  20. EuroPhenome: a repository for high-throughput mouse phenotyping data

    PubMed Central

    Morgan, Hugh; Beck, Tim; Blake, Andrew; Gates, Hilary; Adams, Niels; Debouzy, Guillaume; Leblanc, Sophie; Lengger, Christoph; Maier, Holger; Melvin, David; Meziane, Hamid; Richardson, Dave; Wells, Sara; White, Jacqui; Wood, Joe; de Angelis, Martin Hrabé; Brown, Steve D. M.; Hancock, John M.; Mallon, Ann-Marie

    2010-01-01

    The broad aim of biomedical science in the postgenomic era is to link genomic and phenotype information to allow deeper understanding of the processes leading from genomic changes to altered phenotype and disease. The EuroPhenome project (http://www.EuroPhenome.org) is a comprehensive resource for raw and annotated high-throughput phenotyping data arising from projects such as EUMODIC. EUMODIC is gathering data from the EMPReSSslim pipeline (http://www.empress.har.mrc.ac.uk/) which is performed on inbred mouse strains and knock-out lines arising from the EUCOMM project. The EuroPhenome interface allows the user to access the data via the phenotype or genotype. It also allows the user to access the data in a variety of ways, including graphical display, statistical analysis and access to the raw data via web services. The raw phenotyping data captured in EuroPhenome is annotated by an annotation pipeline which automatically identifies statistically different mutants from the appropriate baseline and assigns ontology terms for that specific test. Mutant phenotypes can be quickly identified using two EuroPhenome tools: PhenoMap, a graphical representation of statistically relevant phenotypes, and mining for a mutant using ontology terms. To assist with data definition and cross-database comparisons, phenotype data is annotated using combinations of terms from biological ontologies. PMID:19933761

  1. Graphical Modeling Meets Systems Pharmacology.

    PubMed

    Lombardo, Rosario; Priami, Corrado

    2017-01-01

    A main source of failures in systems projects (including systems pharmacology) is poor communication level and different expectations among the stakeholders. A common and not ambiguous language that is naturally comprehensible by all the involved players is a boost to success. We present bStyle, a modeling tool that adopts a graphical language close enough to cartoons to be a common media to exchange ideas and data and that it is at the same time formal enough to enable modeling, analysis, and dynamic simulations of a system. Data analysis and simulation integrated in the same application are fundamental to understand the mechanisms of actions of drugs: a core aspect of systems pharmacology.

  2. Graphical Modeling Meets Systems Pharmacology

    PubMed Central

    Lombardo, Rosario; Priami, Corrado

    2017-01-01

    A main source of failures in systems projects (including systems pharmacology) is poor communication level and different expectations among the stakeholders. A common and not ambiguous language that is naturally comprehensible by all the involved players is a boost to success. We present bStyle, a modeling tool that adopts a graphical language close enough to cartoons to be a common media to exchange ideas and data and that it is at the same time formal enough to enable modeling, analysis, and dynamic simulations of a system. Data analysis and simulation integrated in the same application are fundamental to understand the mechanisms of actions of drugs: a core aspect of systems pharmacology. PMID:28469411

  3. A computer graphics based model for scattering from objects of arbitrary shapes in the optical region

    NASA Technical Reports Server (NTRS)

    Goel, Narendra S.; Rozehnal, Ivan; Thompson, Richard L.

    1991-01-01

    A computer-graphics-based model, named DIANA, is presented for generation of objects of arbitrary shape and for calculating bidirectional reflectances and scattering from them, in the visible and infrared region. The computer generation is based on a modified Lindenmayer system approach which makes it possible to generate objects of arbitrary shapes and to simulate their growth, dynamics, and movement. Rendering techniques are used to display an object on a computer screen with appropriate shading and shadowing and to calculate the scattering and reflectance from the object. The technique is illustrated with scattering from canopies of simulated corn plants.

  4. Combined effects of heat and mass transfer to magneto hydrodynamics oscillatory dusty fluid flow in a porous channel

    NASA Astrophysics Data System (ADS)

    Govindarajan, A.; Vijayalakshmi, R.; Ramamurthy, V.

    2018-04-01

    The main aim of this article is to study the combined effects of heat and mass transfer to radiative Magneto Hydro Dynamics (MHD) oscillatory optically thin dusty fluid in a saturated porous medium channel. Based on certain assumptions, the momentum, energy, concentration equations are obtained.The governing equations are non-dimensionalised, simplified and solved analytically. The closed analytical form solutions for velocity, temperature, concentration profiles are obtained. Numerical computations are presented graphically to show the salient features of various physical parameters. The shear stress, the rate of heat transfer and the rate of mass transfer are also presented graphically.

  5. Letter from the Editor in Chief.

    PubMed

    Levinson Md, Mark M

    2016-10-21

    Twenty years ago, I presented a new vision for medical publishing. The Heart Surgery Forum was inaugurated in August of 1995 as a multimedia scientific publication communicating over the new "Information Highway" known as the graphical Web.  In the early days of HTML and HTTP, new ideas could evolve and disseminate quickly to a community that spanned the globe. The HSF began as a "Labor of Love" for my profession and my colleagues as a dynamic tool for the betterment of themselves and their patients. Included in the original HSF Web site was a novel means to present interesting cases using color photos, movies, text and graphics, which was groundbreaking in 1995.

  6. Real-time dynamic display of registered 4D cardiac MR and ultrasound images using a GPU

    NASA Astrophysics Data System (ADS)

    Zhang, Q.; Huang, X.; Eagleson, R.; Guiraudon, G.; Peters, T. M.

    2007-03-01

    In minimally invasive image-guided surgical interventions, different imaging modalities, such as magnetic resonance imaging (MRI), computed tomography (CT), and real-time three-dimensional (3D) ultrasound (US), can provide complementary, multi-spectral image information. Multimodality dynamic image registration is a well-established approach that permits real-time diagnostic information to be enhanced by placing lower-quality real-time images within a high quality anatomical context. For the guidance of cardiac procedures, it would be valuable to register dynamic MRI or CT with intraoperative US. However, in practice, either the high computational cost prohibits such real-time visualization of volumetric multimodal images in a real-world medical environment, or else the resulting image quality is not satisfactory for accurate guidance during the intervention. Modern graphics processing units (GPUs) provide the programmability, parallelism and increased computational precision to begin to address this problem. In this work, we first outline our research on dynamic 3D cardiac MR and US image acquisition, real-time dual-modality registration and US tracking. Then we describe image processing and optimization techniques for 4D (3D + time) cardiac image real-time rendering. We also present our multimodality 4D medical image visualization engine, which directly runs on a GPU in real-time by exploiting the advantages of the graphics hardware. In addition, techniques such as multiple transfer functions for different imaging modalities, dynamic texture binding, advanced texture sampling and multimodality image compositing are employed to facilitate the real-time display and manipulation of the registered dual-modality dynamic 3D MR and US cardiac datasets.

  7. A Set of Handwriting Features for Use in Automated Writer Identification.

    PubMed

    Miller, John J; Patterson, Robert Bradley; Gantz, Donald T; Saunders, Christopher P; Walch, Mark A; Buscaglia, JoAnn

    2017-05-01

    A writer's biometric identity can be characterized through the distribution of physical feature measurements ("writer's profile"); a graph-based system that facilitates the quantification of these features is described. To accomplish this quantification, handwriting is segmented into basic graphical forms ("graphemes"), which are "skeletonized" to yield the graphical topology of the handwritten segment. The graph-based matching algorithm compares the graphemes first by their graphical topology and then by their geometric features. Graphs derived from known writers can be compared against graphs extracted from unknown writings. The process is computationally intensive and relies heavily upon statistical pattern recognition algorithms. This article focuses on the quantification of these physical features and the construction of the associated pattern recognition methods for using the features to discriminate among writers. The graph-based system described in this article has been implemented in a highly accurate and approximately language-independent biometric recognition system of writers of cursive documents. © 2017 American Academy of Forensic Sciences.

  8. oneChannelGUI: a graphical interface to Bioconductor tools, designed for life scientists who are not familiar with R language.

    PubMed

    Sanges, Remo; Cordero, Francesca; Calogero, Raffaele A

    2007-12-15

    OneChannelGUI is an add-on Bioconductor package providing a new set of functions extending the capability of the affylmGUI package. This library provides a graphical interface (GUI) for Bioconductor libraries to be used for quality control, normalization, filtering, statistical validation and data mining for single channel microarrays. Affymetrix 3' expression (IVT) arrays as well as the new whole transcript expression arrays, i.e. gene/exon 1.0 ST, are actually implemented. oneChannelGUI is available for most platforms on which R runs, i.e. Windows and Unix-like machines. http://www.bioconductor.org/packages/2.0/bioc/html/oneChannelGUI.html

  9. A Simple Graphical Method for Quantification of Disaster Management Surge Capacity Using Computer Simulation and Process-control Tools.

    PubMed

    Franc, Jeffrey Michael; Ingrassia, Pier Luigi; Verde, Manuela; Colombo, Davide; Della Corte, Francesco

    2015-02-01

    Surge capacity, or the ability to manage an extraordinary volume of patients, is fundamental for hospital management of mass-casualty incidents. However, quantification of surge capacity is difficult and no universal standard for its measurement has emerged, nor has a standardized statistical method been advocated. As mass-casualty incidents are rare, simulation may represent a viable alternative to measure surge capacity. Hypothesis/Problem The objective of the current study was to develop a statistical method for the quantification of surge capacity using a combination of computer simulation and simple process-control statistical tools. Length-of-stay (LOS) and patient volume (PV) were used as metrics. The use of this method was then demonstrated on a subsequent computer simulation of an emergency department (ED) response to a mass-casualty incident. In the derivation phase, 357 participants in five countries performed 62 computer simulations of an ED response to a mass-casualty incident. Benchmarks for ED response were derived from these simulations, including LOS and PV metrics for triage, bed assignment, physician assessment, and disposition. In the application phase, 13 students of the European Master in Disaster Medicine (EMDM) program completed the same simulation scenario, and the results were compared to the standards obtained in the derivation phase. Patient-volume metrics included number of patients to be triaged, assigned to rooms, assessed by a physician, and disposed. Length-of-stay metrics included median time to triage, room assignment, physician assessment, and disposition. Simple graphical methods were used to compare the application phase group to the derived benchmarks using process-control statistical tools. The group in the application phase failed to meet the indicated standard for LOS from admission to disposition decision. This study demonstrates how simulation software can be used to derive values for objective benchmarks of ED surge capacity using PV and LOS metrics. These objective metrics can then be applied to other simulation groups using simple graphical process-control tools to provide a numeric measure of surge capacity. Repeated use in simulations of actual EDs may represent a potential means of objectively quantifying disaster management surge capacity. It is hoped that the described statistical method, which is simple and reusable, will be useful for investigators in this field to apply to their own research.

  10. ArraySolver: an algorithm for colour-coded graphical display and Wilcoxon signed-rank statistics for comparing microarray gene expression data.

    PubMed

    Khan, Haseeb Ahmad

    2004-01-01

    The massive surge in the production of microarray data poses a great challenge for proper analysis and interpretation. In recent years numerous computational tools have been developed to extract meaningful interpretation of microarray gene expression data. However, a convenient tool for two-groups comparison of microarray data is still lacking and users have to rely on commercial statistical packages that might be costly and require special skills, in addition to extra time and effort for transferring data from one platform to other. Various statistical methods, including the t-test, analysis of variance, Pearson test and Mann-Whitney U test, have been reported for comparing microarray data, whereas the utilization of the Wilcoxon signed-rank test, which is an appropriate test for two-groups comparison of gene expression data, has largely been neglected in microarray studies. The aim of this investigation was to build an integrated tool, ArraySolver, for colour-coded graphical display and comparison of gene expression data using the Wilcoxon signed-rank test. The results of software validation showed similar outputs with ArraySolver and SPSS for large datasets. Whereas the former program appeared to be more accurate for 25 or fewer pairs (n < or = 25), suggesting its potential application in analysing molecular signatures that usually contain small numbers of genes. The main advantages of ArraySolver are easy data selection, convenient report format, accurate statistics and the familiar Excel platform.

  11. ArraySolver: An Algorithm for Colour-Coded Graphical Display and Wilcoxon Signed-Rank Statistics for Comparing Microarray Gene Expression Data

    PubMed Central

    2004-01-01

    The massive surge in the production of microarray data poses a great challenge for proper analysis and interpretation. In recent years numerous computational tools have been developed to extract meaningful interpretation of microarray gene expression data. However, a convenient tool for two-groups comparison of microarray data is still lacking and users have to rely on commercial statistical packages that might be costly and require special skills, in addition to extra time and effort for transferring data from one platform to other. Various statistical methods, including the t-test, analysis of variance, Pearson test and Mann–Whitney U test, have been reported for comparing microarray data, whereas the utilization of the Wilcoxon signed-rank test, which is an appropriate test for two-groups comparison of gene expression data, has largely been neglected in microarray studies. The aim of this investigation was to build an integrated tool, ArraySolver, for colour-coded graphical display and comparison of gene expression data using the Wilcoxon signed-rank test. The results of software validation showed similar outputs with ArraySolver and SPSS for large datasets. Whereas the former program appeared to be more accurate for 25 or fewer pairs (n ≤ 25), suggesting its potential application in analysing molecular signatures that usually contain small numbers of genes. The main advantages of ArraySolver are easy data selection, convenient report format, accurate statistics and the familiar Excel platform. PMID:18629036

  12. Probing the Statistical Validity of the Ductile-to-Brittle Transition in Metallic Nanowires Using GPU Computing.

    PubMed

    French, William R; Pervaje, Amulya K; Santos, Andrew P; Iacovella, Christopher R; Cummings, Peter T

    2013-12-10

    We perform a large-scale statistical analysis (>2000 independent simulations) of the elongation and rupture of gold nanowires, probing the validity and scope of the recently proposed ductile-to-brittle transition that occurs with increasing nanowire length [Wu et al. Nano Lett. 2012, 12, 910-914]. To facilitate a high-throughput simulation approach, we implement the second-moment approximation to the tight-binding (TB-SMA) potential within HOOMD-Blue, a molecular dynamics package which runs on massively parallel graphics processing units (GPUs). In a statistical sense, we find that the nanowires obey the ductile-to-brittle model quite well; however, we observe several unexpected features from the simulations that build on our understanding of the ductile-to-brittle transition. First, occasional failure behavior is observed that qualitatively differs from that predicted by the model prediction; this is attributed to stochastic thermal motion of the Au atoms and occurs at temperatures as low as 10 K. In addition, we also find that the ductile-to-brittle model, which was developed using classical dislocation theory, holds for nanowires as small as 3 nm in diameter. Finally, we demonstrate that the nanowire critical length is higher at 298 K relative to 10 K, a result that is not predicted by the ductile-to-brittle model. These results offer practical design strategies for adjusting nanowire failure and structure and also demonstrate that GPU computing is an excellent tool for studies requiring a large number of independent trajectories in order to fully characterize a system's behavior.

  13. Can power-law scaling and neuronal avalanches arise from stochastic dynamics?

    PubMed

    Touboul, Jonathan; Destexhe, Alain

    2010-02-11

    The presence of self-organized criticality in biology is often evidenced by a power-law scaling of event size distributions, which can be measured by linear regression on logarithmic axes. We show here that such a procedure does not necessarily mean that the system exhibits self-organized criticality. We first provide an analysis of multisite local field potential (LFP) recordings of brain activity and show that event size distributions defined as negative LFP peaks can be close to power-law distributions. However, this result is not robust to change in detection threshold, or when tested using more rigorous statistical analyses such as the Kolmogorov-Smirnov test. Similar power-law scaling is observed for surrogate signals, suggesting that power-law scaling may be a generic property of thresholded stochastic processes. We next investigate this problem analytically, and show that, indeed, stochastic processes can produce spurious power-law scaling without the presence of underlying self-organized criticality. However, this power-law is only apparent in logarithmic representations, and does not survive more rigorous analysis such as the Kolmogorov-Smirnov test. The same analysis was also performed on an artificial network known to display self-organized criticality. In this case, both the graphical representations and the rigorous statistical analysis reveal with no ambiguity that the avalanche size is distributed as a power-law. We conclude that logarithmic representations can lead to spurious power-law scaling induced by the stochastic nature of the phenomenon. This apparent power-law scaling does not constitute a proof of self-organized criticality, which should be demonstrated by more stringent statistical tests.

  14. Visualizing and understanding l'hopital's rule

    NASA Astrophysics Data System (ADS)

    Gordon, Sheldon P.

    2017-11-01

    This article uses dynamic software in Excel to demonstrate several ways in which graphical and numerical approaches can be introduced both to enhance student understanding of l'Hopital's Rule and to explain why the Rule actually works to give the 'right' answers. One of the approaches used is to visualize what is happening by examining the limits with both l'Hopital's Rule and the associated Taylor approximation to the function. The dynamic software allows students to experiment with the ideas.

  15. The impact of numeric and graphic displays of ST-segment deviation levels on cardiologists' decisions of reperfusion therapy for patients with acute coronary occlusion.

    PubMed

    Nimmermark, Magnus O; Wang, John J; Maynard, Charles; Cohen, Mauricio; Gilcrist, Ian; Heitner, John; Hudson, Michael; Palmeri, Sebastian; Wagner, Galen S; Pahlm, Olle

    2011-01-01

    The study purpose is to determine whether numeric and/or graphic ST measurements added to the display of the 12-lead electrocardiogram (ECG) would influence cardiologists' decision to provide myocardial reperfusion therapy. Twenty ECGs with borderline ST-segment deviation during elective percutaneous coronary intervention and 10 controls before balloon inflation were included. Only 5 of the 20 ECGs during coronary balloon occlusion met the 2007 American Heart Association guidelines for ST-elevation myocardial infarction (STEMI). Fifteen cardiologists read 4 sets of these ECGs as the basis for a "yes/no" reperfusion therapy decision. Sets 1 and 4 were the same 12-lead ECGs alone. Set 2 also included numeric ST-segment measurements, and set 3 included both numeric and graphically displayed ST measurements ("ST Maps"). The mean (range) positive reperfusion decisions were 10.6 (2-15), 11.4 (1-19), 9.7 (2-14), and 10.7 (1-15) for sets 1 to 4, respectively. The accuracies of the observers for the 5 STEMI ECGs were 67%, 69%, and 77% for the standard format, the ST numeric format, and the ST graphic format, respectively. The improved detection rate (77% vs 67%) with addition of both numeric and graphic displays did achieve statistical significance (P < .025). The corresponding specificities for the 10 control ECGs were 85%, 79%, and 89%, respectively. In conclusion, a wide variation of reperfusion decisions was observed among clinical cardiologists, and their decisions were not altered by adding ST deviation measurements in numeric and/or graphic displays. Acute coronary occlusion detection rate was low for ECGs meeting STEMI criteria, and this was improved by adding ST-segment measurements in numeric and graphic forms. These results merit further study of the clinical value of this technique for improved acute coronary occlusion treatment decision support. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Central Limit Theorem: New SOCR Applet and Demonstration Activity

    PubMed Central

    Dinov, Ivo D.; Christou, Nicolas; Sanchez, Juana

    2011-01-01

    Modern approaches for information technology based blended education utilize a variety of novel instructional, computational and network resources. Such attempts employ technology to deliver integrated, dynamically linked, interactive content and multifaceted learning environments, which may facilitate student comprehension and information retention. In this manuscript, we describe one such innovative effort of using technological tools for improving student motivation and learning of the theory, practice and usability of the Central Limit Theorem (CLT) in probability and statistics courses. Our approach is based on harnessing the computational libraries developed by the Statistics Online Computational Resource (SOCR) to design a new interactive Java applet and a corresponding demonstration activity that illustrate the meaning and the power of the CLT. The CLT applet and activity have clear common goals; to provide graphical representation of the CLT, to improve student intuition, and to empirically validate and establish the limits of the CLT. The SOCR CLT activity consists of four experiments that demonstrate the assumptions, meaning and implications of the CLT and ties these to specific hands-on simulations. We include a number of examples illustrating the theory and applications of the CLT. Both the SOCR CLT applet and activity are freely available online to the community to test, validate and extend (Applet: http://www.socr.ucla.edu/htmls/SOCR_Experiments.html and Activity: http://wiki.stat.ucla.edu/socr/index.php/SOCR_EduMaterials_Activities_GeneralCentralLimitTheorem). PMID:21833159

  17. Central Limit Theorem: New SOCR Applet and Demonstration Activity.

    PubMed

    Dinov, Ivo D; Christou, Nicolas; Sanchez, Juana

    2008-07-01

    Modern approaches for information technology based blended education utilize a variety of novel instructional, computational and network resources. Such attempts employ technology to deliver integrated, dynamically linked, interactive content and multifaceted learning environments, which may facilitate student comprehension and information retention. In this manuscript, we describe one such innovative effort of using technological tools for improving student motivation and learning of the theory, practice and usability of the Central Limit Theorem (CLT) in probability and statistics courses. Our approach is based on harnessing the computational libraries developed by the Statistics Online Computational Resource (SOCR) to design a new interactive Java applet and a corresponding demonstration activity that illustrate the meaning and the power of the CLT. The CLT applet and activity have clear common goals; to provide graphical representation of the CLT, to improve student intuition, and to empirically validate and establish the limits of the CLT. The SOCR CLT activity consists of four experiments that demonstrate the assumptions, meaning and implications of the CLT and ties these to specific hands-on simulations. We include a number of examples illustrating the theory and applications of the CLT. Both the SOCR CLT applet and activity are freely available online to the community to test, validate and extend (Applet: http://www.socr.ucla.edu/htmls/SOCR_Experiments.html and Activity: http://wiki.stat.ucla.edu/socr/index.php/SOCR_EduMaterials_Activities_GeneralCentralLimitTheorem).

  18. Dynamic simulation of road vehicle door window regulator mechanism of cross arm type

    NASA Astrophysics Data System (ADS)

    Miklos, I. Zs; Miklos, C.; Alic, C.

    2017-01-01

    The paper presents issues related to the dynamic simulation of a motor-drive operating mechanism of cross arm type, for the manipulation of road vehicle door windows, using Autodesk Inventor Professional software. The dynamic simulation of the mechanism involves a 3D modelling, kinematic coupling, drive motion parameters and external loads, as well as the graphically view of the kinematic and kinetostatic results for the various elements and kinematic couplings of the mechanism, under real operating conditions. Also, based on the results, the analysis of the mechanism components has been carried out using the finite element method.

  19. Techniques for animation of CFD results. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Horowitz, Jay; Hanson, Jeffery C.

    1992-01-01

    Video animation is becoming increasingly vital to the computational fluid dynamics researcher, not just for presentation, but for recording and comparing dynamic visualizations that are beyond the current capabilities of even the most powerful graphic workstation. To meet these needs, Lewis Research Center has recently established a facility to provide users with easy access to advanced video animation capabilities. However, producing animation that is both visually effective and scientifically accurate involves various technological and aesthetic considerations that must be understood both by the researcher and those supporting the visualization process. These considerations include: scan conversion, color conversion, and spatial ambiguities.

  20. Integration of Dynamic Models in Range Operations

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge; Thirumalainambi, Rajkumar

    2004-01-01

    This work addresses the various model interactions in real-time to make an efficient internet based decision making tool for Shuttle launch. The decision making tool depends on the launch commit criteria coupled with physical models. Dynamic interaction between a wide variety of simulation applications and techniques, embedded algorithms, and data visualizations are needed to exploit the full potential of modeling and simulation. This paper also discusses in depth details of web based 3-D graphics and applications to range safety. The advantages of this dynamic model integration are secure accessibility and distribution of real time information to other NASA centers.

  1. An outline of graphical Markov models in dentistry.

    PubMed

    Helfenstein, U; Steiner, M; Menghini, G

    1999-12-01

    In the usual multiple regression model there is one response variable and one block of several explanatory variables. In contrast, in reality there may be a block of several possibly interacting response variables one would like to explain. In addition, the explanatory variables may split into a sequence of several blocks, each block containing several interacting variables. The variables in the second block are explained by those in the first block; the variables in the third block by those in the first and the second block etc. During recent years methods have been developed allowing analysis of problems where the data set has the above complex structure. The models involved are called graphical models or graphical Markov models. The main result of an analysis is a picture, a conditional independence graph with precise statistical meaning, consisting of circles representing variables and lines or arrows representing significant conditional associations. The absence of a line between two circles signifies that the corresponding two variables are independent conditional on the presence of other variables in the model. An example from epidemiology is presented in order to demonstrate application and use of the models. The data set in the example has a complex structure consisting of successive blocks: the variable in the first block is year of investigation; the variables in the second block are age and gender; the variables in the third block are indices of calculus, gingivitis and mutans streptococci and the final response variables in the fourth block are different indices of caries. Since the statistical methods may not be easily accessible to dentists, this article presents them in an introductory form. Graphical models may be of great value to dentists in allowing analysis and visualisation of complex structured multivariate data sets consisting of a sequence of blocks of interacting variables and, in particular, several possibly interacting responses in the final block.

  2. Cool but counterproductive: interactive, Web-based risk communications can backfire.

    PubMed

    Zikmund-Fisher, Brian J; Dickson, Mark; Witteman, Holly O

    2011-08-25

    Paper-based patient decision aids generally present risk information using numbers and/or static images. However, limited psychological research has suggested that when people interactively graph risk information, they process the statistics more actively, making the information more available for decision making. Such interactive tools could potentially be incorporated in a new generation of Web-based decision aids. The objective of our study was to investigate whether interactive graphics detailing the risk of side effects of two treatments improve knowledge and decision making over standard risk graphics. A total of 3371 members of a demographically diverse Internet panel viewed a hypothetical scenario about two hypothetical treatments for thyroid cancer. Each treatment had a chance of causing 1 of 2 side effects, but we randomly varied whether one treatment was better on both dimensions (strong dominance condition), slightly better on only one dimension (mild dominance condition), or better on one dimension but worse on the other (trade-off condition) than the other treatment. We also varied whether respondents passively viewed the risk information in static pictograph (icon array) images or actively manipulated the information by using interactive Flash-based animations of "fill-in-the-blank" pictographs. Our primary hypothesis was that active manipulation would increase respondents' ability to recognize dominance (when available) and choose the better treatment. The interactive risk graphic conditions had significantly worse survey completion rates (1110/1695, 65.5% vs 1316/1659, 79.3%, P < .001) than the static image conditions. In addition, respondents using interactive graphs were less likely to recognize and select the dominant treatment option (234/380, 61.6% vs 343/465, 73.8%, P < .001 in the strong dominance condition). Interactivity, however visually appealing, can both add to respondent burden and distract people from understanding relevant statistical information. Decision-aid developers need to be aware that interactive risk presentations may create worse outcomes than presentations of static risk graphic formats.

  3. Cool but Counterproductive: Interactive, Web-Based Risk Communications Can Backfire

    PubMed Central

    Dickson, Mark; Witteman, Holly O

    2011-01-01

    Background Paper-based patient decision aids generally present risk information using numbers and/or static images. However, limited psychological research has suggested that when people interactively graph risk information, they process the statistics more actively, making the information more available for decision making. Such interactive tools could potentially be incorporated in a new generation of Web-based decision aids. Objective The objective of our study was to investigate whether interactive graphics detailing the risk of side effects of two treatments improve knowledge and decision making over standard risk graphics. Methods A total of 3371 members of a demographically diverse Internet panel viewed a hypothetical scenario about two hypothetical treatments for thyroid cancer. Each treatment had a chance of causing 1 of 2 side effects, but we randomly varied whether one treatment was better on both dimensions (strong dominance condition), slightly better on only one dimension (mild dominance condition), or better on one dimension but worse on the other (trade-off condition) than the other treatment. We also varied whether respondents passively viewed the risk information in static pictograph (icon array) images or actively manipulated the information by using interactive Flash-based animations of “fill-in-the-blank” pictographs. Our primary hypothesis was that active manipulation would increase respondents’ ability to recognize dominance (when available) and choose the better treatment. Results The interactive risk graphic conditions had significantly worse survey completion rates (1110/1695, 65.5% vs 1316/1659, 79.3%, P < .001) than the static image conditions. In addition, respondents using interactive graphs were less likely to recognize and select the dominant treatment option (234/380, 61.6% vs 343/465, 73.8%, P < .001 in the strong dominance condition). Conclusions Interactivity, however visually appealing, can both add to respondent burden and distract people from understanding relevant statistical information. Decision-aid developers need to be aware that interactive risk presentations may create worse outcomes than presentations of static risk graphic formats. PMID:21868349

  4. Cassidy and Parmitano in U.S. Laboratory

    NASA Image and Video Library

    2013-06-25

    ISS036-E-012131 (25 June 2013) --- NASA astronaut Chris Cassidy (left) and European Space Agency astronaut Luca Parmitano, both Expedition 36 flight engineers, perform a Portable Onboard Computers (POC) Dynamic Onboard Ubiquitous Graphics (DOUG) software review in preparation for spacewalks scheduled for July 9 and July 16.

  5. Cassidy and Parmitano in U.S. Laboratory

    NASA Image and Video Library

    2013-06-25

    ISS036-E-012130 (25 June 2013) --- NASA astronaut Chris Cassidy (left) and European Space Agency astronaut Luca Parmitano, both Expedition 36 flight engineers, perform a Portable Onboard Computers (POC) Dynamic Onboard Ubiquitous Graphics (DOUG) software review in preparation for spacewalks scheduled for July 9 and July 16.

  6. Computer-aided Instructional System for Transmission Line Simulation.

    ERIC Educational Resources Information Center

    Reinhard, Erwin A.; Roth, Charles H., Jr.

    A computer-aided instructional system has been developed which utilizes dynamic computer-controlled graphic displays and which requires student interaction with a computer simulation in an instructional mode. A numerical scheme has been developed for digital simulation of a uniform, distortionless transmission line with resistive terminations and…

  7. Guam's forest resources, 2002.

    Treesearch

    Joseph A. Donnegan; Sarah L. Butler; Walter Grabowiecki; Bruce A. Hiserote; David. Limtiaco

    2004-01-01

    The Forest Inventory and Analysis Program collected, analyzed, and summarized field data on 46 forested plots on the island of Guam. Estimates of forest area, tree stem volume and biomass, the numbers of trees, tree damages, and the distribution of tree sizes were summarized for this statistical sample. Detailed tables and graphical highlights provide a summary of Guam...

  8. Learning to Use an Alphabetic Writing System

    ERIC Educational Resources Information Center

    Treiman, Rebecca; Kessler, Brett

    2013-01-01

    Gaining facility with spelling is an important part of becoming a good writer. Here we review recent work on how children learn to spell in alphabetic writing systems. Statistical learning plays an important role in this process. Young children learn about some of the salient graphic characteristics of written texts and attempt to reproduce these…

  9. Statistical Abstracts, Fall 1990: Instructional Workload, Faculty, and I&DR Costs.

    ERIC Educational Resources Information Center

    State Univ. of New York, Albany. Central Staff Office of Institutional Research.

    This publication provides summary analytical reports and graphic displays from the official Course and Section Analysis (CASA) system concerning the instructional workload and the financial resources of academic departments offering courses during the fall 1990 semester within the State University of New York system. Included are six reports. The…

  10. Quantifying soil profile change caused by land use in central Missouri loess hillslopes

    Treesearch

    Samuel J. Indorante; John M. Kabrick; Brad D. Lee; Jon M. Maatta

    2014-01-01

    Three major challenges are present when studying anthropogenic impacts on soil profile properties: (i) site selection; (ii) sampling and modeling native and cultivated soil-landscape relationships; and (iii) graphically and statistically comparing native and cultivated sites to model soil profile changes. This study addressed those challenges by measuring and modeling...

  11. Teaching for Art Criticism: Incorporating Feldman's Critical Analysis Learning Model in Students' Studio Practice

    ERIC Educational Resources Information Center

    Subramaniam, Maithreyi; Hanafi, Jaffri; Putih, Abu Talib

    2016-01-01

    This study adopted 30 first year graphic design students' artwork, with critical analysis using Feldman's model of art criticism. Data were analyzed quantitatively; descriptive statistical techniques were employed. The scores were viewed in the form of mean score and frequencies to determine students' performances in their critical ability.…

  12. Palau's forest resources, 2003.

    Treesearch

    Joseph A. Donnegan; Sarah L. Butler; Olaf Kuegler; Brent J. Stroud; Bruce A. Hiserote; Kashgar. Rengulbai

    2007-01-01

    The Forest Inventory and Analysis Program collected, analyzed, and summarized field data on 54 forested plots on the islands in the Republic of Palau. Estimates of forest area, tree stem volume and biomass, the numbers of trees, tree damages, and the distribution of tree sizes were summarized for this statistical sample. Detailed tables and graphical highlights provide...

  13. Asia & Oceania Today: A Reproducible Atlas. 1995 Revised Edition. World Eagle's Today Series.

    ERIC Educational Resources Information Center

    Independent Broadcasting Associates, Inc., Littleton, MA.

    This book contains blank outline maps of the continent or region, tables, and graphics depicting various aspects of Asia and Oceania. Sections of the book include: (1) "The Land and Population Figures"; (2) "Cities and Countries"; (3) "People: Languages, Literacy, Ethnic groups, Demographic Statistics and Projections,…

  14. Cholera, Rocket Ships and Tom's Veggies: Contemporary and Historical Ideas towards the Effective Communication of School Performance.

    ERIC Educational Resources Information Center

    Wainer, Howard

    2000-01-01

    Discusses three interlocking areas associated with effectively and accurately conveying information about school performance to the public: (1) graphical display; (2) nonrandomly gathered data; and (3) statistical adjustment. Illustrates these points with historical data, including test results from the National Assessment of Educational Progress…

  15. Infographics And Public Policy: Using Data Visualization To Convey Complex Information.

    PubMed

    Otten, Jennifer J; Cheng, Karen; Drewnowski, Adam

    2015-11-01

    Data visualization combines principles from psychology, usability, graphic design, and statistics to highlight important data in accessible and appealing formats. Doing so helps bridge knowledge producers with knowledge users, who are often inundated with information and increasingly pressed for time. Project HOPE—The People-to-People Health Foundation, Inc.

  16. Educational Opportunities through Federal Assistance Programs, Fiscal 1974.

    ERIC Educational Resources Information Center

    Ohio State Dept. of Education, Columbus. Div. of Federal Assistance.

    This publication, the ninth annual report of the Division of Federal Assistance of the Ohio Department of Education, summarizes the work of the division during fiscal 1974 (July 1, 1973-June 30, 1974). In addition to presenting statistical, fiscal, and graphic data, the report is designed to help educators and other interested persons to:…

  17. Educational Opportunities through Federal Assistance Programs. Fiscal 1973.

    ERIC Educational Resources Information Center

    Ohio State Dept. of Education, Columbus. Div. of Federal Assistance.

    This eighth annual report of the Division of Federal Assistance of the Ohio Department of Education, summarizes the work of the division during fiscal 1973. In addition to presenting statistical, fiscal, and graphic data, the report is designed to help educators and other interested persons understand the various Federal programs administered by…

  18. Graphical correlation of gaging-station records

    USGS Publications Warehouse

    Searcy, James K.

    1960-01-01

    A gaging-station record is a sample of the rate of flow of a stream at a given site. This sample can be used to estimate the magnitude and distribution of future flows if the record is long enough to be representative of the long-term flow of the stream. The reliability of a short-term record for estimating future flow characteristics can be improved through correlation with a long-term record. Correlation can be either numerical or graphical, but graphical correlation of gaging-station records has several advantages. The graphical correlation method is described in a step-by-step procedure with an illustrative problem of simple correlation, illustrative problems of three examples of multiple correlation--removing seasonal effect--and two examples of correlation of one record with two other records. Except in the problem on removal of seasonal effect, the same group of stations is used in the illustrative problems. The purpose of the problems is to illustrate the method--not to show the improvement that can result from multiple correlation as compared with simple correlation. Hydrologic factors determine whether a usable relation exists between gaging-station records. Statistics is only a tool for evaluating and using an existing relation, and the investigator must be guided by a knowledge of hydrology.

  19. Using a color-coded ambigraphic nucleic acid notation to visualize conserved palindromic motifs within and across genomes

    PubMed Central

    2014-01-01

    Background Ambiscript is a graphically-designed nucleic acid notation that uses symbol symmetries to support sequence complementation, highlight biologically-relevant palindromes, and facilitate the analysis of consensus sequences. Although the original Ambiscript notation was designed to easily represent consensus sequences for multiple sequence alignments, the notation’s black-on-white ambiguity characters are unable to reflect the statistical distribution of nucleotides found at each position. We now propose a color-augmented ambigraphic notation to encode the frequency of positional polymorphisms in these consensus sequences. Results We have implemented this color-coding approach by creating an Adobe Flash® application ( http://www.ambiscript.org) that shades and colors modified Ambiscript characters according to the prevalence of the encoded nucleotide at each position in the alignment. The resulting graphic helps viewers perceive biologically-relevant patterns in multiple sequence alignments by uniquely combining color, shading, and character symmetries to highlight palindromes and inverted repeats in conserved DNA motifs. Conclusion Juxtaposing an intuitive color scheme over the deliberate character symmetries of an ambigraphic nucleic acid notation yields a highly-functional nucleic acid notation that maximizes information content and successfully embodies key principles of graphic excellence put forth by the statistician and graphic design theorist, Edward Tufte. PMID:24447494

  20. Individual and population pharmacokinetic compartment analysis: a graphic procedure for quantification of predictive performance.

    PubMed

    Eksborg, Staffan

    2013-01-01

    Pharmacokinetic studies are important for optimizing of drug dosing, but requires proper validation of the used pharmacokinetic procedures. However, simple and reliable statistical methods suitable for evaluation of the predictive performance of pharmacokinetic analysis are essentially lacking. The aim of the present study was to construct and evaluate a graphic procedure for quantification of predictive performance of individual and population pharmacokinetic compartment analysis. Original data from previously published pharmacokinetic compartment analyses after intravenous, oral, and epidural administration, and digitized data, obtained from published scatter plots of observed vs predicted drug concentrations from population pharmacokinetic studies using the NPEM algorithm and NONMEM computer program and Bayesian forecasting procedures, were used for estimating the predictive performance according to the proposed graphical method and by the method of Sheiner and Beal. The graphical plot proposed in the present paper proved to be a useful tool for evaluation of predictive performance of both individual and population compartment pharmacokinetic analysis. The proposed method is simple to use and gives valuable information concerning time- and concentration-dependent inaccuracies that might occur in individual and population pharmacokinetic compartment analysis. Predictive performance can be quantified by the fraction of concentration ratios within arbitrarily specified ranges, e.g. within the range 0.8-1.2.

  1. Novel presentational approaches were developed for reporting network meta-analysis.

    PubMed

    Tan, Sze Huey; Cooper, Nicola J; Bujkiewicz, Sylwia; Welton, Nicky J; Caldwell, Deborah M; Sutton, Alexander J

    2014-06-01

    To present graphical tools for reporting network meta-analysis (NMA) results aiming to increase the accessibility, transparency, interpretability, and acceptability of NMA analyses. The key components of NMA results were identified based on recommendations by agencies such as the National Institute for Health and Care Excellence (United Kingdom). Three novel graphs were designed to amalgamate the identified components using familiar graphical tools such as the bar, line, or pie charts and adhering to good graphical design principles. Three key components for presentation of NMA results were identified, namely relative effects and their uncertainty, probability of an intervention being best, and between-study heterogeneity. Two of the three graphs developed present results (for each pairwise comparison of interventions in the network) obtained from both NMA and standard pairwise meta-analysis for easy comparison. They also include options to display the probability best, ranking statistics, heterogeneity, and prediction intervals. The third graph presents rankings of interventions in terms of their effectiveness to enable clinicians to easily identify "top-ranking" interventions. The graphical tools presented can display results tailored to the research question of interest, and targeted at a whole spectrum of users from the technical analyst to the nontechnical clinician. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Generalizing the extensibility of a dynamic geometry software

    NASA Astrophysics Data System (ADS)

    Herceg, Đorđe; Radaković, Davorka; Herceg, Dejana

    2012-09-01

    Plug-and-play visual components in a Dynamic Geometry Software (DGS) enable development of visually attractive, rich and highly interactive dynamic drawings. We are developing SLGeometry, a DGS that contains a custom programming language, a computer algebra system (CAS engine) and a graphics subsystem. The basic extensibility framework on SLGeometry supports dynamic addition of new functions from attribute annotated classes that implement runtime metadata registration in code. We present a general plug-in framework for dynamic importing of arbitrary Silverlight user interface (UI) controls into SLGeometry at runtime. The CAS engine maintains a metadata storage that describes each imported visual component and enables two-way communication between the expressions stored in the engine and the UI controls on the screen.

  3. Fitting of dynamic recurrent neural network models to sensory stimulus-response data.

    PubMed

    Doruk, R Ozgur; Zhang, Kechen

    2018-06-02

    We present a theoretical study aiming at model fitting for sensory neurons. Conventional neural network training approaches are not applicable to this problem due to lack of continuous data. Although the stimulus can be considered as a smooth time-dependent variable, the associated response will be a set of neural spike timings (roughly the instants of successive action potential peaks) that have no amplitude information. A recurrent neural network model can be fitted to such a stimulus-response data pair by using the maximum likelihood estimation method where the likelihood function is derived from Poisson statistics of neural spiking. The universal approximation feature of the recurrent dynamical neuron network models allows us to describe excitatory-inhibitory characteristics of an actual sensory neural network with any desired number of neurons. The stimulus data are generated by a phased cosine Fourier series having a fixed amplitude and frequency but a randomly shot phase. Various values of amplitude, stimulus component size, and sample size are applied in order to examine the effect of the stimulus to the identification process. Results are presented in tabular and graphical forms at the end of this text. In addition, to demonstrate the success of this research, a study involving the same model, nominal parameters and stimulus structure, and another study that works on different models are compared to that of this research.

  4. HiQuant: Rapid Postquantification Analysis of Large-Scale MS-Generated Proteomics Data.

    PubMed

    Bryan, Kenneth; Jarboui, Mohamed-Ali; Raso, Cinzia; Bernal-Llinares, Manuel; McCann, Brendan; Rauch, Jens; Boldt, Karsten; Lynn, David J

    2016-06-03

    Recent advances in mass-spectrometry-based proteomics are now facilitating ambitious large-scale investigations of the spatial and temporal dynamics of the proteome; however, the increasing size and complexity of these data sets is overwhelming current downstream computational methods, specifically those that support the postquantification analysis pipeline. Here we present HiQuant, a novel application that enables the design and execution of a postquantification workflow, including common data-processing steps, such as assay normalization and grouping, and experimental replicate quality control and statistical analysis. HiQuant also enables the interpretation of results generated from large-scale data sets by supporting interactive heatmap analysis and also the direct export to Cytoscape and Gephi, two leading network analysis platforms. HiQuant may be run via a user-friendly graphical interface and also supports complete one-touch automation via a command-line mode. We evaluate HiQuant's performance by analyzing a large-scale, complex interactome mapping data set and demonstrate a 200-fold improvement in the execution time over current methods. We also demonstrate HiQuant's general utility by analyzing proteome-wide quantification data generated from both a large-scale public tyrosine kinase siRNA knock-down study and an in-house investigation into the temporal dynamics of the KSR1 and KSR2 interactomes. Download HiQuant, sample data sets, and supporting documentation at http://hiquant.primesdb.eu .

  5. Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data

    NASA Astrophysics Data System (ADS)

    Jern, Mikael

    Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.

  6. DAnTE: a statistical tool for quantitative analysis of –omics data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep

    2008-05-03

    DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.

  7. Considerations for the design, analysis and presentation of in vivo studies.

    PubMed

    Ranstam, J; Cook, J A

    2017-03-01

    To describe, explain and give practical suggestions regarding important principles and key methodological challenges in the study design, statistical analysis, and reporting of results from in vivo studies. Pre-specifying endpoints and analysis, recognizing the common underlying assumption of statistically independent observations, performing sample size calculations, and addressing multiplicity issues are important parts of an in vivo study. A clear reporting of results and informative graphical presentations of data are other important parts. Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  8. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    PubMed

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  9. Computational fluid dynamics uses in fluid dynamics/aerodynamics education

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.

    1994-01-01

    The field of computational fluid dynamics (CFD) has advanced to the point where it can now be used for the purpose of fluid dynamics physics education. Because of the tremendous wealth of information available from numerical simulation, certain fundamental concepts can be efficiently communicated using an interactive graphical interrogation of the appropriate numerical simulation data base. In other situations, a large amount of aerodynamic information can be communicated to the student by interactive use of simple CFD tools on a workstation or even in a personal computer environment. The emphasis in this presentation is to discuss ideas for how this process might be implemented. Specific examples, taken from previous publications, will be used to highlight the presentation.

  10. An interactive graphics program to retrieve, display, compare, manipulate, curve fit, difference and cross plot wind tunnel data

    NASA Technical Reports Server (NTRS)

    Elliott, R. D.; Werner, N. M.; Baker, W. M.

    1975-01-01

    The Aerodynamic Data Analysis and Integration System (ADAIS), developed as a highly interactive computer graphics program capable of manipulating large quantities of data such that addressable elements of a data base can be called up for graphic display, compared, curve fit, stored, retrieved, differenced, etc., was described. The general nature of the system is evidenced by the fact that limited usage has already occurred with data bases consisting of thermodynamic, basic loads, and flight dynamics data. Productivity using ADAIS of five times that for conventional manual methods of wind tunnel data analysis is routinely achieved. In wind tunnel data analysis, data from one or more runs of a particular test may be called up and displayed along with data from one or more runs of a different test. Curves may be faired through the data points by any of four methods, including cubic spline and least squares polynomial fit up to seventh order.

  11. Impact of memory bottleneck on the performance of graphics processing units

    NASA Astrophysics Data System (ADS)

    Son, Dong Oh; Choi, Hong Jun; Kim, Jong Myon; Kim, Cheol Hong

    2015-12-01

    Recent graphics processing units (GPUs) can process general-purpose applications as well as graphics applications with the help of various user-friendly application programming interfaces (APIs) supported by GPU vendors. Unfortunately, utilizing the hardware resource in the GPU efficiently is a challenging problem, since the GPU architecture is totally different to the traditional CPU architecture. To solve this problem, many studies have focused on the techniques for improving the system performance using GPUs. In this work, we analyze the GPU performance varying GPU parameters such as the number of cores and clock frequency. According to our simulations, the GPU performance can be improved by 125.8% and 16.2% on average as the number of cores and clock frequency increase, respectively. However, the performance is saturated when memory bottleneck problems incur due to huge data requests to the memory. The performance of GPUs can be improved as the memory bottleneck is reduced by changing GPU parameters dynamically.

  12. AirShow 1.0 CFD Software Users' Guide

    NASA Technical Reports Server (NTRS)

    Mohler, Stanley R., Jr.

    2005-01-01

    AirShow is visualization post-processing software for Computational Fluid Dynamics (CFD). Upon reading binary PLOT3D grid and solution files into AirShow, the engineer can quickly see how hundreds of complex 3-D structured blocks are arranged and numbered. Additionally, chosen grid planes can be displayed and colored according to various aerodynamic flow quantities such as Mach number and pressure. The user may interactively rotate and translate the graphical objects using the mouse. The software source code was written in cross-platform Java, C++, and OpenGL, and runs on Unix, Linux, and Windows. The graphical user interface (GUI) was written using Java Swing. Java also provides multiple synchronized threads. The Java Native Interface (JNI) provides a bridge between the Java code and the C++ code where the PLOT3D files are read, the OpenGL graphics are rendered, and numerical calculations are performed. AirShow is easy to learn and simple to use. The source code is available for free from the NASA Technology Transfer and Partnership Office.

  13. X-Antenna: A graphical interface for antenna analysis codes

    NASA Technical Reports Server (NTRS)

    Goldstein, B. L.; Newman, E. H.; Shamansky, H. T.

    1995-01-01

    This report serves as the user's manual for the X-Antenna code. X-Antenna is intended to simplify the analysis of antennas by giving the user graphical interfaces in which to enter all relevant antenna and analysis code data. Essentially, X-Antenna creates a Motif interface to the user's antenna analysis codes. A command-file allows new antennas and codes to be added to the application. The menu system and graphical interface screens are created dynamically to conform to the data in the command-file. Antenna data can be saved and retrieved from disk. X-Antenna checks all antenna and code values to ensure they are of the correct type, writes an output file, and runs the appropriate antenna analysis code. Volumetric pattern data may be viewed in 3D space with an external viewer run directly from the application. Currently, X-Antenna includes analysis codes for thin wire antennas (dipoles, loops, and helices), rectangular microstrip antennas, and thin slot antennas.

  14. Real-time autocorrelator for fluorescence correlation spectroscopy based on graphical-processor-unit architecture: method, implementation, and comparative studies

    NASA Astrophysics Data System (ADS)

    Laracuente, Nicholas; Grossman, Carl

    2013-03-01

    We developed an algorithm and software to calculate autocorrelation functions from real-time photon-counting data using the fast, parallel capabilities of graphical processor units (GPUs). Recent developments in hardware and software have allowed for general purpose computing with inexpensive GPU hardware. These devices are more suited for emulating hardware autocorrelators than traditional CPU-based software applications by emphasizing parallel throughput over sequential speed. Incoming data are binned in a standard multi-tau scheme with configurable points-per-bin size and are mapped into a GPU memory pattern to reduce time-expensive memory access. Applications include dynamic light scattering (DLS) and fluorescence correlation spectroscopy (FCS) experiments. We ran the software on a 64-core graphics pci card in a 3.2 GHz Intel i5 CPU based computer running Linux. FCS measurements were made on Alexa-546 and Texas Red dyes in a standard buffer (PBS). Software correlations were compared to hardware correlator measurements on the same signals. Supported by HHMI and Swarthmore College

  15. Project Physics Text 1, Concepts of Motion.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Harvard Project Physics.

    Fundamental concepts of motion are presented in this first unit of the Project Physics textbook. Descriptions of motion are made in connection with speeds, accelerations, and their graphical representation. Free-fall bodies are analyzed by using Aristotle's theory and Galileo's work. Dynamics aspects are discussed with a background of mass, force,…

  16. Group Mirrors to Support Interaction Regulation in Collaborative Problem Solving

    ERIC Educational Resources Information Center

    Jermann, Patrick; Dillenbourg, Pierre

    2008-01-01

    Two experimental studies test the effect of group mirrors upon quantitative and qualitative aspects of participation in collaborative problem solving. Mirroring tools consist of a graphical representation of the group's actions which is dynamically updated and displayed to the collaborators. In addition, metacognitive tools display a standard for…

  17. 40 CFR 205.54-2 - Sound data acquisition system.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... meets the “fast” dynamic requirement of a precision sound level meter indicating meter system for the... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Sound data acquisition system. 205.54... data acquisition system. (a) Systems employing tape recorders and graphic level recorders may be...

  18. Enhancing Computer-Based Lessons for Effective Speech Education.

    ERIC Educational Resources Information Center

    Hemphill, Michael R.; Standerfer, Christina C.

    1987-01-01

    Assesses the advantages of computer-based instruction on speech education. Concludes that, while it offers tremendous flexibility to the instructor--especially in dynamic lesson design, feedback, graphics, and artificial intelligence--there is no inherent advantage to the use of computer technology in the classroom, unless the student interacts…

  19. Technology-Based Content through Virtual and Physical Modeling: A National Research Study

    ERIC Educational Resources Information Center

    Ernst, Jeremy V.; Clark, Aaron C.

    2009-01-01

    Visualization is becoming more prevalent as an application in science, engineering, and technology related professions. The analysis of static and dynamic graphical visualization provides data solutions and understandings that go beyond traditional forms of communication. The study of technology-based content and the application of conceptual…

  20. On Endogenous Competitive Business Cycles

    DTIC Science & Technology

    1984-01-01

    Equilibria Part II. Properties of Bequest Equilibria" by Debraj Ray and Douglas Bernheim. Reports in this Series . • . ■ 1J20. "On the Existence...or equivalently by the map W. It ray be worthwhile to end up this section with a simple graphical illustration of the backward dynamics associated to

  1. Impact of Assimilating Ocean Velocity Observations Inferred from Lagrangian Drifter Data Using the NCOM-4DVAR

    DTIC Science & Technology

    2014-04-01

    absolute dynamic height (ADH; in meters) from the Archiving, Validation, and Interpretation of Satellite Oceano - graphic data (AVISO) product [this...altimeter product was produced by the Segment Sol multimissions d’Altimetrie, d’Orbitographie et de localisation precise (Ssalto)/Data Unification and

  2. An Unexpected Influence on a Quadratic

    ERIC Educational Resources Information Center

    Davis, Jon D.

    2013-01-01

    Using technology to explore the coefficients of a quadratic equation can lead to an unexpected result. This article describes an investigation that involves sliders and dynamically linked representations. It guides students to notice the effect that the parameter "a" has on the graphical representation of a quadratic function in the form…

  3. Creativity--A Dynamic Approach to Industrial Education

    ERIC Educational Resources Information Center

    Markowitz, John, Jr.

    1974-01-01

    The author presents a number of unique programs and projects which have proved successful in one high school's woodworking and graphic arts classes in terms of motivating high student interest, growth in skills, good community relations--and a financial profit. The chief objective is self-discovery, the model is Outward Bound. (AJ)

  4. Man-in-the-control-loop simulation of manipulators

    NASA Technical Reports Server (NTRS)

    Chang, J. L.; Lin, Tsung-Chieh; Yae, K. Harold

    1989-01-01

    A method to achieve man-in-the-control-loop simulation is presented. Emerging real-time dynamics simulation suggests a potential for creating an interactive design workstation with a human operator in the control loop. The recursive formulation for multibody dynamics simulation is studied to determine requirements for man-in-the-control-loop simulation. High speed computer graphics techniques provides realistic visual cues for the simulator. Backhoe and robot arm simulations are implemented to demonstrate the capability of man-in-the-control-loop simulation.

  5. Dynamical analysis of cigarette smoking model with a saturated incidence rate

    NASA Astrophysics Data System (ADS)

    Zeb, Anwar; Bano, Ayesha; Alzahrani, Ebraheem; Zaman, Gul

    2018-04-01

    In this paper, we consider a delayed smoking model in which the potential smokers are assumed to satisfy the logistic equation. We discuss the dynamical behavior of our proposed model in the form of Delayed Differential Equations (DDEs) and show conditions for asymptotic stability of the model in steady state. We also discuss the Hopf bifurcation analysis of considered model. Finally, we use the nonstandard finite difference (NSFD) scheme to show the results graphically with help of MATLAB.

  6. Integrating aerodynamic surface modeling for computational fluid dynamics with computer aided structural analysis, design, and manufacturing

    NASA Technical Reports Server (NTRS)

    Thorp, Scott A.

    1992-01-01

    This presentation will discuss the development of a NASA Geometry Exchange Specification for transferring aerodynamic surface geometry between LeRC systems and grid generation software used for computational fluid dynamics research. The proposed specification is based on a subset of the Initial Graphics Exchange Specification (IGES). The presentation will include discussion of how the NASA-IGES standard will accommodate improved computer aided design inspection methods and reverse engineering techniques currently being developed. The presentation is in viewgraph format.

  7. Algorithmic commonalities in the parallel environment

    NASA Technical Reports Server (NTRS)

    Mcanulty, Michael A.; Wainer, Michael S.

    1987-01-01

    The ultimate aim of this project was to analyze procedures from substantially different application areas to discover what is either common or peculiar in the process of conversion to the Massively Parallel Processor (MPP). Three areas were identified: molecular dynamic simulation, production systems (rule systems), and various graphics and vision algorithms. To date, only selected graphics procedures have been investigated. They are the most readily available, and produce the most visible results. These include simple polygon patch rendering, raycasting against a constructive solid geometric model, and stochastic or fractal based textured surface algorithms. Only the simplest of conversion strategies, mapping a major loop to the array, has been investigated so far. It is not entirely satisfactory.

  8. Using the stereokinetic effect to convey depth - Computationally efficient depth-from-motion displays

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Proffitt, Dennis R.

    1992-01-01

    Recent developments in microelectronics have encouraged the use of 3D data bases to create compelling volumetric renderings of graphical objects. However, even with the computational capabilities of current-generation graphical systems, real-time displays of such objects are difficult, particularly when dynamic spatial transformations are involved. In this paper we discuss a type of visual stimulus (the stereokinetic effect display) that is computationally far less complex than a true three-dimensional transformation but yields an equally compelling depth impression, often perceptually indiscriminable from the true spatial transformation. Several possible applications for this technique are discussed (e.g., animating contour maps and air traffic control displays so as to evoke accurate depth percepts).

  9. A SCILAB Program for Computing Rotating Magnetic Compact Objects

    NASA Astrophysics Data System (ADS)

    Papasotiriou, P. J.; Geroyannis, V. S.

    We implement the so-called ``complex-plane iterative technique'' (CIT) to the computation of classical differentially rotating magnetic white dwarf and neutron star models. The program has been written in SCILAB (© INRIA-ENPC), a matrix-oriented high-level programming language, which can be downloaded free of charge from the site http://www-rocq.inria.fr/scilab. Due to the advanced capabilities of this language, the code is short and understandable. Highlights of the program are: (a) time-saving character, (b) easy use due to the built-in graphics user interface, (c) easy interfacing with Fortran via online dynamic link. We interpret our numerical results in various ways by extensively using the graphics environment of SCILAB.

  10. A simplified application of the method of operators to the calculation of disturbed motions of an airplane

    NASA Technical Reports Server (NTRS)

    Jones, Robert T

    1937-01-01

    A simplified treatment of the application of Heaviside's operational methods to problems of airplane dynamics is given. Certain graphical methods and logarithmic formulas that lessen the amount of computation involved are explained. The problem representing a gust disturbance or control manipulation is taken up and it is pointed out that in certain cases arbitrary control manipulations may be dealt with as though they imposed specific constraints on the airplane, thus avoiding the necessity of any integration. The application of the calculations described in the text is illustrated by several examples chosen to show the use of the methods and the practicability of the graphical and logarithmic computations described.

  11. Novel graphical environment for virtual and real-world operations of tracked mobile manipulators

    NASA Astrophysics Data System (ADS)

    Chen, ChuXin; Trivedi, Mohan M.; Azam, Mir; Lassiter, Nils T.

    1993-08-01

    A simulation, animation, visualization and interactive control (SAVIC) environment has been developed for the design and operation of an integrated mobile manipulator system. This unique system possesses the abilities for (1) multi-sensor simulation, (2) kinematics and locomotion animation, (3) dynamic motion and manipulation animation, (4) transformation between real and virtual modes within the same graphics system, (5) ease in exchanging software modules and hardware devices between real and virtual world operations, and (6) interfacing with a real robotic system. This paper describes a working system and illustrates the concepts by presenting the simulation, animation and control methodologies for a unique mobile robot with articulated tracks, a manipulator, and sensory modules.

  12. Applications of computer-graphics animation for motion-perception research

    NASA Technical Reports Server (NTRS)

    Proffitt, D. R.; Kaiser, M. K.

    1986-01-01

    The advantages and limitations of using computer animated stimuli in studying motion perception are presented and discussed. Most current programs of motion perception research could not be pursued without the use of computer graphics animation. Computer generated displays afford latitudes of freedom and control that are almost impossible to attain through conventional methods. There are, however, limitations to this presentational medium. At present, computer generated displays present simplified approximations of the dynamics in natural events. Very little is known about how the differences between natural events and computer simulations influence perceptual processing. In practice, the differences are assumed to be irrelevant to the questions under study, and that findings with computer generated stimuli will generalize to natural events.

  13. Software Aids In Graphical Depiction Of Flow Data

    NASA Technical Reports Server (NTRS)

    Stegeman, J. D.

    1995-01-01

    Interactive Data Display System (IDDS) computer program is graphical-display program designed to assist in visualization of three-dimensional flow in turbomachinery. Grid and simulation data files in PLOT3D format required for input. Able to unwrap volumetric data cone associated with centrifugal compressor and display results in easy-to-understand two- or three-dimensional plots. IDDS provides majority of visualization and analysis capability for Integrated Computational Fluid Dynamics and Experiment (ICE) system. IDDS invoked from any subsystem, or used as stand-alone package of display software. Generates contour, vector, shaded, x-y, and carpet plots. Written in C language. Input file format used by IDDS is that of PLOT3D (COSMIC item ARC-12782).

  14. GPU acceleration for digitally reconstructed radiographs using bindless texture objects and CUDA/OpenGL interoperability.

    PubMed

    Abdellah, Marwan; Eldeib, Ayman; Owis, Mohamed I

    2015-01-01

    This paper features an advanced implementation of the X-ray rendering algorithm that harnesses the giant computing power of the current commodity graphics processors to accelerate the generation of high resolution digitally reconstructed radiographs (DRRs). The presented pipeline exploits the latest features of NVIDIA Graphics Processing Unit (GPU) architectures, mainly bindless texture objects and dynamic parallelism. The rendering throughput is substantially improved by exploiting the interoperability mechanisms between CUDA and OpenGL. The benchmarks of our optimized rendering pipeline reflect its capability of generating DRRs with resolutions of 2048(2) and 4096(2) at interactive and semi interactive frame-rates using an NVIDIA GeForce 970 GTX device.

  15. The Zombie Plot: A Simple Graphic Method for Visualizing the Efficacy of a Diagnostic Test.

    PubMed

    Richardson, Michael L

    2016-08-09

    One of the most important jobs of a radiologist is to pick the most appropriate imaging test for a particular clinical situation. Making a proper selection sometimes requires statistical analysis. The objective of this article is to introduce a simple graphic technique, an ROC plot that has been divided into zones of mostly bad imaging efficacy (ZOMBIE, hereafter referred to as the "zombie plot"), that transforms information about imaging efficacy from the numeric domain into the visual domain. The numeric rationale for the use of zombie plots is given, as are several examples of the clinical use of these plots. Two online calculators are described that simplify the process of producing a zombie plot.

  16. Effect of balance training in older adults using Wii fit plus.

    PubMed

    Afridi, Ayesha; Malik, Arshad Nawaz; Ali, Shaukat; Amjad, Imran

    2018-03-01

    The Nintendo Wii-fit plus is a type of Virtual Reality exer-gaming with graphical and auditory response system. A case series was conducted at Shifa Tamer-e-Millat University Islamabad from January-July 2016. Sixteen adults more than 60 years age (07 males and 09 females) were recruited through convenient sampling. The specified Wii fit plus training was provided to all patients and the games included the Soccer heading, Ski slalom, table tilt and yoga. Berg balance test, time up and go and functional reach test were used before and after 06 weeks of treatment (4 days / week). Data was analysed by SPSS V-20. The mean age of the sample was 67.56±7.29 years, with 56% female and 44% males were in sample. There was a statistically significant difference in pre and post Berg Balance Score, time up and go test and functional reach. In this case series Wii-fit plus training was effective in improving dynamic balance and mobility in older adults. This should be explored further in large trials.

  17. Biomathematical Description of Synthetic Peptide Libraries

    PubMed Central

    Trepel, Martin

    2015-01-01

    Libraries of randomised peptides displayed on phages or viral particles are essential tools in a wide spectrum of applications. However, there is only limited understanding of a library's fundamental dynamics and the influences of encoding schemes and sizes on their quality. Numeric properties of libraries, such as the expected number of different peptides and the library's coverage, have long been in use as measures of a library's quality. Here, we present a graphical framework of these measures together with a library's relative efficiency to help to describe libraries in enough detail for researchers to plan new experiments in a more informed manner. In particular, these values allow us to answer-in a probabilistic fashion-the question of whether a specific library does indeed contain one of the "best" possible peptides. The framework is implemented in a web-interface based on two packages, discreteRV and peptider, to the statistical software environment R. We further provide a user-friendly web-interface called PeLiCa (Peptide Library Calculator, http://www.pelica.org), allowing scientists to plan and analyse their peptide libraries. PMID:26042419

  18. Orbital Debris Engineering Model (ORDEM) v.3

    NASA Technical Reports Server (NTRS)

    Matney, Mark; Krisko, Paula; Xu, Yu-Lin; Horstman, Matthew

    2013-01-01

    A model of the manmade orbital debris environment is required by spacecraft designers, mission planners, and others in order to understand and mitigate the effects of the environment on their spacecraft or systems. A manmade environment is dynamic, and can be altered significantly by intent (e.g., the Chinese anti-satellite weapon test of January 2007) or accident (e.g., the collision of Iridium 33 and Cosmos 2251 spacecraft in February 2009). Engineering models are used to portray the manmade debris environment in Earth orbit. The availability of new sensor and in situ data, the re-analysis of older data, and the development of new analytical and statistical techniques has enabled the construction of this more comprehensive and sophisticated model. The primary output of this model is the flux [#debris/area/time] as a function of debris size and year. ORDEM may be operated in spacecraft mode or telescope mode. In the former case, an analyst defines an orbit for a spacecraft and "flies" the spacecraft through the orbital debris environment. In the latter case, an analyst defines a ground-based sensor (telescope or radar) in terms of latitude, azimuth, and elevation, and the model provides the number of orbital debris traversing the sensor's field of view. An upgraded graphical user interface (GUI) is integrated with the software. This upgraded GUI uses project-oriented organization and provides the user with graphical representations of numerous output data products. These range from the conventional flux as a function of debris size for chosen analysis orbits (or views), for example, to the more complex color-contoured two-dimensional (2D) directional flux diagrams in local spacecraft elevation and azimuth.

  19. Accelerated Molecular Dynamics Simulations with the AMOEBA Polarizable Force Field on Graphics Processing Units

    PubMed Central

    2013-01-01

    The accelerated molecular dynamics (aMD) method has recently been shown to enhance the sampling of biomolecules in molecular dynamics (MD) simulations, often by several orders of magnitude. Here, we describe an implementation of the aMD method for the OpenMM application layer that takes full advantage of graphics processing units (GPUs) computing. The aMD method is shown to work in combination with the AMOEBA polarizable force field (AMOEBA-aMD), allowing the simulation of long time-scale events with a polarizable force field. Benchmarks are provided to show that the AMOEBA-aMD method is efficiently implemented and produces accurate results in its standard parametrization. For the BPTI protein, we demonstrate that the protein structure described with AMOEBA remains stable even on the extended time scales accessed at high levels of accelerations. For the DNA repair metalloenzyme endonuclease IV, we show that the use of the AMOEBA force field is a significant improvement over fixed charged models for describing the enzyme active-site. The new AMOEBA-aMD method is publicly available (http://wiki.simtk.org/openmm/VirtualRepository) and promises to be interesting for studying complex systems that can benefit from both the use of a polarizable force field and enhanced sampling. PMID:24634618

  20. Spins Dynamics in a Dissipative Environment: Hierarchal Equations of Motion Approach Using a Graphics Processing Unit (GPU).

    PubMed

    Tsuchimoto, Masashi; Tanimura, Yoshitaka

    2015-08-11

    A system with many energy states coupled to a harmonic oscillator bath is considered. To study quantum non-Markovian system-bath dynamics numerically rigorously and nonperturbatively, we developed a computer code for the reduced hierarchy equations of motion (HEOM) for a graphics processor unit (GPU) that can treat the system as large as 4096 energy states. The code employs a Padé spectrum decomposition (PSD) for a construction of HEOM and the exponential integrators. Dynamics of a quantum spin glass system are studied by calculating the free induction decay signal for the cases of 3 × 2 to 3 × 4 triangular lattices with antiferromagnetic interactions. We found that spins relax faster at lower temperature due to transitions through a quantum coherent state, as represented by the off-diagonal elements of the reduced density matrix, while it has been known that the spins relax slower due to suppression of thermal activation in a classical case. The decay of the spins are qualitatively similar regardless of the lattice sizes. The pathway of spin relaxation is analyzed under a sudden temperature drop condition. The Compute Unified Device Architecture (CUDA) based source code used in the present calculations is provided as Supporting Information .

  1. [Statistical analysis using freely-available "EZR (Easy R)" software].

    PubMed

    Kanda, Yoshinobu

    2015-10-01

    Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.

  2. Graphical statistical approach to soil organic matter resilience using analytical pyrolysis data.

    PubMed

    Almendros, Gonzalo; Hernández, Zulimar; Sanz, Jesús; Rodríguez-Sánchez, Sonia; Jiménez-González, Marco A; González-Pérez, José A

    2018-01-19

    Pyrolysis-gas chromatography-mass spectrometry (Py-GC-MS) of humic acids (HAs) from 30 agricultural soils from a volcanic island (Tenerife, Spain) was used to discern the molecular characteristics of soil organic matter (SOM) associated to resilience. For faster perceptual identification of the results, the yields of the pyrolysis products in the form of surface density plots were compared in an update of the Van Krevelen graphical statistical method. This approach, with respect to data reduction and visualization, was also used to collectively represent statistical indices that were obtained after simple and partial least squares (PLS) regression. The resulting plots illustrate different SOM structural domains (for example, carbohydrate- and lignin-derived and condensed lipid). The content of SOM and total mineralization coefficient (TMC) values can be well estimated from the relative abundance of 57 major pyrolysis compounds: SOM content and composition parallels the accumulation of lignin- and carbohydrate-derived structures (lignocellulosic material) and the depletion of condensed polyalkyl structures. In other words, in the volcanic ash soils that were studied, we found that the higher the amount of SOM, the lower its quality in terms of resilience. Although no cause-and-effect is inferred from this fact, it is evident that the resistance to biodegradation of the SOM is related to its molecular composition. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Assessing compositional variability through graphical analysis and Bayesian statistical approaches: case studies on transgenic crops.

    PubMed

    Harrigan, George G; Harrison, Jay M

    2012-01-01

    New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.

  4. The AFDD International Dynamic Stall Workshop on Correlation of Dynamic Stall Models with 3-D Dynamic Stall Data

    NASA Technical Reports Server (NTRS)

    Tan, C. M.; Carr, L. W.

    1996-01-01

    A variety of empirical and computational fluid dynamics two-dimensional (2-D) dynamic stall models were compared to recently obtained three-dimensional (3-D) dynamic stall data in a workshop on modeling of 3-D dynamic stall of an unswept, rectangular wing, of aspect ratio 10. Dynamic stall test data both below and above the static stall angle-of-attack were supplied to the participants, along with a 'blind' case where only the test conditions were supplied in advance, with results being compared to experimental data at the workshop itself. Detailed graphical comparisons are presented in the report, which also includes discussion of the methods and the results. The primary conclusion of the workshop was that the 3-D effects of dynamic stall on the oscillating wing studied in the workshop can be reasonably reproduced by existing semi-empirical models once 2-D dynamic stall data have been obtained. The participants also emphasized the need for improved quantification of 2-D dynamic stall.

  5. An Initial Survey of Fractional Graph and Table Area in Behavioral Journals

    ERIC Educational Resources Information Center

    Kubina, Richard M., Jr.; Kostewicz, Douglas E.; Datchuck, Shawn M.

    2008-01-01

    This study examined the fractional graph area (FGA), the proportion of page space used to display statistical graphics, in 11 behavioral journals and places behavior analysis on a continuum with other natural, mathematical, and social science disciplines. The composite FGA of all 11 journals puts behavior analysis within the range of the social…

  6. Using Recursive Regression to Explore Nonlinear Relationships and Interactions: A Tutorial Applied to a Multicultural Education Study

    ERIC Educational Resources Information Center

    Strang, Kenneth David

    2009-01-01

    This paper discusses how a seldom-used statistical procedure, recursive regression (RR), can numerically and graphically illustrate data-driven nonlinear relationships and interaction of variables. This routine falls into the family of exploratory techniques, yet a few interesting features make it a valuable compliment to factor analysis and…

  7. Estimating Janka hardness from specific gravity for tropical and temperate species

    Treesearch

    Michael C. Wiemann; David W. Green

    2007-01-01

    Using mean values for basic (green) specific gravity and Janka side hardness for individual species obtained from the world literature, regression equations were developed to predict side hardness from specific gravity. Statistical and graphical methods showed that the hardness–specific gravity relationship is the same for tropical and temperate hardwoods, but that the...

  8. Global, Local, and Graphical Person-Fit Analysis Using Person-Response Functions

    ERIC Educational Resources Information Center

    Emons, Wilco H. M.; Sijtsma, Klaas; Meijer, Rob R.

    2005-01-01

    Person-fit statistics test whether the likelihood of a respondent's complete vector of item scores on a test is low given the hypothesized item response theory model. This binary information may be insufficient for diagnosing the cause of a misfitting item-score vector. The authors propose a comprehensive methodology for person-fit analysis in the…

  9. Software Tools on the Peregrine System | High-Performance Computing | NREL

    Science.gov Websites

    Debugger or performance analysis Tool for understanding the behavior of MPI applications. Intel VTune environment for statistical computing and graphics. VirtualGL/TurboVNC Visualization and analytics Remote Tools on the Peregrine System Software Tools on the Peregrine System NREL has a variety of

  10. Sole: Online Analysis of Southern FIA Data

    Treesearch

    Michael P. Spinney; Paul C. Van Deusen; Francis A. Roesch

    2006-01-01

    The Southern On Line Estimator (SOLE) is a flexible modular software program for analyzing U.S. Department of Agriculture Forest Service Forest Inventory and Analysis data. SOLE produces statistical tables, figures, maps, and portable document format reports based on user selected area and variables. SOLE?s Java-based graphical user interface is easy to use, and its R-...

  11. Evaluation of Biological and Male Reproductive Function Responses to Potential Lead Exposures in 155 mm Howitzer Crewmen

    DTIC Science & Technology

    1992-01-01

    GROUP j SUB-GROUP Lead, Weapons Systems, Microwave Radiation, Male 16; 19 03 1 Reproductive Effects 17 10 19. ABSTRACT (Continue on reverse if... 1 INTRODUCTION ............ ................... 2 BACKGROUND ............................................... 4 EXPOSURE CHARACTERIZATION...APPENDIX C ............................................... 132 LIST OF FIGURES Figure 1 . Graphic representation for trend with respect to statistically

  12. An Intuitive Graphical Approach to Understanding the Split-Plot Experiment

    ERIC Educational Resources Information Center

    Robinson, Timothy J.; Brenneman, William A.; Myers, William R.

    2009-01-01

    While split-plot designs have received considerable attention in the literature over the past decade, there seems to be a general lack of intuitive understanding of the error structure of these designs and the resulting statistical analysis. Typically, students learn the proper error terms for testing factors of a split-plot design via "expected…

  13. The Nature and Extent of Instructors' Use of Learning Analytics in Higher Education to Inform Teaching and Learning

    ERIC Educational Resources Information Center

    King, Janet L.

    2017-01-01

    The utilization of learning analytics to support teaching and learning has emerged as a newer phenomenon combining instructor-oriented action research, the mining of educational data, and the analyses of statistics and patterns. Learning analytics have documented, quantified and graphically displayed students' interactions, engagement, and…

  14. Compendium of Statistical and Financial Information: Ontario Universities, 1998-99.

    ERIC Educational Resources Information Center

    Council of Ontario Universities, Toronto.

    This compendium presents data on various aspects of the Ontario University system. Compiled by the Council of Finance Officers - Universities of Ontario (COFO-UO), it is intended as a companion to the Financial Report of Ontario Universities and as an aid to financial planning and policy. Data are presented in graphical and tabular formats.…

  15. GRAPHIC REANALYSIS OF THE TWO NINDS-TPA TRIALS CONFIRMS SUBSTANTIAL TREATMENT BENEFIT

    PubMed Central

    Saver, Jeffrey L.; Gornbein, Jeffrey; Starkman, Sidney

    2010-01-01

    Background of Comment/Review Multiple statistical analyses of the two NINDS-TPA Trials have confirmed study findings of benefit of fibrinolytic therapy. A recent graphic analysis departed from best practices in the visual display of quantitative information by failing to take into account the skewed functional importance NIH Stroke Scale raw scores and by scaling change axes at up to twenty times the range achievable by individual patients. Methods Using the publicly available datasets of the 2 NINDS-TPA Trials, we generated a variety of figures appropriate to the characteristics of acute stroke trial data. Results A diverse array of figures all visually delineated substantial benefits of fibrinolytic therapy, including: bar charts of normalized gain and loss; stacked bar, bar, and matrix plots of clinically relevant ordinal ranks; a time series stacked line plot of continuous scale disability weights; and line plot, bubble chart, and person icon array graphs of joint outcome table analysis. The achievable change figure showed substantially greater improvement among TPA than placebo patients, median 66.7% (IQR 0–92.0) vs 50.0% (IQR −7.1 – 80.0), p=0.003. Conclusions On average, under 3 hour patients treated with TPA recovered two-thirds while placebo patients improved only half of the way towards fully normal. Graphical analyses of the two NINDS-TPA trials, when performed according to best practices, is a useful means of conveying details about patient response to therapy not fully delineated by summary statistics, and confirms a valuable treatment benefit of under 3 hour fibrinolytic therapy in acute stroke. PMID:20829518

  16. 3D graphics, virtual reality, and motion-onset visual evoked potentials in neurogaming.

    PubMed

    Beveridge, R; Wilson, S; Coyle, D

    2016-01-01

    A brain-computer interface (BCI) offers movement-free control of a computer application and is achieved by reading and translating the cortical activity of the brain into semantic control signals. Motion-onset visual evoked potentials (mVEP) are neural potentials employed in BCIs and occur when motion-related stimuli are attended visually. mVEP dynamics are correlated with the position and timing of the moving stimuli. To investigate the feasibility of utilizing the mVEP paradigm with video games of various graphical complexities including those of commercial quality, we conducted three studies over four separate sessions comparing the performance of classifying five mVEP responses with variations in graphical complexity and style, in-game distractions, and display parameters surrounding mVEP stimuli. To investigate the feasibility of utilizing contemporary presentation modalities in neurogaming, one of the studies compared mVEP classification performance when stimuli were presented using the oculus rift virtual reality headset. Results from 31 independent subjects were analyzed offline. The results show classification performances ranging up to 90% with variations in conditions in graphical complexity having limited effect on mVEP performance; thus, demonstrating the feasibility of using the mVEP paradigm within BCI-based neurogaming. © 2016 Elsevier B.V. All rights reserved.

  17. Real-time colouring and filtering with graphics shaders

    NASA Astrophysics Data System (ADS)

    Vohl, D.; Fluke, C. J.; Barnes, D. G.; Hassan, A. H.

    2017-11-01

    Despite the popularity of the Graphics Processing Unit (GPU) for general purpose computing, one should not forget about the practicality of the GPU for fast scientific visualization. As astronomers have increasing access to three-dimensional (3D) data from instruments and facilities like integral field units and radio interferometers, visualization techniques such as volume rendering offer means to quickly explore spectral cubes as a whole. As most 3D visualization techniques have been developed in fields of research like medical imaging and fluid dynamics, many transfer functions are not optimal for astronomical data. We demonstrate how transfer functions and graphics shaders can be exploited to provide new astronomy-specific explorative colouring methods. We present 12 shaders, including four novel transfer functions specifically designed to produce intuitive and informative 3D visualizations of spectral cube data. We compare their utility to classic colour mapping. The remaining shaders highlight how common computation like filtering, smoothing and line ratio algorithms can be integrated as part of the graphics pipeline. We discuss how this can be achieved by utilizing the parallelism of modern GPUs along with a shading language, letting astronomers apply these new techniques at interactive frame rates. All shaders investigated in this work are included in the open source software shwirl (Vohl 2017).

  18. Computational Particle Dynamic Simulations on Multicore Processors (CPDMu) Final Report Phase I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmalz, Mark S

    2011-07-24

    Statement of Problem - Department of Energy has many legacy codes for simulation of computational particle dynamics and computational fluid dynamics applications that are designed to run on sequential processors and are not easily parallelized. Emerging high-performance computing architectures employ massively parallel multicore architectures (e.g., graphics processing units) to increase throughput. Parallelization of legacy simulation codes is a high priority, to achieve compatibility, efficiency, accuracy, and extensibility. General Statement of Solution - A legacy simulation application designed for implementation on mainly-sequential processors has been represented as a graph G. Mathematical transformations, applied to G, produce a graph representation {und G}more » for a high-performance architecture. Key computational and data movement kernels of the application were analyzed/optimized for parallel execution using the mapping G {yields} {und G}, which can be performed semi-automatically. This approach is widely applicable to many types of high-performance computing systems, such as graphics processing units or clusters comprised of nodes that contain one or more such units. Phase I Accomplishments - Phase I research decomposed/profiled computational particle dynamics simulation code for rocket fuel combustion into low and high computational cost regions (respectively, mainly sequential and mainly parallel kernels), with analysis of space and time complexity. Using the research team's expertise in algorithm-to-architecture mappings, the high-cost kernels were transformed, parallelized, and implemented on Nvidia Fermi GPUs. Measured speedups (GPU with respect to single-core CPU) were approximately 20-32X for realistic model parameters, without final optimization. Error analysis showed no loss of computational accuracy. Commercial Applications and Other Benefits - The proposed research will constitute a breakthrough in solution of problems related to efficient parallel computation of particle and fluid dynamics simulations. These problems occur throughout DOE, military and commercial sectors: the potential payoff is high. We plan to license or sell the solution to contractors for military and domestic applications such as disaster simulation (aerodynamic and hydrodynamic), Government agencies (hydrological and environmental simulations), and medical applications (e.g., in tomographic image reconstruction). Keywords - High-performance Computing, Graphic Processing Unit, Fluid/Particle Simulation. Summary for Members of Congress - Department of Energy has many simulation codes that must compute faster, to be effective. The Phase I research parallelized particle/fluid simulations for rocket combustion, for high-performance computing systems.« less

  19. Dynamics of entanglement and uncertainty relation in coupled harmonic oscillator system: exact results

    NASA Astrophysics Data System (ADS)

    Park, DaeKil

    2018-06-01

    The dynamics of entanglement and uncertainty relation is explored by solving the time-dependent Schrödinger equation for coupled harmonic oscillator system analytically when the angular frequencies and coupling constant are arbitrarily time dependent. We derive the spectral and Schmidt decompositions for vacuum solution. Using the decompositions, we derive the analytical expressions for von Neumann and Rényi entropies. Making use of Wigner distribution function defined in phase space, we derive the time dependence of position-momentum uncertainty relations. To show the dynamics of entanglement and uncertainty relation graphically, we introduce two toy models and one realistic quenched model. While the dynamics can be conjectured by simple consideration in the toy models, the dynamics in the realistic quenched model is somewhat different from that in the toy models. In particular, the dynamics of entanglement exhibits similar pattern to dynamics of uncertainty parameter in the realistic quenched model.

  20. Chemical reactivity and spectroscopy explored from QM/MM molecular dynamics simulations using the LIO code

    NASA Astrophysics Data System (ADS)

    Marcolongo, Juan P.; Zeida, Ari; Semelak, Jonathan A.; Foglia, Nicolás O.; Morzan, Uriel N.; Estrin, Dario A.; González Lebrero, Mariano C.; Scherlis, Damián A.

    2018-03-01

    In this work we present the current advances in the development and the applications of LIO, a lab-made code designed for density functional theory calculations in graphical processing units (GPU), that can be coupled with different classical molecular dynamics engines. This code has been thoroughly optimized to perform efficient molecular dynamics simulations at the QM/MM DFT level, allowing for an exhaustive sampling of the configurational space. Selected examples are presented for the description of chemical reactivity in terms of free energy profiles, and also for the computation of optical properties, such as vibrational and electronic spectra in solvent and protein environments.

Top