Sample records for comprehensive computational analysis

  1. A computational modeling of semantic knowledge in reading comprehension: Integrating the landscape model with latent semantic analysis.

    PubMed

    Yeari, Menahem; van den Broek, Paul

    2016-09-01

    It is a well-accepted view that the prior semantic (general) knowledge that readers possess plays a central role in reading comprehension. Nevertheless, computational models of reading comprehension have not integrated the simulation of semantic knowledge and online comprehension processes under a unified mathematical algorithm. The present article introduces a computational model that integrates the landscape model of comprehension processes with latent semantic analysis representation of semantic knowledge. In three sets of simulations of previous behavioral findings, the integrated model successfully simulated the activation and attenuation of predictive and bridging inferences during reading, as well as centrality estimations and recall of textual information after reading. Analyses of the computational results revealed new theoretical insights regarding the underlying mechanisms of the various comprehension phenomena.

  2. Requirements for Next Generation Comprehensive Analysis of Rotorcraft

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne; Data, Anubhav

    2008-01-01

    The unique demands of rotorcraft aeromechanics analysis have led to the development of software tools that are described as comprehensive analyses. The next generation of rotorcraft comprehensive analyses will be driven and enabled by the tremendous capabilities of high performance computing, particularly modular and scaleable software executed on multiple cores. Development of a comprehensive analysis based on high performance computing both demands and permits a new analysis architecture. This paper describes a vision of the requirements for this next generation of comprehensive analyses of rotorcraft. The requirements are described and substantiated for what must be included and justification provided for what should be excluded. With this guide, a path to the next generation code can be found.

  3. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    DOE PAGES

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-08-03

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  4. Comprehensive silicon solar-cell computer modeling

    NASA Technical Reports Server (NTRS)

    Lamorte, M. F.

    1984-01-01

    A comprehensive silicon solar cell computer modeling scheme was developed to perform the following tasks: (1) model and analysis of the net charge distribution in quasineutral regions; (2) experimentally determined temperature behavior of Spire Corp. n+pp+ solar cells where n+-emitter is formed by ion implantation of 75As or 31P; and (3) initial validation results of computer simulation program using Spire Corp. n+pp+ cells.

  5. The Effect of a Computer Program Based on Analysis, Design, Development, Implementation and Evaluation (ADDIE) in Improving Ninth Graders' Listening and Reading Comprehension Skills in English in Jordan

    ERIC Educational Resources Information Center

    Alodwan, Talal; Almosa, Mosaab

    2018-01-01

    The study aimed to assess the effectiveness of a computer program based on Analysis, Design, Development, Implementation and Evaluation (ADDIE) Model on the achievement of Ninth Graders' listening and Reading Comprehension Skills in English. The study sample comprised 70 ninth graders during the second semester of the academic year 2016/2017. The…

  6. Cost-effective use of minicomputers to solve structural problems

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Foster, E. P.

    1978-01-01

    Minicomputers are receiving increased use throughout the aerospace industry. Until recently, their use focused primarily on process control and numerically controlled tooling applications, while their exposure to and the opportunity for structural calculations has been limited. With the increased availability of this computer hardware, the question arises as to the feasibility and practicality of carrying out comprehensive structural analysis on a minicomputer. This paper presents results on the potential for using minicomputers for structural analysis by (1) selecting a comprehensive, finite-element structural analysis system in use on large mainframe computers; (2) implementing the system on a minicomputer; and (3) comparing the performance of the minicomputers with that of a large mainframe computer for the solution to a wide range of finite element structural analysis problems.

  7. Computer-Assisted Training in the Comprehension of Authentic French Speech: A Closer View

    ERIC Educational Resources Information Center

    Hoeflaak, Arie

    2004-01-01

    In this article, the development of a computer-assisted listening comprehension project is described. First, we comment briefly on the points of departure, the need for autonomous learning against the background of recent changes in Dutch education, and the role of learning strategies. Then, an error analysis, the programs used for this project,…

  8. Effects of computer-based immediate feedback on foreign language listening comprehension and test-associated anxiety.

    PubMed

    Lee, Shu-Ping; Su, Hui-Kai; Lee, Shin-Da

    2012-06-01

    This study investigated the effects of immediate feedback on computer-based foreign language listening comprehension tests and on intrapersonal test-associated anxiety in 72 English major college students at a Taiwanese University. Foreign language listening comprehension of computer-based tests designed by MOODLE, a dynamic e-learning environment, with or without immediate feedback together with the state-trait anxiety inventory (STAI) were tested and repeated after one week. The analysis indicated that immediate feedback during testing caused significantly higher anxiety and resulted in significantly higher listening scores than in the control group, which had no feedback. However, repeated feedback did not affect the test anxiety and listening scores. Computer-based immediate feedback did not lower debilitating effects of anxiety but enhanced students' intrapersonal eustress-like anxiety and probably improved their attention during listening tests. Computer-based tests with immediate feedback might help foreign language learners to increase attention in foreign language listening comprehension.

  9. Eleventh NASTRAN User's Colloquium

    NASA Technical Reports Server (NTRS)

    1983-01-01

    NASTRAN (NASA STRUCTURAL ANALYSIS) is a large, comprehensive, nonproprietary, general purpose finite element computer code for structural analysis which was developed under NASA sponsorship. The Eleventh Colloquium provides some comprehensive general papers on the application of finite element methods in engineering, comparisons with other approaches, unique applications, pre- and post-processing or auxiliary programs, and new methods of analysis with NASTRAN.

  10. Computer-Mediated Glosses in Second Language Reading Comprehension and Vocabulary Learning: A Meta-Analysis

    ERIC Educational Resources Information Center

    Abraham, Lee B.

    2008-01-01

    Language learners have unprecedented opportunities for developing second language literacy skills and intercultural understanding by reading authentic texts on the Internet and in multimedia computer-assisted language learning environments. This article presents findings from a meta-analysis of 11 studies of computer-mediated glosses in second…

  11. Twelfth NASTRAN (R) Users' Colloquium

    NASA Technical Reports Server (NTRS)

    1984-01-01

    NASTRAN is a large, comprehensive, nonproprietary, general purpose finite element computer code for structural analysis. The Twelfth Users' Colloquim provides some comprehensive papers on the application of finite element methods in engineering, comparisons with other approaches, unique applications, pre and post processing or auxiliary programs, and new methods of analysis with NASTRAN.

  12. Computer Analysis in HSC German

    ERIC Educational Resources Information Center

    Clutterbuck, Michael; Mowchanuk, Timothy

    1977-01-01

    In October, 1976, a new type of question was introduced into the Victorian HSC German test: A listening comprehension question with multiple-choice answers has replaced the written reproduction question. Results of a computer analysis of the answers given in the exam are reported. (SW)

  13. Evaluative methodology for comprehensive water quality management planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyer, H. L.

    Computer-based evaluative methodologies have been developed to provide for the analysis of coupled phenomena associated with natural resource comprehensive planning requirements. Provisions for planner/computer interaction have been included. Each of the simulation models developed is described in terms of its coded procedures. An application of the models for water quality management planning is presented; and the data requirements for each of the models are noted.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  15. A computational model for simulating text comprehension.

    PubMed

    Lemaire, Benoît; Denhière, Guy; Bellissens, Cédrick; Jhean-Larose, Sandra

    2006-11-01

    In the present article, we outline the architecture of a computer program for simulating the process by which humans comprehend texts. The program is based on psycholinguistic theories about human memory and text comprehension processes, such as the construction-integration model (Kintsch, 1998), the latent semantic analysis theory of knowledge representation (Landauer & Dumais, 1997), and the predication algorithms (Kintsch, 2001; Lemaire & Bianco, 2003), and it is intended to help psycholinguists investigate the way humans comprehend texts.

  16. Computed microtomography and X-ray fluorescence analysis for comprehensive analysis of structural changes in bone.

    PubMed

    Buzmakov, Alexey; Chukalina, Marina; Nikolaev, Dmitry; Schaefer, Gerald; Gulimova, Victoria; Saveliev, Sergey; Tereschenko, Elena; Seregin, Alexey; Senin, Roman; Prun, Victor; Zolotov, Denis; Asadchikov, Victor

    2013-01-01

    This paper presents the results of a comprehensive analysis of structural changes in the caudal vertebrae of Turner's thick-toed geckos by computer microtomography and X-ray fluorescence analysis. We present algorithms used for the reconstruction of tomographic images which allow to work with high noise level projections that represent typical conditions dictated by the nature of the samples. Reptiles, due to their ruggedness, small size, belonging to the amniote and a number of other valuable features, are an attractive model object for long-orbital experiments on unmanned spacecraft. Issues of possible changes in their bone tissue under the influence of spaceflight are the subject of discussions between biologists from different laboratories around the world.

  17. Synthesizing Results from Empirical Research on Computer-Based Scaffolding in STEM Education: A Meta-Analysis

    ERIC Educational Resources Information Center

    Belland, Brian R.; Walker, Andrew E.; Kim, Nam Ju; Lefler, Mason

    2017-01-01

    Computer-based scaffolding assists students as they generate solutions to complex problems, goals, or tasks, helping increase and integrate their higher order skills in the process. However, despite decades of research on scaffolding in STEM (science, technology, engineering, and mathematics) education, no existing comprehensive meta-analysis has…

  18. Program For Analysis Of Metal-Matrix Composites

    NASA Technical Reports Server (NTRS)

    Murthy, P. L. N.; Mital, S. K.

    1994-01-01

    METCAN (METal matrix Composite ANalyzer) is computer program used to simulate computationally nonlinear behavior of high-temperature metal-matrix composite structural components in specific applications, providing comprehensive analyses of thermal and mechanical performances. Written in FORTRAN 77.

  19. Comprehensive rotorcraft analysis methods

    NASA Technical Reports Server (NTRS)

    Stephens, Wendell B.; Austin, Edward E.

    1988-01-01

    The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).

  20. [Text Comprehensibility of Hospital Report Cards].

    PubMed

    Sander, U; Kolb, B; Christoph, C; Emmert, M

    2016-12-01

    Objectives: Recently, the number of hospital report cards that compare quality of hospitals and present information from German quality reports has greatly increased. Objectives of this study were to a) identify suitable methods for measuring the readability and comprehensibility of hospital report cards, b) to obtain reliable information on the comprehensibility of texts for laymen, c) to give recommendations for improvements and d) to recommend public health actions. Methods: The readability and comprehensibility of the texts were tested with a) a computer-aided evaluation of formal text characteristics (readability indices Flesch (German formula) and 1. Wiener Sachtextformel formula), b) an expert-based heuristic analysis of readability and comprehensibility of texts (counting technical terms and analysis of text simplicity as well as brevity and conciseness using the Hamburg intelligibility model) and c) a survey of subjects about the comprehensibility of individual technical terms, the assessment of the comprehensibility of the presentations and the subjects' decisions in favour of one of the 5 presented clinics due to the better quality of data. In addition, the correlation between the results of the text analysis with the results from the survey of subjects was tested. Results: The assessment of texts with the computer-aided evaluations showed poor comprehensibility values. The assessment of text simplicity using the Hamburg intelligibility model showed poor comprehensibility values (-0.3). On average, 6.8% of the words used were technical terms. A review of 10 technical terms revealed that in all cases only a minority of respondents (from 4.4% to 39.1%) exactly knew what was meant by each of them. Most subjects (62.4%) also believed that unclear terms worsened their understanding of the information offered. The correlation analysis showed that presentations with a lower frequency of technical terms and better values for the text simplicity were better understood. Conclusion: The determination of the frequency of technical terms and the assessment of text simplicity using the Hamburg intelligibility model were suitable methods to determine the readability and comprehensibility of presentations of quality indicators. The analysis showed predominantly poor comprehensibility values and indicated the need to improve the texts of report cards. © Georg Thieme Verlag KG Stuttgart · New York.

  1. Optimization Issues with Complex Rotorcraft Comprehensive Analysis

    NASA Technical Reports Server (NTRS)

    Walsh, Joanne L.; Young, Katherine C.; Tarzanin, Frank J.; Hirsh, Joel E.; Young, Darrell K.

    1998-01-01

    This paper investigates the use of the general purpose automatic differentiation (AD) tool called Automatic Differentiation of FORTRAN (ADIFOR) as a means of generating sensitivity derivatives for use in Boeing Helicopter's proprietary comprehensive rotor analysis code (VII). ADIFOR transforms an existing computer program into a new program that performs a sensitivity analysis in addition to the original analysis. In this study both the pros (exact derivatives, no step-size problems) and cons (more CPU, more memory) of ADIFOR are discussed. The size (based on the number of lines) of the VII code after ADIFOR processing increased by 70 percent and resulted in substantial computer memory requirements at execution. The ADIFOR derivatives took about 75 percent longer to compute than the finite-difference derivatives. However, the ADIFOR derivatives are exact and are not functions of step-size. The VII sensitivity derivatives generated by ADIFOR are compared with finite-difference derivatives. The ADIFOR and finite-difference derivatives are used in three optimization schemes to solve a low vibration rotor design problem.

  2. Lévy-like diffusion in eye movements during spoken-language comprehension.

    PubMed

    Stephen, Damian G; Mirman, Daniel; Magnuson, James S; Dixon, James A

    2009-05-01

    This study explores the diffusive properties of human eye movements during a language comprehension task. In this task, adults are given auditory instructions to locate named objects on a computer screen. Although it has been convention to model visual search as standard Brownian diffusion, we find evidence that eye movements are hyperdiffusive. Specifically, we use comparisons of maximum-likelihood fit as well as standard deviation analysis and diffusion entropy analysis to show that visual search during language comprehension exhibits Lévy-like rather than Gaussian diffusion.

  3. Lévy-like diffusion in eye movements during spoken-language comprehension

    NASA Astrophysics Data System (ADS)

    Stephen, Damian G.; Mirman, Daniel; Magnuson, James S.; Dixon, James A.

    2009-05-01

    This study explores the diffusive properties of human eye movements during a language comprehension task. In this task, adults are given auditory instructions to locate named objects on a computer screen. Although it has been convention to model visual search as standard Brownian diffusion, we find evidence that eye movements are hyperdiffusive. Specifically, we use comparisons of maximum-likelihood fit as well as standard deviation analysis and diffusion entropy analysis to show that visual search during language comprehension exhibits Lévy-like rather than Gaussian diffusion.

  4. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    Progress in the development of system models and techniques for the formulation and evaluation of aircraft computer system effectiveness is reported. Topics covered include: analysis of functional dependence: a prototype software package, METAPHOR, developed to aid the evaluation of performability; and a comprehensive performability modeling and evaluation exercise involving the SIFT computer.

  5. Proceedings: Conference on Computers in Chemical Education and Research, Dekalb, Illinois, 19-23 July 1971.

    ERIC Educational Resources Information Center

    1971

    Computers have effected a comprehensive transformation of chemistry. Computers have greatly enhanced the chemist's ability to do model building, simulations, data refinement and reduction, analysis of data in terms of models, on-line data logging, automated control of experiments, quantum chemistry and statistical and mechanical calculations, and…

  6. NEW GIS WATERSHED ANALYSIS TOOLS FOR SOIL CHARACTERIZATION AND EROSION AND SEDIMENTATION MODELING

    EPA Science Inventory

    A comprehensive procedure for computing soil erosion and sediment delivery metrics has been developed which utilizes a suite of automated scripts and a pair of processing-intensive executable programs operating on a personal computer platform.

  7. A new framework for comprehensive, robust, and efficient global sensitivity analysis: 1. Theory

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin V.

    2016-01-01

    Computer simulation models are continually growing in complexity with increasingly more factors to be identified. Sensitivity Analysis (SA) provides an essential means for understanding the role and importance of these factors in producing model responses. However, conventional approaches to SA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we present a new and general sensitivity analysis framework (called VARS), based on an analogy to "variogram analysis," that provides an intuitive and comprehensive characterization of sensitivity across the full spectrum of scales in the factor space. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are special cases of VARS, and that their SA indices can be computed as by-products of the VARS framework. Synthetic functions that resemble actual model response surfaces are used to illustrate the concepts, and show VARS to be as much as two orders of magnitude more computationally efficient than the state-of-the-art Sobol approach. In a companion paper, we propose a practical implementation strategy, and demonstrate the effectiveness, efficiency, and reliability (robustness) of the VARS framework on real-data case studies.

  8. Hiding in Plain Sight: Identifying Computational Thinking in the Ontario Elementary School Curriculum

    ERIC Educational Resources Information Center

    Hennessey, Eden J. V.; Mueller, Julie; Beckett, Danielle; Fisher, Peter A.

    2017-01-01

    Given a growing digital economy with complex problems, demands are being made for education to address computational thinking (CT)--an approach to problem solving that draws on the tenets of computer science. We conducted a comprehensive content analysis of the Ontario elementary school curriculum documents for 44 CT-related terms to examine the…

  9. The Development of Reading for Comprehension: An Information Processing Analysis. Final Report.

    ERIC Educational Resources Information Center

    Schadler, Margaret; Juola, James F.

    This report summarizes research performed at the Universtiy of Kansas that involved several topics related to reading and learning to read, including the development of automatic word recognition processes, reading for comprehension, and the development of new computer technologies designed to facilitate the reading process. The first section…

  10. A Review of CEFA Software: Comprehensive Exploratory Factor Analysis Program

    ERIC Educational Resources Information Center

    Lee, Soon-Mook

    2010-01-01

    CEFA 3.02(Browne, Cudeck, Tateneni, & Mels, 2008) is a factor analysis computer program designed to perform exploratory factor analysis. It provides the main properties that are needed for exploratory factor analysis, namely a variety of factoring methods employing eight different discrepancy functions to be minimized to yield initial…

  11. The Perseus computational platform for comprehensive analysis of (prote)omics data.

    PubMed

    Tyanova, Stefka; Temu, Tikira; Sinitcyn, Pavel; Carlson, Arthur; Hein, Marco Y; Geiger, Tamar; Mann, Matthias; Cox, Jürgen

    2016-09-01

    A main bottleneck in proteomics is the downstream biological analysis of highly multivariate quantitative protein abundance data generated using mass-spectrometry-based analysis. We developed the Perseus software platform (http://www.perseus-framework.org) to support biological and biomedical researchers in interpreting protein quantification, interaction and post-translational modification data. Perseus contains a comprehensive portfolio of statistical tools for high-dimensional omics data analysis covering normalization, pattern recognition, time-series analysis, cross-omics comparisons and multiple-hypothesis testing. A machine learning module supports the classification and validation of patient groups for diagnosis and prognosis, and it also detects predictive protein signatures. Central to Perseus is a user-friendly, interactive workflow environment that provides complete documentation of computational methods used in a publication. All activities in Perseus are realized as plugins, and users can extend the software by programming their own, which can be shared through a plugin store. We anticipate that Perseus's arsenal of algorithms and its intuitive usability will empower interdisciplinary analysis of complex large data sets.

  12. Assessing Computer Literacy: A Validated Instrument and Empirical Results.

    ERIC Educational Resources Information Center

    Gabriel, Roy M.

    1985-01-01

    Describes development of a comprehensive computer literacy assessment battery for K-12 curriculum based on objectives of a curriculum implemented in the Worldwide Department of Defense Dependents Schools system. Test development and field test data are discussed and a correlational analysis which assists in interpretation of test results is…

  13. SPAN: Ocean science

    NASA Technical Reports Server (NTRS)

    Thomas, Valerie L.; Koblinsky, Chester J.; Webster, Ferris; Zlotnicki, Victor; Green, James L.

    1987-01-01

    The Space Physics Analysis Network (SPAN) is a multi-mission, correlative data comparison network which links space and Earth science research and data analysis computers. It provides a common working environment for sharing computer resources, sharing computer peripherals, solving proprietary problems, and providing the potential for significant time and cost savings for correlative data analysis. This is one of a series of discipline-specific SPAN documents which are intended to complement the SPAN primer and SPAN Management documents. Their purpose is to provide the discipline scientists with a comprehensive set of documents to assist in the use of SPAN for discipline specific scientific research.

  14. Signalling maps in cancer research: construction and data analysis

    PubMed Central

    Kondratova, Maria; Sompairac, Nicolas; Barillot, Emmanuel; Zinovyev, Andrei

    2018-01-01

    Abstract Generation and usage of high-quality molecular signalling network maps can be augmented by standardizing notations, establishing curation workflows and application of computational biology methods to exploit the knowledge contained in the maps. In this manuscript, we summarize the major aims and challenges of assembling information in the form of comprehensive maps of molecular interactions. Mainly, we share our experience gained while creating the Atlas of Cancer Signalling Network. In the step-by-step procedure, we describe the map construction process and suggest solutions for map complexity management by introducing a hierarchical modular map structure. In addition, we describe the NaviCell platform, a computational technology using Google Maps API to explore comprehensive molecular maps similar to geographical maps and explain the advantages of semantic zooming principles for map navigation. We also provide the outline to prepare signalling network maps for navigation using the NaviCell platform. Finally, several examples of cancer high-throughput data analysis and visualization in the context of comprehensive signalling maps are presented. PMID:29688383

  15. Comprehensive numerical methodology for direct numerical simulations of compressible Rayleigh-Taylor instability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reckinger, Scott James; Livescu, Daniel; Vasilyev, Oleg V.

    A comprehensive numerical methodology has been developed that handles the challenges introduced by considering the compressive nature of Rayleigh-Taylor instability (RTI) systems, which include sharp interfacial density gradients on strongly stratified background states, acoustic wave generation and removal at computational boundaries, and stratification-dependent vorticity production. The computational framework is used to simulate two-dimensional single-mode RTI to extreme late-times for a wide range of flow compressibility and variable density effects. The results show that flow compressibility acts to reduce the growth of RTI for low Atwood numbers, as predicted from linear stability analysis.

  16. Quest: The Interactive Test Analysis System.

    ERIC Educational Resources Information Center

    Adams, Raymond J.; Khoo, Siek-Toon

    The Quest program offers a comprehensive test and questionnaire analysis environment by providing a data analyst (a computer program) with access to the most recent developments in Rasch measurement theory, as well as a range of traditional analysis procedures. This manual helps the user use Quest to construct and validate variables based on…

  17. A Randomized Controlled Trial Study of the ABRACADABRA Reading Intervention Program in Grade 1

    ERIC Educational Resources Information Center

    Savage, Robert S.; Abrami, Philip; Hipps, Geoffrey; Deault, Louise

    2009-01-01

    This study reports a randomized controlled trial evaluation of a computer-based balanced literacy intervention, ABRACADABRA (http://grover.concordia.ca/abra/version1/abracadabra.html). Children (N = 144) in Grade 1 were exposed either to computer activities for word analysis, text comprehension, and fluency, alongside shared stories (experimental…

  18. A comprehensive analytical model of rotorcraft aerodynamics and dynamics. Part 3: Program manual

    NASA Technical Reports Server (NTRS)

    Johnson, W.

    1980-01-01

    The computer program for a comprehensive analytical model of rotorcraft aerodynamics and dynamics is described. This analysis is designed to calculate rotor performance, loads, and noise; the helicopter vibration and gust response; the flight dynamics and handling qualities; and the system aeroelastic stability. The analysis is a combination of structural, inertial, and aerodynamic models that is applicable to a wide range of problems and a wide class of vehicles. The analysis is intended for use in the design, testing, and evaluation of rotors and rotorcraft and to be a basis for further development of rotary wing theories.

  19. Methods for Functional Connectivity Analyses

    DTIC Science & Technology

    2012-12-13

    motor , or hand motor function (green, red, or blue shading, respectively). Thus, this work produced the first comprehensive analysis of ECoG...Computer Engineering, University of Texas at El Paso , TX, USA 3Department of Neurology, Albany Medical College, Albany, NY, USA 4Department of Computer...Department of Health, Albany, NY, USA bDepartment of Electrical and Computer Engineering, University of Texas at El Paso , TX, USA cDepartment of Neurology

  20. Bird impact analysis package for turbine engine fan blades

    NASA Technical Reports Server (NTRS)

    Hirschbein, M. S.

    1982-01-01

    A computer program has been developed to analyze the gross structural response of turbine engine fan blades subjected to bird strikes. The program couples a NASTRAN finite element model and modal analysis of a fan blade with a multi-mode bird impact analysis computer program. The impact analysis uses the NASTRAN blade model and a fluid jet model of the bird to interactively calculate blade loading during a bird strike event. The analysis package is computationaly efficient, easy to use and provides a comprehensive history of the gross structual blade response. Example cases are presented for a representative fan blade.

  1. Development of self-compressing BLSOM for comprehensive analysis of big sequence data.

    PubMed

    Kikuchi, Akihito; Ikemura, Toshimichi; Abe, Takashi

    2015-01-01

    With the remarkable increase in genomic sequence data from various organisms, novel tools are needed for comprehensive analyses of available big sequence data. We previously developed a Batch-Learning Self-Organizing Map (BLSOM), which can cluster genomic fragment sequences according to phylotype solely dependent on oligonucleotide composition and applied to genome and metagenomic studies. BLSOM is suitable for high-performance parallel-computing and can analyze big data simultaneously, but a large-scale BLSOM needs a large computational resource. We have developed Self-Compressing BLSOM (SC-BLSOM) for reduction of computation time, which allows us to carry out comprehensive analysis of big sequence data without the use of high-performance supercomputers. The strategy of SC-BLSOM is to hierarchically construct BLSOMs according to data class, such as phylotype. The first-layer BLSOM was constructed with each of the divided input data pieces that represents the data subclass, such as phylotype division, resulting in compression of the number of data pieces. The second BLSOM was constructed with a total of weight vectors obtained in the first-layer BLSOMs. We compared SC-BLSOM with the conventional BLSOM by analyzing bacterial genome sequences. SC-BLSOM could be constructed faster than BLSOM and cluster the sequences according to phylotype with high accuracy, showing the method's suitability for efficient knowledge discovery from big sequence data.

  2. Introducing computational thinking through hands-on projects using R with applications to calculus, probability and data analysis

    NASA Astrophysics Data System (ADS)

    Benakli, Nadia; Kostadinov, Boyan; Satyanarayana, Ashwin; Singh, Satyanand

    2017-04-01

    The goal of this paper is to promote computational thinking among mathematics, engineering, science and technology students, through hands-on computer experiments. These activities have the potential to empower students to learn, create and invent with technology, and they engage computational thinking through simulations, visualizations and data analysis. We present nine computer experiments and suggest a few more, with applications to calculus, probability and data analysis, which engage computational thinking through simulations, visualizations and data analysis. We are using the free (open-source) statistical programming language R. Our goal is to give a taste of what R offers rather than to present a comprehensive tutorial on the R language. In our experience, these kinds of interactive computer activities can be easily integrated into a smart classroom. Furthermore, these activities do tend to keep students motivated and actively engaged in the process of learning, problem solving and developing a better intuition for understanding complex mathematical concepts.

  3. COMAN: a web server for comprehensive metatranscriptomics analysis.

    PubMed

    Ni, Yueqiong; Li, Jun; Panagiotou, Gianni

    2016-08-11

    Microbiota-oriented studies based on metagenomic or metatranscriptomic sequencing have revolutionised our understanding on microbial ecology and the roles of both clinical and environmental microbes. The analysis of massive metatranscriptomic data requires extensive computational resources, a collection of bioinformatics tools and expertise in programming. We developed COMAN (Comprehensive Metatranscriptomics Analysis), a web-based tool dedicated to automatically and comprehensively analysing metatranscriptomic data. COMAN pipeline includes quality control of raw reads, removal of reads derived from non-coding RNA, followed by functional annotation, comparative statistical analysis, pathway enrichment analysis, co-expression network analysis and high-quality visualisation. The essential data generated by COMAN are also provided in tabular format for additional analysis and integration with other software. The web server has an easy-to-use interface and detailed instructions, and is freely available at http://sbb.hku.hk/COMAN/ CONCLUSIONS: COMAN is an integrated web server dedicated to comprehensive functional analysis of metatranscriptomic data, translating massive amount of reads to data tables and high-standard figures. It is expected to facilitate the researchers with less expertise in bioinformatics in answering microbiota-related biological questions and to increase the accessibility and interpretation of microbiota RNA-Seq data.

  4. Course Modularization Applied: The Interface System and Its Implications For Sequence Control and Data Analysis.

    ERIC Educational Resources Information Center

    Schneider, E. W.

    The Interface System is a comprehensive method for developing and managing computer-assisted instructional courses or computer-managed instructional courses composed of sets of instructional modules. Each module is defined by one or more behavioral objectives and by a list of prerequisite modules that must be completed successfully before the…

  5. LXtoo: an integrated live Linux distribution for the bioinformatics community

    PubMed Central

    2012-01-01

    Background Recent advances in high-throughput technologies dramatically increase biological data generation. However, many research groups lack computing facilities and specialists. This is an obstacle that remains to be addressed. Here, we present a Linux distribution, LXtoo, to provide a flexible computing platform for bioinformatics analysis. Findings Unlike most of the existing live Linux distributions for bioinformatics limiting their usage to sequence analysis and protein structure prediction, LXtoo incorporates a comprehensive collection of bioinformatics software, including data mining tools for microarray and proteomics, protein-protein interaction analysis, and computationally complex tasks like molecular dynamics. Moreover, most of the programs have been configured and optimized for high performance computing. Conclusions LXtoo aims to provide well-supported computing environment tailored for bioinformatics research, reducing duplication of efforts in building computing infrastructure. LXtoo is distributed as a Live DVD and freely available at http://bioinformatics.jnu.edu.cn/LXtoo. PMID:22813356

  6. LXtoo: an integrated live Linux distribution for the bioinformatics community.

    PubMed

    Yu, Guangchuang; Wang, Li-Gen; Meng, Xiao-Hua; He, Qing-Yu

    2012-07-19

    Recent advances in high-throughput technologies dramatically increase biological data generation. However, many research groups lack computing facilities and specialists. This is an obstacle that remains to be addressed. Here, we present a Linux distribution, LXtoo, to provide a flexible computing platform for bioinformatics analysis. Unlike most of the existing live Linux distributions for bioinformatics limiting their usage to sequence analysis and protein structure prediction, LXtoo incorporates a comprehensive collection of bioinformatics software, including data mining tools for microarray and proteomics, protein-protein interaction analysis, and computationally complex tasks like molecular dynamics. Moreover, most of the programs have been configured and optimized for high performance computing. LXtoo aims to provide well-supported computing environment tailored for bioinformatics research, reducing duplication of efforts in building computing infrastructure. LXtoo is distributed as a Live DVD and freely available at http://bioinformatics.jnu.edu.cn/LXtoo.

  7. A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Piascik, Robert S.; Newman, James C., Jr.

    1999-01-01

    An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.

  8. A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints

    NASA Technical Reports Server (NTRS)

    Harris, C. E.; Piascik, R. S.; Newman, J. C., Jr.

    2000-01-01

    An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.

  9. Interfacing comprehensive rotorcraft analysis with advanced aeromechanics and vortex wake models

    NASA Astrophysics Data System (ADS)

    Liu, Haiying

    This dissertation describes three aspects of the comprehensive rotorcraft analysis. First, a physics-based methodology for the modeling of hydraulic devices within multibody-based comprehensive models of rotorcraft systems is developed. This newly proposed approach can predict the fully nonlinear behavior of hydraulic devices, and pressure levels in the hydraulic chambers are coupled with the dynamic response of the system. The proposed hydraulic device models are implemented in a multibody code and calibrated by comparing their predictions with test bench measurements for the UH-60 helicopter lead-lag damper. Predicted peak damping forces were found to be in good agreement with measurements, while the model did not predict the entire time history of damper force to the same level of accuracy. The proposed model evaluates relevant hydraulic quantities such as chamber pressures, orifice flow rates, and pressure relief valve displacements. This model could be used to design lead-lag dampers with desirable force and damping characteristics. The second part of this research is in the area of computational aeroelasticity, in which an interface between computational fluid dynamics (CFD) and computational structural dynamics (CSD) is established. This interface enables data exchange between CFD and CSD with the goal of achieving accurate airloads predictions. In this work, a loose coupling approach based on the delta-airloads method is developed in a finite-element method based multibody dynamics formulation, DYMORE. To validate this aerodynamic interface, a CFD code, OVERFLOW-2, is loosely coupled with a CSD program, DYMORE, to compute the airloads of different flight conditions for Sikorsky UH-60 aircraft. This loose coupling approach has good convergence characteristics. The predicted airloads are found to be in good agreement with the experimental data, although not for all flight conditions. In addition, the tight coupling interface between the CFD program, OVERFLOW-2, and the CSD program, DYMORE, is also established. The ability to accurately capture the wake structure around a helicopter rotor is crucial for rotorcraft performance analysis. In the third part of this thesis, a new representation of the wake vortex structure based on Non-Uniform Rational B-Spline (NURBS) curves and surfaces is proposed to develop an efficient model for prescribed and free wakes. NURBS curves and surfaces are able to represent complex shapes with remarkably little data. The proposed formulation has the potential to reduce the computational cost associated with the use of Helmholtz's law and the Biot-Savart law when calculating the induced flow field around the rotor. An efficient free-wake analysis will considerably decrease the computational cost of comprehensive rotorcraft analysis, making the approach more attractive to routine use in industrial settings.

  10. Improved Foundry Castings Utilizing CAD/CAM (Computer Aided Design/ Computer Aided Manufacture). Volume 1. Overview

    DTIC Science & Technology

    1988-06-30

    casting. 68 Figure 1-9: Line printer representation of roll solidification. 69 Figure I1-1: Test casting model. 76 Figure 11-2: Division of test casting...writing new casting analysis and design routines. The new routines would take advantage of advanced criteria for predicting casting soundness and cast...properties and technical advances in computer hardware and software. 11 2. CONCLUSIONS UPCAST, a comprehensive software package, has been developed for

  11. Transient thermal, hydraulic, and mechanical analysis of a counter flow offset strip fin intermediate heat exchanger using an effective porous media approach

    NASA Astrophysics Data System (ADS)

    Urquiza, Eugenio

    This work presents a comprehensive thermal hydraulic analysis of a compact heat exchanger using offset strip fins. The thermal hydraulics analysis in this work is followed by a finite element analysis (FEA) to predict the mechanical stresses experienced by an intermediate heat exchanger (IHX) during steady-state operation and selected flow transients. In particular, the scenario analyzed involves a gas-to-liquid IHX operating between high pressure helium and liquid or molten salt. In order to estimate the stresses in compact heat exchangers a comprehensive thermal and hydraulic analysis is needed. Compact heat exchangers require very small flow channels and fins to achieve high heat transfer rates and thermal effectiveness. However, studying such small features computationally contributes little to the understanding of component level phenomena and requires prohibitive computational effort using computational fluid dynamics (CFD). To address this issue, the analysis developed here uses an effective porous media (EPM) approach; this greatly reduces the computation time and produces results with the appropriate resolution [1]. This EPM fluid dynamics and heat transfer computational code has been named the Compact Heat Exchanger Explicit Thermal and Hydraulics (CHEETAH) code. CHEETAH solves for the two-dimensional steady-state and transient temperature and flow distributions in the IHX including the complicating effects of temperature-dependent fluid thermo-physical properties. Temperature- and pressure-dependent fluid properties are evaluated by CHEETAH and the thermal effectiveness of the IHX is also calculated. Furthermore, the temperature distribution can then be imported into a finite element analysis (FEA) code for mechanical stress analysis using the EPM methods developed earlier by the University of California, Berkeley, for global and local stress analysis [2]. These simulation tools will also allow the heat exchanger design to be improved through an iterative design process which will lead to a design with a reduced pressure drop, increased thermal effectiveness, and improved mechanical performance as it relates to creep deformation and transient thermal stresses.

  12. Robust Nucleus/Cell Detection and Segmentation in Digital Pathology and Microscopy Images: A Comprehensive Review.

    PubMed

    Xing, Fuyong; Yang, Lin

    2016-01-01

    Digital pathology and microscopy image analysis is widely used for comprehensive studies of cell morphology or tissue structure. Manual assessment is labor intensive and prone to interobserver variations. Computer-aided methods, which can significantly improve the objectivity and reproducibility, have attracted a great deal of interest in recent literature. Among the pipeline of building a computer-aided diagnosis system, nucleus or cell detection and segmentation play a very important role to describe the molecular morphological information. In the past few decades, many efforts have been devoted to automated nucleus/cell detection and segmentation. In this review, we provide a comprehensive summary of the recent state-of-the-art nucleus/cell segmentation approaches on different types of microscopy images including bright-field, phase-contrast, differential interference contrast, fluorescence, and electron microscopies. In addition, we discuss the challenges for the current methods and the potential future work of nucleus/cell detection and segmentation.

  13. Environmental Testing and Thermal Analysis of the NPS Solar Cell Array Tester (NPS-SCAT) CubeSat

    DTIC Science & Technology

    2011-06-01

    BCR Battery Charge Regulator C&DH Command and Data Handling CAD Computer Aided Design CDR Critical Design Review CFT Comprehensive Functional Test ...CPT Comprehensive Performance Test CoM Center of Mass COTS Commercial Off-the-Shelf CTB Cargo Transfer Bag EDU Engineering Design Unit EPS...and inexpensive solution. 2 C. ENVIRONMENTAL TESTING Environmental testing is an important element of the design and testing of a satellite. By

  14. Modelling and simulation of complex sociotechnical systems: envisioning and analysing work environments

    PubMed Central

    Hettinger, Lawrence J.; Kirlik, Alex; Goh, Yang Miang; Buckle, Peter

    2015-01-01

    Accurate comprehension and analysis of complex sociotechnical systems is a daunting task. Empirically examining, or simply envisioning the structure and behaviour of such systems challenges traditional analytic and experimental approaches as well as our everyday cognitive capabilities. Computer-based models and simulations afford potentially useful means of accomplishing sociotechnical system design and analysis objectives. From a design perspective, they can provide a basis for a common mental model among stakeholders, thereby facilitating accurate comprehension of factors impacting system performance and potential effects of system modifications. From a research perspective, models and simulations afford the means to study aspects of sociotechnical system design and operation, including the potential impact of modifications to structural and dynamic system properties, in ways not feasible with traditional experimental approaches. This paper describes issues involved in the design and use of such models and simulations and describes a proposed path forward to their development and implementation. Practitioner Summary: The size and complexity of real-world sociotechnical systems can present significant barriers to their design, comprehension and empirical analysis. This article describes the potential advantages of computer-based models and simulations for understanding factors that impact sociotechnical system design and operation, particularly with respect to process and occupational safety. PMID:25761227

  15. CODAP: Programmer Notes for the Subroutine Library on the Univac 1108.

    ERIC Educational Resources Information Center

    Weissmuller, Johnny J.; And Others

    The Comprehensive Occupational Data Analysis Programs (CODAP) package is a highly interactive and efficient system of computer routines for analyzing, organizing, and reporting occupational information. Since its inception in 1960, CODAP has grown in tandem with advances in job analysis methodology and is now capable of answering most of the wide…

  16. The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences.

    PubMed

    Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker

    2016-01-01

    The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant's platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses.

  17. Analysis of the Effects of the Computer Enhanced Classroom on the Achievement of Remedial High School Math Students.

    ERIC Educational Resources Information Center

    Lang, William Steve; And Others

    The effects of the use of computer-enhanced instruction with remedial students were assessed, using 4,293 ninth through twelfth graders--3,308 Black, 957 White, and 28 Other--involved in the Governor's Remediation Initiative (GRI) in Georgia. Data sources included the Comprehensive Tests of Basic Skills (CTBS), a data collection form developed for…

  18. Identification of metabolic pathways using pathfinding approaches: a systematic review.

    PubMed

    Abd Algfoor, Zeyad; Shahrizal Sunar, Mohd; Abdullah, Afnizanfaizal; Kolivand, Hoshang

    2017-03-01

    Metabolic pathways have become increasingly available for various microorganisms. Such pathways have spurred the development of a wide array of computational tools, in particular, mathematical pathfinding approaches. This article can facilitate the understanding of computational analysis of metabolic pathways in genomics. Moreover, stoichiometric and pathfinding approaches in metabolic pathway analysis are discussed. Three major types of studies are elaborated: stoichiometric identification models, pathway-based graph analysis and pathfinding approaches in cellular metabolism. Furthermore, evaluation of the outcomes of the pathways with mathematical benchmarking metrics is provided. This review would lead to better comprehension of metabolism behaviors in living cells, in terms of computed pathfinding approaches. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  19. Effects of a Computer-Assisted Concept Mapping Learning Strategy on EFL College Students' English Reading Comprehension

    ERIC Educational Resources Information Center

    Liu, Pei-Lin; Chen, Chiu-Jung; Chang, Yu-Ju

    2010-01-01

    The purpose of this research was to investigate the effects of a computer-assisted concept mapping learning strategy on EFL college learners' English reading comprehension. The research questions were: (1) what was the influence of the computer-assisted concept mapping learning strategy on different learners' English reading comprehension? (2) did…

  20. Comprehensive Micromechanics-Analysis Code - Version 4.0

    NASA Technical Reports Server (NTRS)

    Arnold, S. M.; Bednarcyk, B. A.

    2005-01-01

    Version 4.0 of the Micromechanics Analysis Code With Generalized Method of Cells (MAC/GMC) has been developed as an improved means of computational simulation of advanced composite materials. The previous version of MAC/GMC was described in "Comprehensive Micromechanics-Analysis Code" (LEW-16870), NASA Tech Briefs, Vol. 24, No. 6 (June 2000), page 38. To recapitulate: MAC/GMC is a computer program that predicts the elastic and inelastic thermomechanical responses of continuous and discontinuous composite materials with arbitrary internal microstructures and reinforcement shapes. The predictive capability of MAC/GMC rests on a model known as the generalized method of cells (GMC) - a continuum-based model of micromechanics that provides closed-form expressions for the macroscopic response of a composite material in terms of the properties, sizes, shapes, and responses of the individual constituents or phases that make up the material. Enhancements in version 4.0 include a capability for modeling thermomechanically and electromagnetically coupled ("smart") materials; a more-accurate (high-fidelity) version of the GMC; a capability to simulate discontinuous plies within a laminate; additional constitutive models of materials; expanded yield-surface-analysis capabilities; and expanded failure-analysis and life-prediction capabilities on both the microscopic and macroscopic scales.

  1. Advanced technology airfoil research, volume 2. [conferences

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A comprehensive review of airfoil research is presented. The major thrust of the research is in three areas: development of computational aerodynamic codes for airfoil analysis and design, development of experimental facilities and test techniques, and all types of airfoil applications.

  2. Computational nanoelectronics towards: design, analysis, synthesis, and fundamental limits

    NASA Technical Reports Server (NTRS)

    Klimeck, G.

    2003-01-01

    This seminar will review the development of a comprehensive nanoelectronic modeling tool (NEMO 1-D and NEMO 3-D) and its application to high-speed electronics (resonant tunneling diodes) and IR detectors and lasers (quantum dots and 1-D heterostructures).

  3. Comprehensive Flood Plain Studies Using Spatial Data Management Techniques.

    DTIC Science & Technology

    1978-06-01

    Hydrologic Engineer- ing Center computer programs that forecast urban storm water quality and dynamic in- stream water quality response to waste...determination. Water Quality The water quality analysis planned for the pilot study includes urban storm water quality forecasting and in-streamn...analysis is performed under the direction of Tony Thomas. Chief, Research Branch, by Jess Abbott for storm water quality analysis, R. G. Willey for

  4. Updated Lagrangian finite element formulations of various biological soft tissue non-linear material models: a comprehensive procedure and review.

    PubMed

    Townsend, Molly T; Sarigul-Klijn, Nesrin

    2016-01-01

    Simplified material models are commonly used in computational simulation of biological soft tissue as an approximation of the complicated material response and to minimize computational resources. However, the simulation of complex loadings, such as long-duration tissue swelling, necessitates complex models that are not easy to formulate. This paper strives to offer the updated Lagrangian formulation comprehensive procedure of various non-linear material models for the application of finite element analysis of biological soft tissues including a definition of the Cauchy stress and the spatial tangential stiffness. The relationships between water content, osmotic pressure, ionic concentration and the pore pressure stress of the tissue are discussed with the merits of these models and their applications.

  5. Comprehensive silicon solar cell computer modeling

    NASA Technical Reports Server (NTRS)

    Lamorte, M. F.

    1984-01-01

    The development of an efficient, comprehensive Si solar cell modeling program that has the capability of simulation accuracy of 5 percent or less is examined. A general investigation of computerized simulation is provided. Computer simulation programs are subdivided into a number of major tasks: (1) analytical method used to represent the physical system; (2) phenomena submodels that comprise the simulation of the system; (3) coding of the analysis and the phenomena submodels; (4) coding scheme that results in efficient use of the CPU so that CPU costs are low; and (5) modularized simulation program with respect to structures that may be analyzed, addition and/or modification of phenomena submodels as new experimental data become available, and the addition of other photovoltaic materials.

  6. Advanced Technology Airfoil Research, volume 1, part 1. [conference on development of computational codes and test facilities

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A comprehensive review of all NASA airfoil research, conducted both in-house and under grant and contract, as well as a broad spectrum of airfoil research outside of NASA is presented. Emphasis is placed on the development of computational aerodynamic codes for airfoil analysis and design, the development of experimental facilities and test techniques, and all types of airfoil applications.

  7. A CS1 pedagogical approach to parallel thinking

    NASA Astrophysics Data System (ADS)

    Rague, Brian William

    Almost all collegiate programs in Computer Science offer an introductory course in programming primarily devoted to communicating the foundational principles of software design and development. The ACM designates this introduction to computer programming course for first-year students as CS1, during which methodologies for solving problems within a discrete computational context are presented. Logical thinking is highlighted, guided primarily by a sequential approach to algorithm development and made manifest by typically using the latest, commercially successful programming language. In response to the most recent developments in accessible multicore computers, instructors of these introductory classes may wish to include training on how to design workable parallel code. Novel issues arise when programming concurrent applications which can make teaching these concepts to beginning programmers a seemingly formidable task. Student comprehension of design strategies related to parallel systems should be monitored to ensure an effective classroom experience. This research investigated the feasibility of integrating parallel computing concepts into the first-year CS classroom. To quantitatively assess student comprehension of parallel computing, an experimental educational study using a two-factor mixed group design was conducted to evaluate two instructional interventions in addition to a control group: (1) topic lecture only, and (2) topic lecture with laboratory work using a software visualization Parallel Analysis Tool (PAT) specifically designed for this project. A new evaluation instrument developed for this study, the Perceptions of Parallelism Survey (PoPS), was used to measure student learning regarding parallel systems. The results from this educational study show a statistically significant main effect among the repeated measures, implying that student comprehension levels of parallel concepts as measured by the PoPS improve immediately after the delivery of any initial three-week CS1 level module when compared with student comprehension levels just prior to starting the course. Survey results measured during the ninth week of the course reveal that performance levels remained high compared to pre-course performance scores. A second result produced by this study reveals no statistically significant interaction effect between the intervention method and student performance as measured by the evaluation instrument over three separate testing periods. However, visual inspection of survey score trends and the low p-value generated by the interaction analysis (0.062) indicate that further studies may verify improved concept retention levels for the lecture w/PAT group.

  8. A practice course to cultivate students' comprehensive ability of photoelectricity

    NASA Astrophysics Data System (ADS)

    Lv, Yong; Liu, Yang; Niu, Chunhui; Liu, Lishuang

    2017-08-01

    After the studying of many theoretical courses, it's important and urgent for the students from specialty of optoelectronic information science and engineering to cultivate their comprehensive ability of photoelectricity. We set up a comprehensive practice course named "Integrated Design of Optoelectronic Information System" (IDOIS) for the purpose that students can integrate their knowledge of optics, electronics and computer programming to design, install and debug an optoelectronic system with independent functions. Eight years of practice shows that this practice course can train students' ability of analysis, design/development and debugging of photoelectric system, improve their ability in document retrieval, design proposal and summary report writing, teamwork, innovation consciousness and skill.

  9. Preliminary Findings on the Computer-Administered Multiple-Choice Online Causal Comprehension Assessment, a Diagnostic Reading Comprehension Test

    ERIC Educational Resources Information Center

    Davison, Mark L.; Biancarosa, Gina; Carlson, Sarah E.; Seipel, Ben; Liu, Bowen

    2018-01-01

    The computer-administered Multiple-Choice Online Causal Comprehension Assessment (MOCCA) for Grades 3 to 5 has an innovative, 40-item multiple-choice structure in which each distractor corresponds to a comprehension process upon which poor comprehenders have been shown to rely. This structure requires revised thinking about measurement issues…

  10. Modeling Criminal Activity in Urban Landscapes

    NASA Astrophysics Data System (ADS)

    Brantingham, Patricia; Glässer, Uwe; Jackson, Piper; Vajihollahi, Mona

    Computational and mathematical methods arguably have an enormous potential for serving practical needs in crime analysis and prevention by offering novel tools for crime investigations and experimental platforms for evidence-based policy making. We present a comprehensive formal framework and tool support for mathematical and computational modeling of criminal behavior to facilitate systematic experimental studies of a wide range of criminal activities in urban environments. The focus is on spatial and temporal aspects of different forms of crime, including opportunistic and serial violent crimes. However, the proposed framework provides a basis to push beyond conventional empirical research and engage the use of computational thinking and social simulations in the analysis of terrorism and counter-terrorism.

  11. The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences

    PubMed Central

    Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker

    2016-01-01

    The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant’s platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses. PMID:26752627

  12. Scalable Parameter Estimation for Genome-Scale Biochemical Reaction Networks

    PubMed Central

    Kaltenbacher, Barbara; Hasenauer, Jan

    2017-01-01

    Mechanistic mathematical modeling of biochemical reaction networks using ordinary differential equation (ODE) models has improved our understanding of small- and medium-scale biological processes. While the same should in principle hold for large- and genome-scale processes, the computational methods for the analysis of ODE models which describe hundreds or thousands of biochemical species and reactions are missing so far. While individual simulations are feasible, the inference of the model parameters from experimental data is computationally too intensive. In this manuscript, we evaluate adjoint sensitivity analysis for parameter estimation in large scale biochemical reaction networks. We present the approach for time-discrete measurement and compare it to state-of-the-art methods used in systems and computational biology. Our comparison reveals a significantly improved computational efficiency and a superior scalability of adjoint sensitivity analysis. The computational complexity is effectively independent of the number of parameters, enabling the analysis of large- and genome-scale models. Our study of a comprehensive kinetic model of ErbB signaling shows that parameter estimation using adjoint sensitivity analysis requires a fraction of the computation time of established methods. The proposed method will facilitate mechanistic modeling of genome-scale cellular processes, as required in the age of omics. PMID:28114351

  13. On the Computation of Comprehensive Boolean Gröbner Bases

    NASA Astrophysics Data System (ADS)

    Inoue, Shutaro

    We show that a comprehensive Boolean Gröbner basis of an ideal I in a Boolean polynomial ring B (bar A,bar X) with main variables bar X and parameters bar A can be obtained by simply computing a usual Boolean Gröbner basis of I regarding both bar X and bar A as variables with a certain block term order such that bar X ≫ bar A. The result together with a fact that a finite Boolean ring is isomorphic to a direct product of the Galois field mathbb{GF}_2 enables us to compute a comprehensive Boolean Gröbner basis by only computing corresponding Gröbner bases in a polynomial ring over mathbb{GF}_2. Our implementation in a computer algebra system Risa/Asir shows that our method is extremely efficient comparing with existing computation algorithms of comprehensive Boolean Gröbner bases.

  14. Robust Nucleus/Cell Detection and Segmentation in Digital Pathology and Microscopy Images: A Comprehensive Review

    PubMed Central

    Xing, Fuyong; Yang, Lin

    2016-01-01

    Digital pathology and microscopy image analysis is widely used for comprehensive studies of cell morphology or tissue structure. Manual assessment is labor intensive and prone to inter-observer variations. Computer-aided methods, which can significantly improve the objectivity and reproducibility, have attracted a great deal of interest in recent literatures. Among the pipeline of building a computer-aided diagnosis system, nucleus or cell detection and segmentation play a very important role to describe the molecular morphological information. In the past few decades, many efforts have been devoted to automated nucleus/cell detection and segmentation. In this review, we provide a comprehensive summary of the recent state-of-the-art nucleus/cell segmentation approaches on different types of microscopy images including bright-field, phase-contrast, differential interference contrast (DIC), fluorescence, and electron microscopies. In addition, we discuss the challenges for the current methods and the potential future work of nucleus/cell detection and segmentation. PMID:26742143

  15. A Computational Clonal Analysis of the Developing Mouse Limb Bud

    PubMed Central

    Marcon, Luciano; Arqués, Carlos G.; Torres, Miguel S.; Sharpe, James

    2011-01-01

    A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis. PMID:21347315

  16. Computer program to perform cost and weight analysis of transport aircraft. Volume 2: Technical volume

    NASA Technical Reports Server (NTRS)

    1973-01-01

    An improved method for estimating aircraft weight and cost using a unique and fundamental approach was developed. The results of this study were integrated into a comprehensive digital computer program, which is intended for use at the preliminary design stage of aircraft development. The program provides a means of computing absolute values for weight and cost, and enables the user to perform trade studies with a sensitivity to detail design and overall structural arrangement. Both batch and interactive graphics modes of program operation are available.

  17. A Program for Computing Steady Inviscid Three-Dimensional Supersonic Flow on Reentry Vehicles. Volume I: Analysis and Programming

    DTIC Science & Technology

    1977-02-11

    Continue an reverse aide If necessaty and Identify by block number) A comprehensive computational procedure is presented for predicting the...Aeroballistic Reentry Technology ( ART ) program with some of the fundamental analytical and numerical work supported by NSWC Independent Research Funds. Most of...the Aerospace Corporation. The authors gratefully acknowledge the efforts of Mr. R. Feldhuhn, NSWC coordinator for the ART program, who was responsible

  18. Comprehensive Computer-Based Instructional Programs: What Works for Educationally Disadvantaged Students?

    ERIC Educational Resources Information Center

    Swan, Karen; And Others

    The Computer Pilot Program of the Division of Computer Information Services of the New York City Board of Education was designed to investigate the claim that comprehensive computer-based instruction (CBI) might best be used to improve the basic skills of educationally disadvantaged students. This ongoing project is designed to identify…

  19. Development of advanced structural analysis methodologies for predicting widespread fatigue damage in aircraft structures

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Starnes, James H., Jr.; Newman, James C., Jr.

    1995-01-01

    NASA is developing a 'tool box' that includes a number of advanced structural analysis computer codes which, taken together, represent the comprehensive fracture mechanics capability required to predict the onset of widespread fatigue damage. These structural analysis tools have complementary and specialized capabilities ranging from a finite-element-based stress-analysis code for two- and three-dimensional built-up structures with cracks to a fatigue and fracture analysis code that uses stress-intensity factors and material-property data found in 'look-up' tables or from equations. NASA is conducting critical experiments necessary to verify the predictive capabilities of the codes, and these tests represent a first step in the technology-validation and industry-acceptance processes. NASA has established cooperative programs with aircraft manufacturers to facilitate the comprehensive transfer of this technology by making these advanced structural analysis codes available to industry.

  20. Evaluation of CFD to Determine Two-Dimensional Airfoil Characteristics for Rotorcraft Applications

    NASA Technical Reports Server (NTRS)

    Smith, Marilyn J.; Wong, Tin-Chee; Potsdam, Mark; Baeder, James; Phanse, Sujeet

    2004-01-01

    The efficient prediction of helicopter rotor performance, vibratory loads, and aeroelastic properties still relies heavily on the use of comprehensive analysis codes by the rotorcraft industry. These comprehensive codes utilize look-up tables to provide two-dimensional aerodynamic characteristics. Typically these tables are comprised of a combination of wind tunnel data, empirical data and numerical analyses. The potential to rely more heavily on numerical computations based on Computational Fluid Dynamics (CFD) simulations has become more of a reality with the advent of faster computers and more sophisticated physical models. The ability of five different CFD codes applied independently to predict the lift, drag and pitching moments of rotor airfoils is examined for the SC1095 airfoil, which is utilized in the UH-60A main rotor. Extensive comparisons with the results of ten wind tunnel tests are performed. These CFD computations are found to be as good as experimental data in predicting many of the aerodynamic performance characteristics. Four turbulence models were examined (Baldwin-Lomax, Spalart-Allmaras, Menter SST, and k-omega).

  1. Technical Advances and Fifth Grade Reading Comprehension: Do Students Benefit?

    ERIC Educational Resources Information Center

    Fountaine, Drew

    This paper takes a look at some recent studies on utilization of technical tools, primarily personal computers and software, for improving fifth-grade students' reading comprehension. Specifically, the paper asks what benefits an educator can expect students to derive from closed-captioning and computer-assisted reading comprehension products. It…

  2. Competencies in Organizational E-Learning: Concepts and Tools

    ERIC Educational Resources Information Center

    Sicilia, Miguel-Angel, Ed.

    2007-01-01

    "Competencies in Organizational E-Learning: Concepts and Tools" provides a comprehensive view of the way competencies can be used to drive organizational e-learning, including the main conceptual elements, competency gap analysis, advanced related computing topics, the application of semantic Web technologies, and the integration of competencies…

  3. HART-II: Prediction of Blade-Vortex Interaction Loading

    NASA Technical Reports Server (NTRS)

    Lim, Joon W.; Tung, Chee; Yu, Yung H.; Burley, Casey L.; Brooks, Thomas; Boyd, Doug; vanderWall, Berend; Schneider, Oliver; Richard, Hugues; Raffel, Markus

    2003-01-01

    During the HART-I data analysis, the need for comprehensive wake data was found including vortex creation and aging, and its re-development after blade-vortex interaction. In October 2001, US Army AFDD, NASA Langley, German DLR, French ONERA and Dutch DNW performed the HART-II test as an international joint effort. The main objective was to focus on rotor wake measurement using a PIV technique along with the comprehensive data of blade deflections, airloads, and acoustics. Three prediction teams made preliminary correlation efforts with HART-II data: a joint US team of US Army AFDD and NASA Langley, German DLR, and French ONERA. The predicted results showed significant improvements over the HART-I predicted results, computed about several years ago, which indicated that there has been better understanding of complicated wake modeling in the comprehensive rotorcraft analysis. All three teams demonstrated satisfactory prediction capabilities, in general, though there were slight deviations of prediction accuracies for various disciplines.

  4. An evaluation of a computer code based on linear acoustic theory for predicting helicopter main rotor noise

    NASA Astrophysics Data System (ADS)

    Davis, S. J.; Egolf, T. A.

    1980-07-01

    Acoustic characteristics predicted using a recently developed computer code were correlated with measured acoustic data for two helicopter rotors. The analysis, is based on a solution of the Ffowcs-Williams-Hawkings (FW-H) equation and includes terms accounting for both the thickness and loading components of the rotational noise. Computations are carried out in the time domain and assume free field conditions. Results of the correlation show that the Farrassat/Nystrom analysis, when using predicted airload data as input, yields fair but encouraging correlation for the first 6 harmonics of blade passage. It also suggests that although the analysis represents a valuable first step towards developing a truly comprehensive helicopter rotor noise prediction capability, further work remains to be done identifying and incorporating additional noise mechanisms into the code.

  5. Examining the Effect of Computer-Based Passage Presentation on Reading Test Performance

    ERIC Educational Resources Information Center

    Higgins, Jennifer; Russell, Michael; Hoffmann, Thomas

    2005-01-01

    To examine the impact of transitioning 4th grade reading comprehension assessments to the computer, 219 fourth graders were randomly assigned to take a one-hour reading comprehension assessment on paper, on a computer using scrolling text to navigate through passages, or on a computer using paging text to navigate through passages. This study…

  6. Computing a Comprehensible Model for Spam Filtering

    NASA Astrophysics Data System (ADS)

    Ruiz-Sepúlveda, Amparo; Triviño-Rodriguez, José L.; Morales-Bueno, Rafael

    In this paper, we describe the application of the Desicion Tree Boosting (DTB) learning model to spam email filtering.This classification task implies the learning in a high dimensional feature space. So, it is an example of how the DTB algorithm performs in such feature space problems. In [1], it has been shown that hypotheses computed by the DTB model are more comprehensible that the ones computed by another ensemble methods. Hence, this paper tries to show that the DTB algorithm maintains the same comprehensibility of hypothesis in high dimensional feature space problems while achieving the performance of other ensemble methods. Four traditional evaluation measures (precision, recall, F1 and accuracy) have been considered for performance comparison between DTB and others models usually applied to spam email filtering. The size of the hypothesis computed by a DTB is smaller and more comprehensible than the hypothesis computed by Adaboost and Naïve Bayes.

  7. Nationwide disturbance attribution on NASA’s earth exchange: experiences in a high-end computing environment

    Treesearch

    J. Chris Toney; Karen G. Schleeweis; Jennifer Dungan; Andrew Michaelis; Todd Schroeder; Gretchen G. Moisen

    2015-01-01

    The North American Forest Dynamics (NAFD) project’s Attribution Team is completing nationwide processing of historic Landsat data to provide a comprehensive annual, wall-to-wall analysis of US disturbance history, with attribution, over the last 25+ years. Per-pixel time series analysis based on a new nonparametric curve fitting algorithm yields several metrics useful...

  8. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  9. Anima: Modular Workflow System for Comprehensive Image Data Analysis

    PubMed Central

    Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa

    2014-01-01

    Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541

  10. AUTOMATED GIS WATERSHED ANALYSIS TOOLS FOR RUSLE/SEDMOD SOIL EROSION AND SEDIMENTATION MODELING

    EPA Science Inventory

    A comprehensive procedure for computing soil erosion and sediment delivery metrics has been developed using a suite of automated Arc Macro Language (AML ) scripts and a pair of processing- intensive ANSI C++ executable programs operating on an ESRI ArcGIS 8.x Workstation platform...

  11. Chi-Square Statistics, Tests of Hypothesis and Technology.

    ERIC Educational Resources Information Center

    Rochowicz, John A.

    The use of technology such as computers and programmable calculators enables students to find p-values and conduct tests of hypotheses in many different ways. Comprehension and interpretation of a research problem become the focus for statistical analysis. This paper describes how to calculate chisquare statistics and p-values for statistical…

  12. Studies for development of novel quinazolinones: New biomarker for EGFR

    NASA Astrophysics Data System (ADS)

    Aggarwal, Swati; Sinha, Deepa; Tiwari, Anjani Kumar; Pooja, Pooja; Kaul, Ankur; Singh, Gurmeet; Mishra, Anil Kumar

    2015-05-01

    The binding capabilities of a series of novel quinazolinone molecules were established and stated in a comprehensive computational methodology as well as by in vitro analysis. The main focus of this work was to achieve more insight of the interactions with crystal structure of PDB ID:

  13. Computational studies on the excited state properties of citrinin and application in fluorescence analysis

    USDA-ARS?s Scientific Manuscript database

    Citrinin is a mycotoxin of increasing concern that is produced by fungi associated with maize, red yeast rice, and other agricultural commodities. A comprehensive time-dependent density functional study on the excited state properties of citrinin was conducted to identify parameters for reliable det...

  14. CODAP: Source Program Listings for the Univac 1108.

    ERIC Educational Resources Information Center

    Weissmuller, Johnny J.; And Others

    Documentation of the Univac 1108 Comprehensive Occupational Data Analysis Programs (CODAP) system is being published in a series of three technical reports covering the control card and programing aspects of the system, which is a highly interactive and efficient system of computer routines for analyzing, organizing, and reporting occupational…

  15. CODAP: Control Card Specifications for the Univac 1108.

    ERIC Educational Resources Information Center

    Stacey, William D.; And Others

    The document is one of three in a series of technical reports covering the control card and programing aspects of the Comprehensive Occupational Data Analysis Programs (CODAP), a highly interactive and efficient system of computer routines for analyzing, organizing, and reporting occupational information. The document contains control card…

  16. MATREX: A Unifying Modeling and Simulation Architecture for Live-Virtual-Constructive Applications

    DTIC Science & Technology

    2007-05-23

    Deployment Systems Acquisition Operations & Support B C Sustainment FRP Decision Review FOC LRIP/IOT& ECritical Design Review Pre-Systems...CMS2 – Comprehensive Munitions & Sensor Server • CSAT – C4ISR Static Analysis Tool • C4ISR – Command & Control, Communications, Computers

  17. Synchronous Computer-Mediated Communication and Interaction: A Research Synthesis and Meta-Analysis

    ERIC Educational Resources Information Center

    Ziegler, Nicole

    2013-01-01

    The interaction approach to second language acquisition (SLA) suggests that changes that occur during conversation facilitate second language development by providing learners with opportunities to receive modified comprehensible input and interactional feedback, to produce output, and to notice gaps between their interlanguage and the target…

  18. What's the Story? A Computational Analysis of Narrative Competence in Autism

    ERIC Educational Resources Information Center

    Lee, Michelle; Martin, Gary E.; Hogan, Abigail; Hano, Deanna; Gordon, Peter C.; Losh, Molly

    2018-01-01

    Individuals with autism spectrum disorder demonstrate narrative (i.e. storytelling) difficulties which can significantly impact their ability to form and maintain social relationships. However, existing research has not comprehensively documented these impairments in more open-ended, emotionally evocative situations common to daily interactions.…

  19. On aerodynamic wake analysis and its relation to total aerodynamic drag in a wind tunnel environment

    NASA Astrophysics Data System (ADS)

    Guterres, Rui M.

    The present work was developed with the goal of advancing the state of the art in the application of three-dimensional wake data analysis to the quantification of aerodynamic drag on a body in a low speed wind tunnel environment. Analysis of the existing tools, their strengths and limitations is presented. Improvements to the existing analysis approaches were made. Software tools were developed to integrate the analysis into a practical tool. A comprehensive derivation of the equations needed for drag computations based on three dimensional separated wake data is developed. A set of complete steps ranging from the basic mathematical concept to the applicable engineering equations is presented. An extensive experimental study was conducted. Three representative body types were studied in varying ground effect conditions. A detailed qualitative wake analysis using wake imaging and two and three dimensional flow visualization was performed. Several significant features of the flow were identified and their relation to the total aerodynamic drag established. A comprehensive wake study of this type is shown to be in itself a powerful tool for the analysis of the wake aerodynamics and its relation to body drag. Quantitative wake analysis techniques were developed. Significant post processing and data conditioning tools and precision analysis were developed. The quality of the data is shown to be in direct correlation with the accuracy of the computed aerodynamic drag. Steps are taken to identify the sources of uncertainty. These are quantified when possible and the accuracy of the computed results is seen to significantly improve. When post processing alone does not resolve issues related to precision and accuracy, solutions are proposed. The improved quantitative wake analysis is applied to the wake data obtained. Guidelines are established that will lead to more successful implementation of these tools in future research programs. Close attention is paid to implementation of issues that are of crucial importance for the accuracy of the results and that are not detailed in the literature. The impact of ground effect on the flows in hand is qualitatively and quantitatively studied. Its impact on the accuracy of the computations as well as the wall drag incompatibility with the theoretical model followed are discussed. The newly developed quantitative analysis provides significantly increased accuracy. The aerodynamic drag coefficient is computed within one percent of balance measured value for the best cases.

  20. An interactive web-based application for Comprehensive Analysis of RNAi-screen Data.

    PubMed

    Dutta, Bhaskar; Azhir, Alaleh; Merino, Louis-Henri; Guo, Yongjian; Revanur, Swetha; Madhamshettiwar, Piyush B; Germain, Ronald N; Smith, Jennifer A; Simpson, Kaylene J; Martin, Scott E; Buehler, Eugen; Beuhler, Eugen; Fraser, Iain D C

    2016-02-23

    RNAi screens are widely used in functional genomics. Although the screen data can be susceptible to a number of experimental biases, many of these can be corrected by computational analysis. For this purpose, here we have developed a web-based platform for integrated analysis and visualization of RNAi screen data named CARD (for Comprehensive Analysis of RNAi Data; available at https://card.niaid.nih.gov). CARD allows the user to seamlessly carry out sequential steps in a rigorous data analysis workflow, including normalization, off-target analysis, integration of gene expression data, optimal thresholds for hit selection and network/pathway analysis. To evaluate the utility of CARD, we describe analysis of three genome-scale siRNA screens and demonstrate: (i) a significant increase both in selection of subsequently validated hits and in rejection of false positives, (ii) an increased overlap of hits from independent screens of the same biology and (iii) insight to microRNA (miRNA) activity based on siRNA seed enrichment.

  1. An interactive web-based application for Comprehensive Analysis of RNAi-screen Data

    PubMed Central

    Dutta, Bhaskar; Azhir, Alaleh; Merino, Louis-Henri; Guo, Yongjian; Revanur, Swetha; Madhamshettiwar, Piyush B.; Germain, Ronald N.; Smith, Jennifer A.; Simpson, Kaylene J.; Martin, Scott E.; Beuhler, Eugen; Fraser, Iain D. C.

    2016-01-01

    RNAi screens are widely used in functional genomics. Although the screen data can be susceptible to a number of experimental biases, many of these can be corrected by computational analysis. For this purpose, here we have developed a web-based platform for integrated analysis and visualization of RNAi screen data named CARD (for Comprehensive Analysis of RNAi Data; available at https://card.niaid.nih.gov). CARD allows the user to seamlessly carry out sequential steps in a rigorous data analysis workflow, including normalization, off-target analysis, integration of gene expression data, optimal thresholds for hit selection and network/pathway analysis. To evaluate the utility of CARD, we describe analysis of three genome-scale siRNA screens and demonstrate: (i) a significant increase both in selection of subsequently validated hits and in rejection of false positives, (ii) an increased overlap of hits from independent screens of the same biology and (iii) insight to microRNA (miRNA) activity based on siRNA seed enrichment. PMID:26902267

  2. The role of sustained attention and display medium in reading comprehension among adolescents with ADHD and without it.

    PubMed

    Stern, Pnina; Shalev, Lilach

    2013-01-01

    Difficulties in reading comprehension are common in children and adolescents with Attention Deficit/Hyperactivity Disorder (ADHD). The current study aimed at investigating the relation between sustained attention and reading comprehension among adolescents with and without ADHD. Another goal was to examine the impact of two manipulations of the text on the efficiency of reading comprehension: Spacing (standard- vs. double-spacing) and Type of presentation (computer screen vs. hard copy). Reading comprehension of two groups of adolescents (participants with ADHD and normal controls) was assessed and compared in four different conditions (standard printed, spaced printed, standard on computer screen, spaced on computer screen). In addition, participants completed a visual sustained attention task. Significant differences in reading comprehension and in sustained attention were obtained between the two groups. Also, a significant correlation was obtained between sustained attention and reading comprehension. Moreover, a significant interaction was revealed between presentation-type, spacing and level of sustained attention on reading comprehension. Implications for reading intervention and the importance of early assessment of attention functioning are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Comprehension and retrieval of failure cases in airborne observatories

    NASA Technical Reports Server (NTRS)

    Alvarado, Sergio J.; Mock, Kenrick J.

    1995-01-01

    This paper describes research dealing with the computational problem of analyzing and repairing failures of electronic and mechanical systems of telescopes in NASA's airborne observatories, such as KAO (Kuiper Airborne Observatory) and SOFIA (Stratospheric Observatory for Infrared Astronomy). The research has resulted in the development of an experimental system that acquires knowledge of failure analysis from input text, and answers questions regarding failure detection and correction. The system's design builds upon previous work on text comprehension and question answering, including: knowledge representation for conceptual analysis of failure descriptions, strategies for mapping natural language into conceptual representations, case-based reasoning strategies for memory organization and indexing, and strategies for memory search and retrieval. These techniques have been combined into a model that accounts for: (a) how to build a knowledge base of system failures and repair procedures from descriptions that appear in telescope-operators' logbooks and FMEA (failure modes and effects analysis) manuals; and (b) how to use that knowledge base to search and retrieve answers to questions about causes and effects of failures, as well as diagnosis and repair procedures. This model has been implemented in FANSYS (Failure ANalysis SYStem), a prototype text comprehension and question answering program for failure analysis.

  4. Comprehension and retrieval of failure cases in airborne observatories

    NASA Astrophysics Data System (ADS)

    Alvarado, Sergio J.; Mock, Kenrick J.

    1995-05-01

    This paper describes research dealing with the computational problem of analyzing and repairing failures of electronic and mechanical systems of telescopes in NASA's airborne observatories, such as KAO (Kuiper Airborne Observatory) and SOFIA (Stratospheric Observatory for Infrared Astronomy). The research has resulted in the development of an experimental system that acquires knowledge of failure analysis from input text, and answers questions regarding failure detection and correction. The system's design builds upon previous work on text comprehension and question answering, including: knowledge representation for conceptual analysis of failure descriptions, strategies for mapping natural language into conceptual representations, case-based reasoning strategies for memory organization and indexing, and strategies for memory search and retrieval. These techniques have been combined into a model that accounts for: (a) how to build a knowledge base of system failures and repair procedures from descriptions that appear in telescope-operators' logbooks and FMEA (failure modes and effects analysis) manuals; and (b) how to use that knowledge base to search and retrieve answers to questions about causes and effects of failures, as well as diagnosis and repair procedures. This model has been implemented in FANSYS (Failure ANalysis SYStem), a prototype text comprehension and question answering program for failure analysis.

  5. Structural Technology Evaluation and Analysis Program (STEAP). Delivery Order 0037: Prognosis-Based Control Reconfiguration for an Aircraft with Faulty Actuator to Enable Performance in a Degraded State

    DTIC Science & Technology

    2010-12-01

    computers in 1953. HIL motion simulators were also built for the dynamic testing of vehicle com- ponents (e.g. suspensions, bodies ) with hydraulic or...complex, comprehensive mechanical systems can be simulated in real-time by parallel computers; examples include multi- body sys- tems, brake systems...hard constraints in a multivariable control framework. And the third aspect is the ability to perform online optimization. These aspects results in

  6. A computer-aided approach to nonlinear control systhesis

    NASA Technical Reports Server (NTRS)

    Wie, Bong; Anthony, Tobin

    1988-01-01

    The major objective of this project is to develop a computer-aided approach to nonlinear stability analysis and nonlinear control system design. This goal is to be obtained by refining the describing function method as a synthesis tool for nonlinear control design. The interim report outlines the approach by this study to meet these goals including an introduction to the INteractive Controls Analysis (INCA) program which was instrumental in meeting these study objectives. A single-input describing function (SIDF) design methodology was developed in this study; coupled with the software constructed in this study, the results of this project provide a comprehensive tool for design and integration of nonlinear control systems.

  7. Tissue classification for laparoscopic image understanding based on multispectral texture analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Yan; Wirkert, Sebastian J.; Iszatt, Justin; Kenngott, Hannes; Wagner, Martin; Mayer, Benjamin; Stock, Christian; Clancy, Neil T.; Elson, Daniel S.; Maier-Hein, Lena

    2016-03-01

    Intra-operative tissue classification is one of the prerequisites for providing context-aware visualization in computer-assisted minimally invasive surgeries. As many anatomical structures are difficult to differentiate in conventional RGB medical images, we propose a classification method based on multispectral image patches. In a comprehensive ex vivo study we show (1) that multispectral imaging data is superior to RGB data for organ tissue classification when used in conjunction with widely applied feature descriptors and (2) that combining the tissue texture with the reflectance spectrum improves the classification performance. Multispectral tissue analysis could thus evolve as a key enabling technique in computer-assisted laparoscopy.

  8. Enhancing Next-Generation Sequencing-Guided Cancer Care Through Cognitive Computing.

    PubMed

    Patel, Nirali M; Michelini, Vanessa V; Snell, Jeff M; Balu, Saianand; Hoyle, Alan P; Parker, Joel S; Hayward, Michele C; Eberhard, David A; Salazar, Ashley H; McNeillie, Patrick; Xu, Jia; Huettner, Claudia S; Koyama, Takahiko; Utro, Filippo; Rhrissorrakrai, Kahn; Norel, Raquel; Bilal, Erhan; Royyuru, Ajay; Parida, Laxmi; Earp, H Shelton; Grilley-Olson, Juneko E; Hayes, D Neil; Harvey, Stephen J; Sharpless, Norman E; Kim, William Y

    2018-02-01

    Using next-generation sequencing (NGS) to guide cancer therapy has created challenges in analyzing and reporting large volumes of genomic data to patients and caregivers. Specifically, providing current, accurate information on newly approved therapies and open clinical trials requires considerable manual curation performed mainly by human "molecular tumor boards" (MTBs). The purpose of this study was to determine the utility of cognitive computing as performed by Watson for Genomics (WfG) compared with a human MTB. One thousand eighteen patient cases that previously underwent targeted exon sequencing at the University of North Carolina (UNC) and subsequent analysis by the UNCseq informatics pipeline and the UNC MTB between November 7, 2011, and May 12, 2015, were analyzed with WfG, a cognitive computing technology for genomic analysis. Using a WfG-curated actionable gene list, we identified additional genomic events of potential significance (not discovered by traditional MTB curation) in 323 (32%) patients. The majority of these additional genomic events were considered actionable based upon their ability to qualify patients for biomarker-selected clinical trials. Indeed, the opening of a relevant clinical trial within 1 month prior to WfG analysis provided the rationale for identification of a new actionable event in nearly a quarter of the 323 patients. This automated analysis took <3 minutes per case. These results demonstrate that the interpretation and actionability of somatic NGS results are evolving too rapidly to rely solely on human curation. Molecular tumor boards empowered by cognitive computing could potentially improve patient care by providing a rapid, comprehensive approach for data analysis and consideration of up-to-date availability of clinical trials. The results of this study demonstrate that the interpretation and actionability of somatic next-generation sequencing results are evolving too rapidly to rely solely on human curation. Molecular tumor boards empowered by cognitive computing can significantly improve patient care by providing a fast, cost-effective, and comprehensive approach for data analysis in the delivery of precision medicine. Patients and physicians who are considering enrollment in clinical trials may benefit from the support of such tools applied to genomic data. © AlphaMed Press 2017.

  9. An evaluation of a computer code based on linear acoustic theory for predicting helicopter main rotor noise. [CH-53A and S-76 helicopters

    NASA Technical Reports Server (NTRS)

    Davis, S. J.; Egolf, T. A.

    1980-01-01

    Acoustic characteristics predicted using a recently developed computer code were correlated with measured acoustic data for two helicopter rotors. The analysis, is based on a solution of the Ffowcs-Williams-Hawkings (FW-H) equation and includes terms accounting for both the thickness and loading components of the rotational noise. Computations are carried out in the time domain and assume free field conditions. Results of the correlation show that the Farrassat/Nystrom analysis, when using predicted airload data as input, yields fair but encouraging correlation for the first 6 harmonics of blade passage. It also suggests that although the analysis represents a valuable first step towards developing a truly comprehensive helicopter rotor noise prediction capability, further work remains to be done identifying and incorporating additional noise mechanisms into the code.

  10. Applied Graph-Mining Algorithms to Study Biomolecular Interaction Networks

    PubMed Central

    2014-01-01

    Protein-protein interaction (PPI) networks carry vital information on the organization of molecular interactions in cellular systems. The identification of functionally relevant modules in PPI networks is one of the most important applications of biological network analysis. Computational analysis is becoming an indispensable tool to understand large-scale biomolecular interaction networks. Several types of computational methods have been developed and employed for the analysis of PPI networks. Of these computational methods, graph comparison and module detection are the two most commonly used strategies. This review summarizes current literature on graph kernel and graph alignment methods for graph comparison strategies, as well as module detection approaches including seed-and-extend, hierarchical clustering, optimization-based, probabilistic, and frequent subgraph methods. Herein, we provide a comprehensive review of the major algorithms employed under each theme, including our recently published frequent subgraph method, for detecting functional modules commonly shared across multiple cancer PPI networks. PMID:24800226

  11. The building loads analysis system thermodynamics (BLAST) program, Version 2. 0: input booklet. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sowell, E.

    1979-06-01

    The Building Loads Analysis and System Thermodynamics (BLAST) program is a comprehensive set of subprograms for predicting energy consumption in buildings. There are three major subprograms: (1) the space load predicting subprogram, which computes hourly space loads in a building or zone based on user input and hourly weather data; (2) the air distribution system simulation subprogram, which uses the computed space load and user inputs describing the building air-handling system to calculate hot water or steam, chilled water, and electric energy demands; and (3) the central plant simulation program, which simulates boilers, chillers, onsite power generating equipment and solarmore » energy systems and computes monthly and annual fuel and electrical power consumption and plant life cycle cost.« less

  12. Direct coal liquefaction baseline design and system analysis. Quarterly report, January--March 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less

  13. Direct coal liquefaction baseline design and system analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less

  14. Assessment of the information content of patterns: an algorithm

    NASA Astrophysics Data System (ADS)

    Daemi, M. Farhang; Beurle, R. L.

    1991-12-01

    A preliminary investigation confirmed the possibility of assessing the translational and rotational information content of simple artificial images. The calculation is tedious, and for more realistic patterns it is essential to implement the method on a computer. This paper describes an algorithm developed for this purpose which confirms the results of the preliminary investigation. Use of the algorithm facilitates much more comprehensive analysis of the combined effect of continuous rotation and fine translation, and paves the way for analysis of more realistic patterns. Owing to the volume of calculation involved in these algorithms, extensive computing facilities were necessary. The major part of the work was carried out using an ICL 3900 series mainframe computer as well as other powerful workstations such as a RISC architecture MIPS machine.

  15. Computer program modifications of Open-file report 82-1065; a comprehensive system for interpreting seismic-refraction and arrival-time data using interactive computer methods

    USGS Publications Warehouse

    Ackermann, Hans D.; Pankratz, Leroy W.; Dansereau, Danny A.

    1983-01-01

    The computer programs published in Open-File Report 82-1065, A comprehensive system for interpreting seismic-refraction arrival-time data using interactive computer methods (Ackermann, Pankratz, and Dansereau, 1982), have been modified to run on a mini-computer. The new version uses approximately 1/10 of the memory of the initial version, is more efficient and gives the same results.

  16. Computer-Based and Paper-Based Reading Comprehension in Adolescents with Typical Language Development and Language-Learning Disabilities

    ERIC Educational Resources Information Center

    Srivastava, Pradyumn; Gray, Shelley

    2012-01-01

    Purpose: With the global expansion of technology, our reading platform has shifted from traditional text to hypertext, yet little consideration has been given to how this shift might help or hinder students' reading comprehension. The purpose of this study was to compare reading comprehension of computer-based and paper-based texts in adolescents…

  17. The Effect of Computer-Assisted Language Learning on Reading Comprehension in an Iranian EFL Context

    ERIC Educational Resources Information Center

    Saeidi, Mahnaz; Yusefi, Mahsa

    2012-01-01

    This study is an attempt to examine the effect of computer-assisted language learning (CALL) on reading comprehension in an Iranian English as a foreign language (EFL) context. It was hypothesized that CALL has an effect on reading comprehension. Forty female learners of English at intermediate level after administering a proficiency test were…

  18. Computational Text Analysis: A More Comprehensive Approach to Determine Readability of Reading Materials

    ERIC Educational Resources Information Center

    Aziz, Anealka; Fook, Chan Yuen; Alsree, Zubaida

    2010-01-01

    Reading materials are considered having high readability if readers are interested to read the materials, understand the content of the materials and able to read the materials fluently. In contrast, reading materials with low readability discourage readers from reading the materials, create difficulties for readers to understand the content of…

  19. A Formative Analysis of Resources Used to Learn Software

    ERIC Educational Resources Information Center

    Kay, Robin

    2007-01-01

    A comprehensive, formal comparison of resources used to learn computer software has yet to be researched. Understanding the relative strengths and weakness of resources would provide useful guidance to teachers and students. The purpose of the current study was to explore the effectiveness of seven key resources: human assistance, the manual, the…

  20. Computational analysis of aircraft pressure relief doors

    NASA Astrophysics Data System (ADS)

    Schott, Tyler

    Modern trends in commercial aircraft design have sought to improve fuel efficiency while reducing emissions by operating at higher pressures and temperatures than ever before. Consequently, greater demands are placed on the auxiliary bleed air systems used for a multitude of aircraft operations. The increased role of bleed air systems poses significant challenges for the pressure relief system to ensure the safe and reliable operation of the aircraft. The core compartment pressure relief door (PRD) is an essential component of the pressure relief system which functions to relieve internal pressure in the core casing of a high-bypass turbofan engine during a burst duct over-pressurization event. The successful modeling and analysis of a burst duct event are imperative to the design and development of PRD's to ensure that they will meet the increased demands placed on the pressure relief system. Leveraging high-performance computing coupled with advances in computational analysis, this thesis focuses on a comprehensive computational fluid dynamics (CFD) study to characterize turbulent flow dynamics and quantify the performance of a core compartment PRD across a range of operating conditions and geometric configurations. The CFD analysis was based on a compressible, steady-state, three-dimensional, Reynolds-averaged Navier-Stokes approach. Simulations were analyzed, and results show that variations in freestream conditions, plenum environment, and geometric configurations have a non-linear impact on the discharge, moment, thrust, and surface temperature characteristics. The CFD study revealed that the underlying physics for this behavior is explained by the interaction of vortices, jets, and shockwaves. This thesis research is innovative and provides a comprehensive and detailed analysis of existing and novel PRD geometries over a range of realistic operating conditions representative of a burst duct over-pressurization event. Further, the study provides aircraft manufacturers with valuable insight into the impact that operating conditions and geometric configurations have on PRD performance and how the information can be used to assist future research and development of PRD design.

  1. Surface electrical properties experiment, Part 3

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A complete unified discussion of the electromagnetic response of a plane stratified structure is reported. A detailed and comprehensive analysis of the theoretical parts of the electromagnetic is given. The numerical problem of computing numbers of the electromagnetic field strengths is discussed. It is shown that the analysis of the conductive media is not very far removed from the theoretical analysis and the numerical difficulties are not as accute as for the low-loss problem. For Vol. 1, see N75-15570; for Vol. 2 see N75-15571.

  2. HiC-bench: comprehensive and reproducible Hi-C data analysis designed for parameter exploration and benchmarking.

    PubMed

    Lazaris, Charalampos; Kelly, Stephen; Ntziachristos, Panagiotis; Aifantis, Iannis; Tsirigos, Aristotelis

    2017-01-05

    Chromatin conformation capture techniques have evolved rapidly over the last few years and have provided new insights into genome organization at an unprecedented resolution. Analysis of Hi-C data is complex and computationally intensive involving multiple tasks and requiring robust quality assessment. This has led to the development of several tools and methods for processing Hi-C data. However, most of the existing tools do not cover all aspects of the analysis and only offer few quality assessment options. Additionally, availability of a multitude of tools makes scientists wonder how these tools and associated parameters can be optimally used, and how potential discrepancies can be interpreted and resolved. Most importantly, investigators need to be ensured that slight changes in parameters and/or methods do not affect the conclusions of their studies. To address these issues (compare, explore and reproduce), we introduce HiC-bench, a configurable computational platform for comprehensive and reproducible analysis of Hi-C sequencing data. HiC-bench performs all common Hi-C analysis tasks, such as alignment, filtering, contact matrix generation and normalization, identification of topological domains, scoring and annotation of specific interactions using both published tools and our own. We have also embedded various tasks that perform quality assessment and visualization. HiC-bench is implemented as a data flow platform with an emphasis on analysis reproducibility. Additionally, the user can readily perform parameter exploration and comparison of different tools in a combinatorial manner that takes into account all desired parameter settings in each pipeline task. This unique feature facilitates the design and execution of complex benchmark studies that may involve combinations of multiple tool/parameter choices in each step of the analysis. To demonstrate the usefulness of our platform, we performed a comprehensive benchmark of existing and new TAD callers exploring different matrix correction methods, parameter settings and sequencing depths. Users can extend our pipeline by adding more tools as they become available. HiC-bench consists an easy-to-use and extensible platform for comprehensive analysis of Hi-C datasets. We expect that it will facilitate current analyses and help scientists formulate and test new hypotheses in the field of three-dimensional genome organization.

  3. Improvements to a method for the geometrically nonlinear analysis of compressively loaded stiffened composite panels

    NASA Technical Reports Server (NTRS)

    Stoll, Frederick

    1993-01-01

    The NLPAN computer code uses a finite-strip approach to the analysis of thin-walled prismatic composite structures such as stiffened panels. The code can model in-plane axial loading, transverse pressure loading, and constant through-the-thickness thermal loading, and can account for shape imperfections. The NLPAN code represents an attempt to extend the buckling analysis of the VIPASA computer code into the geometrically nonlinear regime. Buckling mode shapes generated using VIPASA are used in NLPAN as global functions for representing displacements in the nonlinear regime. While the NLPAN analysis is approximate in nature, it is computationally economical in comparison with finite-element analysis, and is thus suitable for use in preliminary design and design optimization. A comprehensive description of the theoretical approach of NLPAN is provided. A discussion of some operational considerations for the NLPAN code is included. NLPAN is applied to several test problems in order to demonstrate new program capabilities, and to assess the accuracy of the code in modeling various types of loading and response. User instructions for the NLPAN computer program are provided, including a detailed description of the input requirements and example input files for two stiffened-panel configurations.

  4. Development and application of an analysis of axisymmetric body effects on helicopter rotor aerodynamics using modified slender body theory

    NASA Technical Reports Server (NTRS)

    Yamauchi, G.; Johnson, W.

    1984-01-01

    A computationally efficient body analysis designed to couple with a comprehensive helicopter analysis is developed in order to calculate the body-induced aerodynamic effects on rotor performance and loads. A modified slender body theory is used as the body model. With the objective of demonstrating the accuracy, efficiency, and application of the method, the analysis at this stage is restricted to axisymmetric bodies at zero angle of attack. By comparing with results from an exact analysis for simple body shapes, it is found that the modified slender body theory provides an accurate potential flow solution for moderately thick bodies, with only a 10%-20% increase in computational effort over that of an isolated rotor analysis. The computational ease of this method provides a means for routine assessment of body-induced effects on a rotor. Results are given for several configurations that typify those being used in the Ames 40- by 80-Foot Wind Tunnel and in the rotor-body aerodynamic interference tests being conducted at Ames. A rotor-hybrid airship configuration is also analyzed.

  5. The Impact of Item Dependency on the Efficiency of Testing and Reliability of Student Scores from a Computer Adaptive Assessment of Reading Comprehension

    ERIC Educational Resources Information Center

    Petscher, Yaacov; Foorman, Barbara R.; Truckenmiller, Adrea J.

    2017-01-01

    The objective of the present study was to evaluate the extent to which students who took a computer adaptive test of reading comprehension accounting for testlet effects were administered fewer passages and had a more precise estimate of their reading comprehension ability compared to students in the control condition. A randomized controlled…

  6. System analysis in rotorcraft design: The past decade

    NASA Technical Reports Server (NTRS)

    Galloway, Thomas L.

    1988-01-01

    Rapid advances in the technology of electronic digital computers and the need for an integrated synthesis approach in developing future rotorcraft programs has led to increased emphasis on system analysis techniques in rotorcraft design. The task in systems analysis is to deal with complex, interdependent, and conflicting requirements in a structured manner so rational and objective decisions can be made. Whether the results are wisdom or rubbish depends upon the validity and sometimes more importantly, the consistency of the inputs, the correctness of the analysis, and a sensible choice of measures of effectiveness to draw conclusions. In rotorcraft design this means combining design requirements, technology assessment, sensitivity analysis and reviews techniques currently in use by NASA and Army organizations in developing research programs and vehicle specifications for rotorcraft. These procedures span simple graphical approaches to comprehensive analysis on large mainframe computers. Examples of recent applications to military and civil missions are highlighted.

  7. PLANS; a finite element program for nonlinear analysis of structures. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Pifko, A.; Armen, H., Jr.; Levy, A.; Levine, H.

    1977-01-01

    The PLANS system, rather than being one comprehensive computer program, is a collection of finite element programs used for the nonlinear analysis of structures. This collection of programs evolved and is based on the organizational philosophy in which classes of analyses are treated individually based on the physical problem class to be analyzed. Each of the independent finite element computer programs of PLANS, with an associated element library, can be individually loaded and used to solve the problem class of interest. A number of programs have been developed for material nonlinear behavior alone and for combined geometric and material nonlinear behavior. The usage, capabilities, and element libraries of the current programs include: (1) plastic analysis of built-up structures where bending and membrane effects are significant, (2) three dimensional elastic-plastic analysis, (3) plastic analysis of bodies of revolution, and (4) material and geometric nonlinear analysis of built-up structures.

  8. An Assessment of the State-of-the-art in Multidisciplinary Aeromechanical Analyses

    NASA Technical Reports Server (NTRS)

    Datta, Anubhav; Johnson, Wayne

    2008-01-01

    This paper presents a survey of the current state-of-the-art in multidisciplinary aeromechanical analyses which integrate advanced Computational Structural Dynamics (CSD) and Computational Fluid Dynamics (CFD) methods. The application areas to be surveyed include fixed wing aircraft, turbomachinery, and rotary wing aircraft. The objective of the authors in the present paper, together with a companion paper on requirements, is to lay out a path for a High Performance Computing (HPC) based next generation comprehensive rotorcraft analysis. From this survey of the key technologies in other application areas it is possible to identify the critical technology gaps that stem from unique rotorcraft requirements.

  9. Similarity network fusion for aggregating data types on a genomic scale.

    PubMed

    Wang, Bo; Mezlini, Aziz M; Demir, Feyyaz; Fiume, Marc; Tu, Zhuowen; Brudno, Michael; Haibe-Kains, Benjamin; Goldenberg, Anna

    2014-03-01

    Recent technologies have made it cost-effective to collect diverse types of genome-wide data. Computational methods are needed to combine these data to create a comprehensive view of a given disease or a biological process. Similarity network fusion (SNF) solves this problem by constructing networks of samples (e.g., patients) for each available data type and then efficiently fusing these into one network that represents the full spectrum of underlying data. For example, to create a comprehensive view of a disease given a cohort of patients, SNF computes and fuses patient similarity networks obtained from each of their data types separately, taking advantage of the complementarity in the data. We used SNF to combine mRNA expression, DNA methylation and microRNA (miRNA) expression data for five cancer data sets. SNF substantially outperforms single data type analysis and established integrative approaches when identifying cancer subtypes and is effective for predicting survival.

  10. A comprehensive pathway map of epidermal growth factor receptor signaling

    PubMed Central

    Oda, Kanae; Matsuoka, Yukiko; Funahashi, Akira; Kitano, Hiroaki

    2005-01-01

    The epidermal growth factor receptor (EGFR) signaling pathway is one of the most important pathways that regulate growth, survival, proliferation, and differentiation in mammalian cells. Reflecting this importance, it is one of the best-investigated signaling systems, both experimentally and computationally, and several computational models have been developed for dynamic analysis. A map of molecular interactions of the EGFR signaling system is a valuable resource for research in this area. In this paper, we present a comprehensive pathway map of EGFR signaling and other related pathways. The map reveals that the overall architecture of the pathway is a bow-tie (or hourglass) structure with several feedback loops. The map is created using CellDesigner software that enables us to graphically represent interactions using a well-defined and consistent graphical notation, and to store it in Systems Biology Markup Language (SBML). PMID:16729045

  11. Driver comprehension of managed lane signing.

    DOT National Transportation Integrated Search

    2009-09-01

    A statewide survey of driver comprehension of managed lane signing is reported. Computer-based surveys were conducted using video clips of computer animations as well as still images of signs. The surveys were conducted in four Texas cities with a to...

  12. Comprehensive helicopter analysis: A state of the art review

    NASA Technical Reports Server (NTRS)

    Johnson, W.

    1978-01-01

    An assessment of the status of helicopter theory and analysis is presented. The technology level embodied in available design tools (computer programs) is examined, considering the problem areas of performance, loads and vibration, handling qualities and simulation, and aeroelastic stability. The effectiveness of the present analyses is discussed. The characteristics of the technology in the analyses are reviewed, including the aerodynamics technology, induced velocity and wake geometry, dynamics technology, and machine limitations.

  13. Preliminary Design of ICI-based Multimedia for Reconceptualizing Electric Conceptions at Universitas Pendidikan Indonesia

    NASA Astrophysics Data System (ADS)

    Samsudin, A.; Suhandi, A.; Rusdiana, D.; Kaniawati, I.

    2016-08-01

    Interactive Conceptual Instruction (ICI) based Multimedia has been developed to represent the electric concepts turn into more real and meaningful learning. The initial design of ICI based multimedia is a multimedia computer that allows users to explore the entire electric concepts in terms of the existing conceptual and practical. Pre-service physics teachers should be provided with the learning that could optimize the conceptions held by re-conceptualizing concepts in Basic Physics II, especially the concepts about electricity. To collect and to analyze the data genuinely and comprehensively, researchers utilized a developing method of ADDIE which has comprehensive steps: analyzing, design, development, implementation, and evaluation. The ADDIE developing steps has been utilized to describe comprehensively from the phase of analysis program up until the evaluation program. Based on data analysis, it can be concluded that ICI-based multimedia could effectively increase the pre-service physics teachers’ understanding on electric conceptions for re-conceptualizing electric conceptions at Universitas Pendidikan Indonesia.

  14. Vectorial Representations of Meaning for a Computational Model of Language Comprehension

    ERIC Educational Resources Information Center

    Wu, Stephen Tze-Inn

    2010-01-01

    This thesis aims to define and extend a line of computational models for text comprehension that are humanly plausible. Since natural language is human by nature, computational models of human language will always be just that--models. To the degree that they miss out on information that humans would tap into, they may be improved by considering…

  15. Use of cloud computing in biomedicine.

    PubMed

    Sobeslav, Vladimir; Maresova, Petra; Krejcar, Ondrej; Franca, Tanos C C; Kuca, Kamil

    2016-12-01

    Nowadays, biomedicine is characterised by a growing need for processing of large amounts of data in real time. This leads to new requirements for information and communication technologies (ICT). Cloud computing offers a solution to these requirements and provides many advantages, such as cost savings, elasticity and scalability of using ICT. The aim of this paper is to explore the concept of cloud computing and the related use of this concept in the area of biomedicine. Authors offer a comprehensive analysis of the implementation of the cloud computing approach in biomedical research, decomposed into infrastructure, platform and service layer, and a recommendation for processing large amounts of data in biomedicine. Firstly, the paper describes the appropriate forms and technological solutions of cloud computing. Secondly, the high-end computing paradigm of cloud computing aspects is analysed. Finally, the potential and current use of applications in scientific research of this technology in biomedicine is discussed.

  16. CAPTIONALS: A computer aided testing environment for the verification and validation of communication protocols

    NASA Technical Reports Server (NTRS)

    Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio

    1992-01-01

    This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.

  17. A Disciplined Architectural Approach to Scaling Data Analysis for Massive, Scientific Data

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Braverman, A. J.; Cinquini, L.; Turmon, M.; Lee, H.; Law, E.

    2014-12-01

    Data collections across remote sensing and ground-based instruments in astronomy, Earth science, and planetary science are outpacing scientists' ability to analyze them. Furthermore, the distribution, structure, and heterogeneity of the measurements themselves pose challenges that limit the scalability of data analysis using traditional approaches. Methods for developing science data processing pipelines, distribution of scientific datasets, and performing analysis will require innovative approaches that integrate cyber-infrastructure, algorithms, and data into more systematic approaches that can more efficiently compute and reduce data, particularly distributed data. This requires the integration of computer science, machine learning, statistics and domain expertise to identify scalable architectures for data analysis. The size of data returned from Earth Science observing satellites and the magnitude of data from climate model output, is predicted to grow into the tens of petabytes challenging current data analysis paradigms. This same kind of growth is present in astronomy and planetary science data. One of the major challenges in data science and related disciplines defining new approaches to scaling systems and analysis in order to increase scientific productivity and yield. Specific needs include: 1) identification of optimized system architectures for analyzing massive, distributed data sets; 2) algorithms for systematic analysis of massive data sets in distributed environments; and 3) the development of software infrastructures that are capable of performing massive, distributed data analysis across a comprehensive data science framework. NASA/JPL has begun an initiative in data science to address these challenges. Our goal is to evaluate how scientific productivity can be improved through optimized architectural topologies that identify how to deploy and manage the access, distribution, computation, and reduction of massive, distributed data, while managing the uncertainties of scientific conclusions derived from such capabilities. This talk will provide an overview of JPL's efforts in developing a comprehensive architectural approach to data science.

  18. Feasibility Study for an Air Force Environmental Model and Data Exchange. Volume 4. Appendix G. Model Review and Index-Air Multimedia and Other Models, Plus Data Bases.

    DTIC Science & Technology

    1983-07-01

    Analysis of trace contaminants project at ORNL. Medium applied to movement of heavy metals through a forested watershed. OAQPS has not reviewed...computer cartography and site design aids; management information systems for facility planning, construction and * operation; and a computer...4 (5) Comprehensive 4 (6) Spills/ Heavy Gas 5 b. Regional 7 c. Reactive Pollutants 7 d. Special Purpose 8 e. Rocket Firing 8 f. Summary of Models by

  19. Los Alamos National Laboratory Economic Analysis Capability Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boero, Riccardo; Edwards, Brian Keith; Pasqualini, Donatella

    Los Alamos National Laboratory has developed two types of models to compute the economic impact of infrastructure disruptions. FastEcon is a fast running model that estimates first-­order economic impacts of large scale events such as hurricanes and floods and can be used to identify the amount of economic activity that occurs in a specific area. LANL’s Computable General Equilibrium (CGE) model estimates more comprehensive static and dynamic economic impacts of a broader array of events and captures the interactions between sectors and industries when estimating economic impacts.

  20. Online Reading Practices of Students Who Are Deaf/Hard of Hearing

    ERIC Educational Resources Information Center

    Donne, Vicki; Rugg, Natalie

    2015-01-01

    This study sought to investigate reading perceptions, computer use perceptions, and online reading comprehension strategy use of 26 students who are deaf/hard of hearing in grades 4 through 8 attending public school districts in a tri-state area of the U.S. Students completed an online questionnaire and descriptive analysis indicated that students…

  1. ChiLin: a comprehensive ChIP-seq and DNase-seq quality control and analysis pipeline.

    PubMed

    Qin, Qian; Mei, Shenglin; Wu, Qiu; Sun, Hanfei; Li, Lewyn; Taing, Len; Chen, Sujun; Li, Fugen; Liu, Tao; Zang, Chongzhi; Xu, Han; Chen, Yiwen; Meyer, Clifford A; Zhang, Yong; Brown, Myles; Long, Henry W; Liu, X Shirley

    2016-10-03

    Transcription factor binding, histone modification, and chromatin accessibility studies are important approaches to understanding the biology of gene regulation. ChIP-seq and DNase-seq have become the standard techniques for studying protein-DNA interactions and chromatin accessibility respectively, and comprehensive quality control (QC) and analysis tools are critical to extracting the most value from these assay types. Although many analysis and QC tools have been reported, few combine ChIP-seq and DNase-seq data analysis and quality control in a unified framework with a comprehensive and unbiased reference of data quality metrics. ChiLin is a computational pipeline that automates the quality control and data analyses of ChIP-seq and DNase-seq data. It is developed using a flexible and modular software framework that can be easily extended and modified. ChiLin is ideal for batch processing of many datasets and is well suited for large collaborative projects involving ChIP-seq and DNase-seq from different designs. ChiLin generates comprehensive quality control reports that include comparisons with historical data derived from over 23,677 public ChIP-seq and DNase-seq samples (11,265 datasets) from eight literature-based classified categories. To the best of our knowledge, this atlas represents the most comprehensive ChIP-seq and DNase-seq related quality metric resource currently available. These historical metrics provide useful heuristic quality references for experiment across all commonly used assay types. Using representative datasets, we demonstrate the versatility of the pipeline by applying it to different assay types of ChIP-seq data. The pipeline software is available open source at https://github.com/cfce/chilin . ChiLin is a scalable and powerful tool to process large batches of ChIP-seq and DNase-seq datasets. The analysis output and quality metrics have been structured into user-friendly directories and reports. We have successfully compiled 23,677 profiles into a comprehensive quality atlas with fine classification for users.

  2. Analyzing comprehensive QoS with security constraints for services composition applications in wireless sensor networks.

    PubMed

    Xiong, Naixue; Wu, Zhao; Huang, Yannong; Xu, Degang

    2014-12-01

    Services composition is fundamental to software development in multi-service wireless sensor networks (WSNs). The quality of service (QoS) of services composition applications (SCAs) are confronted with severe challenges due to the open, dynamic, and complex natures of WSNs. Most previous research separated various QoS indices into different fields and studied them individually due to the computational complexity. This approach ignores the mutual influence between these QoS indices, and leads to a non-comprehensive and inaccurate analysis result. The universal generating function (UGF) shows the speediness and precision in QoS analysis. However, only one QoS index at a time can be analyzed by the classic UGF. In order to efficiently analyze the comprehensive QoS of SCAs, this paper proposes an improved UGF technique-vector universal generating function (VUGF)-which considers the relationship between multiple QoS indices, including security, and can simultaneously analyze multiple QoS indices. The numerical examples demonstrate that it can be used for the evaluation of the comprehensive QoS of SCAs subjected to the security constraint in WSNs. Therefore, it can be effectively applied to the optimal design of multi-service WSNs.

  3. Analyzing Comprehensive QoS with Security Constraints for Services Composition Applications in Wireless Sensor Networks

    PubMed Central

    Xiong, Naixue; Wu, Zhao; Huang, Yannong; Xu, Degang

    2014-01-01

    Services composition is fundamental to software development in multi-service wireless sensor networks (WSNs). The quality of service (QoS) of services composition applications (SCAs) are confronted with severe challenges due to the open, dynamic, and complex natures of WSNs. Most previous research separated various QoS indices into different fields and studied them individually due to the computational complexity. This approach ignores the mutual influence between these QoS indices, and leads to a non-comprehensive and inaccurate analysis result. The universal generating function (UGF) shows the speediness and precision in QoS analysis. However, only one QoS index at a time can be analyzed by the classic UGF. In order to efficiently analyze the comprehensive QoS of SCAs, this paper proposes an improved UGF technique—vector universal generating function (VUGF)—which considers the relationship between multiple QoS indices, including security, and can simultaneously analyze multiple QoS indices. The numerical examples demonstrate that it can be used for the evaluation of the comprehensive QoS of SCAs subjected to the security constraint in WSNs. Therefore, it can be effectively applied to the optimal design of multi-service WSNs. PMID:25470488

  4. Comprehensive analysis of the optical Kerr coefficient of graphene

    DOE PAGES

    Soh, Daniel B. S.; Hamerly, Ryan; Mabuchi, Hideo

    2016-08-25

    We present a comprehensive analysis of the nonlinear optical Kerr effect in graphene. We directly solve the S-matrix element to calculate the absorption rate, utilizing the Volkov-Keldysh-type crystal wave functions. We then convert to the nonlinear refractive index coefficients through the Kramers-Kronig relation. In this formalism, the source of Kerr nonlinearity is the interplay of optical fields that cooperatively drive the transition from valence to conduction band. This formalism makes it possible to identify and compute the rates of distinct nonlinear processes that contribute to the Kerr nonlinear refractive index coefficient. The four identified mechanisms are two-photon absorption, Raman transition,more » self-coupling, and quadratic ac Stark effect. As a result, we present a comparison of our theory with recent experimental and theoretical results.« less

  5. Thermal Analysis of Magnetically-Coupled Pump for Cryogenic Applications

    NASA Technical Reports Server (NTRS)

    Senocak, Inanc; Udaykumar, H. S.; Ndri, Narcisse; Francois, Marianne; Shyy, Wei

    1999-01-01

    Magnetically-coupled pump is under evaluation at Kennedy Space Center for possible cryogenic applications. A major concern is the impact of low temperature fluid flows on the pump performance. As a first step toward addressing this and related issues, a computational fluid dynamics and heat transfer tool has been adopted in a pump geometry. The computational tool includes (i) a commercial grid generator to handle multiple grid blocks and complicated geometric definitions, and (ii) an in-house computational fluid dynamics and heat transfer software developed in the Principal Investigator's group at the University of Florida. Both pure-conduction and combined convection-conduction computations have been conducted. A pure-conduction analysis gives insufficient information about the overall thermal distribution. Combined convection-conduction analysis indicates the significant influence of the coolant over the entire flow path. Since 2-D simulation is of limited help, future work on full 3-D modeling of the pump using multi-materials is needed. A comprehensive and accurate model can be developed to take into account the effect of multi-phase flow in the cooling flow loop, and the magnetic interactions.

  6. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deboever, Jeremiah; Zhang, Xiaochen; Reno, Matthew J.

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10more » to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.« less

  7. Enhancements to the Network Repair Level Analysis (NRLA) Model Using Marginal Analysis Techniques and Centralized Intermediate Repair Facility (CIRF) Maintenance Concepts.

    DTIC Science & Technology

    1983-12-01

    while at the same time improving its operational efficiency. Through their integration and use, System Program Managers have a comprehensive analytical... systems . The NRLA program is hosted on the CREATE Operating System and contains approxiamately 5500 lines of computer code. It consists of a main...associated with C alternative maintenance plans. As the technological complexity of weapons systems has increased new and innovative logisitcal support

  8. The Temporal Dimension of Linguistic Prediction

    ERIC Educational Resources Information Center

    Chow, Wing Yee

    2013-01-01

    This thesis explores how predictions about upcoming language inputs are computed during real-time language comprehension. Previous research has demonstrated humans' ability to use rich contextual information to compute linguistic prediction during real-time language comprehension, and it has been widely assumed that contextual information can…

  9. A randomized, controlled trial of interactive, multimedia software for patient colonoscopy education.

    PubMed

    Shaw, M J; Beebe, T J; Tomshine, P A; Adlis, S A; Cass, O W

    2001-02-01

    The purpose of our study was to assess the effectiveness of computer-assisted instruction (CAI) in patients having colonoscopies. We conducted a randomized, controlled trial in large, multispecialty clinic. Eighty-six patients were referred for colonoscopies. The interventions were standard education versus standard education plus CAI, and the outcome measures were anxiety, comprehension, and satisfaction. Computer-assisted instruction had no effect on patients' anxiety. The group receiving CAI demonstrated better overall comprehension (p < 0.001). However, Comprehension of certain aspects of serious complications and appropriate postsedation behavior were unaffected by educational method. Patients in the CAI group were more likely to indicate satisfaction with the amount of information provided when compared with the standard education counterparts (p = 0.001). Overall satisfaction was unaffected by educational method. Computer-assisted instruction for colonoscopy provided better comprehension and greater satisfaction with the adequacy of education than standard education. Computer-assisted instruction helps physicians meet their educational responsibilities with no decrement to the interpersonal aspects of the patient-physician relationship.

  10. Shor's factoring algorithm and modern cryptography. An illustration of the capabilities inherent in quantum computers

    NASA Astrophysics Data System (ADS)

    Gerjuoy, Edward

    2005-06-01

    The security of messages encoded via the widely used RSA public key encryption system rests on the enormous computational effort required to find the prime factors of a large number N using classical (conventional) computers. In 1994 Peter Shor showed that for sufficiently large N, a quantum computer could perform the factoring with much less computational effort. This paper endeavors to explain, in a fashion comprehensible to the nonexpert, the RSA encryption protocol; the various quantum computer manipulations constituting the Shor algorithm; how the Shor algorithm performs the factoring; and the precise sense in which a quantum computer employing Shor's algorithm can be said to accomplish the factoring of very large numbers with less computational effort than a classical computer. It is made apparent that factoring N generally requires many successive runs of the algorithm. Our analysis reveals that the probability of achieving a successful factorization on a single run is about twice as large as commonly quoted in the literature.

  11. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  12. Metabolic Flux Analysis in Isotope Labeling Experiments Using the Adjoint Approach.

    PubMed

    Mottelet, Stephane; Gaullier, Gil; Sadaka, Georges

    2017-01-01

    Comprehension of metabolic pathways is considerably enhanced by metabolic flux analysis (MFA-ILE) in isotope labeling experiments. The balance equations are given by hundreds of algebraic (stationary MFA) or ordinary differential equations (nonstationary MFA), and reducing the number of operations is therefore a crucial part of reducing the computation cost. The main bottleneck for deterministic algorithms is the computation of derivatives, particularly for nonstationary MFA. In this article, we explain how the overall identification process may be speeded up by using the adjoint approach to compute the gradient of the residual sum of squares. The proposed approach shows significant improvements in terms of complexity and computation time when it is compared with the usual (direct) approach. Numerical results are obtained for the central metabolic pathways of Escherichia coli and are validated against reference software in the stationary case. The methods and algorithms described in this paper are included in the sysmetab software package distributed under an Open Source license at http://forge.scilab.org/index.php/p/sysmetab/.

  13. Enhanced computer vision with Microsoft Kinect sensor: a review.

    PubMed

    Han, Jungong; Shao, Ling; Xu, Dong; Shotton, Jamie

    2013-10-01

    With the invention of the low-cost Microsoft Kinect sensor, high-resolution depth and visual (RGB) sensing has become available for widespread use. The complementary nature of the depth and visual information provided by the Kinect sensor opens up new opportunities to solve fundamental problems in computer vision. This paper presents a comprehensive review of recent Kinect-based computer vision algorithms and applications. The reviewed approaches are classified according to the type of vision problems that can be addressed or enhanced by means of the Kinect sensor. The covered topics include preprocessing, object tracking and recognition, human activity analysis, hand gesture analysis, and indoor 3-D mapping. For each category of methods, we outline their main algorithmic contributions and summarize their advantages/differences compared to their RGB counterparts. Finally, we give an overview of the challenges in this field and future research trends. This paper is expected to serve as a tutorial and source of references for Kinect-based computer vision researchers.

  14. Faster computation of exact RNA shape probabilities.

    PubMed

    Janssen, Stefan; Giegerich, Robert

    2010-03-01

    Abstract shape analysis allows efficient computation of a representative sample of low-energy foldings of an RNA molecule. More comprehensive information is obtained by computing shape probabilities, accumulating the Boltzmann probabilities of all structures within each abstract shape. Such information is superior to free energies because it is independent of sequence length and base composition. However, up to this point, computation of shape probabilities evaluates all shapes simultaneously and comes with a computation cost which is exponential in the length of the sequence. We device an approach called RapidShapes that computes the shapes above a specified probability threshold T by generating a list of promising shapes and constructing specialized folding programs for each shape to compute its share of Boltzmann probability. This aims at a heuristic improvement of runtime, while still computing exact probability values. Evaluating this approach and several substrategies, we find that only a small proportion of shapes have to be actually computed. For an RNA sequence of length 400, this leads, depending on the threshold, to a 10-138 fold speed-up compared with the previous complete method. Thus, probabilistic shape analysis has become feasible in medium-scale applications, such as the screening of RNA transcripts in a bacterial genome. RapidShapes is available via http://bibiserv.cebitec.uni-bielefeld.de/rnashapes

  15. A new computer code for discrete fracture network modelling

    NASA Astrophysics Data System (ADS)

    Xu, Chaoshui; Dowd, Peter

    2010-03-01

    The authors describe a comprehensive software package for two- and three-dimensional stochastic rock fracture simulation using marked point processes. Fracture locations can be modelled by a Poisson, a non-homogeneous, a cluster or a Cox point process; fracture geometries and properties are modelled by their respective probability distributions. Virtual sampling tools such as plane, window and scanline sampling are included in the software together with a comprehensive set of statistical tools including histogram analysis, probability plots, rose diagrams and hemispherical projections. The paper describes in detail the theoretical basis of the implementation and provides a case study in rock fracture modelling to demonstrate the application of the software.

  16. Navier-Stokes and Comprehensive Analysis Performance Predictions of the NREL Phase VI Experiment

    NASA Technical Reports Server (NTRS)

    Duque, Earl P. N.; Burklund, Michael D.; Johnson, Wayne

    2003-01-01

    A vortex lattice code, CAMRAD II, and a Reynolds-Averaged Navier-Stoke code, OVERFLOW-D2, were used to predict the aerodynamic performance of a two-bladed horizontal axis wind turbine. All computations were compared with experimental data that was collected at the NASA Ames Research Center 80- by 120-Foot Wind Tunnel. Computations were performed for both axial as well as yawed operating conditions. Various stall delay models and dynamics stall models were used by the CAMRAD II code. Comparisons between the experimental data and computed aerodynamic loads show that the OVERFLOW-D2 code can accurately predict the power and spanwise loading of a wind turbine rotor.

  17. The possibility of coexistence and co-development in language competition: ecology-society computational model and simulation.

    PubMed

    Yun, Jian; Shang, Song-Chao; Wei, Xiao-Dan; Liu, Shuang; Li, Zhi-Jie

    2016-01-01

    Language is characterized by both ecological properties and social properties, and competition is the basic form of language evolution. The rise and decline of one language is a result of competition between languages. Moreover, this rise and decline directly influences the diversity of human culture. Mathematics and computer modeling for language competition has been a popular topic in the fields of linguistics, mathematics, computer science, ecology, and other disciplines. Currently, there are several problems in the research on language competition modeling. First, comprehensive mathematical analysis is absent in most studies of language competition models. Next, most language competition models are based on the assumption that one language in the model is stronger than the other. These studies tend to ignore cases where there is a balance of power in the competition. The competition between two well-matched languages is more practical, because it can facilitate the co-development of two languages. A third issue with current studies is that many studies have an evolution result where the weaker language inevitably goes extinct. From the integrated point of view of ecology and sociology, this paper improves the Lotka-Volterra model and basic reaction-diffusion model to propose an "ecology-society" computational model for describing language competition. Furthermore, a strict and comprehensive mathematical analysis was made for the stability of the equilibria. Two languages in competition may be either well-matched or greatly different in strength, which was reflected in the experimental design. The results revealed that language coexistence, and even co-development, are likely to occur during language competition.

  18. A Comprehensive, Open-source Platform for Mass Spectrometry-based Glycoproteomics Data Analysis.

    PubMed

    Liu, Gang; Cheng, Kai; Lo, Chi Y; Li, Jun; Qu, Jun; Neelamegham, Sriram

    2017-11-01

    Glycosylation is among the most abundant and diverse protein post-translational modifications (PTMs) identified to date. The structural analysis of this PTM is challenging because of the diverse monosaccharides which are not conserved among organisms, the branched nature of glycans, their isomeric structures, and heterogeneity in the glycan distribution at a given site. Glycoproteomics experiments have adopted the traditional high-throughput LC-MS n proteomics workflow to analyze site-specific glycosylation. However, comprehensive computational platforms for data analyses are scarce. To address this limitation, we present a comprehensive, open-source, modular software for glycoproteomics data analysis called GlycoPAT (GlycoProteomics Analysis Toolbox; freely available from www.VirtualGlycome.org/glycopat). The program includes three major advances: (1) "SmallGlyPep," a minimal linear representation of glycopeptides for MS n data analysis. This format allows facile serial fragmentation of both the peptide backbone and PTM at one or more locations. (2) A novel scoring scheme based on calculation of the "Ensemble Score (ES)," a measure that scores and rank-orders MS/MS spectrum for N- and O-linked glycopeptides using cross-correlation and probability based analyses. (3) A false discovery rate (FDR) calculation scheme where decoy glycopeptides are created by simultaneously scrambling the amino acid sequence and by introducing artificial monosaccharides by perturbing the original sugar mass. Parallel computing facilities and user-friendly GUIs (Graphical User Interfaces) are also provided. GlycoPAT is used to catalogue site-specific glycosylation on simple glycoproteins, standard protein mixtures and human plasma cryoprecipitate samples in three common MS/MS fragmentation modes: CID, HCD and ETD. It is also used to identify 960 unique glycopeptides in cell lysates from prostate cancer cells. The results show that the simultaneous consideration of peptide and glycan fragmentation is necessary for high quality MS n spectrum annotation in CID and HCD fragmentation modes. Additionally, they confirm the suitability of GlycoPAT to analyze shotgun glycoproteomics data. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  19. Image analysis and modeling in medical image computing. Recent developments and advances.

    PubMed

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body. Hence, model-based image computing methods are important tools to improve medical diagnostics and patient treatment in future.

  20. Multiscale Multifunctional Progressive Fracture of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Minnetyan, L.

    2012-01-01

    A new approach is described for evaluating fracture in composite structures. This approach is independent of classical fracture mechanics parameters like fracture toughness. It relies on computational simulation and is programmed in a stand-alone integrated computer code. It is multiscale, multifunctional because it includes composite mechanics for the composite behavior and finite element analysis for predicting the structural response. It contains seven modules; layered composite mechanics (micro, macro, laminate), finite element, updating scheme, local fracture, global fracture, stress based failure modes, and fracture progression. The computer code is called CODSTRAN (Composite Durability Structural ANalysis). It is used in the present paper to evaluate the global fracture of four composite shell problems and one composite built-up structure. Results show that the composite shells. Global fracture is enhanced when internal pressure is combined with shear loads. The old reference denotes that nothing has been added to this comprehensive report since then.

  1. Fundamentals of digital filtering with applications in geophysical prospecting for oil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mesko, A.

    This book is a comprehensive work bringing together the important mathematical foundations and computing techniques for numerical filtering methods. The first two parts of the book introduce the techniques, fundamental theory and applications, while the third part treats specific applications in geophysical prospecting. Discussion is limited to linear filters, but takes in related fields such as correlational and spectral analysis.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hussain, Hameed; Malik, Saif Ur Rehman; Hameed, Abdul

    An efficient resource allocation is a fundamental requirement in high performance computing (HPC) systems. Many projects are dedicated to large-scale distributed computing systems that have designed and developed resource allocation mechanisms with a variety of architectures and services. In our study, through analysis, a comprehensive survey for describing resource allocation in various HPCs is reported. The aim of the work is to aggregate under a joint framework, the existing solutions for HPC to provide a thorough analysis and characteristics of the resource management and allocation strategies. Resource allocation mechanisms and strategies play a vital role towards the performance improvement ofmore » all the HPCs classifications. Therefore, a comprehensive discussion of widely used resource allocation strategies deployed in HPC environment is required, which is one of the motivations of this survey. Moreover, we have classified the HPC systems into three broad categories, namely: (a) cluster, (b) grid, and (c) cloud systems and define the characteristics of each class by extracting sets of common attributes. All of the aforementioned systems are cataloged into pure software and hybrid/hardware solutions. The system classification is used to identify approaches followed by the implementation of existing resource allocation strategies that are widely presented in the literature.« less

  3. The Effect of Electronic Storybooks on Struggling Fourth-Graders' Reading Comprehension

    ERIC Educational Resources Information Center

    Ertem, Ihsan Seyit

    2010-01-01

    This quantitative research examined the differences in struggling readers' comprehension of storybooks according to the medium of presentation. Each student was randomly assigned with one of three conditions: (1) computer presentation of storybooks with animation; (2) computer presentation of storybooks without animation; and (3) traditional print…

  4. Application of enhanced modern structured analysis techniques to Space Station Freedom electric power system requirements

    NASA Technical Reports Server (NTRS)

    Biernacki, John; Juhasz, John; Sadler, Gerald

    1991-01-01

    A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.

  5. Remote sensing image ship target detection method based on visual attention model

    NASA Astrophysics Data System (ADS)

    Sun, Yuejiao; Lei, Wuhu; Ren, Xiaodong

    2017-11-01

    The traditional methods of detecting ship targets in remote sensing images mostly use sliding window to search the whole image comprehensively. However, the target usually occupies only a small fraction of the image. This method has high computational complexity for large format visible image data. The bottom-up selective attention mechanism can selectively allocate computing resources according to visual stimuli, thus improving the computational efficiency and reducing the difficulty of analysis. Considering of that, a method of ship target detection in remote sensing images based on visual attention model was proposed in this paper. The experimental results show that the proposed method can reduce the computational complexity while improving the detection accuracy, and improve the detection efficiency of ship targets in remote sensing images.

  6. Vibration analysis of rotor blades with pendulum absorbers

    NASA Technical Reports Server (NTRS)

    Murthy, V. R.; Hammond, C. E.

    1979-01-01

    A comprehensive vibration analysis of rotor blades with spherical pendulum absorbers is presented. Linearized equations of motion for small oscillations about the steady-state deflection of a spherical pendulum on elastic rotor blades undergoing coupled flapwise bending, chordwise bending, and torsional vibrations are obtained. A transmission matrix formulation is given to determine the natural vibrational characteristics of rotor blades with spherical or simple flapping pendulum absorbers. The natural frequencies and mode shapes of a hingeless rotor blade with a spherical pendulum are computed.

  7. Drug target inference through pathway analysis of genomics data

    PubMed Central

    Ma, Haisu; Zhao, Hongyu

    2013-01-01

    Statistical modeling coupled with bioinformatics is commonly used for drug discovery. Although there exist many approaches for single target based drug design and target inference, recent years have seen a paradigm shift to system-level pharmacological research. Pathway analysis of genomics data represents one promising direction for computational inference of drug targets. This article aims at providing a comprehensive review on the evolving issues is this field, covering methodological developments, their pros and cons, as well as future research directions. PMID:23369829

  8. Semantic Metrics for Analysis of Software

    NASA Technical Reports Server (NTRS)

    Etzkorn, Letha H.; Cox, Glenn W.; Farrington, Phil; Utley, Dawn R.; Ghalston, Sampson; Stein, Cara

    2005-01-01

    A recently conceived suite of object-oriented software metrics focus is on semantic aspects of software, in contradistinction to traditional software metrics, which focus on syntactic aspects of software. Semantic metrics represent a more human-oriented view of software than do syntactic metrics. The semantic metrics of a given computer program are calculated by use of the output of a knowledge-based analysis of the program, and are substantially more representative of software quality and more readily comprehensible from a human perspective than are the syntactic metrics.

  9. Comprehensive Materials and Morphologies Study of Ion Traps (COMMIT) for Scalable Quantum Computation

    DTIC Science & Technology

    2012-04-21

    the photoelectric effect. The typical shortest wavelengths needed for ion traps range from 194 nm for Hg+ to 493 nm for Ba +, corresponding to 6.4-2.5...REPORT Comprehensive Materials and Morphologies Study of Ion Traps (COMMIT) for scalable Quantum Computation - Final Report 14. ABSTRACT 16. SECURITY...CLASSIFICATION OF: Trapped ion systems, are extremely promising for large-scale quantum computation, but face a vexing problem, with motional quantum

  10. [Mobile phone-computer wireless interactive graphics transmission technology and its medical application].

    PubMed

    Huang, Shuo; Liu, Jing

    2010-05-01

    Application of clinical digital medical imaging has raised many tough issues to tackle, such as data storage, management, and information sharing. Here we investigated a mobile phone based medical image management system which is capable of achieving personal medical imaging information storage, management and comprehensive health information analysis. The technologies related to the management system spanning the wireless transmission technology, the technical capabilities of phone in mobile health care and management of mobile medical database were discussed. Taking medical infrared images transmission between phone and computer as an example, the working principle of the present system was demonstrated.

  11. Comprehensive analysis of a Radiology Operations Management computer system.

    PubMed

    Arenson, R L; London, J W

    1979-11-01

    The Radiology Operations Management computer system at the Hospital of the University of Pennsylvania is discussed. The scheduling and file room modules are based on the system at Massachusetts General Hospital. Patient delays are indicated by the patient tracking module. A reporting module allows CRT/keyboard entry by transcriptionists, entry of standard reports by radiologists using bar code labels, and entry by radiologists using a specialty designed diagnostic reporting terminal. Time-flow analyses demonstrate a significant improvement in scheduling, patient waiting, retrieval of radiographs, and report delivery. Recovery of previously lost billing contributes to the proved cost effectiveness of this system.

  12. Computer assisted analysis of research-based teaching method in English newspaper reading teaching

    NASA Astrophysics Data System (ADS)

    Jie, Zheng

    2017-06-01

    In recent years, the teaching of English newspaper reading has been developing rapidly. However, the teaching effect of the existing course is not ideal. The paper tries to apply the research-based teaching model to English newspaper reading teaching, investigates the current situation in higher vocational colleges, and analyzes the problems. It designs a teaching model of English newspaper reading and carries out the empirical research conducted by computers. The results show that the teaching mode can use knowledge and ability to stimulate learners interest and comprehensively improve their ability to read newspapers.

  13. RFA Guardian: Comprehensive Simulation of Radiofrequency Ablation Treatment of Liver Tumors.

    PubMed

    Voglreiter, Philip; Mariappan, Panchatcharam; Pollari, Mika; Flanagan, Ronan; Blanco Sequeiros, Roberto; Portugaller, Rupert Horst; Fütterer, Jurgen; Schmalstieg, Dieter; Kolesnik, Marina; Moche, Michael

    2018-01-15

    The RFA Guardian is a comprehensive application for high-performance patient-specific simulation of radiofrequency ablation of liver tumors. We address a wide range of usage scenarios. These include pre-interventional planning, sampling of the parameter space for uncertainty estimation, treatment evaluation and, in the worst case, failure analysis. The RFA Guardian is the first of its kind that exhibits sufficient performance for simulating treatment outcomes during the intervention. We achieve this by combining a large number of high-performance image processing, biomechanical simulation and visualization techniques into a generalized technical workflow. Further, we wrap the feature set into a single, integrated application, which exploits all available resources of standard consumer hardware, including massively parallel computing on graphics processing units. This allows us to predict or reproduce treatment outcomes on a single personal computer with high computational performance and high accuracy. The resulting low demand for infrastructure enables easy and cost-efficient integration into the clinical routine. We present a number of evaluation cases from the clinical practice where users performed the whole technical workflow from patient-specific modeling to final validation and highlight the opportunities arising from our fast, accurate prediction techniques.

  14. Reading Research in 1984: Comprehension, Computers, Communication. Fifth Yearbook of the American Reading Forum.

    ERIC Educational Resources Information Center

    McNinch, George H., Ed.; And Others

    Conference presentations of research on reading comprehension, reading instruction, computer applications in reading instruction, and reading theory are compiled in this yearbook. Titles and authors of some of the articles are as follows: "A Rationale for Teaching Children with Limited English Proficiency" (M. Zintz); "Preliminary Development of a…

  15. Development of a Computer-Based Measure of Listening Comprehension of Science Talk

    ERIC Educational Resources Information Center

    Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien

    2015-01-01

    The purpose of this study was to develop a computer-based assessment for elementary school students' listening comprehension of science talk within an inquiry-oriented environment. The development procedure had 3 steps: a literature review to define the framework of the test, collecting and identifying key constructs of science talk, and…

  16. Comparison of Computed and Measured Vortex Evolution for a UH-60A Rotor in Forward Flight

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim Uddin; Yamauchi, Gloria K.; Kao, David L.

    2013-01-01

    A Computational Fluid Dynamics (CFD) simulation using the Navier-Stokes equations was performed to determine the evolutionary and dynamical characteristics of the vortex flowfield for a highly flexible aeroelastic UH-60A rotor in forward flight. The experimental wake data were acquired using Particle Image Velocimetry (PIV) during a test of the fullscale UH-60A rotor in the National Full-Scale Aerodynamics Complex 40- by 80-Foot Wind Tunnel. The PIV measurements were made in a stationary cross-flow plane at 90 deg rotor azimuth. The CFD simulation was performed using the OVERFLOW CFD solver loosely coupled with the rotorcraft comprehensive code CAMRAD II. Characteristics of vortices captured in the PIV plane from different blades are compared with CFD calculations. The blade airloads were calculated using two different turbulence models. A limited spatial, temporal, and CFD/comprehensive-code coupling sensitivity analysis was performed in order to verify the unsteady helicopter simulations with a moving rotor grid system.

  17. On Roles of Models in Information Systems

    NASA Astrophysics Data System (ADS)

    Sølvberg, Arne

    The increasing penetration of computers into all aspects of human activity makes it desirable that the interplay among software, data and the domains where computers are applied is made more transparent. An approach to this end is to explicitly relate the modeling concepts of the domains, e.g., natural science, technology and business, to the modeling concepts of software and data. This may make it simpler to build comprehensible integrated models of the interactions between computers and non-computers, e.g., interaction among computers, people, physical processes, biological processes, and administrative processes. This chapter contains an analysis of various facets of the modeling environment for information systems engineering. The lack of satisfactory conceptual modeling tools seems to be central to the unsatisfactory state-of-the-art in establishing information systems. The chapter contains a proposal for defining a concept of information that is relevant to information systems engineering.

  18. A cost-utility analysis of the use of preoperative computed tomographic angiography in abdomen-based perforator flap breast reconstruction.

    PubMed

    Offodile, Anaeze C; Chatterjee, Abhishek; Vallejo, Sergio; Fisher, Carla S; Tchou, Julia C; Guo, Lifei

    2015-04-01

    Computed tomographic angiography is a diagnostic tool increasingly used for preoperative vascular mapping in abdomen-based perforator flap breast reconstruction. This study compared the use of computed tomographic angiography and the conventional practice of Doppler ultrasonography only in postmastectomy reconstruction using a cost-utility model. Following a comprehensive literature review, a decision analytic model was created using the three most clinically relevant health outcomes in free autologous breast reconstruction with computed tomographic angiography versus Doppler ultrasonography only. Cost and utility estimates for each health outcome were used to derive the quality-adjusted life-years and incremental cost-utility ratio. One-way sensitivity analysis was performed to scrutinize the robustness of the authors' results. Six studies and 782 patients were identified. Cost-utility analysis revealed a baseline cost savings of $3179, a gain in quality-adjusted life-years of 0.25. This yielded an incremental cost-utility ratio of -$12,716, implying a dominant choice favoring preoperative computed tomographic angiography. Sensitivity analysis revealed that computed tomographic angiography was costlier when the operative time difference between the two techniques was less than 21.3 minutes. However, the clinical advantage of computed tomographic angiography over Doppler ultrasonography only showed that computed tomographic angiography would still remain the cost-effective option even if it offered no additional operating time advantage. The authors' results show that computed tomographic angiography is a cost-effective technology for identifying lower abdominal perforators for autologous breast reconstruction. Although the perfect study would be a randomized controlled trial of the two approaches with true cost accrual, the authors' results represent the best available evidence.

  19. Psychology of computer use: XXXII. Computer screen-savers as distractors.

    PubMed

    Volk, F A; Halcomb, C G

    1994-12-01

    The differences in performance of 16 male and 16 female undergraduates on three cognitive tasks were investigated in the presence of visual distractors (computer-generated dynamic graphic images). These tasks included skilled and unskilled proofreading and listening comprehension. The visually demanding task of proofreading (skilled and unskilled) showed no significant decreases in performance in the distractor conditions. Results showed significant decrements, however, in performance on listening comprehension in at least one of the distractor conditions.

  20. The NJOY Nuclear Data Processing System, Version 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macfarlane, Robert; Muir, Douglas W.; Boicourt, R. M.

    The NJOY Nuclear Data Processing System, version 2016, is a comprehensive computer code package for producing pointwise and multigroup cross sections and related quantities from evaluated nuclear data in the ENDF-4 through ENDF-6 legacy card-image formats. NJOY works with evaluated files for incident neutrons, photons, and charged particles, producing libraries for a wide variety of particle transport and reactor analysis codes.

  1. Kids & Media @ the New Millennium: A Kaiser Family Foundation Report. A Comprehensive National Analysis of Children's Media Use. Executive Summary.

    ERIC Educational Resources Information Center

    Roberts, Donald F.

    A study examined media use patterns among a large, nationally representative sample of children ages 2-18, and which explored how children choose and interact with the whole array of media available to them, including television, movies, computers, music, video games, radio, magazines, books, and newspapers. The goal was to provide a solid base…

  2. "Listen and Understand What I Am Saying": Church-Listening as a Challenge for Non-Native Listeners of English in the United Kingdom

    ERIC Educational Resources Information Center

    Malmström, Hans

    2015-01-01

    This article uses computer-assisted analysis to study the listening environment provided by Bible readings and preaching during church services. It focuses on the vocabulary size needed to comprehend 95% and 98% of the running words of the input (lexical coverage levels indicating comprehension in connection with listening) and on the place of…

  3. TopoMS: Comprehensive topological exploration for molecular and condensed-matter systems.

    PubMed

    Bhatia, Harsh; Gyulassy, Attila G; Lordi, Vincenzo; Pask, John E; Pascucci, Valerio; Bremer, Peer-Timo

    2018-06-15

    We introduce TopoMS, a computational tool enabling detailed topological analysis of molecular and condensed-matter systems, including the computation of atomic volumes and charges through the quantum theory of atoms in molecules, as well as the complete molecular graph. With roots in techniques from computational topology, and using a shared-memory parallel approach, TopoMS provides scalable, numerically robust, and topologically consistent analysis. TopoMS can be used as a command-line tool or with a GUI (graphical user interface), where the latter also enables an interactive exploration of the molecular graph. This paper presents algorithmic details of TopoMS and compares it with state-of-the-art tools: Bader charge analysis v1.0 (Arnaldsson et al., 01/11/17) and molecular graph extraction using Critic2 (Otero-de-la-Roza et al., Comput. Phys. Commun. 2014, 185, 1007). TopoMS not only combines the functionality of these individual codes but also demonstrates up to 4× performance gain on a standard laptop, faster convergence to fine-grid solution, robustness against lattice bias, and topological consistency. TopoMS is released publicly under BSD License. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  4. DGCA: A comprehensive R package for Differential Gene Correlation Analysis.

    PubMed

    McKenzie, Andrew T; Katsyv, Igor; Song, Won-Min; Wang, Minghui; Zhang, Bin

    2016-11-15

    Dissecting the regulatory relationships between genes is a critical step towards building accurate predictive models of biological systems. A powerful approach towards this end is to systematically study the differences in correlation between gene pairs in more than one distinct condition. In this study we develop an R package, DGCA (for Differential Gene Correlation Analysis), which offers a suite of tools for computing and analyzing differential correlations between gene pairs across multiple conditions. To minimize parametric assumptions, DGCA computes empirical p-values via permutation testing. To understand differential correlations at a systems level, DGCA performs higher-order analyses such as measuring the average difference in correlation and multiscale clustering analysis of differential correlation networks. Through a simulation study, we show that the straightforward z-score based method that DGCA employs significantly outperforms the existing alternative methods for calculating differential correlation. Application of DGCA to the TCGA RNA-seq data in breast cancer not only identifies key changes in the regulatory relationships between TP53 and PTEN and their target genes in the presence of inactivating mutations, but also reveals an immune-related differential correlation module that is specific to triple negative breast cancer (TNBC). DGCA is an R package for systematically assessing the difference in gene-gene regulatory relationships under different conditions. This user-friendly, effective, and comprehensive software tool will greatly facilitate the application of differential correlation analysis in many biological studies and thus will help identification of novel signaling pathways, biomarkers, and targets in complex biological systems and diseases.

  5. Intelligent redundant actuation system requirements and preliminary system design

    NASA Technical Reports Server (NTRS)

    Defeo, P.; Geiger, L. J.; Harris, J.

    1985-01-01

    Several redundant actuation system configurations were designed and demonstrated to satisfy the stringent operational requirements of advanced flight control systems. However, this has been accomplished largely through brute force hardware redundancy, resulting in significantly increased computational requirements on the flight control computers which perform the failure analysis and reconfiguration management. Modern technology now provides powerful, low-cost microprocessors which are effective in performing failure isolation and configuration management at the local actuator level. One such concept, called an Intelligent Redundant Actuation System (IRAS), significantly reduces the flight control computer requirements and performs the local tasks more comprehensively than previously feasible. The requirements and preliminary design of an experimental laboratory system capable of demonstrating the concept and sufficiently flexible to explore a variety of configurations are discussed.

  6. Operating manual for coaxial injection combustion model. [for the space shuttle main engine

    NASA Technical Reports Server (NTRS)

    Sutton, R. D.; Schuman, M. D.; Chadwick, W. D.

    1974-01-01

    An operating manual for the coaxial injection combustion model (CICM) is presented as the final report for an eleven month effort designed to provide improvement, to verify, and to document the comprehensive computer program for analyzing the performance of thrust chamber operation with gas/liquid coaxial jet injection. The effort culminated in delivery of an operation FORTRAN IV computer program and associated documentation pertaining to the combustion conditions in the space shuttle main engine. The computer program is structured for compatibility with the standardized Joint Army-Navy-NASA-Air Force (JANNAF) performance evaluation procedure. Use of the CICM in conjunction with the JANNAF procedure allows the analysis of engine systems using coaxial gas/liquid injection.

  7. Solving subsurface structural problems using a computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witte, D.M.

    1987-02-01

    Until recently, the solution of subsurface structural problems has required a combination of graphical construction, trigonometry, time, and patience. Recent advances in software available for both mainframe and microcomputers now reduce the time and potential error of these calculations by an order of magnitude. Software for analysis of deviated wells, three point problems, apparent dip, apparent thickness, and the intersection of two planes, as well as the plotting and interpretation of these data can be used to allow timely and accurate exploration or operational decisions. The available computer software provides a set of utilities, or tools, rather than a comprehensive,more » intelligent system. The burden for selection of appropriate techniques, computation methods, and interpretations still lies with the explorationist user.« less

  8. Design optimization of axial flow hydraulic turbine runner: Part II - multi-objective constrained optimization method

    NASA Astrophysics Data System (ADS)

    Peng, Guoyi; Cao, Shuliang; Ishizuka, Masaru; Hayama, Shinji

    2002-06-01

    This paper is concerned with the design optimization of axial flow hydraulic turbine runner blade geometry. In order to obtain a better design plan with good performance, a new comprehensive performance optimization procedure has been presented by combining a multi-variable multi-objective constrained optimization model with a Q3D inverse computation and a performance prediction procedure. With careful analysis of the inverse design of axial hydraulic turbine runner, the total hydraulic loss and the cavitation coefficient are taken as optimization objectives and a comprehensive objective function is defined using the weight factors. Parameters of a newly proposed blade bound circulation distribution function and parameters describing positions of blade leading and training edges in the meridional flow passage are taken as optimization variables.The optimization procedure has been applied to the design optimization of a Kaplan runner with specific speed of 440 kW. Numerical results show that the performance of designed runner is successfully improved through optimization computation. The optimization model is found to be validated and it has the feature of good convergence. With the multi-objective optimization model, it is possible to control the performance of designed runner by adjusting the value of weight factors defining the comprehensive objective function. Copyright

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikolic, R J

    This month's issue has the following articles: (1) Dawn of a New Era of Scientific Discovery - Commentary by Edward I. Moses; (2) At the Frontiers of Fundamental Science Research - Collaborators from national laboratories, universities, and international organizations are using the National Ignition Facility to probe key fundamental science questions; (3) Livermore Responds to Crisis in Post-Earthquake Japan - More than 70 Laboratory scientists provided round-the-clock expertise in radionuclide analysis and atmospheric dispersion modeling as part of the nation's support to Japan following the March 2011 earthquake and nuclear accident; (4) A Comprehensive Resource for Modeling, Simulation, and Experimentsmore » - A new Web-based resource called MIDAS is a central repository for material properties, experimental data, and computer models; and (5) Finding Data Needles in Gigabit Haystacks - Livermore computer scientists have developed a novel computer architecture based on 'persistent' memory to ease data-intensive computations.« less

  10. Developments in REDES: The rocket engine design expert system

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth O.

    1990-01-01

    The Rocket Engine Design Expert System (REDES) is being developed at the NASA-Lewis to collect, automate, and perpetuate the existing expertise of performing a comprehensive rocket engine analysis and design. Currently, REDES uses the rigorous JANNAF methodology to analyze the performance of the thrust chamber and perform computational studies of liquid rocket engine problems. The following computer codes were included in REDES: a gas properties program named GASP, a nozzle design program named RAO, a regenerative cooling channel performance evaluation code named RTE, and the JANNAF standard liquid rocket engine performance prediction code TDK (including performance evaluation modules ODE, ODK, TDE, TDK, and BLM). Computational analyses are being conducted by REDES to provide solutions to liquid rocket engine thrust chamber problems. REDES is built in the Knowledge Engineering Environment (KEE) expert system shell and runs on a Sun 4/110 computer.

  11. Developments in REDES: The Rocket Engine Design Expert System

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth O.

    1990-01-01

    The Rocket Engine Design Expert System (REDES) was developed at NASA-Lewis to collect, automate, and perpetuate the existing expertise of performing a comprehensive rocket engine analysis and design. Currently, REDES uses the rigorous JANNAF methodology to analyze the performance of the thrust chamber and perform computational studies of liquid rocket engine problems. The following computer codes were included in REDES: a gas properties program named GASP; a nozzle design program named RAO; a regenerative cooling channel performance evaluation code named RTE; and the JANNAF standard liquid rocket engine performance prediction code TDK (including performance evaluation modules ODE, ODK, TDE, TDK, and BLM). Computational analyses are being conducted by REDES to provide solutions to liquid rocket engine thrust chamber problems. REDES was built in the Knowledge Engineering Environment (KEE) expert system shell and runs on a Sun 4/110 computer.

  12. Using Primary Language Support via Computer to Improve Reading Comprehension Skills of First-Grade English Language Learners

    ERIC Educational Resources Information Center

    Rodriguez, Cathi Draper; Filler, John; Higgins, Kyle

    2012-01-01

    Through this exploratory study the authors investigated the effects of primary language support delivered via computer on the English reading comprehension skills of English language learners. Participants were 28 First-grade students identified as Limited English Proficient. The primary language of all participants was Spanish. Students were…

  13. The Effect of Gloss Type and Mode on Iranian EFL Learners' Reading Comprehension

    ERIC Educational Resources Information Center

    Sadeghi, Karim; Ahmadi, Negar

    2012-01-01

    This study investigated the effects of three kinds of gloss conditions that is traditional non-CALL marginal gloss, computer-based audio gloss, and computer-based extended audio gloss, on reading comprehension of Iranian EFL learners. To this end, three experimental and one control groups, each comprising 15 participants, took part in this study.…

  14. Instructional Effectiveness of a Computer-Supported Program for Teaching Reading Comprehension Strategies

    ERIC Educational Resources Information Center

    Ponce, Hector R.; Lopez, Mario J.; Mayer, Richard E.

    2012-01-01

    This article examines the effectiveness of a computer-based instructional program (e-PELS) aimed at direct instruction in a collection of reading comprehension strategies. In e-PELS, students learn to highlight and outline expository passages based on various types of text structures (such as comparison or cause-and-effect) as well as to…

  15. Knowledge-driven computational modeling in Alzheimer's disease research: Current state and future trends.

    PubMed

    Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J

    2017-11-01

    Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  16. Comprehensive Experimental and Computational Analysis of Binding Energy Hot Spots at the NF-κB Essential Modulator (NEMO)/IKKβ Protein-Protein Interface

    PubMed Central

    Golden, Mary S.; Cote, Shaun M.; Sayeg, Marianna; Zerbe, Brandon S.; Villar, Elizabeth A.; Beglov, Dmitri; Sazinsky, Stephen L.; Georgiadis, Rosina M.; Vajda, Sandor; Kozakov, Dima; Whitty, Adrian

    2013-01-01

    We report a comprehensive analysis of binding energy hot spots at the protein-protein interaction (PPI) interface between NF-κB Essential Modulator (NEMO) and IκB kinase subunit β (IKKβ), an interaction that is critical for NF-κB pathway signaling, using experimental alanine scanning mutagenesis and also the FTMap method for computational fragment screening. The experimental results confirm that the previously identified NBD region of IKKβ contains the highest concentration of hot spot residues, the strongest of which are W739, W741 and L742 (ΔΔG = 4.3, 3.5 and 3.2 kcal/mol, respectively). The region occupied by these residues defines a potentially druggable binding site on NEMO that extends for ~16 Å to additionally include the regions that bind IKKβ L737 and F734. NBD residues D738 and S740 are also important for binding but do not make direct contact with NEMO, instead likely acting to stabilize the active conformation of surrounding residues. We additionally found two previously unknown hot spot regions centered on IKKβ residues L708/V709 and L719/I723. The computational approach successfully identified all three hot spot regions on IKKβ. Moreover, the method was able to accurately quantify the energetic importance of all hot spots residues involving direct contact with NEMO. Our results provide new information to guide the discovery of small molecule inhibitors that target the NEMO/IKKβ interaction. They additionally clarify the structural and energetic complementarity between “pocket-forming” and “pocket occupying” hot spot residues, and further validate computational fragment mapping as a method for identifying hot spots at PPI interfaces. PMID:23506214

  17. Knowledge-Based Environmental Context Modeling

    NASA Astrophysics Data System (ADS)

    Pukite, P. R.; Challou, D. J.

    2017-12-01

    As we move from the oil-age to an energy infrastructure based on renewables, the need arises for new educational tools to support the analysis of geophysical phenomena and their behavior and properties. Our objective is to present models of these phenomena to make them amenable for incorporation into more comprehensive analysis contexts. Starting at the level of a college-level computer science course, the intent is to keep the models tractable and therefore practical for student use. Based on research performed via an open-source investigation managed by DARPA and funded by the Department of Interior [1], we have adapted a variety of physics-based environmental models for a computer-science curriculum. The original research described a semantic web architecture based on patterns and logical archetypal building-blocks (see figure) well suited for a comprehensive environmental modeling framework. The patterns span a range of features that cover specific land, atmospheric and aquatic domains intended for engineering modeling within a virtual environment. The modeling engine contained within the server relied on knowledge-based inferencing capable of supporting formal terminology (through NASA JPL's Semantic Web for Earth and Environmental Technology (SWEET) ontology and a domain-specific language) and levels of abstraction via integrated reasoning modules. One of the key goals of the research was to simplify models that were ordinarily computationally intensive to keep them lightweight enough for interactive or virtual environment contexts. The breadth of the elements incorporated is well-suited for learning as the trend toward ontologies and applying semantic information is vital for advancing an open knowledge infrastructure. As examples of modeling, we have covered such geophysics topics as fossil-fuel depletion, wind statistics, tidal analysis, and terrain modeling, among others. Techniques from the world of computer science will be necessary to promote efficient use of our renewable natural resources. [1] C2M2L (Component, Context, and Manufacturing Model Library) Final Report, https://doi.org/10.13140/RG.2.1.4956.3604

  18. A stochastic multicriteria model for evidence-based decision making in drug benefit-risk analysis.

    PubMed

    Tervonen, Tommi; van Valkenhoef, Gert; Buskens, Erik; Hillege, Hans L; Postmus, Douwe

    2011-05-30

    Drug benefit-risk (BR) analysis is based on firm clinical evidence regarding various safety and efficacy outcomes. In this paper, we propose a new and more formal approach for constructing a supporting multi-criteria model that fully takes into account the evidence on efficacy and adverse drug reactions. Our approach is based on the stochastic multi-criteria acceptability analysis methodology, which allows us to compute the typical value judgments that support a decision, to quantify decision uncertainty, and to compute a comprehensive BR profile. We construct a multi-criteria model for the therapeutic group of second-generation antidepressants. We assess fluoxetine and venlafaxine together with placebo according to incidence of treatment response and three common adverse drug reactions by using data from a published study. Our model shows that there are clear trade-offs among the treatment alternatives. Copyright © 2011 John Wiley & Sons, Ltd.

  19. AgRISTARS. Supporting research: Algorithms for scene modelling

    NASA Technical Reports Server (NTRS)

    Rassbach, M. E. (Principal Investigator)

    1982-01-01

    The requirements for a comprehensive analysis of LANDSAT or other visual data scenes are defined. The development of a general model of a scene and a computer algorithm for finding the particular model for a given scene is discussed. The modelling system includes a boundary analysis subsystem, which detects all the boundaries and lines in the image and builds a boundary graph; a continuous variation analysis subsystem, which finds gradual variations not well approximated by a boundary structure; and a miscellaneous features analysis, which includes texture, line parallelism, etc. The noise reduction capabilities of this method and its use in image rectification and registration are discussed.

  20. Identification and Analysis of Critical Gaps in Nuclear Fuel Cycle Codes Required by the SINEMA Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adrian Miron; Joshua Valentine; John Christenson

    2009-10-01

    The current state of the art in nuclear fuel cycle (NFC) modeling is an eclectic mixture of codes with various levels of applicability, flexibility, and availability. In support of the advanced fuel cycle systems analyses, especially those by the Advanced Fuel Cycle Initiative (AFCI), Unviery of Cincinnati in collaboration with Idaho State University carried out a detailed review of the existing codes describing various aspects of the nuclear fuel cycle and identified the research and development needs required for a comprehensive model of the global nuclear energy infrastructure and the associated nuclear fuel cycles. Relevant information obtained on the NFCmore » codes was compiled into a relational database that allows easy access to various codes' properties. Additionally, the research analyzed the gaps in the NFC computer codes with respect to their potential integration into programs that perform comprehensive NFC analysis.« less

  1. Can the computer replace the adult for storybook reading? A meta-analysis on the effects of multimedia stories as compared to sharing print stories with an adult.

    PubMed

    Takacs, Zsofia K; Swart, Elise K; Bus, Adriana G

    2014-01-01

    The present meta-analysis challenges the notion that young children necessarily need adult scaffolding in order to understand a narrative story and learn words as long as they encounter optimally designed multimedia stories. Including 29 studies and 1272 children, multimedia stories were found more beneficial than encounters with traditional story materials that did not include the help of an adult for story comprehension (g+ = 0.40, k = 18) as well as vocabulary (g+ = 0.30, k = 11). However, no significant differences were found between the learning outcomes of multimedia stories and sharing traditional print-like stories with an adult. It is concluded that multimedia features like animated illustrations, background music and sound effects provide similar scaffolding of story comprehension and word learning as an adult.

  2. Continuous Trailing-Edge Flaps for Primary Flight Control of a Helicopter Main Rotor

    NASA Technical Reports Server (NTRS)

    Thornburgh, Robert P.; Kreshock, Andrew R.; Wilbur, Matthew L.; Sekula, Martin K.; Shen, Jinwei

    2014-01-01

    The use of continuous trailing-edge flaps (CTEFs) for primary flight control of a helicopter main rotor is studied. A practical, optimized bimorph design with Macro-Fiber Composite actuators is developed for CTEF control, and a coupled structures and computational fluid dynamics methodology is used to study the fundamental behavior of an airfoil with CTEFs. These results are used within a comprehensive rotorcraft analysis model to study the control authority requirements of the CTEFs when utilized for primary flight control of a utility class helicopter. A study of the effect of blade root pitch index (RPI) on CTEF control authority is conducted, and the impact of structural and aerodynamic model complexity on the comprehensive analysis results is presented. The results show that primary flight control using CTEFs is promising; however, a more viable option may include the control of blade RPI, as well.

  3. Can the computer replace the adult for storybook reading? A meta-analysis on the effects of multimedia stories as compared to sharing print stories with an adult

    PubMed Central

    Takacs, Zsofia K.; Swart, Elise K.; Bus, Adriana G.

    2014-01-01

    The present meta-analysis challenges the notion that young children necessarily need adult scaffolding in order to understand a narrative story and learn words as long as they encounter optimally designed multimedia stories. Including 29 studies and 1272 children, multimedia stories were found more beneficial than encounters with traditional story materials that did not include the help of an adult for story comprehension (g+ = 0.40, k = 18) as well as vocabulary (g+ = 0.30, k = 11). However, no significant differences were found between the learning outcomes of multimedia stories and sharing traditional print-like stories with an adult. It is concluded that multimedia features like animated illustrations, background music and sound effects provide similar scaffolding of story comprehension and word learning as an adult. PMID:25520684

  4. Cost effectiveness of the addition of a comprehensive CT scan to the abdomen and pelvis for the detection of cancer after unprovoked venous thromboembolism.

    PubMed

    Coyle, Kathryn; Carrier, Marc; Lazo-Langner, Alejandro; Shivakumar, Sudeep; Zarychanski, Ryan; Tagalakis, Vicky; Solymoss, Susan; Routhier, Nathalie; Douketis, James; Coyle, Douglas

    2017-03-01

    Unprovoked venous thromboembolism (VTE) can be the first manifestation of cancer. It is unclear if extensive screening for occult cancer including a comprehensive computed tomography (CT) scan of the abdomen/pelvis is cost-effective in this patient population. To assess the health care related costs, number of missed cancer cases and health related utility values of a limited screening strategy with and without the addition of a comprehensive CT scan of the abdomen/pelvis and to identify to what extent testing should be done in these circumstances to allow early detection of occult cancers. Cost effectiveness analysis using data that was collected alongside the SOME randomized controlled trial which compared an extensive occult cancer screening including a CT of the abdomen/pelvis to a more limited screening strategy in patients with a first unprovoked VTE, was used for the current analyses. Analyses were conducted with a one-year time horizon from a Canadian health care perspective. Primary analysis was based on complete cases, with sensitivity analysis using appropriate multiple imputation methods to account for missing data. Data from a total of 854 patients with a first unprovoked VTE were included in these analyses. The addition of a comprehensive CT scan was associated with higher costs ($551 CDN) with no improvement in utility values or number of missed cancers. Results were consistent when adopting multiple imputation methods. The addition of a comprehensive CT scan of the abdomen/pelvis for the screening of occult cancer in patients with unprovoked VTE is not cost effective, as it is both more costly and not more effective in detecting occult cancer. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. A Review of Methods for Analysis of the Expected Value of Information.

    PubMed

    Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca

    2017-10-01

    In recent years, value-of-information analysis has become more widespread in health economic evaluations, specifically as a tool to guide further research and perform probabilistic sensitivity analysis. This is partly due to methodological advancements allowing for the fast computation of a typical summary known as the expected value of partial perfect information (EVPPI). A recent review discussed some approximation methods for calculating the EVPPI, but as the research has been active over the intervening years, that review does not discuss some key estimation methods. Therefore, this paper presents a comprehensive review of these new methods. We begin by providing the technical details of these computation methods. We then present two case studies in order to compare the estimation performance of these new methods. We conclude that a method based on nonparametric regression offers the best method for calculating the EVPPI in terms of accuracy, computational time, and ease of implementation. This means that the EVPPI can now be used practically in health economic evaluations, especially as all the methods are developed in parallel with R functions and a web app to aid practitioners.

  6. TranslatomeDB: a comprehensive database and cloud-based analysis platform for translatome sequencing data

    PubMed Central

    Liu, Wanting; Xiang, Lunping; Zheng, Tingkai; Jin, Jingjie

    2018-01-01

    Abstract Translation is a key regulatory step, linking transcriptome and proteome. Two major methods of translatome investigations are RNC-seq (sequencing of translating mRNA) and Ribo-seq (ribosome profiling). To facilitate the investigation of translation, we built a comprehensive database TranslatomeDB (http://www.translatomedb.net/) which provides collection and integrated analysis of published and user-generated translatome sequencing data. The current version includes 2453 Ribo-seq, 10 RNC-seq and their 1394 corresponding mRNA-seq datasets in 13 species. The database emphasizes the analysis functions in addition to the dataset collections. Differential gene expression (DGE) analysis can be performed between any two datasets of same species and type, both on transcriptome and translatome levels. The translation indices translation ratios, elongation velocity index and translational efficiency can be calculated to quantitatively evaluate translational initiation efficiency and elongation velocity, respectively. All datasets were analyzed using a unified, robust, accurate and experimentally-verifiable pipeline based on the FANSe3 mapping algorithm and edgeR for DGE analyzes. TranslatomeDB also allows users to upload their own datasets and utilize the identical unified pipeline to analyze their data. We believe that our TranslatomeDB is a comprehensive platform and knowledgebase on translatome and proteome research, releasing the biologists from complex searching, analyzing and comparing huge sequencing data without needing local computational power. PMID:29106630

  7. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  8. Corporate funding and ideological polarization about climate change

    PubMed Central

    Farrell, Justin

    2016-01-01

    Drawing on large-scale computational data and methods, this research demonstrates how polarization efforts are influenced by a patterned network of political and financial actors. These dynamics, which have been notoriously difficult to quantify, are illustrated here with a computational analysis of climate change politics in the United States. The comprehensive data include all individual and organizational actors in the climate change countermovement (164 organizations), as well as all written and verbal texts produced by this network between 1993–2013 (40,785 texts, more than 39 million words). Two main findings emerge. First, that organizations with corporate funding were more likely to have written and disseminated texts meant to polarize the climate change issue. Second, and more importantly, that corporate funding influences the actual thematic content of these polarization efforts, and the discursive prevalence of that thematic content over time. These findings provide new, and comprehensive, confirmation of dynamics long thought to be at the root of climate change politics and discourse. Beyond the specifics of climate change, this paper has important implications for understanding ideological polarization more generally, and the increasing role of private funding in determining why certain polarizing themes are created and amplified. Lastly, the paper suggests that future studies build on the novel approach taken here that integrates large-scale textual analysis with social networks. PMID:26598653

  9. Corporate funding and ideological polarization about climate change.

    PubMed

    Farrell, Justin

    2016-01-05

    Drawing on large-scale computational data and methods, this research demonstrates how polarization efforts are influenced by a patterned network of political and financial actors. These dynamics, which have been notoriously difficult to quantify, are illustrated here with a computational analysis of climate change politics in the United States. The comprehensive data include all individual and organizational actors in the climate change countermovement (164 organizations), as well as all written and verbal texts produced by this network between 1993-2013 (40,785 texts, more than 39 million words). Two main findings emerge. First, that organizations with corporate funding were more likely to have written and disseminated texts meant to polarize the climate change issue. Second, and more importantly, that corporate funding influences the actual thematic content of these polarization efforts, and the discursive prevalence of that thematic content over time. These findings provide new, and comprehensive, confirmation of dynamics long thought to be at the root of climate change politics and discourse. Beyond the specifics of climate change, this paper has important implications for understanding ideological polarization more generally, and the increasing role of private funding in determining why certain polarizing themes are created and amplified. Lastly, the paper suggests that future studies build on the novel approach taken here that integrates large-scale textual analysis with social networks.

  10. Improved Helicopter Rotor Performance Prediction through Loose and Tight CFD/CSD Coupling

    NASA Astrophysics Data System (ADS)

    Ickes, Jacob C.

    Helicopters and other Vertical Take-Off or Landing (VTOL) vehicles exhibit an interesting combination of structural dynamic and aerodynamic phenomena which together drive the rotor performance. The combination of factors involved make simulating the rotor a challenging and multidisciplinary effort, and one which is still an active area of interest in the industry because of the money and time it could save during design. Modern tools allow the prediction of rotorcraft physics from first principles. Analysis of the rotor system with this level of accuracy provides the understanding necessary to improve its performance. There has historically been a divide between the comprehensive codes which perform aeroelastic rotor simulations using simplified aerodynamic models, and the very computationally intensive Navier-Stokes Computational Fluid Dynamics (CFD) solvers. As computer resources become more available, efforts have been made to replace the simplified aerodynamics of the comprehensive codes with the more accurate results from a CFD code. The objective of this work is to perform aeroelastic rotorcraft analysis using first-principles simulations for both fluids and structural predictions using tools available at the University of Toledo. Two separate codes are coupled together in both loose coupling (data exchange on a periodic interval) and tight coupling (data exchange each time step) schemes. To allow the coupling to be carried out in a reliable and efficient way, a Fluid-Structure Interaction code was developed which automatically performs primary functions of loose and tight coupling procedures. Flow phenomena such as transonics, dynamic stall, locally reversed flow on a blade, and Blade-Vortex Interaction (BVI) were simulated in this work. Results of the analysis show aerodynamic load improvement due to the inclusion of the CFD-based airloads in the structural dynamics analysis of the Computational Structural Dynamics (CSD) code. Improvements came in the form of improved peak/trough magnitude prediction, better phase prediction of these locations, and a predicted signal with a frequency content more like the flight test data than the CSD code acting alone. Additionally, a tight coupling analysis was performed as a demonstration of the capability and unique aspects of such an analysis. This work shows that away from the center of the flight envelope, the aerodynamic modeling of the CSD code can be replaced with a more accurate set of predictions from a CFD code with an improvement in the aerodynamic results. The better predictions come at substantially increased computational costs between 1,000 and 10,000 processor-hours.

  11. Communications among data and science centers

    NASA Technical Reports Server (NTRS)

    Green, James L.

    1990-01-01

    The ability to electronically access and query the contents of remote computer archives is of singular importance in space and earth sciences; the present evaluation of such on-line information networks' development status foresees swift expansion of their data capabilities and complexity, in view of the volumes of data that will continue to be generated by NASA missions. The U.S.'s National Space Science Data Center (NSSDC) manages NASA's largest science computer network, the Space Physics Analysis Network; a comprehensive account is given of the structure of NSSDC international access through BITNET, and of connections to the NSSDC available in the Americas via the International X.25 network.

  12. Computational and Experimental Analysis of the Secretome of Methylococcus capsulatus (Bath)

    PubMed Central

    Indrelid, Stine; Mathiesen, Geir; Jacobsen, Morten; Lea, Tor; Kleiveland, Charlotte R.

    2014-01-01

    The Gram-negative methanotroph Methylococcus capsulatus (Bath) was recently demonstrated to abrogate inflammation in a murine model of inflammatory bowel disease, suggesting interactions with cells involved in maintaining mucosal homeostasis and emphasizing the importance of understanding the many properties of M. capsulatus. Secreted proteins determine how bacteria may interact with their environment, and a comprehensive knowledge of such proteins is therefore vital to understand bacterial physiology and behavior. The aim of this study was to systematically analyze protein secretion in M. capsulatus (Bath) by identifying the secretion systems present and the respective secreted substrates. Computational analysis revealed that in addition to previously recognized type II secretion systems and a type VII secretion system, a type Vb (two-partner) secretion system and putative type I secretion systems are present in M. capsulatus (Bath). In silico analysis suggests that the diverse secretion systems in M.capsulatus transport proteins likely to be involved in adhesion, colonization, nutrient acquisition and homeostasis maintenance. Results of the computational analysis was verified and extended by an experimental approach showing that in addition an uncharacterized protein and putative moonlighting proteins are released to the medium during exponential growth of M. capsulatus (Bath). PMID:25479164

  13. Addressing Curse of Dimensionality in Sensitivity Analysis: How Can We Handle High-Dimensional Problems?

    NASA Astrophysics Data System (ADS)

    Safaei, S.; Haghnegahdar, A.; Razavi, S.

    2016-12-01

    Complex environmental models are now the primary tool to inform decision makers for the current or future management of environmental resources under the climate and environmental changes. These complex models often contain a large number of parameters that need to be determined by a computationally intensive calibration procedure. Sensitivity analysis (SA) is a very useful tool that not only allows for understanding the model behavior, but also helps in reducing the number of calibration parameters by identifying unimportant ones. The issue is that most global sensitivity techniques are highly computationally demanding themselves for generating robust and stable sensitivity metrics over the entire model response surface. Recently, a novel global sensitivity analysis method, Variogram Analysis of Response Surfaces (VARS), is introduced that can efficiently provide a comprehensive assessment of global sensitivity using the Variogram concept. In this work, we aim to evaluate the effectiveness of this highly efficient GSA method in saving computational burden, when applied to systems with extra-large number of input factors ( 100). We use a test function and a hydrological modelling case study to demonstrate the capability of VARS method in reducing problem dimensionality by identifying important vs unimportant input factors.

  14. Computational principles of working memory in sentence comprehension.

    PubMed

    Lewis, Richard L; Vasishth, Shravan; Van Dyke, Julie A

    2006-10-01

    Understanding a sentence requires a working memory of the partial products of comprehension, so that linguistic relations between temporally distal parts of the sentence can be rapidly computed. We describe an emerging theoretical framework for this working memory system that incorporates several independently motivated principles of memory: a sharply limited attentional focus, rapid retrieval of item (but not order) information subject to interference from similar items, and activation decay (forgetting over time). A computational model embodying these principles provides an explanation of the functional capacities and severe limitations of human processing, as well as accounts of reading times. The broad implication is that the detailed nature of cross-linguistic sentence processing emerges from the interaction of general principles of human memory with the specialized task of language comprehension.

  15. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  16. Marcus canonical integral for non-Gaussian processes and its computation: pathwise simulation and tau-leaping algorithm.

    PubMed

    Li, Tiejun; Min, Bin; Wang, Zhiming

    2013-03-14

    The stochastic integral ensuring the Newton-Leibnitz chain rule is essential in stochastic energetics. Marcus canonical integral has this property and can be understood as the Wong-Zakai type smoothing limit when the driving process is non-Gaussian. However, this important concept seems not well-known for physicists. In this paper, we discuss Marcus integral for non-Gaussian processes and its computation in the context of stochastic energetics. We give a comprehensive introduction to Marcus integral and compare three equivalent definitions in the literature. We introduce the exact pathwise simulation algorithm and give the error analysis. We show how to compute the thermodynamic quantities based on the pathwise simulation algorithm. We highlight the information hidden in the Marcus mapping, which plays the key role in determining thermodynamic quantities. We further propose the tau-leaping algorithm, which advance the process with deterministic time steps when tau-leaping condition is satisfied. The numerical experiments and its efficiency analysis show that it is very promising.

  17. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE PAGES

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...

    2015-07-14

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  18. LOX/LH2 vane pump for auxiliary propulsion systems

    NASA Technical Reports Server (NTRS)

    Hemminger, J. A.; Ulbricht, T. E.

    1985-01-01

    Positive displacement pumps offer potential efficiency advantages over centrifugal pumps for future low thrust space missions. Low flow rate applications, such as space station auxiliary propulsion or dedicated low thrust orbiter transfer vehicles, are typical of missions where low flow and high head rise challenge centrifugal pumps. The positive displacement vane pump for pumping of LOX and LH2 is investigated. This effort has included: (1) a testing program in which pump performance was investigated for differing pump clearances and for differing pump materials while pumping LN2, LOX, and LH2; and (2) an analysis effort, in which a comprehensive pump performance analysis computer code was developed and exercised. An overview of the theoretical framework of the performance analysis computer code is presented, along with a summary of analysis results. Experimental results are presented for pump operating in liquid nitrogen. Included are data on the effects on pump performance of pump clearance, speed, and pressure rise. Pump suction performance is also presented.

  19. A Position on a Computer Literacy Course.

    ERIC Educational Resources Information Center

    Self, Charles C.

    A position is put forth on the appropriate content of a computer literacy course and the role of computer literacy in the community college. First, various definitions of computer literacy are examined, including the programming, computer awareness, and comprehensive approaches. Next, five essential components of a computer literacy course are…

  20. Scientific Visualization, Seeing the Unseeable

    ScienceCinema

    LBNL

    2017-12-09

    June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in bo... June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.

  1. Computerized Presentation of Text: Effects on Children's Reading of Informational Material. Special Issue on Reading Comprehension ? Part II

    ERIC Educational Resources Information Center

    Kerr, Matthew A.; Symons, Sonya E.

    2006-01-01

    This study examined whether children's reading rate, comprehension, and recall are affected by computer presentation of text. Participants were 60 grade five students, who each read two expository texts, one in a traditional print format and the other from a computer monitor, which used a common scrolling text interface. After reading each text,…

  2. The Effects of Targeted English Language Arts Instruction Using Multimedia Applications on Grade Three Students' Reading Comprehension, Attitude toward Computers, and Attitude toward School

    ERIC Educational Resources Information Center

    Swerdloff, Matthew

    2013-01-01

    The purpose of this study was to investigate the specific effects of targeted English Language Arts (ELA) instruction using multimedia applications. Student reading comprehension, student attitude toward computers, and student attitude toward school were measured in this study. The study also examined the perceptions, of selected students, of the…

  3. Using a Computer-Adapted, Conceptually Based History Text to Increase Comprehension and Problem-Solving Skills of Students with Disabilities

    ERIC Educational Resources Information Center

    Twyman, Todd; Tindal, Gerald

    2006-01-01

    The purpose of this study was to improve the comprehension and problem-solving skills of students with disabilities in social studies using a conceptually framed, computer-adapted history text. Participants were 11th and 12th grade students identified with learning disabilities in reading and writing from two intact, self-contained social studies…

  4. Climate Modeling Computing Needs Assessment

    NASA Astrophysics Data System (ADS)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  5. Computation of rotor aerodynamic loads in forward flight using a full-span free wake analysis

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Bliss, Donald B.; Wachspress, Daniel A.; Boschitsch, Alexander H.; Chua, Kiat

    1990-01-01

    The development of an advanced computational analysis of unsteady aerodynamic loads on isolated helicopter rotors in forward flight is described. The primary technical focus of the development was the implementation of a freely distorting filamentary wake model composed of curved vortex elements laid out along contours of constant vortex sheet strength in the wake. This model captures the wake generated by the full span of each rotor blade and makes possible a unified treatment of the shed and trailed vorticity in the wake. This wake model was coupled to a modal analysis of the rotor blade dynamics and a vortex lattice treatment of the aerodynamic loads to produce a comprehensive model for rotor performance and air loads in forward flight dubbed RotorCRAFT (Computation of Rotor Aerodynamics in Forward Flight). The technical background on the major components of this analysis are discussed and the correlation of predictions of performance, trim, and unsteady air loads with experimental data from several representative rotor configurations is examined. The primary conclusions of this study are that the RotorCRAFT analysis correlates well with measured loads on a variety of configurations and that application of the full span free wake model is required to capture several important features of the vibratory loading on rotor blades in forward flight.

  6. Advancement and Implementation of Integrated Computational Materials Engineering (ICME) for Aerospace Applications

    DTIC Science & Technology

    2010-03-01

    of sub-routines Thermal history • Abaqus FEM engine mature applied within ABAQUS Residual stress & Distortion • Unknown maturity for HTC • Focused...investment. The committee’s ICME vision is comprehensive, expansive , and involves the entire materials community. The scope of this white paper is...Software • Continuum FEM for fluid flow, heat Mold Fill • FEM implementation mature flow and stress analysis Thermal & mushy zone history • Needs

  7. Research on a Frame-Based Model of Reading Comprehension. Final Report.

    ERIC Educational Resources Information Center

    Goldstein, Ira

    This report summarizes computational investigations of language comprehension based on Marvin Minsky's theory of frames, a recent advance in artifical intelligence theories about the representation of knowledge. The investigations discussed explored frame theory as a basis for text comprehension by implementing models of the theory and developing…

  8. A Rural Community's Involvement in the Design and Usability Testing of a Computer-Based Informed Consent Process for the Personalized Medicine Research Project

    PubMed Central

    Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A

    2014-01-01

    Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants’ understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, such as genetic clinical trials consents. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. PMID:24273095

  9. A rural community's involvement in the design and usability testing of a computer-based informed consent process for the Personalized Medicine Research Project.

    PubMed

    Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A

    2014-01-01

    Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants' understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, including those for trials involving treatment of genetic disorders. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. © 2013 Wiley Periodicals, Inc.

  10. The Association Between Nutrition Facts Label Utilization and Comprehension among Latinos in Two East Los Angeles Neighborhoods

    PubMed Central

    Sharif, Mienah Z.; Rizzo, Shemra; Prelip, Michael L; Glik, Deborah C; Belin, Thomas R; Langellier, Brent A; Kuo, Alice A.; Garza, Jeremiah R; Ortega, Alexander N

    2014-01-01

    Background The Nutrition Facts label can facilitate healthy dietary practices. There is a dearth of research on Latinos’ utilization and comprehension of the Nutrition Facts label. Objective To measure Nutrition Facts label use and comprehension and to identify their correlates among Latinos in East Los Angeles. Design Cross-sectional interviewer-administered survey using a computer assisted personal interview (CAPI) software conducted in either English or Spanish in the participant’s home. Participants/Setting Eligibility criteria were: living in a household within the block clusters identified, being age 18 or over, speaking English or Spanish, identifying as Latino and as the household’s main food purchaser and preparer. Analyses were based on 269 eligible respondents. Statistical analyses performed Chi-square test and multivariate logistic regression analysis assessed the association between the main outcomes and demographics. Multiple imputation addressed missing data. Results Sixty percent reported using the label; only 13% showed adequate comprehension of the label. Utilization was associated with being female, speaking Spanish and being below the poverty line. Comprehension was associated with younger age, not being married, and higher education. Utilization was not associated with comprehension. Conclusions Latinos who are using the Nutrition Facts label are not correctly interpreting the available information. Targeted education is needed to improve Nutrition Facts label use and comprehension, to directly improve diet, particularly among males, older Latinos, and those with less than a high school education. PMID:24974172

  11. The association between nutrition facts label utilization and comprehension among Latinos in two east Los Angeles neighborhoods.

    PubMed

    Sharif, Mienah Z; Rizzo, Shemra; Prelip, Michael L; Glik, Deborah C; Belin, Thomas R; Langellier, Brent A; Kuo, Alice A; Garza, Jeremiah R; Ortega, Alexander N

    2014-12-01

    The Nutrition Facts label can facilitate healthy dietary practices. There is a dearth of research on Latinos' utilization and comprehension of the Nutrition Facts label. To measure use and comprehension of the Nutrition Facts label and to identify correlates among Latinos in East Los Angeles, CA. Cross-sectional interviewer-administered survey using computer-assisted personal interview software, conducted in either English or Spanish in the participant's home. Eligibility criteria were: living in a household within the block clusters identified, being age 18 years or older, speaking English or Spanish, identifying as Latino and as the household's main food purchaser and preparer. Analyses were based on 269 eligible respondents. χ(2) test and multivariate logistic regression analysis assessed the associations among the main outcomes and demographics. Multiple imputations addressed missing data. Sixty percent reported using the label; only 13% showed adequate comprehension of the label. Utilization was associated with being female, speaking Spanish, and being below the poverty line. Comprehension was associated with younger age, not being married, and higher education. Utilization was not associated with comprehension. Latinos who are using the Nutrition Facts label are not correctly interpreting the available information. Targeted education is needed to improve use and comprehension of the Nutrition Facts label to directly improve diet, particularly among males, older Latinos, and those with less than a high school education. Copyright © 2014 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  12. TranslatomeDB: a comprehensive database and cloud-based analysis platform for translatome sequencing data.

    PubMed

    Liu, Wanting; Xiang, Lunping; Zheng, Tingkai; Jin, Jingjie; Zhang, Gong

    2018-01-04

    Translation is a key regulatory step, linking transcriptome and proteome. Two major methods of translatome investigations are RNC-seq (sequencing of translating mRNA) and Ribo-seq (ribosome profiling). To facilitate the investigation of translation, we built a comprehensive database TranslatomeDB (http://www.translatomedb.net/) which provides collection and integrated analysis of published and user-generated translatome sequencing data. The current version includes 2453 Ribo-seq, 10 RNC-seq and their 1394 corresponding mRNA-seq datasets in 13 species. The database emphasizes the analysis functions in addition to the dataset collections. Differential gene expression (DGE) analysis can be performed between any two datasets of same species and type, both on transcriptome and translatome levels. The translation indices translation ratios, elongation velocity index and translational efficiency can be calculated to quantitatively evaluate translational initiation efficiency and elongation velocity, respectively. All datasets were analyzed using a unified, robust, accurate and experimentally-verifiable pipeline based on the FANSe3 mapping algorithm and edgeR for DGE analyzes. TranslatomeDB also allows users to upload their own datasets and utilize the identical unified pipeline to analyze their data. We believe that our TranslatomeDB is a comprehensive platform and knowledgebase on translatome and proteome research, releasing the biologists from complex searching, analyzing and comparing huge sequencing data without needing local computational power. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  13. The CMC/3DPNS computer program for prediction of three-dimension, subsonic, turbulent aerodynamic juncture region flow. Volume 2: Users' manual

    NASA Technical Reports Server (NTRS)

    Manhardt, P. D.

    1982-01-01

    The CMC fluid mechanics program system was developed to transmit the theoretical solution of finite element numerical solution methodology, applied to nonlinear field problems into a versatile computer code for comprehensive flow field analysis. Data procedures for the CMC 3 dimensional Parabolic Navier-Stokes (PNS) algorithm are presented. General data procedures a juncture corner flow standard test case data deck is described. A listing of the data deck and an explanation of grid generation methodology are presented. Tabulations of all commands and variables available to the user are described. These are in alphabetical order with cross reference numbers which refer to storage addresses.

  14. Data Processing System (DPS) software with experimental design, statistical analysis and data mining developed for use in entomological research.

    PubMed

    Tang, Qi-Yi; Zhang, Chuan-Xi

    2013-04-01

    A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.

  15. Low-speed Aerodynamic Investigations of a Hybrid Wing Body Configuration

    NASA Technical Reports Server (NTRS)

    Vicroy, Dan D.; Gatlin, Gregory M.; Jenkins, Luther N.; Murphy, Patrick C.; Carter, Melissa B.

    2014-01-01

    Two low-speed static wind tunnel tests and a water tunnel static and dynamic forced-motion test have been conducted on a hybrid wing-body (HWB) twinjet configuration. These tests, in addition to computational fluid dynamics (CFD) analysis, have provided a comprehensive dataset of the low-speed aerodynamic characteristics of this nonproprietary configuration. In addition to force and moment measurements, the tests included surface pressures, flow visualization, and off-body particle image velocimetry measurements. This paper will summarize the results of these tests and highlight the data that is available for code comparison or additional analysis.

  16. CARD 2017: expansion and model-centric curation of the comprehensive antibiotic resistance database

    PubMed Central

    Jia, Baofeng; Raphenya, Amogelang R.; Alcock, Brian; Waglechner, Nicholas; Guo, Peiyao; Tsang, Kara K.; Lago, Briony A.; Dave, Biren M.; Pereira, Sheldon; Sharma, Arjun N.; Doshi, Sachin; Courtot, Mélanie; Lo, Raymond; Williams, Laura E.; Frye, Jonathan G.; Elsayegh, Tariq; Sardar, Daim; Westman, Erin L.; Pawlowski, Andrew C.; Johnson, Timothy A.; Brinkman, Fiona S.L.; Wright, Gerard D.; McArthur, Andrew G.

    2017-01-01

    The Comprehensive Antibiotic Resistance Database (CARD; http://arpcard.mcmaster.ca) is a manually curated resource containing high quality reference data on the molecular basis of antimicrobial resistance (AMR), with an emphasis on the genes, proteins and mutations involved in AMR. CARD is ontologically structured, model centric, and spans the breadth of AMR drug classes and resistance mechanisms, including intrinsic, mutation-driven and acquired resistance. It is built upon the Antibiotic Resistance Ontology (ARO), a custom built, interconnected and hierarchical controlled vocabulary allowing advanced data sharing and organization. Its design allows the development of novel genome analysis tools, such as the Resistance Gene Identifier (RGI) for resistome prediction from raw genome sequence. Recent improvements include extensive curation of additional reference sequences and mutations, development of a unique Model Ontology and accompanying AMR detection models to power sequence analysis, new visualization tools, and expansion of the RGI for detection of emergent AMR threats. CARD curation is updated monthly based on an interplay of manual literature curation, computational text mining, and genome analysis. PMID:27789705

  17. The Effects of Beacons, Comments, and Tasks on Program Comprehension Process in Software Maintenance

    ERIC Educational Resources Information Center

    Fan, Quyin

    2010-01-01

    Program comprehension is the most important and frequent process in software maintenance. Extensive research has found that individual characteristics of programmers, differences of computer programs, and differences of task-driven motivations are the major factors that affect the program comprehension results. There is no study specifically…

  18. Effectiveness evaluation of STOL transport operations (phase 2). [computer simulation program of commercial short haul aircraft operations

    NASA Technical Reports Server (NTRS)

    Welp, D. W.; Brown, R. A.; Ullman, D. G.; Kuhner, M. B.

    1974-01-01

    A computer simulation program which models a commercial short-haul aircraft operating in the civil air system was developed. The purpose of the program is to evaluate the effect of a given aircraft avionics capability on the ability of the aircraft to perform on-time carrier operations. The program outputs consist primarily of those quantities which can be used to determine direct operating costs. These include: (1) schedule reliability or delays, (2) repairs/replacements, (3) fuel consumption, and (4) cancellations. More comprehensive models of the terminal area environment were added and a simulation of an existing airline operation was conducted to obtain a form of model verification. The capability of the program to provide comparative results (sensitivity analysis) was then demonstrated by modifying the aircraft avionics capability for additional computer simulations.

  19. Comprehensive Benchmark Suite for Simulation of Particle Laden Flows Using the Discrete Element Method with Performance Profiles from the Multiphase Flow with Interface eXchanges (MFiX) Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Peiyuan; Brown, Timothy; Fullmer, William D.

    Five benchmark problems are developed and simulated with the computational fluid dynamics and discrete element model code MFiX. The benchmark problems span dilute and dense regimes, consider statistically homogeneous and inhomogeneous (both clusters and bubbles) particle concentrations and a range of particle and fluid dynamic computational loads. Several variations of the benchmark problems are also discussed to extend the computational phase space to cover granular (particles only), bidisperse and heat transfer cases. A weak scaling analysis is performed for each benchmark problem and, in most cases, the scalability of the code appears reasonable up to approx. 103 cores. Profiling ofmore » the benchmark problems indicate that the most substantial computational time is being spent on particle-particle force calculations, drag force calculations and interpolating between discrete particle and continuum fields. Hardware performance analysis was also carried out showing significant Level 2 cache miss ratios and a rather low degree of vectorization. These results are intended to serve as a baseline for future developments to the code as well as a preliminary indicator of where to best focus performance optimizations.« less

  20. Computer Program for Analysis, Design and Optimization of Propulsion, Dynamics, and Kinematics of Multistage Rockets

    NASA Astrophysics Data System (ADS)

    Lali, Mehdi

    2009-03-01

    A comprehensive computer program is designed in MATLAB to analyze, design and optimize the propulsion, dynamics, thermodynamics, and kinematics of any serial multi-staging rocket for a set of given data. The program is quite user-friendly. It comprises two main sections: "analysis and design" and "optimization." Each section has a GUI (Graphical User Interface) in which the rocket's data are entered by the user and by which the program is run. The first section analyzes the performance of the rocket that is previously devised by the user. Numerous plots and subplots are provided to display the performance of the rocket. The second section of the program finds the "optimum trajectory" via billions of iterations and computations which are done through sophisticated algorithms using numerical methods and incremental integrations. Innovative techniques are applied to calculate the optimal parameters for the engine and designing the "optimal pitch program." This computer program is stand-alone in such a way that it calculates almost every design parameter in regards to rocket propulsion and dynamics. It is meant to be used for actual launch operations as well as educational and research purposes.

  1. A brief overview of computational structures technology related activities at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Hopkins, Dale A.

    1992-01-01

    The presentation gives a partial overview of research and development underway in the Structures Division of LeRC, which collectively is referred to as the Computational Structures Technology Program. The activities in the program are diverse and encompass four major categories: (1) composite materials and structures; (2) probabilistic analysis and reliability; (3) design optimization and expert systems; and (4) computational methods and simulation. The approach of the program is comprehensive and entails exploration of fundamental theories of structural mechanics to accurately represent the complex physics governing engine structural performance, formulation, and implementation of computational techniques and integrated simulation strategies to provide accurate and efficient solutions of the governing theoretical models by exploiting the emerging advances in computer technology, and validation and verification through numerical and experimental tests to establish confidence and define the qualities and limitations of the resulting theoretical models and computational solutions. The program comprises both in-house and sponsored research activities. The remainder of the presentation provides a sample of activities to illustrate the breadth and depth of the program and to demonstrate the accomplishments and benefits that have resulted.

  2. Systems cell biology

    PubMed Central

    Mast, Fred D.; Ratushny, Alexander V.

    2014-01-01

    Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. PMID:25225336

  3. Calculating Mass Diffusion in High-Pressure Binary Fluids

    NASA Technical Reports Server (NTRS)

    Bellan, Josette; Harstad, Kenneth

    2004-01-01

    A comprehensive mathematical model of mass diffusion has been developed for binary fluids at high pressures, including critical and supercritical pressures. Heretofore, diverse expressions, valid for limited parameter ranges, have been used to correlate high-pressure binary mass-diffusion-coefficient data. This model will likely be especially useful in the computational simulation and analysis of combustion phenomena in diesel engines, gas turbines, and liquid rocket engines, wherein mass diffusion at high pressure plays a major role.

  4. Time-variant random interval natural frequency analysis of structures

    NASA Astrophysics Data System (ADS)

    Wu, Binhua; Wu, Di; Gao, Wei; Song, Chongmin

    2018-02-01

    This paper presents a new robust method namely, unified interval Chebyshev-based random perturbation method, to tackle hybrid random interval structural natural frequency problem. In the proposed approach, random perturbation method is implemented to furnish the statistical features (i.e., mean and standard deviation) and Chebyshev surrogate model strategy is incorporated to formulate the statistical information of natural frequency with regards to the interval inputs. The comprehensive analysis framework combines the superiority of both methods in a way that computational cost is dramatically reduced. This presented method is thus capable of investigating the day-to-day based time-variant natural frequency of structures accurately and efficiently under concrete intrinsic creep effect with probabilistic and interval uncertain variables. The extreme bounds of the mean and standard deviation of natural frequency are captured through the embedded optimization strategy within the analysis procedure. Three particularly motivated numerical examples with progressive relationship in perspective of both structure type and uncertainty variables are demonstrated to justify the computational applicability, accuracy and efficiency of the proposed method.

  5. POSTOP: Postbuckled open-stiffener optimum panels-theory and capability

    NASA Technical Reports Server (NTRS)

    Dickson, J. N.; Biggers, S. B.

    1984-01-01

    The computer program POSTOP was developed to serve as an aid in the analysis and sizing of stiffened composite panels that are loaded in the postbuckling regime. A comprehensive set of analysis routines was coupled to a widely used optimization program to produce this sizing code. POSTOP is intended for the preliminary design of metal or composite panels with open-section stiffeners, subjected to multiple combined biaxial compression (or tension), shear and normal pressure load cases. Longitudinal compression, however, is assumed to be the dominant loading. Temperature, initial bow eccentricity and load eccentricity effects are included. The panel geometry is assumed to be repetitive over several bays in the longitudinal (stiffener) direction as well as in the transverse direction. Analytical routines are included to compute panel stiffnesses, strains, local and panel buckling loads, and skin/stiffener interface stresses. The resulting program is applicable to stiffened panels as commonly used in fuselage, wing, or empennage structures. The analysis procedures and rationale for the assumptions used therein are described in detail.

  6. Template construction grammar: from visual scene description to language comprehension and agrammatism.

    PubMed

    Barrès, Victor; Lee, Jinyong

    2014-01-01

    How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and world-knowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuo-motor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eye-tracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentence-picture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community.

  7. QuickNGS elevates Next-Generation Sequencing data analysis to a new level of automation.

    PubMed

    Wagle, Prerana; Nikolić, Miloš; Frommolt, Peter

    2015-07-01

    Next-Generation Sequencing (NGS) has emerged as a widely used tool in molecular biology. While time and cost for the sequencing itself are decreasing, the analysis of the massive amounts of data remains challenging. Since multiple algorithmic approaches for the basic data analysis have been developed, there is now an increasing need to efficiently use these tools to obtain results in reasonable time. We have developed QuickNGS, a new workflow system for laboratories with the need to analyze data from multiple NGS projects at a time. QuickNGS takes advantage of parallel computing resources, a comprehensive back-end database, and a careful selection of previously published algorithmic approaches to build fully automated data analysis workflows. We demonstrate the efficiency of our new software by a comprehensive analysis of 10 RNA-Seq samples which we can finish in only a few minutes of hands-on time. The approach we have taken is suitable to process even much larger numbers of samples and multiple projects at a time. Our approach considerably reduces the barriers that still limit the usability of the powerful NGS technology and finally decreases the time to be spent before proceeding to further downstream analysis and interpretation of the data.

  8. Cloud Service Selection Using Multicriteria Decision Analysis

    PubMed Central

    Anuar, Nor Badrul; Shiraz, Muhammad; Haque, Israat Tanzeena

    2014-01-01

    Cloud computing (CC) has recently been receiving tremendous attention from the IT industry and academic researchers. CC leverages its unique services to cloud customers in a pay-as-you-go, anytime, anywhere manner. Cloud services provide dynamically scalable services through the Internet on demand. Therefore, service provisioning plays a key role in CC. The cloud customer must be able to select appropriate services according to his or her needs. Several approaches have been proposed to solve the service selection problem, including multicriteria decision analysis (MCDA). MCDA enables the user to choose from among a number of available choices. In this paper, we analyze the application of MCDA to service selection in CC. We identify and synthesize several MCDA techniques and provide a comprehensive analysis of this technology for general readers. In addition, we present a taxonomy derived from a survey of the current literature. Finally, we highlight several state-of-the-art practical aspects of MCDA implementation in cloud computing service selection. The contributions of this study are four-fold: (a) focusing on the state-of-the-art MCDA techniques, (b) highlighting the comparative analysis and suitability of several MCDA methods, (c) presenting a taxonomy through extensive literature review, and (d) analyzing and summarizing the cloud computing service selections in different scenarios. PMID:24696645

  9. Cloud service selection using multicriteria decision analysis.

    PubMed

    Whaiduzzaman, Md; Gani, Abdullah; Anuar, Nor Badrul; Shiraz, Muhammad; Haque, Mohammad Nazmul; Haque, Israat Tanzeena

    2014-01-01

    Cloud computing (CC) has recently been receiving tremendous attention from the IT industry and academic researchers. CC leverages its unique services to cloud customers in a pay-as-you-go, anytime, anywhere manner. Cloud services provide dynamically scalable services through the Internet on demand. Therefore, service provisioning plays a key role in CC. The cloud customer must be able to select appropriate services according to his or her needs. Several approaches have been proposed to solve the service selection problem, including multicriteria decision analysis (MCDA). MCDA enables the user to choose from among a number of available choices. In this paper, we analyze the application of MCDA to service selection in CC. We identify and synthesize several MCDA techniques and provide a comprehensive analysis of this technology for general readers. In addition, we present a taxonomy derived from a survey of the current literature. Finally, we highlight several state-of-the-art practical aspects of MCDA implementation in cloud computing service selection. The contributions of this study are four-fold: (a) focusing on the state-of-the-art MCDA techniques, (b) highlighting the comparative analysis and suitability of several MCDA methods, (c) presenting a taxonomy through extensive literature review, and (d) analyzing and summarizing the cloud computing service selections in different scenarios.

  10. Computational analysis of microRNA function in heart development.

    PubMed

    Liu, Ganqiang; Ding, Min; Chen, Jiajia; Huang, Jinyan; Wang, Haiyun; Jing, Qing; Shen, Bairong

    2010-09-01

    Emerging evidence suggests that specific spatio-temporal microRNA (miRNA) expression is required for heart development. In recent years, hundreds of miRNAs have been discovered. In contrast, functional annotations are available only for a very small fraction of these regulatory molecules. In order to provide a global perspective for the biologists who study the relationship between differentially expressed miRNAs and heart development, we employed computational analysis to uncover the specific cellular processes and biological pathways targeted by miRNAs in mouse heart development. Here, we utilized Gene Ontology (GO) categories, KEGG Pathway, and GeneGo Pathway Maps as a gene functional annotation system for miRNA target enrichment analysis. The target genes of miRNAs were found to be enriched in functional categories and pathway maps in which miRNAs could play important roles during heart development. Meanwhile, we developed miRHrt (http://sysbio.suda.edu.cn/mirhrt/), a database aiming to provide a comprehensive resource of miRNA function in regulating heart development. These computational analysis results effectively illustrated the correlation of differentially expressed miRNAs with cellular functions and heart development. We hope that the identified novel heart development-associated pathways and the database presented here would facilitate further understanding of the roles and mechanisms of miRNAs in heart development.

  11. Neural source dynamics of brain responses to continuous stimuli: Speech processing from acoustics to comprehension.

    PubMed

    Brodbeck, Christian; Presacco, Alessandro; Simon, Jonathan Z

    2018-05-15

    Human experience often involves continuous sensory information that unfolds over time. This is true in particular for speech comprehension, where continuous acoustic signals are processed over seconds or even minutes. We show that brain responses to such continuous stimuli can be investigated in detail, for magnetoencephalography (MEG) data, by combining linear kernel estimation with minimum norm source localization. Previous research has shown that the requirement to average data over many trials can be overcome by modeling the brain response as a linear convolution of the stimulus and a kernel, or response function, and estimating a kernel that predicts the response from the stimulus. However, such analysis has been typically restricted to sensor space. Here we demonstrate that this analysis can also be performed in neural source space. We first computed distributed minimum norm current source estimates for continuous MEG recordings, and then computed response functions for the current estimate at each source element, using the boosting algorithm with cross-validation. Permutation tests can then assess the significance of individual predictor variables, as well as features of the corresponding spatio-temporal response functions. We demonstrate the viability of this technique by computing spatio-temporal response functions for speech stimuli, using predictor variables reflecting acoustic, lexical and semantic processing. Results indicate that processes related to comprehension of continuous speech can be differentiated anatomically as well as temporally: acoustic information engaged auditory cortex at short latencies, followed by responses over the central sulcus and inferior frontal gyrus, possibly related to somatosensory/motor cortex involvement in speech perception; lexical frequency was associated with a left-lateralized response in auditory cortex and subsequent bilateral frontal activity; and semantic composition was associated with bilateral temporal and frontal brain activity. We conclude that this technique can be used to study the neural processing of continuous stimuli in time and anatomical space with the millisecond temporal resolution of MEG. This suggests new avenues for analyzing neural processing of naturalistic stimuli, without the necessity of averaging over artificially short or truncated stimuli. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Normal-Gamma-Bernoulli Peak Detection for Analysis of Comprehensive Two-Dimensional Gas Chromatography Mass Spectrometry Data.

    PubMed

    Kim, Seongho; Jang, Hyejeong; Koo, Imhoi; Lee, Joohyoung; Zhang, Xiang

    2017-01-01

    Compared to other analytical platforms, comprehensive two-dimensional gas chromatography coupled with mass spectrometry (GC×GC-MS) has much increased separation power for analysis of complex samples and thus is increasingly used in metabolomics for biomarker discovery. However, accurate peak detection remains a bottleneck for wide applications of GC×GC-MS. Therefore, the normal-exponential-Bernoulli (NEB) model is generalized by gamma distribution and a new peak detection algorithm using the normal-gamma-Bernoulli (NGB) model is developed. Unlike the NEB model, the NGB model has no closed-form analytical solution, hampering its practical use in peak detection. To circumvent this difficulty, three numerical approaches, which are fast Fourier transform (FFT), the first-order and the second-order delta methods (D1 and D2), are introduced. The applications to simulated data and two real GC×GC-MS data sets show that the NGB-D1 method performs the best in terms of both computational expense and peak detection performance.

  13. Coupled CFD/CSD Analysis of an Active-Twist Rotor in a Wind Tunnel with Experimental Validation

    NASA Technical Reports Server (NTRS)

    Massey, Steven J.; Kreshock, Andrew R.; Sekula, Martin K.

    2015-01-01

    An unsteady Reynolds averaged Navier-Stokes analysis loosely coupled with a comprehensive rotorcraft code is presented for a second-generation active-twist rotor. High fidelity Navier-Stokes results for three configurations: an isolated rotor, a rotor with fuselage, and a rotor with fuselage mounted in a wind tunnel, are compared to lifting-line theory based comprehensive rotorcraft code calculations and wind tunnel data. Results indicate that CFD/CSD predictions of flapwise bending moments are in good agreement with wind tunnel measurements for configurations with a fuselage, and that modeling the wind tunnel environment does not significantly enhance computed results. Actuated rotor results for the rotor with fuselage configuration are also validated for predictions of vibratory blade loads and fixed-system vibratory loads. Varying levels of agreement with wind tunnel measurements are observed for blade vibratory loads, depending on the load component (flap, lag, or torsion) and the harmonic being examined. Predicted trends in fixed-system vibratory loads are in good agreement with wind tunnel measurements.

  14. Sensitivity Analysis of Multidisciplinary Rotorcraft Simulations

    NASA Technical Reports Server (NTRS)

    Wang, Li; Diskin, Boris; Biedron, Robert T.; Nielsen, Eric J.; Bauchau, Olivier A.

    2017-01-01

    A multidisciplinary sensitivity analysis of rotorcraft simulations involving tightly coupled high-fidelity computational fluid dynamics and comprehensive analysis solvers is presented and evaluated. An unstructured sensitivity-enabled Navier-Stokes solver, FUN3D, and a nonlinear flexible multibody dynamics solver, DYMORE, are coupled to predict the aerodynamic loads and structural responses of helicopter rotor blades. A discretely-consistent adjoint-based sensitivity analysis available in FUN3D provides sensitivities arising from unsteady turbulent flows and unstructured dynamic overset meshes, while a complex-variable approach is used to compute DYMORE structural sensitivities with respect to aerodynamic loads. The multidisciplinary sensitivity analysis is conducted through integrating the sensitivity components from each discipline of the coupled system. Numerical results verify accuracy of the FUN3D/DYMORE system by conducting simulations for a benchmark rotorcraft test model and comparing solutions with established analyses and experimental data. Complex-variable implementation of sensitivity analysis of DYMORE and the coupled FUN3D/DYMORE system is verified by comparing with real-valued analysis and sensitivities. Correctness of adjoint formulations for FUN3D/DYMORE interfaces is verified by comparing adjoint-based and complex-variable sensitivities. Finally, sensitivities of the lift and drag functions obtained by complex-variable FUN3D/DYMORE simulations are compared with sensitivities computed by the multidisciplinary sensitivity analysis, which couples adjoint-based flow and grid sensitivities of FUN3D and FUN3D/DYMORE interfaces with complex-variable sensitivities of DYMORE structural responses.

  15. Computational analysis of Variable Thrust Engine (VTE) performance

    NASA Technical Reports Server (NTRS)

    Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.

    1993-01-01

    The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.

  16. Application of a new model for groundwater age distributions: Modeling and isotopic analysis of artificial recharge in the Rialto-Colton basin, California

    USGS Publications Warehouse

    Ginn, T.R.; Woolfenden, L.

    2002-01-01

    A project for modeling and isotopic analysis of artificial recharge in the Rialto-Colton basin aquifer in California, is discussed. The Rialto-Colton aquifer has been divided into four primary and significant flowpaths following the general direction of groundwater flow from NW to SE. The introductory investigation include sophisticated chemical reaction modeling, with highly simplified flow path simulation. A comprehensive reactive transport model with the established set of geochemical reactions over the whole aquifer will also be developed for treating both reactions and transport realistically. This will be completed by making use of HBGC123D implemented with isotopic calculation step to compute Carbon-14 (C14) and stable Carbon-13 (C13) contents of the water. Computed carbon contents will also be calibrated with the measured carbon contents for assessment of the amount of imported recharge into the Linden pond.

  17. Analytical Methodology for Predicting the Onset of Widespread Fatigue Damage in Fuselage Structure

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Newman, James C., Jr.; Piascik, Robert S.; Starnes, James H., Jr.

    1996-01-01

    NASA has developed a comprehensive analytical methodology for predicting the onset of widespread fatigue damage in fuselage structure. The determination of the number of flights and operational hours of aircraft service life that are related to the onset of widespread fatigue damage includes analyses for crack initiation, fatigue crack growth, and residual strength. Therefore, the computational capability required to predict analytically the onset of widespread fatigue damage must be able to represent a wide range of crack sizes from the material (microscale) level to the global structural-scale level. NASA studies indicate that the fatigue crack behavior in aircraft structure can be represented conveniently by the following three analysis scales: small three-dimensional cracks at the microscale level, through-the-thickness two-dimensional cracks at the local structural level, and long cracks at the global structural level. The computational requirements for each of these three analysis scales are described in this paper.

  18. Performance optimization of helicopter rotor blades

    NASA Technical Reports Server (NTRS)

    Walsh, Joanne L.

    1991-01-01

    As part of a center-wide activity at NASA Langley Research Center to develop multidisciplinary design procedures by accounting for discipline interactions, a performance design optimization procedure is developed. The procedure optimizes the aerodynamic performance of rotor blades by selecting the point of taper initiation, root chord, taper ratio, and maximum twist which minimize hover horsepower while not degrading forward flight performance. The procedure uses HOVT (a strip theory momentum analysis) to compute the horse power required for hover and the comprehensive helicopter analysis program CAMRAD to compute the horsepower required for forward flight and maneuver. The optimization algorithm consists of the general purpose optimization program CONMIN and approximate analyses. Sensitivity analyses consisting of derivatives of the objective function and constraints are carried out by forward finite differences. The procedure is applied to a test problem which is an analytical model of a wind tunnel model of a utility rotor blade.

  19. The utility of ERTS-1 data for applications in land use classification. [Texas Gulf Coast

    NASA Technical Reports Server (NTRS)

    Dornbach, J. E.; Mckain, G. E.

    1974-01-01

    A comprehensive study has been undertaken to determine the extent to which conventional image interpretation and computer-aided (spectral pattern recognition) analysis techniques using ERTS-1 data could be used to detect, identify (classify), locate, and measure current land use over large geographic areas. It can be concluded that most of the level 1 and 2 categories in the USGS Circular no. 671 can be detected in the Houston-Gulf Coast area using a combination of both techniques for analysis. These capabilities could be exercised over larger geographic areas, however, certain factors such as different vegetative cover, topography, etc. may have to be considered in other geographic regions. The best results in identification (classification), location, and measurement of level 1 and 2 type categories appear to be obtainable through automatic data processing of multispectral scanner computer compatible tapes.

  20. Analysis on trust influencing factors and trust model from multiple perspectives of online Auction

    NASA Astrophysics Data System (ADS)

    Yu, Wang

    2017-10-01

    Current reputation models lack the research on online auction trading completely so they cannot entirely reflect the reputation status of users and may cause problems on operability. To evaluate the user trust in online auction correctly, a trust computing model based on multiple influencing factors is established. It aims at overcoming the efficiency of current trust computing methods and the limitations of traditional theoretical trust models. The improved model comprehensively considers the trust degree evaluation factors of three types of participants according to different participation modes of online auctioneers, to improve the accuracy, effectiveness and robustness of the trust degree. The experiments test the efficiency and the performance of our model under different scale of malicious user, under environment like eBay and Sporas model. The experimental results analysis show the model proposed in this paper makes up the deficiency of existing model and it also has better feasibility.

  1. Computational knowledge integration in biopharmaceutical research.

    PubMed

    Ficenec, David; Osborne, Mark; Pradines, Joel; Richards, Dan; Felciano, Ramon; Cho, Raymond J; Chen, Richard O; Liefeld, Ted; Owen, James; Ruttenberg, Alan; Reich, Christian; Horvath, Joseph; Clark, Tim

    2003-09-01

    An initiative to increase biopharmaceutical research productivity by capturing, sharing and computationally integrating proprietary scientific discoveries with public knowledge is described. This initiative involves both organisational process change and multiple interoperating software systems. The software components rely on mutually supporting integration techniques. These include a richly structured ontology, statistical analysis of experimental data against stored conclusions, natural language processing of public literature, secure document repositories with lightweight metadata, web services integration, enterprise web portals and relational databases. This approach has already begun to increase scientific productivity in our enterprise by creating an organisational memory (OM) of internal research findings, accessible on the web. Through bringing together these components it has also been possible to construct a very large and expanding repository of biological pathway information linked to this repository of findings which is extremely useful in analysis of DNA microarray data. This repository, in turn, enables our research paradigm to be shifted towards more comprehensive systems-based understandings of drug action.

  2. T7 lytic phage-displayed peptide libraries: construction and diversity characterization.

    PubMed

    Krumpe, Lauren R H; Mori, Toshiyuki

    2014-01-01

    In this chapter, we describe the construction of T7 bacteriophage (phage)-displayed peptide libraries and the diversity analyses of random amino acid sequences obtained from the libraries. We used commercially available reagents, Novagen's T7Select system, to construct the libraries. Using a combination of biotinylated extension primer and streptavidin-coupled magnetic beads, we were able to prepare library DNA without applying gel purification, resulting in extremely high ligation efficiencies. Further, we describe the use of bioinformatics tools to characterize library diversity. Amino acid frequency and positional amino acid diversity and hydropathy are estimated using the REceptor LIgand Contacts website http://relic.bio.anl.gov. Peptide net charge analysis and peptide hydropathy analysis are conducted using the Genetics Computer Group Wisconsin Package computational tools. A comprehensive collection of the estimated number of recombinants and titers of T7 phage-displayed peptide libraries constructed in our lab is included.

  3. Extraction and Analysis of Display Data

    NASA Technical Reports Server (NTRS)

    Land, Chris; Moye, Kathryn

    2008-01-01

    The Display Audit Suite is an integrated package of software tools that partly automates the detection of Portable Computer System (PCS) Display errors. [PCS is a lap top computer used onboard the International Space Station (ISS).] The need for automation stems from the large quantity of PCS displays (6,000+, with 1,000,000+ lines of command and telemetry data). The Display Audit Suite includes data-extraction tools, automatic error detection tools, and database tools for generating analysis spread sheets. These spread sheets allow engineers to more easily identify many different kinds of possible errors. The Suite supports over 40 independent analyses, 16 NASA Tech Briefs, November 2008 and complements formal testing by being comprehensive (all displays can be checked) and by revealing errors that are difficult to detect via test. In addition, the Suite can be run early in the development cycle to find and correct errors in advance of testing.

  4. VIPER: Visualization Pipeline for RNA-seq, a Snakemake workflow for efficient and complete RNA-seq analysis.

    PubMed

    Cornwell, MacIntosh; Vangala, Mahesh; Taing, Len; Herbert, Zachary; Köster, Johannes; Li, Bo; Sun, Hanfei; Li, Taiwen; Zhang, Jian; Qiu, Xintao; Pun, Matthew; Jeselsohn, Rinath; Brown, Myles; Liu, X Shirley; Long, Henry W

    2018-04-12

    RNA sequencing has become a ubiquitous technology used throughout life sciences as an effective method of measuring RNA abundance quantitatively in tissues and cells. The increase in use of RNA-seq technology has led to the continuous development of new tools for every step of analysis from alignment to downstream pathway analysis. However, effectively using these analysis tools in a scalable and reproducible way can be challenging, especially for non-experts. Using the workflow management system Snakemake we have developed a user friendly, fast, efficient, and comprehensive pipeline for RNA-seq analysis. VIPER (Visualization Pipeline for RNA-seq analysis) is an analysis workflow that combines some of the most popular tools to take RNA-seq analysis from raw sequencing data, through alignment and quality control, into downstream differential expression and pathway analysis. VIPER has been created in a modular fashion to allow for the rapid incorporation of new tools to expand the capabilities. This capacity has already been exploited to include very recently developed tools that explore immune infiltrate and T-cell CDR (Complementarity-Determining Regions) reconstruction abilities. The pipeline has been conveniently packaged such that minimal computational skills are required to download and install the dozens of software packages that VIPER uses. VIPER is a comprehensive solution that performs most standard RNA-seq analyses quickly and effectively with a built-in capacity for customization and expansion.

  5. CMS results in the Combined Computing Readiness Challenge CCRC'08

    NASA Astrophysics Data System (ADS)

    Bonacorsi, D.; Bauerdick, L.; CMS Collaboration

    2009-12-01

    During February and May 2008, CMS participated to the Combined Computing Readiness Challenge (CCRC'08) together with all other LHC experiments. The purpose of this worldwide exercise was to check the readiness of the Computing infrastructure for LHC data taking. Another set of major CMS tests called Computing, Software and Analysis challenge (CSA'08) - as well as CMS cosmic runs - were also running at the same time: CCRC augmented the load on computing with additional tests to validate and stress-test all CMS computing workflows at full data taking scale, also extending this to the global WLCG community. CMS exercised most aspects of the CMS computing model, with very comprehensive tests. During May 2008, CMS moved more than 3.6 Petabytes among more than 300 links in the complex Grid topology. CMS demonstrated that is able to safely move data out of CERN to the Tier-1 sites, sustaining more than 600 MB/s as a daily average for more than seven days in a row, with enough headroom and with hourly peaks of up to 1.7 GB/s. CMS ran hundreds of simultaneous jobs at each Tier-1 site, re-reconstructing and skimming hundreds of millions of events. After re-reconstruction the fresh AOD (Analysis Object Data) has to be synchronized between Tier-1 centers: CMS demonstrated that the required inter-Tier-1 transfers are achievable within a few days. CMS also showed that skimmed analysis data sets can be transferred to Tier-2 sites for analysis at sufficient rate, regionally as well as inter-regionally, achieving all goals in about 90% of >200 links. Simultaneously, CMS also ran a large Tier-2 analysis exercise, where realistic analysis jobs were submitted to a large set of Tier-2 sites by a large number of people to produce a chaotic workload across the systems, and with more than 400 analysis users in May. Taken all together, CMS routinely achieved submissions of 100k jobs/day, with peaks up to 200k jobs/day. The achieved results in CCRC'08 - focussing on the distributed workflows - are presented and discussed.

  6. Toward a unified account of comprehension and production in language development.

    PubMed

    McCauley, Stewart M; Christiansen, Morten H

    2013-08-01

    Although Pickering & Garrod (P&G) argue convincingly for a unified system for language comprehension and production, they fail to explain how such a system might develop. Using a recent computational model of language acquisition as an example, we sketch a developmental perspective on the integration of comprehension and production. We conclude that only through development can we fully understand the intertwined nature of comprehension and production in adult processing.

  7. The Nature of Computer Assisted Learning.

    ERIC Educational Resources Information Center

    Whiting, John

    Computer assisted learning (CAL) is an old technology which has generated much new interest. Computers can: reduce data to a directly comprehensible form; reduce administration; communicate worldwide and exchange, store, and retrieve data; and teach. The computer's limitation is in its dependence on the user's ability and perceptive nature.…

  8. The Effectiveness of Electronic Text and Pictorial Graphic Organizers to Improve Comprehension Related to Functional Skills

    ERIC Educational Resources Information Center

    Douglas, Karen H.; Ayres, Kevin M.; Langone, John; Bramlett, Virginia Bell

    2011-01-01

    This study evaluated the effects of a computer-based instructional program to assist three students with mild to moderate intellectual disabilities in using pictorial graphic organizers as aids for increasing comprehension of electronic text-based recipes. Student comprehension of recipes was measured by their ability to verbally retell recipe…

  9. A comprehensive approach to identify dominant controls of the behavior of a land surface-hydrology model across various hydroclimatic conditions

    NASA Astrophysics Data System (ADS)

    Haghnegahdar, Amin; Elshamy, Mohamed; Yassin, Fuad; Razavi, Saman; Wheater, Howard; Pietroniro, Al

    2017-04-01

    Complex physically-based environmental models are being increasingly used as the primary tool for watershed planning and management due to advances in computation power and data acquisition. Model sensitivity analysis plays a crucial role in understanding the behavior of these complex models and improving their performance. Due to the non-linearity and interactions within these complex models, Global sensitivity analysis (GSA) techniques should be adopted to provide a comprehensive understanding of model behavior and identify its dominant controls. In this study we adopt a multi-basin multi-criteria GSA approach to systematically assess the behavior of the Modélisation Environmentale-Surface et Hydrologie (MESH) across various hydroclimatic conditions in Canada including areas in the Great Lakes Basin, Mackenzie River Basin, and South Saskatchewan River Basin. MESH is a semi-distributed physically-based coupled land surface-hydrology modelling system developed by Environment and Climate Change Canada (ECCC) for various water resources management purposes in Canada. We use a novel method, called Variogram Analysis of Response Surfaces (VARS), to perform sensitivity analysis. VARS is a variogram-based GSA technique that can efficiently provide a spectrum of sensitivity information across a range of scales within the parameter space. We use multiple metrics to identify dominant controls of model response (e.g. streamflow) to model parameters under various conditions such as high flows, low flows, and flow volume. We also investigate the influence of initial conditions on model behavior as part of this study. Our preliminary results suggest that this type of GSA can significantly help with estimating model parameters, decreasing calibration computational burden, and reducing prediction uncertainty.

  10. Machine Learning Meta-analysis of Large Metagenomic Datasets: Tools and Biological Insights.

    PubMed

    Pasolli, Edoardo; Truong, Duy Tin; Malik, Faizan; Waldron, Levi; Segata, Nicola

    2016-07-01

    Shotgun metagenomic analysis of the human associated microbiome provides a rich set of microbial features for prediction and biomarker discovery in the context of human diseases and health conditions. However, the use of such high-resolution microbial features presents new challenges, and validated computational tools for learning tasks are lacking. Moreover, classification rules have scarcely been validated in independent studies, posing questions about the generality and generalization of disease-predictive models across cohorts. In this paper, we comprehensively assess approaches to metagenomics-based prediction tasks and for quantitative assessment of the strength of potential microbiome-phenotype associations. We develop a computational framework for prediction tasks using quantitative microbiome profiles, including species-level relative abundances and presence of strain-specific markers. A comprehensive meta-analysis, with particular emphasis on generalization across cohorts, was performed in a collection of 2424 publicly available metagenomic samples from eight large-scale studies. Cross-validation revealed good disease-prediction capabilities, which were in general improved by feature selection and use of strain-specific markers instead of species-level taxonomic abundance. In cross-study analysis, models transferred between studies were in some cases less accurate than models tested by within-study cross-validation. Interestingly, the addition of healthy (control) samples from other studies to training sets improved disease prediction capabilities. Some microbial species (most notably Streptococcus anginosus) seem to characterize general dysbiotic states of the microbiome rather than connections with a specific disease. Our results in modelling features of the "healthy" microbiome can be considered a first step toward defining general microbial dysbiosis. The software framework, microbiome profiles, and metadata for thousands of samples are publicly available at http://segatalab.cibio.unitn.it/tools/metaml.

  11. Systematic pharmacogenomics analysis of a Malay whole genome: proof of concept for personalized medicine.

    PubMed

    Salleh, Mohd Zaki; Teh, Lay Kek; Lee, Lian Shien; Ismet, Rose Iszati; Patowary, Ashok; Joshi, Kandarp; Pasha, Ayesha; Ahmed, Azni Zain; Janor, Roziah Mohd; Hamzah, Ahmad Sazali; Adam, Aishah; Yusoff, Khalid; Hoh, Boon Peng; Hatta, Fazleen Haslinda Mohd; Ismail, Mohamad Izwan; Scaria, Vinod; Sivasubbu, Sridhar

    2013-01-01

    With a higher throughput and lower cost in sequencing, second generation sequencing technology has immense potential for translation into clinical practice and in the realization of pharmacogenomics based patient care. The systematic analysis of whole genome sequences to assess patient to patient variability in pharmacokinetics and pharmacodynamics responses towards drugs would be the next step in future medicine in line with the vision of personalizing medicine. Genomic DNA obtained from a 55 years old, self-declared healthy, anonymous male of Malay descent was sequenced. The subject's mother died of lung cancer and the father had a history of schizophrenia and deceased at the age of 65 years old. A systematic, intuitive computational workflow/pipeline integrating custom algorithm in tandem with large datasets of variant annotations and gene functions for genetic variations with pharmacogenomics impact was developed. A comprehensive pathway map of drug transport, metabolism and action was used as a template to map non-synonymous variations with potential functional consequences. Over 3 million known variations and 100,898 novel variations in the Malay genome were identified. Further in-depth pharmacogenetics analysis revealed a total of 607 unique variants in 563 proteins, with the eventual identification of 4 drug transport genes, 2 drug metabolizing enzyme genes and 33 target genes harboring deleterious SNVs involved in pharmacological pathways, which could have a potential role in clinical settings. The current study successfully unravels the potential of personal genome sequencing in understanding the functionally relevant variations with potential influence on drug transport, metabolism and differential therapeutic outcomes. These will be essential for realizing personalized medicine through the use of comprehensive computational pipeline for systematic data mining and analysis.

  12. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  13. On Target Localization Using Combined RSS and AoA Measurements

    PubMed Central

    Beko, Marko; Dinis, Rui

    2018-01-01

    This work revises existing solutions for a problem of target localization in wireless sensor networks (WSNs), utilizing integrated measurements, namely received signal strength (RSS) and angle of arrival (AoA). The problem of RSS/AoA-based target localization became very popular in the research community recently, owing to its great applicability potential and relatively low implementation cost. Therefore, here, a comprehensive study of the state-of-the-art (SoA) solutions and their detailed analysis is presented. The beginning of this work starts by considering the SoA approaches based on convex relaxation techniques (more computationally complex in general), and it goes through other (less computationally complex) approaches, as well, such as the ones based on the generalized trust region sub-problems framework and linear least squares. Furthermore, a detailed analysis of the computational complexity of each solution is reviewed. Furthermore, an extensive set of simulation results is presented. Finally, the main conclusions are summarized, and a set of future aspects and trends that might be interesting for future research in this area is identified. PMID:29671832

  14. Key Technology Research on Open Architecture for The Sharing of Heterogeneous Geographic Analysis Models

    NASA Astrophysics Data System (ADS)

    Yue, S. S.; Wen, Y. N.; Lv, G. N.; Hu, D.

    2013-10-01

    In recent years, the increasing development of cloud computing technologies laid critical foundation for efficiently solving complicated geographic issues. However, it is still difficult to realize the cooperative operation of massive heterogeneous geographical models. Traditional cloud architecture is apt to provide centralized solution to end users, while all the required resources are often offered by large enterprises or special agencies. Thus, it's a closed framework from the perspective of resource utilization. Solving comprehensive geographic issues requires integrating multifarious heterogeneous geographical models and data. In this case, an open computing platform is in need, with which the model owners can package and deploy their models into cloud conveniently, while model users can search, access and utilize those models with cloud facility. Based on this concept, the open cloud service strategies for the sharing of heterogeneous geographic analysis models is studied in this article. The key technology: unified cloud interface strategy, sharing platform based on cloud service, and computing platform based on cloud service are discussed in detail, and related experiments are conducted for further verification.

  15. Evaluation of the chondral modeling theory using fe-simulation and numeric shape optimization

    PubMed Central

    Plochocki, Jeffrey H; Ward, Carol V; Smith, Douglas E

    2009-01-01

    The chondral modeling theory proposes that hydrostatic pressure within articular cartilage regulates joint size, shape, and congruence through regional variations in rates of tissue proliferation.The purpose of this study is to develop a computational model using a nonlinear two-dimensional finite element analysis in conjunction with numeric shape optimization to evaluate the chondral modeling theory. The model employed in this analysis is generated from an MR image of the medial portion of the tibiofemoral joint in a subadult male. Stress-regulated morphological changes are simulated until skeletal maturity and evaluated against the chondral modeling theory. The computed results are found to support the chondral modeling theory. The shape-optimized model exhibits increased joint congruence, broader stress distributions in articular cartilage, and a relative decrease in joint diameter. The results for the computational model correspond well with experimental data and provide valuable insights into the mechanical determinants of joint growth. The model also provides a crucial first step toward developing a comprehensive model that can be employed to test the influence of mechanical variables on joint conformation. PMID:19438771

  16. Design and Analysis of a Neuromemristive Reservoir Computing Architecture for Biosignal Processing

    PubMed Central

    Kudithipudi, Dhireesha; Saleh, Qutaiba; Merkel, Cory; Thesing, James; Wysocki, Bryant

    2016-01-01

    Reservoir computing (RC) is gaining traction in several signal processing domains, owing to its non-linear stateful computation, spatiotemporal encoding, and reduced training complexity over recurrent neural networks (RNNs). Previous studies have shown the effectiveness of software-based RCs for a wide spectrum of applications. A parallel body of work indicates that realizing RNN architectures using custom integrated circuits and reconfigurable hardware platforms yields significant improvements in power and latency. In this research, we propose a neuromemristive RC architecture, with doubly twisted toroidal structure, that is validated for biosignal processing applications. We exploit the device mismatch to implement the random weight distributions within the reservoir and propose mixed-signal subthreshold circuits for energy efficiency. A comprehensive analysis is performed to compare the efficiency of the neuromemristive RC architecture in both digital(reconfigurable) and subthreshold mixed-signal realizations. Both Electroencephalogram (EEG) and Electromyogram (EMG) biosignal benchmarks are used for validating the RC designs. The proposed RC architecture demonstrated an accuracy of 90 and 84% for epileptic seizure detection and EMG prosthetic finger control, respectively. PMID:26869876

  17. SIMAP--a comprehensive database of pre-calculated protein sequence similarities, domains, annotations and clusters.

    PubMed

    Rattei, Thomas; Tischler, Patrick; Götz, Stefan; Jehl, Marc-André; Hoser, Jonathan; Arnold, Roland; Conesa, Ana; Mewes, Hans-Werner

    2010-01-01

    The prediction of protein function as well as the reconstruction of evolutionary genesis employing sequence comparison at large is still the most powerful tool in sequence analysis. Due to the exponential growth of the number of known protein sequences and the subsequent quadratic growth of the similarity matrix, the computation of the Similarity Matrix of Proteins (SIMAP) becomes a computational intensive task. The SIMAP database provides a comprehensive and up-to-date pre-calculation of the protein sequence similarity matrix, sequence-based features and sequence clusters. As of September 2009, SIMAP covers 48 million proteins and more than 23 million non-redundant sequences. Novel features of SIMAP include the expansion of the sequence space by including databases such as ENSEMBL as well as the integration of metagenomes based on their consistent processing and annotation. Furthermore, protein function predictions by Blast2GO are pre-calculated for all sequences in SIMAP and the data access and query functions have been improved. SIMAP assists biologists to query the up-to-date sequence space systematically and facilitates large-scale downstream projects in computational biology. Access to SIMAP is freely provided through the web portal for individuals (http://mips.gsf.de/simap/) and for programmatic access through DAS (http://webclu.bio.wzw.tum.de/das/) and Web-Service (http://mips.gsf.de/webservices/services/SimapService2.0?wsdl).

  18. A comprehensive overview of computational resources to aid in precision genome editing with engineered nucleases.

    PubMed

    Periwal, Vinita

    2017-07-01

    Genome editing with engineered nucleases (zinc finger nucleases, TAL effector nucleases s and Clustered regularly inter-spaced short palindromic repeats/CRISPR-associated) has recently been shown to have great promise in a variety of therapeutic and biotechnological applications. However, their exploitation in genetic analysis and clinical settings largely depends on their specificity for the intended genomic target. Large and complex genomes often contain highly homologous/repetitive sequences, which limits the specificity of genome editing tools and could result in off-target activity. Over the past few years, various computational approaches have been developed to assist the design process and predict/reduce the off-target activity of these nucleases. These tools could be efficiently used to guide the design of constructs for engineered nucleases and evaluate results after genome editing. This review provides a comprehensive overview of various databases, tools, web servers and resources for genome editing and compares their features and functionalities. Additionally, it also describes tools that have been developed to analyse post-genome editing results. The article also discusses important design parameters that could be considered while designing these nucleases. This review is intended to be a quick reference guide for experimentalists as well as computational biologists working in the field of genome editing with engineered nucleases. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. SuperPILOT: A Comprehensive Computer-Assisted Instruction Programming Language for the Apple II Computer.

    ERIC Educational Resources Information Center

    Falleur, David M.

    This presentation describes SuperPILOT, an extended version of Apple PILOT, a programming language for developing computer-assisted instruction (CAI) with the Apple II computer that includes the features of its early PILOT (Programmed Inquiry, Learning or Teaching) ancestors together with new features that make use of the Apple computer's advanced…

  20. NuGO contributions to GenePattern

    PubMed Central

    Reiff, C.; Mayer, C.; Müller, M.

    2008-01-01

    NuGO, the European Nutrigenomics Organization, utilizes 31 powerful computers for, e.g., data storage and analysis. These so-called black boxes (NBXses) are located at the sites of different partners. NuGO decided to use GenePattern as the preferred genomic analysis tool on each NBX. To handle the custom made Affymetrix NuGO arrays, new NuGO modules are added to GenePattern. These NuGO modules execute the latest Bioconductor version ensuring up-to-date annotations and access to the latest scientific developments. The following GenePattern modules are provided by NuGO: NuGOArrayQualityAnalysis for comprehensive quality control, NuGOExpressionFileCreator for import and normalization of data, LimmaAnalysis for identification of differentially expressed genes, TopGoAnalysis for calculation of GO enrichment, and GetResultForGo for retrieval of information on genes associated with specific GO terms. All together, these NuGO modules allow comprehensive, up-to-date, and user friendly analysis of Affymetrix data. A special feature of the NuGO modules is that for analysis they allow the use of either the standard Affymetrix or the MBNI custom CDF-files, which remap probes based on current knowledge. In both cases a .chip-file is created to enable GSEA analysis. The NuGO GenePattern installations are distributed as binary Ubuntu (.deb) packages via the NuGO repository. PMID:19034553

  1. NuGO contributions to GenePattern.

    PubMed

    De Groot, P J; Reiff, C; Mayer, C; Müller, M

    2008-12-01

    NuGO, the European Nutrigenomics Organization, utilizes 31 powerful computers for, e.g., data storage and analysis. These so-called black boxes (NBXses) are located at the sites of different partners. NuGO decided to use GenePattern as the preferred genomic analysis tool on each NBX. To handle the custom made Affymetrix NuGO arrays, new NuGO modules are added to GenePattern. These NuGO modules execute the latest Bioconductor version ensuring up-to-date annotations and access to the latest scientific developments. The following GenePattern modules are provided by NuGO: NuGOArrayQualityAnalysis for comprehensive quality control, NuGOExpressionFileCreator for import and normalization of data, LimmaAnalysis for identification of differentially expressed genes, TopGoAnalysis for calculation of GO enrichment, and GetResultForGo for retrieval of information on genes associated with specific GO terms. All together, these NuGO modules allow comprehensive, up-to-date, and user friendly analysis of Affymetrix data. A special feature of the NuGO modules is that for analysis they allow the use of either the standard Affymetrix or the MBNI custom CDF-files, which remap probes based on current knowledge. In both cases a .chip-file is created to enable GSEA analysis. The NuGO GenePattern installations are distributed as binary Ubuntu (.deb) packages via the NuGO repository.

  2. MIPS: analysis and annotation of proteins from whole genomes in 2005

    PubMed Central

    Mewes, H. W.; Frishman, D.; Mayer, K. F. X.; Münsterkötter, M.; Noubibou, O.; Pagel, P.; Rattei, T.; Oesterheld, M.; Ruepp, A.; Stümpflen, V.

    2006-01-01

    The Munich Information Center for Protein Sequences (MIPS at the GSF), Neuherberg, Germany, provides resources related to genome information. Manually curated databases for several reference organisms are maintained. Several of these databases are described elsewhere in this and other recent NAR database issues. In a complementary effort, a comprehensive set of >400 genomes automatically annotated with the PEDANT system are maintained. The main goal of our current work on creating and maintaining genome databases is to extend gene centered information to information on interactions within a generic comprehensive framework. We have concentrated our efforts along three lines (i) the development of suitable comprehensive data structures and database technology, communication and query tools to include a wide range of different types of information enabling the representation of complex information such as functional modules or networks Genome Research Environment System, (ii) the development of databases covering computable information such as the basic evolutionary relations among all genes, namely SIMAP, the sequence similarity matrix and the CABiNet network analysis framework and (iii) the compilation and manual annotation of information related to interactions such as protein–protein interactions or other types of relations (e.g. MPCDB, MPPI, CYGD). All databases described and the detailed descriptions of our projects can be accessed through the MIPS WWW server (). PMID:16381839

  3. MIPS: analysis and annotation of proteins from whole genomes in 2005.

    PubMed

    Mewes, H W; Frishman, D; Mayer, K F X; Münsterkötter, M; Noubibou, O; Pagel, P; Rattei, T; Oesterheld, M; Ruepp, A; Stümpflen, V

    2006-01-01

    The Munich Information Center for Protein Sequences (MIPS at the GSF), Neuherberg, Germany, provides resources related to genome information. Manually curated databases for several reference organisms are maintained. Several of these databases are described elsewhere in this and other recent NAR database issues. In a complementary effort, a comprehensive set of >400 genomes automatically annotated with the PEDANT system are maintained. The main goal of our current work on creating and maintaining genome databases is to extend gene centered information to information on interactions within a generic comprehensive framework. We have concentrated our efforts along three lines (i) the development of suitable comprehensive data structures and database technology, communication and query tools to include a wide range of different types of information enabling the representation of complex information such as functional modules or networks Genome Research Environment System, (ii) the development of databases covering computable information such as the basic evolutionary relations among all genes, namely SIMAP, the sequence similarity matrix and the CABiNet network analysis framework and (iii) the compilation and manual annotation of information related to interactions such as protein-protein interactions or other types of relations (e.g. MPCDB, MPPI, CYGD). All databases described and the detailed descriptions of our projects can be accessed through the MIPS WWW server (http://mips.gsf.de).

  4. Predicting and understanding comprehensive drug-drug interactions via semi-nonnegative matrix factorization.

    PubMed

    Yu, Hui; Mao, Kui-Tao; Shi, Jian-Yu; Huang, Hua; Chen, Zhi; Dong, Kai; Yiu, Siu-Ming

    2018-04-11

    Drug-drug interactions (DDIs) always cause unexpected and even adverse drug reactions. It is important to identify DDIs before drugs are used in the market. However, preclinical identification of DDIs requires much money and time. Computational approaches have exhibited their abilities to predict potential DDIs on a large scale by utilizing pre-market drug properties (e.g. chemical structure). Nevertheless, none of them can predict two comprehensive types of DDIs, including enhancive and degressive DDIs, which increases and decreases the behaviors of the interacting drugs respectively. There is a lack of systematic analysis on the structural relationship among known DDIs. Revealing such a relationship is very important, because it is able to help understand how DDIs occur. Both the prediction of comprehensive DDIs and the discovery of structural relationship among them play an important guidance when making a co-prescription. In this work, treating a set of comprehensive DDIs as a signed network, we design a novel model (DDINMF) for the prediction of enhancive and degressive DDIs based on semi-nonnegative matrix factorization. Inspiringly, DDINMF achieves the conventional DDI prediction (AUROC = 0.872 and AUPR = 0.605) and the comprehensive DDI prediction (AUROC = 0.796 and AUPR = 0.579). Compared with two state-of-the-art approaches, DDINMF shows it superiority. Finally, representing DDIs as a binary network and a signed network respectively, an analysis based on NMF reveals crucial knowledge hidden among DDIs. Our approach is able to predict not only conventional binary DDIs but also comprehensive DDIs. More importantly, it reveals several key points about the DDI network: (1) both binary and signed networks show fairly clear clusters, in which both drug degree and the difference between positive degree and negative degree show significant distribution; (2) the drugs having large degrees tend to have a larger difference between positive degree and negative degree; (3) though the binary DDI network contains no information about enhancive and degressive DDIs at all, it implies some of their relationship in the comprehensive DDI matrix; (4) the occurrence of signs indicating enhancive and degressive DDIs is not random because the comprehensive DDI network is equipped with a structural balance.

  5. Computation and Communication Evaluation of an Authentication Mechanism for Time-Triggered Networked Control Systems

    PubMed Central

    Martins, Goncalo; Moondra, Arul; Dubey, Abhishek; Bhattacharjee, Anirban; Koutsoukos, Xenofon D.

    2016-01-01

    In modern networked control applications, confidentiality and integrity are important features to address in order to prevent against attacks. Moreover, network control systems are a fundamental part of the communication components of current cyber-physical systems (e.g., automotive communications). Many networked control systems employ Time-Triggered (TT) architectures that provide mechanisms enabling the exchange of precise and synchronous messages. TT systems have computation and communication constraints, and with the aim to enable secure communications in the network, it is important to evaluate the computational and communication overhead of implementing secure communication mechanisms. This paper presents a comprehensive analysis and evaluation of the effects of adding a Hash-based Message Authentication (HMAC) to TT networked control systems. The contributions of the paper include (1) the analysis and experimental validation of the communication overhead, as well as a scalability analysis that utilizes the experimental result for both wired and wireless platforms and (2) an experimental evaluation of the computational overhead of HMAC based on a kernel-level Linux implementation. An automotive application is used as an example, and the results show that it is feasible to implement a secure communication mechanism without interfering with the existing automotive controller execution times. The methods and results of the paper can be used for evaluating the performance impact of security mechanisms and, thus, for the design of secure wired and wireless TT networked control systems. PMID:27463718

  6. Computation and Communication Evaluation of an Authentication Mechanism for Time-Triggered Networked Control Systems.

    PubMed

    Martins, Goncalo; Moondra, Arul; Dubey, Abhishek; Bhattacharjee, Anirban; Koutsoukos, Xenofon D

    2016-07-25

    In modern networked control applications, confidentiality and integrity are important features to address in order to prevent against attacks. Moreover, network control systems are a fundamental part of the communication components of current cyber-physical systems (e.g., automotive communications). Many networked control systems employ Time-Triggered (TT) architectures that provide mechanisms enabling the exchange of precise and synchronous messages. TT systems have computation and communication constraints, and with the aim to enable secure communications in the network, it is important to evaluate the computational and communication overhead of implementing secure communication mechanisms. This paper presents a comprehensive analysis and evaluation of the effects of adding a Hash-based Message Authentication (HMAC) to TT networked control systems. The contributions of the paper include (1) the analysis and experimental validation of the communication overhead, as well as a scalability analysis that utilizes the experimental result for both wired and wireless platforms and (2) an experimental evaluation of the computational overhead of HMAC based on a kernel-level Linux implementation. An automotive application is used as an example, and the results show that it is feasible to implement a secure communication mechanism without interfering with the existing automotive controller execution times. The methods and results of the paper can be used for evaluating the performance impact of security mechanisms and, thus, for the design of secure wired and wireless TT networked control systems.

  7. Non-intrusive uncertainty quantification of computational fluid dynamics simulations: notes on the accuracy and efficiency

    NASA Astrophysics Data System (ADS)

    Zimoń, Małgorzata; Sawko, Robert; Emerson, David; Thompson, Christopher

    2017-11-01

    Uncertainty quantification (UQ) is increasingly becoming an indispensable tool for assessing the reliability of computational modelling. Efficient handling of stochastic inputs, such as boundary conditions, physical properties or geometry, increases the utility of model results significantly. We discuss the application of non-intrusive generalised polynomial chaos techniques in the context of fluid engineering simulations. Deterministic and Monte Carlo integration rules are applied to a set of problems, including ordinary differential equations and the computation of aerodynamic parameters subject to random perturbations. In particular, we analyse acoustic wave propagation in a heterogeneous medium to study the effects of mesh resolution, transients, number and variability of stochastic inputs. We consider variants of multi-level Monte Carlo and perform a novel comparison of the methods with respect to numerical and parametric errors, as well as computational cost. The results provide a comprehensive view of the necessary steps in UQ analysis and demonstrate some key features of stochastic fluid flow systems.

  8. Understanding Plant Nitrogen Metabolism through Metabolomics and Computational Approaches

    PubMed Central

    Beatty, Perrin H.; Klein, Matthias S.; Fischer, Jeffrey J.; Lewis, Ian A.; Muench, Douglas G.; Good, Allen G.

    2016-01-01

    A comprehensive understanding of plant metabolism could provide a direct mechanism for improving nitrogen use efficiency (NUE) in crops. One of the major barriers to achieving this outcome is our poor understanding of the complex metabolic networks, physiological factors, and signaling mechanisms that affect NUE in agricultural settings. However, an exciting collection of computational and experimental approaches has begun to elucidate whole-plant nitrogen usage and provides an avenue for connecting nitrogen-related phenotypes to genes. Herein, we describe how metabolomics, computational models of metabolism, and flux balance analysis have been harnessed to advance our understanding of plant nitrogen metabolism. We introduce a model describing the complex flow of nitrogen through crops in a real-world agricultural setting and describe how experimental metabolomics data, such as isotope labeling rates and analyses of nutrient uptake, can be used to refine these models. In summary, the metabolomics/computational approach offers an exciting mechanism for understanding NUE that may ultimately lead to more effective crop management and engineered plants with higher yields. PMID:27735856

  9. Aerodynamic preliminary analysis system. Part 2: User's manual and program description

    NASA Technical Reports Server (NTRS)

    Divan, P.; Dunn, K.; Kojima, J.

    1978-01-01

    A comprehensive aerodynamic analysis program based on linearized potential theory is described. The solution treats thickness and attitude problems at subsonic and supersonic speeds. Three dimensional configurations with or without jet flaps having multiple nonplanar surfaces of arbitrary planform and open or closed slender bodies or noncircular contour are analyzed. Longitudinal and lateral-directional static and rotary derivative solutions are generated. The analysis is implemented on a time sharing system in conjunction with an input tablet digitizer and an interactive graphics input/output display and editing terminal to maximize its responsiveness to the preliminary analysis problem. Nominal case computation time of 45 CPU seconds on the CDC 175 for a 200 panel simulation indicates the program provides an efficient analysis for systematically performing various aerodynamic configuration tradeoff and evaluation studies.

  10. A Comprehensive Toolset for General-Purpose Private Computing and Outsourcing

    DTIC Science & Technology

    2016-12-08

    project and scientific advances made towards each of the research thrusts throughout the project duration. 1 Project Objectives Cloud computing enables...possibilities that the cloud enables is computation outsourcing, when the client can utilize any necessary computing resources for its computational task...Security considerations, however, stand on the way of harnessing the full benefits of cloud computing to the fullest extent and prevent clients from

  11. Systems cell biology.

    PubMed

    Mast, Fred D; Ratushny, Alexander V; Aitchison, John D

    2014-09-15

    Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. © 2014 Mast et al.

  12. Automatic design of optical systems by digital computer

    NASA Technical Reports Server (NTRS)

    Casad, T. A.; Schmidt, L. F.

    1967-01-01

    Computer program uses geometrical optical techniques and a least squares optimization method employing computing equipment for the automatic design of optical systems. It evaluates changes in various optical parameters, provides comprehensive ray-tracing, and generally determines the acceptability of the optical system characteristics.

  13. A simulation model for wind energy storage systems. Volume 1: Technical report

    NASA Technical Reports Server (NTRS)

    Warren, A. W.; Edsinger, R. W.; Chan, Y. K.

    1977-01-01

    A comprehensive computer program for the modeling of wind energy and storage systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel and pneumatic) was developed. The level of detail of Simulation Model for Wind Energy Storage (SIMWEST) is consistent with a role of evaluating the economic feasibility as well as the general performance of wind energy systems. The software package consists of two basic programs and a library of system, environmental, and load components. The first program is a precompiler which generates computer models (in FORTRAN) of complex wind source storage application systems, from user specifications using the respective library components. The second program provides the techno-economic system analysis with the respective I/O, the integration of systems dynamics, and the iteration for conveyance of variables. SIMWEST program, as described, runs on the UNIVAC 1100 series computers.

  14. Computational methods to extract meaning from text and advance theories of human cognition.

    PubMed

    McNamara, Danielle S

    2011-01-01

    Over the past two decades, researchers have made great advances in the area of computational methods for extracting meaning from text. This research has to a large extent been spurred by the development of latent semantic analysis (LSA), a method for extracting and representing the meaning of words using statistical computations applied to large corpora of text. Since the advent of LSA, researchers have developed and tested alternative statistical methods designed to detect and analyze meaning in text corpora. This research exemplifies how statistical models of semantics play an important role in our understanding of cognition and contribute to the field of cognitive science. Importantly, these models afford large-scale representations of human knowledge and allow researchers to explore various questions regarding knowledge, discourse processing, text comprehension, and language. This topic includes the latest progress by the leading researchers in the endeavor to go beyond LSA. Copyright © 2010 Cognitive Science Society, Inc.

  15. Wall Shear Stress Distribution in a Patient-Specific Cerebral Aneurysm Model using Reduced Order Modeling

    NASA Astrophysics Data System (ADS)

    Han, Suyue; Chang, Gary Han; Schirmer, Clemens; Modarres-Sadeghi, Yahya

    2016-11-01

    We construct a reduced-order model (ROM) to study the Wall Shear Stress (WSS) distributions in image-based patient-specific aneurysms models. The magnitude of WSS has been shown to be a critical factor in growth and rupture of human aneurysms. We start the process by running a training case using Computational Fluid Dynamics (CFD) simulation with time-varying flow parameters, such that these parameters cover the range of parameters of interest. The method of snapshot Proper Orthogonal Decomposition (POD) is utilized to construct the reduced-order bases using the training CFD simulation. The resulting ROM enables us to study the flow patterns and the WSS distributions over a range of system parameters computationally very efficiently with a relatively small number of modes. This enables comprehensive analysis of the model system across a range of physiological conditions without the need to re-compute the simulation for small changes in the system parameters.

  16. Computers and the Primary Curriculum 3-13.

    ERIC Educational Resources Information Center

    Crompton, Rob, Ed.

    This book is a comprehensive and practical guide to the use of computers across a wide age range. Extensive use is made of photographs, illustrations, cartoons, and samples of children's work to demonstrate the versatility of computer use in schools. An introduction by Rob Crompton placing computer use within the educational context of the United…

  17. Computer Assisted Language Learning. Routledge Studies in Computer Assisted Language Learning

    ERIC Educational Resources Information Center

    Pennington, Martha

    2011-01-01

    Computer-assisted language learning (CALL) is an approach to language teaching and learning in which computer technology is used as an aid to the presentation, reinforcement and assessment of material to be learned, usually including a substantial interactive element. This books provides an up-to date and comprehensive overview of…

  18. Computer Literacy.

    ERIC Educational Resources Information Center

    San Marcos Unified School District, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: After viewing many computer-literacy programs, we believe San Marcos Junior High School has developed a unique program which will truly develop computer literacy. Our hope is to give all students a comprehensive look at computers as they go through their two years here. They will not only learn the…

  19. Effects of Strength of Accent on an L2 Interactive Lecture Listening Comprehension Test

    ERIC Educational Resources Information Center

    Ockey, Gary J.; Papageorgiou, Spiros; French, Robert

    2016-01-01

    This article reports on a study which aimed to determine the effect of strength of accent on listening comprehension of interactive lectures. Test takers (N = 21,726) listened to an interactive lecture given by one of nine speakers and responded to six comprehension items. The test taker responses were analyzed with the Rasch computer program…

  20. A Systematic Comprehensive Computational Model for Stake Estimation in Mission Assurance: Applying Cyber Security Econometrics System (CSES) to Mission Assurance Analysis Protocol (MAAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T; Grimaila, Michael R

    2010-01-01

    In earlier works, we presented a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper, we discuss how this infrastructure can be used in the subject domain of mission assurance as defined as the full life-cycle engineering process to identify and mitigate design, production, test, and field support deficiencies of mission success. We address the opportunity to apply the Cyberspace Security Econometrics System (CSES) to Carnegie Mellon University and Software Engineering Institute s Mission Assurance Analysismore » Protocol (MAAP) in this context.« less

  1. Software to model AXAF-I image quality

    NASA Technical Reports Server (NTRS)

    Ahmad, Anees; Feng, Chen

    1995-01-01

    A modular user-friendly computer program for the modeling of grazing-incidence type x-ray optical systems has been developed. This comprehensive computer software GRAZTRACE covers the manipulation of input data, ray tracing with reflectivity and surface deformation effects, convolution with x-ray source shape, and x-ray scattering. The program also includes the capabilities for image analysis, detector scan modeling, and graphical presentation of the results. A number of utilities have been developed to interface the predicted Advanced X-ray Astrophysics Facility-Imaging (AXAF-I) mirror structural and thermal distortions with the ray-trace. This software is written in FORTRAN 77 and runs on a SUN/SPARC station. An interactive command mode version and a batch mode version of the software have been developed.

  2. Statistical models of lunar rocks and regolith

    NASA Technical Reports Server (NTRS)

    Marcus, A. H.

    1973-01-01

    The mathematical, statistical, and computational approaches used in the investigation of the interrelationship of lunar fragmental material, regolith, lunar rocks, and lunar craters are described. The first two phases of the work explored the sensitivity of the production model of fragmental material to mathematical assumptions, and then completed earlier studies on the survival of lunar surface rocks with respect to competing processes. The third phase combined earlier work into a detailed statistical analysis and probabilistic model of regolith formation by lithologically distinct layers, interpreted as modified crater ejecta blankets. The fourth phase of the work dealt with problems encountered in combining the results of the entire project into a comprehensive, multipurpose computer simulation model for the craters and regolith. Highlights of each phase of research are given.

  3. Excited state dynamics of thiophene and bithiophene: new insights into theoretically challenging systems.

    PubMed

    Prlj, Antonio; Curchod, Basile F E; Corminboeuf, Clémence

    2015-06-14

    The computational elucidation and proper description of the ultrafast deactivation mechanisms of simple organic electronic units, such as thiophene and its oligomers, is as challenging as it is contentious. A comprehensive excited state dynamics analysis of these systems utilizing reliable electronic structure approaches is currently lacking, with earlier pictures of the photochemistry of these systems being conceived based upon high-level static computations or lower level dynamic trajectories. Here a detailed surface hopping molecular dynamics of thiophene and bithiophene using the algebraic diagrammatic construction to second order (ADC(2)) method is presented. Our findings illustrate that ring puckering plays an important role in thiophene photochemistry and that the photostability increases when going upon dimerization into bithiophene.

  4. Process and representation in graphical displays

    NASA Technical Reports Server (NTRS)

    Gillan, Douglas J.; Lewis, Robert; Rudisill, Marianne

    1993-01-01

    Our initial model of graphic comprehension has focused on statistical graphs. Like other models of human-computer interaction, models of graphical comprehension can be used by human-computer interface designers and developers to create interfaces that present information in an efficient and usable manner. Our investigation of graph comprehension addresses two primary questions: how do people represent the information contained in a data graph?; and how do they process information from the graph? The topics of focus for graphic representation concern the features into which people decompose a graph and the representations of the graph in memory. The issue of processing can be further analyzed as two questions: what overall processing strategies do people use?; and what are the specific processing skills required?

  5. A Plan for Community College Instructional Computing.

    ERIC Educational Resources Information Center

    Howard, Alan; And Others

    This document presents a comprehensive plan for future growth in instructional computing in the Washington community colleges. Two chapters define the curriculum objectives and content recommended for instructional courses in the community colleges which require access to computing facilities. The courses described include data processing…

  6. The Comprehension Problems of Children with Poor Reading Comprehension Despite Adequate Decoding: A Meta-Analysis

    ERIC Educational Resources Information Center

    Spencer, Mercedes; Wagner, Richard K.

    2018-01-01

    The purpose of this meta-analysis was to examine the comprehension problems of children who have a specific reading comprehension deficit (SCD), which is characterized by poor reading comprehension despite adequate decoding. The meta-analysis included 86 studies of children with SCD who were assessed in reading comprehension and oral language…

  7. Human Expertise Helps Computer Classify Images

    NASA Technical Reports Server (NTRS)

    Rorvig, Mark E.

    1991-01-01

    Two-domain method of computational classification of images requires less computation than other methods for computational recognition, matching, or classification of images or patterns. Does not require explicit computational matching of features, and incorporates human expertise without requiring translation of mental processes of classification into language comprehensible to computer. Conceived to "train" computer to analyze photomicrographs of microscope-slide specimens of leucocytes from human peripheral blood to distinguish between specimens from healthy and specimens from traumatized patients.

  8. System Proposal for Mass Transit Service Quality Control Based on GPS Data

    PubMed Central

    Padrón, Gabino; Cristóbal, Teresa; Alayón, Francisco; Quesada-Arencibia, Alexis; García, Carmelo R.

    2017-01-01

    Quality is an essential aspect of public transport. In the case of regular public passenger transport by road, punctuality and regularity are criteria used to assess quality of service. Calculating metrics related to these criteria continuously over time and comprehensively across the entire transport network requires the handling of large amounts of data. This article describes a system for continuously and comprehensively monitoring punctuality and regularity. The system uses location data acquired continuously in the vehicles and automatically transferred for analysis. These data are processed intelligently by elements that are commonly used by transport operators: GPS-based tracking system, onboard computer and wireless networks for mobile data communications. The system was tested on a transport company, for which we measured the punctuality of one of the routes that it operates; the results are presented in this article. PMID:28621745

  9. System Proposal for Mass Transit Service Quality Control Based on GPS Data.

    PubMed

    Padrón, Gabino; Cristóbal, Teresa; Alayón, Francisco; Quesada-Arencibia, Alexis; García, Carmelo R

    2017-06-16

    Quality is an essential aspect of public transport. In the case of regular public passenger transport by road, punctuality and regularity are criteria used to assess quality of service. Calculating metrics related to these criteria continuously over time and comprehensively across the entire transport network requires the handling of large amounts of data. This article describes a system for continuously and comprehensively monitoring punctuality and regularity. The system uses location data acquired continuously in the vehicles and automatically transferred for analysis. These data are processed intelligently by elements that are commonly used by transport operators: GPS-based tracking system, onboard computer and wireless networks for mobile data communications. The system was tested on a transport company, for which we measured the punctuality of one of the routes that it operates; the results are presented in this article.

  10. Automated quantitative assessment of proteins' biological function in protein knowledge bases.

    PubMed

    Mayr, Gabriele; Lepperdinger, Günter; Lackner, Peter

    2008-01-01

    Primary protein sequence data are archived in databases together with information regarding corresponding biological functions. In this respect, UniProt/Swiss-Prot is currently the most comprehensive collection and it is routinely cross-examined when trying to unravel the biological role of hypothetical proteins. Bioscientists frequently extract single entries and further evaluate those on a subjective basis. In lieu of a standardized procedure for scoring the existing knowledge regarding individual proteins, we here report about a computer-assisted method, which we applied to score the present knowledge about any given Swiss-Prot entry. Applying this quantitative score allows the comparison of proteins with respect to their sequence yet highlights the comprehension of functional data. pfs analysis may be also applied for quality control of individual entries or for database management in order to rank entry listings.

  11. Process computerization of No. 13 blast furnace at Gary works

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sherman, G.J.; Zmierski, M.L.; Hyle, F.W.

    1993-10-01

    No. 13 blast furnace underwent extensive modifications to the process control system during the second reline. This represents a major shift in control philosophy from hardwired relays and analog controllers to a totally integrated computer control system. The new system created the opportunity for comprehensive diagnostic analysis and increased flexibility for control strategy modifications. The goal of achieving maximum production with minimal delay was accomplished by an integrated test, and comprehensive operator and maintenance training. Normal production was reached within four days of blow-ink and design production capacity of 7800 NTHM/day was met in the third month of operation. Recordmore » furnace productivity of 8931 NTHM/day (8.56 NTHM/100 crwv) was achieved in less than five months of operation and again in April 1992 of 9062 NTHM/day (8.68 NTHM/100 cfwv).« less

  12. Partial polarization: a comprehensive student exercise

    NASA Astrophysics Data System (ADS)

    Topasna, Gregory A.; Topasna, Daniela M.

    2015-10-01

    We present a comprehensive student exercise in partial polarization. Students are first introduced to the concept of partial polarization using Fresnel Equations. Next, MATHCAD is used to compute and graph the reflectance for dielectrics materials. The students then design and construct a simple, easy to use collimated light source for their experiment, which is performed on an optical breadboard using optical components typically found in an optics lab above the introductory level. The students obtain reflection data that is compared with their model by a nonlinear least square fit using EXCEL. Sources of error and uncertainty are discussed and students present a final written report. In this one exercise students learn how an experiment is constructed "from the ground up". They gain practical experience on data modeling and analysis, working with optical equipment, machining and construction, and preparing a final presentation.

  13. Computer-Based Image Analysis for Plus Disease Diagnosis in Retinopathy of Prematurity: Performance of the "i-ROP" System and Image Features Associated With Expert Diagnosis.

    PubMed

    Ataer-Cansizoglu, Esra; Bolon-Canedo, Veronica; Campbell, J Peter; Bozkurt, Alican; Erdogmus, Deniz; Kalpathy-Cramer, Jayashree; Patel, Samir; Jonas, Karyn; Chan, R V Paul; Ostmo, Susan; Chiang, Michael F

    2015-11-01

    We developed and evaluated the performance of a novel computer-based image analysis system for grading plus disease in retinopathy of prematurity (ROP), and identified the image features, shapes, and sizes that best correlate with expert diagnosis. A dataset of 77 wide-angle retinal images from infants screened for ROP was collected. A reference standard diagnosis was determined for each image by combining image grading from 3 experts with the clinical diagnosis from ophthalmoscopic examination. Manually segmented images were cropped into a range of shapes and sizes, and a computer algorithm was developed to extract tortuosity and dilation features from arteries and veins. Each feature was fed into our system to identify the set of characteristics that yielded the highest-performing system compared to the reference standard, which we refer to as the "i-ROP" system. Among the tested crop shapes, sizes, and measured features, point-based measurements of arterial and venous tortuosity (combined), and a large circular cropped image (with radius 6 times the disc diameter), provided the highest diagnostic accuracy. The i-ROP system achieved 95% accuracy for classifying preplus and plus disease compared to the reference standard. This was comparable to the performance of the 3 individual experts (96%, 94%, 92%), and significantly higher than the mean performance of 31 nonexperts (81%). This comprehensive analysis of computer-based plus disease suggests that it may be feasible to develop a fully-automated system based on wide-angle retinal images that performs comparably to expert graders at three-level plus disease discrimination. Computer-based image analysis, using objective and quantitative retinal vascular features, has potential to complement clinical ROP diagnosis by ophthalmologists.

  14. Fluid-Structure Interaction Analysis of Ruptured Mitral Chordae Tendineae.

    PubMed

    Toma, Milan; Bloodworth, Charles H; Pierce, Eric L; Einstein, Daniel R; Cochran, Richard P; Yoganathan, Ajit P; Kunzelman, Karyn S

    2017-03-01

    The chordal structure is a part of mitral valve geometry that has been commonly neglected or simplified in computational modeling due to its complexity. However, these simplifications cannot be used when investigating the roles of individual chordae tendineae in mitral valve closure. For the first time, advancements in imaging, computational techniques, and hardware technology make it possible to create models of the mitral valve without simplifications to its complex geometry, and to quickly run validated computer simulations that more realistically capture its function. Such simulations can then be used for a detailed analysis of chordae-related diseases. In this work, a comprehensive model of a subject-specific mitral valve with detailed chordal structure is used to analyze the distinct role played by individual chordae in closure of the mitral valve leaflets. Mitral closure was simulated for 51 possible chordal rupture points. Resultant regurgitant orifice area and strain change in the chordae at the papillary muscle tips were then calculated to examine the role of each ruptured chorda in the mitral valve closure. For certain subclassifications of chordae, regurgitant orifice area was found to trend positively with ruptured chordal diameter, and strain changes correlated negatively with regurgitant orifice area. Further advancements in clinical imaging modalities, coupled with the next generation of computational techniques will enable more physiologically realistic simulations.

  15. Fluid-Structure Interaction Analysis of Ruptured Mitral Chordae Tendineae

    PubMed Central

    Toma, Milan; Bloodworth, Charles H.; Pierce, Eric L.; Einstein, Daniel R.; Cochran, Richard P.; Yoganathan, Ajit P.; Kunzelman, Karyn S.

    2016-01-01

    The chordal structure is a part of mitral valve geometry that has been commonly neglected or simplified in computational modeling due to its complexity. However, these simplifications cannot be used when investigating the roles of individual chordae tendineae in mitral valve closure. For the first time, advancements in imaging, computational techniques, and hardware technology make it possible to create models of the mitral valve without simplifications to its complex geometry, and to quickly run validated computer simulations that more realistically capture its function. Such simulations can then be used for a detailed analysis of chordae-related diseases. In this work, a comprehensive model of a subject-specific mitral valve with detailed chordal structure is used to analyze the distinct role played by individual chordae in closure of the mitral valve leaflets. Mitral closure was simulated for 51 possible chordal rupture points. Resultant regurgitant orifice area and strain change in the chordae at the papillary muscle tips were then calculated to examine the role of each ruptured chorda in the mitral valve closure. For certain subclassifications of chordae, regurgitant orifice area was found to trend positively with ruptured chordal diameter, and strain changes correlated negatively with regurgitant orifice area. Further advancements in clinical imaging modalities, coupled with the next generation of computational techniques will enable more physiologically realistic simulations. PMID:27624659

  16. Numerosity as a topological invariant.

    PubMed

    Kluth, Tobias; Zetzsche, Christoph

    2016-01-01

    The ability to quickly recognize the number of objects in our environment is a fundamental cognitive function. However, it is far from clear which computations and which actual neural processing mechanisms are used to provide us with such a skill. Here we try to provide a detailed and comprehensive analysis of this issue, which comprises both the basic mathematical foundations and the peculiarities imposed by the structure of the visual system and by the neural computations provided by the visual cortex. We suggest that numerosity should be considered as a mathematical invariant. Making use of concepts from mathematical topology--like connectedness, Betti numbers, and the Gauss-Bonnet theorem--we derive the basic computations suited for the computation of this invariant. We show that the computation of numerosity is possible in a neurophysiologically plausible fashion using only computational elements which are known to exist in the visual cortex. We further show that a fundamental feature of numerosity perception, its Weber property, arises naturally, assuming noise in the basic neural operations. The model is tested on an extended data set (made publicly available). It is hoped that our results can provide a general framework for future research on the invariance properties of the numerosity system.

  17. Administrative Uses of Computers in the Schools.

    ERIC Educational Resources Information Center

    Bluhm, Harry P.

    This book, intended for school administrators, provides a comprehensive account of how computer information systems can enable administrators at both middle and top management levels to manage the educational enterprise. It can be used as a textbook in an educational administration course emphasizing computer technology in education, an…

  18. Demographic Computer Library.

    ERIC Educational Resources Information Center

    Shaw, David C.; Johnson, Dorothy M.

    The complete comprehension of this paper requires a firm grasp of both mathematical demography and FORTRAN programming. The paper aims at the establishment of a language with which complex demographic manipulations can be briefly expressed in a form intelligible both to demographic analysts and to computers. The Demographic Computer Library (DCL)…

  19. Care and Handling of Computer Magnetic Storage Media.

    ERIC Educational Resources Information Center

    Geller, Sidney B.

    Intended for use by data processing installation managers, operating personnel, and technical staff, this publication provides a comprehensive set of care and handling guidelines for the physical/chemical preservation of computer magnetic storage media--principally computer magnetic tapes--and their stored data. Emphasis is placed on media…

  20. The Reality of National Computer Networking for Higher Education. Proceedings of the 1978 EDUCOM Fall Conference. EDUCOM Series in Computing and Telecommunications in Higher Education 3.

    ERIC Educational Resources Information Center

    Emery, James C., Ed.

    A comprehensive review of the current status, prospects, and problems of computer networking in higher education is presented from the perspectives of both computer users and network suppliers. Several areas of computer use are considered including applications for instruction, research, and administration in colleges and universities. In the…

  1. Differential network analysis reveals the genome-wide landscape of estrogen receptor modulation in hormonal cancers

    PubMed Central

    Hsiao, Tzu-Hung; Chiu, Yu-Chiao; Hsu, Pei-Yin; Lu, Tzu-Pin; Lai, Liang-Chuan; Tsai, Mong-Hsun; Huang, Tim H.-M.; Chuang, Eric Y.; Chen, Yidong

    2016-01-01

    Several mutual information (MI)-based algorithms have been developed to identify dynamic gene-gene and function-function interactions governed by key modulators (genes, proteins, etc.). Due to intensive computation, however, these methods rely heavily on prior knowledge and are limited in genome-wide analysis. We present the modulated gene/gene set interaction (MAGIC) analysis to systematically identify genome-wide modulation of interaction networks. Based on a novel statistical test employing conjugate Fisher transformations of correlation coefficients, MAGIC features fast computation and adaption to variations of clinical cohorts. In simulated datasets MAGIC achieved greatly improved computation efficiency and overall superior performance than the MI-based method. We applied MAGIC to construct the estrogen receptor (ER) modulated gene and gene set (representing biological function) interaction networks in breast cancer. Several novel interaction hubs and functional interactions were discovered. ER+ dependent interaction between TGFβ and NFκB was further shown to be associated with patient survival. The findings were verified in independent datasets. Using MAGIC, we also assessed the essential roles of ER modulation in another hormonal cancer, ovarian cancer. Overall, MAGIC is a systematic framework for comprehensively identifying and constructing the modulated interaction networks in a whole-genome landscape. MATLAB implementation of MAGIC is available for academic uses at https://github.com/chiuyc/MAGIC. PMID:26972162

  2. Isotopically Nonstationary Metabolic Flux Analysis (INST-MFA) of Photosynthesis and Photorespiration in Plants.

    PubMed

    Ma, Fangfang; Jazmin, Lara J; Young, Jamey D; Allen, Doug K

    2017-01-01

    Photorespiration is a central component of photosynthesis; however to better understand its role it should be viewed in the context of an integrated metabolic network rather than a series of individual reactions that operate independently. Isotopically nonstationary 13 C metabolic flux analysis (INST-MFA), which is based on transient labeling studies at metabolic steady state, offers a comprehensive platform to quantify plant central metabolism. In this chapter, we describe the application of INST-MFA to investigate metabolism in leaves. Leaves are an autotrophic tissue, assimilating CO 2 over a diurnal period implying that the metabolic steady state is limited to less than 12 h and thus requiring an INST-MFA approach. This strategy results in a comprehensive unified description of photorespiration, Calvin cycle, sucrose and starch synthesis, tricarboxylic acid (TCA) cycle, and amino acid biosynthetic fluxes. We present protocols of the experimental aspects for labeling studies: transient 13 CO 2 labeling of leaf tissue, sample quenching and extraction, mass spectrometry (MS) analysis of isotopic labeling data, measurement of sucrose and amino acids in vascular exudates, and provide details on the computational flux estimation using INST-MFA.

  3. A comprehensive map of the mTOR signaling network

    PubMed Central

    Caron, Etienne; Ghosh, Samik; Matsuoka, Yukiko; Ashton-Beaucage, Dariel; Therrien, Marc; Lemieux, Sébastien; Perreault, Claude; Roux, Philippe P; Kitano, Hiroaki

    2010-01-01

    The mammalian target of rapamycin (mTOR) is a central regulator of cell growth and proliferation. mTOR signaling is frequently dysregulated in oncogenic cells, and thus an attractive target for anticancer therapy. Using CellDesigner, a modeling support software for graphical notation, we present herein a comprehensive map of the mTOR signaling network, which includes 964 species connected by 777 reactions. The map complies with both the systems biology markup language (SBML) and graphical notation (SBGN) for computational analysis and graphical representation, respectively. As captured in the mTOR map, we review and discuss our current understanding of the mTOR signaling network and highlight the impact of mTOR feedback and crosstalk regulations on drug-based cancer therapy. This map is available on the Payao platform, a Web 2.0 based community-wide interactive process for creating more accurate and information-rich databases. Thus, this comprehensive map of the mTOR network will serve as a tool to facilitate systems-level study of up-to-date mTOR network components and signaling events toward the discovery of novel regulatory processes and therapeutic strategies for cancer. PMID:21179025

  4. Analysis of a Hovering Rotor in Icing Conditions

    NASA Technical Reports Server (NTRS)

    Narducci, Robert; Kreeger, Richard E.

    2012-01-01

    A high fidelity analysis method is proposed to evaluate the ice accumulation and the ensuing rotor performance degradation for a helicopter flying through an icing cloud. The process uses computational fluid dynamics (CFD) coupled to a rotorcraft comprehensive code to establish the aerodynamic environment of a trimmed rotor prior to icing. Based on local aerodynamic conditions along the rotor span and accounting for the azimuthal variation, an ice accumulation analysis using NASA's Lewice3D code is made to establish the ice geometry. Degraded rotor performance is quantified by repeating the high fidelity rotor analysis with updates which account for ice shape and mass. The process is applied on a full-scale UH-1H helicopter in hover using data recorded during the Helicopter Icing Flight Test Program.

  5. Simulation Framework for Intelligent Transportation Systems

    DOT National Transportation Integrated Search

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System. The simulator is designed for running on parellel computers and distributed (networked) computer systems, but ca...

  6. Information transfer satellite concept study. Volume 4: computer manual

    NASA Technical Reports Server (NTRS)

    Bergin, P.; Kincade, C.; Kurpiewski, D.; Leinhaupel, F.; Millican, F.; Onstad, R.

    1971-01-01

    The Satellite Telecommunications Analysis and Modeling Program (STAMP) provides the user with a flexible and comprehensive tool for the analysis of ITS system requirements. While obtaining minimum cost design points, the program enables the user to perform studies over a wide range of user requirements and parametric demands. The program utilizes a total system approach wherein the ground uplink and downlink, the spacecraft, and the launch vehicle are simultaneously synthesized. A steepest descent algorithm is employed to determine the minimum total system cost design subject to the fixed user requirements and imposed constraints. In the process of converging to the solution, the pertinent subsystem tradeoffs are resolved. This report documents STAMP through a technical analysis and a description of the principal techniques employed in the program.

  7. Health information technology and physician-patient interactions: impact of computers on communication during outpatient primary care visits.

    PubMed

    Hsu, John; Huang, Jie; Fung, Vicki; Robertson, Nan; Jimison, Holly; Frankel, Richard

    2005-01-01

    The aim of this study was to evaluate the impact of introducing health information technology (HIT) on physician-patient interactions during outpatient visits. This was a longitudinal pre-post study: two months before and one and seven months after introduction of examination room computers. Patient questionnaires (n = 313) after primary care visits with physicians (n = 8) within an integrated delivery system. There were three patient satisfaction domains: (1) satisfaction with visit components, (2) comprehension of the visit, and (3) perceptions of the physician's use of the computer. Patients reported that physicians used computers in 82.3% of visits. Compared with baseline, overall patient satisfaction with visits increased seven months after the introduction of computers (odds ratio [OR] = 1.50; 95% confidence interval [CI]: 1.01-2.22), as did satisfaction with physicians' familiarity with patients (OR = 1.60, 95% CI: 1.01-2.52), communication about medical issues (OR = 1.61; 95% CI: 1.05-2.47), and comprehension of decisions made during the visit (OR = 1.63; 95% CI: 1.06-2.50). In contrast, there were no significant changes in patient satisfaction with comprehension of self-care responsibilities, communication about psychosocial issues, or available visit time. Seven months post-introduction, patients were more likely to report that the computer helped the visit run in a more timely manner (OR = 1.76; 95% CI: 1.28-2.42) compared with the first month after introduction. There were no other significant changes in patient perceptions of the computer use over time. The examination room computers appeared to have positive effects on physician-patient interactions related to medical communication without significant negative effects on other areas such as time available for patient concerns. Further study is needed to better understand HIT use during outpatient visits.

  8. Computer-generated vs. physician-documented history of present illness (HPI): results of a blinded comparison.

    PubMed

    Almario, Christopher V; Chey, William; Kaung, Aung; Whitman, Cynthia; Fuller, Garth; Reid, Mark; Nguyen, Ken; Bolus, Roger; Dennis, Buddy; Encarnacion, Rey; Martinez, Bibiana; Talley, Jennifer; Modi, Rushaba; Agarwal, Nikhil; Lee, Aaron; Kubomoto, Scott; Sharma, Gobind; Bolus, Sally; Chang, Lin; Spiegel, Brennan M R

    2015-01-01

    Healthcare delivery now mandates shorter visits with higher documentation requirements, undermining the patient-provider interaction. To improve clinic visit efficiency, we developed a patient-provider portal that systematically collects patient symptoms using a computer algorithm called Automated Evaluation of Gastrointestinal Symptoms (AEGIS). AEGIS also automatically "translates" the patient report into a full narrative history of present illness (HPI). We aimed to compare the quality of computer-generated vs. physician-documented HPIs. We performed a cross-sectional study with a paired sample design among individuals visiting outpatient adult gastrointestinal (GI) clinics for evaluation of active GI symptoms. Participants first underwent usual care and then subsequently completed AEGIS. Each individual thereby had both a physician-documented and a computer-generated HPI. Forty-eight blinded physicians assessed HPI quality across six domains using 5-point scales: (i) overall impression, (ii) thoroughness, (iii) usefulness, (iv) organization, (v) succinctness, and (vi) comprehensibility. We compared HPI scores within patient using a repeated measures model. Seventy-five patients had both computer-generated and physician-documented HPIs. The mean overall impression score for computer-generated HPIs was higher than physician HPIs (3.68 vs. 2.80; P<0.001), even after adjusting for physician and visit type, location, mode of transcription, and demographics. Computer-generated HPIs were also judged more complete (3.70 vs. 2.73; P<0.001), more useful (3.82 vs. 3.04; P<0.001), better organized (3.66 vs. 2.80; P<0.001), more succinct (3.55 vs. 3.17; P<0.001), and more comprehensible (3.66 vs. 2.97; P<0.001). Computer-generated HPIs were of higher overall quality, better organized, and more succinct, comprehensible, complete, and useful compared with HPIs written by physicians during usual care in GI clinics.

  9. Four PPPPerspectives on computational creativity in theory and in practice

    NASA Astrophysics Data System (ADS)

    Jordanous, Anna

    2016-04-01

    Computational creativity is the modelling, simulating or replicating of creativity computationally. In examining and learning from these "creative systems", from what perspective should the creativity of a system be considered? Are we interested in the creativity of the system's output? Or of its creative processes? Features of the system? Or how it operates within its environment? Traditionally computational creativity has focused more on creative systems' products or processes, though this focus has widened recently. Creativity research offers the Four Ps of creativity: Person/Producer, Product, Process and Press/Environment. This paper presents the Four Ps, explaining each in the context of creativity research and how it relates to computational creativity. To illustrate the usefulness of the Four Ps in taking broader perspectives on creativity in its computational treatment, the concepts of novelty and value are explored using the Four Ps, highlighting aspects of novelty and value that may otherwise be overlooked. Analysis of recent research in computational creativity finds that although each of the Four Ps appears in the body of computational creativity work, individual pieces of work often do not acknowledge all Four Ps, missing opportunities to widen their work's relevance. We can see, though, that high-status computational creativity papers do typically address all Four Ps. This paper argues that the broader views of creativity afforded by the Four Ps is vital in guiding us towards more comprehensively useful computational investigations of creativity.

  10. SCALE Code System 6.2.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less

  11. BioInfra.Prot: A comprehensive proteomics workflow including data standardization, protein inference, expression analysis and data publication.

    PubMed

    Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin

    2017-11-10

    The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  12. I Use the Computer to ADVANCE Advances in Comprehension-Strategy Research.

    ERIC Educational Resources Information Center

    Blohm, Paul J.

    Merging the instructional implications drawn from theory and research in the interactive reading model, schemata, and metacognition with computer based instruction seems a natural approach for actively involving students' participation in reading and learning from text. Computer based graphic organizers guide students' preview or review of lengthy…

  13. A Framework for Understanding Physics Students' Computational Modeling Practices

    ERIC Educational Resources Information Center

    Lunk, Brandon Robert

    2012-01-01

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content…

  14. Learner Assessment Methods Using a Computer Based Interactive Videodisc System.

    ERIC Educational Resources Information Center

    Ehrlich, Lisa R.

    This paper focuses on item design considerations faced by instructional designers and evaluators when using computer videodisc delivery systems as a means of assessing learner comprehension and competencies. Media characteristics of various interactive computer/videodisc training systems are briefly discussed as well as reasons for using such…

  15. Chapter 11. Quality evaluation of apple by computer vision

    USDA-ARS?s Scientific Manuscript database

    Apple is one of the most consumed fruits in the world, and there is a critical need for enhanced computer vision technology for quality assessment of apples. This chapter gives a comprehensive review on recent advances in various computer vision techniques for detecting surface and internal defects ...

  16. Aerodynamic preliminary analysis system. Part 1: Theory. [linearized potential theory

    NASA Technical Reports Server (NTRS)

    Bonner, E.; Clever, W.; Dunn, K.

    1978-01-01

    A comprehensive aerodynamic analysis program based on linearized potential theory is described. The solution treats thickness and attitude problems at subsonic and supersonic speeds. Three dimensional configurations with or without jet flaps having multiple non-planar surfaces of arbitrary planform and open or closed slender bodies of non-circular contour may be analyzed. Longitudinal and lateral-directional static and rotary derivative solutions may be generated. The analysis was implemented on a time sharing system in conjunction with an input tablet digitizer and an interactive graphics input/output display and editing terminal to maximize its responsiveness to the preliminary analysis problem. Nominal case computation time of 45 CPU seconds on the CDC 175 for a 200 panel simulation indicates the program provides an efficient analysis for systematically performing various aerodynamic configuration tradeoff and evaluation studies.

  17. Integrated Composite Analyzer (ICAN): Users and programmers manual

    NASA Technical Reports Server (NTRS)

    Murthy, P. L. N.; Chamis, C. C.

    1986-01-01

    The use of and relevant equations programmed in a computer code designed to carry out a comprehensive linear analysis of multilayered fiber composites is described. The analysis contains the essential features required to effectively design structural components made from fiber composites. The inputs to the code are constituent material properties, factors reflecting the fabrication process, and composite geometry. The code performs micromechanics, macromechanics, and laminate analysis, including the hygrothermal response of fiber composites. The code outputs are the various ply and composite properties, composite structural response, and composite stress analysis results with details on failure. The code is in Fortran IV and can be used efficiently as a package in complex structural analysis programs. The input-output format is described extensively through the use of a sample problem. The program listing is also included. The code manual consists of two parts.

  18. Comprehensive efficiency analysis of supercomputer resource usage based on system monitoring data

    NASA Astrophysics Data System (ADS)

    Mamaeva, A. A.; Shaykhislamov, D. I.; Voevodin, Vad V.; Zhumatiy, S. A.

    2018-03-01

    One of the main problems of modern supercomputers is the low efficiency of their usage, which leads to the significant idle time of computational resources, and, in turn, to the decrease in speed of scientific research. This paper presents three approaches to study the efficiency of supercomputer resource usage based on monitoring data analysis. The first approach performs an analysis of computing resource utilization statistics, which allows to identify different typical classes of programs, to explore the structure of the supercomputer job flow and to track overall trends in the supercomputer behavior. The second approach is aimed specifically at analyzing off-the-shelf software packages and libraries installed on the supercomputer, since efficiency of their usage is becoming an increasingly important factor for the efficient functioning of the entire supercomputer. Within the third approach, abnormal jobs – jobs with abnormally inefficient behavior that differs significantly from the standard behavior of the overall supercomputer job flow – are being detected. For each approach, the results obtained in practice in the Supercomputer Center of Moscow State University are demonstrated.

  19. Variogram Analysis of Response surfaces (VARS): A New Framework for Global Sensitivity Analysis of Earth and Environmental Systems Models

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Gupta, H. V.

    2015-12-01

    Earth and environmental systems models (EESMs) are continually growing in complexity and dimensionality with continuous advances in understanding and computing power. Complexity and dimensionality are manifested by introducing many different factors in EESMs (i.e., model parameters, forcings, boundary conditions, etc.) to be identified. Sensitivity Analysis (SA) provides an essential means for characterizing the role and importance of such factors in producing the model responses. However, conventional approaches to SA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we present a new and general sensitivity analysis framework (called VARS), based on an analogy to 'variogram analysis', that provides an intuitive and comprehensive characterization of sensitivity across the full spectrum of scales in the factor space. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are limiting cases of VARS, and that their SA indices can be computed as by-products of the VARS framework. We also present a practical strategy for the application of VARS to real-world problems, called STAR-VARS, including a new sampling strategy, called "star-based sampling". Our results across several case studies show the STAR-VARS approach to provide reliable and stable assessments of "global" sensitivity across the full range of scales in the factor space, while being at least 1-2 orders of magnitude more efficient than the benchmark Morris and Sobol approaches.

  20. Relating dynamic brain states to dynamic machine states: Human and machine solutions to the speech recognition problem

    PubMed Central

    Liu, Xunying; Zhang, Chao; Woodland, Phil; Fonteneau, Elisabeth

    2017-01-01

    There is widespread interest in the relationship between the neurobiological systems supporting human cognition and emerging computational systems capable of emulating these capacities. Human speech comprehension, poorly understood as a neurobiological process, is an important case in point. Automatic Speech Recognition (ASR) systems with near-human levels of performance are now available, which provide a computationally explicit solution for the recognition of words in continuous speech. This research aims to bridge the gap between speech recognition processes in humans and machines, using novel multivariate techniques to compare incremental ‘machine states’, generated as the ASR analysis progresses over time, to the incremental ‘brain states’, measured using combined electro- and magneto-encephalography (EMEG), generated as the same inputs are heard by human listeners. This direct comparison of dynamic human and machine internal states, as they respond to the same incrementally delivered sensory input, revealed a significant correspondence between neural response patterns in human superior temporal cortex and the structural properties of ASR-derived phonetic models. Spatially coherent patches in human temporal cortex responded selectively to individual phonetic features defined on the basis of machine-extracted regularities in the speech to lexicon mapping process. These results demonstrate the feasibility of relating human and ASR solutions to the problem of speech recognition, and suggest the potential for further studies relating complex neural computations in human speech comprehension to the rapidly evolving ASR systems that address the same problem domain. PMID:28945744

  1. Computer Game Play as an Imaginary Stage for Reading: Implicit Spatial Effects of Computer Games Embedded in Hard Copy Books

    ERIC Educational Resources Information Center

    Smith, Glenn Gordon

    2012-01-01

    This study compared books with embedded computer games (via pentop computers with microdot paper and audio feedback) with regular books with maps, in terms of fifth graders' comprehension and retention of spatial details from stories. One group read a story in hard copy with embedded computer games, the other group read it in regular book format…

  2. An Assessment of Security Vulnerabilities Comprehension of Cloud Computing Environments: A Quantitative Study Using the Unified Theory of Acceptance and Use

    ERIC Educational Resources Information Center

    Venkatesh, Vijay P.

    2013-01-01

    The current computing landscape owes its roots to the birth of hardware and software technologies from the 1940s and 1950s. Since then, the advent of mainframes, miniaturized computing, and internetworking has given rise to the now prevalent cloud computing era. In the past few months just after 2010, cloud computing adoption has picked up pace…

  3. Synthesizing Results From Empirical Research on Computer-Based Scaffolding in STEM Education: A Meta-Analysis.

    PubMed

    Belland, Brian R; Walker, Andrew E; Kim, Nam Ju; Lefler, Mason

    2017-04-01

    Computer-based scaffolding assists students as they generate solutions to complex problems, goals, or tasks, helping increase and integrate their higher order skills in the process. However, despite decades of research on scaffolding in STEM (science, technology, engineering, and mathematics) education, no existing comprehensive meta-analysis has synthesized the results of these studies. This review addresses that need by synthesizing the results of 144 experimental studies (333 outcomes) on the effects of computer-based scaffolding designed to assist the full range of STEM learners (primary through adult education) as they navigated ill-structured, problem-centered curricula. Results of our random effect meta-analysis (a) indicate that computer-based scaffolding showed a consistently positive (ḡ = 0.46) effect on cognitive outcomes across various contexts of use, scaffolding characteristics, and levels of assessment and (b) shed light on many scaffolding debates, including the roles of customization (i.e., fading and adding) and context-specific support. Specifically, scaffolding's influence on cognitive outcomes did not vary on the basis of context-specificity, presence or absence of scaffolding change, and logic by which scaffolding change is implemented. Scaffolding's influence was greatest when measured at the principles level and among adult learners. Still scaffolding's effect was substantial and significantly greater than zero across all age groups and assessment levels. These results suggest that scaffolding is a highly effective intervention across levels of different characteristics and can largely be designed in many different ways while still being highly effective.

  4. Large-scale retrieval for medical image analytics: A comprehensive review.

    PubMed

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Development of Multiobjective Optimization Techniques for Sonic Boom Minimization

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.

    1996-01-01

    A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.

  6. Speed and accuracy improvements in FLAASH atmospheric correction of hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Perkins, Timothy; Adler-Golden, Steven; Matthew, Michael W.; Berk, Alexander; Bernstein, Lawrence S.; Lee, Jamine; Fox, Marsha

    2012-11-01

    Remotely sensed spectral imagery of the earth's surface can be used to fullest advantage when the influence of the atmosphere has been removed and the measurements are reduced to units of reflectance. Here, we provide a comprehensive summary of the latest version of the Fast Line-of-sight Atmospheric Analysis of Spectral Hypercubes atmospheric correction algorithm. We also report some new code improvements for speed and accuracy. These include the re-working of the original algorithm in C-language code parallelized with message passing interface and containing a new radiative transfer look-up table option, which replaces executions of the MODTRAN model. With computation times now as low as ~10 s per image per computer processor, automated, real-time, on-board atmospheric correction of hyper- and multi-spectral imagery is within reach.

  7. Computer-Assisted Diagnosis of the Sleep Apnea-Hypopnea Syndrome: A Review

    PubMed Central

    Alvarez-Estevez, Diego; Moret-Bonillo, Vicente

    2015-01-01

    Automatic diagnosis of the Sleep Apnea-Hypopnea Syndrome (SAHS) has become an important area of research due to the growing interest in the field of sleep medicine and the costs associated with its manual diagnosis. The increment and heterogeneity of the different techniques, however, make it somewhat difficult to adequately follow the recent developments. A literature review within the area of computer-assisted diagnosis of SAHS has been performed comprising the last 15 years of research in the field. Screening approaches, methods for the detection and classification of respiratory events, comprehensive diagnostic systems, and an outline of current commercial approaches are reviewed. An overview of the different methods is presented together with validation analysis and critical discussion of the current state of the art. PMID:26266052

  8. Inhalation toxicity of indoor air pollutants in Drosophila melanogaster using integrated transcriptomics and computational behavior analyses

    NASA Astrophysics Data System (ADS)

    Eom, Hyun-Jeong; Liu, Yuedan; Kwak, Gyu-Suk; Heo, Muyoung; Song, Kyung Seuk; Chung, Yun Doo; Chon, Tae-Soo; Choi, Jinhee

    2017-06-01

    We conducted an inhalation toxicity test on the alternative animal model, Drosophila melanogaster, to investigate potential hazards of indoor air pollution. The inhalation toxicity of toluene and formaldehyde was investigated using comprehensive transcriptomics and computational behavior analyses. The ingenuity pathway analysis (IPA) based on microarray data suggests the involvement of pathways related to immune response, stress response, and metabolism in formaldehyde and toluene exposure based on hub molecules. We conducted a toxicity test using mutants of the representative genes in these pathways to explore the toxicological consequences of alterations of these pathways. Furthermore, extensive computational behavior analysis showed that exposure to either toluene or formaldehyde reduced most of the behavioral parameters of both wild-type and mutants. Interestingly, behavioral alteration caused by toluene or formaldehyde exposure was most severe in the p38b mutant, suggesting that the defects in the p38 pathway underlie behavioral alteration. Overall, the results indicate that exposure to toluene and formaldehyde via inhalation causes severe toxicity in Drosophila, by inducing significant alterations in gene expression and behavior, suggesting that Drosophila can be used as a potential alternative model in inhalation toxicity screening.

  9. Negative correlates of computer game play in adolescents.

    PubMed

    Colwell, J; Payne, J

    2000-08-01

    There is some concern that playing computer games may be associated with social isolation, lowered self-esteem, and aggression among adolescents. Measures of these variables were included in a questionnaire completed by 204 year eight students at a North London comprehensive school. Principal components analysis of a scale to assess needs fulfilled by game play provided some support for the notion of 'electronic friendship' among boys, but there was no evidence that game play leads to social isolation. Play was not linked to self-esteem in girls, but a negative relationship was obtained between self-esteem and frequency of play in boys. However, self-esteem was not associated with total exposure to game play. Aggression scores were not related to the number of games with aggressive content named among three favourite games, but they were positively correlated with total exposure to game play. A multiple regression analysis revealed that sex and total game play exposure each accounted for a significant but small amount of the variance in aggression scores. The positive correlation between playing computer games and aggression provides some justification for further investigation of the causal hypothesis, and possible methodologies are discussed.

  10. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1986-01-01

    An overview is given of the current capabilities of thirty-three computer programs that are used to solve heat transfer problems. The programs considered range from large general-purpose codes with broad spectrum of capabilities, large user community, and comprehensive user support (e.g., ABAQUS, ANSYS, EAL, MARC, MITAS II, MSC/NASTRAN, and SAMCEF) to the small, special-purpose codes with limited user community such as ANDES, NTEMP, TAC2D, TAC3D, TEPSA and TRUMP. The majority of the programs use either finite elements or finite differences for the spatial discretization. The capabilities of the programs are listed in tabular form followed by a summary of the major features of each program. The information presented herein is based on a questionnaire sent to the developers of each program. This information is preceded by a brief background material needed for effective evaluation and use of computer programs for heat transfer analysis. The present survey is useful in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program.

  11. Inhalation toxicity of indoor air pollutants in Drosophila melanogaster using integrated transcriptomics and computational behavior analyses

    PubMed Central

    Eom, Hyun-Jeong; Liu, Yuedan; Kwak, Gyu-Suk; Heo, Muyoung; Song, Kyung Seuk; Chung, Yun Doo; Chon, Tae-Soo; Choi, Jinhee

    2017-01-01

    We conducted an inhalation toxicity test on the alternative animal model, Drosophila melanogaster, to investigate potential hazards of indoor air pollution. The inhalation toxicity of toluene and formaldehyde was investigated using comprehensive transcriptomics and computational behavior analyses. The ingenuity pathway analysis (IPA) based on microarray data suggests the involvement of pathways related to immune response, stress response, and metabolism in formaldehyde and toluene exposure based on hub molecules. We conducted a toxicity test using mutants of the representative genes in these pathways to explore the toxicological consequences of alterations of these pathways. Furthermore, extensive computational behavior analysis showed that exposure to either toluene or formaldehyde reduced most of the behavioral parameters of both wild-type and mutants. Interestingly, behavioral alteration caused by toluene or formaldehyde exposure was most severe in the p38b mutant, suggesting that the defects in the p38 pathway underlie behavioral alteration. Overall, the results indicate that exposure to toluene and formaldehyde via inhalation causes severe toxicity in Drosophila, by inducing significant alterations in gene expression and behavior, suggesting that Drosophila can be used as a potential alternative model in inhalation toxicity screening. PMID:28621308

  12. DataView: a computational visualisation system for multidisciplinary design and analysis

    NASA Astrophysics Data System (ADS)

    Wang, Chengen

    2016-01-01

    Rapidly processing raw data and effectively extracting underlining information from huge volumes of multivariate data become essential to all decision-making processes in sectors like finance, government, medical care, climate analysis, industries, science, etc. Remarkably, visualisation is recognised as a fundamental technology that props up human comprehension, cognition and utilisation of burgeoning amounts of heterogeneous data. This paper presents a computational visualisation system, named DataView, which has been developed for graphically displaying and capturing outcomes of multiphysics problem-solvers widely used in engineering fields. The DataView is functionally composed of techniques for table/diagram representation, and graphical illustration of scalar, vector and tensor fields. The field visualisation techniques are implemented on the basis of a range of linear and non-linear meshes, which flexibly adapts to disparate data representation schemas adopted by a variety of disciplinary problem-solvers. The visualisation system has been successfully applied to a number of engineering problems, of which some illustrations are presented to demonstrate effectiveness of the visualisation techniques.

  13. An interactive environment for agile analysis and visualization of ChIP-sequencing data.

    PubMed

    Lerdrup, Mads; Johansen, Jens Vilstrup; Agrawal-Singh, Shuchi; Hansen, Klaus

    2016-04-01

    To empower experimentalists with a means for fast and comprehensive chromatin immunoprecipitation sequencing (ChIP-seq) data analyses, we introduce an integrated computational environment, EaSeq. The software combines the exploratory power of genome browsers with an extensive set of interactive and user-friendly tools for genome-wide abstraction and visualization. It enables experimentalists to easily extract information and generate hypotheses from their own data and public genome-wide datasets. For demonstration purposes, we performed meta-analyses of public Polycomb ChIP-seq data and established a new screening approach to analyze more than 900 datasets from mouse embryonic stem cells for factors potentially associated with Polycomb recruitment. EaSeq, which is freely available and works on a standard personal computer, can substantially increase the throughput of many analysis workflows, facilitate transparency and reproducibility by automatically documenting and organizing analyses, and enable a broader group of scientists to gain insights from ChIP-seq data.

  14. The Comprehensive Competencies Program Reference Manual. Volume I. Introduction.

    ERIC Educational Resources Information Center

    Taggart, Robert

    Chapter 1 of this reference manual is a summary of the comprehensive competencies program (CCP). It describes this system for organizing, implementing, managing, and efficiently delivering individualized self-paced instruction, combined with group and experience-based learning activities, using computer-assisted instruction. (The CCP covers not…

  15. CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis.

    PubMed

    Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran

    2015-01-01

    Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/.

  16. CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis

    PubMed Central

    Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran

    2015-01-01

    Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/ PMID:26571275

  17. Examination of a Rotorcraft Noise Prediction Method and Comparison to Flight Test Data

    NASA Technical Reports Server (NTRS)

    Boyd, D. Douglas, Jr.; Greenwood, Eric; Watts, Michael E.; Lopes, Leonard V.

    2017-01-01

    With a view that rotorcraft noise should be included in the preliminary design process, a relatively fast noise prediction method is examined in this paper. A comprehensive rotorcraft analysis is combined with a noise prediction method to compute several noise metrics of interest. These predictions are compared to flight test data. Results show that inclusion of only the main rotor noise will produce results that severely underpredict integrated metrics of interest. Inclusion of the tail rotor frequency content is essential for accurately predicting these integrated noise metrics.

  18. Advanced technology airfoil research, volume 1, part 2

    NASA Technical Reports Server (NTRS)

    1978-01-01

    This compilation contains papers presented at the NASA Conference on Advanced Technology Airfoil Research held at Langley Research Center on March 7-9, 1978, which have unlimited distribution. This conference provided a comprehensive review of all NASA airfoil research, conducted in-house and under grant and contract. A broad spectrum of airfoil research outside of NASA was also reviewed. The major thrust of the technical sessions were in three areas: development of computational aerodynamic codes for airfoil analysis and design, development of experimental facilities and test techniques, and all types of airfoil applications.

  19. Theoretical analysis of microwave propagation

    NASA Astrophysics Data System (ADS)

    Parl, S.; Malaga, A.

    1984-04-01

    This report documents a comprehensive investigation of microwave propagation. The structure of line-of-sight multipath is determined and the impact on practical diversity is discussed. A new model of diffraction propagation for multiple rounded obstacles is developed. A troposcatter model valid at microwave frequencies is described. New results for the power impulse response, and the delay spread and Doppler spread are developed. A 2-component model separating large and small scale scatter effects is proposed. The prediction techniques for diffraction and troposcatter have been implemented in a computer program intended as a tool to analyze propagation experiments.

  20. MASH Suite Pro: A Comprehensive Software Tool for Top-Down Proteomics*

    PubMed Central

    Cai, Wenxuan; Guner, Huseyin; Gregorich, Zachery R.; Chen, Albert J.; Ayaz-Guner, Serife; Peng, Ying; Valeja, Santosh G.; Liu, Xiaowen; Ge, Ying

    2016-01-01

    Top-down mass spectrometry (MS)-based proteomics is arguably a disruptive technology for the comprehensive analysis of all proteoforms arising from genetic variation, alternative splicing, and posttranslational modifications (PTMs). However, the complexity of top-down high-resolution mass spectra presents a significant challenge for data analysis. In contrast to the well-developed software packages available for data analysis in bottom-up proteomics, the data analysis tools in top-down proteomics remain underdeveloped. Moreover, despite recent efforts to develop algorithms and tools for the deconvolution of top-down high-resolution mass spectra and the identification of proteins from complex mixtures, a multifunctional software platform, which allows for the identification, quantitation, and characterization of proteoforms with visual validation, is still lacking. Herein, we have developed MASH Suite Pro, a comprehensive software tool for top-down proteomics with multifaceted functionality. MASH Suite Pro is capable of processing high-resolution MS and tandem MS (MS/MS) data using two deconvolution algorithms to optimize protein identification results. In addition, MASH Suite Pro allows for the characterization of PTMs and sequence variations, as well as the relative quantitation of multiple proteoforms in different experimental conditions. The program also provides visualization components for validation and correction of the computational outputs. Furthermore, MASH Suite Pro facilitates data reporting and presentation via direct output of the graphics. Thus, MASH Suite Pro significantly simplifies and speeds up the interpretation of high-resolution top-down proteomics data by integrating tools for protein identification, quantitation, characterization, and visual validation into a customizable and user-friendly interface. We envision that MASH Suite Pro will play an integral role in advancing the burgeoning field of top-down proteomics. PMID:26598644

  1. Computer Games Application within Alternative Classroom Goal Structures: Cognitive, Metacognitive, and Affective Evaluation

    ERIC Educational Resources Information Center

    Ke, Fengfeng

    2008-01-01

    This article reports findings on a study of educational computer games used within various classroom situations. Employing an across-stage, mixed method model, the study examined whether educational computer games, in comparison to traditional paper-and-pencil drills, would be more effective in facilitating comprehensive math learning outcomes,…

  2. Benefits and Drawbacks of Computer-Based Assessment and Feedback Systems: Student and Educator Perspectives

    ERIC Educational Resources Information Center

    Debuse, Justin C. W.; Lawley, Meredith

    2016-01-01

    Providing students with high quality feedback is important and can be achieved using computer-based systems. While student and educator perspectives of such systems have been investigated, a comprehensive multidisciplinary study has not yet been undertaken. This study examines student and educator perspectives of a computer-based assessment and…

  3. Comprehensive Outlook for Managed Pines Using Simulated Treatment Experiments-Planted Loblolly Pine (COMPUTE_P-LOB): A User's Guide

    Treesearch

    R.B. Ferguson; V. Clark Baldwin

    1987-01-01

    Complete instructions for user operation of COMPUTE_P-LOB to include detailed examples of computer input and output, of a growth and yield prediction system providing volume and weight yields in stand and stock table format.A complete program listing is provided.

  4. Cognitive Consequences of Participation in a "Fifth Dimension" After-School Computer Club.

    ERIC Educational Resources Information Center

    Mayer, Richard E.; Quilici, Jill; Moreno, Roxana; Duran, Richard; Woodbridge, Scott; Simon, Rebecca; Sanchez, David; Lavezzo, Amy

    1997-01-01

    Children who attended the Fifth Dimension after-school computer club at least 10 times during the 1994-95 school year performed better on word problem comprehension tests than did non-participating children. Results support the hypothesis that experience in using computer software in the Fifth Dimension club produces measurable, resilient, and…

  5. Heterogeneity in Health Care Computing Environments

    PubMed Central

    Sengupta, Soumitra

    1989-01-01

    This paper discusses issues of heterogeneity in computer systems, networks, databases, and presentation techniques, and the problems it creates in developing integrated medical information systems. The need for institutional, comprehensive goals are emphasized. Using the Columbia-Presbyterian Medical Center's computing environment as the case study, various steps to solve the heterogeneity problem are presented.

  6. Training and Generalization Effects of a Reading Comprehension Learning Strategy on Computer and Paper-Pencil Assessments

    ERIC Educational Resources Information Center

    Worrell, Jamie; Duffy, Mary Lou; Brady, Michael P.; Dukes, Charles; Gonzalez-DeHass, Alyssa

    2016-01-01

    Many schools use computer-based testing to measure students' progress for end-of-the-year and statewide assessments. There is little research to support whether computer-based testing accurately reflects student progress, particularly among students with learning, performance, and generalization difficulties. This article summarizes an…

  7. DAMBE7: New and Improved Tools for Data Analysis in Molecular Biology and Evolution.

    PubMed

    Xia, Xuhua

    2018-06-01

    DAMBE is a comprehensive software package for genomic and phylogenetic data analysis on Windows, Linux, and Macintosh computers. New functions include imputing missing distances and phylogeny simultaneously (paving the way to build large phage and transposon trees), new bootstrapping/jackknifing methods for PhyPA (phylogenetics from pairwise alignments), and an improved function for fast and accurate estimation of the shape parameter of the gamma distribution for fitting rate heterogeneity over sites. Previous method corrects multiple hits for each site independently. DAMBE's new method uses all sites simultaneously for correction. DAMBE, featuring a user-friendly graphic interface, is freely available from http://dambe.bio.uottawa.ca (last accessed, April 17, 2018).

  8. Exploring Techniques for Vision Based Human Activity Recognition: Methods, Systems, and Evaluation

    PubMed Central

    Xu, Xin; Tang, Jinshan; Zhang, Xiaolong; Liu, Xiaoming; Zhang, Hong; Qiu, Yimin

    2013-01-01

    With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activities, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation of the performance of human activity recognition. PMID:23353144

  9. Conformational and vibrational reassessment of solid paracetamol

    NASA Astrophysics Data System (ADS)

    Amado, Ana M.; Azevedo, Celeste; Ribeiro-Claro, Paulo J. A.

    2017-08-01

    This work provides an answer to the urge for a more detailed and accurate knowledge of the vibrational spectrum of the widely used analgesic/antipyretic drug commonly known as paracetamol. A comprehensive spectroscopic analysis - including infrared, Raman, and inelastic neutron scattering (INS) - is combined with a computational approach which takes account for the effects of intermolecular interactions in the solid state. This allows a full reassessment of the vibrational assignments for Paracetamol, thus preventing the propagation of incorrect data analysis and misassignments already found in the literature. In particular, the vibrational modes involving the hydrogen-bonded Nsbnd H and Osbnd H groups are correctly reallocated to bands shifted by up to 300 cm- 1 relatively to previous assignments.

  10. Direct biomechanical modeling of trabecular bone using a nonlinear manifold-based volumetric representation

    NASA Astrophysics Data System (ADS)

    Jin, Dakai; Lu, Jia; Zhang, Xiaoliu; Chen, Cheng; Bai, ErWei; Saha, Punam K.

    2017-03-01

    Osteoporosis is associated with increased fracture risk. Recent advancement in the area of in vivo imaging allows segmentation of trabecular bone (TB) microstructures, which is a known key determinant of bone strength and fracture risk. An accurate biomechanical modelling of TB micro-architecture provides a comprehensive summary measure of bone strength and fracture risk. In this paper, a new direct TB biomechanical modelling method using nonlinear manifold-based volumetric reconstruction of trabecular network is presented. It is accomplished in two sequential modules. The first module reconstructs a nonlinear manifold-based volumetric representation of TB networks from three-dimensional digital images. Specifically, it starts with the fuzzy digital segmentation of a TB network, and computes its surface and curve skeletons. An individual trabecula is identified as a topological segment in the curve skeleton. Using geometric analysis, smoothing and optimization techniques, the algorithm generates smooth, curved, and continuous representations of individual trabeculae glued at their junctions. Also, the method generates a geometrically consistent TB volume at junctions. In the second module, a direct computational biomechanical stress-strain analysis is applied on the reconstructed TB volume to predict mechanical measures. The accuracy of the method was examined using micro-CT imaging of cadaveric distal tibia specimens (N = 12). A high linear correlation (r = 0.95) between TB volume computed using the new manifold-modelling algorithm and that directly derived from the voxel-based micro-CT images was observed. Young's modulus (YM) was computed using direct mechanical analysis on the TB manifold-model over a cubical volume of interest (VOI), and its correlation with the YM, computed using micro-CT based conventional finite-element analysis over the same VOI, was examined. A moderate linear correlation (r = 0.77) was observed between the two YM measures. This preliminary results show the accuracy of the new nonlinear manifold modelling algorithm for TB, and demonstrate the feasibility of a new direct mechanical strain-strain analysis on a nonlinear manifold model of a highly complex biological structure.

  11. Revalidation of the Selection Instrument for Flight Training

    DTIC Science & Technology

    2017-07-01

    AACog) composite, as measured by the following SIFT subscales: o Mechanical Comprehension Test (MCT) o Math Skills Test (MST) o Reading...applied mechanical science. varies/ 15 minutes MST Math Skills Test—Assesses the examinee’s computational skill and mathematical aptitude...3.36 2.11 .03 1.01 Mechanical Comprehension 463 -1.95 3.58 .01 0.92 Math Skills 463 -2.59 2.87 .06 0.82 Reading Comprehension 463 -2.51 2.93 .52

  12. A History of Rotorcraft Comprehensive Analyses

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  13. Survey of computer programs for heat transfer analysis

    NASA Astrophysics Data System (ADS)

    Noor, A. K.

    An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.

  14. Comprehensive review: Computational modelling of schizophrenia.

    PubMed

    Valton, Vincent; Romaniuk, Liana; Douglas Steele, J; Lawrie, Stephen; Seriès, Peggy

    2017-12-01

    Computational modelling has been used to address: (1) the variety of symptoms observed in schizophrenia using abstract models of behavior (e.g. Bayesian models - top-down descriptive models of psychopathology); (2) the causes of these symptoms using biologically realistic models involving abnormal neuromodulation and/or receptor imbalance (e.g. connectionist and neural networks - bottom-up realistic models of neural processes). These different levels of analysis have been used to answer different questions (i.e. understanding behavioral vs. neurobiological anomalies) about the nature of the disorder. As such, these computational studies have mostly supported diverging hypotheses of schizophrenia's pathophysiology, resulting in a literature that is not always expanding coherently. Some of these hypotheses are however ripe for revision using novel empirical evidence. Here we present a review that first synthesizes the literature of computational modelling for schizophrenia and psychotic symptoms into categories supporting the dopamine, glutamate, GABA, dysconnection and Bayesian inference hypotheses respectively. Secondly, we compare model predictions against the accumulated empirical evidence and finally we identify specific hypotheses that have been left relatively under-investigated. Copyright © 2017. Published by Elsevier Ltd.

  15. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1982-01-01

    An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.

  16. Effective Instruction for Persisting Dyslexia in Upper Grades: Adding Hope Stories and Computer Coding to Explicit Literacy Instruction.

    PubMed

    Thompson, Robert; Tanimoto, Steve; Lyman, Ruby Dawn; Geselowitz, Kira; Begay, Kristin Kawena; Nielsen, Kathleen; Nagy, William; Abbott, Robert; Raskind, Marshall; Berninger, Virginia

    2018-05-01

    Children in grades 4 to 6 ( N =14) who despite early intervention had persisting dyslexia (impaired word reading and spelling) were assessed before and after computerized reading and writing instruction aimed at subword, word, and syntax skills shown in four prior studies to be effective for treating dyslexia. During the 12 two-hour sessions once a week after school they first completed HAWK Letters in Motion© for manuscript and cursive handwriting, HAWK Words in Motion© for phonological, orthographic, and morphological coding for word reading and spelling, and HAWK Minds in Motion© for sentence reading comprehension and written sentence composing. A reading comprehension activity in which sentences were presented one word at a time or one added word at a time was introduced. Next, to instill hope they could overcome their struggles with reading and spelling, they read and discussed stories about struggles of Buckminister Fuller who overcame early disabilities to make important contributions to society. Finally, they engaged in the new Kokopelli's World (KW)©, blocks-based online lessons, to learn computer coding in introductory programming by creating stories in sentence blocks (Tanimoto and Thompson 2016). Participants improved significantly in hallmark word decoding and spelling deficits of dyslexia, three syntax skills (oral construction, listening comprehension, and written composing), reading comprehension (with decoding as covariate), handwriting, orthographic and morphological coding, orthographic loop, and inhibition (focused attention). They answered more reading comprehension questions correctly when they had read sentences presented one word at a time (eliminating both regressions out and regressions in during saccades) than when presented one added word at a time (eliminating only regressions out during saccades). Indicators of improved self-efficacy that they could learn to read and write were observed. Reminders to pay attention and stay on task needed before adding computer coding were not needed after computer coding was added.

  17. THE COMPREHENSION OF RAPID SPEECH BY THE BLIND, PART III.

    ERIC Educational Resources Information Center

    FOULKE, EMERSON

    A REVIEW OF THE RESEARCH ON THE COMPREHENSION OF RAPID SPEECH BY THE BLIND IDENTIFIES FIVE METHODS OF SPEECH COMPRESSION--SPEECH CHANGING, ELECTROMECHANICAL SAMPLING, COMPUTER SAMPLING, SPEECH SYNTHESIS, AND FREQUENCY DIVIDING WITH THE HARMONIC COMPRESSOR. THE SPEECH CHANGING AND ELECTROMECHANICAL SAMPLING METHODS AND THE NECESSARY APPARATUS HAVE…

  18. The Bilingual Language Interaction Network for Comprehension of Speech

    ERIC Educational Resources Information Center

    Shook, Anthony; Marian, Viorica

    2013-01-01

    During speech comprehension, bilinguals co-activate both of their languages, resulting in cross-linguistic interaction at various levels of processing. This interaction has important consequences for both the structure of the language system and the mechanisms by which the system processes spoken language. Using computational modeling, we can…

  19. Prediction, Error, and Adaptation during Online Sentence Comprehension

    ERIC Educational Resources Information Center

    Fine, Alex Brabham

    2013-01-01

    A fundamental challenge for human cognition is perceiving and acting in a world in which the statistics that characterize available sensory data are non-stationary. This thesis focuses on this problem specifically in the domain of sentence comprehension, where linguistic variability poses computational challenges to the processes underlying…

  20. Keeping It Simple: The Case for E-Mail.

    ERIC Educational Resources Information Center

    Haimovic, Gila

    The Open University of Israel (OUI) is a distance education institution that offers over 250 computer-mediated courses through the Internet. All OUI students must pass an English reading comprehension exemption exam or take the University's English reading comprehension courses. Because reading instruction differs from content instruction,…

  1. Computer Assisted Instruction to Promote Comprehension in Students with Learning Disabilities

    ERIC Educational Resources Information Center

    Stetter, Maria Earman; Hughes, Marie Tejero

    2011-01-01

    Reading comprehension is a crucial skill for academic success of all students. Very often, students with learning disabilities struggle with reading skills and since students learn new information in school by reading; these difficulties often increase the academic struggles students with learning disabilities face. The current study examined…

  2. Deriving Empirically-Based Design Guidelines for Advanced Learning Technologies that Foster Disciplinary Comprehension

    ERIC Educational Resources Information Center

    Poitras, Eric; Trevors, Gregory

    2012-01-01

    Planning, conducting, and reporting leading-edge research requires professionals who are capable of highly skilled reading. This study reports the development of an empirically informed computer-based learning environment designed to foster the acquisition of reading comprehension strategies that mediate expertise in the social sciences. Empirical…

  3. The Role of Working Memory in Metaphor Production and Comprehension

    ERIC Educational Resources Information Center

    Chiappe, Dan L.; Chiappe, Penny

    2007-01-01

    The following tested Kintsch's [Kintsch, W. (2000). "Metaphor comprehension: a computational theory." "Psychonomic Bulletin & Review," 7, 257-266 and Kintsch, W. (2001). "Predication." "Cognitive Science," 25, 173-202] Predication Model, which predicts that working memory capacity is an important factor in metaphor processing. In support of his…

  4. Interactive Video Listening Comprehension in Foreign Language Instruction: Development and Evaluation.

    ERIC Educational Resources Information Center

    Fischer, Robert

    The report details development, at Southwest Texas State University and later at Pennsylvania State University, of a computer authoring system ("Libra") enabling foreign language faculty to develop multimedia lessons focusing on listening comprehension. Staff at Southwest Texas State University first developed a Macintosh version of the…

  5. Effect of Hypertextual Reading on Academic Success and Comprehension Skills

    ERIC Educational Resources Information Center

    Durukan, Erhan

    2014-01-01

    As computer technology developed, hypertexts emerged as an influential environment for developing language skills. This study aims to evaluate a text prepared in a hypertextual environment and its effects on academic success and comprehension skills. In this study, "preliminary test final test control group experimental pattern" was used…

  6. Event-Based Plausibility Immediately Influences On-Line Language Comprehension

    ERIC Educational Resources Information Center

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L.; Scheepers, Christoph; McRae, Ken

    2011-01-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional…

  7. Computers, the Human Mind, and My In-Laws' House.

    ERIC Educational Resources Information Center

    Esque, Timm J.

    1996-01-01

    Discussion of human memory, computer memory, and the storage of information focuses on a metaphor that can account for memory without storage and can set the stage for systemic research around a more comprehensive, understandable theory. (Author/LRW)

  8. HTSFinder: Powerful Pipeline of DNA Signature Discovery by Parallel and Distributed Computing

    PubMed Central

    Karimi, Ramin; Hajdu, Andras

    2016-01-01

    Comprehensive effort for low-cost sequencing in the past few years has led to the growth of complete genome databases. In parallel with this effort, a strong need, fast and cost-effective methods and applications have been developed to accelerate sequence analysis. Identification is the very first step of this task. Due to the difficulties, high costs, and computational challenges of alignment-based approaches, an alternative universal identification method is highly required. Like an alignment-free approach, DNA signatures have provided new opportunities for the rapid identification of species. In this paper, we present an effective pipeline HTSFinder (high-throughput signature finder) with a corresponding k-mer generator GkmerG (genome k-mers generator). Using this pipeline, we determine the frequency of k-mers from the available complete genome databases for the detection of extensive DNA signatures in a reasonably short time. Our application can detect both unique and common signatures in the arbitrarily selected target and nontarget databases. Hadoop and MapReduce as parallel and distributed computing tools with commodity hardware are used in this pipeline. This approach brings the power of high-performance computing into the ordinary desktop personal computers for discovering DNA signatures in large databases such as bacterial genome. A considerable number of detected unique and common DNA signatures of the target database bring the opportunities to improve the identification process not only for polymerase chain reaction and microarray assays but also for more complex scenarios such as metagenomics and next-generation sequencing analysis. PMID:26884678

  9. HTSFinder: Powerful Pipeline of DNA Signature Discovery by Parallel and Distributed Computing.

    PubMed

    Karimi, Ramin; Hajdu, Andras

    2016-01-01

    Comprehensive effort for low-cost sequencing in the past few years has led to the growth of complete genome databases. In parallel with this effort, a strong need, fast and cost-effective methods and applications have been developed to accelerate sequence analysis. Identification is the very first step of this task. Due to the difficulties, high costs, and computational challenges of alignment-based approaches, an alternative universal identification method is highly required. Like an alignment-free approach, DNA signatures have provided new opportunities for the rapid identification of species. In this paper, we present an effective pipeline HTSFinder (high-throughput signature finder) with a corresponding k-mer generator GkmerG (genome k-mers generator). Using this pipeline, we determine the frequency of k-mers from the available complete genome databases for the detection of extensive DNA signatures in a reasonably short time. Our application can detect both unique and common signatures in the arbitrarily selected target and nontarget databases. Hadoop and MapReduce as parallel and distributed computing tools with commodity hardware are used in this pipeline. This approach brings the power of high-performance computing into the ordinary desktop personal computers for discovering DNA signatures in large databases such as bacterial genome. A considerable number of detected unique and common DNA signatures of the target database bring the opportunities to improve the identification process not only for polymerase chain reaction and microarray assays but also for more complex scenarios such as metagenomics and next-generation sequencing analysis.

  10. Omics-Based Strategies in Precision Medicine: Toward a Paradigm Shift in Inborn Errors of Metabolism Investigations

    PubMed Central

    Tebani, Abdellah; Afonso, Carlos; Marret, Stéphane; Bekri, Soumeya

    2016-01-01

    The rise of technologies that simultaneously measure thousands of data points represents the heart of systems biology. These technologies have had a huge impact on the discovery of next-generation diagnostics, biomarkers, and drugs in the precision medicine era. Systems biology aims to achieve systemic exploration of complex interactions in biological systems. Driven by high-throughput omics technologies and the computational surge, it enables multi-scale and insightful overviews of cells, organisms, and populations. Precision medicine capitalizes on these conceptual and technological advancements and stands on two main pillars: data generation and data modeling. High-throughput omics technologies allow the retrieval of comprehensive and holistic biological information, whereas computational capabilities enable high-dimensional data modeling and, therefore, accessible and user-friendly visualization. Furthermore, bioinformatics has enabled comprehensive multi-omics and clinical data integration for insightful interpretation. Despite their promise, the translation of these technologies into clinically actionable tools has been slow. In this review, we present state-of-the-art multi-omics data analysis strategies in a clinical context. The challenges of omics-based biomarker translation are discussed. Perspectives regarding the use of multi-omics approaches for inborn errors of metabolism (IEM) are presented by introducing a new paradigm shift in addressing IEM investigations in the post-genomic era. PMID:27649151

  11. Omics-Based Strategies in Precision Medicine: Toward a Paradigm Shift in Inborn Errors of Metabolism Investigations.

    PubMed

    Tebani, Abdellah; Afonso, Carlos; Marret, Stéphane; Bekri, Soumeya

    2016-09-14

    The rise of technologies that simultaneously measure thousands of data points represents the heart of systems biology. These technologies have had a huge impact on the discovery of next-generation diagnostics, biomarkers, and drugs in the precision medicine era. Systems biology aims to achieve systemic exploration of complex interactions in biological systems. Driven by high-throughput omics technologies and the computational surge, it enables multi-scale and insightful overviews of cells, organisms, and populations. Precision medicine capitalizes on these conceptual and technological advancements and stands on two main pillars: data generation and data modeling. High-throughput omics technologies allow the retrieval of comprehensive and holistic biological information, whereas computational capabilities enable high-dimensional data modeling and, therefore, accessible and user-friendly visualization. Furthermore, bioinformatics has enabled comprehensive multi-omics and clinical data integration for insightful interpretation. Despite their promise, the translation of these technologies into clinically actionable tools has been slow. In this review, we present state-of-the-art multi-omics data analysis strategies in a clinical context. The challenges of omics-based biomarker translation are discussed. Perspectives regarding the use of multi-omics approaches for inborn errors of metabolism (IEM) are presented by introducing a new paradigm shift in addressing IEM investigations in the post-genomic era.

  12. Clinical applications of cone beam computed tomography in endodontics: A comprehensive review.

    PubMed

    Cohenca, Nestor; Shemesh, Hagay

    2015-06-01

    Cone beam computed tomography (CBCT) is a new technology that produces three-dimensional (3D) digital imaging at reduced cost and less radiation for the patient than traditional CT scans. It also delivers faster and easier image acquisition. By providing a 3D representation of the maxillofacial tissues in a cost- and dose-efficient manner, a better preoperative assessment can be obtained for diagnosis and treatment. This comprehensive review presents current applications of CBCT in endodontics. Specific case examples illustrate the difference in treatment planning with traditional periapical radiography versus CBCT technology.

  13. Comprehensive Thematic T-Matrix Reference Database: A 2014-2015 Update

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Zakharova, Nadezhda; Khlebtsov, Nikolai G.; Videen, Gorden; Wriedt, Thomas

    2015-01-01

    The T-matrix method is one of the most versatile and efficient direct computer solvers of the macroscopic Maxwell equations and is widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, and particles in the vicinity of an interface separating two half-spaces with different refractive indices. This paper is the seventh update to the comprehensive thematic database of peer-reviewed T-matrix publications initiated by us in 2004 and includes relevant publications that have appeared since 2013. It also lists a number of earlier publications overlooked previously.

  14. Development of a COTS-Based Computing Environment Blueprint Application at KSC

    NASA Technical Reports Server (NTRS)

    Ghansah, Isaac; Boatright, Bryan

    1996-01-01

    This paper describes a blueprint that can be used for developing a distributed computing environment (DCE) for NASA in general, and the Kennedy Space Center (KSC) in particular. A comprehensive, open, secure, integrated, and multi-vendor DCE such as OSF DCE has been suggested. Design issues, as well as recommendations for each component have been given. Where necessary, modifications were suggested to fit the needs of KSC. This was done in the areas of security and directory services. Readers requiring a more comprehensive coverage are encouraged to refer to the eight-chapter document prepared for this work.

  15. MNE software for processing MEG and EEG data

    PubMed Central

    Gramfort, A.; Luessi, M.; Larson, E.; Engemann, D.; Strohmeier, D.; Brodbeck, C.; Parkkonen, L.; Hämäläinen, M.

    2013-01-01

    Magnetoencephalography and electroencephalography (M/EEG) measure the weak electromagnetic signals originating from neural currents in the brain. Using these signals to characterize and locate brain activity is a challenging task, as evidenced by several decades of methodological contributions. MNE, whose name stems from its capability to compute cortically-constrained minimum-norm current estimates from M/EEG data, is a software package that provides comprehensive analysis tools and workflows including preprocessing, source estimation, time–frequency analysis, statistical analysis, and several methods to estimate functional connectivity between distributed brain regions. The present paper gives detailed information about the MNE package and describes typical use cases while also warning about potential caveats in analysis. The MNE package is a collaborative effort of multiple institutes striving to implement and share best methods and to facilitate distribution of analysis pipelines to advance reproducibility of research. Full documentation is available at http://martinos.org/mne. PMID:24161808

  16. The Flight Optimization System Weights Estimation Method

    NASA Technical Reports Server (NTRS)

    Wells, Douglas P.; Horvath, Bryce L.; McCullers, Linwood A.

    2017-01-01

    FLOPS has been the primary aircraft synthesis software used by the Aeronautics Systems Analysis Branch at NASA Langley Research Center. It was created for rapid conceptual aircraft design and advanced technology impact assessments. FLOPS is a single computer program that includes weights estimation, aerodynamics estimation, engine cycle analysis, propulsion data scaling and interpolation, detailed mission performance analysis, takeoff and landing performance analysis, noise footprint estimation, and cost analysis. It is well known as a baseline and common denominator for aircraft design studies. FLOPS is capable of calibrating a model to known aircraft data, making it useful for new aircraft and modifications to existing aircraft. The weight estimation method in FLOPS is known to be of high fidelity for conventional tube with wing aircraft and a substantial amount of effort went into its development. This report serves as a comprehensive documentation of the FLOPS weight estimation method. The development process is presented with the weight estimation process.

  17. Digital pathology imaging as a novel platform for standardization and globalization of quantitative nephropathology

    PubMed Central

    Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B.; Hewitt, Stephen M.

    2017-01-01

    Abstract The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future. PMID:28584625

  18. Framework to model neutral particle flux in convex high aspect ratio structures using one-dimensional radiosity

    NASA Astrophysics Data System (ADS)

    Manstetten, Paul; Filipovic, Lado; Hössinger, Andreas; Weinbub, Josef; Selberherr, Siegfried

    2017-02-01

    We present a computationally efficient framework to compute the neutral flux in high aspect ratio structures during three-dimensional plasma etching simulations. The framework is based on a one-dimensional radiosity approach and is applicable to simulations of convex rotationally symmetric holes and convex symmetric trenches with a constant cross-section. The framework is intended to replace the full three-dimensional simulation step required to calculate the neutral flux during plasma etching simulations. Especially for high aspect ratio structures, the computational effort, required to perform the full three-dimensional simulation of the neutral flux at the desired spatial resolution, conflicts with practical simulation time constraints. Our results are in agreement with those obtained by three-dimensional Monte Carlo based ray tracing simulations for various aspect ratios and convex geometries. With this framework we present a comprehensive analysis of the influence of the geometrical properties of high aspect ratio structures as well as of the particle sticking probability on the neutral particle flux.

  19. Digital pathology imaging as a novel platform for standardization and globalization of quantitative nephropathology.

    PubMed

    Barisoni, Laura; Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B; Hewitt, Stephen M

    2017-04-01

    The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future.

  20. The CMC/3DPNS computer program for prediction of three-dimensional, subsonic, turbulent aerodynamic juncture region flow. Volume 3: Programmers' manual

    NASA Technical Reports Server (NTRS)

    Orzechowski, J. A.

    1982-01-01

    The CMC fluid mechanics program system was developed to transmit the theoretical evolution of finite element numerical solution methodology, applied to nonlinear field problems into a versatile computer code for comprehensive flow field analysis. A detailed view of the code from the standpoint of a computer programmer's use is presented. A system macroflow chart and detailed flow charts of several routines necessary to interact with a theoretican/user to modify the operation of this program are presented. All subroutines and details of usage, primarily for input and output routines are described. Integer and real scalars and a cross reference list denoting subroutine usage for these scalars are outlined. Entry points in dynamic storage vector IZ; the lengths of each vector accompanying the scalar definitions are described. A listing of the routines peculiar to the standard test case and a listing of the input deck and printout for this case are included.

  1. Revealing the vectors of cellular identity with single-cell genomics

    PubMed Central

    Wagner, Allon; Regev, Aviv; Yosef, Nir

    2017-01-01

    Single-cell genomics has now made it possible to create a comprehensive atlas of human cells. At the same time, it has reopened definitions of a cell’s identity and type and of the ways in which they are regulated by the cell’s molecular circuitry. Emerging computational analysis methods, especially in single-cell RNA sequencing (scRNA-seq), have already begun to reveal, in a data-driven way, the diverse simultaneous facets of a cell’s identity, from a taxonomy of discrete cell types to continuous dynamic transitions and spatial locations. These developments will eventually allow a cell to be represented as a superposition of ‘basis vectors’, each determining a different (but possibly dependent) aspect of cellular organization and function. However, computational methods must also overcome considerable challenges—from handling technical noise and data scale to forming new abstractions of biology. As the scale of single-cell experiments continues to increase, new computational approaches will be essential for constructing and characterizing a reference map of cell identities. PMID:27824854

  2. SIMPLEX: Cloud-Enabled Pipeline for the Comprehensive Analysis of Exome Sequencing Data

    PubMed Central

    Fischer, Maria; Snajder, Rene; Pabinger, Stephan; Dander, Andreas; Schossig, Anna; Zschocke, Johannes; Trajanoski, Zlatko; Stocker, Gernot

    2012-01-01

    In recent studies, exome sequencing has proven to be a successful screening tool for the identification of candidate genes causing rare genetic diseases. Although underlying targeted sequencing methods are well established, necessary data handling and focused, structured analysis still remain demanding tasks. Here, we present a cloud-enabled autonomous analysis pipeline, which comprises the complete exome analysis workflow. The pipeline combines several in-house developed and published applications to perform the following steps: (a) initial quality control, (b) intelligent data filtering and pre-processing, (c) sequence alignment to a reference genome, (d) SNP and DIP detection, (e) functional annotation of variants using different approaches, and (f) detailed report generation during various stages of the workflow. The pipeline connects the selected analysis steps, exposes all available parameters for customized usage, performs required data handling, and distributes computationally expensive tasks either on a dedicated high-performance computing infrastructure or on the Amazon cloud environment (EC2). The presented application has already been used in several research projects including studies to elucidate the role of rare genetic diseases. The pipeline is continuously tested and is publicly available under the GPL as a VirtualBox or Cloud image at http://simplex.i-med.ac.at; additional supplementary data is provided at http://www.icbi.at/exome. PMID:22870267

  3. CAPER 3.0: A Scalable Cloud-Based System for Data-Intensive Analysis of Chromosome-Centric Human Proteome Project Data Sets.

    PubMed

    Yang, Shuai; Zhang, Xinlei; Diao, Lihong; Guo, Feifei; Wang, Dan; Liu, Zhongyang; Li, Honglei; Zheng, Junjie; Pan, Jingshan; Nice, Edouard C; Li, Dong; He, Fuchu

    2015-09-04

    The Chromosome-centric Human Proteome Project (C-HPP) aims to catalog genome-encoded proteins using a chromosome-by-chromosome strategy. As the C-HPP proceeds, the increasing requirement for data-intensive analysis of the MS/MS data poses a challenge to the proteomic community, especially small laboratories lacking computational infrastructure. To address this challenge, we have updated the previous CAPER browser into a higher version, CAPER 3.0, which is a scalable cloud-based system for data-intensive analysis of C-HPP data sets. CAPER 3.0 uses cloud computing technology to facilitate MS/MS-based peptide identification. In particular, it can use both public and private cloud, facilitating the analysis of C-HPP data sets. CAPER 3.0 provides a graphical user interface (GUI) to help users transfer data, configure jobs, track progress, and visualize the results comprehensively. These features enable users without programming expertise to easily conduct data-intensive analysis using CAPER 3.0. Here, we illustrate the usage of CAPER 3.0 with four specific mass spectral data-intensive problems: detecting novel peptides, identifying single amino acid variants (SAVs) derived from known missense mutations, identifying sample-specific SAVs, and identifying exon-skipping events. CAPER 3.0 is available at http://prodigy.bprc.ac.cn/caper3.

  4. A mathematical framework for yield (vs. rate) optimization in constraint-based modeling and applications in metabolic engineering.

    PubMed

    Klamt, Steffen; Müller, Stefan; Regensburger, Georg; Zanghellini, Jürgen

    2018-05-01

    The optimization of metabolic rates (as linear objective functions) represents the methodical core of flux-balance analysis techniques which have become a standard tool for the study of genome-scale metabolic models. Besides (growth and synthesis) rates, metabolic yields are key parameters for the characterization of biochemical transformation processes, especially in the context of biotechnological applications. However, yields are ratios of rates, and hence the optimization of yields (as nonlinear objective functions) under arbitrary linear constraints is not possible with current flux-balance analysis techniques. Despite the fundamental importance of yields in constraint-based modeling, a comprehensive mathematical framework for yield optimization is still missing. We present a mathematical theory that allows one to systematically compute and analyze yield-optimal solutions of metabolic models under arbitrary linear constraints. In particular, we formulate yield optimization as a linear-fractional program. For practical computations, we transform the linear-fractional yield optimization problem to a (higher-dimensional) linear problem. Its solutions determine the solutions of the original problem and can be used to predict yield-optimal flux distributions in genome-scale metabolic models. For the theoretical analysis, we consider the linear-fractional problem directly. Most importantly, we show that the yield-optimal solution set (like the rate-optimal solution set) is determined by (yield-optimal) elementary flux vectors of the underlying metabolic model. However, yield- and rate-optimal solutions may differ from each other, and hence optimal (biomass or product) yields are not necessarily obtained at solutions with optimal (growth or synthesis) rates. Moreover, we discuss phase planes/production envelopes and yield spaces, in particular, we prove that yield spaces are convex and provide algorithms for their computation. We illustrate our findings by a small example and demonstrate their relevance for metabolic engineering with realistic models of E. coli. We develop a comprehensive mathematical framework for yield optimization in metabolic models. Our theory is particularly useful for the study and rational modification of cell factories designed under given yield and/or rate requirements. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Intersections between the Autism Spectrum and the Internet: Perceived Benefits and Preferred Functions of Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Gillespie-Lynch, Kristen; Kapp, Steven K.; Shane-Simpson, Christina; Smith, David Shane; Hutman, Ted

    2014-01-01

    An online survey compared the perceived benefits and preferred functions of computer-mediated communication of participants with (N = 291) and without ASD (N = 311). Participants with autism spectrum disorder (ASD) perceived benefits of computer-mediated communication in terms of increased comprehension and control over communication, access to…

  6. Computer Simulations to Support Science Instruction and Learning: A Critical Review of the Literature

    ERIC Educational Resources Information Center

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-01-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is…

  7. Computer versus Paper-Based Reading: A Case Study in English Language Teaching Context

    ERIC Educational Resources Information Center

    Solak, Ekrem

    2014-01-01

    This research aims to determine the preference of prospective English teachers in performing computer and paper-based reading tasks and to what extent computer and paper-based reading influence their reading speed, accuracy and comprehension. The research was conducted at a State run University, English Language Teaching Department in Turkey. The…

  8. Identifying Discriminating Variables between Teachers Who Fully Integrate Computers and Teachers with Limited Integration

    ERIC Educational Resources Information Center

    Mueller, Julie; Wood, Eileen; Willoughby, Teena; Ross, Craig; Specht, Jacqueline

    2008-01-01

    Given the prevalence of computers in education today, it is critical to understand teachers' perspectives regarding computer integration in their classrooms. The current study surveyed a random sample of a heterogeneous group of 185 elementary and 204 secondary teachers in order to provide a comprehensive summary of teacher characteristics and…

  9. Investigating an Innovative Computer Application to Improve L2 Word Recognition from Speech

    ERIC Educational Resources Information Center

    Matthews, Joshua; O'Toole, John Mitchell

    2015-01-01

    The ability to recognise words from the aural modality is a critical aspect of successful second language (L2) listening comprehension. However, little research has been reported on computer-mediated development of L2 word recognition from speech in L2 learning contexts. This report describes the development of an innovative computer application…

  10. An Investigation of the Effectiveness of Computer Simulation Programs as Tutorial Tools for Teaching Population Ecology at University.

    ERIC Educational Resources Information Center

    Korfiatis, K.; Papatheodorou, E.; Paraskevopoulous, S.; Stamou, G. P.

    1999-01-01

    Describes a study of the effectiveness of computer-simulation programs in enhancing biology students' familiarity with ecological modeling and concepts. Finds that computer simulations improved student comprehension of ecological processes expressed in mathematical form, but did not allow a full understanding of ecological concepts. Contains 28…

  11. Errors and Intelligence in Computer-Assisted Language Learning: Parsers and Pedagogues. Routledge Studies in Computer Assisted Language Learning

    ERIC Educational Resources Information Center

    Heift, Trude; Schulze, Mathias

    2012-01-01

    This book provides the first comprehensive overview of theoretical issues, historical developments and current trends in ICALL (Intelligent Computer-Assisted Language Learning). It assumes a basic familiarity with Second Language Acquisition (SLA) theory and teaching, CALL and linguistics. It is of interest to upper undergraduate and/or graduate…

  12. Stormwater quality processes for three land-use areas in Broward County, Florida

    USGS Publications Warehouse

    Mattraw, H.C.; Miller, Robert A.

    1981-01-01

    Systematic collection and chemical analysis of stormwater runoff samples from three small urban areas in Broward County, Florida, were obtained between 1974 and 1977. Thirty or more runoff-constituent loads were computed for each of the homogeneous land-use areas. The areas sampled were single family residential, highway, and a commercial shopping center. Rainfall , runoff, and nutrient and metal analyses were stored in a data-management system. The data-management system permitted computation of loads, publication of basic-data reports and the interface of environmental and load information with a comprehensive statistical analysis system. Seven regression models relating water quality loads to characteristics of peak discharge, antecedent conditions, season, storm duration and rainfall intensity were constructed for each of the three sites. Total water-quality loads were computed for the collection period by summing loads for individual storms. Loads for unsampled storms were estimated by using regression models and records of storm precipitation. Loadings, pounds per day per acre of hydraulically effective impervious area, were computed for the three land-use types. Total nitrogen, total phosphorus, and total residue loadings were highest in the residential area. Chemical oxygen demand and total lead loadings were highest in the commercial area. Loadings of atmospheric fallout on each watershed were estimated by bulk precipitation samples collected at the highway and commercial site. (USGS)

  13. Viscous-flow analysis of a subsonic transport aircraft high-lift system and correlation with flight data

    NASA Technical Reports Server (NTRS)

    Potter, R. C.; Vandam, C. P.

    1995-01-01

    High-lift system aerodynamics has been gaining attention in recent years. In an effort to improve aircraft performance, comprehensive studies of multi-element airfoil systems are being undertaken in wind-tunnel and flight experiments. Recent developments in Computational Fluid Dynamics (CFD) offer a relatively inexpensive alternative for studying complex viscous flows by numerically solving the Navier-Stokes (N-S) equations. Current limitations in computer resources restrict practical high-lift N-S computations to two dimensions, but CFD predictions can yield tremendous insight into flow structure, interactions between airfoil elements, and effects of changes in airfoil geometry or free-stream conditions. These codes are very accurate when compared to strictly 2D data provided by wind-tunnel testing, as will be shown here. Yet, additional challenges must be faced in the analysis of a production aircraft wing section, such as that of the NASA Langley Transport Systems Research Vehicle (TSRV). A primary issue is the sweep theory used to correlate 2D predictions with 3D flight results, accounting for sweep, taper, and finite wing effects. Other computational issues addressed here include the effects of surface roughness of the geometry, cove shape modeling, grid topology, and transition specification. The sensitivity of the flow to changing free-stream conditions is investigated. In addition, the effects of Gurney flaps on the aerodynamic characteristics of the airfoil system are predicted.

  14. Computer architecture for efficient algorithmic executions in real-time systems: New technology for avionics systems and advanced space vehicles

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Youngblood, John N.; Saha, Aindam

    1987-01-01

    Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.

  15. Computer architecture for efficient algorithmic executions in real-time systems: new technology for avionics systems and advanced space vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carroll, C.C.; Youngblood, J.N.; Saha, A.

    1987-12-01

    Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processingmore » elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.« less

  16. Computational 3-D Model of the Human Respiratory System

    EPA Science Inventory

    We are developing a comprehensive, morphologically-realistic computational model of the human respiratory system that can be used to study the inhalation, deposition, and clearance of contaminants, while being adaptable for age, race, gender, and health/disease status. The model ...

  17. Large-scale data integration framework provides a comprehensive view on glioblastoma multiforme.

    PubMed

    Ovaska, Kristian; Laakso, Marko; Haapa-Paananen, Saija; Louhimo, Riku; Chen, Ping; Aittomäki, Viljami; Valo, Erkka; Núñez-Fontarnau, Javier; Rantanen, Ville; Karinen, Sirkku; Nousiainen, Kari; Lahesmaa-Korpinen, Anna-Maria; Miettinen, Minna; Saarinen, Lilli; Kohonen, Pekka; Wu, Jianmin; Westermarck, Jukka; Hautaniemi, Sampsa

    2010-09-07

    Coordinated efforts to collect large-scale data sets provide a basis for systems level understanding of complex diseases. In order to translate these fragmented and heterogeneous data sets into knowledge and medical benefits, advanced computational methods for data analysis, integration and visualization are needed. We introduce a novel data integration framework, Anduril, for translating fragmented large-scale data into testable predictions. The Anduril framework allows rapid integration of heterogeneous data with state-of-the-art computational methods and existing knowledge in bio-databases. Anduril automatically generates thorough summary reports and a website that shows the most relevant features of each gene at a glance, allows sorting of data based on different parameters, and provides direct links to more detailed data on genes, transcripts or genomic regions. Anduril is open-source; all methods and documentation are freely available. We have integrated multidimensional molecular and clinical data from 338 subjects having glioblastoma multiforme, one of the deadliest and most poorly understood cancers, using Anduril. The central objective of our approach is to identify genetic loci and genes that have significant survival effect. Our results suggest several novel genetic alterations linked to glioblastoma multiforme progression and, more specifically, reveal Moesin as a novel glioblastoma multiforme-associated gene that has a strong survival effect and whose depletion in vitro significantly inhibited cell proliferation. All analysis results are available as a comprehensive website. Our results demonstrate that integrated analysis and visualization of multidimensional and heterogeneous data by Anduril enables drawing conclusions on functional consequences of large-scale molecular data. Many of the identified genetic loci and genes having significant survival effect have not been reported earlier in the context of glioblastoma multiforme. Thus, in addition to generally applicable novel methodology, our results provide several glioblastoma multiforme candidate genes for further studies.Anduril is available at http://csbi.ltdk.helsinki.fi/anduril/The glioblastoma multiforme analysis results are available at http://csbi.ltdk.helsinki.fi/anduril/tcga-gbm/

  18. Benefits and applications of interdisciplinary digital tools for environmental meta-reviews and analyses

    NASA Astrophysics Data System (ADS)

    Grubert, Emily; Siders, Anne

    2016-09-01

    Digitally-aided reviews of large bodies of text-based information, such as academic literature, are growing in capability but are not yet common in environmental fields. Environmental sciences and studies can benefit from application of digital tools to create comprehensive, replicable, interdisciplinary reviews that provide rapid, up-to-date, and policy-relevant reports of existing work. This work reviews the potential for applications of computational text mining and analysis tools originating in the humanities to environmental science and policy questions. Two process-oriented case studies of digitally-aided environmental literature reviews and meta-analyses illustrate potential benefits and limitations. A medium-sized, medium-resolution review (∼8000 journal abstracts and titles) focuses on topic modeling as a rapid way to identify thematic changes over time. A small, high-resolution review (∼300 full text journal articles) combines collocation and network analysis with manual coding to synthesize and question empirical field work. We note that even small digitally-aided analyses are close to the upper limit of what can be done manually. Established computational methods developed in humanities disciplines and refined by humanities and social science scholars to interrogate large bodies of textual data are applicable and useful in environmental sciences but have not yet been widely applied. Two case studies provide evidence that digital tools can enhance insight. Two major conclusions emerge. First, digital tools enable scholars to engage large literatures rapidly and, in some cases, more comprehensively than is possible manually. Digital tools can confirm manually identified patterns or identify additional patterns visible only at a large scale. Second, digital tools allow for more replicable and transparent conclusions to be drawn from literature reviews and meta-analyses. The methodological subfields of digital humanities and computational social sciences will likely continue to create innovative tools for analyzing large bodies of text, providing opportunities for interdisciplinary collaboration with the environmental fields.

  19. Predicting reading comprehension academic achievement in late adolescents with velo-cardio-facial (22q11.2 deletion) syndrome (VCFS): A longitudinal study

    PubMed Central

    Antshel, Kevin M.; Hier, Bridget O.; Fremont, Wanda; Faraone, Stephen V.; Kates, Wendy R.

    2015-01-01

    Background The primary objective of the current study was to examine the childhood predictors of adolescent reading comprehension in velo-cardio-facial syndrome (VCFS). Although much research has focused on mathematics skills among individuals with VCFS, no studies have examined predictors of reading comprehension. Methods 69 late adolescents with VCFS , 23 siblings of youth with VCFS and 30 community controls participated in a longitudinal research project and had repeat neuropsychological test batteries and psychiatric evaluations every 3 years. The Wechsler Individual Achievement Test – 2nd edition (WIAT-II) Reading Comprehension subtest served as our primary outcome variable. Results Consistent with previous research, children and adolescents with VCFS had mean reading comprehension scores on the WIAT-II which were approximately two standard deviations below the mean and word reading scores approximately one standard deviation below the mean. A more novel finding is that relative to both control groups, individuals with VCFS demonstrated a longitudinal decline in reading comprehension abilities yet a slight increase in word reading abilities. In the combined control sample, WISC-III FSIQ, WIAT-II Word Reading, WISC-III Vocabulary and CVLT-C List A Trial 1 accounted for 75% of the variance in Time 3 WIAT-II Reading Comprehension scores. In the VCFS sample, WISC-III FSIQ, BASC-Teacher Aggression, CVLT-C Intrusions, Tower of London, Visual Span Backwards, WCST non-perseverative errors, WIAT-II Word Reading and WISC-III Freedom from Distractibility index accounted for 85% of the variance in Time 3 WIAT-II Reading Comprehension scores. A principal component analysis with promax rotation computed on the statistically significant Time 1 predictor variables in the VCFS sample resulted in three factors: Word reading decoding / Interference control, Self-Control / Self-Monitoring and Working Memory. Conclusions Childhood predictors of late adolescent reading comprehension in VCFS differ in some meaningful ways from predictors in the non-VCFS population. These results offer some guidance for how best to consider intervention efforts to improve reading comprehension in the VCFS population. PMID:24861691

  20. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.« less

  1. Image-Based Macro-Micro Finite Element Models of a Canine Femur with Implant Design Implications

    NASA Astrophysics Data System (ADS)

    Ghosh, Somnath; Krishnan, Ganapathi; Dyce, Jonathan

    2006-06-01

    In this paper, a comprehensive model of a bone-cement-implant assembly is developed for a canine cemented femoral prosthesis system. Various steps in this development entail profiling the canine femur contours by computed tomography (CT) scanning, computer aided design (CAD) reconstruction of the canine femur from CT images, CAD modeling of the implant from implant blue prints and CAD modeling of the interface cement. Finite element analysis of the macroscopic assembly is conducted for stress analysis in individual components of the system, accounting for variation in density and material properties in the porous bone material. A sensitivity analysis is conducted with the macroscopic model to investigate the effect of implant design variables on the stress distribution in the assembly. Subsequently, rigorous microstructural analysis of the bone incorporating the morphological intricacies is conducted. Various steps in this development include acquisition of the bone microstructural data from histological serial sectioning, stacking of sections to obtain 3D renderings of void distributions, microstructural characterization and determination of properties and, finally, microstructural stress analysis using a 3D Voronoi cell finite element method. Generation of the simulated microstructure and analysis by the 3D Voronoi cell finite element model provides a new way of modeling complex microstructures and correlating to morphological characteristics. An inverse calculation of the material parameters of bone by combining macroscopic experiments with microstructural characterization and analysis provides a new approach to evaluating properties without having to do experiments at this scale. Finally, the microstructural stresses in the femur are computed using the 3D VCFEM to study the stress distribution at the scale of the bone porosity. Significant difference is observed between the macroscopic stresses and the peak microscopic stresses at different locations.

  2. The Relation between Thematic Role Computing and Semantic Relatedness Processing during On-Line Sentence Comprehension

    PubMed Central

    Li, Xiaoqing; Zhao, Haiyan; Lu, Yong

    2014-01-01

    Sentence comprehension involves timely computing different types of relations between its verbs and noun arguments, such as morphosyntactic, semantic, and thematic relations. Here, we used EEG technique to investigate the potential differences in thematic role computing and lexical-semantic relatedness processing during on-line sentence comprehension, and the interaction between these two types of processes. Mandarin Chinese sentences were used as materials. The basic structure of those sentences is “Noun+Verb+‘le’+a two-character word”, with the Noun being the initial argument. The verb disambiguates the initial argument as an agent or a patient. Meanwhile, the initial argument and the verb are highly or lowly semantically related. The ERPs at the verbs revealed that: relative to the agent condition, the patient condition evoked a larger N400 only when the argument and verb were lowly semantically related; however, relative to the high-relatedness condition, the low-relatedness condition elicited a larger N400 regardless of the thematic relation; although both thematic role variation and semantic relatedness variation elicited N400 effects, the N400 effect elicited by the former was broadly distributed and reached maximum over the frontal electrodes, and the N400 effect elicited by the latter had a posterior distribution. In addition, the brain oscillations results showed that, although thematic role variation (patient vs. agent) induced power decreases around the beta frequency band (15–30 Hz), semantic relatedness variation (low-relatedness vs. high-relatedness) induced power increases in the theta frequency band (4–7 Hz). These results suggested that, in the sentence context, thematic role computing is modulated by the semantic relatedness between the verb and its argument; semantic relatedness processing, however, is in some degree independent from the thematic relations. Moreover, our results indicated that, during on-line sentence comprehension, thematic role computing and semantic relatedness processing are mediated by distinct neural systems. PMID:24755643

  3. Student Use of Physics to Make Sense of Incomplete but Functional VPython Programs in a Lab Setting

    NASA Astrophysics Data System (ADS)

    Weatherford, Shawn A.

    2011-12-01

    Computational activities in Matter & Interactions, an introductory calculus-based physics course, have the instructional goal of providing students with the experience of applying the same set of a small number of fundamental principles to model a wide range of physical systems. However there are significant instructional challenges for students to build computer programs under limited time constraints, especially for students who are unfamiliar with programming languages and concepts. Prior attempts at designing effective computational activities were successful at having students ultimately build working VPython programs under the tutelage of experienced teaching assistants in a studio lab setting. A pilot study revealed that students who completed these computational activities had significant difficultly repeating the exact same tasks and further, had difficulty predicting the animation that would be produced by the example program after interpreting the program code. This study explores the interpretation and prediction tasks as part of an instructional sequence where students are asked to read and comprehend a functional, but incomplete program. Rather than asking students to begin their computational tasks with modifying program code, we explicitly ask students to interpret an existing program that is missing key lines of code. The missing lines of code correspond to the algebraic form of fundamental physics principles or the calculation of forces which would exist between analogous physical objects in the natural world. Students are then asked to draw a prediction of what they would see in the simulation produced by the VPython program and ultimately run the program to evaluate the students' prediction. This study specifically looks at how the participants use physics while interpreting the program code and creating a whiteboard prediction. This study also examines how students evaluate their understanding of the program and modification goals at the beginning of the modification task. While working in groups over the course of a semester, study participants were recorded while they completed three activities using these incomplete programs. Analysis of the video data showed that study participants had little difficulty interpreting physics quantities, generating a prediction, or determining how to modify the incomplete program. Participants did not base their prediction solely from the information from the incomplete program. When participants tried to predict the motion of the objects in the simulation, many turned to their knowledge of how the system would evolve if it represented an analogous real-world physical system. For example, participants attributed the real-world behavior of springs to helix objects even though the program did not include calculations for the spring to exert a force when stretched. Participants rarely interpreted lines of code in the computational loop during the first computational activity, but this changed during latter computational activities with most participants using their physics knowledge to interpret the computational loop. Computational activities in the Matter & Interactions curriculum were revised in light of these findings to include an instructional sequence of tasks to build a comprehension of the example program. The modified activities also ask students to create an additional whiteboard prediction for the time-evolution of the real-world phenomena which the example program will eventually model. This thesis shows how comprehension tasks identified by Palinscar and Brown (1984) as effective in improving reading comprehension are also effective in helping students apply their physics knowledge to interpret a computer program which attempts to model a real-world phenomena and identify errors in their understanding of the use, or omission, of fundamental physics principles in a computational model.

  4. An eye-tracking paradigm for analyzing the processing time of sentences with different linguistic complexities.

    PubMed

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics.

  5. An Eye-Tracking Paradigm for Analyzing the Processing Time of Sentences with Different Linguistic Complexities

    PubMed Central

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics. PMID:24950184

  6. A comprehensive comparison of tools for differential ChIP-seq analysis

    PubMed Central

    Steinhauser, Sebastian; Kurzawa, Nils; Eils, Roland

    2016-01-01

    ChIP-seq has become a widely adopted genomic assay in recent years to determine binding sites for transcription factors or enrichments for specific histone modifications. Beside detection of enriched or bound regions, an important question is to determine differences between conditions. While this is a common analysis for gene expression, for which a large number of computational approaches have been validated, the same question for ChIP-seq is particularly challenging owing to the complexity of ChIP-seq data in terms of noisiness and variability. Many different tools have been developed and published in recent years. However, a comprehensive comparison and review of these tools is still missing. Here, we have reviewed 14 tools, which have been developed to determine differential enrichment between two conditions. They differ in their algorithmic setups, and also in the range of applicability. Hence, we have benchmarked these tools on real data sets for transcription factors and histone modifications, as well as on simulated data sets to quantitatively evaluate their performance. Overall, there is a great variety in the type of signal detected by these tools with a surprisingly low level of agreement. Depending on the type of analysis performed, the choice of method will crucially impact the outcome. PMID:26764273

  7. PATRIC: the Comprehensive Bacterial Bioinformatics Resource with a Focus on Human Pathogenic Species ▿ ‡ #

    PubMed Central

    Gillespie, Joseph J.; Wattam, Alice R.; Cammer, Stephen A.; Gabbard, Joseph L.; Shukla, Maulik P.; Dalay, Oral; Driscoll, Timothy; Hix, Deborah; Mane, Shrinivasrao P.; Mao, Chunhong; Nordberg, Eric K.; Scott, Mark; Schulman, Julie R.; Snyder, Eric E.; Sullivan, Daniel E.; Wang, Chunxia; Warren, Andrew; Williams, Kelly P.; Xue, Tian; Seung Yoo, Hyun; Zhang, Chengdong; Zhang, Yan; Will, Rebecca; Kenyon, Ronald W.; Sobral, Bruno W.

    2011-01-01

    Funded by the National Institute of Allergy and Infectious Diseases, the Pathosystems Resource Integration Center (PATRIC) is a genomics-centric relational database and bioinformatics resource designed to assist scientists in infectious-disease research. Specifically, PATRIC provides scientists with (i) a comprehensive bacterial genomics database, (ii) a plethora of associated data relevant to genomic analysis, and (iii) an extensive suite of computational tools and platforms for bioinformatics analysis. While the primary aim of PATRIC is to advance the knowledge underlying the biology of human pathogens, all publicly available genome-scale data for bacteria are compiled and continually updated, thereby enabling comparative analyses to reveal the basis for differences between infectious free-living and commensal species. Herein we summarize the major features available at PATRIC, dividing the resources into two major categories: (i) organisms, genomes, and comparative genomics and (ii) recurrent integration of community-derived associated data. Additionally, we present two experimental designs typical of bacterial genomics research and report on the execution of both projects using only PATRIC data and tools. These applications encompass a broad range of the data and analysis tools available, illustrating practical uses of PATRIC for the biologist. Finally, a summary of PATRIC's outreach activities, collaborative endeavors, and future research directions is provided. PMID:21896772

  8. Comprehensive analysis of a medication dosing error related to CPOE.

    PubMed

    Horsky, Jan; Kuperman, Gilad J; Patel, Vimla L

    2005-01-01

    This case study of a serious medication error demonstrates the necessity of a comprehensive methodology for the analysis of failures in interaction between humans and information systems. The authors used a novel approach to analyze a dosing error related to computer-based ordering of potassium chloride (KCl). The method included a chronological reconstruction of events and their interdependencies from provider order entry usage logs, semistructured interviews with involved clinicians, and interface usability inspection of the ordering system. Information collected from all sources was compared and evaluated to understand how the error evolved and propagated through the system. In this case, the error was the product of faults in interaction among human and system agents that methods limited in scope to their distinct analytical domains would not identify. The authors characterized errors in several converging aspects of the drug ordering process: confusing on-screen laboratory results review, system usability difficulties, user training problems, and suboptimal clinical system safeguards that all contributed to a serious dosing error. The results of the authors' analysis were used to formulate specific recommendations for interface layout and functionality modifications, suggest new user alerts, propose changes to user training, and address error-prone steps of the KCl ordering process to reduce the risk of future medication dosing errors.

  9. Evolution, Nature, Uses and Issues in the Creation of Local School District Comprehensive Information Systems.

    ERIC Educational Resources Information Center

    Hathaway, Walter E.

    Efficient and convenient comprehensive information systems, long kept from coming into being by a variety of obstacles, are now made possible by the concept of distributive processing and the technology of micro- and mini-computer networks. Such systems can individualize instruction, group students efficiently, cut administrative costs, streamline…

  10. "To Gloss or Not To Gloss": An Investigation of Reading Comprehension Online.

    ERIC Educational Resources Information Center

    Lomika, Lara L.

    1998-01-01

    Investigated effects of multimedia reading software on reading comprehension. Twelve college students enrolled in a second semester French course were instructed to think aloud during reading of text on the computer screen. They read text under one of three conditions: full glossing, limited glossing, no glossing. Suggests computerized reading…

  11. Cognitive Load for Configuration Comprehension in Computer-Supported Geometry Problem Solving: An Eye Movement Perspective

    ERIC Educational Resources Information Center

    Lin, John Jr-Hung; Lin, Sunny S. J.

    2014-01-01

    The present study investigated (a) whether the perceived cognitive load was different when geometry problems with various levels of configuration comprehension were solved and (b) whether eye movements in comprehending geometry problems showed sources of cognitive loads. In the first investigation, three characteristics of geometry configurations…

  12. Desktop Publishing: The Effects of Computerized Formats on Reading Speed and Comprehension.

    ERIC Educational Resources Information Center

    Knupfer, Nancy Nelson; McIsaac, Marina Stock

    1989-01-01

    Describes study that was conducted to determine the effects of two electronic text variables used in desktop publishing on undergraduate students' reading speed and comprehension. Research on text variables, graphic design, instructional text design, and computer screen design is discussed, and further studies are suggested. (22 references) (LRW)

  13. Text Simplification and Comprehensible Input: A Case for an Intuitive Approach

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Allen, David; McNamara, Danielle S.

    2012-01-01

    Texts are routinely simplified to make them more comprehensible for second language learners. However, the effects of simplification upon the linguistic features of texts remain largely unexplored. Here we examine the effects of one type of text simplification: intuitive text simplification. We use the computational tool, Coh-Metrix, to examine…

  14. Who Benefits from a Low versus High Guidance CSCL Script and Why?

    ERIC Educational Resources Information Center

    Mende, Stephan; Proske, Antje; Körndle, Hermann; Narciss, Susanne

    2017-01-01

    Computer-supported collaborative learning (CSCL) scripts can foster learners' deep text comprehension. However, this depends on (a) the extent to which the learning activities targeted by a script promote deep text comprehension and (b) whether the guidance level provided by the script is adequate to induce the targeted learning activities…

  15. The Research and Evaluation of Serious Games: Toward a Comprehensive Methodology

    ERIC Educational Resources Information Center

    Mayer, Igor; Bekebrede, Geertje; Harteveld, Casper; Warmelink, Harald; Zhou, Qiqi; van Ruijven, Theo; Lo, Julia; Kortmann, Rens; Wenzler, Ivo

    2014-01-01

    The authors present the methodological background to and underlying research design of an ongoing research project on the scientific evaluation of serious games and/or computer-based simulation games (SGs) for advanced learning. The main research questions are: (1) what are the requirements and design principles for a comprehensive social…

  16. Cognitive correlates of pragmatic language comprehension in adult traumatic brain injury: A systematic review and meta-analyses.

    PubMed

    Rowley, Dane A; Rogish, Miles; Alexander, Timothy; Riggs, Kevin J

    2017-01-01

    Effective pragmatic comprehension of language is critical for successful communication and interaction, but this ability is routinely impaired following Traumatic Brain Injury (TBI) (1,2). Individual studies have investigated the cognitive domains associated with impaired pragmatic comprehension, but there remains little understanding of the relative importance of these domains in contributing to pragmatic comprehension impairment following TBI. This paper presents a systematic meta-analytic review of the observed correlations between pragmatic comprehension and cognitive processes following TBI. Five meta-analyses were computed, which quantified the relationship between pragmatic comprehension and five key cognitive constructs (declarative memory; working memory; attention; executive functions; social cognition). Significant moderate-to-strong correlations were found between all cognitive measures and pragmatic comprehension, where declarative memory was the strongest correlate. Thus, our findings indicate that pragmatic comprehension in TBI is associated with an array of domain general cognitive processes, and as such deficits in these cognitive domains may underlie pragmatic comprehension difficulties following TBI. The clinical implications of these findings are discussed.

  17. Health Information Technology and Physician-Patient Interactions: Impact of Computers on Communication during Outpatient Primary Care Visits

    PubMed Central

    Hsu, John; Huang, Jie; Fung, Vicki; Robertson, Nan; Jimison, Holly; Frankel, Richard

    2005-01-01

    Objective: The aim of this study was to evaluate the impact of introducing health information technology (HIT) on physician-patient interactions during outpatient visits. Design: This was a longitudinal pre-post study: two months before and one and seven months after introduction of examination room computers. Patient questionnaires (n = 313) after primary care visits with physicians (n = 8) within an integrated delivery system. There were three patient satisfaction domains: (1) satisfaction with visit components, (2) comprehension of the visit, and (3) perceptions of the physician's use of the computer. Results: Patients reported that physicians used computers in 82.3% of visits. Compared with baseline, overall patient satisfaction with visits increased seven months after the introduction of computers (odds ratio [OR] = 1.50; 95% confidence interval [CI]: 1.01–2.22), as did satisfaction with physicians' familiarity with patients (OR = 1.60, 95% CI: 1.01–2.52), communication about medical issues (OR = 1.61; 95% CI: 1.05–2.47), and comprehension of decisions made during the visit (OR = 1.63; 95% CI: 1.06–2.50). In contrast, there were no significant changes in patient satisfaction with comprehension of self-care responsibilities, communication about psychosocial issues, or available visit time. Seven months post-introduction, patients were more likely to report that the computer helped the visit run in a more timely manner (OR = 1.76; 95% CI: 1.28–2.42) compared with the first month after introduction. There were no other significant changes in patient perceptions of the computer use over time. Conclusion: The examination room computers appeared to have positive effects on physician-patient interactions related to medical communication without significant negative effects on other areas such as time available for patient concerns. Further study is needed to better understand HIT use during outpatient visits. PMID:15802484

  18. Computer program for analysis of split-Stirling-cycle cryogenic coolers

    NASA Technical Reports Server (NTRS)

    Brown, M. T.; Russo, S. C.

    1983-01-01

    A computer program for predicting the detailed thermodynamic performance of split-Stirling-cycle refrigerators has been developed. The mathematical model includes the refrigerator cold head, free-displacer/regenerator, gas transfer line, and provision for modeling a mechanical or thermal compressor. To allow for dynamic processes (such as aerodynamic friction and heat transfer) temperature, pressure, and mass flow rate are varied by sub-dividing the refrigerator into an appropriate number of fluid and structural control volumes. Of special importance to modeling of cryogenic coolers is the inclusion of real gas properties, and allowance for variation of thermo-physical properties such as thermal conductivities, specific heats and viscosities, with temperature and/or pressure. The resulting model, therefore, comprehensively simulates the split-cycle cooler both spatially and temporally by reflecting the effects of dynamic processes and real material properties.

  19. Fitting Computers into the Curriculum.

    ERIC Educational Resources Information Center

    Rodgers, Robert J.; And Others

    This paper provides strategies and insights that should be weighed and perhaps included in any proposal for integrating computers into a comprehensive school curriculum. The strategies include six basic stages: Initiation, Needs Assessment, Master Plan, Logistic-Specifics, Implementation, and Evaluation. The New Brunswick (New Jersey) Public…

  20. 75 FR 70899 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-19

    ... submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection... Annual Burden Hours: 2,952. Public Computer Center Reports (Quarterly and Annually) Number of Respondents... specific to Infrastructure and Comprehensive Community Infrastructure, Public Computer Center, and...

  1. The iPlant collaborative: cyberinfrastructure for enabling data to discovery for the life sciences

    USDA-ARS?s Scientific Manuscript database

    The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identify management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning m...

  2. NASA aerodynamics program

    NASA Technical Reports Server (NTRS)

    Holmes, Bruce J.; Schairer, Edward; Hicks, Gary; Wander, Stephen; Blankson, Isiaiah; Rose, Raymond; Olson, Lawrence; Unger, George

    1990-01-01

    Presented here is a comprehensive review of the following aerodynamics elements: computational methods and applications, computational fluid dynamics (CFD) validation, transition and turbulence physics, numerical aerodynamic simulation, drag reduction, test techniques and instrumentation, configuration aerodynamics, aeroacoustics, aerothermodynamics, hypersonics, subsonic transport/commuter aviation, fighter/attack aircraft and rotorcraft.

  3. RAP: RNA-Seq Analysis Pipeline, a new cloud-based NGS web application

    PubMed Central

    2015-01-01

    Background The study of RNA has been dramatically improved by the introduction of Next Generation Sequencing platforms allowing massive and cheap sequencing of selected RNA fractions, also providing information on strand orientation (RNA-Seq). The complexity of transcriptomes and of their regulative pathways make RNA-Seq one of most complex field of NGS applications, addressing several aspects of the expression process (e.g. identification and quantification of expressed genes and transcripts, alternative splicing and polyadenylation, fusion genes and trans-splicing, post-transcriptional events, etc.). Moreover, the huge volume of data generated by NGS platforms introduces unprecedented computational and technological challenges to efficiently analyze and store sequence data and results. Methods In order to provide researchers with an effective and friendly resource for analyzing RNA-Seq data, we present here RAP (RNA-Seq Analysis Pipeline), a cloud computing web application implementing a complete but modular analysis workflow. This pipeline integrates both state-of-the-art bioinformatics tools for RNA-Seq analysis and in-house developed scripts to offer to the user a comprehensive strategy for data analysis. RAP is able to perform quality checks (adopting FastQC and NGS QC Toolkit), identify and quantify expressed genes and transcripts (with Tophat, Cufflinks and HTSeq), detect alternative splicing events (using SpliceTrap) and chimeric transcripts (with ChimeraScan). This pipeline is also able to identify splicing junctions and constitutive or alternative polyadenylation sites (implementing custom analysis modules) and call for statistically significant differences in genes and transcripts expression, splicing pattern and polyadenylation site usage (using Cuffdiff2 and DESeq). Results Through a user friendly web interface, the RAP workflow can be suitably customized by the user and it is automatically executed on our cloud computing environment. This strategy allows to access to bioinformatics tools and computational resources without specific bioinformatics and IT skills. RAP provides a set of tabular and graphical results that can be helpful to browse, filter and export analyzed data, according to the user needs. PMID:26046471

  4. A Pilot Study of Biomedical Text Comprehension using an Attention-Based Deep Neural Reader: Design and Experimental Analysis

    PubMed Central

    Lee, Kyubum; Kim, Byounggun; Jeon, Minji; Kim, Jihye; Tan, Aik Choon

    2018-01-01

    Background With the development of artificial intelligence (AI) technology centered on deep-learning, the computer has evolved to a point where it can read a given text and answer a question based on the context of the text. Such a specific task is known as the task of machine comprehension. Existing machine comprehension tasks mostly use datasets of general texts, such as news articles or elementary school-level storybooks. However, no attempt has been made to determine whether an up-to-date deep learning-based machine comprehension model can also process scientific literature containing expert-level knowledge, especially in the biomedical domain. Objective This study aims to investigate whether a machine comprehension model can process biomedical articles as well as general texts. Since there is no dataset for the biomedical literature comprehension task, our work includes generating a large-scale question answering dataset using PubMed and manually evaluating the generated dataset. Methods We present an attention-based deep neural model tailored to the biomedical domain. To further enhance the performance of our model, we used a pretrained word vector and biomedical entity type embedding. We also developed an ensemble method of combining the results of several independent models to reduce the variance of the answers from the models. Results The experimental results showed that our proposed deep neural network model outperformed the baseline model by more than 7% on the new dataset. We also evaluated human performance on the new dataset. The human evaluation result showed that our deep neural model outperformed humans in comprehension by 22% on average. Conclusions In this work, we introduced a new task of machine comprehension in the biomedical domain using a deep neural model. Since there was no large-scale dataset for training deep neural models in the biomedical domain, we created the new cloze-style datasets Biomedical Knowledge Comprehension Title (BMKC_T) and Biomedical Knowledge Comprehension Last Sentence (BMKC_LS) (together referred to as BioMedical Knowledge Comprehension) using the PubMed corpus. The experimental results showed that the performance of our model is much higher than that of humans. We observed that our model performed consistently better regardless of the degree of difficulty of a text, whereas humans have difficulty when performing biomedical literature comprehension tasks that require expert level knowledge. PMID:29305341

  5. Computer Ratio and Student Achievement in Reading and Math in a North Carolina School District

    ERIC Educational Resources Information Center

    Preswood, Erica

    2017-01-01

    This longitudinal research project explored the relationship between a 1:1 computing initiative and student achievement on the North Carolina End of Grade Reading Comprehension and Math tests in the study school district. The purpose of this research study was to determine if the implementation of a 1:1 computing initiative impacted student…

  6. Effects of Static Visuals and Computer-Generated Animations in Facilitating Immediate and Delayed Achievement in the EFL Classroom

    ERIC Educational Resources Information Center

    Lin, Huifen; Chen, Tsuiping; Dwyer, Francis M.

    2006-01-01

    The purpose of this experimental study was to compare the effects of using static visuals versus computer-generated animation to enhance learners' comprehension and retention of a content-based lesson in a computer-based learning environment for learning English as a foreign language (EFL). Fifty-eight students from two EFL reading sections were…

  7. Towards a Versatile Tele-Education Platform for Computer Science Educators Based on the Greek School Network

    ERIC Educational Resources Information Center

    Paraskevas, Michael; Zarouchas, Thomas; Angelopoulos, Panagiotis; Perikos, Isidoros

    2013-01-01

    Now days the growing need for highly qualified computer science educators in modern educational environments is commonplace. This study examines the potential use of Greek School Network (GSN) to provide a robust and comprehensive e-training course for computer science educators in order to efficiently exploit advanced IT services and establish a…

  8. Computing Support for Basic Research in Perception and Cognition

    DTIC Science & Technology

    1988-12-07

    hearing aids and cochlear implants, this suggests that certain types of proposed coding schemes, specifically those employing periodicity tuning in...developing a computer model of the interaction of declarative and procedural knowledge in skill acquisition. In the Visual Psychophysics Laboratory... Psycholinguistics - Laboratory a computer model of text comprehension and recall has been constructed and several - experiments have been completed that verify basic

  9. Objective and Item Banking Computer Software and Its Use in Comprehensive Achievement Monitoring.

    ERIC Educational Resources Information Center

    Schriber, Peter E.; Gorth, William P.

    The current emphasis on objectives and test item banks for constructing more effective tests is being augmented by increasingly sophisticated computer software. Items can be catalogued in numerous ways for retrieval. The items as well as instructional objectives can be stored and test forms can be selected and printed by the computer. It is also…

  10. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    ERIC Educational Resources Information Center

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  11. Assessment of technical condition of concrete pavement by the example of district road

    NASA Astrophysics Data System (ADS)

    Linek, M.; Nita, P.; Żebrowski, W.; Wolka, P.

    2018-05-01

    The article presents the comprehensive assessment of concrete pavement condition. Analyses included the district road located in the swietokrzyskie province, used for 11 years. Comparative analyses were conducted twice. The first analysis was carried out after 9 years of pavement operation, in 2015. In order to assess the extent of pavement degradation, the tests were repeated in 2017. Within the scope of field research, the traffic intensity within the analysed road section was determined. Visual assessment of pavement condition was conducted, according to the guidelines included in SOSN-B. Visual assessment can be extended by ground-penetrating radar measurements which allow to provide comprehensive assessment of the occurred structure changes within its entire thickness and length. The assessment included also performance parameters, i.e. pavement regularity, surface roughness and texture. Extension of test results by the assessment of changes in internal structure of concrete composite and structure observations by means of Scanning Electron Microscope allow for the assessment of parameters of internal structure of hardened concrete. Supplementing the observations of internal structure by means of computed tomography scan provides comprehensive information of possible discontinuities and composite structure. According to the analysis of the obtained results, conclusions concerning the analysed pavement condition were reached. It was determined that the pavement is distinguished by high performance parameters, its condition is good and it does not require any repairs. Maintenance treatment was suggested in order to extend the period of proper operation of the analysed pavement.

  12. Mooring Mechanics. A Comprehensive Computer Study. Volume II. Three Dimensional Dynamic Analysis of Moored and Drifting Buoy Systems

    DTIC Science & Technology

    1976-12-01

    If point p and w are overlapping then: [Rpw]E R pw]B VBw VEw Ec - EB x Rcp -2.2- Here; VEw i (V +t + i (0 + ) + *w x ox x y 1, y 1i ( Voz + Cz) and...VBw =ix[Vox + 4x -x - Z’cose] + +i z[ Voz + (D - Zc + z’Osinf] = x [ (Vox + Ox - C)cose A - ( Voz + 0z - zo )Sin6 - z’g] + iz’[(Vox + ox " )sinO + ( Voz ...3pr3. Following representation is used in this analysis for this force (AM). S.*.’ I.-; , . ’x.:, I A -3C- 3 L 1rAM = C ip 1 AEt{ z’=-. .. r- L 3

  13. Tau-independent Phase Analysis: A Novel Method for Accurately Determining Phase Shifts.

    PubMed

    Tackenberg, Michael C; Jones, Jeff R; Page, Terry L; Hughey, Jacob J

    2018-06-01

    Estimations of period and phase are essential in circadian biology. While many techniques exist for estimating period, comparatively few methods are available for estimating phase. Current approaches to analyzing phase often vary between studies and are sensitive to coincident changes in period and the stage of the circadian cycle at which the stimulus occurs. Here we propose a new technique, tau-independent phase analysis (TIPA), for quantifying phase shifts in multiple types of circadian time-course data. Through comprehensive simulations, we show that TIPA is both more accurate and more precise than the standard actogram approach. TIPA is computationally simple and therefore will enable accurate and reproducible quantification of phase shifts across multiple subfields of chronobiology.

  14. Integrated protocol for reliable and fast quantification and documentation of electrophoresis gels.

    PubMed

    Rehbein, Peter; Schwalbe, Harald

    2015-06-01

    Quantitative analysis of electrophoresis gels is an important part in molecular cloning, as well as in protein expression and purification. Parallel quantifications in yield and purity can be most conveniently obtained from densitometric analysis. This communication reports a comprehensive, reliable and simple protocol for gel quantification and documentation, applicable for single samples and with special features for protein expression screens. As major component of the protocol, the fully annotated code of a proprietary open source computer program for semi-automatic densitometric quantification of digitized electrophoresis gels is disclosed. The program ("GelQuant") is implemented for the C-based macro-language of the widespread integrated development environment of IGOR Pro. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, T.L.; Simonen, F.A.

    1992-05-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less

  16. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, T.L.; Simonen, F.A.

    1992-01-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less

  17. Full-Spectrum-Analysis Isotope ID

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Dean J.; Harding, Lee; Thoreson, Gregory G.

    2017-06-28

    FSAIsotopeID analyzes gamma ray spectra to identify radioactive isotopes (radionuclides). The algorithm fits the entire spectrum with combinations of pre-computed templates for a comprehensive set of radionuclides with varying thicknesses and compositions of shielding materials. The isotope identification algorithm is suitable for the analysis of spectra collected by gamma-ray sensors ranging from medium-resolution detectors, such a NaI, to high-resolution detectors, such as HPGe. In addition to analyzing static measurements, the isotope identification algorithm is applied for the radiation search applications. The search subroutine maintains a running background spectrum that is passed to the isotope identification algorithm, and it also selectsmore » temporal integration periods that optimize the responsiveness and sensitivity. Gain stabilization is supported for both types of applications.« less

  18. Milestones in Rotorcraft Aeromechanics

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2011-01-01

    The subject of this paper is milestones in rotorcraft aeromechanics. Aeromechanics covers much of what the engineer needs: performance, loads, vibration, stability, flight dynamics, noise. These topics cover many of the key performance attributes, and many of the often-encountered problems in rotorcraft designs. A milestone is a critical achievement, a turning point, an event marking a significant change or stage in development. The milestones identified and discussed include the beginnings of aeromechanics with autogyro analysis, ground resonance, aeromechanics books, unsteady aerodynamics and airloads, nonuniform inflow and wakes, beams and dynamics, comprehensive analysis, computational fluid dynamics, and rotor airloads tests. The focus on milestones limits the scope of the history, but allows the author to acknowledge his choices for key steps in the development of the science and engineering of rotorcraft.

  19. Analysis, calculation and utilization of the k-balance attribute in interdependent networks

    NASA Astrophysics Data System (ADS)

    Liu, Zheng; Li, Qing; Wang, Dan; Xu, Mingwei

    2018-05-01

    Interdependent networks, where two networks depend on each other, are becoming more and more significant in modern systems. From previous work, it can be concluded that interdependent networks are more vulnerable than a single network. The robustness in interdependent networks deserves special attention. In this paper, we propose a metric of robustness from a new perspective-the balance. First, we define the balance-coefficient of the interdependent system. Based on precise analysis and derivation, we prove some significant theories and provide an efficient algorithm to compute the balance-coefficient. Finally, we propose an optimal solution to reduce the balance-coefficient to enhance the robustness of the given system. Comprehensive experiments confirm the efficiency of our algorithms.

  20. Data consistency checks for Jefferson Lab Experiment E00-002

    NASA Astrophysics Data System (ADS)

    Telfeyan, John; Niculescu, Gabriel; Niculescu, Ioana

    2006-10-01

    Jefferson Lab experiment E00-002 aims to measure inclusive electron-proton and electron-deuteron scattering cross section at low Q squared and moderately low Bjorken x. Data in this kinematic region will further our understanding of the transition between the perturbative and non-perturbative regimes of Quantum Chromodynamics (QCD). As part of the data analysis effort underway at James Madison University (JMU) a comprehensive set of checks and tests was implemented. These tests ensure the quality and consistency of the experimental data, as well as providing, where appropriate, correction factors between the experimental apparatus as used and its idealized computer-simulated representation. This contribution will outline this testing procedure as implemented in the JMU analysis, highlighting the most important features/results.

  1. Computer-Aided Modeling and Analysis of Power Processing Systems (CAMAPPS), phase 1

    NASA Technical Reports Server (NTRS)

    Kim, S.; Lee, J.; Cho, B. H.; Lee, F. C.

    1986-01-01

    The large-signal behaviors of a regulator depend largely on the type of power circuit topology and control. Thus, for maximum flexibility, it is best to develop models for each functional block a independent modules. A regulator can then be configured by collecting appropriate pre-defined modules for each functional block. In order to complete the component model generation for a comprehensive spacecraft power system, the following modules were developed: solar array switching unit and control; shunt regulators; and battery discharger. The capability of each module is demonstrated using a simplified Direct Energy Transfer (DET) system. Large-signal behaviors of solar array power systems were analyzed. Stability of the solar array system operating points with a nonlinear load is analyzed. The state-plane analysis illustrates trajectories of the system operating point under various conditions. Stability and transient responses of the system operating near the solar array's maximum power point are also analyzed. The solar array system mode of operation is described using the DET spacecraft power system. The DET system is simulated for various operating conditions. Transfer of the software program CAMAPPS (Computer Aided Modeling and Analysis of Power Processing Systems) to NASA/GSFC (Goddard Space Flight Center) was accomplished.

  2. Differential gene and transcript expression analysis of RNA-seq experiments with TopHat and Cufflinks

    PubMed Central

    Trapnell, Cole; Roberts, Adam; Goff, Loyal; Pertea, Geo; Kim, Daehwan; Kelley, David R; Pimentel, Harold; Salzberg, Steven L; Rinn, John L; Pachter, Lior

    2012-01-01

    Recent advances in high-throughput cDNA sequencing (RNA-seq) can reveal new genes and splice variants and quantify expression genome-wide in a single assay. The volume and complexity of data from RNA-seq experiments necessitate scalable, fast and mathematically principled analysis software. TopHat and Cufflinks are free, open-source software tools for gene discovery and comprehensive expression analysis of high-throughput mRNA sequencing (RNA-seq) data. Together, they allow biologists to identify new genes and new splice variants of known ones, as well as compare gene and transcript expression under two or more conditions. This protocol describes in detail how to use TopHat and Cufflinks to perform such analyses. It also covers several accessory tools and utilities that aid in managing data, including CummeRbund, a tool for visualizing RNA-seq analysis results. Although the procedure assumes basic informatics skills, these tools assume little to no background with RNA-seq analysis and are meant for novices and experts alike. The protocol begins with raw sequencing reads and produces a transcriptome assembly, lists of differentially expressed and regulated genes and transcripts, and publication-quality visualizations of analysis results. The protocol's execution time depends on the volume of transcriptome sequencing data and available computing resources but takes less than 1 d of computer time for typical experiments and ~1 h of hands-on time. PMID:22383036

  3. The aquatic animals' transcriptome resource for comparative functional analysis.

    PubMed

    Chou, Chih-Hung; Huang, Hsi-Yuan; Huang, Wei-Chih; Hsu, Sheng-Da; Hsiao, Chung-Der; Liu, Chia-Yu; Chen, Yu-Hung; Liu, Yu-Chen; Huang, Wei-Yun; Lee, Meng-Lin; Chen, Yi-Chang; Huang, Hsien-Da

    2018-05-09

    Aquatic animals have great economic and ecological importance. Among them, non-model organisms have been studied regarding eco-toxicity, stress biology, and environmental adaptation. Due to recent advances in next-generation sequencing techniques, large amounts of RNA-seq data for aquatic animals are publicly available. However, currently there is no comprehensive resource exist for the analysis, unification, and integration of these datasets. This study utilizes computational approaches to build a new resource of transcriptomic maps for aquatic animals. This aquatic animal transcriptome map database dbATM provides de novo assembly of transcriptome, gene annotation and comparative analysis of more than twenty aquatic organisms without draft genome. To improve the assembly quality, three computational tools (Trinity, Oases and SOAPdenovo-Trans) were employed to enhance individual transcriptome assembly, and CAP3 and CD-HIT-EST software were then used to merge these three assembled transcriptomes. In addition, functional annotation analysis provides valuable clues to gene characteristics, including full-length transcript coding regions, conserved domains, gene ontology and KEGG pathways. Furthermore, all aquatic animal genes are essential for comparative genomics tasks such as constructing homologous gene groups and blast databases and phylogenetic analysis. In conclusion, we establish a resource for non model organism aquatic animals, which is great economic and ecological importance and provide transcriptomic information including functional annotation and comparative transcriptome analysis. The database is now publically accessible through the URL http://dbATM.mbc.nctu.edu.tw/ .

  4. On Mediation in Virtual Learning Environments.

    ERIC Educational Resources Information Center

    Davies, Larry; Hassan, W. Shukry

    2001-01-01

    Discusses concepts of mediation and focuses on the importance of implementing comprehensive virtual learning environments. Topics include education and technology as they relate to cultural change, social institutions, the Internet and computer-mediated communication, software design and human-computer interaction, the use of MOOs, and language.…

  5. Comprehensive Thematic T-Matrix Reference Database: A 2015-2017 Update

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Zakharova, Nadezhda; Khlebtsov, Nikolai G.; Videen, Gorden; Wriedt, Thomas

    2017-01-01

    The T-matrix method pioneered by Peter C. Waterman is one of the most versatile and efficient numerically exact computer solvers of the time-harmonic macroscopic Maxwell equations. It is widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, periodic structures (including metamaterials), and particles in the vicinity of plane or rough interfaces separating media with different refractive indices. This paper is the eighth update to the comprehensive thematic database of peer-reviewed T-matrix publications initiated in 2004 and lists relevant publications that have appeared since 2015. It also references a small number of earlier publications overlooked previously.

  6. Comprehensive thematic T-matrix reference database: A 2015-2017 update

    NASA Astrophysics Data System (ADS)

    Mishchenko, Michael I.; Zakharova, Nadezhda T.; Khlebtsov, Nikolai G.; Videen, Gorden; Wriedt, Thomas

    2017-11-01

    The T-matrix method pioneered by Peter C. Waterman is one of the most versatile and efficient numerically exact computer solvers of the time-harmonic macroscopic Maxwell equations. It is widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, periodic structures (including metamaterials), and particles in the vicinity of plane or rough interfaces separating media with different refractive indices. This paper is the eighth update to the comprehensive thematic database of peer-reviewed T-matrix publications initiated in 2004 and lists relevant publications that have appeared since 2015. It also references a small number of earlier publications overlooked previously.

  7. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis.

    PubMed

    Ni, Jianhua; Qian, Tianlu; Xi, Changbai; Rui, Yikang; Wang, Jiechen

    2016-08-18

    The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  8. New method for designing serial resonant power converters

    NASA Astrophysics Data System (ADS)

    Hinov, Nikolay

    2017-12-01

    In current work is presented one comprehensive method for design of serial resonant energy converters. The method is based on new simplified approach in analysis of such kind power electronic devices. It is grounded on supposing resonant mode of operation when finding relation between input and output voltage regardless of other operational modes (when controlling frequency is below or above resonant frequency). This approach is named `quasiresonant method of analysis', because it is based on assuming that all operational modes are `sort of' resonant modes. An estimation of error was made because of the a.m. hypothesis and is compared to the classic analysis. The `quasiresonant method' of analysis gains two main advantages: speed and easiness in designing of presented power circuits. Hence it is very useful in practice and in teaching Power Electronics. Its applicability is proven with mathematic modelling and computer simulation.

  9. RELAP-7 Software Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less

  10. methylPipe and compEpiTools: a suite of R packages for the integrative analysis of epigenomics data.

    PubMed

    Kishore, Kamal; de Pretis, Stefano; Lister, Ryan; Morelli, Marco J; Bianchi, Valerio; Amati, Bruno; Ecker, Joseph R; Pelizzola, Mattia

    2015-09-29

    Numerous methods are available to profile several epigenetic marks, providing data with different genome coverage and resolution. Large epigenomic datasets are then generated, and often combined with other high-throughput data, including RNA-seq, ChIP-seq for transcription factors (TFs) binding and DNase-seq experiments. Despite the numerous computational tools covering specific steps in the analysis of large-scale epigenomics data, comprehensive software solutions for their integrative analysis are still missing. Multiple tools must be identified and combined to jointly analyze histone marks, TFs binding and other -omics data together with DNA methylation data, complicating the analysis of these data and their integration with publicly available datasets. To overcome the burden of integrating various data types with multiple tools, we developed two companion R/Bioconductor packages. The former, methylPipe, is tailored to the analysis of high- or low-resolution DNA methylomes in several species, accommodating (hydroxy-)methyl-cytosines in both CpG and non-CpG sequence context. The analysis of multiple whole-genome bisulfite sequencing experiments is supported, while maintaining the ability of integrating targeted genomic data. The latter, compEpiTools, seamlessly incorporates the results obtained with methylPipe and supports their integration with other epigenomics data. It provides a number of methods to score these data in regions of interest, leading to the identification of enhancers, lncRNAs, and RNAPII stalling/elongation dynamics. Moreover, it allows a fast and comprehensive annotation of the resulting genomic regions, and the association of the corresponding genes with non-redundant GeneOntology terms. Finally, the package includes a flexible method based on heatmaps for the integration of various data types, combining annotation tracks with continuous or categorical data tracks. methylPipe and compEpiTools provide a comprehensive Bioconductor-compliant solution for the integrative analysis of heterogeneous epigenomics data. These packages are instrumental in providing biologists with minimal R skills a complete toolkit facilitating the analysis of their own data, or in accelerating the analyses performed by more experienced bioinformaticians.

  11. Cross-language differences in the brain network subserving intelligible speech.

    PubMed

    Ge, Jianqiao; Peng, Gang; Lyu, Bingjiang; Wang, Yi; Zhuo, Yan; Niu, Zhendong; Tan, Li Hai; Leff, Alexander P; Gao, Jia-Hong

    2015-03-10

    How is language processed in the brain by native speakers of different languages? Is there one brain system for all languages or are different languages subserved by different brain systems? The first view emphasizes commonality, whereas the second emphasizes specificity. We investigated the cortical dynamics involved in processing two very diverse languages: a tonal language (Chinese) and a nontonal language (English). We used functional MRI and dynamic causal modeling analysis to compute and compare brain network models exhaustively with all possible connections among nodes of language regions in temporal and frontal cortex and found that the information flow from the posterior to anterior portions of the temporal cortex was commonly shared by Chinese and English speakers during speech comprehension, whereas the inferior frontal gyrus received neural signals from the left posterior portion of the temporal cortex in English speakers and from the bilateral anterior portion of the temporal cortex in Chinese speakers. Our results revealed that, although speech processing is largely carried out in the common left hemisphere classical language areas (Broca's and Wernicke's areas) and anterior temporal cortex, speech comprehension across different language groups depends on how these brain regions interact with each other. Moreover, the right anterior temporal cortex, which is crucial for tone processing, is equally important as its left homolog, the left anterior temporal cortex, in modulating the cortical dynamics in tone language comprehension. The current study pinpoints the importance of the bilateral anterior temporal cortex in language comprehension that is downplayed or even ignored by popular contemporary models of speech comprehension.

  12. Cross-language differences in the brain network subserving intelligible speech

    PubMed Central

    Ge, Jianqiao; Peng, Gang; Lyu, Bingjiang; Wang, Yi; Zhuo, Yan; Niu, Zhendong; Tan, Li Hai; Leff, Alexander P.; Gao, Jia-Hong

    2015-01-01

    How is language processed in the brain by native speakers of different languages? Is there one brain system for all languages or are different languages subserved by different brain systems? The first view emphasizes commonality, whereas the second emphasizes specificity. We investigated the cortical dynamics involved in processing two very diverse languages: a tonal language (Chinese) and a nontonal language (English). We used functional MRI and dynamic causal modeling analysis to compute and compare brain network models exhaustively with all possible connections among nodes of language regions in temporal and frontal cortex and found that the information flow from the posterior to anterior portions of the temporal cortex was commonly shared by Chinese and English speakers during speech comprehension, whereas the inferior frontal gyrus received neural signals from the left posterior portion of the temporal cortex in English speakers and from the bilateral anterior portion of the temporal cortex in Chinese speakers. Our results revealed that, although speech processing is largely carried out in the common left hemisphere classical language areas (Broca’s and Wernicke’s areas) and anterior temporal cortex, speech comprehension across different language groups depends on how these brain regions interact with each other. Moreover, the right anterior temporal cortex, which is crucial for tone processing, is equally important as its left homolog, the left anterior temporal cortex, in modulating the cortical dynamics in tone language comprehension. The current study pinpoints the importance of the bilateral anterior temporal cortex in language comprehension that is downplayed or even ignored by popular contemporary models of speech comprehension. PMID:25713366

  13. RAP: RNA-Seq Analysis Pipeline, a new cloud-based NGS web application.

    PubMed

    D'Antonio, Mattia; D'Onorio De Meo, Paolo; Pallocca, Matteo; Picardi, Ernesto; D'Erchia, Anna Maria; Calogero, Raffaele A; Castrignanò, Tiziana; Pesole, Graziano

    2015-01-01

    The study of RNA has been dramatically improved by the introduction of Next Generation Sequencing platforms allowing massive and cheap sequencing of selected RNA fractions, also providing information on strand orientation (RNA-Seq). The complexity of transcriptomes and of their regulative pathways make RNA-Seq one of most complex field of NGS applications, addressing several aspects of the expression process (e.g. identification and quantification of expressed genes and transcripts, alternative splicing and polyadenylation, fusion genes and trans-splicing, post-transcriptional events, etc.). In order to provide researchers with an effective and friendly resource for analyzing RNA-Seq data, we present here RAP (RNA-Seq Analysis Pipeline), a cloud computing web application implementing a complete but modular analysis workflow. This pipeline integrates both state-of-the-art bioinformatics tools for RNA-Seq analysis and in-house developed scripts to offer to the user a comprehensive strategy for data analysis. RAP is able to perform quality checks (adopting FastQC and NGS QC Toolkit), identify and quantify expressed genes and transcripts (with Tophat, Cufflinks and HTSeq), detect alternative splicing events (using SpliceTrap) and chimeric transcripts (with ChimeraScan). This pipeline is also able to identify splicing junctions and constitutive or alternative polyadenylation sites (implementing custom analysis modules) and call for statistically significant differences in genes and transcripts expression, splicing pattern and polyadenylation site usage (using Cuffdiff2 and DESeq). Through a user friendly web interface, the RAP workflow can be suitably customized by the user and it is automatically executed on our cloud computing environment. This strategy allows to access to bioinformatics tools and computational resources without specific bioinformatics and IT skills. RAP provides a set of tabular and graphical results that can be helpful to browse, filter and export analyzed data, according to the user needs.

  14. What Constitutes a "Good" Sensitivity Analysis? Elements and Tools for a Robust Sensitivity Analysis with Reduced Computational Cost

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin; Haghnegahdar, Amin

    2016-04-01

    Global sensitivity analysis (GSA) is a systems theoretic approach to characterizing the overall (average) sensitivity of one or more model responses across the factor space, by attributing the variability of those responses to different controlling (but uncertain) factors (e.g., model parameters, forcings, and boundary and initial conditions). GSA can be very helpful to improve the credibility and utility of Earth and Environmental System Models (EESMs), as these models are continually growing in complexity and dimensionality with continuous advances in understanding and computing power. However, conventional approaches to GSA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we identify several important sensitivity-related characteristics of response surfaces that must be considered when investigating and interpreting the ''global sensitivity'' of a model response (e.g., a metric of model performance) to its parameters/factors. Accordingly, we present a new and general sensitivity and uncertainty analysis framework, Variogram Analysis of Response Surfaces (VARS), based on an analogy to 'variogram analysis', that characterizes a comprehensive spectrum of information on sensitivity. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are special cases of VARS, and that their SA indices are contained within the VARS framework. We also present a practical strategy for the application of VARS to real-world problems, called STAR-VARS, including a new sampling strategy, called "star-based sampling". Our results across several case studies show the STAR-VARS approach to provide reliable and stable assessments of "global" sensitivity, while being at least 1-2 orders of magnitude more efficient than the benchmark Morris and Sobol approaches.

  15. Comprehensive Modeling and Visualization of Cardiac Anatomy and Physiology from CT Imaging and Computer Simulations

    PubMed Central

    Sun, Peng; Zhou, Haoyin; Ha, Seongmin; Hartaigh, Bríain ó; Truong, Quynh A.; Min, James K.

    2016-01-01

    In clinical cardiology, both anatomy and physiology are needed to diagnose cardiac pathologies. CT imaging and computer simulations provide valuable and complementary data for this purpose. However, it remains challenging to gain useful information from the large amount of high-dimensional diverse data. The current tools are not adequately integrated to visualize anatomic and physiologic data from a complete yet focused perspective. We introduce a new computer-aided diagnosis framework, which allows for comprehensive modeling and visualization of cardiac anatomy and physiology from CT imaging data and computer simulations, with a primary focus on ischemic heart disease. The following visual information is presented: (1) Anatomy from CT imaging: geometric modeling and visualization of cardiac anatomy, including four heart chambers, left and right ventricular outflow tracts, and coronary arteries; (2) Function from CT imaging: motion modeling, strain calculation, and visualization of four heart chambers; (3) Physiology from CT imaging: quantification and visualization of myocardial perfusion and contextual integration with coronary artery anatomy; (4) Physiology from computer simulation: computation and visualization of hemodynamics (e.g., coronary blood velocity, pressure, shear stress, and fluid forces on the vessel wall). Substantially, feedback from cardiologists have confirmed the practical utility of integrating these features for the purpose of computer-aided diagnosis of ischemic heart disease. PMID:26863663

  16. SCISEAL: A CFD code for analysis of fluid dynamic forces in seals

    NASA Technical Reports Server (NTRS)

    Athavale, Mahesh; Przekwas, Andrzej

    1994-01-01

    A viewgraph presentation is made of the objectives, capabilities, and test results of the computer code SCISEAL. Currently, the seal code has: a finite volume, pressure-based integration scheme; colocated variables with strong conservation approach; high-order spatial differencing, up to third-order; up to second-order temporal differencing; a comprehensive set of boundary conditions; a variety of turbulence models and surface roughness treatment; moving grid formulation for arbitrary rotor whirl; rotor dynamic coefficients calculated by the circular whirl and numerical shaker methods; and small perturbation capabilities to handle centered and eccentric seals.

  17. Aeromechanics and man-machine integration technology opportunities for rotorcraft of the 1990s and beyond

    NASA Technical Reports Server (NTRS)

    Kerr, Andrew W.

    1989-01-01

    Programs related to rotorcraft aeromechanics and man-machine integration are discussed which will support advanced army rotorcraft design. In aeromechanics, recent advances in computational fluid dynamics will be used to characterize the complex unsteady flowfields of rotorcraft, and a second-generation comprehensive helicopter analysis system will be used along with models of aerodynamics, engines, and control systems to study the structural dynamics of rotor/body configurations. The man-machine integration program includes the development of advanced cockpit design technology and the evaluation of cockpit and mission equipment concepts in a real-time full-combat environment.

  18. EOS-AM precision pointing verification

    NASA Technical Reports Server (NTRS)

    Throckmorton, A.; Braknis, E.; Bolek, J.

    1993-01-01

    The Earth Observing System (EOS) AM mission requires tight pointing knowledge to meet scientific objectives, in a spacecraft with low frequency flexible appendage modes. As the spacecraft controller reacts to various disturbance sources and as the inherent appendage modes are excited by this control action, verification of precision pointing knowledge becomes particularly challenging for the EOS-AM mission. As presently conceived, this verification includes a complementary set of multi-disciplinary analyses, hardware tests and real-time computer in the loop simulations, followed by collection and analysis of hardware test and flight data and supported by a comprehensive data base repository for validated program values.

  19. The menu-setting problem and subsidized prices: drug formulary illustration.

    PubMed

    Olmstead, T; Zeckhauser, R

    1999-10-01

    The menu-setting problem (MSP) determines the goods and services an institution offers and the prices charged. It appears widely in health care, from choosing the services an insurance arrangement offers, to selecting the health plans an employer proffers. The challenge arises because purchases are subsidized, and consumers (or their physician agents) may make cost-ineffective choices. The intuitively comprehensible MSP model--readily solved by computer using actual data--helps structure thinking and support decision making about such problems. The analysis uses drug formularies--lists of approved drugs in a plan or institution--to illustrate the framework.

  20. Petrophysical evaluation of subterranean formations

    DOEpatents

    Klein, James D; Schoderbek, David A; Mailloux, Jason M

    2013-05-28

    Methods and systems are provided for evaluating petrophysical properties of subterranean formations and comprehensively evaluating hydrate presence through a combination of computer-implemented log modeling and analysis. Certain embodiments include the steps of running a number of logging tools in a wellbore to obtain a variety of wellbore data and logs, and evaluating and modeling the log data to ascertain various petrophysical properties. Examples of suitable logging techniques that may be used in combination with the present invention include, but are not limited to, sonic logs, electrical resistivity logs, gamma ray logs, neutron porosity logs, density logs, NRM logs, or any combination or subset thereof.

  1. Neural bases of event knowledge and syntax integration in comprehension of complex sentences.

    PubMed

    Malaia, Evie; Newman, Sharlene

    2015-01-01

    Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.

  2. A comprehensive method for GNSS data quality determination to improve ionospheric data analysis.

    PubMed

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun

    2014-08-14

    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis.

  3. Minfi: a flexible and comprehensive Bioconductor package for the analysis of Infinium DNA methylation microarrays

    PubMed Central

    Aryee, Martin J.; Jaffe, Andrew E.; Corrada-Bravo, Hector; Ladd-Acosta, Christine; Feinberg, Andrew P.; Hansen, Kasper D.; Irizarry, Rafael A.

    2014-01-01

    Motivation: The recently released Infinium HumanMethylation450 array (the ‘450k’ array) provides a high-throughput assay to quantify DNA methylation (DNAm) at ∼450 000 loci across a range of genomic features. Although less comprehensive than high-throughput sequencing-based techniques, this product is more cost-effective and promises to be the most widely used DNAm high-throughput measurement technology over the next several years. Results: Here we describe a suite of computational tools that incorporate state-of-the-art statistical techniques for the analysis of DNAm data. The software is structured to easily adapt to future versions of the technology. We include methods for preprocessing, quality assessment and detection of differentially methylated regions from the kilobase to the megabase scale. We show how our software provides a powerful and flexible development platform for future methods. We also illustrate how our methods empower the technology to make discoveries previously thought to be possible only with sequencing-based methods. Availability and implementation: http://bioconductor.org/packages/release/bioc/html/minfi.html. Contact: khansen@jhsph.edu; rafa@jimmy.harvard.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24478339

  4. Envelope analysis of rotating machine vibrations in variable speed conditions: A comprehensive treatment

    NASA Astrophysics Data System (ADS)

    Abboud, D.; Antoni, J.; Sieg-Zieba, S.; Eltabach, M.

    2017-02-01

    Nowadays, the vibration analysis of rotating machine signals is a well-established methodology, rooted on powerful tools offered, in particular, by the theory of cyclostationary (CS) processes. Among them, the squared envelope spectrum (SES) is probably the most popular to detect random CS components which are typical symptoms, for instance, of rolling element bearing faults. Recent researches are shifted towards the extension of existing CS tools - originally devised in constant speed conditions - to the case of variable speed conditions. Many of these works combine the SES with computed order tracking after some preprocessing steps. The principal object of this paper is to organize these dispersed researches into a structured comprehensive framework. Three original features are furnished. First, a model of rotating machine signals is introduced which sheds light on the various components to be expected in the SES. Second, a critical comparison is made of three sophisticated methods, namely, the improved synchronous average, the cepstrum prewhitening, and the generalized synchronous average, used for suppressing the deterministic part. Also, a general envelope enhancement methodology which combines the latter two techniques with a time-domain filtering operation is revisited. All theoretical findings are experimentally validated on simulated and real-world vibration signals.

  5. A Comprehensive Method for GNSS Data Quality Determination to Improve Ionospheric Data Analysis

    PubMed Central

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun

    2014-01-01

    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis. PMID:25196005

  6. A Fine-Tuned Look at White Space Variation in Desktop Publishing.

    ERIC Educational Resources Information Center

    Knupfer, Nancy Nelson; McIsaac, Marina Stock

    This investigation of the use of white space in print-based, computer-generated text focused on the point at which the white space interferes with reading speed and comprehension. It was hypothesized that reading speed and comprehension would be significantly greater when text was wrapped tightly around the graphic than when it had one-half inch…

  7. The Effects of Hypertext Gloss on Comprehension and Vocabulary Retention under Incidental and Intentional Learning Conditions

    ERIC Educational Resources Information Center

    Zandieh, Zeinab; Jafarigohar, Manoochehr

    2012-01-01

    The present study investigated comprehension, immediate and delayed vocabulary retention under incidental and intentional learning conditions via computer mediated hypertext gloss. One hundred and eighty four (N = 184) intermediate students of English as a foreign language at an English school participated in the study. They were randomly assigned…

  8. Reading Linear Texts on Paper versus Computer Screen: Effects on Reading Comprehension

    ERIC Educational Resources Information Center

    Mangen, Anne; Walgermo, Bente R.; Bronnick, Kolbjorn

    2013-01-01

    Objective: To explore effects of the technological interface on reading comprehension in a Norwegian school context. Participants: 72 tenth graders from two different primary schools in Norway. Method: The students were randomized into two groups, where the first group read two texts (1400-2000 words) in print, and the other group read the same…

  9. CALL--Enhanced L2 Listening Skills--Aiming for Automatization in a Multimedia Environment

    ERIC Educational Resources Information Center

    Mayor, Maria Jesus Blasco

    2009-01-01

    Computer Assisted Language Learning (CALL) and L2 listening comprehension skill training are bound together for good. A neglected macroskill for decades, developing listening comprehension skill is now considered crucial for L2 acquisition. Thus this paper makes an attempt to offer latest information on processing theories and L2 listening…

  10. Children's Media Comprehension: The Relationship between Media Platform, Executive Functioning Abilities, and Age

    ERIC Educational Resources Information Center

    Menkes, Susan M.

    2012-01-01

    Children's media comprehension was compared for material presented on television, computer, or touchscreen tablet. One hundred and thirty-two children were equally distributed across 12 groups defined by age (4- or 6-years-olds), gender, and the three media platforms. Executive functioning as measured by attentional control, cognitive…

  11. A CALL-Based Lesson Plan for Teaching Reading Comprehension to Iranian Intermediate EFL Learners

    ERIC Educational Resources Information Center

    Khoshsima, Hooshang; Khosravani, Mahboobeh

    2014-01-01

    The main purpose of this descriptive research is to provide a CALL (Computer-Assisted Language Learning)-based lesson plan for teaching reading comprehension to Iranian intermediate EFL learners. CALL is a new way of learning and teaching language. It is proved that CALL mainly has positive effects on educational contexts. Although teachers…

  12. Eye vs. Text Movement: Which Technique Leads to Faster Reading Comprehension?

    ERIC Educational Resources Information Center

    Abdellah, Antar Solhy

    2009-01-01

    Eye fixation is a frequent problem that faces foreign language learners and hinders the flow of their reading comprehension. Although students are usually advised to read fast/skim to overcome this problem, eye fixation persists. The present study investigates the effect of using a paper-based program as compared to a computer-based software in…

  13. Adaptive and Maladaptive Strategy Use in Computer-Assisted Language Learning Activities for Listening Comprehension

    ERIC Educational Resources Information Center

    McBride, Cara

    2008-01-01

    College students of English as a foreign language (EFL) in Chile participated in an online mini course designed to improve their listening comprehension. There were four experimental conditions: A) one in which participants listened to fast dialogues; B) one in which participants listened to slow dialogues; C) one in which participants were given…

  14. Experimental Evaluation of Computer Assisted Self-Assessment of Reading Comprehension: Effects on Reading Achievement and Attitude.

    ERIC Educational Resources Information Center

    Vollands, Stacy R.; And Others

    A study evaluated the effect software for self-assessment and management of reading practice had on reading achievement and motivation in two primary schools in Aberdeen, Scotland. The program utilized was The Accelerated Reader (AR) which was designed to enable curriculum based assessment of reading comprehension within the classroom. Students…

  15. The science of rotator cuff tears: translating animal models to clinical recommendations using simulation analysis.

    PubMed

    Mannava, Sandeep; Plate, Johannes F; Tuohy, Christopher J; Seyler, Thorsten M; Whitlock, Patrick W; Curl, Walton W; Smith, Thomas L; Saul, Katherine R

    2013-07-01

    The purpose of this article is to review basic science studies using various animal models for rotator cuff research and to describe structural, biomechanical, and functional changes to muscle following rotator cuff tears. The use of computational simulations to translate the findings from animal models to human scale is further detailed. A comprehensive review was performed of the basic science literature describing the use of animal models and simulation analysis to examine muscle function following rotator cuff injury and repair in the ageing population. The findings from various studies of rotator cuff pathology emphasize the importance of preventing permanent muscular changes with detrimental results. In vivo muscle function, electromyography, and passive muscle-tendon unit properties were studied before and after supraspinatus tenotomy in a rodent rotator cuff injury model (acute vs chronic). Then, a series of simulation experiments were conducted using a validated computational human musculoskeletal shoulder model to assess both passive and active tension of rotator cuff repairs based on surgical positioning. Outcomes of rotator cuff repair may be improved by earlier surgical intervention, with lower surgical repair tensions and fewer electromyographic neuromuscular changes. An integrated approach of animal experiments, computer simulation analyses, and clinical studies may allow us to gain a fundamental understanding of the underlying pathology and interpret the results for clinical translation.

  16. Design synthesis and optimization of permanent magnet synchronous machines based on computationally-efficient finite element analysis

    NASA Astrophysics Data System (ADS)

    Sizov, Gennadi Y.

    In this dissertation, a model-based multi-objective optimal design of permanent magnet ac machines, supplied by sine-wave current regulated drives, is developed and implemented. The design procedure uses an efficient electromagnetic finite element-based solver to accurately model nonlinear material properties and complex geometric shapes associated with magnetic circuit design. Application of an electromagnetic finite element-based solver allows for accurate computation of intricate performance parameters and characteristics. The first contribution of this dissertation is the development of a rapid computational method that allows accurate and efficient exploration of large multi-dimensional design spaces in search of optimum design(s). The computationally efficient finite element-based approach developed in this work provides a framework of tools that allow rapid analysis of synchronous electric machines operating under steady-state conditions. In the developed modeling approach, major steady-state performance parameters such as, winding flux linkages and voltages, average, cogging and ripple torques, stator core flux densities, core losses, efficiencies and saturated machine winding inductances, are calculated with minimum computational effort. In addition, the method includes means for rapid estimation of distributed stator forces and three-dimensional effects of stator and/or rotor skew on the performance of the machine. The second contribution of this dissertation is the development of the design synthesis and optimization method based on a differential evolution algorithm. The approach relies on the developed finite element-based modeling method for electromagnetic analysis and is able to tackle large-scale multi-objective design problems using modest computational resources. Overall, computational time savings of up to two orders of magnitude are achievable, when compared to current and prevalent state-of-the-art methods. These computational savings allow one to expand the optimization problem to achieve more complex and comprehensive design objectives. The method is used in the design process of several interior permanent magnet industrial motors. The presented case studies demonstrate that the developed finite element-based approach practically eliminates the need for using less accurate analytical and lumped parameter equivalent circuit models for electric machine design optimization. The design process and experimental validation of the case-study machines are detailed in the dissertation.

  17. 3D visualization of membrane failures in fuel cells

    NASA Astrophysics Data System (ADS)

    Singh, Yadvinder; Orfino, Francesco P.; Dutta, Monica; Kjeang, Erik

    2017-03-01

    Durability issues in fuel cells, due to chemical and mechanical degradation, are potential impediments in their commercialization. Hydrogen leak development across degraded fuel cell membranes is deemed a lifetime-limiting failure mode and potential safety issue that requires thorough characterization for devising effective mitigation strategies. The scope and depth of failure analysis has, however, been limited by the 2D nature of conventional imaging. In the present work, X-ray computed tomography is introduced as a novel, non-destructive technique for 3D failure analysis. Its capability to acquire true 3D images of membrane damage is demonstrated for the very first time. This approach has enabled unique and in-depth analysis resulting in novel findings regarding the membrane degradation mechanism; these are: significant, exclusive membrane fracture development independent of catalyst layers, localized thinning at crack sites, and demonstration of the critical impact of cracks on fuel cell durability. Evidence of crack initiation within the membrane is demonstrated, and a possible new failure mode different from typical mechanical crack development is identified. X-ray computed tomography is hereby established as a breakthrough approach for comprehensive 3D characterization and reliable failure analysis of fuel cell membranes, and could readily be extended to electrolyzers and flow batteries having similar structure.

  18. Failure analysis of fuel cell electrodes using three-dimensional multi-length scale X-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Pokhrel, A.; El Hannach, M.; Orfino, F. P.; Dutta, M.; Kjeang, E.

    2016-10-01

    X-ray computed tomography (XCT), a non-destructive technique, is proposed for three-dimensional, multi-length scale characterization of complex failure modes in fuel cell electrodes. Comparative tomography data sets are acquired for a conditioned beginning of life (BOL) and a degraded end of life (EOL) membrane electrode assembly subjected to cathode degradation by voltage cycling. Micro length scale analysis shows a five-fold increase in crack size and 57% thickness reduction in the EOL cathode catalyst layer, indicating widespread action of carbon corrosion. Complementary nano length scale analysis shows a significant reduction in porosity, increased pore size, and dramatically reduced effective diffusivity within the remaining porous structure of the catalyst layer at EOL. Collapsing of the structure is evident from the combination of thinning and reduced porosity, as uniquely determined by the multi-length scale approach. Additionally, a novel image processing based technique developed for nano scale segregation of pore, ionomer, and Pt/C dominated voxels shows an increase in ionomer volume fraction, Pt/C agglomerates, and severe carbon corrosion at the catalyst layer/membrane interface at EOL. In summary, XCT based multi-length scale analysis enables detailed information needed for comprehensive understanding of the complex failure modes observed in fuel cell electrodes.

  19. Integrated web visualizations for protein-protein interaction databases.

    PubMed

    Jeanquartier, Fleur; Jean-Quartier, Claire; Holzinger, Andreas

    2015-06-16

    Understanding living systems is crucial for curing diseases. To achieve this task we have to understand biological networks based on protein-protein interactions. Bioinformatics has come up with a great amount of databases and tools that support analysts in exploring protein-protein interactions on an integrated level for knowledge discovery. They provide predictions and correlations, indicate possibilities for future experimental research and fill the gaps to complete the picture of biochemical processes. There are numerous and huge databases of protein-protein interactions used to gain insights into answering some of the many questions of systems biology. Many computational resources integrate interaction data with additional information on molecular background. However, the vast number of diverse Bioinformatics resources poses an obstacle to the goal of understanding. We present a survey of databases that enable the visual analysis of protein networks. We selected M=10 out of N=53 resources supporting visualization, and we tested against the following set of criteria: interoperability, data integration, quantity of possible interactions, data visualization quality and data coverage. The study reveals differences in usability, visualization features and quality as well as the quantity of interactions. StringDB is the recommended first choice. CPDB presents a comprehensive dataset and IntAct lets the user change the network layout. A comprehensive comparison table is available via web. The supplementary table can be accessed on http://tinyurl.com/PPI-DB-Comparison-2015. Only some web resources featuring graph visualization can be successfully applied to interactive visual analysis of protein-protein interaction. Study results underline the necessity for further enhancements of visualization integration in biochemical analysis tools. Identified challenges are data comprehensiveness, confidence, interactive feature and visualization maturing.

  20. Towards pervasive computing in health care - a literature review.

    PubMed

    Orwat, Carsten; Graefe, Andreas; Faulwasser, Timm

    2008-06-19

    The evolving concepts of pervasive computing, ubiquitous computing and ambient intelligence are increasingly influencing health care and medicine. Summarizing published research, this literature review provides an overview of recent developments and implementations of pervasive computing systems in health care. It also highlights some of the experiences reported in deployment processes. There is no clear definition of pervasive computing in the current literature. Thus specific inclusion criteria for selecting articles about relevant systems were developed. Searches were conducted in four scientific databases alongside manual journal searches for the period of 2002 to 2006. Articles included present prototypes, case studies and pilot studies, clinical trials and systems that are already in routine use. The searches identified 69 articles describing 67 different systems. In a quantitative analysis, these systems were categorized into project status, health care settings, user groups, improvement aims, and systems features (i.e., component types, data gathering, data transmission, systems functions). The focus is on the types of systems implemented, their frequency of occurrence and their characteristics. Qualitative analyses were performed of deployment issues, such as organizational and personnel issues, privacy and security issues, and financial issues. This paper provides a comprehensive access to the literature of the emerging field by addressing specific topics of application settings, systems features, and deployment experiences. Both an overview and an analysis of the literature on a broad and heterogeneous range of systems are provided. Most systems are described in their prototype stages. Deployment issues, such as implications on organization or personnel, privacy concerns, or financial issues are mentioned rarely, though their solution is regarded as decisive in transferring promising systems to a stage of regular operation. There is a need for further research on the deployment of pervasive computing systems, including clinical studies, economic and social analyses, user studies, etc.

Top