Science.gov

Sample records for accurate computational tools

  1. CAFE: A Computer Tool for Accurate Simulation of the Regulatory Pool Fire Environment for Type B Packages

    SciTech Connect

    Gritzo, L.A.; Koski, J.A.; Suo-Anttila, A.J.

    1999-03-16

    The Container Analysis Fire Environment computer code (CAFE) is intended to provide Type B package designers with an enhanced engulfing fire boundary condition when combined with the PATRAN/P-Thermal commercial code. Historically an engulfing fire boundary condition has been modeled as {sigma}T{sup 4} where {sigma} is the Stefan-Boltzman constant, and T is the fire temperature. The CAFE code includes the necessary chemistry, thermal radiation, and fluid mechanics to model an engulfing fire. Effects included are the local cooling of gases that form a protective boundary layer that reduces the incoming radiant heat flux to values lower than expected from a simple {sigma}T{sup 4} model. In addition, the effect of object shape on mixing that may increase the local fire temperature is included. Both high and low temperature regions that depend upon the local availability of oxygen are also calculated. Thus the competing effects that can both increase and decrease the local values of radiant heat flux are included in a reamer that is not predictable a-priori. The CAFE package consists of a group of computer subroutines that can be linked to workstation-based thermal analysis codes in order to predict package performance during regulatory and other accident fire scenarios.

  2. Selecting Appropriate Computing Tools.

    ERIC Educational Resources Information Center

    Tetlow, William L.

    1990-01-01

    Selecting computer tools requires analyzing information requirements and audiences, assessing existing institutional research and computing capacities, creating or improving a planning database, using computer experts, determining software needs, obtaining sufficient resources for independent operations, acquiring quality, and insisting on…

  3. Accurate modeling of parallel scientific computations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Townsend, James C.

    1988-01-01

    Scientific codes are usually parallelized by partitioning a grid among processors. To achieve top performance it is necessary to partition the grid so as to balance workload and minimize communication/synchronization costs. This problem is particularly acute when the grid is irregular, changes over the course of the computation, and is not known until load time. Critical mapping and remapping decisions rest on the ability to accurately predict performance, given a description of a grid and its partition. This paper discusses one approach to this problem, and illustrates its use on a one-dimensional fluids code. The models constructed are shown to be accurate, and are used to find optimal remapping schedules.

  4. Tube dimpling tool assures accurate dip-brazed joints

    NASA Technical Reports Server (NTRS)

    Beuyukian, C. S.; Heisman, R. M.

    1968-01-01

    Portable, hand-held dimpling tool assures accurate brazed joints between tubes of different diameters. Prior to brazing, the tool performs precise dimpling and nipple forming and also provides control and accurate measuring of the height of nipples and depth of dimples so formed.

  5. Computers as tools

    SciTech Connect

    Eriksson, I.V.

    1994-12-31

    The following message was recently posted on a bulletin board and clearly shows the relevance of the conference theme: {open_quotes}The computer and digital networks seem poised to change whole regions of human activity -- how we record knowledge, communicate, learn, work, understand ourselves and the world. What`s the best framework for understanding this digitalization, or virtualization, of seemingly everything? ... Clearly, symbolic tools like the alphabet, book, and mechanical clock have changed some of our most fundamental notions -- self, identity, mind, nature, time, space. Can we say what the computer, a purely symbolic {open_quotes}machine,{close_quotes} is doing to our thinking in these areas? Or is it too early to say, given how much more powerful and less expensive the technology seems destinated to become in the next few decades?{close_quotes} (Verity, 1994) Computers certainly affect our lives and way of thinking but what have computers to do with ethics? A narrow approach would be that on the one hand people can and do abuse computer systems and on the other hand people can be abused by them. Weli known examples of the former are computer comes such as the theft of money, services and information. The latter can be exemplified by violation of privacy, health hazards and computer monitoring. Broadening the concept from computers to information systems (ISs) and information technology (IT) gives a wider perspective. Computers are just the hardware part of information systems which also include software, people and data. Information technology is the concept preferred today. It extends to communication, which is an essential part of information processing. Now let us repeat the question: What has IT to do with ethics? Verity mentioned changes in {open_quotes}how we record knowledge, communicate, learn, work, understand ourselves and the world{close_quotes}.

  6. LensTools: Weak Lensing computing tools

    NASA Astrophysics Data System (ADS)

    Petri, A.

    2016-02-01

    LensTools implements a wide range of routines frequently used in Weak Gravitational Lensing, including tools for image analysis, statistical processing and numerical theory predictions. The package offers many useful features, including complete flexibility and easy customization of input/output formats; efficient measurements of power spectrum, PDF, Minkowski functionals and peak counts of convergence maps; survey masks; artificial noise generation engines; easy to compute parameter statistical inferences; ray tracing simulations; and many others. It requires standard numpy and scipy, and depending on tools used, may require Astropy (ascl:1304.002), emcee (ascl:1303.002), matplotlib, and mpi4py.

  7. Computers as Cognitive Tools.

    ERIC Educational Resources Information Center

    Lajoie, Susanne P., Ed.; Derry, Sharon J., Ed.

    This book provides exemplars of the types of computer-based learning environments represented by the theoretical camps within the field and the practical applications of the theories. The contributors discuss a variety of computer applications to learning, ranging from school-related topics such as geometry, algebra, biology, history, physics, and…

  8. Accurate atom-mapping computation for biochemical reactions.

    PubMed

    Latendresse, Mario; Malerich, Jeremiah P; Travers, Mike; Karp, Peter D

    2012-11-26

    The complete atom mapping of a chemical reaction is a bijection of the reactant atoms to the product atoms that specifies the terminus of each reactant atom. Atom mapping of biochemical reactions is useful for many applications of systems biology, in particular for metabolic engineering where synthesizing new biochemical pathways has to take into account for the number of carbon atoms from a source compound that are conserved in the synthesis of a target compound. Rapid, accurate computation of the atom mapping(s) of a biochemical reaction remains elusive despite significant work on this topic. In particular, past researchers did not validate the accuracy of mapping algorithms. We introduce a new method for computing atom mappings called the minimum weighted edit-distance (MWED) metric. The metric is based on bond propensity to react and computes biochemically valid atom mappings for a large percentage of biochemical reactions. MWED models can be formulated efficiently as Mixed-Integer Linear Programs (MILPs). We have demonstrated this approach on 7501 reactions of the MetaCyc database for which 87% of the models could be solved in less than 10 s. For 2.1% of the reactions, we found multiple optimal atom mappings. We show that the error rate is 0.9% (22 reactions) by comparing these atom mappings to 2446 atom mappings of the manually curated Kyoto Encyclopedia of Genes and Genomes (KEGG) RPAIR database. To our knowledge, our computational atom-mapping approach is the most accurate and among the fastest published to date. The atom-mapping data will be available in the MetaCyc database later in 2012; the atom-mapping software will be available within the Pathway Tools software later in 2012.

  9. High accurate interpolation of NURBS tool path for CNC machine tools

    NASA Astrophysics Data System (ADS)

    Liu, Qiang; Liu, Huan; Yuan, Songmei

    2016-06-01

    Feedrate fluctuation caused by approximation errors of interpolation methods has great effects on machining quality in NURBS interpolation, but few methods can efficiently eliminate or reduce it to a satisfying level without sacrificing the computing efficiency at present. In order to solve this problem, a high accurate interpolation method for NURBS tool path is proposed. The proposed method can efficiently reduce the feedrate fluctuation by forming a quartic equation with respect to the curve parameter increment, which can be efficiently solved by analytic methods in real-time. Theoretically, the proposed method can totally eliminate the feedrate fluctuation for any 2nd degree NURBS curves and can interpolate 3rd degree NURBS curves with minimal feedrate fluctuation. Moreover, a smooth feedrate planning algorithm is also proposed to generate smooth tool motion with considering multiple constraints and scheduling errors by an efficient planning strategy. Experiments are conducted to verify the feasibility and applicability of the proposed method. This research presents a novel NURBS interpolation method with not only high accuracy but also satisfying computing efficiency.

  10. Fast, Accurate RF Propagation Modeling and Simulation Tool for Highly Cluttered Environments

    SciTech Connect

    Kuruganti, Phani Teja

    2007-01-01

    As network centric warfare and distributed operations paradigms unfold, there is a need for robust, fast wireless network deployment tools. These tools must take into consideration the terrain of the operating theater, and facilitate specific modeling of end to end network performance based on accurate RF propagation predictions. It is well known that empirical models can not provide accurate, site specific predictions of radio channel behavior. In this paper an event-driven wave propagation simulation is proposed as a computationally efficient technique for predicting critical propagation characteristics of RF signals in cluttered environments. Convincing validation and simulator performance studies confirm the suitability of this method for indoor and urban area RF channel modeling. By integrating our RF propagation prediction tool, RCSIM, with popular packetlevel network simulators, we are able to construct an end to end network analysis tool for wireless networks operated in built-up urban areas.

  11. Fast, accurate, robust and Open Source Brain Extraction Tool (OSBET)

    NASA Astrophysics Data System (ADS)

    Namias, R.; Donnelly Kehoe, P.; D'Amato, J. P.; Nagel, J.

    2015-12-01

    The removal of non-brain regions in neuroimaging is a critical task to perform a favorable preprocessing. The skull-stripping depends on different factors including the noise level in the image, the anatomy of the subject being scanned and the acquisition sequence. For these and other reasons, an ideal brain extraction method should be fast, accurate, user friendly, open-source and knowledge based (to allow for the interaction with the algorithm in case the expected outcome is not being obtained), producing stable results and making it possible to automate the process for large datasets. There are already a large number of validated tools to perform this task but none of them meets the desired characteristics. In this paper we introduced an open source brain extraction tool (OSBET), composed of four steps using simple well-known operations such as: optimal thresholding, binary morphology, labeling and geometrical analysis that aims to assemble all the desired features. We present an experiment comparing OSBET with other six state-of-the-art techniques against a publicly available dataset consisting of 40 T1-weighted 3D scans and their corresponding manually segmented images. OSBET gave both: a short duration with an excellent accuracy, getting the best Dice Coefficient metric. Further validation should be performed, for instance, in unhealthy population, to generalize its usage for clinical purposes.

  12. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy

    PubMed Central

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T.; Cerutti, Francesco; Chin, Mary P. W.; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G.; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R.; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both 4He and 12C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth–dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features. PMID:27242956

  13. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy.

    PubMed

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T; Cerutti, Francesco; Chin, Mary P W; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both (4)He and (12)C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth-dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features. PMID:27242956

  14. Computational Tools for Metabolic Engineering

    PubMed Central

    Copeland, Wilbert B.; Bartley, Bryan A.; Chandran, Deepak; Galdzicki, Michal; Kim, Kyung H.; Sleight, Sean C.; Maranas, Costas D.; Sauro, Herbert M.

    2012-01-01

    A great variety of software applications are now employed in the metabolic engineering field. These applications have been created to support a wide range of experimental and analysis techniques. Computational tools are utilized throughout the metabolic engineering workflow to extract and interpret relevant information from large data sets, to present complex models in a more manageable form, and to propose efficient network design strategies. In this review, we present a number of tools that can assist in modifying and understanding cellular metabolic networks. The review covers seven areas of relevance to metabolic engineers. These include metabolic reconstruction efforts, network visualization, nucleic acid and protein engineering, metabolic flux analysis, pathway prospecting, post-structural network analysis and culture optimization. The list of available tools is extensive and we can only highlight a small, representative portion of the tools from each area. PMID:22629572

  15. CgWind: A high-order accurate simulation tool for wind turbines and wind farms

    SciTech Connect

    Chand, K K; Henshaw, W D; Lundquist, K A; Singer, M A

    2010-02-22

    CgWind is a high-fidelity large eddy simulation (LES) tool designed to meet the modeling needs of wind turbine and wind park engineers. This tool combines several advanced computational technologies in order to model accurately the complex and dynamic nature of wind energy applications. The composite grid approach provides high-quality structured grids for the efficient implementation of high-order accurate discretizations of the incompressible Navier-Stokes equations. Composite grids also provide a natural mechanism for modeling bodies in relative motion and complex geometry. Advanced algorithms such as matrix-free multigrid, compact discretizations and approximate factorization will allow CgWind to perform highly resolved calculations efficiently on a wide class of computing resources. Also in development are nonlinear LES subgrid-scale models required to simulate the many interacting scales present in large wind turbine applications. This paper outlines our approach, the current status of CgWind and future development plans.

  16. TACT: The Action Computation Tool

    NASA Astrophysics Data System (ADS)

    Sanders, Jason L.; Binney, James

    2015-12-01

    The Action Computation Tool (TACT) tests methods for estimating actions, angles and frequencies of orbits in both axisymmetric and triaxial potentials, including general spherical potentials, analytic potentials (Isochrone and Harmonic oscillator), axisymmetric Stackel fudge, average generating function from orbit (AvGF), and others. It is written in C++; code is provided to compile the routines into a Python library. TM (ascl:1512.014) and LAPACK are required to access some features.

  17. Foundational Tools for Petascale Computing

    SciTech Connect

    Miller, Barton

    2014-05-19

    The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building tools and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.

  18. Slim hole MWD tool accurately measures downhole annular pressure

    SciTech Connect

    Burban, B.; Delahaye, T. )

    1994-02-14

    Measurement-while-drilling of downhole pressure accurately determines annular pressure losses from circulation and drillstring rotation and helps monitor swab and surge pressures during tripping. In early 1993, two slim-hole wells (3.4 in. and 3 in. diameter) were drilled with continuous real-time electromagnetic wave transmission of downhole temperature and annular pressure. The data were obtained during all stages of the drilling operation and proved useful for operations personnel. The use of real-time measurements demonstrated the characteristic hydraulic effects of pressure surges induced by drillstring rotation in the small slim-hole annulus under field conditions. The interest in this information is not restricted to the slim-hole geometry. Monitoring or estimating downhole pressure is a key element for drilling operations. Except in special cases, no real-time measurements of downhole annular pressure during drilling and tripping have been used on an operational basis. The hydraulic effects are significant in conventional-geometry wells (3 1/2-in. drill pipe in a 6-in. hole). This paper describes the tool and the results from the field test.

  19. Computational tools for protein modeling.

    PubMed

    Xu, D; Xu, Y; Uberbacher, E C

    2000-07-01

    Protein modeling is playing a more and more important role in protein and peptide sciences due to improvements in modeling methods, advances in computer technology, and the huge amount of biological data becoming available. Modeling tools can often predict the structure and shed some light on the function and its underlying mechanism. They can also provide insight to design experiments and suggest possible leads for drug design. This review attempts to provide a comprehensive introduction to major computer programs, especially on-line servers, for protein modeling. The review covers the following aspects: (1) protein sequence comparison, including sequence alignment/search, sequence-based protein family classification, domain parsing, and phylogenetic classification; (2) sequence annotation, including annotation/prediction of hydrophobic profiles, transmembrane regions, active sites, signaling sites, and secondary structures; (3) protein structure analysis, including visualization, geometry analysis, structure comparison/classification, dynamics, and electrostatics; (4) three-dimensional structure prediction, including homology modeling, fold recognition using threading, ab initio prediction, and docking. We will address what a user can expect from the computer tools in terms of their strengths and limitations. We will also discuss the major challenges and the future trends in the field. A collection of the links of tools can be found at http://compbio.ornl.gov/structure/resource/.

  20. Computers: Tools of Oppression, Tools of Liberation.

    ERIC Educational Resources Information Center

    Taylor, Jefferey H.

    This paper contends that students who are learning to use computers can benefit from having an overview of the history and social context of computers. The paper highlights some milestones in the history of computers, from ancient times to ENIAC to Altair to Bill Gates to the Internet. It also suggests some things for students to think about and…

  1. Tools for Accurate and Efficient Analysis of Complex Evolutionary Mechanisms in Microbial Genomes. Final Report

    SciTech Connect

    Nakhleh, Luay

    2014-03-12

    I proposed to develop computationally efficient tools for accurate detection and reconstruction of microbes' complex evolutionary mechanisms, thus enabling rapid and accurate annotation, analysis and understanding of their genomes. To achieve this goal, I proposed to address three aspects. (1) Mathematical modeling. A major challenge facing the accurate detection of HGT is that of distinguishing between these two events on the one hand and other events that have similar "effects." I proposed to develop a novel mathematical approach for distinguishing among these events. Further, I proposed to develop a set of novel optimization criteria for the evolutionary analysis of microbial genomes in the presence of these complex evolutionary events. (2) Algorithm design. In this aspect of the project, I proposed to develop an array of e cient and accurate algorithms for analyzing microbial genomes based on the formulated optimization criteria. Further, I proposed to test the viability of the criteria and the accuracy of the algorithms in an experimental setting using both synthetic as well as biological data. (3) Software development. I proposed the nal outcome to be a suite of software tools which implements the mathematical models as well as the algorithms developed.

  2. A new approach to compute accurate velocity of meteors

    NASA Astrophysics Data System (ADS)

    Egal, Auriane; Gural, Peter; Vaubaillon, Jeremie; Colas, Francois; Thuillot, William

    2016-10-01

    The CABERNET project was designed to push the limits of meteoroid orbit measurements by improving the determination of the meteors' velocities. Indeed, despite of the development of the cameras networks dedicated to the observation of meteors, there is still an important discrepancy between the measured orbits of meteoroids computed and the theoretical results. The gap between the observed and theoretic semi-major axis of the orbits is especially significant; an accurate determination of the orbits of meteoroids therefore largely depends on the computation of the pre-atmospheric velocities. It is then imperative to dig out how to increase the precision of the measurements of the velocity.In this work, we perform an analysis of different methods currently used to compute the velocities and trajectories of the meteors. They are based on the intersecting planes method developed by Ceplecha (1987), the least squares method of Borovicka (1990), and the multi-parameter fitting (MPF) method published by Gural (2012).In order to objectively compare the performances of these techniques, we have simulated realistic meteors ('fakeors') reproducing the different error measurements of many cameras networks. Some fakeors are built following the propagation models studied by Gural (2012), and others created by numerical integrations using the Borovicka et al. 2007 model. Different optimization techniques have also been investigated in order to pick the most suitable one to solve the MPF, and the influence of the geometry of the trajectory on the result is also presented.We will present here the results of an improved implementation of the multi-parameter fitting that allow an accurate orbit computation of meteors with CABERNET. The comparison of different velocities computation seems to show that if the MPF is by far the best method to solve the trajectory and the velocity of a meteor, the ill-conditioning of the costs functions used can lead to large estimate errors for noisy

  3. Neutron supermirrors: an accurate theory for layer thickness computation

    NASA Astrophysics Data System (ADS)

    Bray, Michael

    2001-11-01

    We present a new theory for the computation of Super-Mirror stacks, using accurate formulas derived from the classical optics field. Approximations are introduced into the computation, but at a later stage than existing theories, providing a more rigorous treatment of the problem. The final result is a continuous thickness stack, whose properties can be determined at the outset of the design. We find that the well-known fourth power dependence of number of layers versus maximum angle is (of course) asymptotically correct. We find a formula giving directly the relation between desired reflectance, maximum angle, and number of layers (for a given pair of materials). Note: The author of this article, a classical opticist, has limited knowledge of the Neutron world, and begs forgiveness for any shortcomings, erroneous assumptions and/or misinterpretation of previous authors' work on the subject.

  4. IVUSAngio tool: a publicly available software for fast and accurate 3D reconstruction of coronary arteries.

    PubMed

    Doulaverakis, Charalampos; Tsampoulatidis, Ioannis; Antoniadis, Antonios P; Chatzizisis, Yiannis S; Giannopoulos, Andreas; Kompatsiaris, Ioannis; Giannoglou, George D

    2013-11-01

    There is an ongoing research and clinical interest in the development of reliable and easily accessible software for the 3D reconstruction of coronary arteries. In this work, we present the architecture and validation of IVUSAngio Tool, an application which performs fast and accurate 3D reconstruction of the coronary arteries by using intravascular ultrasound (IVUS) and biplane angiography data. The 3D reconstruction is based on the fusion of the detected arterial boundaries in IVUS images with the 3D IVUS catheter path derived from the biplane angiography. The IVUSAngio Tool suite integrates all the intermediate processing and computational steps and provides a user-friendly interface. It also offers additional functionality, such as automatic selection of the end-diastolic IVUS images, semi-automatic and automatic IVUS segmentation, vascular morphometric measurements, graphical visualization of the 3D model and export in a format compatible with other computer-aided design applications. Our software was applied and validated in 31 human coronary arteries yielding quite promising results. Collectively, the use of IVUSAngio Tool significantly reduces the total processing time for 3D coronary reconstruction. IVUSAngio Tool is distributed as free software, publicly available to download and use.

  5. Direct computation of parameters for accurate polarizable force fields

    SciTech Connect

    Verstraelen, Toon Vandenbrande, Steven; Ayers, Paul W.

    2014-11-21

    We present an improved electronic linear response model to incorporate polarization and charge-transfer effects in polarizable force fields. This model is a generalization of the Atom-Condensed Kohn-Sham Density Functional Theory (DFT), approximated to second order (ACKS2): it can now be defined with any underlying variational theory (next to KS-DFT) and it can include atomic multipoles and off-center basis functions. Parameters in this model are computed efficiently as expectation values of an electronic wavefunction, obviating the need for their calibration, regularization, and manual tuning. In the limit of a complete density and potential basis set in the ACKS2 model, the linear response properties of the underlying theory for a given molecular geometry are reproduced exactly. A numerical validation with a test set of 110 molecules shows that very accurate models can already be obtained with fluctuating charges and dipoles. These features greatly facilitate the development of polarizable force fields.

  6. An Accurate and Dynamic Computer Graphics Muscle Model

    NASA Technical Reports Server (NTRS)

    Levine, David Asher

    1997-01-01

    A computer based musculo-skeletal model was developed at the University in the departments of Mechanical and Biomedical Engineering. This model accurately represents human shoulder kinematics. The result of this model is the graphical display of bones moving through an appropriate range of motion based on inputs of EMGs and external forces. The need existed to incorporate a geometric muscle model in the larger musculo-skeletal model. Previous muscle models did not accurately represent muscle geometries, nor did they account for the kinematics of tendons. This thesis covers the creation of a new muscle model for use in the above musculo-skeletal model. This muscle model was based on anatomical data from the Visible Human Project (VHP) cadaver study. Two-dimensional digital images from the VHP were analyzed and reconstructed to recreate the three-dimensional muscle geometries. The recreated geometries were smoothed, reduced, and sliced to form data files defining the surfaces of each muscle. The muscle modeling function opened these files during run-time and recreated the muscle surface. The modeling function applied constant volume limitations to the muscle and constant geometry limitations to the tendons.

  7. Development and Validation of a Multidisciplinary Tool for Accurate and Efficient Rotorcraft Noise Prediction (MUTE)

    NASA Technical Reports Server (NTRS)

    Liu, Yi; Anusonti-Inthra, Phuriwat; Diskin, Boris

    2011-01-01

    A physics-based, systematically coupled, multidisciplinary prediction tool (MUTE) for rotorcraft noise was developed and validated with a wide range of flight configurations and conditions. MUTE is an aggregation of multidisciplinary computational tools that accurately and efficiently model the physics of the source of rotorcraft noise, and predict the noise at far-field observer locations. It uses systematic coupling approaches among multiple disciplines including Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and high fidelity acoustics. Within MUTE, advanced high-order CFD tools are used around the rotor blade to predict the transonic flow (shock wave) effects, which generate the high-speed impulsive noise. Predictions of the blade-vortex interaction noise in low speed flight are also improved by using the Particle Vortex Transport Method (PVTM), which preserves the wake flow details required for blade/wake and fuselage/wake interactions. The accuracy of the source noise prediction is further improved by utilizing a coupling approach between CFD and CSD, so that the effects of key structural dynamics, elastic blade deformations, and trim solutions are correctly represented in the analysis. The blade loading information and/or the flow field parameters around the rotor blade predicted by the CFD/CSD coupling approach are used to predict the acoustic signatures at far-field observer locations with a high-fidelity noise propagation code (WOPWOP3). The predicted results from the MUTE tool for rotor blade aerodynamic loading and far-field acoustic signatures are compared and validated with a variation of experimental data sets, such as UH60-A data, DNW test data and HART II test data.

  8. Visualization Tools for Teaching Computer Security

    ERIC Educational Resources Information Center

    Yuan, Xiaohong; Vega, Percy; Qadah, Yaseen; Archer, Ricky; Yu, Huiming; Xu, Jinsheng

    2010-01-01

    Using animated visualization tools has been an important teaching approach in computer science education. We have developed three visualization and animation tools that demonstrate various information security concepts and actively engage learners. The information security concepts illustrated include: packet sniffer and related computer network…

  9. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  10. Photoacoustic computed tomography without accurate ultrasonic transducer responses

    NASA Astrophysics Data System (ADS)

    Sheng, Qiwei; Wang, Kun; Xia, Jun; Zhu, Liren; Wang, Lihong V.; Anastasio, Mark A.

    2015-03-01

    Conventional photoacoustic computed tomography (PACT) image reconstruction methods assume that the object and surrounding medium are described by a constant speed-of-sound (SOS) value. In order to accurately recover fine structures, SOS heterogeneities should be quantified and compensated for during PACT reconstruction. To address this problem, several groups have proposed hybrid systems that combine PACT with ultrasound computed tomography (USCT). In such systems, a SOS map is reconstructed first via USCT. Consequently, this SOS map is employed to inform the PACT reconstruction method. Additionally, the SOS map can provide structural information regarding tissue, which is complementary to the functional information from the PACT image. We propose a paradigm shift in the way that images are reconstructed in hybrid PACT-USCT imaging. Inspired by our observation that information about the SOS distribution is encoded in PACT measurements, we propose to jointly reconstruct the absorbed optical energy density and SOS distributions from a combined set of USCT and PACT measurements, thereby reducing the two reconstruction problems into one. This innovative approach has several advantages over conventional approaches in which PACT and USCT images are reconstructed independently: (1) Variations in the SOS will automatically be accounted for, optimizing PACT image quality; (2) The reconstructed PACT and USCT images will possess minimal systematic artifacts because errors in the imaging models will be optimally balanced during the joint reconstruction; (3) Due to the exploitation of information regarding the SOS distribution in the full-view PACT data, our approach will permit high-resolution reconstruction of the SOS distribution from sparse array data.

  11. Automated Development of Accurate Algorithms and Efficient Codes for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.; Dyson, Rodger W.

    1999-01-01

    The simulation of sound generation and propagation in three space dimensions with realistic aircraft components is a very large time dependent computation with fine details. Simulations in open domains with embedded objects require accurate and robust algorithms for propagation, for artificial inflow and outflow boundaries, and for the definition of geometrically complex objects. The development, implementation, and validation of methods for solving these demanding problems is being done to support the NASA pillar goals for reducing aircraft noise levels. Our goal is to provide algorithms which are sufficiently accurate and efficient to produce usable results rapidly enough to allow design engineers to study the effects on sound levels of design changes in propulsion systems, and in the integration of propulsion systems with airframes. There is a lack of design tools for these purposes at this time. Our technical approach to this problem combines the development of new, algorithms with the use of Mathematica and Unix utilities to automate the algorithm development, code implementation, and validation. We use explicit methods to ensure effective implementation by domain decomposition for SPMD parallel computing. There are several orders of magnitude difference in the computational efficiencies of the algorithms which we have considered. We currently have new artificial inflow and outflow boundary conditions that are stable, accurate, and unobtrusive, with implementations that match the accuracy and efficiency of the propagation methods. The artificial numerical boundary treatments have been proven to have solutions which converge to the full open domain problems, so that the error from the boundary treatments can be driven as low as is required. The purpose of this paper is to briefly present a method for developing highly accurate algorithms for computational aeroacoustics, the use of computer automation in this process, and a brief survey of the algorithms that

  12. MetaBAT, an efficient tool for accurately reconstructing single genomes from complex microbial communities

    DOE PAGES

    Kang, Dongwan D.; Froula, Jeff; Egan, Rob; Wang, Zhong

    2015-01-01

    Grouping large genomic fragments assembled from shotgun metagenomic sequences to deconvolute complex microbial communities, or metagenome binning, enables the study of individual organisms and their interactions. Because of the complex nature of these communities, existing metagenome binning methods often miss a large number of microbial species. In addition, most of the tools are not scalable to large datasets. Here we introduce automated software called MetaBAT that integrates empirical probabilistic distances of genome abundance and tetranucleotide frequency for accurate metagenome binning. MetaBAT outperforms alternative methods in accuracy and computational efficiency on both synthetic and real metagenome datasets. Lastly, it automatically formsmore » hundreds of high quality genome bins on a very large assembly consisting millions of contigs in a matter of hours on a single node. MetaBAT is open source software and available at https://bitbucket.org/berkeleylab/metabat.« less

  13. MetaBAT, an efficient tool for accurately reconstructing single genomes from complex microbial communities

    SciTech Connect

    Kang, Dongwan D.; Froula, Jeff; Egan, Rob; Wang, Zhong

    2015-01-01

    Grouping large genomic fragments assembled from shotgun metagenomic sequences to deconvolute complex microbial communities, or metagenome binning, enables the study of individual organisms and their interactions. Because of the complex nature of these communities, existing metagenome binning methods often miss a large number of microbial species. In addition, most of the tools are not scalable to large datasets. Here we introduce automated software called MetaBAT that integrates empirical probabilistic distances of genome abundance and tetranucleotide frequency for accurate metagenome binning. MetaBAT outperforms alternative methods in accuracy and computational efficiency on both synthetic and real metagenome datasets. Lastly, it automatically forms hundreds of high quality genome bins on a very large assembly consisting millions of contigs in a matter of hours on a single node. MetaBAT is open source software and available at https://bitbucket.org/berkeleylab/metabat.

  14. Groupware: A Tool for Interpersonal Computing.

    ERIC Educational Resources Information Center

    Knupfer, Nancy Nelson; McLellan, Hilary

    Computer networks have provided a foundation for interpersonal computing, and new tools are emerging, the centerpiece of which is called "groupware." Groupware technology is reviewed, and the theoretical framework that will underlie interpersonal collaborative computing is discussed. Groupware can consist of hardware, software, services, and…

  15. Methods for Efficiently and Accurately Computing Quantum Mechanical Free Energies for Enzyme Catalysis.

    PubMed

    Kearns, F L; Hudson, P S; Boresch, S; Woodcock, H L

    2016-01-01

    Enzyme activity is inherently linked to free energies of transition states, ligand binding, protonation/deprotonation, etc.; these free energies, and thus enzyme function, can be affected by residue mutations, allosterically induced conformational changes, and much more. Therefore, being able to predict free energies associated with enzymatic processes is critical to understanding and predicting their function. Free energy simulation (FES) has historically been a computational challenge as it requires both the accurate description of inter- and intramolecular interactions and adequate sampling of all relevant conformational degrees of freedom. The hybrid quantum mechanical molecular mechanical (QM/MM) framework is the current tool of choice when accurate computations of macromolecular systems are essential. Unfortunately, robust and efficient approaches that employ the high levels of computational theory needed to accurately describe many reactive processes (ie, ab initio, DFT), while also including explicit solvation effects and accounting for extensive conformational sampling are essentially nonexistent. In this chapter, we will give a brief overview of two recently developed methods that mitigate several major challenges associated with QM/MM FES: the QM non-Boltzmann Bennett's acceptance ratio method and the QM nonequilibrium work method. We will also describe usage of these methods to calculate free energies associated with (1) relative properties and (2) along reaction paths, using simple test cases with relevance to enzymes examples.

  16. Methods for Efficiently and Accurately Computing Quantum Mechanical Free Energies for Enzyme Catalysis.

    PubMed

    Kearns, F L; Hudson, P S; Boresch, S; Woodcock, H L

    2016-01-01

    Enzyme activity is inherently linked to free energies of transition states, ligand binding, protonation/deprotonation, etc.; these free energies, and thus enzyme function, can be affected by residue mutations, allosterically induced conformational changes, and much more. Therefore, being able to predict free energies associated with enzymatic processes is critical to understanding and predicting their function. Free energy simulation (FES) has historically been a computational challenge as it requires both the accurate description of inter- and intramolecular interactions and adequate sampling of all relevant conformational degrees of freedom. The hybrid quantum mechanical molecular mechanical (QM/MM) framework is the current tool of choice when accurate computations of macromolecular systems are essential. Unfortunately, robust and efficient approaches that employ the high levels of computational theory needed to accurately describe many reactive processes (ie, ab initio, DFT), while also including explicit solvation effects and accounting for extensive conformational sampling are essentially nonexistent. In this chapter, we will give a brief overview of two recently developed methods that mitigate several major challenges associated with QM/MM FES: the QM non-Boltzmann Bennett's acceptance ratio method and the QM nonequilibrium work method. We will also describe usage of these methods to calculate free energies associated with (1) relative properties and (2) along reaction paths, using simple test cases with relevance to enzymes examples. PMID:27498635

  17. An Accurate and Computationally Efficient Model for Membrane-Type Circular-Symmetric Micro-Hotplates

    PubMed Central

    Khan, Usman; Falconi, Christian

    2014-01-01

    Ideally, the design of high-performance micro-hotplates would require a large number of simulations because of the existence of many important design parameters as well as the possibly crucial effects of both spread and drift. However, the computational cost of FEM simulations, which are the only available tool for accurately predicting the temperature in micro-hotplates, is very high. As a result, micro-hotplate designers generally have no effective simulation-tools for the optimization. In order to circumvent these issues, here, we propose a model for practical circular-symmetric micro-hot-plates which takes advantage of modified Bessel functions, computationally efficient matrix-approach for considering the relevant boundary conditions, Taylor linearization for modeling the Joule heating and radiation losses, and external-region-segmentation strategy in order to accurately take into account radiation losses in the entire micro-hotplate. The proposed model is almost as accurate as FEM simulations and two to three orders of magnitude more computationally efficient (e.g., 45 s versus more than 8 h). The residual errors, which are mainly associated to the undesired heating in the electrical contacts, are small (e.g., few degrees Celsius for an 800 °C operating temperature) and, for important analyses, almost constant. Therefore, we also introduce a computationally-easy single-FEM-compensation strategy in order to reduce the residual errors to about 1 °C. As illustrative examples of the power of our approach, we report the systematic investigation of a spread in the membrane thermal conductivity and of combined variations of both ambient and bulk temperatures. Our model enables a much faster characterization of micro-hotplates and, thus, a much more effective optimization prior to fabrication. PMID:24763214

  18. Measurement of Fracture Geometry for Accurate Computation of Hydraulic Conductivity

    NASA Astrophysics Data System (ADS)

    Chae, B.; Ichikawa, Y.; Kim, Y.

    2003-12-01

    Fluid flow in rock mass is controlled by geometry of fractures which is mainly characterized by roughness, aperture and orientation. Fracture roughness and aperture was observed by a new confocal laser scanning microscope (CLSM; Olympus OLS1100). The wavelength of laser is 488nm, and the laser scanning is managed by a light polarization method using two galvano-meter scanner mirrors. The system improves resolution in the light axis (namely z) direction because of the confocal optics. The sampling is managed in a spacing 2.5 μ m along x and y directions. The highest measurement resolution of z direction is 0.05 μ m, which is the more accurate than other methods. For the roughness measurements, core specimens of coarse and fine grained granites were provided. Measurements were performed along three scan lines on each fracture surface. The measured data were represented as 2-D and 3-D digital images showing detailed features of roughness. Spectral analyses by the fast Fourier transform (FFT) were performed to characterize on the roughness data quantitatively and to identify influential frequency of roughness. The FFT results showed that components of low frequencies were dominant in the fracture roughness. This study also verifies that spectral analysis is a good approach to understand complicate characteristics of fracture roughness. For the aperture measurements, digital images of the aperture were acquired under applying five stages of uniaxial normal stresses. This method can characterize the response of aperture directly using the same specimen. Results of measurements show that reduction values of aperture are different at each part due to rough geometry of fracture walls. Laboratory permeability tests were also conducted to evaluate changes of hydraulic conductivities related to aperture variation due to different stress levels. The results showed non-uniform reduction of hydraulic conductivity under increase of the normal stress and different values of

  19. Evaluation of distributed computing tools

    SciTech Connect

    Stanberry, L.

    1992-10-28

    The original goal stated in the collaboration agreement from LCC`s perspective was ``to show that networking tools available in UNICOS perform well enough to meet the requirements of LCC customers.`` This translated into evaluating how easy it was to port ELROS over CRI`s ISO 2.0, which itself is a port of ISODE to the Cray. In addition we tested the interoperability of ELROS and ISO 2.0 programs running on the Cray, and communicating with each other, and with servers or clients running on other machines. To achieve these goals from LCC`s side, we ported ELROS to the Cray, and also obtained and installed a copy of the ISO 2.0 distribution from CRI. CRI`s goal for the collaboration was to evaluate the usability of ELROS. In particular, we were interested in their potential feedback on the use of ELROS in implementing ISO protocols--whether ELROS would be easter to use and perform better than other tools that form part of the standard ISODE system. To help achieve these goals for CRI, we provided them with a distribution tar file containing the ELROS system, once we had completed our port of ELROS to the Cray.

  20. Evaluation of distributed computing tools

    SciTech Connect

    Stanberry, L.

    1992-10-28

    The original goal stated in the collaboration agreement from LCC's perspective was to show that networking tools available in UNICOS perform well enough to meet the requirements of LCC customers.'' This translated into evaluating how easy it was to port ELROS over CRI's ISO 2.0, which itself is a port of ISODE to the Cray. In addition we tested the interoperability of ELROS and ISO 2.0 programs running on the Cray, and communicating with each other, and with servers or clients running on other machines. To achieve these goals from LCC's side, we ported ELROS to the Cray, and also obtained and installed a copy of the ISO 2.0 distribution from CRI. CRI's goal for the collaboration was to evaluate the usability of ELROS. In particular, we were interested in their potential feedback on the use of ELROS in implementing ISO protocols--whether ELROS would be easter to use and perform better than other tools that form part of the standard ISODE system. To help achieve these goals for CRI, we provided them with a distribution tar file containing the ELROS system, once we had completed our port of ELROS to the Cray.

  1. Accurate real-time depth control for CP-SSOCT distal sensor based handheld microsurgery tools

    PubMed Central

    Cheon, Gyeong Woo; Huang, Yong; Cha, Jaepyeng; Gehlbach, Peter L.; Kang, Jin U.

    2015-01-01

    This paper presents a novel intuitive targeting and tracking scheme that utilizes a common-path swept source optical coherence tomography (CP-SSOCT) distal sensor integrated handheld microsurgical tool. To achieve micron-order precision control, a reliable and accurate OCT distal sensing method is required; simultaneously, a prediction algorithm is necessary to compensate for the system delay associated with the computational, mechanical and electronic latencies. Due to the multi-layered structure of retina, it is necessary to develop effective surface detection methods rather than simple peak detection. To achieve this, a shifted cross-correlation method is applied for surface detection in order to increase robustness and accuracy in distal sensing. A predictor based on Kalman filter was implemented for more precise motion compensation. The performance was first evaluated using an established dry phantom consisting of stacked cellophane tape. This was followed by evaluation in an ex-vivo bovine retina model to assess system accuracy and precision. The results demonstrate highly accurate depth targeting with less than 5 μm RMSE depth locking. PMID:26137393

  2. Accurately measuring MPI broadcasts in a computational grid

    SciTech Connect

    Karonis N T; de Supinski, B R

    1999-05-06

    An MPI library's implementation of broadcast communication can significantly affect the performance of applications built with that library. In order to choose between similar implementations or to evaluate available libraries, accurate measurements of broadcast performance are required. As we demonstrate, existing methods for measuring broadcast performance are either inaccurate or inadequate. Fortunately, we have designed an accurate method for measuring broadcast performance, even in a challenging grid environment. Measuring broadcast performance is not easy. Simply sending one broadcast after another allows them to proceed through the network concurrently, thus resulting in inaccurate per broadcast timings. Existing methods either fail to eliminate this pipelining effect or eliminate it by introducing overheads that are as difficult to measure as the performance of the broadcast itself. This problem becomes even more challenging in grid environments. Latencies a long different links can vary significantly. Thus, an algorithm's performance is difficult to predict from it's communication pattern. Even when accurate pre-diction is possible, the pattern is often unknown. Our method introduces a measurable overhead to eliminate the pipelining effect, regardless of variations in link latencies. choose between different available implementations. Also, accurate and complete measurements could guide use of a given implementation to improve application performance. These choices will become even more important as grid-enabled MPI libraries [6, 7] become more common since bad choices are likely to cost significantly more in grid environments. In short, the distributed processing community needs accurate, succinct and complete measurements of collective communications performance. Since successive collective communications can often proceed concurrently, accurately measuring them is difficult. Some benchmarks use knowledge of the communication algorithm to predict the

  3. Computer assisted blast design and assessment tools

    SciTech Connect

    Cameron, A.R.; Kleine, T.H.; Forsyth, W.W.

    1995-12-31

    In general the software required by a blast designer includes tools that graphically present blast designs (surface and underground), can analyze a design or predict its result, and can assess blasting results. As computers develop and computer literacy continues to rise the development of and use of such tools will spread. An example of the tools that are becoming available includes: Automatic blast pattern generation and underground ring design; blast design evaluation in terms of explosive distribution and detonation simulation; fragmentation prediction; blast vibration prediction and minimization; blast monitoring for assessment of dynamic performance; vibration measurement, display and signal processing; evaluation of blast results in terms of fragmentation; and risk and reliability based blast assessment. The authors have identified a set of criteria that are essential in choosing appropriate software blasting tools.

  4. Graphical arterial blood gas visualization tool supports rapid and accurate data interpretation.

    PubMed

    Doig, Alexa K; Albert, Robert W; Syroid, Noah D; Moon, Shaun; Agutter, Jim A

    2011-04-01

    A visualization tool that integrates numeric information from an arterial blood gas report with novel graphics was designed for the purpose of promoting rapid and accurate interpretation of acid-base data. A study compared data interpretation performance when arterial blood gas results were presented in a traditional numerical list versus the graphical visualization tool. Critical-care nurses (n = 15) and nursing students (n = 15) were significantly more accurate identifying acid-base states and assessing trends in acid-base data when using the graphical visualization tool. Critical-care nurses and nursing students using traditional numerical data had an average accuracy of 69% and 74%, respectively. Using the visualization tool, average accuracy improved to 83% for critical-care nurses and 93% for nursing students. Analysis of response times demonstrated that the visualization tool might help nurses overcome the "speed/accuracy trade-off" during high-stress situations when rapid decisions must be rendered. Perceived mental workload was significantly reduced for nursing students when they used the graphical visualization tool. In this study, the effects of implementing the graphical visualization were greater for nursing students than for critical-care nurses, which may indicate that the experienced nurses needed more training and use of the new technology prior to testing to show similar gains. Results of the objective and subjective evaluations support the integration of this graphical visualization tool into clinical environments that require accurate and timely interpretation of arterial blood gas data.

  5. Computational Tools to Accelerate Commercial Development

    SciTech Connect

    Miller, David C

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  6. A computational tool to design and generate crystal structures

    NASA Astrophysics Data System (ADS)

    Ferreira, R. C.; Vieira, M. B.; Dantas, S. O.; Lobosco, M.

    2014-03-01

    The evolution of computers, more specifically regarding the increased storage and data processing capacity, allowed the construction of computational tools for the simulation of physical and chemical phenomena. Thus, practical experiments are being replaced, in some cases, by computational ones. In this context, we can highlight models used to simulate different phenomena on atomic scale. The construction of these simulators requires, by developers, the study and definition of accurate and reliable models. This complexity is often reflected in the construction of complex simulators, which simulate a limited group of structures. Such structures are sometimes expressed in a fixed manner using a limited set of geometric shapes. This work proposes a computational tool that aims to generate a set of crystal structures. The proposed tool consists of a) a programming language, which is used to describe the structures using for this purpose their characteristic functions and CSG (Constructive Solid Geometry) operators, and b) a compiler/interpreter that examines the source code written in the proposed language, and generates the objects accordingly. This tool enables the generation of an unrestricted number of structures, which can be incorporated in simulators such as the Monte Carlo Spin Engine, developed by our group at UFJF.

  7. MAPPER: A personal computer map projection tool

    NASA Technical Reports Server (NTRS)

    Bailey, Steven A.

    1993-01-01

    MAPPER is a set of software tools designed to let users create and manipulate map projections on a personal computer (PC). The capability exists to generate five popular map projections. These include azimuthal, cylindrical, mercator, lambert, and sinusoidal projections. Data for projections are contained in five coordinate databases at various resolutions. MAPPER is managed by a system of pull-down windows. This interface allows the user to intuitively create, view and export maps to other platforms.

  8. Final Report: Correctness Tools for Petascale Computing

    SciTech Connect

    Mellor-Crummey, John

    2014-10-27

    In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoring of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.

  9. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    PubMed

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry.

  10. Use of Monocrystalline Silicon as Tool Material for Highly Accurate Blanking of Thin Metal Foils

    SciTech Connect

    Hildering, Sven; Engel, Ulf; Merklein, Marion

    2011-05-04

    The trend towards miniaturisation of metallic mass production components combined with increased component functionality is still unbroken. Manufacturing these components by forming and blanking offers economical and ecological advantages combined with the needed accuracy. The complexity of producing tools with geometries below 50 {mu}m by conventional manufacturing methods becomes disproportional higher. Expensive serial finishing operations are required to achieve an adequate surface roughness combined with accurate geometry details. A novel approach for producing such tools is the use of advanced etching technologies for monocrystalline silicon that are well-established in the microsystems technology. High-precision vertical geometries with a width down to 5 {mu}m are possible. The present study shows a novel concept using this potential for the blanking of thin copper foils with monocrystallline silicon as a tool material. A self-contained machine-tool with compact outer dimensions was designed to avoid tensile stresses in the brittle silicon punch by an accurate, careful alignment of the punch, die and metal foil. A microscopic analysis of the monocrystalline silicon punch shows appropriate properties regarding flank angle, edge geometry and surface quality for the blanking process. Using a monocrystalline silicon punch with a width of 70 {mu}m blanking experiments on as-rolled copper foils with a thickness of 20 {mu}m demonstrate the general applicability of this material for micro production processes.

  11. Equilibrium gas flow computations. I - Accurate and efficient calculation of equilibrium gas properties

    NASA Technical Reports Server (NTRS)

    Liu, Yen; Vinokur, Marcel

    1989-01-01

    This paper treats the accurate and efficient calculation of thermodynamic properties of arbitrary gas mixtures for equilibrium flow computations. New improvements in the Stupochenko-Jaffe model for the calculation of thermodynamic properties of diatomic molecules are presented. A unified formulation of equilibrium calculations for gas mixtures in terms of irreversible entropy is given. Using a highly accurate thermo-chemical data base, a new, efficient and vectorizable search algorithm is used to construct piecewise interpolation procedures with generate accurate thermodynamic variable and their derivatives required by modern computational algorithms. Results are presented for equilibrium air, and compared with those given by the Srinivasan program.

  12. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-01-01

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches. PMID:27401684

  13. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-07-11

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.

  14. Computer-Based Cognitive Tools: Description and Design.

    ERIC Educational Resources Information Center

    Kennedy, David; McNaught, Carmel

    With computers, tangible tools are represented by the hardware (e.g., the central processing unit, scanners, and video display unit), while intangible tools are represented by the software. There is a special category of computer-based software tools (CBSTs) that have the potential to mediate cognitive processes--computer-based cognitive tools…

  15. The Clinical Impact of Accurate Cystine Calculi Characterization Using Dual-Energy Computed Tomography.

    PubMed

    Haley, William E; Ibrahim, El-Sayed H; Qu, Mingliang; Cernigliaro, Joseph G; Goldfarb, David S; McCollough, Cynthia H

    2015-01-01

    Dual-energy computed tomography (DECT) has recently been suggested as the imaging modality of choice for kidney stones due to its ability to provide information on stone composition. Standard postprocessing of the dual-energy images accurately identifies uric acid stones, but not other types. Cystine stones can be identified from DECT images when analyzed with advanced postprocessing. This case report describes clinical implications of accurate diagnosis of cystine stones using DECT.

  16. VISTA - computational tools for comparative genomics

    SciTech Connect

    Frazer, Kelly A.; Pachter, Lior; Poliakov, Alexander; Rubin,Edward M.; Dubchak, Inna

    2004-01-01

    Comparison of DNA sequences from different species is a fundamental method for identifying functional elements in genomes. Here we describe the VISTA family of tools created to assist biologists in carrying out this task. Our first VISTA server at http://www-gsd.lbl.gov/VISTA/ was launched in the summer of 2000 and was designed to align long genomic sequences and visualize these alignments with associated functional annotations. Currently the VISTA site includes multiple comparative genomics tools and provides users with rich capabilities to browse pre-computed whole-genome alignments of large vertebrate genomes and other groups of organisms with VISTA Browser, submit their own sequences of interest to several VISTA servers for various types of comparative analysis, and obtain detailed comparative analysis results for a set of cardiovascular genes. We illustrate capabilities of the VISTA site by the analysis of a 180 kilobase (kb) interval on human chromosome 5 that encodes for the kinesin family member3A (KIF3A) protein.

  17. VISTA: computational tools for comparative genomics.

    PubMed

    Frazer, Kelly A; Pachter, Lior; Poliakov, Alexander; Rubin, Edward M; Dubchak, Inna

    2004-07-01

    Comparison of DNA sequences from different species is a fundamental method for identifying functional elements in genomes. Here, we describe the VISTA family of tools created to assist biologists in carrying out this task. Our first VISTA server at http://www-gsd.lbl.gov/vista/ was launched in the summer of 2000 and was designed to align long genomic sequences and visualize these alignments with associated functional annotations. Currently the VISTA site includes multiple comparative genomics tools and provides users with rich capabilities to browse pre-computed whole-genome alignments of large vertebrate genomes and other groups of organisms with VISTA Browser, to submit their own sequences of interest to several VISTA servers for various types of comparative analysis and to obtain detailed comparative analysis results for a set of cardiovascular genes. We illustrate capabilities of the VISTA site by the analysis of a 180 kb interval on human chromosome 5 that encodes for the kinesin family member 3A (KIF3A) protein.

  18. Tools for remote computing in accelerator control

    NASA Astrophysics Data System (ADS)

    Anderssen, Pal S.; Frammery, Veronique; Wilcke, Rainer

    1990-08-01

    In modern accelerator control systems, the intelligence of the equipment is distributed in the geographical and the logical sense. Control processes for a large variety of tasks reside in both the equipment and the control computers. Hence successful operation hinges on the availability and reliability of the communication infrastructure. The computers are interconnected by a communication system and use remote procedure calls and message passing for information exchange. These communication mechanisms need a well-defined convention, i.e. a protocol. They also require flexibility in both the setup and changes to the protocol specification. The Network Compiler is a tool which provides the programmer with a means of establising such a protocol for his application. Input to the Network Compiler is a single Interface Description File provided by the programmer. This file is written according to a grammar, and completely specifies the interprocess communication interfaces. Passed through the Network Compiler, the Interface Description File automatically produces the additional source code needed for the protocol. Hence the programmer does not have to be concerned about the details of the communication calls. Any further additions and modifications are made easy, because all the information about the interface is kept in a single file.

  19. Creation of Anatomically Accurate Computer-Aided Design (CAD) Solid Models from Medical Images

    NASA Technical Reports Server (NTRS)

    Stewart, John E.; Graham, R. Scott; Samareh, Jamshid A.; Oberlander, Eric J.; Broaddus, William C.

    1999-01-01

    Most surgical instrumentation and implants used in the world today are designed with sophisticated Computer-Aided Design (CAD)/Computer-Aided Manufacturing (CAM) software. This software automates the mechanical development of a product from its conceptual design through manufacturing. CAD software also provides a means of manipulating solid models prior to Finite Element Modeling (FEM). Few surgical products are designed in conjunction with accurate CAD models of human anatomy because of the difficulty with which these models are created. We have developed a novel technique that creates anatomically accurate, patient specific CAD solids from medical images in a matter of minutes.

  20. Computer Series, 101: Accurate Equations of State in Computational Chemistry Projects.

    ERIC Educational Resources Information Center

    Albee, David; Jones, Edward

    1989-01-01

    Discusses the use of computers in chemistry courses at the United States Military Academy. Provides two examples of computer projects: (1) equations of state, and (2) solving for molar volume. Presents BASIC and PASCAL listings for the second project. Lists 10 applications for physical chemistry. (MVL)

  1. Computational and Physical Quality Assurance Tools for Radiotherapy

    NASA Astrophysics Data System (ADS)

    Graves, Yan Jiang

    Radiation therapy aims at delivering a prescribed amount of radiation dose to cancerous targets while sparing dose to normal organs. Treatment planning and delivery in modern radiotherapy are highly complex. To ensure the accuracy of the delivered dose to a patient, a quality assurance (QA) procedure is needed before the actual treatment delivery. This dissertation aims at developing computational and physical tools to facilitate the QA process. In Chapter 2, we have developed a fast and accurate computational QA tool using a graphics processing unit based Monte Carlo (MC) dose engine. This QA tool aims at identifying any errors in the treatment planning stage and machine delivery process by comparing three dose distributions: planned dose computed by a treatment planning system, planned dose and delivered dose reconstructed using the MC method. Within this tool, several modules have been built. (1) A denoising algorithm to smooth the MC calculated dose. We have also investigated the effects of statistical uncertainty in MC simulations on a commonly used dose comparison metric. (2) A linear accelerator source model with a semi-automatic commissioning process. (3) A fluence generation module. With all these modules, a web application for this QA tool with a user friendly interface has been developed to provide users with easy access to our tool, facilitating its clinical utilizations. Even after an initial treatment plan fulfills the QA requirements, a patient may experience inter-fractional anatomy variations, which compromise the initial plan optimality. To resolve this issue, adaptive radiotherapy (ART) has been proposed, where treatment plan is redesigned based on most recent patient anatomy. In Chapter 3, we have constructed a physical deformable head and neck (HN) phantom with in-vivo dosimetry capability. This phantom resembles HN patient geometry and simulates tumor shrinkage with a high level of realism. The ground truth deformation field can be measured

  2. High-order computational fluid dynamics tools for aircraft design.

    PubMed

    Wang, Z J

    2014-08-13

    Most forecasts predict an annual airline traffic growth rate between 4.5 and 5% in the foreseeable future. To sustain that growth, the environmental impact of aircraft cannot be ignored. Future aircraft must have much better fuel economy, dramatically less greenhouse gas emissions and noise, in addition to better performance. Many technical breakthroughs must take place to achieve the aggressive environmental goals set up by governments in North America and Europe. One of these breakthroughs will be physics-based, highly accurate and efficient computational fluid dynamics and aeroacoustics tools capable of predicting complex flows over the entire flight envelope and through an aircraft engine, and computing aircraft noise. Some of these flows are dominated by unsteady vortices of disparate scales, often highly turbulent, and they call for higher-order methods. As these tools will be integral components of a multi-disciplinary optimization environment, they must be efficient to impact design. Ultimately, the accuracy, efficiency, robustness, scalability and geometric flexibility will determine which methods will be adopted in the design process. This article explores these aspects and identifies pacing items. PMID:25024419

  3. High-order computational fluid dynamics tools for aircraft design.

    PubMed

    Wang, Z J

    2014-08-13

    Most forecasts predict an annual airline traffic growth rate between 4.5 and 5% in the foreseeable future. To sustain that growth, the environmental impact of aircraft cannot be ignored. Future aircraft must have much better fuel economy, dramatically less greenhouse gas emissions and noise, in addition to better performance. Many technical breakthroughs must take place to achieve the aggressive environmental goals set up by governments in North America and Europe. One of these breakthroughs will be physics-based, highly accurate and efficient computational fluid dynamics and aeroacoustics tools capable of predicting complex flows over the entire flight envelope and through an aircraft engine, and computing aircraft noise. Some of these flows are dominated by unsteady vortices of disparate scales, often highly turbulent, and they call for higher-order methods. As these tools will be integral components of a multi-disciplinary optimization environment, they must be efficient to impact design. Ultimately, the accuracy, efficiency, robustness, scalability and geometric flexibility will determine which methods will be adopted in the design process. This article explores these aspects and identifies pacing items.

  4. High-order computational fluid dynamics tools for aircraft design

    PubMed Central

    Wang, Z. J.

    2014-01-01

    Most forecasts predict an annual airline traffic growth rate between 4.5 and 5% in the foreseeable future. To sustain that growth, the environmental impact of aircraft cannot be ignored. Future aircraft must have much better fuel economy, dramatically less greenhouse gas emissions and noise, in addition to better performance. Many technical breakthroughs must take place to achieve the aggressive environmental goals set up by governments in North America and Europe. One of these breakthroughs will be physics-based, highly accurate and efficient computational fluid dynamics and aeroacoustics tools capable of predicting complex flows over the entire flight envelope and through an aircraft engine, and computing aircraft noise. Some of these flows are dominated by unsteady vortices of disparate scales, often highly turbulent, and they call for higher-order methods. As these tools will be integral components of a multi-disciplinary optimization environment, they must be efficient to impact design. Ultimately, the accuracy, efficiency, robustness, scalability and geometric flexibility will determine which methods will be adopted in the design process. This article explores these aspects and identifies pacing items. PMID:25024419

  5. Stable, accurate and efficient computation of normal modes for horizontal stratified models

    NASA Astrophysics Data System (ADS)

    Wu, Bo; Chen, Xiaofei

    2016-06-01

    We propose an adaptive root-determining strategy that is very useful when dealing with trapped modes or Stoneley modes whose energies become very insignificant on the free surface in the presence of low-velocity layers or fluid layers in the model. Loss of modes in these cases or inaccuracy in the calculation of these modes may then be easily avoided. Built upon the generalized reflection/transmission coefficients, the concept of "family of secular functions" that we herein call "adaptive mode observers", is thus naturally introduced to implement this strategy, the underlying idea of which has been distinctly noted for the first time and may be generalized to other applications such as free oscillations or applied to other methods in use when these cases are encountered. Additionally, we have made further improvements upon the generalized reflection/transmission coefficient method; mode observers associated with only the free surface and low-velocity layers (and the fluid/solid interface if the model contains fluid layers) are adequate to guarantee no loss and high precision at the same time of any physically existent modes without excessive calculations. Finally, the conventional definition of the fundamental mode is reconsidered, which is entailed in the cases under study. Some computational aspects are remarked on. With the additional help afforded by our superior root-searching scheme and the possibility of speeding calculation using a less number of layers aided by the concept of "turning point", our algorithm is remarkably efficient as well as stable and accurate and can be used as a powerful tool for widely related applications.

  6. Stable, accurate and efficient computation of normal modes for horizontal stratified models

    NASA Astrophysics Data System (ADS)

    Wu, Bo; Chen, Xiaofei

    2016-08-01

    We propose an adaptive root-determining strategy that is very useful when dealing with trapped modes or Stoneley modes whose energies become very insignificant on the free surface in the presence of low-velocity layers or fluid layers in the model. Loss of modes in these cases or inaccuracy in the calculation of these modes may then be easily avoided. Built upon the generalized reflection/transmission coefficients, the concept of `family of secular functions' that we herein call `adaptive mode observers' is thus naturally introduced to implement this strategy, the underlying idea of which has been distinctly noted for the first time and may be generalized to other applications such as free oscillations or applied to other methods in use when these cases are encountered. Additionally, we have made further improvements upon the generalized reflection/transmission coefficient method; mode observers associated with only the free surface and low-velocity layers (and the fluid/solid interface if the model contains fluid layers) are adequate to guarantee no loss and high precision at the same time of any physically existent modes without excessive calculations. Finally, the conventional definition of the fundamental mode is reconsidered, which is entailed in the cases under study. Some computational aspects are remarked on. With the additional help afforded by our superior root-searching scheme and the possibility of speeding calculation using a less number of layers aided by the concept of `turning point', our algorithm is remarkably efficient as well as stable and accurate and can be used as a powerful tool for widely related applications.

  7. Computer as Research Tools 4.Use Your PC More Effectively

    NASA Astrophysics Data System (ADS)

    Baba, Hajime

    This article shows the useful tools on personal computers. The electronical dictionaries, the full-text search system, the simple usage of the preprint server, and the numeric computation language for applications in engineering and science are introduced.

  8. Computer-based personality judgments are more accurate than those made by humans

    PubMed Central

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-01

    Judging others’ personalities is an essential skill in successful social living, as personality is a key driver behind people’s interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants’ Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507

  9. Computer-based personality judgments are more accurate than those made by humans.

    PubMed

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-27

    Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy.

  10. Computer-based personality judgments are more accurate than those made by humans.

    PubMed

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-27

    Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507

  11. UP-TORR: online tool for accurate and Up-to-Date annotation of RNAi Reagents.

    PubMed

    Hu, Yanhui; Roesel, Charles; Flockhart, Ian; Perkins, Lizabeth; Perrimon, Norbert; Mohr, Stephanie E

    2013-09-01

    RNA interference (RNAi) is a widely adopted tool for loss-of-function studies but RNAi results only have biological relevance if the reagents are appropriately mapped to genes. Several groups have designed and generated RNAi reagent libraries for studies in cells or in vivo for Drosophila and other species. At first glance, matching RNAi reagents to genes appears to be a simple problem, as each reagent is typically designed to target a single gene. In practice, however, the reagent-gene relationship is complex. Although the sequences of oligonucleotides used to generate most types of RNAi reagents are static, the reference genome and gene annotations are regularly updated. Thus, at the time a researcher chooses an RNAi reagent or analyzes RNAi data, the most current interpretation of the RNAi reagent-gene relationship, as well as related information regarding specificity (e.g., predicted off-target effects), can be different from the original interpretation. Here, we describe a set of strategies and an accompanying online tool, UP-TORR (for Updated Targets of RNAi Reagents; www.flyrnai.org/up-torr), useful for accurate and up-to-date annotation of cell-based and in vivo RNAi reagents. Importantly, UP-TORR automatically synchronizes with gene annotations daily, retrieving the most current information available, and for Drosophila, also synchronizes with the major reagent collections. Thus, UP-TORR allows users to choose the most appropriate RNAi reagents at the onset of a study, as well as to perform the most appropriate analyses of results of RNAi-based studies.

  12. A Unified Methodology for Computing Accurate Quaternion Color Moments and Moment Invariants.

    PubMed

    Karakasis, Evangelos G; Papakostas, George A; Koulouriotis, Dimitrios E; Tourassis, Vassilios D

    2014-02-01

    In this paper, a general framework for computing accurate quaternion color moments and their corresponding invariants is proposed. The proposed unified scheme arose by studying the characteristics of different orthogonal polynomials. These polynomials are used as kernels in order to form moments, the invariants of which can easily be derived. The resulted scheme permits the usage of any polynomial-like kernel in a unified and consistent way. The resulted moments and moment invariants demonstrate robustness to noisy conditions and high discriminative power. Additionally, in the case of continuous moments, accurate computations take place to avoid approximation errors. Based on this general methodology, the quaternion Tchebichef, Krawtchouk, Dual Hahn, Legendre, orthogonal Fourier-Mellin, pseudo Zernike and Zernike color moments, and their corresponding invariants are introduced. A selected paradigm presents the reconstruction capability of each moment family, whereas proper classification scenarios evaluate the performance of color moment invariants. PMID:24216719

  13. Physics Education through Computational Tools: The Case of Geometrical and Physical Optics

    ERIC Educational Resources Information Center

    Rodríguez, Y.; Santana, A.; Mendoza, L. M.

    2013-01-01

    Recently, with the development of more powerful and accurate computational tools, the inclusion of new didactic materials in the classroom is known to have increased. However, the form in which these materials can be used to enhance the learning process is still under debate. Many different methodologies have been suggested for constructing new…

  14. Widgets on the Web: Using Computer-Based Learning Tools.

    ERIC Educational Resources Information Center

    Miller, Darcy; Brown, Abbie; Robinson, LeAnne

    2002-01-01

    This article describes Widgets, computer-based learning tools designed to meet the instructional need for computer-based flexible tools that can be used across ability levels. It discusses development of three Widgets for students with mild disabilities: a multiplication with sets Widget, a number sense Widget, and a dollars and cents Widget.…

  15. MicroRNA-200 Family Profile: A Promising Ancillary Tool for Accurate Cancer Diagnosis.

    PubMed

    Liu, Xiaodong; Zhang, Jianhua; Xie, Botao; Li, Hao; Shen, Jihong; Chen, Jianheng

    2016-01-01

    Cancer is one of the most threatening diseases in the world and great interests have been paid to discover accurate and noninvasive methods for cancer diagnosis. The value of microRNA-200 (miRNA-200, miR-200) family has been revealed in many studies. However, the results from various studies were inconsistent, and thus a meta-analysis was designed and performed to assess the overall value of miRNA200 in cancer diagnosis. Relevant studies were searched electronically from the following databases: PubMed, Embase, Web of Science, the Cochrane Library, and Chinese National Knowledge Infrastructure. Keyword combined with "miR-200," "cancer," and "diagnosis" in any fields was used for searching relevant studies. Then, the pooled sensitivity, specificity, area under the curve (AUC), and partial AUC were calculated using the random-effects model. Heterogeneity among individual studies was also explored by subgroup analyses. A total of 28 studies from 18 articles with an overall sample size of 3676 subjects (2097 patients and 1579 controls) were included in this meta-analysis. The overall sensitivity and specificity with 95% confidence intervals (95% CIs) are 0.709 (95% CI: 0.657-0.755) and 0.667 (95% CI: 0.617-0.713), respectively. Additionally, AUC and partial AUC for the pooled data is 0.735 and 0.627, respectively. Subgroup analyses revealed that using miRNA-200 family for cancer diagnosis is more effective in white than in Asian ethnic groups. In addition, cancer diagnosis by miRNA using circulating specimen is more effective than that using noncirculating specimen. Finally, miRNA is more accurate in diagnosing endometrial cancer than other types of cancer, and some miRNA family members (miR-200b and miR-429) have superior diagnostic accuracy than other miR-200 family members. In conclusion, the profiling of miRNA-200 family is likely to be a valuable tool in cancer detection and diagnosis.

  16. Computers as a Language Learning Tool.

    ERIC Educational Resources Information Center

    Ruschoff, Bernd

    1984-01-01

    Describes a computer-assisted language learning project at the University of Wuppertal (West Germany). It's hoped that teachers can overcome the two handicaps of the past--lack of teacher awareness of current audio-visual technical aids, as well as unsophisticated computer hardware--both problems by getting the opportunity to familiarize…

  17. Computing tools for accelerator design calculations

    SciTech Connect

    Fischler, M.; Nash, T.

    1984-01-01

    This note is intended as a brief, summary guide for accelerator designers to the new generation of commercial and special processors that allow great increases in computing cost effectiveness. New thinking is required to take best advantage of these computing opportunities, in particular, when moving from analytical approaches to tracking simulations. In this paper, we outline the relevant considerations.

  18. An accurate tool for the fast generation of dark matter halo catalogues

    NASA Astrophysics Data System (ADS)

    Monaco, P.; Sefusatti, E.; Borgani, S.; Crocce, M.; Fosalba, P.; Sheth, R. K.; Theuns, T.

    2013-08-01

    We present a new parallel implementation of the PINpointing Orbit Crossing-Collapsed HIerarchical Objects (PINOCCHIO) algorithm, a quick tool, based on Lagrangian Perturbation Theory, for the hierarchical build-up of dark matter (DM) haloes in cosmological volumes. To assess its ability to predict halo correlations on large scales, we compare its results with those of an N-body simulation of a 3 h-1 Gpc box sampled with 20483 particles taken from the MICE suite, matching the same seeds for the initial conditions. Thanks to the Fastest Fourier Transforms in the West (FFTW) libraries and to the relatively simple design, the code shows very good scaling properties. The CPU time required by PINOCCHIO is a tiny fraction (˜1/2000) of that required by the MICE simulation. Varying some of PINOCCHIO numerical parameters allows one to produce a universal mass function that lies in the range allowed by published fits, although it underestimates the MICE mass function of Friends-of-Friends (FoF) haloes in the high-mass tail. We compare the matter-halo and the halo-halo power spectra with those of the MICE simulation and find that these two-point statistics are well recovered on large scales. In particular, when catalogues are matched in number density, agreement within 10 per cent is achieved for the halo power spectrum. At scales k > 0.1 h Mpc-1, the inaccuracy of the Zel'dovich approximation in locating halo positions causes an underestimate of the power spectrum that can be modelled as a Gaussian factor with a damping scale of d = 3 h-1 Mpc at z = 0, decreasing at higher redshift. Finally, a remarkable match is obtained for the reduced halo bispectrum, showing a good description of non-linear halo bias. Our results demonstrate the potential of PINOCCHIO as an accurate and flexible tool for generating large ensembles of mock galaxy surveys, with interesting applications for the analysis of large galaxy redshift surveys.

  19. Fast and accurate computation of system matrix for area integral model-based algebraic reconstruction technique

    NASA Astrophysics Data System (ADS)

    Zhang, Shunli; Zhang, Dinghua; Gong, Hao; Ghasemalizadeh, Omid; Wang, Ge; Cao, Guohua

    2014-11-01

    Iterative algorithms, such as the algebraic reconstruction technique (ART), are popular for image reconstruction. For iterative reconstruction, the area integral model (AIM) is more accurate for better reconstruction quality than the line integral model (LIM). However, the computation of the system matrix for AIM is more complex and time-consuming than that for LIM. Here, we propose a fast and accurate method to compute the system matrix for AIM. First, we calculate the intersection of each boundary line of a narrow fan-beam with pixels in a recursive and efficient manner. Then, by grouping the beam-pixel intersection area into six types according to the slopes of the two boundary lines, we analytically compute the intersection area of the narrow fan-beam with the pixels in a simple algebraic fashion. Overall, experimental results show that our method is about three times faster than the Siddon algorithm and about two times faster than the distance-driven model (DDM) in computation of the system matrix. The reconstruction speed of our AIM-based ART is also faster than the LIM-based ART that uses the Siddon algorithm and DDM-based ART, for one iteration. The fast reconstruction speed of our method was accomplished without compromising the image quality.

  20. High resolution DEM from Tandem-X interferometry: an accurate tool to characterize volcanic activity

    NASA Astrophysics Data System (ADS)

    Albino, Fabien; Kervyn, Francois

    2013-04-01

    Tandem-X mission was launched by the German agency (DLR) in June 2010. It is a new generation high resolution SAR sensor mainly dedicated to topographic applications. For the purpose of our researches focused on the study of the volcano-tectonic activity in the Kivu Rift area, a set of Tandem-X bistatic radar images were used to produce a high resolution InSAR DEM of the Virunga Volcanic Province (VVP). The VVP is part of the Western branch of the African rift, situated at the boundary between D.R. Congo, Rwanda and Uganda. It has two highly active volcanoes, Nyiragongo and Nyamulagira. A first task concerns the quantitative assessment of the vertical accuracy that can be achieved with these new data. The new DEMs are compared to other space borne datasets (SRTM, ASTER) but also to field measurements given by differential GPS. Multi-temporal radar acquisitions allow us to produce several DEM of the same area. This appeared to be very useful in the context of an active volcanic context where new geomorphological features (faults, fissures, volcanic cones and lava flows) appear continuously through time. For example, since the year 2000, time of the SRTM acquisition, we had one eruption at Nyiragongo (2002) and six eruptions at Nyamulagira (2001, 2002, 2004, 2006, 2010 and 2011) which all induce large changes in the landscape with the emplacement of new lava fields and scoria cones. From our repetitive Tandem-X DEM production, we have a tool to identify and also quantify in term of size and volume all the topographic changes relative to this past volcanic activity. These parameters are high value information to improve the understanding of the Virunga volcanoes; the accurate estimation of erupted volume and knowledge of structural features associated to past eruptions are key parameters to understand the volcanic system, to ameliorate the hazard assessment, and finally contribute to risk mitigation in a densely populated area.

  1. Accurate calculation of computer-generated holograms using angular-spectrum layer-oriented method.

    PubMed

    Zhao, Yan; Cao, Liangcai; Zhang, Hao; Kong, Dezhao; Jin, Guofan

    2015-10-01

    Fast calculation and correct depth cue are crucial issues in the calculation of computer-generated hologram (CGH) for high quality three-dimensional (3-D) display. An angular-spectrum based algorithm for layer-oriented CGH is proposed. Angular spectra from each layer are synthesized as a layer-corresponded sub-hologram based on the fast Fourier transform without paraxial approximation. The proposed method can avoid the huge computational cost of the point-oriented method and yield accurate predictions of the whole diffracted field compared with other layer-oriented methods. CGHs of versatile formats of 3-D digital scenes, including computed tomography and 3-D digital models, are demonstrated with precise depth performance and advanced image quality. PMID:26480062

  2. Object-oriented Tools for Distributed Computing

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1993-01-01

    Distributed computing systems are proliferating, owing to the availability of powerful, affordable microcomputers and inexpensive communication networks. A critical problem in developing such systems is getting application programs to interact with one another across a computer network. Remote interprogram connectivity is particularly challenging across heterogeneous environments, where applications run on different kinds of computers and operating systems. NetWorks! (trademark) is an innovative software product that provides an object-oriented messaging solution to these problems. This paper describes the design and functionality of NetWorks! and illustrates how it is being used to build complex distributed applications for NASA and in the commercial sector.

  3. Time accurate application of the MacCormack 2-4 scheme on massively parallel computers

    NASA Technical Reports Server (NTRS)

    Hudson, Dale A.; Long, Lyle N.

    1995-01-01

    Many recent computational efforts in turbulence and acoustics research have used higher order numerical algorithms. One popular method has been the explicit MacCormack 2-4 scheme. The MacCormack 2-4 scheme is second order accurate in time and fourth order accurate in space, and is stable for CFL's below 2/3. Current research has shown that the method can give accurate results but does exhibit significant Gibbs phenomena at sharp discontinuities. The impact of adding Jameson type second, third, and fourth order artificial viscosity was examined here. Category 2 problems, the nonlinear traveling wave and the Riemann problem, were computed using a CFL number of 0.25. This research has found that dispersion errors can be significantly reduced or nearly eliminated by using a combination of second and third order terms in the damping. Use of second and fourth order terms reduced the magnitude of dispersion errors but not as effectively as the second and third order combination. The program was coded using Thinking Machine's CM Fortran, a variant of Fortran 90/High Performance Fortran, and was executed on a 2K CM-200. Simple extrapolation boundary conditions were used for both problems.

  4. Accurate guidance for percutaneous access to a specific target in soft tissues: preclinical study of computer-assisted pericardiocentesis.

    PubMed

    Chavanon, O; Barbe, C; Troccaz, J; Carrat, L; Ribuot, C; Noirclerc, M; Maitrasse, B; Blin, D

    1999-06-01

    In the field of percutaneous access to soft tissues, our project was to improve classical pericardiocentesis by performing accurate guidance to a selected target, according to a model of the pericardial effusion acquired through three-dimensional (3D) data recording. Required hardware is an echocardiographic device and a needle, both linked to a 3D localizer, and a computer. After acquiring echographic data, a modeling procedure allows definition of the optimal puncture strategy, taking into consideration the mobility of the heart, by determining a stable region, whatever the period of the cardiac cycle. A passive guidance system is then used to reach the planned target accurately, generally a site in the middle of the stable region. After validation on a dynamic phantom and a feasibility study in dogs, an accuracy and reliability analysis protocol was realized on pigs with experimental pericardial effusion. Ten consecutive successful punctures using various trajectories were performed on eight pigs. Nonbloody liquid was collected from pericardial effusions in the stable region (5 to 9 mm wide) within 10 to 15 minutes from echographic acquisition to drainage. Accuracy of at least 2.5 mm was demonstrated. This study demonstrates the feasibility of computer-assisted pericardiocentesis. Beyond the simple improvement of the current technique, this method could be a new way to reach the heart or a new tool for percutaneous access and image-guided puncture of soft tissues. Further investigation will be necessary before routine human application.

  5. RapMap: a rapid, sensitive and accurate tool for mapping RNA-seq reads to transcriptomes

    PubMed Central

    Srivastava, Avi; Sarkar, Hirak; Gupta, Nitish; Patro, Rob

    2016-01-01

    Motivation: The alignment of sequencing reads to a transcriptome is a common and important step in many RNA-seq analysis tasks. When aligning RNA-seq reads directly to a transcriptome (as is common in the de novo setting or when a trusted reference annotation is available), care must be taken to report the potentially large number of multi-mapping locations per read. This can pose a substantial computational burden for existing aligners, and can considerably slow downstream analysis. Results: We introduce a novel concept, quasi-mapping, and an efficient algorithm implementing this approach for mapping sequencing reads to a transcriptome. By attempting only to report the potential loci of origin of a sequencing read, and not the base-to-base alignment by which it derives from the reference, RapMap—our tool implementing quasi-mapping—is capable of mapping sequencing reads to a target transcriptome substantially faster than existing alignment tools. The algorithm we use to implement quasi-mapping uses several efficient data structures and takes advantage of the special structure of shared sequence prevalent in transcriptomes to rapidly provide highly-accurate mapping information. We demonstrate how quasi-mapping can be successfully applied to the problems of transcript-level quantification from RNA-seq reads and the clustering of contigs from de novo assembled transcriptomes into biologically meaningful groups. Availability and implementation: RapMap is implemented in C ++11 and is available as open-source software, under GPL v3, at https://github.com/COMBINE-lab/RapMap. Contact: rob.patro@cs.stonybrook.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27307617

  6. An accurate Fortran code for computing hydrogenic continuum wave functions at a wide range of parameters

    NASA Astrophysics Data System (ADS)

    Peng, Liang-You; Gong, Qihuang

    2010-12-01

    The accurate computations of hydrogenic continuum wave functions are very important in many branches of physics such as electron-atom collisions, cold atom physics, and atomic ionization in strong laser fields, etc. Although there already exist various algorithms and codes, most of them are only reliable in a certain ranges of parameters. In some practical applications, accurate continuum wave functions need to be calculated at extremely low energies, large radial distances and/or large angular momentum number. Here we provide such a code, which can generate accurate hydrogenic continuum wave functions and corresponding Coulomb phase shifts at a wide range of parameters. Without any essential restrict to angular momentum number, the present code is able to give reliable results at the electron energy range [10,10] eV for radial distances of [10,10] a.u. We also find the present code is very efficient, which should find numerous applications in many fields such as strong field physics. Program summaryProgram title: HContinuumGautchi Catalogue identifier: AEHD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHD_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1233 No. of bytes in distributed program, including test data, etc.: 7405 Distribution format: tar.gz Programming language: Fortran90 in fixed format Computer: AMD Processors Operating system: Linux RAM: 20 MBytes Classification: 2.7, 4.5 Nature of problem: The accurate computation of atomic continuum wave functions is very important in many research fields such as strong field physics and cold atom physics. Although there have already existed various algorithms and codes, most of them can only be applicable and reliable in a certain range of parameters. We present here an accurate FORTRAN program for

  7. Tool Use of Experienced Learners in Computer-Based Learning Environments: Can Tools Be Beneficial?

    ERIC Educational Resources Information Center

    Juarez Collazo, Norma A.; Corradi, David; Elen, Jan; Clarebout, Geraldine

    2014-01-01

    Research has documented the use of tools in computer-based learning environments as problematic, that is, learners do not use the tools and when they do, they tend to do it suboptimally. This study attempts to disentangle cause and effect of this suboptimal tool use for experienced learners. More specifically, learner variables (metacognitive and…

  8. Time-accurate Navier-Stokes computations of classical two-dimensional edge tone flow fields

    NASA Technical Reports Server (NTRS)

    Liu, B. L.; O'Farrell, J. M.; Jones, Jess H.

    1990-01-01

    Time-accurate Navier-Stokes computations were performed to study a Class II (acoustic) whistle, the edge tone, and gain knowledge of the vortex-acoustic coupling mechanisms driving production of these tones. Results were obtained by solving the full Navier-Stokes equations for laminar compressible air flow of a two-dimensional jet issuing from a slit interacting with a wedge. Cases considered were determined by varying the distance from the slit to the edge. Flow speed was kept constant at 1750 cm/sec as was the slit thickness of 0.1 cm, corresponding to conditions in the experiments of Brown. Excellent agreement was obtained in all four edge tone stage cases between the present computational results and the experimentally obtained results of Brown. Specific edge tone generated phenomena and further confirmation of certain theories concerning these phenomena were brought to light in this analytical simulation of edge tones.

  9. AI tools in computer based problem solving

    NASA Technical Reports Server (NTRS)

    Beane, Arthur J.

    1988-01-01

    The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.

  10. A Computer-Based Tool for Introducing Turfgrass Species.

    ERIC Educational Resources Information Center

    Fermanian, T. W.; Wehner, D. J.

    1995-01-01

    Describes a self-contained computer application constructed using the SuperCard development tool which introduces the characteristics of turfgrass species and their optimum environments. Evaluates students' gain in understanding turf species characteristics through this approach. (LZ)

  11. Astronaut's tool for withdrawing/replacing computer cards

    NASA Technical Reports Server (NTRS)

    West, R. L.

    1969-01-01

    Symmetrical tool allows astronauts to withdraw and replace Apollo Telescope Mount control computer cards. It is easily manipulated by a gloved hand, provides positive locking of a withdrawn card, and has a visible locking device.

  12. Accurate and computationally efficient mixing models for the simulation of turbulent mixing with PDF methods

    NASA Astrophysics Data System (ADS)

    Meyer, Daniel W.; Jenny, Patrick

    2013-08-01

    Different simulation methods are applicable to study turbulent mixing. When applying probability density function (PDF) methods, turbulent transport, and chemical reactions appear in closed form, which is not the case in second moment closure methods (RANS). Moreover, PDF methods provide the entire joint velocity-scalar PDF instead of a limited set of moments. In PDF methods, however, a mixing model is required to account for molecular diffusion. In joint velocity-scalar PDF methods, mixing models should also account for the joint velocity-scalar statistics, which is often under appreciated in applications. The interaction by exchange with the conditional mean (IECM) model accounts for these joint statistics, but requires velocity-conditional scalar means that are expensive to compute in spatially three dimensional settings. In this work, two alternative mixing models are presented that provide more accurate PDF predictions at reduced computational cost compared to the IECM model, since no conditional moments have to be computed. All models are tested for different mixing benchmark cases and their computational efficiencies are inspected thoroughly. The benchmark cases involve statistically homogeneous and inhomogeneous settings dealing with three streams that are characterized by two passive scalars. The inhomogeneous case clearly illustrates the importance of accounting for joint velocity-scalar statistics in the mixing model. Failure to do so leads to significant errors in the resulting scalar means, variances and other statistics.

  13. Accurate identification and compensation of geometric errors of 5-axis CNC machine tools using double ball bar

    NASA Astrophysics Data System (ADS)

    Lasemi, Ali; Xue, Deyi; Gu, Peihua

    2016-05-01

    Five-axis CNC machine tools are widely used in manufacturing of parts with free-form surfaces. Geometric errors of machine tools have significant effects on the quality of manufactured parts. This research focuses on development of a new method to accurately identify geometric errors of 5-axis CNC machines, especially the errors due to rotary axes, using the magnetic double ball bar. A theoretical model for identification of geometric errors is provided. In this model, both position-independent errors and position-dependent errors are considered as the error sources. This model is simplified by identification and removal of the correlated and insignificant error sources of the machine. Insignificant error sources are identified using the sensitivity analysis technique. Simulation results reveal that the simplified error identification model can result in more accurate estimations of the error parameters. Experiments on a 5-axis CNC machine tool also demonstrate significant reduction in the volumetric error after error compensation.

  14. Analysis and computer tools for separation processes involving nonideal mixtures

    SciTech Connect

    Lucia, A.

    1992-05-01

    The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

  15. Computational Tools for Accelerating Carbon Capture Process Development

    SciTech Connect

    Miller, David; Sahinidis, N V; Cozad, A; Lee, A; Kim, H; Morinelly, J; Eslick, J; Yuan, Z

    2013-06-04

    This presentation reports development of advanced computational tools to accelerate next generation technology development. These tools are to develop an optimized process using rigorous models. They include: Process Models; Simulation-Based Optimization; Optimized Process; Uncertainty Quantification; Algebraic Surrogate Models; and Superstructure Optimization (Determine Configuration).

  16. Scratch as a Computational Modelling Tool for Teaching Physics

    ERIC Educational Resources Information Center

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  17. Caesy: A software tool for computer-aided engineering

    NASA Technical Reports Server (NTRS)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  18. Computational Tools to Assess Turbine Biological Performance

    SciTech Connect

    Richmond, Marshall C.; Serkowski, John A.; Rakowski, Cynthia L.; Strickler, Brad; Weisbeck, Molly; Dotson, Curtis L.

    2014-07-24

    Public Utility District No. 2 of Grant County (GCPUD) operates the Priest Rapids Dam (PRD), a hydroelectric facility on the Columbia River in Washington State. The dam contains 10 Kaplan-type turbine units that are now more than 50 years old. Plans are underway to refit these aging turbines with new runners. The Columbia River at PRD is a migratory pathway for several species of juvenile and adult salmonids, so passage of fish through the dam is a major consideration when upgrading the turbines. In this paper, a method for turbine biological performance assessment (BioPA) is demonstrated. Using this method, a suite of biological performance indicators is computed based on simulated data from a CFD model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. Using known relationships between the dose of an injury mechanism and frequency of injury (dose–response) from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from proposed designs, the engineer can identify the more-promising alternatives. We present an application of the BioPA method for baseline risk assessment calculations for the existing Kaplan turbines at PRD that will be used as the minimum biological performance that a proposed new design must achieve.

  19. The Use of Computer Tools to Support Meaningful Learning

    ERIC Educational Resources Information Center

    Keengwe, Jared; Onchwari, Grace; Wachira, Patrick

    2008-01-01

    This article attempts to provide a review of literature pertaining to computer technology use in education. The authors discuss the benefits of learning with technology tools when integrated into teaching. The argument that introducing computer technology into schools will neither improve nor change the quality of classroom instruction unless…

  20. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  1. Scratch as a computational modelling tool for teaching physics

    NASA Astrophysics Data System (ADS)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-05-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling programs. In this article, we briefly discuss why Scratch could be a useful tool for computational modelling in the primary or secondary physics classroom, and we present practical examples of how it can be used to build a model.

  2. DeconMSn: A Software Tool for accurate parent ion monoisotopic mass determination for tandem mass spectra

    SciTech Connect

    Mayampurath, Anoop M.; Jaitly, Navdeep; Purvine, Samuel O.; Monroe, Matthew E.; Auberry, Kenneth J.; Adkins, Joshua N.; Smith, Richard D.

    2008-04-01

    We present a new software tool for tandem MS analyses that: • accurately calculates the monoisotopic mass and charge of high–resolution parent ions • accurately operates regardless of the mass selected for fragmentation • performs independent of instrument settings • enables optimal selection of search mass tolerance for high mass accuracy experiments • is open source and thus can be tailored to individual needs • incorporates a SVM-based charge detection algorithm for analyzing low resolution tandem MS spectra • creates multiple output data formats (.dta, .MGF) • handles .RAW files and .mzXML formats • compatible with SEQUEST, MASCOT, X!Tandem

  3. Making it Easy to Construct Accurate Hydrological Models that Exploit High Performance Computers (Invited)

    NASA Astrophysics Data System (ADS)

    Kees, C. E.; Farthing, M. W.; Terrel, A.; Certik, O.; Seljebotn, D.

    2013-12-01

    This presentation will focus on two barriers to progress in the hydrological modeling community, and research and development conducted to lessen or eliminate them. The first is a barrier to sharing hydrological models among specialized scientists that is caused by intertwining the implementation of numerical methods with the implementation of abstract numerical modeling information. In the Proteus toolkit for computational methods and simulation, we have decoupled these two important parts of computational model through separate "physics" and "numerics" interfaces. More recently we have begun developing the Strong Form Language for easy and direct representation of the mathematical model formulation in a domain specific language embedded in Python. The second major barrier is sharing ANY scientific software tools that have complex library or module dependencies, as most parallel, multi-physics hydrological models must have. In this setting, users and developer are dependent on an entire distribution, possibly depending on multiple compilers and special instructions depending on the environment of the target machine. To solve these problem we have developed, hashdist, a stateless package management tool and a resulting portable, open source scientific software distribution.

  4. Computer tools for systems engineering at LaRC

    NASA Technical Reports Server (NTRS)

    Walters, J. Milam

    1994-01-01

    The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.

  5. CoMOGrad and PHOG: From Computer Vision to Fast and Accurate Protein Tertiary Structure Retrieval

    PubMed Central

    Karim, Rezaul; Aziz, Mohd. Momin Al; Shatabda, Swakkhar; Rahman, M. Sohel; Mia, Md. Abul Kashem; Zaman, Farhana; Rakin, Salman

    2015-01-01

    The number of entries in a structural database of proteins is increasing day by day. Methods for retrieving protein tertiary structures from such a large database have turn out to be the key to comparative analysis of structures that plays an important role to understand proteins and their functions. In this paper, we present fast and accurate methods for the retrieval of proteins having tertiary structures similar to a query protein from a large database. Our proposed methods borrow ideas from the field of computer vision. The speed and accuracy of our methods come from the two newly introduced features- the co-occurrence matrix of the oriented gradient and pyramid histogram of oriented gradient- and the use of Euclidean distance as the distance measure. Experimental results clearly indicate the superiority of our approach in both running time and accuracy. Our method is readily available for use from this website: http://research.buet.ac.bd:8080/Comograd/. PMID:26293226

  6. Accurate Time-Dependent Traveling-Wave Tube Model Developed for Computational Bit-Error-Rate Testing

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.

    2001-01-01

    The phenomenal growth of the satellite communications industry has created a large demand for traveling-wave tubes (TWT's) operating with unprecedented specifications requiring the design and production of many novel devices in record time. To achieve this, the TWT industry heavily relies on computational modeling. However, the TWT industry's computational modeling capabilities need to be improved because there are often discrepancies between measured TWT data and that predicted by conventional two-dimensional helical TWT interaction codes. This limits the analysis and design of novel devices or TWT's with parameters differing from what is conventionally manufactured. In addition, the inaccuracy of current computational tools limits achievable TWT performance because optimized designs require highly accurate models. To address these concerns, a fully three-dimensional, time-dependent, helical TWT interaction model was developed using the electromagnetic particle-in-cell code MAFIA (Solution of MAxwell's equations by the Finite-Integration-Algorithm). The model includes a short section of helical slow-wave circuit with excitation fed by radiofrequency input/output couplers, and an electron beam contained by periodic permanent magnet focusing. A cutaway view of several turns of the three-dimensional helical slow-wave circuit with input/output couplers is shown. This has been shown to be more accurate than conventionally used two-dimensional models. The growth of the communications industry has also imposed a demand for increased data rates for the transmission of large volumes of data. To achieve increased data rates, complex modulation and multiple access techniques are employed requiring minimum distortion of the signal as it is passed through the TWT. Thus, intersymbol interference (ISI) becomes a major consideration, as well as suspected causes such as reflections within the TWT. To experimentally investigate effects of the physical TWT on ISI would be

  7. Plant computer applications 'design and implementation tools' set

    SciTech Connect

    Anikanov, S. S.; Stolyetniy, I. V.; Tregubov, M. I.; Guslyakov, O. L.; Gladkov, Y. I.

    2006-07-01

    This paper describes functionality of the application programs' development tool, which is intended to support the full scope of the NPP plant computer applications' design process. The Application Development Tools' Set (ADTS), described in this paper, refers to a set of tools intended to capture functional requirements for applications and support design process from definition of design basis up to final testing of developed applications. There are several tools developed by Westinghouse that facilitate design of application software on different stages of design process. Those are: NAPDT - Nuclear Application Development Tool; SDODT - Simplified Display Object Development Tool; OPAL - Test case execution and documenting tool; The main idea of ADTS is to combine the aforementioned tools into one software environment with other Common out-of-shelf (COT) software to facilitate and expedite NPP plant computer applications. Combination of the software tools included into ADTS satisfies industry requirements for the application software intended for use in the Category B and C systems /2 - 4/. (authors)

  8. Numerical Computation of a Continuous-thrust State Transition Matrix Incorporating Accurate Hardware and Ephemeris Models

    NASA Technical Reports Server (NTRS)

    Ellison, Donald; Conway, Bruce; Englander, Jacob

    2015-01-01

    A significant body of work exists showing that providing a nonlinear programming (NLP) solver with expressions for the problem constraint gradient substantially increases the speed of program execution and can also improve the robustness of convergence, especially for local optimizers. Calculation of these derivatives is often accomplished through the computation of spacecraft's state transition matrix (STM). If the two-body gravitational model is employed as is often done in the context of preliminary design, closed form expressions for these derivatives may be provided. If a high fidelity dynamics model, that might include perturbing forces such as the gravitational effect from multiple third bodies and solar radiation pressure is used then these STM's must be computed numerically. We present a method for the power hardward model and a full ephemeris model. An adaptive-step embedded eight order Dormand-Prince numerical integrator is discussed and a method for the computation of the time of flight derivatives in this framework is presented. The use of these numerically calculated derivatieves offer a substantial improvement over finite differencing in the context of a global optimizer. Specifically the inclusion of these STM's into the low thrust missiondesign tool chain in use at NASA Goddard Spaceflight Center allows for an increased preliminary mission design cadence.

  9. A new software tool for computing Earth's atmospheric transmission of near- and far-infrared radiation

    NASA Technical Reports Server (NTRS)

    Lord, Steven D.

    1992-01-01

    This report describes a new software tool, ATRAN, which computes the transmittance of Earth's atmosphere at near- and far-infrared wavelengths. We compare the capabilities of this program with others currently available and demonstrate its utility for observational data calibration and reduction. The program employs current water-vapor and ozone models to produce fast and accurate transmittance spectra for wavelengths ranging from 0.8 microns to 10 mm.

  10. Development of highly accurate approximate scheme for computing the charge transfer integral.

    PubMed

    Pershin, Anton; Szalay, Péter G

    2015-08-21

    The charge transfer integral is a key parameter required by various theoretical models to describe charge transport properties, e.g., in organic semiconductors. The accuracy of this important property depends on several factors, which include the level of electronic structure theory and internal simplifications of the applied formalism. The goal of this paper is to identify the performance of various approximate approaches of the latter category, while using the high level equation-of-motion coupled cluster theory for the electronic structure. The calculations have been performed on the ethylene dimer as one of the simplest model systems. By studying different spatial perturbations, it was shown that while both energy split in dimer and fragment charge difference methods are equivalent with the exact formulation for symmetrical displacements, they are less efficient when describing transfer integral along the asymmetric alteration coordinate. Since the "exact" scheme was found computationally expensive, we examine the possibility to obtain the asymmetric fluctuation of the transfer integral by a Taylor expansion along the coordinate space. By exploring the efficiency of this novel approach, we show that the Taylor expansion scheme represents an attractive alternative to the "exact" calculations due to a substantial reduction of computational costs, when a considerably large region of the potential energy surface is of interest. Moreover, we show that the Taylor expansion scheme, irrespective of the dimer symmetry, is very accurate for the entire range of geometry fluctuations that cover the space the molecule accesses at room temperature. PMID:26298117

  11. Parallel Higher-order Finite Element Method for Accurate Field Computations in Wakefield and PIC Simulations

    SciTech Connect

    Candel, A.; Kabel, A.; Lee, L.; Li, Z.; Limborg, C.; Ng, C.; Prudencio, E.; Schussman, G.; Uplenchwar, R.; Ko, K.; /SLAC

    2009-06-19

    Over the past years, SLAC's Advanced Computations Department (ACD), under SciDAC sponsorship, has developed a suite of 3D (2D) parallel higher-order finite element (FE) codes, T3P (T2P) and Pic3P (Pic2P), aimed at accurate, large-scale simulation of wakefields and particle-field interactions in radio-frequency (RF) cavities of complex shape. The codes are built on the FE infrastructure that supports SLAC's frequency domain codes, Omega3P and S3P, to utilize conformal tetrahedral (triangular)meshes, higher-order basis functions and quadratic geometry approximation. For time integration, they adopt an unconditionally stable implicit scheme. Pic3P (Pic2P) extends T3P (T2P) to treat charged-particle dynamics self-consistently using the PIC (particle-in-cell) approach, the first such implementation on a conformal, unstructured grid using Whitney basis functions. Examples from applications to the International Linear Collider (ILC), Positron Electron Project-II (PEP-II), Linac Coherent Light Source (LCLS) and other accelerators will be presented to compare the accuracy and computational efficiency of these codes versus their counterparts using structured grids.

  12. Development of highly accurate approximate scheme for computing the charge transfer integral

    SciTech Connect

    Pershin, Anton; Szalay, Péter G.

    2015-08-21

    The charge transfer integral is a key parameter required by various theoretical models to describe charge transport properties, e.g., in organic semiconductors. The accuracy of this important property depends on several factors, which include the level of electronic structure theory and internal simplifications of the applied formalism. The goal of this paper is to identify the performance of various approximate approaches of the latter category, while using the high level equation-of-motion coupled cluster theory for the electronic structure. The calculations have been performed on the ethylene dimer as one of the simplest model systems. By studying different spatial perturbations, it was shown that while both energy split in dimer and fragment charge difference methods are equivalent with the exact formulation for symmetrical displacements, they are less efficient when describing transfer integral along the asymmetric alteration coordinate. Since the “exact” scheme was found computationally expensive, we examine the possibility to obtain the asymmetric fluctuation of the transfer integral by a Taylor expansion along the coordinate space. By exploring the efficiency of this novel approach, we show that the Taylor expansion scheme represents an attractive alternative to the “exact” calculations due to a substantial reduction of computational costs, when a considerably large region of the potential energy surface is of interest. Moreover, we show that the Taylor expansion scheme, irrespective of the dimer symmetry, is very accurate for the entire range of geometry fluctuations that cover the space the molecule accesses at room temperature.

  13. Virtual Cell: computational tools for modeling in cell biology

    PubMed Central

    Resasco, Diana C.; Gao, Fei; Morgan, Frank; Novak, Igor L.; Schaff, James C.; Slepchenko, Boris M.

    2011-01-01

    The Virtual Cell (VCell) is a general computational framework for modeling physico-chemical and electrophysiological processes in living cells. Developed by the National Resource for Cell Analysis and Modeling at the University of Connecticut Health Center, it provides automated tools for simulating a wide range of cellular phenomena in space and time, both deterministically and stochastically. These computational tools allow one to couple electrophysiology and reaction kinetics with transport mechanisms, such as diffusion and directed transport, and map them onto spatial domains of various shapes, including irregular three-dimensional geometries derived from experimental images. In this article, we review new robust computational tools recently deployed in VCell for treating spatially resolved models. PMID:22139996

  14. Majority vote and other problems when using computational tools.

    PubMed

    Vihinen, Mauno

    2014-08-01

    Computational tools are essential for most of our research. To use these tools, one needs to know how they work. Problems in application of computational methods to variation analysis can appear at several stages and affect, for example, the interpretation of results. Such cases are discussed along with suggestions how to avoid them. The applications include incomplete reporting of methods, especially about the use of prediction tools; method selection on unscientific grounds and without consulting independent method performance assessments; extending application area of methods outside their intended purpose; use of the same data several times for obtaining majority vote; and filtering of datasets so that variants of interest are excluded. All these issues can be avoided by discontinuing the use software tools as black boxes.

  15. Accurate molecular structure and spectroscopic properties for nucleobases: A combined computational - microwave investigation of 2-thiouracil as a case study

    PubMed Central

    Puzzarini, Cristina; Biczysko, Malgorzata; Barone, Vincenzo; Peña, Isabel; Cabezas, Carlos; Alonso, José L.

    2015-01-01

    The computational composite scheme purposely set up for accurately describing the electronic structure and spectroscopic properties of small biomolecules has been applied to the first study of the rotational spectrum of 2-thiouracil. The experimental investigation was made possible thanks to the combination of the laser ablation technique with Fourier Transform Microwave spectrometers. The joint experimental – computational study allowed us to determine accurate molecular structure and spectroscopic properties for the title molecule, but more important, it demonstrates a reliable approach for the accurate investigation of isolated small biomolecules. PMID:24002739

  16. A computer tool to support in design of industrial Ethernet.

    PubMed

    Lugli, Alexandre Baratella; Santos, Max Mauro Dias; Franco, Lucia Regina Horta Rodrigues

    2009-04-01

    This paper presents a computer tool to support in the project and development of an industrial Ethernet network, verifying the physical layer (cables-resistance and capacitance, scan time, network power supply-POE's concept "Power Over Ethernet" and wireless), and occupation rate (amount of information transmitted to the network versus the controller network scan time). These functions are accomplished without a single physical element installed in the network, using only simulation. The computer tool has a software that presents a detailed vision of the network to the user, besides showing some possible problems in the network, and having an extremely friendly environment.

  17. Review of parallel computing methods and tools for FPGA technology

    NASA Astrophysics Data System (ADS)

    Cieszewski, Radosław; Linczuk, Maciej; Pozniak, Krzysztof; Romaniuk, Ryszard

    2013-10-01

    Parallel computing is emerging as an important area of research in computer architectures and software systems. Many algorithms can be greatly accelerated using parallel computing techniques. Specialized parallel computer architectures are used for accelerating speci c tasks. High-Energy Physics Experiments measuring systems often use FPGAs for ne-grained computation. FPGA combines many bene ts of both software and ASIC implementations. Like software, the mapped circuit is exible, and can be recon gured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This paper presents existing methods and tools for ne-grained computation implemented in FPGA using Behavioral Description and High Level Programming Languages.

  18. SPARSKIT: A basic tool kit for sparse matrix computations

    NASA Technical Reports Server (NTRS)

    Saad, Youcef

    1990-01-01

    Presented here are the main features of a tool package for manipulating and working with sparse matrices. One of the goals of the package is to provide basic tools to facilitate the exchange of software and data between researchers in sparse matrix computations. The starting point is the Harwell/Boeing collection of matrices for which the authors provide a number of tools. Among other things, the package provides programs for converting data structures, printing simple statistics on a matrix, plotting a matrix profile, and performing linear algebra operations with sparse matrices.

  19. Tool Use and Performance: Relationships between Tool- and Learner-Related Characteristics in a Computer-Based Learning Environment

    ERIC Educational Resources Information Center

    Juarez-Collazo, Norma A.; Elen, Jan; Clarebout, Geraldine

    2013-01-01

    It is still unclear on what and how tool and learner characteristics influence tool use and consequently performance in computer-based learning environments (CBLEs). This study examines the relationships between tool-related characteristics (tool presentation: non-/embedded tool and instructional cues: non-/explained tool functionality) and…

  20. Accurate computation of surface stresses and forces with immersed boundary methods

    NASA Astrophysics Data System (ADS)

    Goza, Andres; Liska, Sebastian; Morley, Benjamin; Colonius, Tim

    2016-09-01

    Many immersed boundary methods solve for surface stresses that impose the velocity boundary conditions on an immersed body. These surface stresses may contain spurious oscillations that make them ill-suited for representing the physical surface stresses on the body. Moreover, these inaccurate stresses often lead to unphysical oscillations in the history of integrated surface forces such as the coefficient of lift. While the errors in the surface stresses and forces do not necessarily affect the convergence of the velocity field, it is desirable, especially in fluid-structure interaction problems, to obtain smooth and convergent stress distributions on the surface. To this end, we show that the equation for the surface stresses is an integral equation of the first kind whose ill-posedness is the source of spurious oscillations in the stresses. We also demonstrate that for sufficiently smooth delta functions, the oscillations may be filtered out to obtain physically accurate surface stresses. The filtering is applied as a post-processing procedure, so that the convergence of the velocity field is unaffected. We demonstrate the efficacy of the method by computing stresses and forces that converge to the physical stresses and forces for several test problems.

  1. Aeroacoustic Flow Phenomena Accurately Captured by New Computational Fluid Dynamics Method

    NASA Technical Reports Server (NTRS)

    Blech, Richard A.

    2002-01-01

    One of the challenges in the computational fluid dynamics area is the accurate calculation of aeroacoustic phenomena, especially in the presence of shock waves. One such phenomenon is "transonic resonance," where an unsteady shock wave at the throat of a convergent-divergent nozzle results in the emission of acoustic tones. The space-time Conservation-Element and Solution-Element (CE/SE) method developed at the NASA Glenn Research Center can faithfully capture the shock waves, their unsteady motion, and the generated acoustic tones. The CE/SE method is a revolutionary new approach to the numerical modeling of physical phenomena where features with steep gradients (e.g., shock waves, phase transition, etc.) must coexist with those having weaker variations. The CE/SE method does not require the complex interpolation procedures (that allow for the possibility of a shock between grid cells) used by many other methods to transfer information between grid cells. These interpolation procedures can add too much numerical dissipation to the solution process. Thus, while shocks are resolved, weaker waves, such as acoustic waves, are washed out.

  2. Multimedia Instructional Tools and Student Learning in Computer Applications Courses

    ERIC Educational Resources Information Center

    Chapman, Debra Laier

    2013-01-01

    Advances in technology and changes in educational strategies have resulted in the integration of technology into the classroom. Multimedia instructional tools (MMIT) have been identified as a way to provide student-centered active-learning instructional material to students. MMITs are common in introductory computer applications courses based on…

  3. Integrating Computer-Assisted Translation Tools into Language Learning

    ERIC Educational Resources Information Center

    Fernández-Parra, María

    2016-01-01

    Although Computer-Assisted Translation (CAT) tools play an important role in the curriculum in many university translator training programmes, they are seldom used in the context of learning a language, as a good command of a language is needed before starting to translate. Since many institutions often have translator-training programmes as well…

  4. Computer Mathematical Tools: Practical Experience of Learning to Use Them

    ERIC Educational Resources Information Center

    Semenikhina, Elena; Drushlyak, Marina

    2014-01-01

    The article contains general information about the use of specialized mathematics software in the preparation of math teachers. The authors indicate the reasons to study the mathematics software. In particular, they analyze the possibility of presenting basic mathematical courses using mathematical computer tools from both a teacher and a student,…

  5. Optimizing odor identification testing as quick and accurate diagnostic tool for Parkinson's disease

    PubMed Central

    Mahlknecht, Philipp; Pechlaner, Raimund; Boesveldt, Sanne; Volc, Dieter; Pinter, Bernardette; Reiter, Eva; Müller, Christoph; Krismer, Florian; Berendse, Henk W.; van Hilten, Jacobus J.; Wuschitz, Albert; Schimetta, Wolfgang; Högl, Birgit; Djamshidian, Atbin; Nocker, Michael; Göbel, Georg; Gasperi, Arno; Kiechl, Stefan; Willeit, Johann; Poewe, Werner

    2016-01-01

    ABSTRACT Introduction The aim of this study was to evaluate odor identification testing as a quick, cheap, and reliable tool to identify PD. Methods Odor identification with the 16‐item Sniffin' Sticks test (SS‐16) was assessed in a total of 646 PD patients and 606 controls from three European centers (A, B, and C), as well as 75 patients with atypical parkinsonism or essential tremor and in a prospective cohort of 24 patients with idiopathic rapid eye movement sleep behavior disorder (center A). Reduced odor sets most discriminative for PD were determined in a discovery cohort derived from a random split of PD patients and controls from center A using L1‐regularized logistic regression. Diagnostic accuracy was assessed in the rest of the patients/controls as validation cohorts. Results Olfactory performance was lower in PD patients compared with controls and non‐PD patients in all cohorts (each P < 0.001). Both the full SS‐16 and a subscore of the top eight discriminating odors (SS‐8) were associated with an excellent discrimination of PD from controls (areas under the curve ≥0.90; sensitivities ≥83.3%; specificities ≥82.0%) and from non‐PD patients (areas under the curve ≥0.91; sensitivities ≥84.1%; specificities ≥84.0%) in all cohorts. This remained unchanged when patients with >3 years of disease duration were excluded from analysis. All 8 incident PD cases among patients with idiopathic rapid eye movement sleep behavior disorder were predicted with the SS‐16 and the SS‐8 (sensitivity, 100%; positive predictive value, 61.5%). Conclusions Odor identification testing provides excellent diagnostic accuracy in the distinction of PD patients from controls and diagnostic mimics. A reduced set of eight odors could be used as a quick tool in the workup of patients presenting with parkinsonism and for PD risk indication. © 2016 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and

  6. Toward accurate tooth segmentation from computed tomography images using a hybrid level set model

    SciTech Connect

    Gan, Yangzhou; Zhao, Qunfei; Xia, Zeyang E-mail: jing.xiong@siat.ac.cn; Hu, Ying; Xiong, Jing E-mail: jing.xiong@siat.ac.cn; Zhang, Jianwei

    2015-01-15

    Purpose: A three-dimensional (3D) model of the teeth provides important information for orthodontic diagnosis and treatment planning. Tooth segmentation is an essential step in generating the 3D digital model from computed tomography (CT) images. The aim of this study is to develop an accurate and efficient tooth segmentation method from CT images. Methods: The 3D dental CT volumetric images are segmented slice by slice in a two-dimensional (2D) transverse plane. The 2D segmentation is composed of a manual initialization step and an automatic slice by slice segmentation step. In the manual initialization step, the user manually picks a starting slice and selects a seed point for each tooth in this slice. In the automatic slice segmentation step, a developed hybrid level set model is applied to segment tooth contours from each slice. Tooth contour propagation strategy is employed to initialize the level set function automatically. Cone beam CT (CBCT) images of two subjects were used to tune the parameters. Images of 16 additional subjects were used to validate the performance of the method. Volume overlap metrics and surface distance metrics were adopted to assess the segmentation accuracy quantitatively. The volume overlap metrics were volume difference (VD, mm{sup 3}) and Dice similarity coefficient (DSC, %). The surface distance metrics were average symmetric surface distance (ASSD, mm), RMS (root mean square) symmetric surface distance (RMSSSD, mm), and maximum symmetric surface distance (MSSD, mm). Computation time was recorded to assess the efficiency. The performance of the proposed method has been compared with two state-of-the-art methods. Results: For the tested CBCT images, the VD, DSC, ASSD, RMSSSD, and MSSD for the incisor were 38.16 ± 12.94 mm{sup 3}, 88.82 ± 2.14%, 0.29 ± 0.03 mm, 0.32 ± 0.08 mm, and 1.25 ± 0.58 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the canine were 49.12 ± 9.33 mm{sup 3}, 91.57 ± 0.82%, 0.27 ± 0.02 mm, 0

  7. Facilitating the selection and creation of accurate interatomic potentials with robust tools and characterization

    NASA Astrophysics Data System (ADS)

    Trautt, Zachary T.; Tavazza, Francesca; Becker, Chandler A.

    2015-10-01

    The Materials Genome Initiative seeks to significantly decrease the cost and time of development and integration of new materials. Within the domain of atomistic simulations, several roadblocks stand in the way of reaching this goal. While the NIST Interatomic Potentials Repository hosts numerous interatomic potentials (force fields), researchers cannot immediately determine the best choice(s) for their use case. Researchers developing new potentials, specifically those in restricted environments, lack a comprehensive portfolio of efficient tools capable of calculating and archiving the properties of their potentials. This paper elucidates one solution to these problems, which uses Python-based scripts that are suitable for rapid property evaluation and human knowledge transfer. Calculation results are visible on the repository website, which reduces the time required to select an interatomic potential for a specific use case. Furthermore, property evaluation scripts are being integrated with modern platforms to improve discoverability and access of materials property data. To demonstrate these scripts and features, we will discuss the automation of stacking fault energy calculations and their application to additional elements. While the calculation methodology was developed previously, we are using it here as a case study in simulation automation and property calculations. We demonstrate how the use of Python scripts allows for rapid calculation in a more easily managed way where the calculations can be modified, and the results presented in user-friendly and concise ways. Additionally, the methods can be incorporated into other efforts, such as openKIM.

  8. Analysis and accurate reconstruction of incomplete data in X-ray differential phase-contrast computed tomography.

    PubMed

    Fu, Jian; Tan, Renbo; Chen, Liyuan

    2014-01-01

    X-ray differential phase-contrast computed tomography (DPC-CT) is a powerful physical and biochemical analysis tool. In practical applications, there are often challenges for DPC-CT due to insufficient data caused by few-view, bad or missing detector channels, or limited scanning angular range. They occur quite frequently because of experimental constraints from imaging hardware, scanning geometry, and the exposure dose delivered to living specimens. In this work, we analyze the influence of incomplete data on DPC-CT image reconstruction. Then, a reconstruction method is developed and investigated for incomplete data DPC-CT. It is based on an algebraic iteration reconstruction technique, which minimizes the image total variation and permits accurate tomographic imaging with less data. This work comprises a numerical study of the method and its experimental verification using a dataset measured at the W2 beamline of the storage ring DORIS III equipped with a Talbot-Lau interferometer. The numerical and experimental results demonstrate that the presented method can handle incomplete data. It will be of interest for a wide range of DPC-CT applications in medicine, biology, and nondestructive testing.

  9. Raman Spectroscopy Provides a Powerful Diagnostic Tool for Accurate Determination of Albumin Glycation

    PubMed Central

    Dingari, Narahara Chari; Horowitz, Gary L.; Kang, Jeon Woong; Dasari, Ramachandra R.; Barman, Ishan

    2012-01-01

    We present the first demonstration of glycated albumin detection and quantification using Raman spectroscopy without the addition of reagents. Glycated albumin is an important marker for monitoring the long-term glycemic history of diabetics, especially as its concentrations, in contrast to glycated hemoglobin levels, are unaffected by changes in erythrocyte life times. Clinically, glycated albumin concentrations show a strong correlation with the development of serious diabetes complications including nephropathy and retinopathy. In this article, we propose and evaluate the efficacy of Raman spectroscopy for determination of this important analyte. By utilizing the pre-concentration obtained through drop-coating deposition, we show that glycation of albumin leads to subtle, but consistent, changes in vibrational features, which with the help of multivariate classification techniques can be used to discriminate glycated albumin from the unglycated variant with 100% accuracy. Moreover, we demonstrate that the calibration model developed on the glycated albumin spectral dataset shows high predictive power, even at substantially lower concentrations than those typically encountered in clinical practice. In fact, the limit of detection for glycated albumin measurements is calculated to be approximately four times lower than its minimum physiological concentration. Importantly, in relation to the existing detection methods for glycated albumin, the proposed method is also completely reagent-free, requires barely any sample preparation and has the potential for simultaneous determination of glycated hemoglobin levels as well. Given these key advantages, we believe that the proposed approach can provide a uniquely powerful tool for quantification of glycation status of proteins in biopharmaceutical development as well as for glycemic marker determination in routine clinical diagnostics in the future. PMID:22393405

  10. A general and accurate approach for computing the statistical power of the transmission disequilibrium test for complex disease genes.

    PubMed

    Chen, W M; Deng, H W

    2001-07-01

    Transmission disequilibrium test (TDT) is a nuclear family-based analysis that can test linkage in the presence of association. It has gained extensive attention in theoretical investigation and in practical application; in both cases, the accuracy and generality of the power computation of the TDT are crucial. Despite extensive investigations, previous approaches for computing the statistical power of the TDT are neither accurate nor general. In this paper, we develop a general and highly accurate approach to analytically compute the power of the TDT. We compare the results from our approach with those from several other recent papers, all against the results obtained from computer simulations. We show that the results computed from our approach are more accurate than or at least the same as those from other approaches. More importantly, our approach can handle various situations, which include (1) families that consist of one or more children and that have any configuration of affected and nonaffected sibs; (2) families ascertained through the affection status of parent(s); (3) any mixed sample with different types of families in (1) and (2); (4) the marker locus is not a disease susceptibility locus; and (5) existence of allelic heterogeneity. We implement this approach in a user-friendly computer program: TDT Power Calculator. Its applications are demonstrated. The approach and the program developed here should be significant for theoreticians to accurately investigate the statistical power of the TDT in various situations, and for empirical geneticists to plan efficient studies using the TDT.

  11. Computational Tools for Accelerating Carbon Capture Process Development

    SciTech Connect

    Miller, David

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  12. The role of customized computational tools in product development.

    SciTech Connect

    Heinstein, Martin Wilhelm; Kempka, Steven Norman; Tikare, Veena

    2005-06-01

    Model-based computer simulations have revolutionized product development in the last 10 to 15 years. Technologies that have existed for many decades or even centuries have been improved with the aid of computer simulations. Everything from low-tech consumer goods such as detergents, lubricants and light bulb filaments to the most advanced high-tech products such as airplane wings, wireless communication technologies and pharmaceuticals is engineered with the aid of computer simulations today. In this paper, we present a framework for describing computational tools and their application within the context of product engineering. We examine a few cases of product development that integrate numerical computer simulations into the development stage. We will discuss how the simulations were integrated into the development process, what features made the simulations useful, the level of knowledge and experience that was necessary to run meaningful simulations and other details of the process. Based on this discussion, recommendations for the incorporation of simulations and computational tools into product development will be made.

  13. A tangible programming tool for children to cultivate computational thinking.

    PubMed

    Wang, Danli; Wang, Tingting; Liu, Zhen

    2014-01-01

    Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5-9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity. PMID:24719575

  14. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    PubMed Central

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  15. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    PubMed Central

    Wang, Danli; Liu, Zhen

    2014-01-01

    Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity. PMID:24719575

  16. A tangible programming tool for children to cultivate computational thinking.

    PubMed

    Wang, Danli; Wang, Tingting; Liu, Zhen

    2014-01-01

    Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5-9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity.

  17. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    PubMed

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  18. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  19. A tool for modeling concurrent real-time computation

    NASA Technical Reports Server (NTRS)

    Sharma, D. D.; Huang, Shie-Rei; Bhatt, Rahul; Sridharan, N. S.

    1990-01-01

    Real-time computation is a significant area of research in general, and in AI in particular. The complexity of practical real-time problems demands use of knowledge-based problem solving techniques while satisfying real-time performance constraints. Since the demands of a complex real-time problem cannot be predicted (owing to the dynamic nature of the environment) powerful dynamic resource control techniques are needed to monitor and control the performance. A real-time computation model for a real-time tool, an implementation of the QP-Net simulator on a Symbolics machine, and an implementation on a Butterfly multiprocessor machine are briefly described.

  20. Procedure for computer-controlled milling of accurate surfaces of revolution for millimeter and far-infrared mirrors

    NASA Technical Reports Server (NTRS)

    Emmons, Louisa; De Zafra, Robert

    1991-01-01

    A simple method for milling accurate off-axis parabolic mirrors with a computer-controlled milling machine is discussed. For machines with a built-in circle-cutting routine, an exact paraboloid can be milled with few computer commands and without the use of the spherical or linear approximations. The proposed method can be adapted easily to cut off-axis sections of elliptical or spherical mirrors.

  1. Final Report for Foundational Tools for Petascale Computing

    SciTech Connect

    Hollingsworth, Jeff

    2015-02-12

    This project concentrated on various aspects of creating tool infrastructure to make it easier to program large-scale parallel computers. This project was collaborative with the University of Wisconsin and closely related to the project DE-SC0002606 (“Tools for the Development of High Performance Energy Applications and Systems”) . The research conducted during this project is summarized in this report. The complete details of the work are available in the ten publications listed at the end of the report. Many of the concepts created during this project have been incorporated into tools and made available as freely downloadable software (at www.dyninst.org). It also supported the Ph.D. studies of three students and one research staff member.

  2. Cloud-Based Computational Tools for Earth Science Applications

    NASA Astrophysics Data System (ADS)

    Arendt, A. A.; Fatland, R.; Howe, B.

    2015-12-01

    Earth scientists are increasingly required to think across disciplines and utilize a wide range of datasets in order to solve complex environmental challenges. Although significant progress has been made in distributing data, researchers must still invest heavily in developing computational tools to accommodate their specific domain. Here we document our development of lightweight computational data systems aimed at enabling rapid data distribution, analytics and problem solving tools for Earth science applications. Our goal is for these systems to be easily deployable, scalable and flexible to accommodate new research directions. As an example we describe "Ice2Ocean", a software system aimed at predicting runoff from snow and ice in the Gulf of Alaska region. Our backend components include relational database software to handle tabular and vector datasets, Python tools (NumPy, pandas and xray) for rapid querying of gridded climate data, and an energy and mass balance hydrological simulation model (SnowModel). These components are hosted in a cloud environment for direct access across research teams, and can also be accessed via API web services using a REST interface. This API is a vital component of our system architecture, as it enables quick integration of our analytical tools across disciplines, and can be accessed by any existing data distribution centers. We will showcase several data integration and visualization examples to illustrate how our system has expanded our ability to conduct cross-disciplinary research.

  3. Accelerating Battery Design Using Computer-Aided Engineering Tools: Preprint

    SciTech Connect

    Pesaran, A.; Heon, G. H.; Smith, K.

    2011-01-01

    Computer-aided engineering (CAE) is a proven pathway, especially in the automotive industry, to improve performance by resolving the relevant physics in complex systems, shortening the product development design cycle, thus reducing cost, and providing an efficient way to evaluate parameters for robust designs. Academic models include the relevant physics details, but neglect engineering complexities. Industry models include the relevant macroscopic geometry and system conditions, but simplify the fundamental physics too much. Most of the CAE battery tools for in-house use are custom model codes and require expert users. There is a need to make these battery modeling and design tools more accessible to end users such as battery developers, pack integrators, and vehicle makers. Developing integrated and physics-based CAE battery tools can reduce the design, build, test, break, re-design, re-build, and re-test cycle and help lower costs. NREL has been involved in developing various models to predict the thermal and electrochemical performance of large-format cells and has used in commercial three-dimensional finite-element analysis and computational fluid dynamics to study battery pack thermal issues. These NREL cell and pack design tools can be integrated to help support the automotive industry and to accelerate battery design.

  4. Computational Tools for the Secondary Analysis of Metabolomics Experiments

    PubMed Central

    Booth, Sean C.; Weljie, Aalim M.; Turner, Raymond J.

    2013-01-01

    Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before. PMID:24688685

  5. Reliability automation tool (RAT) for fault tolerance computation

    NASA Astrophysics Data System (ADS)

    Singh, N. S. S.; Hamid, N. H.; Asirvadam, V. S.

    2012-09-01

    As CMOS transistors reduced in size, the circuit built using these nano-scale transistors naturally becomes less reliable. The reliability reduction, which is the measure of circuit performance, has brought up so many challenges in designing modern logic integrated circuit. Therefore, reliability modeling is increasingly important subject to be considered in designing modern logic integrated circuit. This drives a need to compute reliability measures for nano-scale circuits. This paper looks into the development of reliability automation tool (RAT) for circuit's reliability computation. The tool is developed using Matlab programming language based on the reliability evaluation model called Probabilistic Transfer Matrix (PTM). RAT allows users to significantly speed-up the reliability assessments of nano-scale circuits. Users have to provide circuit's netlist as the input to RAT for its reliability computation. The netlist signifies the circuit's description in terms of Gate Profile Matrix (GPM), Adjacency Computation Matrix (ACM) and Grid Layout Matrix (GLM). GPM, ACM and GLM indicate the types of logic gates, the interconnection between these logic gates and the layout matrix of these logic gates respectively in a given circuit design. Here, the reliability assessment by RAT is carried out on Full Adder circuit as the benchmark test circuit.

  6. Development and Application of a Predictive Computational Tool for Short-Pulse, High-Intensity Target Interactions

    SciTech Connect

    Town, R J; Chung, H; Langdon, A B; Lasinski, B F; Lund, S M; McCandless, B C; Still, C H; Tabak, M

    2007-01-26

    The widely differing spatial, temporal, and density scales needed to accurately model the fast ignition process and other short-pulse laser-plasma interactions leads to a computationally challenging project that is difficult to solve using a single code. This report summarizes the work performed on a three year LDRD to couple together three independent codes using PYTHON to build a new integrated computational tool. An example calculation using this new model is described.

  7. Computational science: shifting the focus from tools to models

    PubMed Central

    Hinsen, Konrad

    2014-01-01

    Computational techniques have revolutionized many aspects of scientific research over the last few decades. Experimentalists use computation for data analysis, processing ever bigger data sets. Theoreticians compute predictions from ever more complex models. However, traditional articles do not permit the publication of big data sets or complex models. As a consequence, these crucial pieces of information no longer enter the scientific record. Moreover, they have become prisoners of scientific software: many models exist only as software implementations, and the data are often stored in proprietary formats defined by the software. In this article, I argue that this emphasis on software tools over models and data is detrimental to science in the long term, and I propose a means by which this can be reversed. PMID:25309728

  8. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    PubMed Central

    Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris

    2014-01-01

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations. PMID:25340050

  9. Tzanck smear as an accurate and rapid diagnostic tool for cutaneous alternariosis in a renal transplant recipient.

    PubMed

    Karataş Toğral, A; Güleç, A T

    2016-10-01

    Alternaria species are becoming increasingly important opportunistic pathogens in recipients of solid organ transplant, as it has been shown that dissemination with systemic involvement is not as rare as previously reported. Therefore, rapid and accurate diagnosis is necessary for appropriate patient management. We report a patient with renal transplant who developed recurrent cutaneous alternariosis. Tzanck smear successfully and very rapidly revealed hyphae and spores in both the primary and subsequent lesions. Furthermore, Tzanck smear provided guidance for histopathological examination of the second lesion, which failed to disclose the fungal elements until additional deeper serial sections were performed. The present case emphasizes that the Tzanck smear is a useful clinical tool leading to the immediate correct diagnosis even in deep fungal infections. PMID:27663148

  10. Computer-Based Tools for Evaluating Graphical User Interfaces

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  11. Computed-tomography-based finite-element models of long bones can accurately capture strain response to bending and torsion.

    PubMed

    Varghese, Bino; Short, David; Penmetsa, Ravi; Goswami, Tarun; Hangartner, Thomas

    2011-04-29

    Finite element (FE) models of long bones constructed from computed-tomography (CT) data are emerging as an invaluable tool in the field of bone biomechanics. However, the performance of such FE models is highly dependent on the accurate capture of geometry and appropriate assignment of material properties. In this study, a combined numerical-experimental study is performed comparing FE-predicted surface strains with strain-gauge measurements. Thirty-six major, cadaveric, long bones (humerus, radius, femur and tibia), which cover a wide range of bone sizes, were tested under three-point bending and torsion. The FE models were constructed from trans-axial volumetric CT scans, and the segmented bone images were corrected for partial-volume effects. The material properties (Young's modulus for cortex, density-modulus relationship for trabecular bone and Poisson's ratio) were calibrated by minimizing the error between experiments and simulations among all bones. The R(2) values of the measured strains versus load under three-point bending and torsion were 0.96-0.99 and 0.61-0.99, respectively, for all bones in our dataset. The errors of the calculated FE strains in comparison to those measured using strain gauges in the mechanical tests ranged from -6% to 7% under bending and from -37% to 19% under torsion. The observation of comparatively low errors and high correlations between the FE-predicted strains and the experimental strains, across the various types of bones and loading conditions (bending and torsion), validates our approach to bone segmentation and our choice of material properties.

  12. An accurate and efficient computation method of the hydration free energy of a large, complex molecule.

    PubMed

    Yoshidome, Takashi; Ekimoto, Toru; Matubayasi, Nobuyuki; Harano, Yuichi; Kinoshita, Masahiro; Ikeguchi, Mitsunori

    2015-05-01

    The hydration free energy (HFE) is a crucially important physical quantity to discuss various chemical processes in aqueous solutions. Although an explicit-solvent computation with molecular dynamics (MD) simulations is a preferable treatment of the HFE, huge computational load has been inevitable for large, complex solutes like proteins. In the present paper, we propose an efficient computation method for the HFE. In our method, the HFE is computed as a sum of 〈UUV〉/2 (〈UUV〉 is the ensemble average of the sum of pair interaction energy between solute and water molecule) and the water reorganization term mainly reflecting the excluded volume effect. Since 〈UUV〉 can readily be computed through a MD of the system composed of solute and water, an efficient computation of the latter term leads to a reduction of computational load. We demonstrate that the water reorganization term can quantitatively be calculated using the morphometric approach (MA) which expresses the term as the linear combinations of the four geometric measures of a solute and the corresponding coefficients determined with the energy representation (ER) method. Since the MA enables us to finish the computation of the solvent reorganization term in less than 0.1 s once the coefficients are determined, the use of the MA enables us to provide an efficient computation of the HFE even for large, complex solutes. Through the applications, we find that our method has almost the same quantitative performance as the ER method with substantial reduction of the computational load. PMID:25956125

  13. Limited rotational and rovibrational line lists computed with highly accurate quartic force fields and ab initio dipole surfaces.

    PubMed

    Fortenberry, Ryan C; Huang, Xinchuan; Schwenke, David W; Lee, Timothy J

    2014-02-01

    In this work, computational procedures are employed to compute the rotational and rovibrational spectra and line lists for H2O, CO2, and SO2. Building on the established use of quartic force fields, MP2 and CCSD(T) Dipole Moment Surfaces (DMSs) are computed for each system of study in order to produce line intensities as well as the transition energies. The computed results exhibit a clear correlation to reference data available in the HITRAN database. Additionally, even though CCSD(T) DMSs produce more accurate intensities as compared to experiment, the use of MP2 DMSs results in reliable line lists that are still comparable to experiment. The use of the less computationally costly MP2 method is beneficial in the study of larger systems where use of CCSD(T) would be more costly. PMID:23692860

  14. Symmetry-Based Computational Tools for Magnetic Crystallography

    NASA Astrophysics Data System (ADS)

    Perez-Mato, J. M.; Gallego, S. V.; Tasci, E. S.; Elcoro, L.; de la Flor, G.; Aroyo, M. I.

    2015-07-01

    In recent years, two important advances have opened new doors for the characterization and determination of magnetic structures. Firstly, researchers have produced computer-readable listings of the magnetic or Shubnikov space groups. Secondly, they have extended and applied the superspace formalism, which is presently the standard approach for the description of nonmagnetic incommensurate structures and their symmetry, to magnetic structures. These breakthroughs have been the basis for the subsequent development of a series of computer tools that allow a more efficient and comprehensive application of magnetic symmetry, both commensurate and incommensurate. Here we briefly review the capabilities of these computation instruments and present the fundamental concepts on which they are based, providing various examples. We show how these tools facilitate the use of symmetry arguments expressed as either a magnetic space group or a magnetic superspace group and allow the exploration of the possible magnetic orderings associated with one or more propagation vectors in a form that complements and goes beyond the traditional representation method. Special focus is placed on the programs available online at the Bilbao Crystallographic Server ( http://www.cryst.ehu.es ).

  15. Applying computer simulation models as learning tools in fishery management

    USGS Publications Warehouse

    Johnson, B.L.

    1995-01-01

    Computer models can be powerful tools for addressing many problems in fishery management, but uncertainty about how to apply models and how they should perform can lead to a cautious approach to modeling. Within this approach, we expect models to make quantitative predictions but only after all model inputs have been estimated from empirical data and after the model has been tested for agreement with an independent data set. I review the limitations to this approach and show how models can be more useful as tools for organizing data and concepts, learning about the system to be managed, and exploring management options. Fishery management requires deciding what actions to pursue to meet management objectives. Models do not make decisions for us but can provide valuable input to the decision-making process. When empirical data are lacking, preliminary modeling with parameters derived from other sources can help determine priorities for data collection. When evaluating models for management applications, we should attempt to define the conditions under which the model is a useful, analytical tool (its domain of applicability) and should focus on the decisions made using modeling results, rather than on quantitative model predictions. I describe an example of modeling used as a learning tool for the yellow perch Perca flavescens fishery in Green Bay, Lake Michigan.

  16. Brain-computer interfaces: a powerful tool for scientific inquiry

    PubMed Central

    Wander, Jeremiah D; Rao, Rajesh P N

    2014-01-01

    Brain-computer interfaces (BCIs) are devices that record from the nervous system, provide input directly to the nervous system, or do both. Sensory BCIs such as cochlear implants have already had notable clinical success and motor BCIs have shown great promise for helping patients with severe motor deficits. Clinical and engineering outcomes aside, BCIs can also be tremendously powerful tools for scientific inquiry into the workings of the nervous system. They allow researchers to inject and record information at various stages of the system, permitting investigation of the brain in vivo and facilitating the reverse engineering of brain function. Most notably, BCIs are emerging as a novel experimental tool for investigating the tremendous adaptive capacity of the nervous system. PMID:24709603

  17. Brain-computer interfaces: a powerful tool for scientific inquiry.

    PubMed

    Wander, Jeremiah D; Rao, Rajesh P N

    2014-04-01

    Brain-computer interfaces (BCIs) are devices that record from the nervous system, provide input directly to the nervous system, or do both. Sensory BCIs such as cochlear implants have already had notable clinical success and motor BCIs have shown great promise for helping patients with severe motor deficits. Clinical and engineering outcomes aside, BCIs can also be tremendously powerful tools for scientific inquiry into the workings of the nervous system. They allow researchers to inject and record information at various stages of the system, permitting investigation of the brain in vivo and facilitating the reverse engineering of brain function. Most notably, BCIs are emerging as a novel experimental tool for investigating the tremendous adaptive capacity of the nervous system.

  18. Parallelization of ARC3D with Computer-Aided Tools

    NASA Technical Reports Server (NTRS)

    Jin, Haoqiang; Hribar, Michelle; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    A series of efforts have been devoted to investigating methods of porting and parallelizing applications quickly and efficiently for new architectures, such as the SCSI Origin 2000 and Cray T3E. This report presents the parallelization of a CFD application, ARC3D, using the computer-aided tools, Cesspools. Steps of parallelizing this code and requirements of achieving better performance are discussed. The generated parallel version has achieved reasonably well performance, for example, having a speedup of 30 for 36 Cray T3E processors. However, this performance could not be obtained without modification of the original serial code. It is suggested that in many cases improving serial code and performing necessary code transformations are important parts for the automated parallelization process although user intervention in many of these parts are still necessary. Nevertheless, development and improvement of useful software tools, such as Cesspools, can help trim down many tedious parallelization details and improve the processing efficiency.

  19. Computer Aided Safety Assessment(CASA) Tool for ISS Payloads

    NASA Astrophysics Data System (ADS)

    Hochstein, Jason; Festa, Fabrizio

    2010-09-01

    In an effort to streamline the processes established by the partners of the International Space Station(ISS) to certify the safety of hardware and experiments destined for the Station, the European Space Agency’s(ESA) ISS System Safety Team is developing the Computer Aided Safety Assessment(CASA) tool suite. These software tools guide payload developers through the creation process of two types of standard payload hazard reports via a series of questions following a predetermined logic. The responses provided by the user are used by the CASA system to complete the majority of each hazard report requisite for payload flight safety reviews, employing consistent, approved descriptions of most hazards, hazard causes, controls and verification methods. Though some manual inputs will still be required to complete these reports, working with CASA will considerably reduce the amount of time necessary to review the documentation by agency safety authorities.

  20. A computer aided engineering tool for ECLS systems

    NASA Technical Reports Server (NTRS)

    Bangham, Michal E.; Reuter, James L.

    1987-01-01

    The Computer-Aided Systems Engineering and Analysis tool used by NASA for environmental control and life support system design studies is capable of simulating atmospheric revitalization systems, water recovery and management systems, and single-phase active thermal control systems. The designer/analysis interface used is graphics-based, and allows the designer to build a model by constructing a schematic of the system under consideration. Data management functions are performed, and the program is translated into a format that is compatible with the solution routines.

  1. Computational tools for epitope vaccine design and evaluation.

    PubMed

    He, Linling; Zhu, Jiang

    2015-04-01

    Rational approaches will be required to develop universal vaccines for viral pathogens such as human immunodeficiency virus, hepatitis C virus, and influenza, for which empirical approaches have failed. The main objective of a rational vaccine strategy is to design novel immunogens that are capable of inducing long-term protective immunity. In practice, this requires structure-based engineering of the target neutralizing epitopes and a quantitative readout of vaccine-induced immune responses. Therefore, computational tools that can facilitate these two areas have played increasingly important roles in rational vaccine design in recent years. Here we review the computational techniques developed for protein structure prediction and antibody repertoire analysis, and demonstrate how they can be applied to the design and evaluation of epitope vaccines.

  2. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  3. A fourth order accurate finite difference scheme for the computation of elastic waves

    NASA Technical Reports Server (NTRS)

    Bayliss, A.; Jordan, K. E.; Lemesurier, B. J.; Turkel, E.

    1986-01-01

    A finite difference for elastic waves is introduced. The model is based on the first order system of equations for the velocities and stresses. The differencing is fourth order accurate on the spatial derivatives and second order accurate in time. The model is tested on a series of examples including the Lamb problem, scattering from plane interf aces and scattering from a fluid-elastic interface. The scheme is shown to be effective for these problems. The accuracy and stability is insensitive to the Poisson ratio. For the class of problems considered here it is found that the fourth order scheme requires for two-thirds to one-half the resolution of a typical second order scheme to give comparable accuracy.

  4. Tools for 3D scientific visualization in computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val

    1989-01-01

    The purpose is to describe the tools and techniques in use at the NASA Ames Research Center for performing visualization of computational aerodynamics, for example visualization of flow fields from computer simulations of fluid dynamics about vehicles such as the Space Shuttle. The hardware used for visualization is a high-performance graphics workstation connected to a super computer with a high speed channel. At present, the workstation is a Silicon Graphics IRIS 3130, the supercomputer is a CRAY2, and the high speed channel is a hyperchannel. The three techniques used for visualization are post-processing, tracking, and steering. Post-processing analysis is done after the simulation. Tracking analysis is done during a simulation but is not interactive, whereas steering analysis involves modifying the simulation interactively during the simulation. Using post-processing methods, a flow simulation is executed on a supercomputer and, after the simulation is complete, the results of the simulation are processed for viewing. The software in use and under development at NASA Ames Research Center for performing these types of tasks in computational aerodynamics is described. Workstation performance issues, benchmarking, and high-performance networks for this purpose are also discussed as well as descriptions of other hardware for digital video and film recording.

  5. Performance Evaluation Tools for Next Generation Scalable Computing Platforms

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Sarukkai, Sekhar; Craw, James (Technical Monitor)

    1995-01-01

    The Federal High Performance and Communications (HPCC) Program continue to focus on R&D in a wide range of high performance computing and communications technologies. Using its accomplishments in the past four years as building blocks towards a Global Information Infrastructure (GII), an Implementation Plan that identifies six Strategic Focus Areas for R&D has been proposed. This white paper argues that a new generation of system software and programming tools must be developed to support these focus areas, so that the R&D we invest today can lead to technology pay-off a decade from now. The Global Computing Infrastructure (GCI) in the Year 2000 and Beyond would consists of thousands of powerful computing nodes connected via high-speed networks across the globe. Users will be able to obtain computing in formation services the GCI with the ease of using a plugging a toaster into the electrical outlet on the wall anywhere in the country. Developing and managing the GO requires performance prediction and monitoring capabilities that do not exist. Various accomplishments in this field today must be integrated and expanded to support this vision.

  6. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  7. Solid rocket booster internal flow analysis by highly accurate adaptive computational methods

    NASA Technical Reports Server (NTRS)

    Huang, C. Y.; Tworzydlo, W.; Oden, J. T.; Bass, J. M.; Cullen, C.; Vadaketh, S.

    1991-01-01

    The primary objective of this project was to develop an adaptive finite element flow solver for simulating internal flows in the solid rocket booster. Described here is a unique flow simulator code for analyzing highly complex flow phenomena in the solid rocket booster. New methodologies and features incorporated into this analysis tool are described.

  8. OVERSMART Reporting Tool for Flow Computations Over Large Grid Systems

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Chan, William M.

    2012-01-01

    Structured grid solvers such as NASA's OVERFLOW compressible Navier-Stokes flow solver can generate large data files that contain convergence histories for flow equation residuals, turbulence model equation residuals, component forces and moments, and component relative motion dynamics variables. Most of today's large-scale problems can extend to hundreds of grids, and over 100 million grid points. However, due to the lack of efficient tools, only a small fraction of information contained in these files is analyzed. OVERSMART (OVERFLOW Solution Monitoring And Reporting Tool) provides a comprehensive report of solution convergence of flow computations over large, complex grid systems. It produces a one-page executive summary of the behavior of flow equation residuals, turbulence model equation residuals, and component forces and moments. Under the automatic option, a matrix of commonly viewed plots such as residual histograms, composite residuals, sub-iteration bar graphs, and component forces and moments is automatically generated. Specific plots required by the user can also be prescribed via a command file or a graphical user interface. Output is directed to the user s computer screen and/or to an html file for archival purposes. The current implementation has been targeted for the OVERFLOW flow solver, which is used to obtain a flow solution on structured overset grids. The OVERSMART framework allows easy extension to other flow solvers.

  9. An Accurate Method for Computing the Absorption of Solar Radiation by Water Vapor

    NASA Technical Reports Server (NTRS)

    Chou, M. D.

    1980-01-01

    The method is based upon molecular line parameters and makes use of a far wing scaling approximation and k distribution approach previously applied to the computation of the infrared cooling rate due to water vapor. Taking into account the wave number dependence of the incident solar flux, the solar heating rate is computed for the entire water vapor spectrum and for individual absorption bands. The accuracy of the method is tested against line by line calculations. The method introduces a maximum error of 0.06 C/day. The method has the additional advantage over previous methods in that it can be applied to any portion of the spectral region containing the water vapor bands. The integrated absorptances and line intensities computed from the molecular line parameters were compared with laboratory measurements. The comparison reveals that, among the three different sources, absorptance is the largest for the laboratory measurements.

  10. A Scalable and Accurate Targeted Gene Assembly Tool (SAT-Assembler) for Next-Generation Sequencing Data

    PubMed Central

    Zhang, Yuan; Sun, Yanni; Cole, James R.

    2014-01-01

    Gene assembly, which recovers gene segments from short reads, is an important step in functional analysis of next-generation sequencing data. Lacking quality reference genomes, de novo assembly is commonly used for RNA-Seq data of non-model organisms and metagenomic data. However, heterogeneous sequence coverage caused by heterogeneous expression or species abundance, similarity between isoforms or homologous genes, and large data size all pose challenges to de novo assembly. As a result, existing assembly tools tend to output fragmented contigs or chimeric contigs, or have high memory footprint. In this work, we introduce a targeted gene assembly program SAT-Assembler, which aims to recover gene families of particular interest to biologists. It addresses the above challenges by conducting family-specific homology search, homology-guided overlap graph construction, and careful graph traversal. It can be applied to both RNA-Seq and metagenomic data. Our experimental results on an Arabidopsis RNA-Seq data set and two metagenomic data sets show that SAT-Assembler has smaller memory usage, comparable or better gene coverage, and lower chimera rate for assembling a set of genes from one or multiple pathways compared with other assembly tools. Moreover, the family-specific design and rapid homology search allow SAT-Assembler to be naturally compatible with parallel computing platforms. The source code of SAT-Assembler is available at https://sourceforge.net/projects/sat-assembler/. The data sets and experimental settings can be found in supplementary material. PMID:25122209

  11. A scalable and accurate targeted gene assembly tool (SAT-Assembler) for next-generation sequencing data.

    PubMed

    Zhang, Yuan; Sun, Yanni; Cole, James R

    2014-08-01

    Gene assembly, which recovers gene segments from short reads, is an important step in functional analysis of next-generation sequencing data. Lacking quality reference genomes, de novo assembly is commonly used for RNA-Seq data of non-model organisms and metagenomic data. However, heterogeneous sequence coverage caused by heterogeneous expression or species abundance, similarity between isoforms or homologous genes, and large data size all pose challenges to de novo assembly. As a result, existing assembly tools tend to output fragmented contigs or chimeric contigs, or have high memory footprint. In this work, we introduce a targeted gene assembly program SAT-Assembler, which aims to recover gene families of particular interest to biologists. It addresses the above challenges by conducting family-specific homology search, homology-guided overlap graph construction, and careful graph traversal. It can be applied to both RNA-Seq and metagenomic data. Our experimental results on an Arabidopsis RNA-Seq data set and two metagenomic data sets show that SAT-Assembler has smaller memory usage, comparable or better gene coverage, and lower chimera rate for assembling a set of genes from one or multiple pathways compared with other assembly tools. Moreover, the family-specific design and rapid homology search allow SAT-Assembler to be naturally compatible with parallel computing platforms. The source code of SAT-Assembler is available at https://sourceforge.net/projects/sat-assembler/. The data sets and experimental settings can be found in supplementary material. PMID:25122209

  12. Time-Accurate Computations of Isolated Circular Synthetic Jets in Crossflow

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Schaeffler, N. W.; Milanovic, I. M.; Zaman, K. B. M. Q.

    2007-01-01

    Results from unsteady Reynolds-averaged Navier-Stokes computations are described for two different synthetic jet flows issuing into a turbulent boundary layer crossflow through a circular orifice. In one case the jet effect is mostly contained within the boundary layer, while in the other case the jet effect extends beyond the boundary layer edge. Both cases have momentum flux ratios less than 2. Several numerical parameters are investigated, and some lessons learned regarding the CFD methods for computing these types of flow fields are summarized. Results in both cases are compared to experiment.

  13. Time-Accurate Computations of Isolated Circular Synthetic Jets in Crossflow

    NASA Technical Reports Server (NTRS)

    Rumsey, Christoper L.; Schaeffler, Norman W.; Milanovic, I. M.; Zaman, K. B. M. Q.

    2005-01-01

    Results from unsteady Reynolds-averaged Navier-Stokes computations are described for two different synthetic jet flows issuing into a turbulent boundary layer crossflow through a circular orifice. In one case the jet effect is mostly contained within the boundary layer, while in the other case the jet effect extends beyond the boundary layer edge. Both cases have momentum flux ratios less than 2. Several numerical parameters are investigated, and some lessons learned regarding the CFD methods for computing these types of flow fields are outlined. Results in both cases are compared to experiment.

  14. Computer subroutine ISUDS accurately solves large system of simultaneous linear algebraic equations

    NASA Technical Reports Server (NTRS)

    Collier, G.

    1967-01-01

    Computer program, an Iterative Scheme Using a Direct Solution, obtains double precision accuracy using a single-precision coefficient matrix. ISUDS solves a system of equations written in matrix form as AX equals B, where A is a square non-singular coefficient matrix, X is a vector, and B is a vector.

  15. Time-Accurate Computation of Viscous Flow Around Deforming Bodies Using Overset Grids

    SciTech Connect

    Fast, P; Henshaw, W D

    2001-04-02

    Dynamically evolving boundaries and deforming bodies interacting with a flow are commonly encountered in fluid dynamics. However, the numerical simulation of flows with dynamic boundaries is difficult with current methods. We propose a new method for studying such problems. The key idea is to use the overset grid method with a thin, body-fitted grid near the deforming boundary, while using fixed Cartesian grids to cover most of the computational domain. Our approach combines the strengths of earlier moving overset grid methods for rigid body motion, and unstructured grid methods for Aow-structure interactions. Large scale deformation of the flow boundaries can be handled without a global regridding, and in a computationally efficient way. In terms of computational cost, even a full overset grid regridding is significantly cheaper than a full regridding of an unstructured grid for the same domain, especially in three dimensions. Numerical studies are used to verify accuracy and convergence of our flow solver. As a computational example, we consider two-dimensional incompressible flow past a flexible filament with prescribed dynamics.

  16. Modeling of Tool-Tissue Interactions for Computer-Based Surgical Simulation: A Literature Review

    PubMed Central

    Misra, Sarthak; Ramesh, K. T.; Okamura, Allison M.

    2009-01-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in the development of high-fidelity surgical simulators. Researchers have attempted to model tool-tissue interactions in a wide variety of ways, which can be broadly classified as (1) linear elasticity-based, (2) nonlinear (hyperelastic) elasticity-based finite element (FE) methods, and (3) other techniques that not based on FE methods or continuum mechanics. Realistic modeling of organ deformation requires populating the model with real tissue data (which are difficult to acquire in vivo) and simulating organ response in real time (which is computationally expensive). Further, it is challenging to account for connective tissue supporting the organ, friction, and topological changes resulting from tool-tissue interactions during invasive surgical procedures. Overcoming such obstacles will not only help us to model tool-tissue interactions in real time, but also enable realistic force feedback to the user during surgical simulation. This review paper classifies the existing research on tool-tissue interactions for surgical simulators specifically based on the modeling techniques employed and the kind of surgical operation being simulated, in order to inform and motivate future research on improved tool-tissue interaction models. PMID:20119508

  17. Virtual temporal bone: creation and application of a new computer-based teaching tool.

    PubMed

    Mason, T P; Applebaum, E L; Rasmussen, M; Millman, A; Evenhouse, R; Panko, W

    2000-02-01

    The human temporal bone is a 3-dimensionally complex anatomic region with many unique qualities that make anatomic teaching and learning difficult. Current teaching tools have proved only partially adequate for the needs of the aspiring otologic surgeon in learning this anatomy. We used a variety of computerized image processing and reconstruction techniques to reconstruct an anatomically accurate 3-dimensional computer model of the human temporal bone from serial histologic sections. The model is viewed with a specialized visualization system that allows it to be manipulated easily in a stereoscopic virtual environment. The model may then be interactively studied from any viewpoint, greatly simplifying the task of conceptualizing and learning this anatomy. The system also provides for simultaneous computer networking that can bring distant participants into a single shared virtual teaching environment. Future directions of the project are discussed. PMID:10652385

  18. A Microanalysis of Pair Problem Solving With and Without a Computer Tool.

    ERIC Educational Resources Information Center

    Derry, Sharon; And Others

    The social interactions that occur during pair problem solving were studied using a computer tool and without using the tool. The computer tool, (the TAPS system) is an instructional system that presents complex word problems and provides a graphics user interface with tools for constructing problem trees (network structures showing…

  19. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    NASA Astrophysics Data System (ADS)

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-04-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers’ understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools within their own science investigations; discussed general technology issues; and explored, evaluated, and taught their peers about a particular modeling tool. Preservice teachers expanded their vision of the software available and the role that software can play in science teaching, but desired fun, easy-to-use software with scientifically accurate information within a clear, familiar learning task. Such conflict provided a fruitful platform for discussion and for potentially advancing preservice teachers’ pedagogical and epistemological understandings.

  20. Enabling fast, stable and accurate peridynamic computations using multi-time-step integration

    DOE PAGES

    Lindsay, P.; Parks, M. L.; Prakash, A.

    2016-04-13

    Peridynamics is a nonlocal extension of classical continuum mechanics that is well-suited for solving problems with discontinuities such as cracks. This paper extends the peridynamic formulation to decompose a problem domain into a number of smaller overlapping subdomains and to enable the use of different time steps in different subdomains. This approach allows regions of interest to be isolated and solved at a small time step for increased accuracy while the rest of the problem domain can be solved at a larger time step for greater computational efficiency. Lastly, performance of the proposed method in terms of stability, accuracy, andmore » computational cost is examined and several numerical examples are presented to corroborate the findings.« less

  1. Matrix-vector multiplication using digital partitioning for more accurate optical computing

    NASA Technical Reports Server (NTRS)

    Gary, C. K.

    1992-01-01

    Digital partitioning offers a flexible means of increasing the accuracy of an optical matrix-vector processor. This algorithm can be implemented with the same architecture required for a purely analog processor, which gives optical matrix-vector processors the ability to perform high-accuracy calculations at speeds comparable with or greater than electronic computers as well as the ability to perform analog operations at a much greater speed. Digital partitioning is compared with digital multiplication by analog convolution, residue number systems, and redundant number representation in terms of the size and the speed required for an equivalent throughput as well as in terms of the hardware requirements. Digital partitioning and digital multiplication by analog convolution are found to be the most efficient alogrithms if coding time and hardware are considered, and the architecture for digital partitioning permits the use of analog computations to provide the greatest throughput for a single processor.

  2. Accurate computation and continuation of homoclinic and heteroclinic orbits for singular perturbation problems

    NASA Technical Reports Server (NTRS)

    Vaughan, William W.; Friedman, Mark J.; Monteiro, Anand C.

    1993-01-01

    In earlier papers, Doedel and the authors have developed a numerical method and derived error estimates for the computation of branches of heteroclinic orbits for a system of autonomous ordinary differential equations in R(exp n). The idea of the method is to reduce a boundary value problem on the real line to a boundary value problem on a finite interval by using a local (linear or higher order) approximation of the stable and unstable manifolds. A practical limitation for the computation of homoclinic and heteroclinic orbits has been the difficulty in obtaining starting orbits. Typically these were obtained from a closed form solution or via a homotopy from a known solution. Here we consider extensions of our algorithm which allow us to obtain starting orbits on the continuation branch in a more systematic way as well as make the continuation algorithm more flexible. In applications, we use the continuation software package AUTO in combination with some initial value software. The examples considered include computation of homoclinic orbits in a singular perturbation problem and in a turbulent fluid boundary layer in the wall region problem.

  3. Iofetamine I 123 single photon emission computed tomography is accurate in the diagnosis of Alzheimer's disease

    SciTech Connect

    Johnson, K.A.; Holman, B.L.; Rosen, T.J.; Nagel, J.S.; English, R.J.; Growdon, J.H. )

    1990-04-01

    To determine the diagnostic accuracy of iofetamine hydrochloride I 123 (IMP) with single photon emission computed tomography in Alzheimer's disease, we studied 58 patients with AD and 15 age-matched healthy control subjects. We used a qualitative method to assess regional IMP uptake in the entire brain and to rate image data sets as normal or abnormal without knowledge of subjects'clinical classification. The sensitivity and specificity of IMP with single photon emission computed tomography in AD were 88% and 87%, respectively. In 15 patients with mild cognitive deficits (Blessed Dementia Scale score, less than or equal to 10), sensitivity was 80%. With the use of a semiquantitative measure of regional cortical IMP uptake, the parietal lobes were the most functionally impaired in AD and the most strongly associated with the patients' Blessed Dementia Scale scores. These results indicated that IMP with single photon emission computed tomography may be a useful adjunct in the clinical diagnosis of AD in early, mild disease.

  4. Accurate and Scalable O(N) Algorithm for First-Principles Molecular-Dynamics Computations on Large Parallel Computers

    SciTech Connect

    Osei-Kuffuor, Daniel; Fattebert, Jean-Luc

    2014-01-01

    We present the first truly scalable first-principles molecular dynamics algorithm with O(N) complexity and controllable accuracy, capable of simulating systems with finite band gaps of sizes that were previously impossible with this degree of accuracy. By avoiding global communications, we provide a practical computational scheme capable of extreme scalability. Accuracy is controlled by the mesh spacing of the finite difference discretization, the size of the localization regions in which the electronic wave functions are confined, and a cutoff beyond which the components of the overlap matrix can be omitted when computing selected elements of its inverse. We demonstrate the algorithm's excellent parallel scaling for up to 101 952 atoms on 23 328 processors, with a wall-clock time of the order of 1 min per molecular dynamics time step and numerical error on the forces of less than 7x10-4 Ha/Bohr.

  5. Computer-aided design tools for economical MEMS fabrication processes

    NASA Astrophysics Data System (ADS)

    Schneider, Christian; Priebe, Andreas; Brueck, Rainer; Hahn, Kai

    1999-03-01

    Since the early 70s when microsystem technology was first introduce an enormous market for MST-products has been developed. Airbag sensors, micro pumps, ink jet nozzles etc. and the market is just about to start up. Establishing these products for a reasonable price requires mass production. Meanwhile, also computer-based design-tools have been developed in order to reduce the expenses for MST-design. In contrast to other physical design processes like e.g. in micro electronics, MEMS physical design is characterized by the fact that each product requires a tailored sequence of fabrication steps, usually selected from a variety of processing alternatives. The selection from these alternatives is based on economical constraints. Therefore, the design has a strong influence on the money and time spent to take an MST-product to market.

  6. Materials by numbers: Computations as tools of discovery

    PubMed Central

    Landman, Uzi

    2005-01-01

    Current issues pertaining to theoretical simulations of materials, with a focus on systems of nanometer-scale dimensions, are discussed. The use of atomistic simulations as high-resolution numerical experiments, enabling and guiding formulation and testing of analytic theoretical descriptions, is demonstrated through studies of the generation and breakup of nanojets, which have led to the derivation of a stochastic hydrodynamic description. Subsequently, I illustrate the use of computations and simulations as tools of discovery, with examples that include the self-organized formation of nanowires, the surprising nanocatalytic activity of small aggregates of gold that, in the bulk form, is notorious for being chemically inert, and the emergence of rotating electron molecules in two-dimensional quantum dots. I conclude with a brief discussion of some key challenges in nanomaterials simulations. PMID:15870210

  7. Strategies and computational tools for improving randomized protein libraries.

    PubMed

    Patrick, Wayne M; Firth, Andrew E

    2005-10-01

    In the last decade, directed evolution has become a routine approach for engineering proteins with novel or altered properties. Concurrently, a trend away from purely 'blind' randomization strategies and towards more 'semi-rational' approaches has also become apparent. In this review, we discuss ways in which structural information and predictive computational tools are playing an increasingly important role in guiding the design of randomized libraries: web servers such as ConSurf-HSSP and SCHEMA allow the prediction of sites to target for producing functional variants, while algorithms such as GLUE, PEDEL and DRIVeR are useful for estimating library completeness and diversity. In addition, we review recent methodological developments that facilitate the construction of unbiased libraries, which are inherently more diverse than biased libraries and therefore more likely to yield improved variants.

  8. An adaptive grid method for computing time accurate solutions on structured grids

    NASA Technical Reports Server (NTRS)

    Bockelie, Michael J.; Smith, Robert E.; Eiseman, Peter R.

    1991-01-01

    The solution method consists of three parts: a grid movement scheme; an unsteady Euler equation solver; and a temporal coupling routine that links the dynamic grid to the Euler solver. The grid movement scheme is an algebraic method containing grid controls that generate a smooth grid that resolves the severe solution gradients and the sharp transitions in the solution gradients. The temporal coupling is performed with a grid prediction correction procedure that is simple to implement and provides a grid that does not lag the solution in time. The adaptive solution method is tested by computing the unsteady inviscid solutions for a one dimensional shock tube and a two dimensional shock vortex iteraction.

  9. Gravitational Focusing and the Computation of an Accurate Moon/Mars Cratering Ratio

    NASA Technical Reports Server (NTRS)

    Matney, Mark J.

    2006-01-01

    There have been a number of attempts to use asteroid populations to simultaneously compute cratering rates on the Moon and bodies elsewhere in the Solar System to establish the cratering ratio (e.g., [1],[2]). These works use current asteroid orbit population databases combined with collision rate calculations based on orbit intersections alone. As recent work on meteoroid fluxes [3] have highlighted, however, collision rates alone are insufficient to describe the cratering rates on planetary surfaces - especially planets with stronger gravitational fields than the Moon, such as Earth and Mars. Such calculations also need to include the effects of gravitational focusing, whereby the spatial density of the slower-moving impactors is preferentially "focused" by the gravity of the body. This leads overall to higher fluxes and cratering rates, and is highly dependent on the detailed velocity distributions of the impactors. In this paper, a comprehensive gravitational focusing algorithm originally developed to describe fluxes of interplanetary meteoroids [3] is applied to the collision rates and cratering rates of populations of asteroids and long-period comets to compute better cratering ratios for terrestrial bodies in the Solar System. These results are compared to the calculations of other researchers.

  10. Thermal Conductivities in Solids from First Principles: Accurate Computations and Rapid Estimates

    NASA Astrophysics Data System (ADS)

    Carbogno, Christian; Scheffler, Matthias

    In spite of significant research efforts, a first-principles determination of the thermal conductivity κ at high temperatures has remained elusive. Boltzmann transport techniques that account for anharmonicity perturbatively become inaccurate under such conditions. Ab initio molecular dynamics (MD) techniques using the Green-Kubo (GK) formalism capture the full anharmonicity, but can become prohibitively costly to converge in time and size. We developed a formalism that accelerates such GK simulations by several orders of magnitude and that thus enables its application within the limited time and length scales accessible in ab initio MD. For this purpose, we determine the effective harmonic potential occurring during the MD, the associated temperature-dependent phonon properties and lifetimes. Interpolation in reciprocal and frequency space then allows to extrapolate to the macroscopic scale. For both force-field and ab initio MD, we validate this approach by computing κ for Si and ZrO2, two materials known for their particularly harmonic and anharmonic character. Eventually, we demonstrate how these techniques facilitate reasonable estimates of κ from existing MD calculations at virtually no additional computational cost.

  11. Computational tools for Brassica-Arabidopsis comparative genomics.

    PubMed

    Beckett, Paul; Bancroft, Ian; Trick, Martin

    2005-01-01

    Recent advances, such as the availability of extensive genome survey sequence (GSS) data and draft physical maps, are radically transforming the means by which we can dissect Brassica genome structure and systematically relate it to the Arabidopsis model. Hitherto, our view of the co-linearities between these closely related genomes had been largely inferred from comparative RFLP data, necessitating substantial interpolation and expert interpretation. Sequencing of the Brassica rapa genome by the Multinational Brassica Genome Project will, however, enable an entirely computational approach to this problem. Meanwhile we have been developing databases and bioinformatics tools to support our work in Brassica comparative genomics, including a recently completed draft physical map of B. rapa integrated with anchor probes derived from the Arabidopsis genome sequence. We are also exploring new ways to display the emerging Brassica-Arabidopsis sequence homology data. We have mapped all publicly available Brassica sequences in silico to the Arabidopsis TIGR v5 genome sequence and published this in the ATIDB database that uses Generic Genome Browser (GBrowse). This in silico approach potentially identifies all paralogous sequences and so we colour-code the significance of the mappings and offer an integrated, real-time multiple alignment tool to partition them into paralogous groups. The MySQL database driving GBrowse can also be directly interrogated, using the powerful API offered by the Perl BioColon, two colonsDBColon, two colonsGFF methods, facilitating a wide range of data-mining possibilities.

  12. TRAC, a collaborative computer tool for tracer-test interpretation

    NASA Astrophysics Data System (ADS)

    Gutierrez, A.; Klinka, T.; Thiéry, D.; Buscarlet, E.; Binet, S.; Jozja, N.; Défarge, C.; Leclerc, B.; Fécamp, C.; Ahumada, Y.; Elsass, J.

    2013-05-01

    Artificial tracer tests are widely used by consulting engineers for demonstrating water circulation, proving the existence of leakage, or estimating groundwater velocity. However, the interpretation of such tests is often very basic, with the result that decision makers and professionals commonly face unreliable results through hasty and empirical interpretation. There is thus an increasing need for a reliable interpretation tool, compatible with the latest operating systems and available in several languages. BRGM, the French Geological Survey, has developed a project together with hydrogeologists from various other organizations to build software assembling several analytical solutions in order to comply with various field contexts. This computer program, called TRAC, is very light and simple, allowing the user to add his own analytical solution if the formula is not yet included. It aims at collaborative improvement by sharing the tool and the solutions. TRAC can be used for interpreting data recovered from a tracer test as well as for simulating the transport of a tracer in the saturated zone (for the time being). Calibration of a site operation is based on considering the hydrodynamic and hydrodispersive features of groundwater flow as well as the amount, nature and injection mode of the artificial tracer. The software is available in French, English and Spanish, and the latest version can be downloaded from the web site http://trac.brgm.fr">http://trac.brgm.fr.

  13. Integrated modeling tool for performance engineering of complex computer systems

    NASA Technical Reports Server (NTRS)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  14. Two-component density functional theory within the projector augmented-wave approach: Accurate and self-consistent computations of positron lifetimes and momentum distributions

    NASA Astrophysics Data System (ADS)

    Wiktor, Julia; Jomard, Gérald; Torrent, Marc

    2015-09-01

    Many techniques have been developed in the past in order to compute positron lifetimes in materials from first principles. However, there is still a lack of a fast and accurate self-consistent scheme that could handle accurately the forces acting on the ions induced by the presence of the positron. We will show in this paper that we have reached this goal by developing the two-component density functional theory within the projector augmented-wave (PAW) method in the open-source code abinit. This tool offers the accuracy of the all-electron methods with the computational efficiency of the plane-wave ones. We can thus deal with supercells that contain few hundreds to thousands of atoms to study point defects as well as more extended defects clusters. Moreover, using the PAW basis set allows us to use techniques able to, for instance, treat strongly correlated systems or spin-orbit coupling, which are necessary to study heavy elements, such as the actinides or their compounds.

  15. A distributed computing tool for generating neural simulation databases.

    PubMed

    Calin-Jageman, Robert J; Katz, Paul S

    2006-12-01

    After developing a model neuron or network, it is important to systematically explore its behavior across a wide range of parameter values or experimental conditions, or both. However, compiling a very large set of simulation runs is challenging because it typically requires both access to and expertise with high-performance computing facilities. To lower the barrier for large-scale model analysis, we have developed NeuronPM, a client/server application that creates a "screen-saver" cluster for running simulations in NEURON (Hines & Carnevale, 1997). NeuronPM provides a user-friendly way to use existing computing resources to catalog the performance of a neural simulation across a wide range of parameter values and experimental conditions. The NeuronPM client is a Windows-based screen saver, and the NeuronPM server can be hosted on any Apache/PHP/MySQL server. During idle time, the client retrieves model files and work assignments from the server, invokes NEURON to run the simulation, and returns results to the server. Administrative panels make it simple to upload model files, define the parameters and conditions to vary, and then monitor client status and work progress. NeuronPM is open-source freeware and is available for download at http://neuronpm.homeip.net . It is a useful entry-level tool for systematically analyzing complex neuron and network simulations.

  16. SUPIN: A Computational Tool for Supersonic Inlet Design

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2016-01-01

    A computational tool named SUPIN is being developed to design and analyze the aerodynamic performance of supersonic inlets. The inlet types available include the axisymmetric pitot, three-dimensional pitot, axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flow-field is divided into parts to provide a framework for the geometry and aerodynamic modeling. Each part of the inlet is defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick design and analysis. SUPIN provides inlet geometry in the form of coordinates, surface angles, and cross-sectional areas. SUPIN can generate inlet surface grids and three-dimensional, structured volume grids for use with higher-fidelity computational fluid dynamics (CFD) analysis. Capabilities highlighted in this paper include the design and analysis of streamline-traced external-compression inlets, modeling of porous bleed, and the design and analysis of mixed-compression inlets. CFD analyses are used to verify the SUPIN results.

  17. Fast and accurate CMB computations in non-flat FLRW universes

    SciTech Connect

    Lesgourgues, Julien; Tram, Thomas E-mail: thomas.tram@epfl.ch

    2014-09-01

    We present a new method for calculating CMB anisotropies in a non-flat Friedmann universe, relying on a very stable algorithm for the calculation of hyperspherical Bessel functions, that can be pushed to arbitrary precision levels. We also introduce a new approximation scheme which gradually takes over in the flat space limit and leads to significant reductions of the computation time. Our method is implemented in the Boltzmann code class. It can be used to benchmark the accuracy of the camb code in curved space, which is found to match expectations. For default precision settings, corresponding to 0.1% for scalar temperature spectra and 0.2% for scalar polarisation spectra, our code is two to three times faster, depending on curvature. We also simplify the temperature and polarisation source terms significantly, so the different contributions to the C{sub ℓ} 's are easy to identify inside the code.

  18. Computer-implemented system and method for automated and highly accurate plaque analysis, reporting, and visualization

    NASA Technical Reports Server (NTRS)

    Kemp, James Herbert (Inventor); Talukder, Ashit (Inventor); Lambert, James (Inventor); Lam, Raymond (Inventor)

    2008-01-01

    A computer-implemented system and method of intra-oral analysis for measuring plaque removal is disclosed. The system includes hardware for real-time image acquisition and software to store the acquired images on a patient-by-patient basis. The system implements algorithms to segment teeth of interest from surrounding gum, and uses a real-time image-based morphing procedure to automatically overlay a grid onto each segmented tooth. Pattern recognition methods are used to classify plaque from surrounding gum and enamel, while ignoring glare effects due to the reflection of camera light and ambient light from enamel regions. The system integrates these components into a single software suite with an easy-to-use graphical user interface (GUI) that allows users to do an end-to-end run of a patient record, including tooth segmentation of all teeth, grid morphing of each segmented tooth, and plaque classification of each tooth image.

  19. A model for the accurate computation of the lateral scattering of protons in water.

    PubMed

    Bellinzona, E V; Ciocca, M; Embriaco, A; Ferrari, A; Fontana, A; Mairani, A; Parodi, K; Rotondi, A; Sala, P; Tessonnier, T

    2016-02-21

    A pencil beam model for the calculation of the lateral scattering in water of protons for any therapeutic energy and depth is presented. It is based on the full Molière theory, taking into account the energy loss and the effects of mixtures and compounds. Concerning the electromagnetic part, the model has no free parameters and is in very good agreement with the FLUKA Monte Carlo (MC) code. The effects of the nuclear interactions are parametrized with a two-parameter tail function, adjusted on MC data calculated with FLUKA. The model, after the convolution with the beam and the detector response, is in agreement with recent proton data in water from HIT. The model gives results with the same accuracy of the MC codes based on Molière theory, with a much shorter computing time.

  20. A model for the accurate computation of the lateral scattering of protons in water

    NASA Astrophysics Data System (ADS)

    Bellinzona, E. V.; Ciocca, M.; Embriaco, A.; Ferrari, A.; Fontana, A.; Mairani, A.; Parodi, K.; Rotondi, A.; Sala, P.; Tessonnier, T.

    2016-02-01

    A pencil beam model for the calculation of the lateral scattering in water of protons for any therapeutic energy and depth is presented. It is based on the full Molière theory, taking into account the energy loss and the effects of mixtures and compounds. Concerning the electromagnetic part, the model has no free parameters and is in very good agreement with the FLUKA Monte Carlo (MC) code. The effects of the nuclear interactions are parametrized with a two-parameter tail function, adjusted on MC data calculated with FLUKA. The model, after the convolution with the beam and the detector response, is in agreement with recent proton data in water from HIT. The model gives results with the same accuracy of the MC codes based on Molière theory, with a much shorter computing time.

  1. Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals’ Behaviour

    PubMed Central

    Calderara, Simone; Pistocchi, Simone; Cucchiara, Rita; Podaliri-Vulpiani, Michele; Messori, Stefano; Ferri, Nicola

    2016-01-01

    Mankind directly controls the environment and lifestyles of several domestic species for purposes ranging from production and research to conservation and companionship. These environments and lifestyles may not offer these animals the best quality of life. Behaviour is a direct reflection of how the animal is coping with its environment. Behavioural indicators are thus among the preferred parameters to assess welfare. However, behavioural recording (usually from video) can be very time consuming and the accuracy and reliability of the output rely on the experience and background of the observers. The outburst of new video technology and computer image processing gives the basis for promising solutions. In this pilot study, we present a new prototype software able to automatically infer the behaviour of dogs housed in kennels from 3D visual data and through structured machine learning frameworks. Depth information acquired through 3D features, body part detection and training are the key elements that allow the machine to recognise postures, trajectories inside the kennel and patterns of movement that can be later labelled at convenience. The main innovation of the software is its ability to automatically cluster frequently observed temporal patterns of movement without any pre-set ethogram. Conversely, when common patterns are defined through training, a deviation from normal behaviour in time or between individuals could be assessed. The software accuracy in correctly detecting the dogs’ behaviour was checked through a validation process. An automatic behaviour recognition system, independent from human subjectivity, could add scientific knowledge on animals’ quality of life in confinement as well as saving time and resources. This 3D framework was designed to be invariant to the dog’s shape and size and could be extended to farm, laboratory and zoo quadrupeds in artificial housing. The computer vision technique applied to this software is innovative in non

  2. Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals' Behaviour.

    PubMed

    Barnard, Shanis; Calderara, Simone; Pistocchi, Simone; Cucchiara, Rita; Podaliri-Vulpiani, Michele; Messori, Stefano; Ferri, Nicola

    2016-01-01

    Mankind directly controls the environment and lifestyles of several domestic species for purposes ranging from production and research to conservation and companionship. These environments and lifestyles may not offer these animals the best quality of life. Behaviour is a direct reflection of how the animal is coping with its environment. Behavioural indicators are thus among the preferred parameters to assess welfare. However, behavioural recording (usually from video) can be very time consuming and the accuracy and reliability of the output rely on the experience and background of the observers. The outburst of new video technology and computer image processing gives the basis for promising solutions. In this pilot study, we present a new prototype software able to automatically infer the behaviour of dogs housed in kennels from 3D visual data and through structured machine learning frameworks. Depth information acquired through 3D features, body part detection and training are the key elements that allow the machine to recognise postures, trajectories inside the kennel and patterns of movement that can be later labelled at convenience. The main innovation of the software is its ability to automatically cluster frequently observed temporal patterns of movement without any pre-set ethogram. Conversely, when common patterns are defined through training, a deviation from normal behaviour in time or between individuals could be assessed. The software accuracy in correctly detecting the dogs' behaviour was checked through a validation process. An automatic behaviour recognition system, independent from human subjectivity, could add scientific knowledge on animals' quality of life in confinement as well as saving time and resources. This 3D framework was designed to be invariant to the dog's shape and size and could be extended to farm, laboratory and zoo quadrupeds in artificial housing. The computer vision technique applied to this software is innovative in non

  3. Computational Chemical Imaging for Cardiovascular Pathology: Chemical Microscopic Imaging Accurately Determines Cardiac Transplant Rejection

    PubMed Central

    Tiwari, Saumya; Reddy, Vijaya B.; Bhargava, Rohit; Raman, Jaishankar

    2015-01-01

    Rejection is a common problem after cardiac transplants leading to significant number of adverse events and deaths, particularly in the first year of transplantation. The gold standard to identify rejection is endomyocardial biopsy. This technique is complex, cumbersome and requires a lot of expertise in the correct interpretation of stained biopsy sections. Traditional histopathology cannot be used actively or quickly during cardiac interventions or surgery. Our objective was to develop a stain-less approach using an emerging technology, Fourier transform infrared (FT-IR) spectroscopic imaging to identify different components of cardiac tissue by their chemical and molecular basis aided by computer recognition, rather than by visual examination using optical microscopy. We studied this technique in assessment of cardiac transplant rejection to evaluate efficacy in an example of complex cardiovascular pathology. We recorded data from human cardiac transplant patients’ biopsies, used a Bayesian classification protocol and developed a visualization scheme to observe chemical differences without the need of stains or human supervision. Using receiver operating characteristic curves, we observed probabilities of detection greater than 95% for four out of five histological classes at 10% probability of false alarm at the cellular level while correctly identifying samples with the hallmarks of the immune response in all cases. The efficacy of manual examination can be significantly increased by observing the inherent biochemical changes in tissues, which enables us to achieve greater diagnostic confidence in an automated, label-free manner. We developed a computational pathology system that gives high contrast images and seems superior to traditional staining procedures. This study is a prelude to the development of real time in situ imaging systems, which can assist interventionists and surgeons actively during procedures. PMID:25932912

  4. Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals' Behaviour.

    PubMed

    Barnard, Shanis; Calderara, Simone; Pistocchi, Simone; Cucchiara, Rita; Podaliri-Vulpiani, Michele; Messori, Stefano; Ferri, Nicola

    2016-01-01

    Mankind directly controls the environment and lifestyles of several domestic species for purposes ranging from production and research to conservation and companionship. These environments and lifestyles may not offer these animals the best quality of life. Behaviour is a direct reflection of how the animal is coping with its environment. Behavioural indicators are thus among the preferred parameters to assess welfare. However, behavioural recording (usually from video) can be very time consuming and the accuracy and reliability of the output rely on the experience and background of the observers. The outburst of new video technology and computer image processing gives the basis for promising solutions. In this pilot study, we present a new prototype software able to automatically infer the behaviour of dogs housed in kennels from 3D visual data and through structured machine learning frameworks. Depth information acquired through 3D features, body part detection and training are the key elements that allow the machine to recognise postures, trajectories inside the kennel and patterns of movement that can be later labelled at convenience. The main innovation of the software is its ability to automatically cluster frequently observed temporal patterns of movement without any pre-set ethogram. Conversely, when common patterns are defined through training, a deviation from normal behaviour in time or between individuals could be assessed. The software accuracy in correctly detecting the dogs' behaviour was checked through a validation process. An automatic behaviour recognition system, independent from human subjectivity, could add scientific knowledge on animals' quality of life in confinement as well as saving time and resources. This 3D framework was designed to be invariant to the dog's shape and size and could be extended to farm, laboratory and zoo quadrupeds in artificial housing. The computer vision technique applied to this software is innovative in non

  5. Time-Accurate Computational Fluid Dynamics Simulation of a Pair of Moving Solid Rocket Boosters

    NASA Technical Reports Server (NTRS)

    Strutzenberg, Louise L.; Williams, Brandon R.

    2011-01-01

    Since the Columbia accident, the threat to the Shuttle launch vehicle from debris during the liftoff timeframe has been assessed by the Liftoff Debris Team at NASA/MSFC. In addition to engineering methods of analysis, CFD-generated flow fields during the liftoff timeframe have been used in conjunction with 3-DOF debris transport methods to predict the motion of liftoff debris. Early models made use of a quasi-steady flow field approximation with the vehicle positioned at a fixed location relative to the ground; however, a moving overset mesh capability has recently been developed for the Loci/CHEM CFD software which enables higher-fidelity simulation of the Shuttle transient plume startup and liftoff environment. The present work details the simulation of the launch pad and mobile launch platform (MLP) with truncated solid rocket boosters (SRBs) moving in a prescribed liftoff trajectory derived from Shuttle flight measurements. Using Loci/CHEM, time-accurate RANS and hybrid RANS/LES simulations were performed for the timeframe T0+0 to T0+3.5 seconds, which consists of SRB startup to a vehicle altitude of approximately 90 feet above the MLP. Analysis of the transient flowfield focuses on the evolution of the SRB plumes in the MLP plume holes and the flame trench, impingement on the flame deflector, and especially impingment on the MLP deck resulting in upward flow which is a transport mechanism for debris. The results show excellent qualitative agreement with the visual record from past Shuttle flights, and comparisons to pressure measurements in the flame trench and on the MLP provide confidence in these simulation capabilities.

  6. An accurate and scalable O(N) algorithm for First-Principles Molecular Dynamics computations on petascale computers and beyond

    NASA Astrophysics Data System (ADS)

    Osei-Kuffuor, Daniel; Fattebert, Jean-Luc

    2014-03-01

    We present a truly scalable First-Principles Molecular Dynamics algorithm with O(N) complexity and fully controllable accuracy, capable of simulating systems of sizes that were previously impossible with this degree of accuracy. By avoiding global communication, we have extended W. Kohn's condensed matter ``nearsightedness'' principle to a practical computational scheme capable of extreme scalability. Accuracy is controlled by the mesh spacing of the finite difference discretization, the size of the localization regions in which the electronic wavefunctions are confined, and a cutoff beyond which the components of the overlap matrix can be omitted when computing selected elements of its inverse. We demonstrate the algorithm's excellent parallel scaling for up to 100,000 atoms on 100,000 processors, with a wall-clock time of the order of one minute per molecular dynamics time step. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  7. On accurate computations of slowly convergent atomic properties in few-electron ions and electron-electron correlations

    NASA Astrophysics Data System (ADS)

    Frolov, Alexei M.; Wardlaw, David M.

    2016-09-01

    We discuss an approach to accurate numerical computations of slowly convergent properties in two-electron atoms/ions which include the negatively charged Ps- ( e - e + e -) and H- ions, He atom and positively charged, helium-like ions from Li+ to Ni26+. All these ions are considered in their ground 11S-state(s). The slowly convergent properties selected in this study include the electron-nulceus ( r 2k eN) and electron-electron ( r 2k ee) expectation values for k = 2, 3, 4 and 5.

  8. Highly Accurate Frequency Calculations of Crab Cavities Using the VORPAL Computational Framework

    SciTech Connect

    Austin, T.M.; Cary, J.R.; Bellantoni, L.; /Argonne

    2009-05-01

    We have applied the Werner-Cary method [J. Comp. Phys. 227, 5200-5214 (2008)] for extracting modes and mode frequencies from time-domain simulations of crab cavities, as are needed for the ILC and the beam delivery system of the LHC. This method for frequency extraction relies on a small number of simulations, and post-processing using the SVD algorithm with Tikhonov regularization. The time-domain simulations were carried out using the VORPAL computational framework, which is based on the eminently scalable finite-difference time-domain algorithm. A validation study was performed on an aluminum model of the 3.9 GHz RF separators built originally at Fermi National Accelerator Laboratory in the US. Comparisons with measurements of the A15 cavity show that this method can provide accuracy to within 0.01% of experimental results after accounting for manufacturing imperfections. To capture the near degeneracies two simulations, requiring in total a few hours on 600 processors were employed. This method has applications across many areas including obtaining MHD spectra from time-domain simulations.

  9. Improved targeting device and computer navigation for accurate placement of brachytherapy needles

    SciTech Connect

    Pappas, Ion P.I.; Ryan, Paul; Cossmann, Peter; Kowal, Jens; Borgeson, Blake; Caversaccio, Marco

    2005-06-15

    Successful treatment of skull base tumors with interstitial brachytherapy requires high targeting accuracy for the brachytherapy needles to avoid harming vital anatomical structures. To enable safe placement of the needles in this area, we developed an image-based planning and navigation system for brachytherapy, which includes a custom-made mechanical positioning arm that allows rough and fine adjustment of the needle position. The fine-adjustment mechanism consists of an XYZ microstage at the base of the arm and a needle holder with two fine-adjustable inclinations. The rotation axes of the inclinations cross at the tip of the needle so that the inclinational adjustments do not interfere with the translational adjustments. A vacuum cushion and a noninvasive fixation frame are used for the head immobilization. To avoid mechanical bending of the needles due to the weight of attached tracking markers, which would be detrimental for targeting accuracy, only a single LED marker on the tail of the needle is used. An experimental phantom-based targeting study with this setup demonstrated that a positioning accuracy of 1.4 mm (rms) can be achieved. The study showed that the proposed setup allows brachytherapy needles to be easily aligned and inserted with high targeting accuracy according to a preliminary plan. The achievable accuracy is higher than if the needles are inserted manually. The proposed system can be linked to a standard afterloader and standard dosimetry planning module. The associated additional effort is reasonable for the clinical practice and therefore the proposed procedure provides a promising tool for the safe treatment of tumors in the skull base area.

  10. In pursuit of an accurate spatial and temporal model of biomolecules at the atomistic level: a perspective on computer simulation

    PubMed Central

    Gray, Alan; Harlen, Oliver G.; Harris, Sarah A.; Khalid, Syma; Leung, Yuk Ming; Lonsdale, Richard; Mulholland, Adrian J.; Pearson, Arwen R.; Read, Daniel J.; Richardson, Robin A.

    2015-01-01

    Despite huge advances in the computational techniques available for simulating biomolecules at the quantum-mechanical, atomistic and coarse-grained levels, there is still a widespread perception amongst the experimental community that these calculations are highly specialist and are not generally applicable by researchers outside the theoretical community. In this article, the successes and limitations of biomolecular simulation and the further developments that are likely in the near future are discussed. A brief overview is also provided of the experimental biophysical methods that are commonly used to probe biomolecular structure and dynamics, and the accuracy of the information that can be obtained from each is compared with that from modelling. It is concluded that progress towards an accurate spatial and temporal model of biomacromolecules requires a combination of all of these biophysical techniques, both experimental and computational. PMID:25615870

  11. Towards an accurate and computationally-efficient modelling of Fe(II)-based spin crossover materials.

    PubMed

    Vela, Sergi; Fumanal, Maria; Ribas-Arino, Jordi; Robert, Vincent

    2015-07-01

    The DFT + U methodology is regarded as one of the most-promising strategies to treat the solid state of molecular materials, as it may provide good energetic accuracy at a moderate computational cost. However, a careful parametrization of the U-term is mandatory since the results may be dramatically affected by the selected value. Herein, we benchmarked the Hubbard-like U-term for seven Fe(ii)N6-based pseudo-octahedral spin crossover (SCO) compounds, using as a reference an estimation of the electronic enthalpy difference (ΔHelec) extracted from experimental data (T1/2, ΔS and ΔH). The parametrized U-value obtained for each of those seven compounds ranges from 2.37 eV to 2.97 eV, with an average value of U = 2.65 eV. Interestingly, we have found that this average value can be taken as a good starting point since it leads to an unprecedented mean absolute error (MAE) of only 4.3 kJ mol(-1) in the evaluation of ΔHelec for the studied compounds. Moreover, by comparing our results on the solid state and the gas phase of the materials, we quantify the influence of the intermolecular interactions on the relative stability of the HS and LS states, with an average effect of ca. 5 kJ mol(-1), whose sign cannot be generalized. Overall, the findings reported in this manuscript pave the way for future studies devoted to understand the crystalline phase of SCO compounds, or the adsorption of individual molecules on organic or metallic surfaces, in which the rational incorporation of the U-term within DFT + U yields the required energetic accuracy that is dramatically missing when using bare-DFT functionals.

  12. Accurate micro-computed tomography imaging of pore spaces in collagen-based scaffold.

    PubMed

    Zidek, Jan; Vojtova, Lucy; Abdel-Mohsen, A M; Chmelik, Jiri; Zikmund, Tomas; Brtnikova, Jana; Jakubicek, Roman; Zubal, Lukas; Jan, Jiri; Kaiser, Jozef

    2016-06-01

    In this work we have used X-ray micro-computed tomography (μCT) as a method to observe the morphology of 3D porous pure collagen and collagen-composite scaffolds useful in tissue engineering. Two aspects of visualizations were taken into consideration: improvement of the scan and investigation of its sensitivity to the scan parameters. Due to the low material density some parts of collagen scaffolds are invisible in a μCT scan. Therefore, here we present different contrast agents, which increase the contrast of the scanned biopolymeric sample for μCT visualization. The increase of contrast of collagenous scaffolds was performed with ceramic hydroxyapatite microparticles (HAp), silver ions (Ag(+)) and silver nanoparticles (Ag-NPs). Since a relatively small change in imaging parameters (e.g. in 3D volume rendering, threshold value and μCT acquisition conditions) leads to a completely different visualized pattern, we have optimized these parameters to obtain the most realistic picture for visual and qualitative evaluation of the biopolymeric scaffold. Moreover, scaffold images were stereoscopically visualized in order to better see the 3D biopolymer composite scaffold morphology. However, the optimized visualization has some discontinuities in zoomed view, which can be problematic for further analysis of interconnected pores by commonly used numerical methods. Therefore, we applied the locally adaptive method to solve discontinuities issue. The combination of contrast agent and imaging techniques presented in this paper help us to better understand the structure and morphology of the biopolymeric scaffold that is crucial in the design of new biomaterials useful in tissue engineering. PMID:27153826

  13. A fully automatic tool to perform accurate flood mapping by merging remote sensing imagery and ancillary data

    NASA Astrophysics Data System (ADS)

    D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco; Pasquariello, Guido

    2016-04-01

    Flooding is one of the most frequent and expansive natural hazard. High-resolution flood mapping is an essential step in the monitoring and prevention of inundation hazard, both to gain insight into the processes involved in the generation of flooding events, and from the practical point of view of the precise assessment of inundated areas. Remote sensing data are recognized to be useful in this respect, thanks to the high resolution and regular revisit schedules of state-of-the-art satellites, moreover offering a synoptic overview of the extent of flooding. In particular, Synthetic Aperture Radar (SAR) data present several favorable characteristics for flood mapping, such as their relative insensitivity to the meteorological conditions during acquisitions, as well as the possibility of acquiring independently of solar illumination, thanks to the active nature of the radar sensors [1]. However, flood scenarios are typical examples of complex situations in which different factors have to be considered to provide accurate and robust interpretation of the situation on the ground: the presence of many land cover types, each one with a particular signature in presence of flood, requires modelling the behavior of different objects in the scene in order to associate them to flood or no flood conditions [2]. Generally, the fusion of multi-temporal, multi-sensor, multi-resolution and/or multi-platform Earth observation image data, together with other ancillary information, seems to have a key role in the pursuit of a consistent interpretation of complex scenes. In the case of flooding, distance from the river, terrain elevation, hydrologic information or some combination thereof can add useful information to remote sensing data. Suitable methods, able to manage and merge different kind of data, are so particularly needed. In this work, a fully automatic tool, based on Bayesian Networks (BNs) [3] and able to perform data fusion, is presented. It supplies flood maps

  14. Accurate treatments of electrostatics for computer simulations of biological systems: A brief survey of developments and existing problems

    NASA Astrophysics Data System (ADS)

    Yi, Sha-Sha; Pan, Cong; Hu, Zhong-Han

    2015-12-01

    Modern computer simulations of biological systems often involve an explicit treatment of the complex interactions among a large number of molecules. While it is straightforward to compute the short-ranged Van der Waals interaction in classical molecular dynamics simulations, it has been a long-lasting issue to develop accurate methods for the longranged Coulomb interaction. In this short review, we discuss three types of methodologies for the accurate treatment of electrostatics in simulations of explicit molecules: truncation-type methods, Ewald-type methods, and mean-field-type methods. Throughout the discussion, we brief the formulations and developments of these methods, emphasize the intrinsic connections among the three types of methods, and focus on the existing problems which are often associated with the boundary conditions of electrostatics. This brief survey is summarized with a short perspective on future trends along the method developments and applications in the field of biological simulations. Project supported by the National Natural Science Foundation of China (Grant Nos. 91127015 and 21522304) and the Open Project from the State Key Laboratory of Theoretical Physics, and the Innovation Project from the State Key Laboratory of Supramolecular Structure and Materials.

  15. Methods for Computing Accurate Atomic Spin Moments for Collinear and Noncollinear Magnetism in Periodic and Nonperiodic Materials.

    PubMed

    Manz, Thomas A; Sholl, David S

    2011-12-13

    The partitioning of electron spin density among atoms in a material gives atomic spin moments (ASMs), which are important for understanding magnetic properties. We compare ASMs computed using different population analysis methods and introduce a method for computing density derived electrostatic and chemical (DDEC) ASMs. Bader and DDEC ASMs can be computed for periodic and nonperiodic materials with either collinear or noncollinear magnetism, while natural population analysis (NPA) ASMs can be computed for nonperiodic materials with collinear magnetism. Our results show Bader, DDEC, and (where applicable) NPA methods give similar ASMs, but different net atomic charges. Because they are optimized to reproduce both the magnetic field and the chemical states of atoms in a material, DDEC ASMs are especially suitable for constructing interaction potentials for atomistic simulations. We describe the computation of accurate ASMs for (a) a variety of systems using collinear and noncollinear spin DFT, (b) highly correlated materials (e.g., magnetite) using DFT+U, and (c) various spin states of ozone using coupled cluster expansions. The computed ASMs are in good agreement with available experimental results for a variety of periodic and nonperiodic materials. Examples considered include the antiferromagnetic metal organic framework Cu3(BTC)2, several ozone spin states, mono- and binuclear transition metal complexes, ferri- and ferro-magnetic solids (e.g., Fe3O4, Fe3Si), and simple molecular systems. We briefly discuss the theory of exchange-correlation functionals for studying noncollinear magnetism. A method for finding the ground state of systems with highly noncollinear magnetism is introduced. We use these methods to study the spin-orbit coupling potential energy surface of the single molecule magnet Fe4C40H52N4O12, which has highly noncollinear magnetism, and find that it contains unusual features that give a new interpretation to experimental data.

  16. A More Accurate and Efficient Technique Developed for Using Computational Methods to Obtain Helical Traveling-Wave Tube Interaction Impedance

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.

    1999-01-01

    The phenomenal growth of commercial communications has created a great demand for traveling-wave tube (TWT) amplifiers. Although the helix slow-wave circuit remains the mainstay of the TWT industry because of its exceptionally wide bandwidth, until recently it has been impossible to accurately analyze a helical TWT using its exact dimensions because of the complexity of its geometrical structure. For the first time, an accurate three-dimensional helical model was developed that allows accurate prediction of TWT cold-test characteristics including operating frequency, interaction impedance, and attenuation. This computational model, which was developed at the NASA Lewis Research Center, allows TWT designers to obtain a more accurate value of interaction impedance than is possible using experimental methods. Obtaining helical slow-wave circuit interaction impedance is an important part of the design process for a TWT because it is related to the gain and efficiency of the tube. This impedance cannot be measured directly; thus, conventional methods involve perturbing a helical circuit with a cylindrical dielectric rod placed on the central axis of the circuit and obtaining the difference in resonant frequency between the perturbed and unperturbed circuits. A mathematical relationship has been derived between this frequency difference and the interaction impedance (ref. 1). However, because of the complex configuration of the helical circuit, deriving this relationship involves several approximations. In addition, this experimental procedure is time-consuming and expensive, but until recently it was widely accepted as the most accurate means of determining interaction impedance. The advent of an accurate three-dimensional helical circuit model (ref. 2) made it possible for Lewis researchers to fully investigate standard approximations made in deriving the relationship between measured perturbation data and interaction impedance. The most prominent approximations made

  17. Analyzing the Cohesion of English Text and Discourse with Automated Computer Tools

    ERIC Educational Resources Information Center

    Jeon, Moongee

    2014-01-01

    This article investigates the lexical and discourse features of English text and discourse with automated computer technologies. Specifically, this article examines the cohesion of English text and discourse with automated computer tools, Coh-Metrix and TEES. Coh-Metrix is a text analysis computer tool that can analyze English text and discourse…

  18. The Implications of Cognitive Psychology for Computer-Based Learning Tools.

    ERIC Educational Resources Information Center

    Kozma, Robert B.

    1987-01-01

    Defines cognitive computer tools as software programs that use the control capabilities of computers to amplify, extend, or enhance human cognition; suggests seven ways in which computers can aid learning; and describes the "Learning Tool," a software package for the Apple Macintosh microcomputer that is designed to aid learning of declarative…

  19. Computer Instrumentation and the New Tools of Science.

    ERIC Educational Resources Information Center

    Snyder, H. David

    1990-01-01

    The impact and uses of new technologies in science teaching are discussed. Included are computers, software, sensors, integrated circuits, computer signal access, and computer interfaces. Uses and advantages of these new technologies are suggested. (CW)

  20. Navigating Traditional Chinese Medicine Network Pharmacology and Computational Tools

    PubMed Central

    Chen, Jia-Lei; Xu, Li-Wen

    2013-01-01

    The concept of “network target” has ushered in a new era in the field of traditional Chinese medicine (TCM). As a new research approach, network pharmacology is based on the analysis of network models and systems biology. Taking advantage of advancements in systems biology, a high degree of integration data analysis strategy and interpretable visualization provides deeper insights into the underlying mechanisms of TCM theories, including the principles of herb combination, biological foundations of herb or herbal formulae action, and molecular basis of TCM syndromes. In this study, we review several recent developments in TCM network pharmacology research and discuss their potential for bridging the gap between traditional and modern medicine. We briefly summarize the two main functional applications of TCM network models: understanding/uncovering and predicting/discovering. In particular, we focus on how TCM network pharmacology research is conducted and highlight different computational tools, such as network-based and machine learning algorithms, and sources that have been proposed and applied to the different steps involved in the research process. To make network pharmacology research commonplace, some basic network definitions and analysis methods are presented. PMID:23983798

  1. CProb: a computational tool for conducting conditional probability analysis.

    PubMed

    Hollister, Jeffrey W; Walker, Henry A; Paul, John F

    2008-01-01

    Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.

  2. Development of a computer tool to detect and classify nodules in ultrasound breast images

    NASA Astrophysics Data System (ADS)

    Marcomini, Karem D.; Carneiro, Antonio O.; Schiabel, Homero

    2014-03-01

    Due to the high incidence rate of breast cancer in women, many procedures have been developed to assist the diagnosis and early detection. Currently, ultrasonography has proved as a useful tool in distinguishing benign and malignant masses. In this context, the computer-aided diagnosis schemes have provided to the specialist a second opinion more accurately and reliably, minimizing the visual subjectivity between observers. Thus, we propose the application of an automatic detection method based on the use of the technique of active contour in order to show precisely the contour of the lesion and provide a better understanding of their morphology. For this, a total of 144 images of phantoms were segmented and submitted to morphological operations of opening and closing for smoothing the edges. Then morphological features were extracted and selected to work as input parameters for the neural classifier Multilayer Perceptron which obtained 95.34% correct classification of data and Az of 0.96.

  3. Physics education through computational tools: the case of geometrical and physical optics

    NASA Astrophysics Data System (ADS)

    Rodríguez, Y.; Santana, A.; Mendoza, L. M.

    2013-09-01

    Recently, with the development of more powerful and accurate computational tools, the inclusion of new didactic materials in the classroom is known to have increased. However, the form in which these materials can be used to enhance the learning process is still under debate. Many different methodologies have been suggested for constructing new relevant curricular material and, among them, just-in-time teaching (JiTT) has arisen as an effective and successful way to improve the content of classes. In this paper, we will show the implemented pedagogic strategies for the courses of geometrical and optical physics for students of optometry. Thus, the use of the GeoGebra software for the geometrical optics class and the employment of new in-house software for the physical optics class created using the high-level programming language Python is shown with the corresponding activities developed for each of these applets.

  4. Use Computer-Aided Tools to Parallelize Large CFD Applications

    NASA Technical Reports Server (NTRS)

    Jin, H.; Frumkin, M.; Yan, J.

    2000-01-01

    Porting applications to high performance parallel computers is always a challenging task. It is time consuming and costly. With rapid progressing in hardware architectures and increasing complexity of real applications in recent years, the problem becomes even more sever. Today, scalability and high performance are mostly involving handwritten parallel programs using message-passing libraries (e.g. MPI). However, this process is very difficult and often error-prone. The recent reemergence of shared memory parallel (SMP) architectures, such as the cache coherent Non-Uniform Memory Access (ccNUMA) architecture used in the SGI Origin 2000, show good prospects for scaling beyond hundreds of processors. Programming on an SMP is simplified by working in a globally accessible address space. The user can supply compiler directives, such as OpenMP, to parallelize the code. As an industry standard for portable implementation of parallel programs for SMPs, OpenMP is a set of compiler directives and callable runtime library routines that extend Fortran, C and C++ to express shared memory parallelism. It promises an incremental path for parallel conversion of existing software, as well as scalability and performance for a complete rewrite or an entirely new development. Perhaps the main disadvantage of programming with directives is that inserted directives may not necessarily enhance performance. In the worst cases, it can create erroneous results. While vendors have provided tools to perform error-checking and profiling, automation in directive insertion is very limited and often failed on large programs, primarily due to the lack of a thorough enough data dependence analysis. To overcome the deficiency, we have developed a toolkit, CAPO, to automatically insert OpenMP directives in Fortran programs and apply certain degrees of optimization. CAPO is aimed at taking advantage of detailed inter-procedural dependence analysis provided by CAPTools, developed by the University of

  5. CRISIS2012: An Updated Tool to Compute Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Ordaz, M.; Martinelli, F.; Meletti, C.; D'Amico, V.

    2013-05-01

    CRISIS is a computer tool for probabilistic seismic hazard analysis (PSHA), whose development started in the late 1980's at the Instituto de Ingeniería, UNAM, Mexico. It started circulating outside the Mexican borders at the beginning of the 1990's, when it was first distributed as part of SEISAN tools. Throughout the years, CRISIS has been used for seismic hazard studies in several countries in Latin America (Mexico, Guatemala, Belize, El Salvador, Honduras, Nicaragua, Costa Rica, Panama, Colombia, Venezuela, Ecuador, Peru, Argentina and Chile), and in many other countries of the World. CRISIS has always circulated free of charge for non-commercial applications. It is worth noting that CRISIS has been mainly written by people that are, at the same time, PSHA practitioners. Therefore, the development loop has been relatively short, and most of the modifications and improvements have been made to satisfy the needs of the developers themselves. CRISIS has evolved from a rather simple FORTRAN code to a relatively complex program with a friendly graphical interface, able to handle a variety of modeling possibilities for source geometries, seismicity descriptions and ground motion prediction models (GMPM). We will describe some of the improvements made for the newest version of the code: CRISIS 2012.These improvements, some of which were made in the frame of the Italian research project INGV-DPC S2 (http://nuovoprogettoesse2.stru.polimi.it/), funded by the Dipartimento della Protezione Civile (DPC; National Civil Protection Department), include: A wider variety of source geometries A wider variety of seismicity models, including the ability to handle non-Poissonian occurrence models and Poissonian smoothed-seismicity descriptions. Enhanced capabilities for using different kinds of GMPM: attenuation tables, built-in models and generalized attenuation models. In the case of built-in models, there is, by default, a set ready to use in CRISIS, but additional custom GMPMs

  6. A streamline splitting pore-network approach for computationally inexpensive and accurate simulation of transport in porous media

    SciTech Connect

    Mehmani, Yashar; Oostrom, Martinus; Balhoff, Matthew

    2014-03-20

    Several approaches have been developed in the literature for solving flow and transport at the pore-scale. Some authors use a direct modeling approach where the fundamental flow and transport equations are solved on the actual pore-space geometry. Such direct modeling, while very accurate, comes at a great computational cost. Network models are computationally more efficient because the pore-space morphology is approximated. Typically, a mixed cell method (MCM) is employed for solving the flow and transport system which assumes pore-level perfect mixing. This assumption is invalid at moderate to high Peclet regimes. In this work, a novel Eulerian perspective on modeling flow and transport at the pore-scale is developed. The new streamline splitting method (SSM) allows for circumventing the pore-level perfect mixing assumption, while maintaining the computational efficiency of pore-network models. SSM was verified with direct simulations and excellent matches were obtained against micromodel experiments across a wide range of pore-structure and fluid-flow parameters. The increase in the computational cost from MCM to SSM is shown to be minimal, while the accuracy of SSM is much higher than that of MCM and comparable to direct modeling approaches. Therefore, SSM can be regarded as an appropriate balance between incorporating detailed physics and controlling computational cost. The truly predictive capability of the model allows for the study of pore-level interactions of fluid flow and transport in different porous materials. In this paper, we apply SSM and MCM to study the effects of pore-level mixing on transverse dispersion in 3D disordered granular media.

  7. Frances: A Tool for Understanding Computer Architecture and Assembly Language

    ERIC Educational Resources Information Center

    Sondag, Tyler; Pokorny, Kian L.; Rajan, Hridesh

    2012-01-01

    Students in all areas of computing require knowledge of the computing device including software implementation at the machine level. Several courses in computer science curricula address these low-level details such as computer architecture and assembly languages. For such courses, there are advantages to studying real architectures instead of…

  8. In pursuit of an accurate spatial and temporal model of biomolecules at the atomistic level: a perspective on computer simulation

    SciTech Connect

    Gray, Alan; Harlen, Oliver G.; Harris, Sarah A.; Khalid, Syma; Leung, Yuk Ming; Lonsdale, Richard; Mulholland, Adrian J.; Pearson, Arwen R.; Read, Daniel J.; Richardson, Robin A.

    2015-01-01

    The current computational techniques available for biomolecular simulation are described, and the successes and limitations of each with reference to the experimental biophysical methods that they complement are presented. Despite huge advances in the computational techniques available for simulating biomolecules at the quantum-mechanical, atomistic and coarse-grained levels, there is still a widespread perception amongst the experimental community that these calculations are highly specialist and are not generally applicable by researchers outside the theoretical community. In this article, the successes and limitations of biomolecular simulation and the further developments that are likely in the near future are discussed. A brief overview is also provided of the experimental biophysical methods that are commonly used to probe biomolecular structure and dynamics, and the accuracy of the information that can be obtained from each is compared with that from modelling. It is concluded that progress towards an accurate spatial and temporal model of biomacromolecules requires a combination of all of these biophysical techniques, both experimental and computational.

  9. IDO Scheme for Accurate Computation of Seismic Waves I. Plane-Wave Response of a Vertically Heterogeneous Medium

    NASA Astrophysics Data System (ADS)

    Ohkawauchi, K.; Takenaka, H.

    2006-12-01

    We propose a new method for the calculation of seismic wave propagation using the interpolated differential operator (IDO, Aoki,1997) which is a numerical method for solving the partial differential equations and is based on a high accurate interpolation of the profile for the independent variables over a local area. It improves the accuracy of wave computation with high accuracy because the local interpolation can represent high order behavior of wave field between grid points. In addition, locality of this approach makes possible treatment of boundary conditions exactly. In this study, we address computation of plane-wave responses of vertically heterogeneous structure models. We then solve the elastodynamic equation for plane wave derived by Tanaka and Takenaka (2005). The equations to be solved in our method are not only velocity-stress equations but also the corresponding ones integrated over each cell between adjacent grid points. We use two staggered-grid systems which can be non-uniform, and then discretize the governing equations using a finite-difference scheme of second-order accurate in time, and the second-order Hermite interpolation in space. In this method, the second-order Hermite interpolation of particle velocity or stress is obtained from the values at the adjacent two grid points and the integration value at the cell between the grid points. The time marching of the original and integrated quantities are proceeded, and in the following time step the quantities are computed on the alternative grid system to that used in the current time step. In implementation of a free-surface boundary condition, all field quantities locate just on the free surface. Their computational accuracy is the same order as those in the other spatial domain. We also implement the interface condition in a similarly way to the free surface condition. We used some simple models to test the scheme. The results showed that the waveforms calculated by our method fit the

  10. Covariance Analysis Tool (G-CAT) for Computing Ascent, Descent, and Landing Errors

    NASA Technical Reports Server (NTRS)

    Boussalis, Dhemetrios; Bayard, David S.

    2013-01-01

    G-CAT is a covariance analysis tool that enables fast and accurate computation of error ellipses for descent, landing, ascent, and rendezvous scenarios, and quantifies knowledge error contributions needed for error budgeting purposes. Because GCAT supports hardware/system trade studies in spacecraft and mission design, it is useful in both early and late mission/ proposal phases where Monte Carlo simulation capability is not mature, Monte Carlo simulation takes too long to run, and/or there is a need to perform multiple parametric system design trades that would require an unwieldy number of Monte Carlo runs. G-CAT is formulated as a variable-order square-root linearized Kalman filter (LKF), typically using over 120 filter states. An important property of G-CAT is that it is based on a 6-DOF (degrees of freedom) formulation that completely captures the combined effects of both attitude and translation errors on the propagated trajectories. This ensures its accuracy for guidance, navigation, and control (GN&C) analysis. G-CAT provides the desired fast turnaround analysis needed for error budgeting in support of mission concept formulations, design trade studies, and proposal development efforts. The main usefulness of a covariance analysis tool such as G-CAT is its ability to calculate the performance envelope directly from a single run. This is in sharp contrast to running thousands of simulations to obtain similar information using Monte Carlo methods. It does this by propagating the "statistics" of the overall design, rather than simulating individual trajectories. G-CAT supports applications to lunar, planetary, and small body missions. It characterizes onboard knowledge propagation errors associated with inertial measurement unit (IMU) errors (gyro and accelerometer), gravity errors/dispersions (spherical harmonics, masscons), and radar errors (multiple altimeter beams, multiple Doppler velocimeter beams). G-CAT is a standalone MATLAB- based tool intended to

  11. TestMaker: A Computer-Based Test Development Tool.

    ERIC Educational Resources Information Center

    Gibbs, William J.; Lario-Gibbs, Annette M.

    This paper discusses a computer-based prototype called TestMaker that enables educators to create computer-based tests. Given the functional needs of faculty, the host of research implications computer technology has for assessment, and current educational perspectives such as constructivism and their impact on testing, the purposes for developing…

  12. Magnetic Resonance Imaging and GeneXpert: A Rapid and Accurate Diagnostic Tool for the Management of Tuberculosis of the Spine

    PubMed Central

    Chhabra, Harvinder Singh; Mahajan, Rajat; Chabra, Tarun; Batra, Sahil

    2016-01-01

    Study Design Retrospective study. Purpose The aim of this study was to analyze various diagnostic tools, including GeneXpert, for the management of tuberculosis of the spine. Overview of Literature Traditional diagnostic methods of microscopy, histology, and culture have low sensitivity and specificity for the management of tuberculosis of the spine. Methods Of the 262 treated cases of spinal tuberculosis, data on 1 year follow-up was available for 217 cases. Of these, only 145 cases with a confirmed diagnosis were selected for retrospective analysis. Results In 145 of the 217 patients (66.80%), diagnosis was confirmed on the basis of a culture. Of the 145 patients with a confirmed diagnosis, 98 (66.20%) patients were diagnosed on the basis of clinical presentation, whereas 123 (84.8%) exhibited a typical magnetic resonance imaging (MRI) picture. In 99 surgically treated patients, the diagnosis was confirmed on the basis of an intraoperative tissue biopsy. Among the 46 patients treated conservatively, 35 underwent a transpedicular biopsy, 4 patients underwent computed tomography-guided biopsy, 6 patients were diagnosed on the basis of material obtained from a cold abscess, and 1 patient underwent an open biopsy. The sensitivity of the culture for the detection of Mycobacterium tuberculosis was 66.80% (145/217) in our patients. Among the cases in which GeneXpert was used, the sensitivity for the detection of Mycobacterium tuberculosis was 93.4% (43/46). Moreover, the sensitivity of GeneXpert to detect rifampicin resistance was 100% (7/7) in our study. Conclusions Majority of the patients with tuberculosis of the spine can be diagnosed on the basis of a typical radiological presentation via MRI. In our study, 84.8% cases exhibited typical MRI findings. For patients presenting with atypical MRI features, a rapid and accurate diagnosis is possible by combining GeneXpert with MRI. The combined use of MRI and GeneXpert is a rapid and highly sensitive tool to diagnose

  13. BioBloom tools: fast, accurate and memory-efficient host species sequence screening using bloom filters

    PubMed Central

    Chu, Justin; Sadeghi, Sara; Raymond, Anthony; Jackman, Shaun D.; Nip, Ka Ming; Mar, Richard; Mohamadi, Hamid; Butterfield, Yaron S.; Robertson, A. Gordon; Birol, Inanç

    2014-01-01

    Large datasets can be screened for sequences from a specific organism, quickly and with low memory requirements, by a data structure that supports time- and memory-efficient set membership queries. Bloom filters offer such queries but require that false positives be controlled. We present BioBloom Tools, a Bloom filter-based sequence-screening tool that is faster than BWA, Bowtie 2 (popular alignment algorithms) and FACS (a membership query algorithm). It delivers accuracies comparable with these tools, controls false positives and has low memory requirements. Availability and implementaion: www.bcgsc.ca/platform/bioinfo/software/biobloomtools Contact: cjustin@bcgsc.ca or ibirol@bcgsc.ca Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25143290

  14. The modified card agglutination test: an accurate tool for detecting anaplasmosis in Columbian black-tailed deer.

    PubMed

    Howarth, A; Hokama, Y; Amerault, T E

    1976-07-01

    Inoculation of susceptible calves confirmed that the modified card agglutination test accurately detected the anaplasmosis infection status of each of 35 Columbian black-tailed deer (Odocoileus hemionus columbianus). Anaplasma marginale, and specific antibodies, were demonstrated only in calves which received blood from deer that were positive by the card test. The modified card agglutination testing of deer serum was performed in the manner recommended for testing cattle serum with bovine-origin antigen and bovine serum factor.

  15. Professors' and students' perceptions and experiences of computational simulations as learning tools

    NASA Astrophysics Data System (ADS)

    Magana de Leon, Alejandra De Jesus

    Computational simulations are becoming a critical component of scientific and engineering research, and now are becoming an important component for learning. This dissertation provides findings from a multifaceted research study exploring the ways computational simulations have been perceived and experienced as learning tools by instructors and students. Three studies were designed with an increasing focus on the aspects of learning and instructing with computational simulation tools. Study One used a student survey with undergraduate and graduate students whose instructors enhanced their teaching using online computational tools. Results of this survey were used to identify students' perceptions and experiences with these simulations as learning tools. The results provided both an evaluation of the instructional design and an indicator of which instructors were selected in Study Two. Study Two used a phenomenographic research design resulting in a two dimensional outcome space with six qualitatively different ways instructors perceived their learning outcomes associated with using simulation tools as part of students' learning experiences. Results from this work provide a framework for identifying major learning objectives to promote learning with computational simulation tools. Study Three used a grounded theory methodology to expand on instructors' learning objectives to include their perceptions of formative assessment and pedagogy. These perceptions were compared and contrasted with students' perceptions associated with learning with computational tools. The study is organized around three phases and analyzed as a collection of case studies focused on the instructors and their students' perceptions and experiences of computational simulations as learning tools. This third study resulted in a model for using computational simulations as learning tools. This model indicates the potential of integrating the computational simulation tools into formal learning

  16. Accurate molecular structure and spectroscopic properties of nucleobases: a combined computational-microwave investigation of 2-thiouracil as a case study.

    PubMed

    Puzzarini, Cristina; Biczysko, Malgorzata; Barone, Vincenzo; Peña, Isabel; Cabezas, Carlos; Alonso, José L

    2013-10-21

    The computational composite scheme purposely set up for accurately describing the electronic structure and spectroscopic properties of small biomolecules has been applied to the first study of the rotational spectrum of 2-thiouracil. The experimental investigation was made possible thanks to the combination of the laser ablation technique with Fourier transform microwave spectrometers. The joint experimental-computational study allowed us to determine the accurate molecular structure and spectroscopic properties of the title molecule, but more importantly, it demonstrates a reliable approach for the accurate investigation of isolated small biomolecules.

  17. Accurate 3d Textured Models of Vessels for the Improvement of the Educational Tools of a Museum

    NASA Astrophysics Data System (ADS)

    Soile, S.; Adam, K.; Ioannidis, C.; Georgopoulos, A.

    2013-02-01

    Besides the demonstration of the findings, modern museums organize educational programs which aim to experience and knowledge sharing combined with entertainment rather than to pure learning. Toward that effort, 2D and 3D digital representations are gradually replacing the traditional recording of the findings through photos or drawings. The present paper refers to a project that aims to create 3D textured models of two lekythoi that are exhibited in the National Archaeological Museum of Athens in Greece; on the surfaces of these lekythoi scenes of the adventures of Odysseus are depicted. The project is expected to support the production of an educational movie and some other relevant interactive educational programs for the museum. The creation of accurate developments of the paintings and of accurate 3D models is the basis for the visualization of the adventures of the mythical hero. The data collection was made by using a structured light scanner consisting of two machine vision cameras that are used for the determination of geometry of the object, a high resolution camera for the recording of the texture, and a DLP projector. The creation of the final accurate 3D textured model is a complicated and tiring procedure which includes the collection of geometric data, the creation of the surface, the noise filtering, the merging of individual surfaces, the creation of a c-mesh, the creation of the UV map, the provision of the texture and, finally, the general processing of the 3D textured object. For a better result a combination of commercial and in-house software made for the automation of various steps of the procedure was used. The results derived from the above procedure were especially satisfactory in terms of accuracy and quality of the model. However, the procedure was proved to be time consuming while the use of various software packages presumes the services of a specialist.

  18. WASTE REDUCTION USING COMPUTER-AIDED DESIGN TOOLS

    EPA Science Inventory

    Growing environmental concerns have spurred considerable interest in pollution prevention. In most instances, pollution prevention involves introducing radical changes to the design of processes so that waste generation is minimized.
    Process simulators can be effective tools i...

  19. Fine structure in proton radioactivity: An accurate tool to ascertain the breaking of axial symmetry in {sup 145}Tm

    SciTech Connect

    Arumugam, P.; Ferreira, L. S.; Maglione, E.

    2008-10-15

    With a proper formalism for proton emission from triaxially deformed nuclei, we perform exact calculations of decay widths for the decays to ground and first excited 2{sup +} states in the daughter nucleus. Our results for rotational spectrum, decay width and fine structure in the case of the nucleus {sup 145}Tm lead for the first time to an accurate identification of triaxial deformation using proton emission. This work also puts in evidence the advantage of proton emission over the conventional probes to study nuclear structure at the proton drip-line.

  20. A Writer's Tool: Computing as a Mode of Inventing.

    ERIC Educational Resources Information Center

    Burns, Hugh

    Computer assisted instruction can be used for stimulating rhetorical invention in English composition. The computer program is responsible for the direction of the inquiry and the motivational sequence while the writer is responsible for the content. The resulting interaction raises to the conscious level what writers already know about their…

  1. Managing Laboratory Data Using Cloud Computing as an Organizational Tool

    ERIC Educational Resources Information Center

    Bennett, Jacqueline; Pence, Harry E.

    2011-01-01

    One of the most significant difficulties encountered when directing undergraduate research and developing new laboratory experiments is how to efficiently manage the data generated by a number of students. Cloud computing, where both software and computer files reside online, offers a solution to this data-management problem and allows researchers…

  2. Computer Art--A New Tool in Advertising Graphics.

    ERIC Educational Resources Information Center

    Wassmuth, Birgit L.

    Using computers to produce art began with scientists, mathematicians, and individuals with strong technical backgrounds who used the graphic material as visualizations of data in technical fields. People are using computer art in advertising, as well as in painting; sculpture; music; textile, product, industrial, and interior design; architecture;…

  3. ATLAS Distributed Computing Monitoring tools after full 2 years of LHC data taking

    NASA Astrophysics Data System (ADS)

    Schovancová, Jaroslava

    2012-12-01

    This paper details a variety of Monitoring tools used within ATLAS Distributed Computing during the first 2 years of LHC data taking. We discuss tools used to monitor data processing from the very first steps performed at the CERN Analysis Facility after data is read out of the ATLAS detector, through data transfers to the ATLAS computing centres distributed worldwide. We present an overview of monitoring tools used daily to track ATLAS Distributed Computing activities ranging from network performance and data transfer throughput, through data processing and readiness of the computing services at the ATLAS computing centres, to the reliability and usability of the ATLAS computing centres. The described tools provide monitoring for issues of varying levels of criticality: from identifying issues with the instant online monitoring to long-term accounting information.

  4. iTools: A Framework for Classification, Categorization and Integration of Computational Biology Resources

    PubMed Central

    Dinov, Ivo D.; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H. V.; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D. Stott; Toga, Arthur W.

    2008-01-01

    The advancement of the computational biology field hinges on progress in three fundamental directions – the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources–data, software tools and web-services. The iTools design, implementation and resource meta - data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource

  5. iTools: a framework for classification, categorization and integration of computational biology resources.

    PubMed

    Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W

    2008-05-28

    The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management

  6. IHT: Tools for Computing Insolation Absorption by Particle Laden Flows

    SciTech Connect

    Grout, R. W.

    2013-10-01

    This report describes IHT, a toolkit for computing radiative heat exchange between particles. Well suited for insolation absorption computations, it is also has potential applications in combustion (sooting flames), biomass gasification processes and similar processes. The algorithm is based on the 'Photon Monte Carlo' approach and implemented in a library that can be interfaced with a variety of computational fluid dynamics codes to analyze radiative heat transfer in particle-laden flows. The emphasis in this report is on the data structures and organization of IHT for developers seeking to use the IHT toolkit to add Photon Monte Carlo capabilities to their own codes.

  7. Further Uses of the Analog Computer as a Teaching Tool

    ERIC Educational Resources Information Center

    Shonle, John I.

    1976-01-01

    Discusses the use of an analog computer oscilloscope to illustrate the transition from underdamped to overdamped for the simple harmonic oscillator, the maximum range for a projectile, and the behavior of charged particles in crossed electric and magnetic fields. (MLH)

  8. Computer Databases as an Educational Tool in the Basic Sciences.

    ERIC Educational Resources Information Center

    Friedman, Charles P.; And Others

    1990-01-01

    The University of North Carolina School of Medicine developed a computer database, INQUIRER, containing scientific information in bacteriology, and then integrated the database into routine educational activities for first-year medical students in their microbiology course. (Author/MLW)

  9. Library-based statistical reproduction as a tool for computationally efficient climate model emulation

    NASA Astrophysics Data System (ADS)

    Castruccio, S.; McInerney, D.; Stein, M. L.; Moyer, E. J.

    2011-12-01

    The computational demands of modern general circulation models (GCMs) limit their use in a number of areas. Model comparisons, understanding of the physics of climate behavior, and policy analysis would all benefit greatly from a means of reproducing the behavior of a full GCM with lower computational requirements. We show here that library-based statistical modeling can be used to accurately emulate GCM output for arbitrary trajectories of concentration of CO2. To demonstrate this, we constructed a library of runs made with the NCAR Community Climate System Model version 3 (CCSM3) at T31 resolution, and use a subset of the library and a simple statistical model that accounts for temporal autocorrelation and semilinear dependence to the past forcing history to emulate independent scenarios. The library to date consists of 18 forcing scenarios, both realistic (linear and logistical increases) and unrealistic (instantaneous increases or decreases), with most scenarios run with 5 different initial conditions and the longest run over 3000 years duration. We show that given a trajectory of CO2 concentrations, we can reproduce annual temperature and precipitation in several-hundred-year climate projections at scales from global to subcontinental to an accuracy within the intrinsic short-term variability of model output. Both the abilities and limitations of the fit shed light on physical climate processes. The statistical fit captures the characteristic responses of transient climates that depend on the rate of change of radiative forcing, including suppression of precipitation in conditions of rapid increases in radiative forcing. On the other hand, the same fit cannot be used to emulate conditions of rising and falling radiative forcing, showing basic differences in the physics of transient responses. Statistical fits are accurate on both global and subcontinental (32 regions worldwide) scales, with the regional fits demonstrating clear superiority over a linear

  10. EFL Learners' Attitudes towards Using Computers as a Learning Tool in Language Learning

    ERIC Educational Resources Information Center

    Kitchakarn, Orachorn

    2015-01-01

    The study was conducted to investigate attitudes toward using computers as a learning tool among undergraduate students in a private university. In this regards, some variables which might be potential antecedents of attitudes toward computer including gender, experience of using computers and perceived abilities in using programs were examined.…

  11. Project-Based Teaching-Learning Computer-Aided Engineering Tools

    ERIC Educational Resources Information Center

    Simoes, J. A.; Relvas, C.; Moreira, R.

    2004-01-01

    Computer-aided design, computer-aided manufacturing, computer-aided analysis, reverse engineering and rapid prototyping are tools that play an important key role within product design. These are areas of technical knowledge that must be part of engineering and industrial design courses' curricula. This paper describes our teaching experience of…

  12. Evaluating Tablet Computers as a Survey Tool in Rural Communities

    PubMed Central

    Newell, Steve M.; Logan, Henrietta L.; Guo, Yi; Marks, John G.; Shepperd, James A.

    2015-01-01

    Purpose Although tablet computers offer advantages in data collection over traditional paper-and-pencil methods, little research has examined whether the 2 formats yield similar responses, especially with underserved populations. We compared the 2 survey formats and tested whether participants’ responses to common health questionnaires or perceptions of usability differed by survey format. We also tested whether we could replicate established paper-and-pencil findings via tablet computer. Methods We recruited a sample of low-income community members living in the rural southern United States. Participants were 170 residents (black = 49%; white = 36%; other races and missing data = 15%) drawn from 2 counties meeting Florida’s state statutory definition of rural with 100 persons or fewer per square mile. We randomly assigned participants to complete scales (Center for Epidemiologic Studies Depression Inventory and Regulatory Focus Questionnaire) along with survey format usability ratings via paper-and-pencil or tablet computer. All participants rated a series of previously validated posters using a tablet computer. Finally, participants completed comparisons of the survey formats and reported survey format preferences. Findings Participants preferred using the tablet computer and showed no significant differences between formats in mean responses, scale reliabilities, or in participants’ usability ratings. Conclusions Overall, participants reported similar scales responses and usability ratings between formats. However, participants reported both preferring and enjoying responding via tablet computer more. Collectively, these findings are among the first data to show that tablet computers represent a suitable substitute among an underrepresented rural sample for paper-and-pencil methodology in survey research. PMID:25243953

  13. Measurement Model for Division as a Tool in Computing Applications

    ERIC Educational Resources Information Center

    Abramovich, Sergei; Strock, Tracy

    2002-01-01

    The paper describes the use of a spreadsheet in a mathematics teacher education course. It shows how the tool can serve as a link between seemingly disconnected mathematical concepts. The didactical triad of using a spreadsheet as an agent, consumer, and amplifier of mathematical activities allows for an extended investigation of simple yet…

  14. Understanding Computation of Impulse Response in Microwave Software Tools

    ERIC Educational Resources Information Center

    Potrebic, Milka M.; Tosic, Dejan V.; Pejovic, Predrag V.

    2010-01-01

    In modern microwave engineering curricula, the introduction of the many new topics in microwave industrial development, or of software tools for design and simulation, sometimes results in students having an inadequate understanding of the fundamental theory. The terminology for and the explanation of algorithms for calculating impulse response in…

  15. Computational tool for simulation of power and refrigeration cycles

    NASA Astrophysics Data System (ADS)

    Córdoba Tuta, E.; Reyes Orozco, M.

    2016-07-01

    Small improvement in thermal efficiency of power cycles brings huge cost savings in the production of electricity, for that reason have a tool for simulation of power cycles allows modeling the optimal changes for a best performance. There is also a big boom in research Organic Rankine Cycle (ORC), which aims to get electricity at low power through cogeneration, in which the working fluid is usually a refrigerant. A tool to design the elements of an ORC cycle and the selection of the working fluid would be helpful, because sources of heat from cogeneration are very different and in each case would be a custom design. In this work the development of a multiplatform software for the simulation of power cycles and refrigeration, which was implemented in the C ++ language and includes a graphical interface which was developed using multiplatform environment Qt and runs on operating systems Windows and Linux. The tool allows the design of custom power cycles, selection the type of fluid (thermodynamic properties are calculated through CoolProp library), calculate the plant efficiency, identify the fractions of flow in each branch and finally generates a report very educational in pdf format via the LaTeX tool.

  16. Conceptually enhanced simulations: A computer tool for science teaching

    NASA Astrophysics Data System (ADS)

    Snir, Joseph; Smith, Carol; Grosslight, Lorraine

    1993-06-01

    In this paper, we consider a way computer simulations can be used to address the problem of teaching for conceptual change and understanding. After identifying three levels of understanding of a natural phenomenon (concrete, conceptual, and metaconceptual) that need to be addressed in school science, and classifying computer model systems and simulations more generally in terms of the design choices facing the programmer, we argue that there are ways to design computer simulations that can make them more powerful than laboratory models. In particular, computer simulations that provide an explicit representation for a set of interrelated concepts allow students to perceive what cannot be directly observed in laboratory experiments: representations for the concepts and ideas used for interpreting the experiment. Further, by embedding the relevant physical laws directly into the program code, these simulations allow for genuine discoveries. We describe how we applied these ideas in developing a computer simulation for a particular set of purposes: to help students grasp the distinction between mass and density and to understand the phenomenon of flotation in terms of these concepts. Finally, we reflect on the kinds of activities such conceptually enhanced simulations allow that may be important in bringing about the desired conceptual change.

  17. A Perspective on Computational Human Performance Models as Design Tools

    NASA Technical Reports Server (NTRS)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  18. Distributed design tools: Mapping targeted design tools onto a Web-based distributed architecture for high-performance computing

    SciTech Connect

    Holmes, V.P.; Linebarger, J.M.; Miller, D.J.; Poore, C.A.

    1999-11-30

    Design Tools use a Web-based Java interface to guide a product designer through the design-to-analysis cycle for a specific, well-constrained design problem. When these Design Tools are mapped onto a Web-based distributed architecture for high-performance computing, the result is a family of Distributed Design Tools (DDTs). The software components that enable this mapping consist of a Task Sequencer, a generic Script Execution Service, and the storage of both data and metadata in an active, object-oriented database called the Product Database Operator (PDO). The benefits of DDTs include improved security, reliability, scalability (in both problem size and computing hardware), robustness, and reusability. In addition, access to the PDO unlocks its wide range of services for distributed components, such as lookup and launch capability, persistent shared memory for communication between cooperating services, state management, event notification, and archival of design-to-analysis session data.

  19. IHT: Tools for Computing Insolation Absorption by Particle Laden Flows

    SciTech Connect

    Grout, Ray

    2013-09-17

    INT is a toolkit for computing radiative heat exchange between particles. The algorithm is based on the the 'Photon Monte Carlo" approach described by Wang and Modest and implemented as a library that can be interfaced with a variety of CFD codes to analyze radiative heat transfer in particle laden flows.

  20. Computer Assisted Reading Instruction: New Tools for New Experiences.

    ERIC Educational Resources Information Center

    Sponder, Barry

    A Language Experience Approach (LEA) to reading is based on the premise that a child's thinking naturally leads to talking, writing, and eventually reading. Information technologies offer powerful support for learning, but teachers and parents must learn to use these technologies effectively. Three types of computer applications that are…

  1. Computer Vision Tools for Finding Images and Video Sequences.

    ERIC Educational Resources Information Center

    Forsyth, D. A.

    1999-01-01

    Computer vision offers a variety of techniques for searching for pictures in large collections of images. Appearance methods compare images based on the overall content of the image using certain criteria. Finding methods concentrate on matching subparts of images, defined in a variety of ways, in hope of finding particular objects. These ideas…

  2. Coordinated Computer-Supported Collaborative Learning: Awareness and Awareness Tools

    ERIC Educational Resources Information Center

    Janssen, Jeroen; Bodemer, Daniel

    2013-01-01

    Traditionally, research on awareness during online collaboration focused on topics such as the effects of spatial information about group members' activities on the collaborative process. When the concept of awareness was introduced to computer-supported collaborative learning, this focus shifted to cognitive group awareness (e.g., information…

  3. Integrated computational materials engineering: Tools, simulations and new applications

    DOE PAGES

    Madison, Jonathan D.

    2016-03-30

    Here, Integrated Computational Materials Engineering (ICME) is a relatively new methodology full of tremendous potential to revolutionize how science, engineering and manufacturing work together. ICME was motivated by the desire to derive greater understanding throughout each portion of the development life cycle of materials, while simultaneously reducing the time between discovery to implementation [1,2].

  4. Countering Deterministic Tools: A Critical Theory Approach to Computers & Composition.

    ERIC Educational Resources Information Center

    Kimme Hea, Amy C.

    A writing instructor has grappled with how both to integrate and to complicate critical perspectives on technology in the writing classroom. In collaboration with another instructor, a computer classroom pedagogy was constructed emphasizing imperatives of cultural studies practice as outlined by James Berlin. The pedagogy is similar to Berlin's…

  5. Computer Generated Optical Illusions: A Teaching and Research Tool.

    ERIC Educational Resources Information Center

    Bailey, Bruce; Harman, Wade

    Interactive computer-generated simulations that highlight psychological principles were investigated in this study in which 33 female and 19 male undergraduate college student volunteers of median age 21 matched line and circle sizes in six variations of Ponzo's illusion. Prior to working with the illusions, data were collected based on subjects'…

  6. Computational study of the reactions of methanol with the hydroperoxyl and methyl radicals. 2. Accurate thermal rate constants.

    PubMed

    Alecu, I M; Truhlar, Donald G

    2011-12-29

    Multistructural canonical variational-transition-state theory with multidimensional tunneling (MS-CVT/MT) is employed to calculate thermal rate constants for the abstraction of hydrogen atoms from both positions of methanol by the hydroperoxyl and methyl radicals over the temperature range 100-3000 K. The M08-HX hybrid meta-generalized gradient approximation density functional and M08-HX with specific reaction parameters, both with the maug-cc-pVTZ basis set, were validated in part 1 of this study (Alecu, I. M.; Truhlar, D. G. J. Phys. Chem. A2011, 115, 2811) against highly accurate CCSDT(2)(Q)/CBS calculations for the energetics of these reactions, and they are used here to compute the properties of all stationary points and the energies, gradients, and Hessians of nonstationary points along each considered reaction path. The internal rotations in some of the transition states are found to be highly anharmonic and strongly coupled to each other, and they generate multiple structures (conformations) whose contributions are included in the partition function. It is shown that the previous estimates for these rate constants used to build kinetic models for the combustion of methanol, some of which were based on transition state theory calculations with one-dimensional tunneling corrections and harmonic-oscillator approximations or separable one-dimensional hindered rotor treatments of torsions, are appreciably different than the ones presently calculated using MS-CVT/MT. The rate constants obtained from the best MS-CVT/MT calculations carried out in this study, in which the important effects of corner cutting due to small and large reaction path curvature are captured via a microcanonical optimized multidimensional tunneling (μOMT) treatment, are recommended for future refinement of the kinetic model for methanol combustion. PMID:22059377

  7. Computational finite element bone mechanics accurately predicts mechanical competence in the human radius of an elderly population.

    PubMed

    Mueller, Thomas L; Christen, David; Sandercott, Steve; Boyd, Steven K; van Rietbergen, Bert; Eckstein, Felix; Lochmüller, Eva-Maria; Müller, Ralph; van Lenthe, G Harry

    2011-06-01

    High-resolution peripheral quantitative computed tomography (HR-pQCT) is clinically available today and provides a non-invasive measure of 3D bone geometry and micro-architecture with unprecedented detail. In combination with microarchitectural finite element (μFE) models it can be used to determine bone strength using a strain-based failure criterion. Yet, images from only a relatively small part of the radius are acquired and it is not known whether the region recommended for clinical measurements does predict forearm fracture load best. Furthermore, it is questionable whether the currently used failure criterion is optimal because of improvements in image resolution, changes in the clinically measured volume of interest, and because the failure criterion depends on the amount of bone present. Hence, we hypothesized that bone strength estimates would improve by measuring a region closer to the subchondral plate, and by defining a failure criterion that would be independent of the measured volume of interest. To answer our hypotheses, 20% of the distal forearm length from 100 cadaveric but intact human forearms was measured using HR-pQCT. μFE bone strength was analyzed for different subvolumes, as well as for the entire 20% of the distal radius length. Specifically, failure criteria were developed that provided accurate estimates of bone strength as assessed experimentally. It was shown that distal volumes were better in predicting bone strength than more proximal ones. Clinically speaking, this would argue to move the volume of interest for the HR-pQCT measurements even more distally than currently recommended by the manufacturer. Furthermore, new parameter settings using the strain-based failure criterion are presented providing better accuracy for bone strength estimates.

  8. Computer-Aided Protein Directed Evolution: a Review of Web Servers, Databases and other Computational Tools for Protein Engineering.

    PubMed

    Verma, Rajni; Schwaneberg, Ulrich; Roccatano, Danilo

    2012-01-01

    The combination of computational and directed evolution methods has proven a winning strategy for protein engineering. We refer to this approach as computer-aided protein directed evolution (CAPDE) and the review summarizes the recent developments in this rapidly growing field. We will restrict ourselves to overview the availability, usability and limitations of web servers, databases and other computational tools proposed in the last five years. The goal of this review is to provide concise information about currently available computational resources to assist the design of directed evolution based protein engineering experiment.

  9. Accurate determination of human serum transferrin isoforms: Exploring metal-specific isotope dilution analysis as a quantitative proteomic tool.

    PubMed

    Busto, M Estela del Castillo; Montes-Bayón, Maria; Sanz-Medel, Alfredo

    2006-12-15

    Carbohydrate-deficient transferrin (CDT) measurements are considered a reliable marker for chronic alcohol consumption, and its use is becoming extensive in forensic medicine. However, CDT is not a single molecular entity but refers to a group of sialic acid-deficient transferrin isoforms from mono- to trisialotransferrin. Thus, the development of methods to analyze accurately and precisely individual transferrin isoforms in biological fluids such as serum is of increasing importance. The present work illustrates the use of ICPMS isotope dilution analysis for the quantification of transferrin isoforms once saturated with iron and separated by anion exchange chromatography (Mono Q 5/50) using a mobile phase consisting of a gradient of ammonium acetate (0-250 mM) in 25 mM Tris-acetic acid (pH 6.5). Species-specific and species-unspecific spikes have been explored. In the first part of the study, the use of postcolumn addition of a solution of 200 ng mL(-1) isotopically enriched iron (57Fe, 95%) in 25 mM sodium citrate/citric acid (pH 4) permitted the quantification of individual sialoforms of transferrin (from S2 to S5) in human serum samples of healthy individuals as well as alcoholic patients. Second, the species-specific spike method was performed by synthesizing an isotopically enriched standard of saturated transferrin (saturated with 57Fe). The characterization of the spike was performed by postcolumn reverse isotope dilution analysis (this is, by postcolumn addition of a solution of 200 ng mL(-1) natural iron in sodium citrate/citric acid of pH 4). Also, the stability of the transferrin spike was tested during one week with negligible species transformation. Finally, the enriched transferrin was used to quantify the individual isoforms in the same serum samples obtaining results comparative to those of postcolumn isotope dilution and to those previously published in the literature, demonstrating the suitability of both strategies for quantitative transferrin

  10. Computer aided systems human engineering: A hypermedia tool

    NASA Technical Reports Server (NTRS)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  11. Lilith: A scalable secure tool for massively parallel distributed computing

    SciTech Connect

    Armstrong, R.C.; Camp, L.J.; Evensky, D.A.; Gentile, A.C.

    1997-06-01

    Changes in high performance computing have necessitated the ability to utilize and interrogate potentially many thousands of processors. The ASCI (Advanced Strategic Computing Initiative) program conducted by the United States Department of Energy, for example, envisions thousands of distinct operating systems connected by low-latency gigabit-per-second networks. In addition multiple systems of this kind will be linked via high-capacity networks with latencies as low as the speed of light will allow. Code which spans systems of this sort must be scalable; yet constructing such code whether for applications, debugging, or maintenance is an unsolved problem. Lilith is a research software platform that attempts to answer these questions with an end toward meeting these needs. Presently, Lilith exists as a test-bed, written in Java, for various spanning algorithms and security schemes. The test-bed software has, and enforces, hooks allowing implementation and testing of various security schemes.

  12. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Nguyen, D. T.; Baddourah, M. A.; Qin, J.

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigen-solution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization algorithm and domain decomposition. The source code for many of these algorithms is available from NASA Langley.

  13. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O.; Nguyen, Duc T.; Baddourah, Majdi; Qin, Jiangning

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigensolution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization search analysis and domain decomposition. The source code for many of these algorithms is available.

  14. Present status of computational tools for maglev development

    SciTech Connect

    Wang, Z.; Chen, S.S.; Rote, D.M.

    1991-10-01

    High-speed vehicles that employ magnetic levitation (maglev) have received great attention worldwide as a means of relieving both highway and air-traffic congestion. At this time, Japan and Germany are leading the development of maglev. After fifteen years of inactivity that is attributed to technical policy decisions, the federal government of the United States has reconsidered the possibility of using maglev in the United States. The National Maglev Initiative (NMI) was established in May 1990 to assess the potential of maglev in the United States. One of the tasks of the NMI, which is also the objective of this report, is to determine the status of existing computer software that can be applied to maglev-related problems. The computational problems involved in maglev assessment, research, and development can be classified into two categories: electromagnetic and mechanical. Because most maglev problems are complicated and difficult to solve analytically, proper numerical methods are needed to find solutions. To determine the status of maglev-related software, developers and users of computer codes were surveyed. The results of the survey are described in this report. 25 refs.

  15. COMPASS: A general purpose computer aided scheduling tool

    NASA Technical Reports Server (NTRS)

    Mcmahon, Mary Beth; Fox, Barry; Culbert, Chris

    1991-01-01

    COMPASS is a generic scheduling system developed by McDonnell Douglas under the direction of the Software Technology Branch at JSC. COMPASS is intended to illustrate the latest advances in scheduling technology and provide a basis from which custom scheduling systems can be built. COMPASS was written in Ada to promote readability and to conform to potential NASA Space Station Freedom standards. COMPASS has some unique characteristics that distinguishes it from commercial products. These characteristics are discussed and used to illustrate some differences between scheduling tools.

  16. Towards early software reliability prediction for computer forensic tools (case study).

    PubMed

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  17. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    NASA Astrophysics Data System (ADS)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  18. Synthetic RNAs for Gene Regulation: Design Principles and Computational Tools

    PubMed Central

    Laganà, Alessandro; Shasha, Dennis; Croce, Carlo Maria

    2014-01-01

    The use of synthetic non-coding RNAs for post-transcriptional regulation of gene expression has not only become a standard laboratory tool for gene functional studies but it has also opened up new perspectives in the design of new and potentially promising therapeutic strategies. Bioinformatics has provided researchers with a variety of tools for the design, the analysis, and the evaluation of RNAi agents such as small-interfering RNA (siRNA), short-hairpin RNA (shRNA), artificial microRNA (a-miR), and microRNA sponges. More recently, a new system for genome engineering based on the bacterial CRISPR-Cas9 system (Clustered Regularly Interspaced Short Palindromic Repeats), was shown to have the potential to also regulate gene expression at both transcriptional and post-transcriptional level in a more specific way. In this mini review, we present RNAi and CRISPRi design principles and discuss the advantages and limitations of the current design approaches. PMID:25566532

  19. Expert models and modeling processes associated with a computer-modeling tool

    NASA Astrophysics Data System (ADS)

    Zhang, Baohui; Liu, Xiufeng; Krajcik, Joseph S.

    2006-07-01

    Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using think aloud technique and video recording, we captured their computer screen modeling activities and thinking processes. We also interviewed them the day following their modeling sessions to further probe the rationale of their modeling practices. We analyzed both the audio-video transcripts and the experts' models. We found the experts' modeling processes followed the linear sequence built in the modeling program with few instances of moving back and forth. They specified their goals up front and spent a long time thinking through an entire model before acting. They specified relationships with accurate and convincing evidence. Factors (i.e., variables) in expert models were clustered, and represented by specialized technical terms. Based on the above findings, we made suggestions for improving model-based science teaching and learning using Model-It.

  20. Brain–computer interface technology as a tool to augment plasticity and outcomes for neurological rehabilitation

    PubMed Central

    Dobkin, Bruce H

    2007-01-01

    Brain–computer interfaces (BCIs) are a rehabilitation tool for tetraplegic patients that aim to improve quality of life by augmenting communication, control of the environment, and self-care. The neurobiology of both rehabilitation and BCI control depends upon learning to modify the efficacy of spared neural ensembles that represent movement, sensation and cognition through progressive practice with feedback and reward. To serve patients, BCI systems must become safe, reliable, cosmetically acceptable, quickly mastered with minimal ongoing technical support, and highly accurate even in the face of mental distractions and the uncontrolled environment beyond a laboratory. BCI technologies may raise ethical concerns if their availability affects the decisions of patients who become locked-in with brain stem stroke or amyotrophic lateral sclerosis to be sustained with ventilator support. If BCI technology becomes flexible and affordable, volitional control of cortical signals could be employed for the rehabilitation of motor and cognitive impairments in hemiplegic or paraplegic patients by offering on-line feedback about cortical activity associated with mental practice, motor intention, and other neural recruitment strategies during progressive task-oriented practice. Clinical trials with measures of quality of life will be necessary to demonstrate the value of near-term and future BCI applications. PMID:17095557

  1. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  2. Computer implemented method, and apparatus for controlling a hand-held tool

    NASA Technical Reports Server (NTRS)

    Wagner, Kenneth William (Inventor); Taylor, James Clayton (Inventor)

    1999-01-01

    The invention described here in is a computer-implemented method and apparatus for controlling a hand-held tool. In particular, the control of a hand held tool is for the purpose of controlling the speed of a fastener interface mechanism and the torque applied to fasteners by the fastener interface mechanism of the hand-held tool and monitoring the operating parameters of the tool. The control is embodied in intool software embedded on a processor within the tool which also communicates with remote software. An operator can run the tool, or through the interaction of both software, operate the tool from a remote location, analyze data from a performance history recorded by the tool, and select various torque and speed parameters for each fastener.

  3. Computational Modeling as a Design Tool in Microelectronics Manufacturing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    Plans to introduce pilot lines or fabs for 300 mm processing are in progress. The IC technology is simultaneously moving towards 0.25/0.18 micron. The convergence of these two trends places unprecedented stringent demands on processes and equipments. More than ever, computational modeling is called upon to play a complementary role in equipment and process design. The pace in hardware/process development needs a matching pace in software development: an aggressive move towards developing "virtual reactors" is desirable and essential to reduce design cycle and costs. This goal has three elements: reactor scale model, feature level model, and database of physical/chemical properties. With these elements coupled, the complete model should function as a design aid in a CAD environment. This talk would aim at the description of various elements. At the reactor level, continuum, DSMC(or particle) and hybrid models will be discussed and compared using examples of plasma and thermal process simulations. In microtopography evolution, approaches such as level set methods compete with conventional geometric models. Regardless of the approach, the reliance on empricism is to be eliminated through coupling to reactor model and computational surface science. This coupling poses challenging issues of orders of magnitude variation in length and time scales. Finally, database development has fallen behind; current situation is rapidly aggravated by the ever newer chemistries emerging to meet process metrics. The virtual reactor would be a useless concept without an accompanying reliable database that consists of: thermal reaction pathways and rate constants, electron-molecule cross sections, thermochemical properties, transport properties, and finally, surface data on the interaction of radicals, atoms and ions with various surfaces. Large scale computational chemistry efforts are critical as experiments alone cannot meet database needs due to the difficulties associated with such

  4. Automatic brain matter segmentation of computed tomography images using a statistical model: A tool to gain working time!

    PubMed

    Bertè, Francesco; Lamponi, Giuseppe; Bramanti, Placido; Calabrò, Rocco S

    2015-10-01

    Brain computed tomography (CT) is useful diagnostic tool for the evaluation of several neurological disorders due to its accuracy, reliability, safety and wide availability. In this field, a potentially interesting research topic is the automatic segmentation and recognition of medical regions of interest (ROIs). Herein, we propose a novel automated method, based on the use of the active appearance model (AAM) for the segmentation of brain matter in CT images to assist radiologists in the evaluation of the images. The method described, that was applied to 54 CT images coming from a sample of outpatients affected by cognitive impairment, enabled us to obtain the generation of a model overlapping with the original image with quite good precision. Since CT neuroimaging is in widespread use for detecting neurological disease, including neurodegenerative conditions, the development of automated tools enabling technicians and physicians to reduce working time and reach a more accurate diagnosis is needed. PMID:26427894

  5. Abacus: a computational tool for extracting and pre-processing spectral count data for label-free quantitative proteomic analysis.

    PubMed

    Fermin, Damian; Basrur, Venkatesha; Yocum, Anastasia K; Nesvizhskii, Alexey I

    2011-04-01

    We describe Abacus, a computational tool for extracting spectral counts from MS/MS data sets. The program aggregates data from multiple experiments, adjusts spectral counts to accurately account for peptides shared across multiple proteins, and performs common normalization steps. It can also output the spectral count data at the gene level, thus simplifying the integration and comparison between gene and protein expression data. Abacus is compatible with the widely used Trans-Proteomic Pipeline suite of tools and comes with a graphical user interface making it easy to interact with the program. The main aim of Abacus is to streamline the analysis of spectral count data by providing an automated, easy to use solution for extracting this information from proteomic data sets for subsequent, more sophisticated statistical analysis.

  6. Computational space physics as a capacity building tool

    NASA Astrophysics Data System (ADS)

    Toit Strauss, Du; Potgieter, Marius; Moeketsi, Daniel; Kopp, Andreas; Weigel, Bob

    2012-07-01

    Scientific capacity building consists of two parts: (1) Building research infrastructure (hardware) and (2) fostering the scientific know-how to use this infrastructure optimally (the software and application components). The latter is also referred to as human capital development and is the focus of this presentation. We will discuss a capacity building program for computational space physics that was successfully implemented in South Africa. We will also discuss the challenges that face such a program in developing countries and how this program can be used as a template for other developing regions.

  7. Simulation tools for computer-aided design and numerical investigations of high-power gyrotrons

    NASA Astrophysics Data System (ADS)

    Damyanova, M.; Balabanova, E.; Kern, S.; Illy, S.; Sabchevski, S.; Thumm, M.; Vasileva, E.; Zhelyazkov, I.

    2012-03-01

    Modelling and simulation are essential tools for computer-aided design (CAD), analysis and optimization of high-power gyrotrons used as radiation sources for electron cyclotron resonance heating (ECRH) and current drive (ECCD) of magnetically confined plasmas in the thermonuclear reactor ITER. In this communication, we present the current status of our simulation tools and discuss their further development.

  8. Which Way Will the Wind Blow? Networked Computer Tools for Studying the Weather.

    ERIC Educational Resources Information Center

    Fishman, Barry J.; D'Amico, Laura M.

    A suite of networked computer tools within a pedagogical framework was designed to enhance earth science education at the high school level. These tools give students access to live satellite images, weather maps, and other scientific data dealing with the weather, and make it easy for students to make their own weather forecasts by creating…

  9. A Methodology for Integrating Computer-Based Learning Tools in Science Curricula

    ERIC Educational Resources Information Center

    Papadouris, Nicos; Constantinou, Constantinos P.

    2009-01-01

    This paper demonstrates a methodology for effectively integrating computer-based learning tools in science teaching and learning. This methodology provides a means of systematic analysis to identify the capabilities of particular software tools and to formulate a series of competencies relevant to physical science that could be developed by means…

  10. An Evaluation of Teaching Introductory Geomorphology Using Computer-based Tools.

    ERIC Educational Resources Information Center

    Wentz, Elizabeth A.; Vender, Joann C.; Brewer, Cynthia A.

    1999-01-01

    Compares student reactions to traditional teaching methods and an approach where computer-based tools (GEODe CD-ROM and GIS-based exercises) were either integrated with or replaced the traditional methods. Reveals that the students found both of these tools valuable forms of instruction when used in combination with the traditional methods. (CMK)

  11. The Effect of a Computer-Based Cartooning Tool on Children's Cartoons and Written Stories

    ERIC Educational Resources Information Center

    Madden, M.; Chung, P. W. H.; Dawson, C. W.

    2008-01-01

    This paper reports a study assessing a new computer tool for cartoon storytelling, created by the authors for a target audience in the upper half of the English and Welsh Key Stage 2 (years 5 and 6, covering ages 9-11 years). The tool attempts to provide users with more opportunities for expressive visualisation than previous educational software;…

  12. Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models

    ERIC Educational Resources Information Center

    Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron

    2011-01-01

    To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…

  13. A Multiple-Sessions Interactive Computer-Based Learning Tool for Ability Cultivation in Circuit Simulation

    ERIC Educational Resources Information Center

    Xu, Q.; Lai, L. L.; Tse, N. C. F.; Ichiyanagi, K.

    2011-01-01

    An interactive computer-based learning tool with multiple sessions is proposed in this paper, which teaches students to think and helps them recognize the merits and limitations of simulation tools so as to improve their practical abilities in electrical circuit simulation based on the case of a power converter with progressive problems. The…

  14. DEVELOPMENT AND USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOLS FOR POLLUTION PREVENTION

    EPA Science Inventory

    The use of Computer-Aided Process Engineering (CAPE) and process simulation tools has become established industry practice to predict simulation software, new opportunities are available for the creation of a wide range of ancillary tools that can be used from within multiple sim...

  15. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    ERIC Educational Resources Information Center

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  16. Computational Tools for Interpreting Ion Channel pH-Dependence

    PubMed Central

    Sazanavets, Ivan; Warwicker, Jim

    2015-01-01

    Activity in many biological systems is mediated by pH, involving proton titratable groups with pKas in the relevant pH range. Experimental analysis of pH-dependence in proteins focusses on particular sidechains, often with mutagenesis of histidine, due to its pKa near to neutral pH. The key question for algorithms that predict pKas is whether they are sufficiently accurate to effectively narrow the search for molecular determinants of pH-dependence. Through analysis of inwardly rectifying potassium (Kir) channels and acid-sensing ion channels (ASICs), mutational effects on pH-dependence are probed, distinguishing between groups described as pH-coupled or pH-sensor. Whereas mutation can lead to a shift in transition pH between open and closed forms for either type of group, only for pH-sensor groups does mutation modulate the amplitude of the transition. It is shown that a hybrid Finite Difference Poisson-Boltzmann (FDPB) – Debye-Hückel continuum electrostatic model can filter mutation candidates, providing enrichment for key pH-coupled and pH-sensor residues in both ASICs and Kir channels, in comparison with application of FDPB alone. PMID:25915903

  17. Computational Tools for Interpreting Ion Channel pH-Dependence.

    PubMed

    Sazanavets, Ivan; Warwicker, Jim

    2015-01-01

    Activity in many biological systems is mediated by pH, involving proton titratable groups with pKas in the relevant pH range. Experimental analysis of pH-dependence in proteins focusses on particular sidechains, often with mutagenesis of histidine, due to its pKa near to neutral pH. The key question for algorithms that predict pKas is whether they are sufficiently accurate to effectively narrow the search for molecular determinants of pH-dependence. Through analysis of inwardly rectifying potassium (Kir) channels and acid-sensing ion channels (ASICs), mutational effects on pH-dependence are probed, distinguishing between groups described as pH-coupled or pH-sensor. Whereas mutation can lead to a shift in transition pH between open and closed forms for either type of group, only for pH-sensor groups does mutation modulate the amplitude of the transition. It is shown that a hybrid Finite Difference Poisson-Boltzmann (FDPB) - Debye-Hückel continuum electrostatic model can filter mutation candidates, providing enrichment for key pH-coupled and pH-sensor residues in both ASICs and Kir channels, in comparison with application of FDPB alone. PMID:25915903

  18. Computational tool for morphological analysis of cultured neonatal rat cardiomyocytes.

    PubMed

    Leite, Maria Ruth C R; Cestari, Idágene A; Cestari, Ismar N

    2015-08-01

    This study describes the development and evaluation of a semiautomatic myocyte edge-detector using digital image processing. The algorithm was developed in Matlab 6.0 using the SDC Morphology Toolbox. Its conceptual basis is the mathematical morphology theory together with the watershed and Euclidean distance transformations. The algorithm enables the user to select cells within an image for automatic detection of their borders and calculation of their surface areas; these areas are determined by adding the pixels within each myocyte's boundaries. The algorithm was applied to images of cultured ventricular myocytes from neonatal rats. The edge-detector allowed the identification and quantification of morphometric alterations in cultured isolated myocytes induced by 72 hours of exposure to a hypertrophic agent (50 μM phenylephrine). There was a significant increase in the mean surface area of the phenylephrine-treated cells compared with the control cells (p<;0.05), corresponding to cellular hypertrophy of approximately 50%. In conclusion, this edge-detector provides a rapid, repeatable and accurate measurement of cell surface areas in a standardized manner. Other possible applications include morphologic measurement of other types of cultured cells and analysis of time-related morphometric changes in adult cardiac myocytes.

  19. Computational Tools for Interpreting Ion Channel pH-Dependence.

    PubMed

    Sazanavets, Ivan; Warwicker, Jim

    2015-01-01

    Activity in many biological systems is mediated by pH, involving proton titratable groups with pKas in the relevant pH range. Experimental analysis of pH-dependence in proteins focusses on particular sidechains, often with mutagenesis of histidine, due to its pKa near to neutral pH. The key question for algorithms that predict pKas is whether they are sufficiently accurate to effectively narrow the search for molecular determinants of pH-dependence. Through analysis of inwardly rectifying potassium (Kir) channels and acid-sensing ion channels (ASICs), mutational effects on pH-dependence are probed, distinguishing between groups described as pH-coupled or pH-sensor. Whereas mutation can lead to a shift in transition pH between open and closed forms for either type of group, only for pH-sensor groups does mutation modulate the amplitude of the transition. It is shown that a hybrid Finite Difference Poisson-Boltzmann (FDPB) - Debye-Hückel continuum electrostatic model can filter mutation candidates, providing enrichment for key pH-coupled and pH-sensor residues in both ASICs and Kir channels, in comparison with application of FDPB alone.

  20. Arc Flash Boundary Calculations Using Computer Software Tools

    SciTech Connect

    Gibbs, M.D.

    2005-01-07

    Arc Flash Protection boundary calculations have become easier to perform with the availability of personal computer software. These programs incorporate arc flash protection boundary formulas for different voltage and current levels, calculate the bolted fault current at each bus, and use built in time-current coordination curves to determine the clearing time of protective devices in the system. Results of the arc flash protection boundary calculations can be presented in several different forms--as an annotation to the one-line diagram, as a table of arc flash protection boundary distances, and as printed placards to be attached to the appropriate equipment. Basic arc flash protection boundary principles are presented in this paper along with several helpful suggestions for performing arc flash protection boundary calculations.

  1. Using Artificial Intelligence in Education: Computer-Based Tools for Instructional Development.

    ERIC Educational Resources Information Center

    Perez, Ray S.; Seidel, Robert J.

    1990-01-01

    Discussion of the use of artificial intelligence in computer-based instruction focuses on training development for the U.S. Army. Topics discussed include the Systems Approach to Training (SAT); knowledge acquisition; domain expertise; intelligent computer-assisted instruction; software tools and decision aids; and expert systems. (10 references)…

  2. An Instructor's Guide to Collaborative Writing with CECE Talk: A Computer Network Tool.

    ERIC Educational Resources Information Center

    Neuwirth, Christine M.; And Others

    Describing a computer network communication tool which allows users to communicate concurrently across networked, advanced-function workstations, this guide presents information on how to use the Center for Educational Computing in English (CECE) Talk in the writing classroom. The guide focuses on three topics: (1) introducing CECE Talk to…

  3. Development and Assessment of a Chemistry-Based Computer Video Game as a Learning Tool

    ERIC Educational Resources Information Center

    Martinez-Hernandez, Kermin Joel

    2010-01-01

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning…

  4. Assessing Affordances of Selected Cloud Computing Tools for Language Teacher Education in Nigeria

    ERIC Educational Resources Information Center

    Ofemile, Abdulmalik Yusuf

    2015-01-01

    This paper reports part of a study that hoped to understand Teacher Educators' (TE) assessment of the affordances of selected cloud computing tools ranked among the top 100 for the year 2010. Research has shown that ICT and by extension cloud computing has positive impacts on daily life and this informed the Nigerian government's policy to…

  5. Tool or Science? The History of Computing at the Norwegian University of Science and Technology

    NASA Astrophysics Data System (ADS)

    Nordal, Ola

    One may characterize the history of computing at the Norwegian University of Science and Technology by a tension between the computer as a tool in other disciplines and computer science as discipline in itself. This tension has been latent since the pioneering period of the 1950s until today. This paper shows how this have been expressed in the early attempts to take up computing at the University, and how it gave the Division of Computer Science a fairly rough start when it opened in 1972.

  6. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    SciTech Connect

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1998-03-01

    Lilith is a general purpose framework, written in Java, that provides a highly scalable distribution of user code across a heterogeneous computing platform. By creation of suitable user code, the Lilith framework can be used for tool development. The scalable performance provided by Lilith is crucial to the development of effective tools for large distributed systems. Furthermore, since Lilith handles the details of code distribution and communication, the user code need focus primarily on the tool functionality, thus, greatly decreasing the time required for tool development. In this paper, the authors concentrate on the use of the Lilith framework to develop scalable tools. The authors review the functionality of Lilith and introduce a typical tool capitalizing on the features of the framework. They present new Objects directly involved with tool creation. They explain details of development and illustrate with an example. They present timing results demonstrating scalability.

  7. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  8. Bio++: efficient extensible libraries and tools for computational molecular evolution.

    PubMed

    Guéguen, Laurent; Gaillard, Sylvain; Boussau, Bastien; Gouy, Manolo; Groussin, Mathieu; Rochette, Nicolas C; Bigot, Thomas; Fournier, David; Pouyet, Fanny; Cahais, Vincent; Bernard, Aurélien; Scornavacca, Céline; Nabholz, Benoît; Haudry, Annabelle; Dachary, Loïc; Galtier, Nicolas; Belkhir, Khalid; Dutheil, Julien Y

    2013-08-01

    Efficient algorithms and programs for the analysis of the ever-growing amount of biological sequence data are strongly needed in the genomics era. The pace at which new data and methodologies are generated calls for the use of pre-existing, optimized-yet extensible-code, typically distributed as libraries or packages. This motivated the Bio++ project, aiming at developing a set of C++ libraries for sequence analysis, phylogenetics, population genetics, and molecular evolution. The main attractiveness of Bio++ is the extensibility and reusability of its components through its object-oriented design, without compromising the computer-efficiency of the underlying methods. We present here the second major release of the libraries, which provides an extended set of classes and methods. These extensions notably provide built-in access to sequence databases and new data structures for handling and manipulating sequences from the omics era, such as multiple genome alignments and sequencing reads libraries. More complex models of sequence evolution, such as mixture models and generic n-tuples alphabets, are also included.

  9. Computational tools for analysing structural changes in proteins in solution.

    PubMed

    Noé, Frank; Schwarzl, Sonja M; Fischer, Stefan; Smith, Jeremy C

    2003-01-01

    Many important structural changes in proteins involve long-time dynamics, which are outside the timescale presently accessible by a straightforward integration of Newton's equations of motion. This problem is addressed with minimisation-based algorithms, which are applied on possible reaction pathways using atomic-detail models. For reasons of efficiency, an implicit treatment of solvent is imperative. We present the charge reparameterisation protocol, which is a method that approximates the interaction energies obtained by a numerical solution of the Poisson-Boltzmann equation. Furthermore, we present a number of methods that can be used to compute possible reaction pathways associated with a particular conformational change. Two of them, the self-penalty walk and the nudged elastic band method, define an objective function, which is minimised to find optimal paths. A third method, conjugate peak refinement, is a heuristic method, which finds minimum energy paths without the use of an explicit objective function. Finally, we discuss problems and limitations with these methods and give a perspective on future research.

  10. Computational Tools for Parsimony Phylogenetic Analysis of Omics Data.

    PubMed

    Salazar, Jose; Amri, Hakima; Noursi, David; Abu-Asab, Mones

    2015-08-01

    High-throughput assays from genomics, proteomics, metabolomics, and next generation sequencing produce massive omics datasets that are challenging to analyze in biological or clinical contexts. Thus far, there is no publicly available program for converting quantitative omics data into input formats to be used in off-the-shelf robust phylogenetic programs. To the best of our knowledge, this is the first report on creation of two Windows-based programs, OmicsTract and SynpExtractor, to address this gap. We note, as a way of introduction and development of these programs, that one particularly useful bioinformatics inferential modeling is the phylogenetic cladogram. Cladograms are multidimensional tools that show the relatedness between subgroups of healthy and diseased individuals and the latter's shared aberrations; they also reveal some characteristics of a disease that would not otherwise be apparent by other analytical methods. The OmicsTract and SynpExtractor were written for the respective tasks of (1) accommodating advanced phylogenetic parsimony analysis (through standard programs of MIX [from PHYLIP] and TNT), and (2) extracting shared aberrations at the cladogram nodes. OmicsTract converts comma-delimited data tables through assigning each data point into a binary value ("0" for normal states and "1" for abnormal states) then outputs the converted data tables into the proper input file formats for MIX or with embedded commands for TNT. SynapExtractor uses outfiles from MIX and TNT to extract the shared aberrations of each node of the cladogram, matching them with identifying labels from the dataset and exporting them into a comma-delimited file. Labels may be gene identifiers in gene-expression datasets or m/z values in mass spectrometry datasets. By automating these steps, OmicsTract and SynpExtractor offer a veritable opportunity for rapid and standardized phylogenetic analyses of omics data; their model can also be extended to next generation sequencing

  11. Acts -- A collection of high performing software tools for scientific computing

    SciTech Connect

    Drummond, L.A.; Marques, O.A.

    2002-11-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Further, many new discoveries depend on high performance computer simulations to satisfy their demands for large computational resources and short response time. The Advanced CompuTational Software (ACTS) Collection brings together a number of general-purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS collection promotes code portability, reusability, reduction of duplicate efforts, and tool maturity. This paper presents a brief introduction to the functionality available in ACTS. It also highlight the tools that are in demand by Climate and Weather modelers.

  12. Informed public choices for low-carbon electricity portfolios using a computer decision tool.

    PubMed

    Mayer, Lauren A Fleishman; Bruine de Bruin, Wändi; Morgan, M Granger

    2014-04-01

    Reducing CO2 emissions from the electricity sector will likely require policies that encourage the widespread deployment of a diverse mix of low-carbon electricity generation technologies. Public discourse informs such policies. To make informed decisions and to productively engage in public discourse, citizens need to understand the trade-offs between electricity technologies proposed for widespread deployment. Building on previous paper-and-pencil studies, we developed a computer tool that aimed to help nonexperts make informed decisions about the challenges faced in achieving a low-carbon energy future. We report on an initial usability study of this interactive computer tool. After providing participants with comparative and balanced information about 10 electricity technologies, we asked them to design a low-carbon electricity portfolio. Participants used the interactive computer tool, which constrained portfolio designs to be realistic and yield low CO2 emissions. As they changed their portfolios, the tool updated information about projected CO2 emissions, electricity costs, and specific environmental impacts. As in the previous paper-and-pencil studies, most participants designed diverse portfolios that included energy efficiency, nuclear, coal with carbon capture and sequestration, natural gas, and wind. Our results suggest that participants understood the tool and used it consistently. The tool may be downloaded from http://cedmcenter.org/tools-for-cedm/informing-the-public-about-low-carbon-technologies/ .

  13. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    SciTech Connect

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1997-12-31

    Lilith is a general purpose tool that provides a highly scalable, easy distribution of user code across a heterogeneous computing platform. By handling the details of code distribution and communication, such a framework allows for the rapid development of tools for the use and management of large distributed systems. This speed-up in development not only enables the easy creation of tools as needed but also facilitates the ultimate development of more refined, hard-coded tools as well. Lilith is written in Java, providing platform independence and further facilitating rapid tool development through Object reuse and ease of development. The authors present the user-involved objects in the Lilith Distributed Object System and the Lilith User API. They present an example of tool development, illustrating the user calls, and present results demonstrating Lilith`s scalability.

  14. Development of generalized mapping tools to improve implementation of data driven computer simulations (04-ERD-083)

    SciTech Connect

    Ramirez, A; Pasyanos, M; Franz, G A

    2004-09-17

    The Stochastic Engine (SE) is a data driven computer simulation tool for predicting the characteristics of complex systems. The SE integrates accurate simulators with the Monte Carlo Markov Chain (MCMC) approach (a stochastic inverse technique) to identify alternative models that are consistent with available data and ranks these alternatives according to their probabilities. Implementation of the SE is currently cumbersome owing to the need to customize the pre-processing and processing steps that are required for a specific application. This project widens the applicability of the Stochastic Engine by generalizing some aspects of the method (i.e. model-to-data transformation types, configuration, model representation). We have generalized several of the transformations that are necessary to match the observations to proposed models. These transformations are sufficiently general not to pertain to any single application. This approach provides a framework that increases the efficiency of the SE implementation. The overall goal is to reduce response time and make the approach as ''plug-and-play'' as possible, and will result in the rapid accumulation of new data types for a host of both earth science and non-earth science problems. When adapting the SE approach to a specific application, there are various pre-processing and processing steps that are typically needed to run a specific problem. Many of these steps are common to a wide variety of specific applications. Here we list and describe several data transformations that are common to a variety of subsurface inverse problems. A subset of these steps have been developed in a generalized form such that they could be used with little or no modifications in a wide variety of specific applications. This work was funded by the LDRD Program (tracking number 04-ERD-083).

  15. Time-Accurate, Unstructured-Mesh Navier-Stokes Computations with the Space-Time CESE Method

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan

    2006-01-01

    Application of the newly emerged space-time conservation element solution element (CESE) method to compressible Navier-Stokes equations is studied. In contrast to Euler equations solvers, several issues such as boundary conditions, numerical dissipation, and grid stiffness warrant systematic investigations and validations. Non-reflecting boundary conditions applied at the truncated boundary are also investigated from the stand point of acoustic wave propagation. Validations of the numerical solutions are performed by comparing with exact solutions for steady-state as well as time-accurate viscous flow problems. The test cases cover a broad speed regime for problems ranging from acoustic wave propagation to 3D hypersonic configurations. Model problems pertinent to hypersonic configurations demonstrate the effectiveness of the CESE method in treating flows with shocks, unsteady waves, and separations. Good agreement with exact solutions suggests that the space-time CESE method provides a viable alternative for time-accurate Navier-Stokes calculations of a broad range of problems.

  16. Grinding tool optimization in computer controlled grinding of SiC aspheric mirror

    NASA Astrophysics Data System (ADS)

    Song, Ci; Lu, Yi; Peng, Yanglin

    2014-11-01

    The shape, size and material of grinding tool not only affect the machining accuracy, but also the machining efficiency in the process of Computer Controlled Grinding. The hardness of the SiC aspheric mirror, the misfit of grinding tool and the work-piece also emphasize the importance of grinding tool optimization. By means of analyzing the misfit property of grinding tool and aspheric optic theoretically, as well as the wear character and the process of the grinding tool experimentally, this manuscript establishes the rule of grinding tool optimization satisfying different machining objective. Based on this, the adopted grinding tool was optimized in the grinding process of SiC off-axis aspheric (634mm×560mm). The simulation provide reasonable grinding tool for the off-axis aspheric grinding, and good results (large amount material removal and edge error figuring) are obtained when the optimized grinding tool are applied. Both of the simulations and experiments demonstrate the feasibility and correctness of the grinding tool optimization method.

  17. Computational tool for the early screening of monoclonal antibodies for their viscosities

    PubMed Central

    Agrawal, Neeraj J; Helk, Bernhard; Kumar, Sandeep; Mody, Neil; Sathish, Hasige A.; Samra, Hardeep S.; Buck, Patrick M; Li, Li; Trout, Bernhardt L

    2016-01-01

    Highly concentrated antibody solutions often exhibit high viscosities, which present a number of challenges for antibody-drug development, manufacturing and administration. The antibody sequence is a key determinant for high viscosity of highly concentrated solutions; therefore, a sequence- or structure-based tool that can identify highly viscous antibodies from their sequence would be effective in ensuring that only antibodies with low viscosity progress to the development phase. Here, we present a spatial charge map (SCM) tool that can accurately identify highly viscous antibodies from their sequence alone (using homology modeling to determine the 3-dimensional structures). The SCM tool has been extensively validated at 3 different organizations, and has proved successful in correctly identifying highly viscous antibodies. As a quantitative tool, SCM is amenable to high-throughput automated analysis, and can be effectively implemented during the antibody screening or engineering phase for the selection of low-viscosity antibodies. PMID:26399600

  18. Performance of computational tools in evaluating the functional impact of laboratory-induced amino acid mutations.

    PubMed

    Gray, Vanessa E; Kukurba, Kimberly R; Kumar, Sudhir

    2012-08-15

    Site-directed mutagenesis is frequently used by scientists to investigate the functional impact of amino acid mutations in the laboratory. Over 10,000 such laboratory-induced mutations have been reported in the UniProt database along with the outcomes of functional assays. Here, we explore the performance of state-of-the-art computational tools (Condel, PolyPhen-2 and SIFT) in correctly annotating the function-altering potential of 10,913 laboratory-induced mutations from 2372 proteins. We find that computational tools are very successful in diagnosing laboratory-induced mutations that elicit significant functional change in the laboratory (up to 92% accuracy). But, these tools consistently fail in correctly annotating laboratory-induced mutations that show no functional impact in the laboratory assays. Therefore, the overall accuracy of computational tools for laboratory-induced mutations is much lower than that observed for the naturally occurring human variants. We tested and rejected the possibilities that the preponderance of changes to alanine and the presence of multiple base-pair mutations in the laboratory were the reasons for the observed discordance between the performance of computational tools for natural and laboratory mutations. Instead, we discover that the laboratory-induced mutations occur predominately at the highly conserved positions in proteins, where the computational tools have the lowest accuracy of correct prediction for variants that do not impact function (neutral). Therefore, the comparisons of experimental-profiling results with those from computational predictions need to be sensitive to the evolutionary conservation of the positions harboring the amino acid change. PMID:22685075

  19. The Vicious Worm: a computer-based Taenia solium education tool.

    PubMed

    Johansen, Maria Vang; Trevisan, Chiara; Braae, Uffe Christian; Magnussen, Pascal; Ertel, Rebekka Lund; Mejer, Helena; Saarnak, Christopher F L

    2014-08-01

    Ignorance is a major obstacle for the effective control of diseases. To provide evidence-based knowledge about prevention and control of Taenia solium cysticercosis, we have developed a computer-based education tool: 'The Vicious Worm'. The tool targets policy makers, professionals, and laypeople, and comprises educational materials including illustrated short stories, videos, and scientific texts designed for the different target groups. We suggest that evidence-based health education is included as a specific control measure in any control programme.

  20. Scale up tools in reactive extrusion and compounding processes. Could 1D-computer modeling be helpful?

    NASA Astrophysics Data System (ADS)

    Pradel, J.-L.; David, C.; Quinebèche, S.; Blondel, P.

    2014-05-01

    Industrial scale-up (or scale down) in Compounding and Reactive Extrusion processes is one of the most critical R&D challenges. Indeed, most of High Performances Polymers are obtained within a reactive compounding involving chemistry: free radical grafting, in situ compatibilization, rheology control... but also side reactions: oxidation, branching, chain scission... As described by basic Arrhenius and kinetics laws, the competition between all chemical reactions depends on residence time distribution and temperature. Then, to ensure the best possible scale up methodology, we need tools to match thermal history of the formulation along the screws from a lab scale twin screw extruder to an industrial one. This paper proposes a comparison between standard scale-up laws and the use of Computer modeling Software such as Ludovic® applied and compared to experimental data. Scaling data from a compounding line to another one, applying general rules (for example at constant specific mechanical energy), shows differences between experimental and computed data, and error depends on the screw speed range. For more accurate prediction, 1D-Computer Modeling could be used to optimize the process conditions to ensure the best scale-up product, especially in temperature sensitive reactive extrusion processes. When the product temperature along the screws is the key, Ludovic® software could help to compute the temperature profile along the screws and extrapolate conditions, even screw profile, on industrial extruders.

  1. Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.

    2016-01-01

    A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.

  2. Prostate cancer nodal oligometastasis accurately assessed using prostate-specific membrane antigen positron emission tomography-computed tomography and confirmed histologically following robotic-assisted lymph node dissection

    PubMed Central

    O’Kane, Dermot B.; Lawrentschuk, Nathan; Bolton, Damien M.

    2016-01-01

    We herein present a case of a 76-year-old gentleman, where prostate-specific membrane antigen positron emission tomography-computed tomography (PSMA PET-CT) was used to accurately detect prostate cancer (PCa), pelvic lymph node (LN) metastasis in the setting of biochemical recurrence following definitive treatment for PCa. The positive PSMA PET-CT result was confirmed with histological examination of the involved pelvic LNs following pelvic LN dissection. PMID:27141207

  3. Computer-based tools for decision support at the Hanford Site

    SciTech Connect

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high' level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the glue'' or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  4. Computer-based tools for decision support at the Hanford Site

    SciTech Connect

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high` level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the ``glue`` or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  5. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    NASA Technical Reports Server (NTRS)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  6. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  7. Development of a Computational High-Throughput Tool for the Quantitative Examination of Dose-Dependent Histological Features

    PubMed Central

    Nault, Rance; Colbry, Dirk; Brandenberger, Christina; Harkema, Jack R.; Zacharewski, Timothy R.

    2015-01-01

    High-resolution digitalizing of histology slides facilitates the development of computational alternatives to manual quantitation of features of interest. We developed a MATLAB-based quantitative histological analysis tool (QuHAnT) for the high-throughput assessment of distinguishable histological features. QuHAnT validation was demonstrated by comparison with manual quantitation using liver sections from mice orally gavaged with sesame oil vehicle or 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD; 0.001–30 µg/kg) every 4 days for 28 days, which elicits hepatic steatosis with mild fibrosis. A quality control module of QuHAnT reduced the number of quantifiable Oil Red O (ORO)-stained images from 3,123 to 2,756. Increased ORO staining was measured at 10 and 30 µg/kg TCDD with a high correlation between manual and computational volume densities (Vv), although the dynamic range of QuHAnT was 10-fold greater. Additionally, QuHAnT determined the size of each ORO vacuole, which could not be accurately quantitated by visual examination or manual point counting. PicroSirius Red quantitation demonstrated superior collagen deposition detection due to the ability to consider all images within each section. QuHAnT dramatically reduced analysis time and facilitated the comprehensive assessment of features improving accuracy and sensitivity and represents a complementary tool for tissue/cellular features that are difficult and tedious to assess via subjective or semiquantitative methods. PMID:25274660

  8. An accurate binding interaction model in de novo computational protein design of interactions: if you build it, they will bind.

    PubMed

    London, Nir; Ambroggio, Xavier

    2014-02-01

    Computational protein design efforts aim to create novel proteins and functions in an automated manner and, in the process, these efforts shed light on the factors shaping natural proteins. The focus of these efforts has progressed from the interior of proteins to their surface and the design of functions, such as binding or catalysis. Here we examine progress in the development of robust methods for the computational design of non-natural interactions between proteins and molecular targets such as other proteins or small molecules. This problem is referred to as the de novo computational design of interactions. Recent successful efforts in de novo enzyme design and the de novo design of protein-protein interactions open a path towards solving this problem. We examine the common themes in these efforts, and review recent studies aimed at understanding the nature of successes and failures in the de novo computational design of interactions. While several approaches culminated in success, the use of a well-defined structural model for a specific binding interaction in particular has emerged as a key strategy for a successful design, and is therefore reviewed with special consideration.

  9. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road

    PubMed Central

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on “on-demand payment” for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  10. Examining the effects of computational tools on students' understanding of thermodynamics of material concepts and representations

    NASA Astrophysics Data System (ADS)

    Ogunwuyi, Oluwatosin

    Technology is becoming a more critical agent for supporting learning as well as research in science and engineering. In particular, technology-based tools in the form of simulations and virtual environments support learning using mathematical models and computational methods. The purpose of this research is to: (a) measure the value added in conveying Thermodynamics of materials concepts with a blended learning environment using computational simulation tools with lectures; and (b) characterize students' use of representational forms to convey their conceptual understanding of core concepts within a learning environment that blended Gibbs computational resource and traditional lectures. A mix-method approach was implemented that included the use of statistical analysis to compare student test performance as a result of interacting with Gibbs tool and the use of Grounded Theory inductive analysis to explore students' use of representational forms to express their understanding of thermodynamics of material concepts. Results for the quantitative study revealed positive gains in students' conceptual understanding before and after interacting with Gibbs tool for the majority of the concepts tested. In addition, insight gained from the qualitative analysis helped provide understanding about how students utilized representational forms in communicating their understanding of thermodynamics of material concepts. Knowledge of how novice students construct meaning in this context will provide insight for engineering education instructors and researchers in understanding students' learning processes in the context of educational environments that integrate expert simulation tools as part of their instructional resources for foundational domain knowledge.

  11. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    PubMed

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible.

  12. A visualization tool for parallel and distributed computing using the Lilith framework

    SciTech Connect

    Gentile, A.C.; Evensky, D.A.; Wyckoff, P.

    1998-05-01

    The authors present a visualization tool for the monitoring and debugging of codes run in a parallel and distributed computing environment, called Lilith Lights. This tool can be used both for debugging parallel codes as well as for resource management of clusters. It was developed under Lilith, a framework for creating scalable software tools for distributed computing. The use of Lilith provides scalable, non-invasive debugging, as opposed to other commonly used software debugging and visualization tools. Furthermore, by implementing the visualization tool in software rather than in hardware (as available on some MPPs), Lilith Lights is easily transferable to other machines, and well adapted for use on distributed clusters of machines. The information provided in a clustered environment can further be used for resource management of the cluster. In this paper, they introduce Lilith Lights, discussing its use on the Computational Plant cluster at Sandia National Laboratories, show its design and development under the Lilith framework, and present metrics for resource use and performance.

  13. Comparison of functional MRI image realignment tools using a computer-generated phantom.

    PubMed

    Morgan, V L; Pickens, D R; Hartmann, S L; Price, R R

    2001-09-01

    This study discusses the development of a computer-generated phantom to compare the effects of image realignment programs on functional MRI (fMRI) pixel activation. The phantom is a whole-head MRI volume with added random noise, activation, and motion. It allows simulation of realistic head motions with controlled areas of activation. Without motion, the phantom shows the effects of realignment on motion-free data sets. Prior to realignment, the phantom illustrates some activation corruption due to motion. Finally, three widely used realignment packages are examined. The results showed that the most accurate algorithms are able to increase specificity through accurate realignment while maintaining sensitivity through effective resampling techniques. In fact, accurate realignment alone is not a powerful indicator of the most effective algorithm in terms of true activation.

  14. Video Analysis of Projectile Motion Using Tablet Computers as Experimental Tools

    ERIC Educational Resources Information Center

    Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and "g" in order to explore the underlying laws of motion. This experiment…

  15. USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOL IN POLLUTION PREVENTION

    EPA Science Inventory

    Computer-Aided Process Engineering has become established in industry as a design tool. With the establishment of the CAPE-OPEN software specifications for process simulation environments. CAPE-OPEN provides a set of "middleware" standards that enable software developers to acces...

  16. Multimedia Instructional Tools' Impact on Student Motivation and Learning Strategies in Computer Applications Courses

    ERIC Educational Resources Information Center

    Chapman, Debra; Wang, Shuyan

    2015-01-01

    Multimedia instructional tools (MMIT) have been identified as a way effectively and economically present instructional material. MMITs are commonly used in introductory computer applications courses as MMITs should be effective in increasing student knowledge and positively impact motivation and learning strategies, without increasing costs. This…

  17. Graphical and Normative Analysis of Binocular Vision by Mini Computer: A Teaching Aid and Clinical Tool.

    ERIC Educational Resources Information Center

    Kees, Martin; Schor, Clifton

    1981-01-01

    An inexpensive computer graphics systems (Commodore PET), used as a video aid for teaching students advanced case analysis, is described. The course provides students with the analytical tools for evaluating with graphical and statistical techniques and treating with lenses, prisms, and orthoptics various anomalies of binocular vision. (MLW)

  18. Structure of the Brazilian Sign Language (Libras) for Computational Tools: Citizenship and Social Inclusion

    NASA Astrophysics Data System (ADS)

    Guimaraes, Cayley; Antunes, Diego R.; de F. Guilhermino Trindade, Daniela; da Silva, Rafaella A. Lopes; Garcia, Laura Sanchez

    This work presents a computational model (XML) of the Brazilian Sign Language (Libras), based on its phonology. The model was used to create a sample of representative signs to aid the recording of a base of videos whose aim is to support the development of tools to support genuine social inclusion of the deaf.

  19. Utilizing Computer-Mediated Communication Tools for Problem-Based Learning

    ERIC Educational Resources Information Center

    Lo, Hao-Chang

    2009-01-01

    This study aims to strategically use computer-mediated communication (CMC) tools to build online communication environments for problem-based learning (PBL). A six-stage process was proposed for online PBL learning in this study: 1) identifying the problem, 2) brainstorming, 3) collecting and analyzing information, 4) synthesizing information, 5)…

  20. Instruction of Statistics via Computer-Based Tools: Effects on Statistics' Anxiety, Attitude, and Achievement

    ERIC Educational Resources Information Center

    Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar

    2014-01-01

    The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…

  1. An Evaluation of the Webquest as a Computer-Based Learning Tool

    ERIC Educational Resources Information Center

    Hassanien, Ahmed

    2006-01-01

    This paper explores the preparation and use of an internet activity for undergraduate learners in higher education (HE). It evaluates the effectiveness of using webquest as a computer-based learning (CBL) tool to support students to learn in HE. The evaluation undertaken offers insights into learner perceptions concerning the ease of use of the…

  2. On the accurate direct computation of the isothermal compressibility for normal quantum simple fluids: application to quantum hard spheres.

    PubMed

    Sesé, Luis M

    2012-06-28

    A systematic study of the direct computation of the isothermal compressibility of normal quantum fluids is presented by analyzing the solving of the Ornstein-Zernike integral (OZ2) equation for the pair correlations between the path-integral necklace centroids. A number of issues related to the accuracy that can be achieved via this sort of procedure have been addressed, paying particular attention to the finite-N effects and to the definition of significant error bars for the estimates of isothermal compressibilities. Extensive path-integral Monte Carlo computations for the quantum hard-sphere fluid (QHS) have been performed in the (N, V, T) ensemble under temperature and density conditions for which dispersion effects dominate the quantum behavior. These computations have served to obtain the centroid correlations, which have been processed further via the numerical solving of the OZ2 equation. To do so, Baxter-Dixon-Hutchinson's variational procedure, complemented with Baumketner-Hiwatari's grand-canonical corrections, has been used. The virial equation of state has also been obtained and several comparisons between different versions of the QHS equation of state have been made. The results show the reliability of the procedure based on isothermal compressibilities discussed herein, which can then be regarded as a useful and quick means of obtaining the equation of state for fluids under quantum conditions involving strong repulsive interactions.

  3. High order accurate and low dissipation method for unsteady compressible viscous flow computation on helicopter rotor in forward flight

    NASA Astrophysics Data System (ADS)

    Xu, Li; Weng, Peifen

    2014-02-01

    An improved fifth-order weighted essentially non-oscillatory (WENO-Z) scheme combined with the moving overset grid technique has been developed to compute unsteady compressible viscous flows on the helicopter rotor in forward flight. In order to enforce periodic rotation and pitching of the rotor and relative motion between rotor blades, the moving overset grid technique is extended, where a special judgement standard is presented near the odd surface of the blade grid during search donor cells by using the Inverse Map method. The WENO-Z scheme is adopted for reconstructing left and right state values with the Roe Riemann solver updating the inviscid fluxes and compared with the monotone upwind scheme for scalar conservation laws (MUSCL) and the classical WENO scheme. Since the WENO schemes require a six point stencil to build the fifth-order flux, the method of three layers of fringes for hole boundaries and artificial external boundaries is proposed to carry out flow information exchange between chimera grids. The time advance on the unsteady solution is performed by the full implicit dual time stepping method with Newton type LU-SGS subiteration, where the solutions of pseudo steady computation are as the initial fields of the unsteady flow computation. Numerical results on non-variable pitch rotor and periodic variable pitch rotor in forward flight reveal that the approach can effectively capture vortex wake with low dissipation and reach periodic solutions very soon.

  4. G-LoSA: An efficient computational tool for local structure-centric biological studies and drug design.

    PubMed

    Lee, Hui Sun; Im, Wonpil

    2016-04-01

    Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G-LoSA. G-LoSA aligns protein local structures in a sequence order independent way and provides a GA-score, a chemical feature-based and size-independent structure similarity score. Our benchmark validation shows the robust performance of G-LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure-centric comparative biology studies. In particular, G-LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G-LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer-aided drug design. We hope that G-LoSA can be a useful computational method for exploring interesting biological problems through large-scale comparison of protein local structures and facilitating drug discovery research and development. G-LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. PMID:26813336

  5. InteractoMIX: a suite of computational tools to exploit interactomes in biological and clinical research.

    PubMed

    Poglayen, Daniel; Marín-López, Manuel Alejandro; Bonet, Jaume; Fornes, Oriol; Garcia-Garcia, Javier; Planas-Iglesias, Joan; Segura, Joan; Oliva, Baldo; Fernandez-Fuentes, Narcis

    2016-06-15

    Virtually all the biological processes that occur inside or outside cells are mediated by protein-protein interactions (PPIs). Hence, the charting and description of the PPI network, initially in organisms, the interactome, but more recently in specific tissues, is essential to fully understand cellular processes both in health and disease. The study of PPIs is also at the heart of renewed efforts in the medical and biotechnological arena in the quest of new therapeutic targets and drugs. Here, we present a mini review of 11 computational tools and resources tools developed by us to address different aspects of PPIs: from interactome level to their atomic 3D structural details. We provided details on each specific resource, aims and purpose and compare with equivalent tools in the literature. All the tools are presented in a centralized, one-stop, web site: InteractoMIX (http://interactomix.com). PMID:27284060

  6. The Berlin Brain--Computer Interface: accurate performance from first-session in BCI-naïve subjects.

    PubMed

    Blankertz, Benjamin; Losch, Florian; Krauledat, Matthias; Dornhege, Guido; Curio, Gabriel; Müller, Klaus-Robert

    2008-10-01

    The Berlin Brain--Computer Interface (BBCI) project develops a noninvasive BCI system whose key features are: 1) the use of well-established motor competences as control paradigms; 2) high-dimensional features from multichannel EEG; and 3) advanced machine-learning techniques. Spatio-spectral changes of sensorimotor rhythms are used to discriminate imagined movements (left hand, right hand, and foot). A previous feedback study [M. Krauledat, K.-R. MUller, and G. Curio. (2007) The non-invasive Berlin brain--computer Interface: Fast acquisition of effective performance in untrained subjects. NeuroImage. [Online]. 37(2), pp. 539--550. Available: http://dx.doi.org/10.1016/j.neuroimage.2007.01.051] with ten subjects provided preliminary evidence that the BBCI system can be operated at high accuracy for subjects with less than five prior BCI exposures. Here, we demonstrate in a group of 14 fully BCI-naIve subjects that 8 out of 14 BCI novices can perform at >84% accuracy in their very first BCI session, and a further four subjects at >70%. Thus, 12 out of 14 BCI-novices had significant above-chance level performances without any subject training even in the first session, as based on an optimized EEG analysis by advanced machine-learning algorithms. PMID:18838371

  7. Extending peripersonal space representation without tool-use: evidence from a combined behavioral-computational approach.

    PubMed

    Serino, Andrea; Canzoneri, Elisa; Marzolla, Marilena; di Pellegrino, Giuseppe; Magosso, Elisa

    2015-01-01

    Stimuli from different sensory modalities occurring on or close to the body are integrated in a multisensory representation of the space surrounding the body, i.e., peripersonal space (PPS). PPS dynamically modifies depending on experience, e.g., it extends after using a tool to reach far objects. However, the neural mechanism underlying PPS plasticity after tool use is largely unknown. Here we use a combined computational-behavioral approach to propose and test a possible mechanism accounting for PPS extension. We first present a neural network model simulating audio-tactile representation in the PPS around one hand. Simulation experiments showed that our model reproduced the main property of PPS neurons, i.e., selective multisensory response for stimuli occurring close to the hand. We used the neural network model to simulate the effects of a tool-use training. In terms of sensory inputs, tool use was conceptualized as a concurrent tactile stimulation from the hand, due to holding the tool, and an auditory stimulation from the far space, due to tool-mediated action. Results showed that after exposure to those inputs, PPS neurons responded also to multisensory stimuli far from the hand. The model thus suggests that synchronous pairing of tactile hand stimulation and auditory stimulation from the far space is sufficient to extend PPS, such as after tool-use. Such prediction was confirmed by a behavioral experiment, where we used an audio-tactile interaction paradigm to measure the boundaries of PPS representation. We found that PPS extended after synchronous tactile-hand stimulation and auditory-far stimulation in a group of healthy volunteers. Control experiments both in simulation and behavioral settings showed that the same amount of tactile and auditory inputs administered out of synchrony did not change PPS representation. We conclude by proposing a simple, biological-plausible model to explain plasticity in PPS representation after tool-use, which is

  8. Extending peripersonal space representation without tool-use: evidence from a combined behavioral-computational approach

    PubMed Central

    Serino, Andrea; Canzoneri, Elisa; Marzolla, Marilena; di Pellegrino, Giuseppe; Magosso, Elisa

    2015-01-01

    Stimuli from different sensory modalities occurring on or close to the body are integrated in a multisensory representation of the space surrounding the body, i.e., peripersonal space (PPS). PPS dynamically modifies depending on experience, e.g., it extends after using a tool to reach far objects. However, the neural mechanism underlying PPS plasticity after tool use is largely unknown. Here we use a combined computational-behavioral approach to propose and test a possible mechanism accounting for PPS extension. We first present a neural network model simulating audio-tactile representation in the PPS around one hand. Simulation experiments showed that our model reproduced the main property of PPS neurons, i.e., selective multisensory response for stimuli occurring close to the hand. We used the neural network model to simulate the effects of a tool-use training. In terms of sensory inputs, tool use was conceptualized as a concurrent tactile stimulation from the hand, due to holding the tool, and an auditory stimulation from the far space, due to tool-mediated action. Results showed that after exposure to those inputs, PPS neurons responded also to multisensory stimuli far from the hand. The model thus suggests that synchronous pairing of tactile hand stimulation and auditory stimulation from the far space is sufficient to extend PPS, such as after tool-use. Such prediction was confirmed by a behavioral experiment, where we used an audio-tactile interaction paradigm to measure the boundaries of PPS representation. We found that PPS extended after synchronous tactile-hand stimulation and auditory-far stimulation in a group of healthy volunteers. Control experiments both in simulation and behavioral settings showed that the same amount of tactile and auditory inputs administered out of synchrony did not change PPS representation. We conclude by proposing a simple, biological-plausible model to explain plasticity in PPS representation after tool-use, which is

  9. Computational study of the reactions of methanol with the hydroperoxyl and methyl radicals. 1. Accurate thermochemistry and barrier heights.

    PubMed

    Alecu, I M; Truhlar, Donald G

    2011-04-01

    The reactions of CH(3)OH with the HO(2) and CH(3) radicals are important in the combustion of methanol and are prototypes for reactions of heavier alcohols in biofuels. The reaction energies and barrier heights for these reaction systems are computed with CCSD(T) theory extrapolated to the complete basis set limit using correlation-consistent basis sets, both augmented and unaugmented, and further refined by including a fully coupled treatment of the connected triple excitations, a second-order perturbative treatment of quadruple excitations (by CCSDT(2)(Q)), core-valence corrections, and scalar relativistic effects. It is shown that the M08-HX and M08-SO hybrid meta-GGA density functionals can achieve sub-kcal mol(-1) agreement with the high-level ab initio results, identifying these functionals as important potential candidates for direct dynamics studies on the rates of these and homologous reaction systems. PMID:21405059

  10. Blast-induced biomechanical loading of the rat: an experimental and anatomically accurate computational blast injury model.

    PubMed

    Sundaramurthy, Aravind; Alai, Aaron; Ganpule, Shailesh; Holmberg, Aaron; Plougonven, Erwan; Chandra, Namas

    2012-09-01

    Blast waves generated by improvised explosive devices (IEDs) cause traumatic brain injury (TBI) in soldiers and civilians. In vivo animal models that use shock tubes are extensively used in laboratories to simulate field conditions, to identify mechanisms of injury, and to develop injury thresholds. In this article, we place rats in different locations along the length of the shock tube (i.e., inside, outside, and near the exit), to examine the role of animal placement location (APL) in the biomechanical load experienced by the animal. We found that the biomechanical load on the brain and internal organs in the thoracic cavity (lungs and heart) varied significantly depending on the APL. When the specimen is positioned outside, organs in the thoracic cavity experience a higher pressure for a longer duration, in contrast to APL inside the shock tube. This in turn will possibly alter the injury type, severity, and lethality. We found that the optimal APL is where the Friedlander waveform is first formed inside the shock tube. Once the optimal APL was determined, the effect of the incident blast intensity on the surface and intracranial pressure was measured and analyzed. Noticeably, surface and intracranial pressure increases linearly with the incident peak overpressures, though surface pressures are significantly higher than the other two. Further, we developed and validated an anatomically accurate finite element model of the rat head. With this model, we determined that the main pathway of pressure transmission to the brain was through the skull and not through the snout; however, the snout plays a secondary role in diffracting the incoming blast wave towards the skull.

  11. Blast-induced biomechanical loading of the rat: an experimental and anatomically accurate computational blast injury model.

    PubMed

    Sundaramurthy, Aravind; Alai, Aaron; Ganpule, Shailesh; Holmberg, Aaron; Plougonven, Erwan; Chandra, Namas

    2012-09-01

    Blast waves generated by improvised explosive devices (IEDs) cause traumatic brain injury (TBI) in soldiers and civilians. In vivo animal models that use shock tubes are extensively used in laboratories to simulate field conditions, to identify mechanisms of injury, and to develop injury thresholds. In this article, we place rats in different locations along the length of the shock tube (i.e., inside, outside, and near the exit), to examine the role of animal placement location (APL) in the biomechanical load experienced by the animal. We found that the biomechanical load on the brain and internal organs in the thoracic cavity (lungs and heart) varied significantly depending on the APL. When the specimen is positioned outside, organs in the thoracic cavity experience a higher pressure for a longer duration, in contrast to APL inside the shock tube. This in turn will possibly alter the injury type, severity, and lethality. We found that the optimal APL is where the Friedlander waveform is first formed inside the shock tube. Once the optimal APL was determined, the effect of the incident blast intensity on the surface and intracranial pressure was measured and analyzed. Noticeably, surface and intracranial pressure increases linearly with the incident peak overpressures, though surface pressures are significantly higher than the other two. Further, we developed and validated an anatomically accurate finite element model of the rat head. With this model, we determined that the main pathway of pressure transmission to the brain was through the skull and not through the snout; however, the snout plays a secondary role in diffracting the incoming blast wave towards the skull. PMID:22620716

  12. Network Computing Infrastructure to Share Tools and Data in Global Nuclear Energy Partnership

    NASA Astrophysics Data System (ADS)

    Kim, Guehee; Suzuki, Yoshio; Teshima, Naoya

    CCSE/JAEA (Center for Computational Science and e-Systems/Japan Atomic Energy Agency) integrated a prototype system of a network computing infrastructure for sharing tools and data to support the U.S. and Japan collaboration in GNEP (Global Nuclear Energy Partnership). We focused on three technical issues to apply our information process infrastructure, which are accessibility, security, and usability. In designing the prototype system, we integrated and improved both network and Web technologies. For the accessibility issue, we adopted SSL-VPN (Security Socket Layer-Virtual Private Network) technology for the access beyond firewalls. For the security issue, we developed an authentication gateway based on the PKI (Public Key Infrastructure) authentication mechanism to strengthen the security. Also, we set fine access control policy to shared tools and data and used shared key based encryption method to protect tools and data against leakage to third parties. For the usability issue, we chose Web browsers as user interface and developed Web application to provide functions to support sharing tools and data. By using WebDAV (Web-based Distributed Authoring and Versioning) function, users can manipulate shared tools and data through the Windows-like folder environment. We implemented the prototype system in Grid infrastructure for atomic energy research: AEGIS (Atomic Energy Grid Infrastructure) developed by CCSE/JAEA. The prototype system was applied for the trial use in the first period of GNEP.

  13. Lilith: A Java framework for the development of scalable tools for high performance distributed computing platforms

    SciTech Connect

    Evensky, D.A.; Gentile, A.C.; Armstrong, R.C.

    1998-03-19

    Increasingly, high performance computing constitutes the use of very large heterogeneous clusters of machines. The use and maintenance of such clusters are subject to complexities of communication between the machines in a time efficient and secure manner. Lilith is a general purpose tool that provides a highly scalable, secure, and easy distribution of user code across a heterogeneous computing platform. By handling the details of code distribution and communication, such a framework allows for the rapid development of tools for the use and management of large distributed systems. Lilith is written in Java, taking advantage of Java`s unique features of loading and distributing code dynamically, its platform independence, its thread support, and its provision of graphical components to facilitate easy to use resultant tools. The authors describe the use of Lilith in a tool developed for the maintenance of the large distributed cluster at their institution and present details of the Lilith architecture and user API for the general user development of scalable tools.

  14. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  15. MegaMiner: A Tool for Lead Identification Through Text Mining Using Chemoinformatics Tools and Cloud Computing Environment.

    PubMed

    Karthikeyan, Muthukumarasamy; Pandit, Yogesh; Pandit, Deepak; Vyas, Renu

    2015-01-01

    Virtual screening is an indispensable tool to cope with the massive amount of data being tossed by the high throughput omics technologies. With the objective of enhancing the automation capability of virtual screening process a robust portal termed MegaMiner has been built using the cloud computing platform wherein the user submits a text query and directly accesses the proposed lead molecules along with their drug-like, lead-like and docking scores. Textual chemical structural data representation is fraught with ambiguity in the absence of a global identifier. We have used a combination of statistical models, chemical dictionary and regular expression for building a disease specific dictionary. To demonstrate the effectiveness of this approach, a case study on malaria has been carried out in the present work. MegaMiner offered superior results compared to other text mining search engines, as established by F score analysis. A single query term 'malaria' in the portlet led to retrieval of related PubMed records, protein classes, drug classes and 8000 scaffolds which were internally processed and filtered to suggest new molecules as potential anti-malarials. The results obtained were validated by docking the virtual molecules into relevant protein targets. It is hoped that MegaMiner will serve as an indispensable tool for not only identifying hidden relationships between various biological and chemical entities but also for building better corpus and ontologies. PMID:26138567

  16. MegaMiner: A Tool for Lead Identification Through Text Mining Using Chemoinformatics Tools and Cloud Computing Environment.

    PubMed

    Karthikeyan, Muthukumarasamy; Pandit, Yogesh; Pandit, Deepak; Vyas, Renu

    2015-01-01

    Virtual screening is an indispensable tool to cope with the massive amount of data being tossed by the high throughput omics technologies. With the objective of enhancing the automation capability of virtual screening process a robust portal termed MegaMiner has been built using the cloud computing platform wherein the user submits a text query and directly accesses the proposed lead molecules along with their drug-like, lead-like and docking scores. Textual chemical structural data representation is fraught with ambiguity in the absence of a global identifier. We have used a combination of statistical models, chemical dictionary and regular expression for building a disease specific dictionary. To demonstrate the effectiveness of this approach, a case study on malaria has been carried out in the present work. MegaMiner offered superior results compared to other text mining search engines, as established by F score analysis. A single query term 'malaria' in the portlet led to retrieval of related PubMed records, protein classes, drug classes and 8000 scaffolds which were internally processed and filtered to suggest new molecules as potential anti-malarials. The results obtained were validated by docking the virtual molecules into relevant protein targets. It is hoped that MegaMiner will serve as an indispensable tool for not only identifying hidden relationships between various biological and chemical entities but also for building better corpus and ontologies.

  17. Computational neurobiology is a useful tool in translational neurology: the example of ataxia

    PubMed Central

    Brown, Sherry-Ann; McCullough, Louise D.; Loew, Leslie M.

    2014-01-01

    Hereditary ataxia, or motor incoordination, affects approximately 150,000 Americans and hundreds of thousands of individuals worldwide with onset from as early as mid-childhood. Affected individuals exhibit dysarthria, dysmetria, action tremor, and diadochokinesia. In this review, we consider an array of computational studies derived from experimental observations relevant to human neuropathology. A survey of related studies illustrates the impact of integrating clinical evidence with data from mouse models and computational simulations. Results from these studies may help explain findings in mice, and after extensive laboratory study, may ultimately be translated to ataxic individuals. This inquiry lays a foundation for using computation to understand neurobiochemical and electrophysiological pathophysiology of spinocerebellar ataxias and may contribute to development of therapeutics. The interdisciplinary analysis suggests that computational neurobiology can be an important tool for translational neurology. PMID:25653585

  18. Video analysis of projectile motion using tablet computers as experimental tools

    NASA Astrophysics Data System (ADS)

    Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and g in order to explore the underlying laws of motion. This experiment can easily be performed by students themselves, providing more autonomy in their problem-solving processes than traditional learning approaches. We believe that this autonomy and the authenticity of the experimental tool both foster their motivation.

  19. Noncontrast computed tomography can predict the outcome of shockwave lithotripsy via accurate stone measurement and abdominal fat distribution determination.

    PubMed

    Geng, Jiun-Hung; Tu, Hung-Pin; Shih, Paul Ming-Chen; Shen, Jung-Tsung; Jang, Mei-Yu; Wu, Wen-Jen; Li, Ching-Chia; Chou, Yii-Her; Juan, Yung-Shun

    2015-01-01

    Urolithiasis is a common disease of the urinary system. Extracorporeal shockwave lithotripsy (SWL) has become one of the standard treatments for renal and ureteral stones; however, the success rates range widely and failure of stone disintegration may cause additional outlay, alternative procedures, and even complications. We used the data available from noncontrast abdominal computed tomography (NCCT) to evaluate the impact of stone parameters and abdominal fat distribution on calculus-free rates following SWL. We retrospectively reviewed 328 patients who had urinary stones and had undergone SWL from August 2012 to August 2013. All of them received pre-SWL NCCT; 1 month after SWL, radiography was arranged to evaluate the condition of the fragments. These patients were classified into stone-free group and residual stone group. Unenhanced computed tomography variables, including stone attenuation, abdominal fat area, and skin-to-stone distance (SSD) were analyzed. In all, 197 (60%) were classified as stone-free and 132 (40%) as having residual stone. The mean ages were 49.35 ± 13.22 years and 55.32 ± 13.52 years, respectively. On univariate analysis, age, stone size, stone surface area, stone attenuation, SSD, total fat area (TFA), abdominal circumference, serum creatinine, and the severity of hydronephrosis revealed statistical significance between these two groups. From multivariate logistic regression analysis, the independent parameters impacting SWL outcomes were stone size, stone attenuation, TFA, and serum creatinine. [Adjusted odds ratios and (95% confidence intervals): 9.49 (3.72-24.20), 2.25 (1.22-4.14), 2.20 (1.10-4.40), and 2.89 (1.35-6.21) respectively, all p < 0.05]. In the present study, stone size, stone attenuation, TFA and serum creatinine were four independent predictors for stone-free rates after SWL. These findings suggest that pretreatment NCCT may predict the outcomes after SWL. Consequently, we can use these predictors for selecting

  20. Computational Study of the Reactions of Methanol with the Hydroperoxyl and Methyl Radicals. Part I: Accurate Thermochemistry and Barrier Heights

    SciTech Connect

    Alecu, I. M.; Truhlar, D. G.

    2011-04-07

    The reactions of CH3OH with the HO2 and CH3 radicals are important in the combustion of methanol and are prototypes for reactions of heavier alcohols in biofuels. The reaction energies and barrier heights for these reaction systems are computed with CCSD(T) theory extrapolated to the complete basis set limit using correlation-consistent basis sets, both augmented and unaugmented, and further refined by including a fully coupled treatment of the connected triple excitations, a second-order perturbative treatment of quadruple excitations (by CCSDT(2)Q), core–valence corrections, and scalar relativistic effects. It is shown that the M08-HX and M08-SO hybrid meta-GGA density functionals can achieve sub-kcal mol-1 agreement with the high-level ab initio results, identifying these functionals as important potential candidates for direct dynamics studies on the rates of these and homologous reaction systems.

  1. An efficient and accurate technique to compute the absorption, emission, and transmission of radiation by the Martian atmosphere

    NASA Technical Reports Server (NTRS)

    Lindner, Bernhard Lee; Ackerman, Thomas P.; Pollack, James B.

    1990-01-01

    CO2 comprises 95 pct. of the composition of the Martian atmosphere. However, the Martian atmosphere also has a high aerosol content. Dust particles vary from less than 0.2 to greater than 3.0. CO2 is an active absorber and emitter in near IR and IR wavelengths; the near IR absorption bands of CO2 provide significant heating of the atmosphere, and the 15 micron band provides rapid cooling. Including both CO2 and aerosol radiative transfer simultaneously in a model is difficult. Aerosol radiative transfer requires a multiple scattering code, while CO2 radiative transfer must deal with complex wavelength structure. As an alternative to the pure atmosphere treatment in most models which causes inaccuracies, a treatment was developed called the exponential sum or k distribution approximation. The chief advantage of the exponential sum approach is that the integration over k space of f(k) can be computed more quickly than the integration of k sub upsilon over frequency. The exponential sum approach is superior to the photon path distribution and emissivity techniques for dusty conditions. This study was the first application of the exponential sum approach to Martian conditions.

  2. Coronary Computed Tomographic Angiography Does Not Accurately Predict the Need of Coronary Revascularization in Patients with Stable Angina

    PubMed Central

    Hong, Sung-Jin; Her, Ae-Young; Suh, Yongsung; Won, Hoyoun; Cho, Deok-Kyu; Cho, Yun-Hyeong; Yoon, Young-Won; Lee, Kyounghoon; Kang, Woong Chol; Kim, Yong Hoon; Kim, Sang-Wook; Shin, Dong-Ho; Kim, Jung-Sun; Kim, Byeong-Keuk; Ko, Young-Guk; Choi, Byoung-Wook; Choi, Donghoon; Jang, Yangsoo

    2016-01-01

    Purpose To evaluate the ability of coronary computed tomographic angiography (CCTA) to predict the need of coronary revascularization in symptomatic patients with stable angina who were referred to a cardiac catheterization laboratory for coronary revascularization. Materials and Methods Pre-angiography CCTA findings were analyzed in 1846 consecutive symptomatic patients with stable angina, who were referred to a cardiac catheterization laboratory at six hospitals and were potential candidates for coronary revascularization between July 2011 and December 2013. The number of patients requiring revascularization was determined based on the severity of coronary stenosis as assessed by CCTA. This was compared to the actual number of revascularization procedures performed in the cardiac catheterization laboratory. Results Based on CCTA findings, coronary revascularization was indicated in 877 (48%) and not indicated in 969 (52%) patients. Of the 877 patients indicated for revascularization by CCTA, only 600 (68%) underwent the procedure, whereas 285 (29%) of the 969 patients not indicated for revascularization, as assessed by CCTA, underwent the procedure. When the coronary arteries were divided into 15 segments using the American Heart Association coronary tree model, the sensitivity, specificity, positive predictive value, and negative predictive value of CCTA for therapeutic decision making on a per-segment analysis were 42%, 96%, 40%, and 96%, respectively. Conclusion CCTA-based assessment of coronary stenosis severity does not sufficiently differentiate between coronary segments requiring revascularization versus those not requiring revascularization. Conventional coronary angiography should be considered to determine the need of revascularization in symptomatic patients with stable angina. PMID:27401637

  3. Staging of osteonecrosis of the jaw requires computed tomography for accurate definition of the extent of bony disease.

    PubMed

    Bedogni, Alberto; Fedele, Stefano; Bedogni, Giorgio; Scoletta, Matteo; Favia, Gianfranco; Colella, Giuseppe; Agrillo, Alessandro; Bettini, Giordana; Di Fede, Olga; Oteri, Giacomo; Fusco, Vittorio; Gabriele, Mario; Ottolenghi, Livia; Valsecchi, Stefano; Porter, Stephen; Petruzzi, Massimo; Arduino, Paolo; D'Amato, Salvatore; Ungari, Claudio; Fung Polly, Pok-Lam; Saia, Giorgia; Campisi, Giuseppina

    2014-09-01

    Management of osteonecrosis of the jaw associated with antiresorptive agents is challenging, and outcomes are unpredictable. The severity of disease is the main guide to management, and can help to predict prognosis. Most available staging systems for osteonecrosis, including the widely-used American Association of Oral and Maxillofacial Surgeons (AAOMS) system, classify severity on the basis of clinical and radiographic findings. However, clinical inspection and radiography are limited in their ability to identify the extent of necrotic bone disease compared with computed tomography (CT). We have organised a large multicentre retrospective study (known as MISSION) to investigate the agreement between the AAOMS staging system and the extent of osteonecrosis of the jaw (focal compared with diffuse involvement of bone) as detected on CT. We studied 799 patients with detailed clinical phenotyping who had CT images taken. Features of diffuse bone disease were identified on CT within all AAOMS stages (20%, 8%, 48%, and 24% of patients in stages 0, 1, 2, and 3, respectively). Of the patients classified as stage 0, 110/192 (57%) had diffuse disease on CT, and about 1 in 3 with CT evidence of diffuse bone disease was misclassified by the AAOMS system as having stages 0 and 1 osteonecrosis. In addition, more than a third of patients with AAOMS stage 2 (142/405, 35%) had focal bone disease on CT. We conclude that the AAOMS staging system does not correctly identify the extent of bony disease in patients with osteonecrosis of the jaw.

  4. A hybrid stochastic-deterministic computational model accurately describes spatial dynamics and virus diffusion in HIV-1 growth competition assay.

    PubMed

    Immonen, Taina; Gibson, Richard; Leitner, Thomas; Miller, Melanie A; Arts, Eric J; Somersalo, Erkki; Calvetti, Daniela

    2012-11-01

    We present a new hybrid stochastic-deterministic, spatially distributed computational model to simulate growth competition assays on a relatively immobile monolayer of peripheral blood mononuclear cells (PBMCs), commonly used for determining ex vivo fitness of human immunodeficiency virus type-1 (HIV-1). The novel features of our approach include incorporation of viral diffusion through a deterministic diffusion model while simulating cellular dynamics via a stochastic Markov chain model. The model accounts for multiple infections of target cells, CD4-downregulation, and the delay between the infection of a cell and the production of new virus particles. The minimum threshold level of infection induced by a virus inoculum is determined via a series of dilution experiments, and is used to determine the probability of infection of a susceptible cell as a function of local virus density. We illustrate how this model can be used for estimating the distribution of cells infected by either a single virus type or two competing viruses. Our model captures experimentally observed variation in the fitness difference between two virus strains, and suggests a way to minimize variation and dual infection in experiments.

  5. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  6. A Queue Simulation Tool for a High Performance Scientific Computing Center

    NASA Technical Reports Server (NTRS)

    Spear, Carrie; McGalliard, James

    2007-01-01

    The NASA Center for Computational Sciences (NCCS) at the Goddard Space Flight Center provides high performance highly parallel processors, mass storage, and supporting infrastructure to a community of computational Earth and space scientists. Long running (days) and highly parallel (hundreds of CPUs) jobs are common in the workload. NCCS management structures batch queues and allocates resources to optimize system use and prioritize workloads. NCCS technical staff use a locally developed discrete event simulation tool to model the impacts of evolving workloads, potential system upgrades, alternative queue structures and resource allocation policies.

  7. IMAT (Integrated Multidisciplinary Analysis Tool) user's guide for the VAX/VMS computer

    NASA Technical Reports Server (NTRS)

    Meissner, Frances T. (Editor)

    1988-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system for the VAX/VMS computer developed at the Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

  8. IrisPlex: a sensitive DNA tool for accurate prediction of blue and brown eye colour in the absence of ancestry information.

    PubMed

    Walsh, Susan; Liu, Fan; Ballantyne, Kaye N; van Oven, Mannis; Lao, Oscar; Kayser, Manfred

    2011-06-01

    A new era of 'DNA intelligence' is arriving in forensic biology, due to the impending ability to predict externally visible characteristics (EVCs) from biological material such as those found at crime scenes. EVC prediction from forensic samples, or from body parts, is expected to help concentrate police investigations towards finding unknown individuals, at times when conventional DNA profiling fails to provide informative leads. Here we present a robust and sensitive tool, termed IrisPlex, for the accurate prediction of blue and brown eye colour from DNA in future forensic applications. We used the six currently most eye colour-informative single nucleotide polymorphisms (SNPs) that previously revealed prevalence-adjusted prediction accuracies of over 90% for blue and brown eye colour in 6168 Dutch Europeans. The single multiplex assay, based on SNaPshot chemistry and capillary electrophoresis, both widely used in forensic laboratories, displays high levels of genotyping sensitivity with complete profiles generated from as little as 31pg of DNA, approximately six human diploid cell equivalents. We also present a prediction model to correctly classify an individual's eye colour, via probability estimation solely based on DNA data, and illustrate the accuracy of the developed prediction test on 40 individuals from various geographic origins. Moreover, we obtained insights into the worldwide allele distribution of these six SNPs using the HGDP-CEPH samples of 51 populations. Eye colour prediction analyses from HGDP-CEPH samples provide evidence that the test and model presented here perform reliably without prior ancestry information, although future worldwide genotype and phenotype data shall confirm this notion. As our IrisPlex eye colour prediction test is capable of immediate implementation in forensic casework, it represents one of the first steps forward in the creation of a fully individualised EVC prediction system for future use in forensic DNA intelligence.

  9. Computational Tools for Predictive Modeling of Properties in Complex Actinide Systems

    SciTech Connect

    Autschbach, Jochen; Govind, Niranjan; Atta Fynn, Raymond; Bylaska, Eric J.; Weare, John H.; de Jong, Wibe A.

    2015-03-30

    In this chapter we focus on methodological and computational aspects that are key to accurately modeling the spectroscopic and thermodynamic properties of molecular systems containing actinides within the density functional theory (DFT) framework. Our focus is on properties that require either an accurate relativistic all-electron description or an accurate description of the dynamical behavior of actinide species in an environment at finite temperature, or both. The implementation of the methods and the calculations discussed in this chapter were done with the NWChem software suite (Valiev et al. 2010). In the first two sections we discuss two methods that account for relativistic effects, the ZORA and the X2C Hamiltonian. Section 1.2.1 discusses the implementation of the approximate relativistic ZORA Hamiltonian and its extension to magnetic properties. Section 1.3 focuses on the exact X2C Hamiltonian and the application of this methodology to obtain accurate molecular properties. In Section 1.4 we examine the role of a dynamical environment at finite temperature as well as the presence of other ions on the thermodynamics of hydrolysis and exchange reaction mechanisms. Finally, Section 1.5 discusses the modeling of XAS (EXAFS, XANES) properties in realistic environments accounting for both the dynamics of the system and (for XANES) the relativistic effects.

  10. Design and Development of a Sample "Computer Programming" Course Tool via Story-Based E-Learning Approach

    ERIC Educational Resources Information Center

    Kose, Utku; Koc, Durmus; Yucesoy, Suleyman Anil

    2013-01-01

    This study introduces a story-based e-learning oriented course tool that was designed and developed for using within "computer programming" courses. With this tool, students can easily adapt themselves to the subjects in the context of computer programming principles, thanks to the story-based, interactive processes. By using visually…

  11. Distributed computing as a virtual supercomputer: Tools to run and manage large-scale BOINC simulations

    NASA Astrophysics Data System (ADS)

    Giorgino, Toni; Harvey, M. J.; de Fabritiis, Gianni

    2010-08-01

    Distributed computing (DC) projects tackle large computational problems by exploiting the donated processing power of thousands of volunteered computers, connected through the Internet. To efficiently employ the computational resources of one of world's largest DC efforts, GPUGRID, the project scientists require tools that handle hundreds of thousands of tasks which run asynchronously and generate gigabytes of data every day. We describe RBoinc, an interface that allows computational scientists to embed the DC methodology into the daily work-flow of high-throughput experiments. By extending the Berkeley Open Infrastructure for Network Computing (BOINC), the leading open-source middleware for current DC projects, with mechanisms to submit and manage large-scale distributed computations from individual workstations, RBoinc turns distributed grids into cost-effective virtual resources that can be employed by researchers in work-flows similar to conventional supercomputers. The GPUGRID project is currently using RBoinc for all of its in silico experiments based on molecular dynamics methods, including the determination of binding free energies and free energy profiles in all-atom models of biomolecules.

  12. System capacity and economic modeling computer tool for satellite mobile communications systems

    NASA Astrophysics Data System (ADS)

    Wiedeman, Robert A.; Wen, Doong; McCracken, Albert G.

    1988-05-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  13. System capacity and economic modeling computer tool for satellite mobile communications systems

    NASA Technical Reports Server (NTRS)

    Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.

    1988-01-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  14. GlycoMinestruct: a new bioinformatics tool for highly accurate mapping of the human N-linked and O-linked glycoproteomes by incorporating structural features

    PubMed Central

    Li, Fuyi; Li, Chen; Revote, Jerico; Zhang, Yang; Webb, Geoffrey I.; Li, Jian; Song, Jiangning; Lithgow, Trevor

    2016-01-01

    Glycosylation plays an important role in cell-cell adhesion, ligand-binding and subcellular recognition. Current approaches for predicting protein glycosylation are primarily based on sequence-derived features, while little work has been done to systematically assess the importance of structural features to glycosylation prediction. Here, we propose a novel bioinformatics method called GlycoMinestruct(http://glycomine.erc.monash.edu/Lab/GlycoMine_Struct/) for improved prediction of human N- and O-linked glycosylation sites by combining sequence and structural features in an integrated computational framework with a two-step feature-selection strategy. Experiments indicated that GlycoMinestruct outperformed NGlycPred, the only predictor that incorporated both sequence and structure features, achieving AUC values of 0.941 and 0.922 for N- and O-linked glycosylation, respectively, on an independent test dataset. We applied GlycoMinestruct to screen the human structural proteome and obtained high-confidence predictions for N- and O-linked glycosylation sites. GlycoMinestruct can be used as a powerful tool to expedite the discovery of glycosylation events and substrates to facilitate hypothesis-driven experimental studies. PMID:27708373

  15. Computer tools in the discovery of HIV-I integrase inhibitors

    PubMed Central

    Liao, Chenzhong; Nicklaus, Marc C

    2010-01-01

    Computer-aided drug design (CADD) methodologies have made great advances and contributed significantly to the discovery and/or optimization of many clinically used drugs in recent years. CADD tools have likewise been applied to the discovery of inhibitors of HIV-I integrase, a difficult and worthwhile target for the development of efficient anti-HIV drugs. This article reviews the application of CADD tools, including pharmacophore search, quantitative structure–activity relationships, model building of integrase complexed with viral DNA and quantum-chemical studies in the discovery of HIV-I integrase inhibitors. Different structurally diverse integrase inhibitors have been identified by, or with significant help from, various CADD tools. PMID:21426160

  16. The role of optimization in the next generation of computer-based design tools

    NASA Technical Reports Server (NTRS)

    Rogan, J. Edward

    1989-01-01

    There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.

  17. Accelerating Design of Batteries Using Computer-Aided Engineering Tools (Presentation)

    SciTech Connect

    Pesaran, A.; Kim, G. H.; Smith, K.

    2010-11-01

    Computer-aided engineering (CAE) is a proven pathway, especially in the automotive industry, to improve performance by resolving the relevant physics in complex systems, shortening the product development design cycle, thus reducing cost, and providing an efficient way to evaluate parameters for robust designs. Academic models include the relevant physics details, but neglect engineering complexities. Industry models include the relevant macroscopic geometry and system conditions, but simplify the fundamental physics too much. Most of the CAE battery tools for in-house use are custom model codes and require expert users. There is a need to make these battery modeling and design tools more accessible to end users such as battery developers, pack integrators, and vehicle makers. Developing integrated and physics-based CAE battery tools can reduce the design, build, test, break, re-design, re-build, and re-test cycle and help lower costs. NREL has been involved in developing various models to predict the thermal and electrochemical performance of large-format cells and has used in commercial three-dimensional finite-element analysis and computational fluid dynamics to study battery pack thermal issues. These NREL cell and pack design tools can be integrated to help support the automotive industry and to accelerate battery design.

  18. On the Development of a Computer Based Diagnostic Assessment Tool to Help in Teaching and Learning Process

    ERIC Educational Resources Information Center

    Ahmad, Afaq; Al-Mashari, Ahmed; Al-Lawati, Ali

    2010-01-01

    This paper presents a computer based diagnostic tool developed to facilitate the learning process. The developed tool is capable of generating possible error syndromes associated with the answers received. The developed tool simulates the error pattern of the test results and then accordingly models the action plan to help in children's learning…

  19. Dental students' attitudes toward the design of a computer-based treatment planning tool.

    PubMed

    Foster, Lea; Knox, Kathy; Rung, Andrea; Mattheos, Nikos

    2011-11-01

    The purpose of this study was to identify and evaluate the attitudes of a cohort of fourth- and fifth-year dental students (n=53) at Griffith University in Australia to a proposed computer-based Case Study and Treatment Planning (CSTP) tool. The tool would allow students to work through the process of comprehensive, multidisciplinary treatment planning for patients in a structured and logical manner. A questionnaire was designed to investigate the students' perceived needs, attitudes, and factors deemed to be important in the design of such a tool. Students responded on a seven-point Likert scale ranging from 1 (strongly disagree) to 7 (strongly agree). The survey was supplemented by two focus groups, one of fourth-year and one of fifth-year students. The survey results indicated strong agreement that there is a need for such a tool (fourth-year mean=6.24; fifth-year mean=5.75) and the likelihood that it would be used after hours and for extra treatment planning practice (fourth-year mean=5.82; fifth-year mean=5.45). The themes that emerged from the focus groups revealed students' agreement that a CSTP tool would be beneficial both for training and for faculty assessment of students' treatment planning skills. The type of concerns raised included whether a rigid treatment planning template might hamper the flexibility needed to deal with complex patient cases. Additionally, there was some concern that students' personal interaction with tutors would be reduced if this mode of computer-based treatment planning were to be used exclusively. In conclusion, the overall attitude of dental students was positive towards a CSTP tool. This study's findings provide guidance as to how such software could be developed and which features to include.

  20. Computer programing for geosciences: Teach your students how to make tools

    NASA Astrophysics Data System (ADS)

    Grapenthin, Ronni

    2011-12-01

    When I announced my intention to pursue a Ph.D. in geophysics, some people gave me confused looks, because I was working on a master's degree in computer science at the time. My friends, like many incoming geoscience graduate students, have trouble linking these two fields. From my perspective, it is pretty straightforward: Much of geoscience evolves around novel analyses of large data sets that require custom tools—computer programs—to minimize the drudgery of manual data handling; other disciplines share this characteristic. While most faculty adapted to the need for tool development quite naturally, as they grew up around computer terminal interfaces, incoming graduate students lack intuitive understanding of programing concepts such as generalization and automation. I believe the major cause is the intuitive graphical user interfaces of modern operating systems and applications, which isolate the user from all technical details. Generally, current curricula do not recognize this gap between user and machine. For students to operate effectively, they require specialized courses teaching them the skills they need to make tools that operate on particular data sets and solve their specific problems. Courses in computer science departments are aimed at a different audience and are of limited help.

  1. Architecture-Adaptive Computing Environment: A Tool for Teaching Parallel Programming

    NASA Technical Reports Server (NTRS)

    Dorband, John E.; Aburdene, Maurice F.

    2002-01-01

    Recently, networked and cluster computation have become very popular. This paper is an introduction to a new C based parallel language for architecture-adaptive programming, aCe C. The primary purpose of aCe (Architecture-adaptive Computing Environment) is to encourage programmers to implement applications on parallel architectures by providing them the assurance that future architectures will be able to run their applications with a minimum of modification. A secondary purpose is to encourage computer architects to develop new types of architectures by providing an easily implemented software development environment and a library of test applications. This new language should be an ideal tool to teach parallel programming. In this paper, we will focus on some fundamental features of aCe C.

  2. General purpose computational tools for simulation and analysis of medium-energy backscattering spectra

    NASA Astrophysics Data System (ADS)

    Weller, Robert A.

    1999-06-01

    This paper describes a suite of computational tools for general-purpose ion-solid calculations, which has been implemented in the platform-independent computational environment Mathematica®. Although originally developed for medium energy work (beam energies < 300 keV), they are suitable for general, classical, non-relativistic calculations. Routines are available for stopping power, Rutherford and Lenz-Jensen (screened) cross sections, sputtering yields, small-angle multiple scattering, and back-scattering-spectrum simulation and analysis. Also included are a full range of supporting functions, as well as easily accessible atomic mass and other data on all the stable isotopes in the periodic table. The functions use common calling protocols, recognize elements and isotopes by symbolic names and, wherever possible, return symbolic results for symbolic inputs, thereby facilitating further computation. A new paradigm for the representation of backscattering spectra is introduced.

  3. Investigation of the "Convince Me" Computer Environment as a Tool for Critical Argumentation about Public Policy Issues

    ERIC Educational Resources Information Center

    Adams, Stephen T.

    2003-01-01

    The "Convince Me" computer environment supports critical thinking by allowing users to create and evaluate computer-based representations of arguments. This study investigates theoretical and design considerations pertinent to using "Convince Me" as an educational tool to support reasoning about public policy issues. Among computer environments…

  4. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    ERIC Educational Resources Information Center

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-01-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…

  5. Ring polymer molecular dynamics fast computation of rate coefficients on accurate potential energy surfaces in local configuration space: Application to the abstraction of hydrogen from methane

    NASA Astrophysics Data System (ADS)

    Meng, Qingyong; Chen, Jun; Zhang, Dong H.

    2016-04-01

    To fast and accurately compute rate coefficients of the H/D + CH4 → H2/HD + CH3 reactions, we propose a segmented strategy for fitting suitable potential energy surface (PES), on which ring-polymer molecular dynamics (RPMD) simulations are performed. On the basis of recently developed permutation invariant polynomial neural-network approach [J. Li et al., J. Chem. Phys. 142, 204302 (2015)], PESs in local configuration spaces are constructed. In this strategy, global PES is divided into three parts, including asymptotic, intermediate, and interaction parts, along the reaction coordinate. Since less fitting parameters are involved in the local PESs, the computational efficiency for operating the PES routine is largely enhanced by a factor of ˜20, comparing with that for global PES. On interaction part, the RPMD computational time for the transmission coefficient can be further efficiently reduced by cutting off the redundant part of the child trajectories. For H + CH4, good agreements among the present RPMD rates and those from previous simulations as well as experimental results are found. For D + CH4, on the other hand, qualitative agreement between present RPMD and experimental results is predicted.

  6. Assessing students' learning and decision-making skills using high performance web-based computational tools

    NASA Astrophysics Data System (ADS)

    Martin, Akilah

    Using web-based computational tool in classrooms in conjunction with advanced computing models provide the opportunity for students to learn large scale processes, such as state, regional, and global environmental issues that are difficult to incorporate into student learning exercises with present basic models. These tools aided in bridging the gap between multi-field scale models and enhanced student learning. The expectations were that students would improve their decision-making skills by solving realistic and large scale (multi-field conditions) environmental issues that were made possible through faster computation time, larger datasets, larger scale (multi-field), and predictions over longer time periods using the Century soil organic carbon model. The Century Model was linked to a web-based series of functional pages through which students could run the model through. In this project, 239 undergraduate students' learning and decision-making skills using high performance classroom computing tools were assessed. Among the many Century Model parameters, the students were able to alter four variables (climate, crop, tillage, and soil texture). Students were able to simulate several scenarios simultaneously. The results of the study revealed that pretest for the four courses combined was found significant (P < 0.05), meaning that the pretest was a major contributor to their increased posttest score. Although, the scenario scale (multi-field conditions vs. single field conditions) factor was not statistically significant, the students completing the multi-field scenario assignment scored higher on the posttest and also had a higher increase in points from pretest to posttest. Overall, these results revealed that the tool provided had a positive impact on the students' learning which was evident in their enhanced pretest to posttest score and also their perceptions from the written evaluation they provided. Most students felt that the project was a good learning

  7. User's Manual for FOMOCO Utilities-Force and Moment Computation Tools for Overset Grids

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Buning, Pieter G.

    1996-01-01

    In the numerical computations of flows around complex configurations, accurate calculations of force and moment coefficients for aerodynamic surfaces are required. When overset grid methods are used, the surfaces on which force and moment coefficients are sought typically consist of a collection of overlapping surface grids. Direct integration of flow quantities on the overlapping grids would result in the overlapped regions being counted more than once. The FOMOCO Utilities is a software package for computing flow coefficients (force, moment, and mass flow rate) on a collection of overset surfaces with accurate accounting of the overlapped zones. FOMOCO Utilities can be used in stand-alone mode or in conjunction with the Chimera overset grid compressible Navier-Stokes flow solver OVERFLOW. The software package consists of two modules corresponding to a two-step procedure: (1) hybrid surface grid generation (MIXSUR module), and (2) flow quantities integration (OVERINT module). Instructions on how to use this software package are described in this user's manual. Equations used in the flow coefficients calculation are given in Appendix A.

  8. Computational tools for the interpretation of electron spin resonance spectra in solution

    NASA Astrophysics Data System (ADS)

    Zerbetto, Mirco; Licari, Daniele; Barone, Vincenzo; Polimeno, Antonino

    2013-10-01

    Spectroscopic observables can be used for monitoring relaxation processes of molecules. In particular, electron spin resonance of stable multi-radicals is sensitive to the details of the rotational and internal dynamics in rigid and flexible molecules. Integration with advanced theoretical/computational methods proves to be particularly effective to acquire direct information on long-range relaxation processes, based on molecular dynamics, multi-scale approaches and coarse-graining treatments. Together, experimental data and computational interpretation provide a way to understand the effect of chemical changes on specific systems. In this paper we review computational tools aimed at the characterisation of dynamical properties of molecules gathered from electron spin resonance measurements. Stochastic models are employed, based on a number of structural parameters that are calculated at atomistic and/or mesoscopic level depending on their nature. Open source software tools built as user-friendly 'virtual spectroscopes' targeted for use by experimentalists are provided as a kind of extension of the laboratory equipment. An overview of their range of applicability is provided.

  9. The Astronomy Workshop: Computer Assisted Learning Tools with Instructor Support Materials and Student Activities

    NASA Astrophysics Data System (ADS)

    Deming, Grace; Hamilton, D.; Hayes-Gehrke, M.

    2006-12-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is a collection of interactive World Wide Web tools that were developed under the direction of Doug Hamilton for use in undergraduate classes, as supplementary materials appropriate for grades 9-12, and by the general public. The philosophy of the website is to foster student and public interest in astronomy by capitalizing on their fascination with computers and the internet. Many of the tools were developed by graduate and undergraduate students at UMD. This website contains over 20 tools on topics including scientific notation, giant impacts, extrasolar planets, astronomical distances, planets, moons, comets, and asteroids. Educators around the country at universities, colleges, and secondary schools have used the Astronomy Workshop’s tools and activities as homework assignments, in-class demos, or extra credit. Since 2005, Grace Deming has assessed several of the Astronomy Workshop’s tools for clarity and effectiveness by interviewing students as they used tools on the website. Based on these interviews, Deming wrote student activities and instructor support materials and posted them to the website. Over the next three years, we will continue to interview students, develop web materials, and field-test activities. We are targeting classes in introductory undergraduate astronomy courses and grades 11-12 for our Spring 2007 field tests. We are interested in hearing your ideas on how we can make the Astronomy Workshop more appealing to educators, museum directors, specialty programs, and professors. This research is funded by NASA EPO grants NNG04GM18G and NNG06GGF99G.

  10. Validation of RetroPath, a computer-aided design tool for metabolic pathway engineering.

    PubMed

    Fehér, Tamás; Planson, Anne-Gaëlle; Carbonell, Pablo; Fernández-Castané, Alfred; Grigoras, Ioana; Dariy, Ekaterina; Perret, Alain; Faulon, Jean-Loup

    2014-11-01

    Metabolic engineering has succeeded in biosynthesis of numerous commodity or high value compounds. However, the choice of pathways and enzymes used for production was many times made ad hoc, or required expert knowledge of the specific biochemical reactions. In order to rationalize the process of engineering producer strains, we developed the computer-aided design (CAD) tool RetroPath that explores and enumerates metabolic pathways connecting the endogenous metabolites of a chassis cell to the target compound. To experimentally validate our tool, we constructed 12 top-ranked enzyme combinations producing the flavonoid pinocembrin, four of which displayed significant yields. Namely, our tool queried the enzymes found in metabolic databases based on their annotated and predicted activities. Next, it ranked pathways based on the predicted efficiency of the available enzymes, the toxicity of the intermediate metabolites and the calculated maximum product flux. To implement the top-ranking pathway, our procedure narrowed down a list of nine million possible enzyme combinations to 12, a number easily assembled and tested. One round of metabolic network optimization based on RetroPath output further increased pinocembrin titers 17-fold. In total, 12 out of the 13 enzymes tested in this work displayed a relative performance that was in accordance with its predicted score. These results validate the ranking function of our CAD tool, and open the way to its utilization in the biosynthesis of novel compounds.

  11. Java and Vector Graphics Tools for Element Production Calculations in Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Lingerfelt, Eric; McMahon, Erin; Hix, Raph; Guidry, Mike; Smith, Michael

    2002-08-01

    We are developing a set of extendable, cross-platform tools and interfaces using Java and vector technologies such as SVG and SWF to facilitate element production calculations in computational astrophysics. The Java technologies are customizable and portable, and can be utilized as a stand-alone application or distributed across a network. These tools, which can have a broad applications in general scientific visualization, are currently being used to explore and compare various reaction rates, set up and run explosive nucleosynthesis calculations, and visualize these results with compact, high quality vector graphics. The facilities for reading and plotting nuclear reaction rates and their components from a network or library permit the user to include new rates and adjust current ones. Setup and initialization of a nucleosynthesis calculation is through an intuitive graphical interface. Sophisticated visualization and graphical analysis tools offer the ability to view results in an interactive, scalable vector graphics format, which leads to a dramatic reduction in visualization file sizes while maintaining high visual quality and interactive control. The use of these tools for other applications will also be mentioned.

  12. Validation of space/ground antenna control algorithms using a computer-aided design tool

    NASA Technical Reports Server (NTRS)

    Gantenbein, Rex E.

    1995-01-01

    The validation of the algorithms for controlling the space-to-ground antenna subsystem for Space Station Alpha is an important step in assuring reliable communications. These algorithms have been developed and tested using a simulation environment based on a computer-aided design tool that can provide a time-based execution framework with variable environmental parameters. Our work this summer has involved the exploration of this environment and the documentation of the procedures used to validate these algorithms. We have installed a variety of tools in a laboratory of the Tracking and Communications division for reproducing the simulation experiments carried out on these algorithms to verify that they do meet their requirements for controlling the antenna systems. In this report, we describe the processes used in these simulations and our work in validating the tests used.

  13. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  14. A computer assisted diagnosis tool for the classification of burns by depth of injury.

    PubMed

    Serrano, Carmen; Acha, Begoña; Gómez-Cía, Tomás; Acha, José I; Roa, Laura M

    2005-05-01

    In this paper, a computer assisted diagnosis (CAD) tool for the classification of burns into their depths is proposed. The aim of the system is to separate burn wounds from healthy skin, and to distinguish among the different types of burns (burn depths) by means of digital photographs. It is intended to be used as an aid to diagnosis in local medical centres, where there is a lack of specialists. Another potential use of the system is as an educational tool. The system is based on the analysis of digital photographs. It extracts from those images colour and texture information, as these are the characteristics observed by physicians in order to form a diagnosis. Clinical effectiveness of the method was demonstrated on 35 clinical burn wound images, yielding an average classification success rate of 88% compared to expert classified images.

  15. A computer assisted diagnosis tool for the classification of burns by depth of injury.

    PubMed

    Serrano, Carmen; Acha, Begoña; Gómez-Cía, Tomás; Acha, José I; Roa, Laura M

    2005-05-01

    In this paper, a computer assisted diagnosis (CAD) tool for the classification of burns into their depths is proposed. The aim of the system is to separate burn wounds from healthy skin, and to distinguish among the different types of burns (burn depths) by means of digital photographs. It is intended to be used as an aid to diagnosis in local medical centres, where there is a lack of specialists. Another potential use of the system is as an educational tool. The system is based on the analysis of digital photographs. It extracts from those images colour and texture information, as these are the characteristics observed by physicians in order to form a diagnosis. Clinical effectiveness of the method was demonstrated on 35 clinical burn wound images, yielding an average classification success rate of 88% compared to expert classified images. PMID:15774281

  16. Defining a Standard for Reporting Digital Evidence Items in Computer Forensic Tools

    NASA Astrophysics Data System (ADS)

    Bariki, Hamda; Hashmi, Mariam; Baggili, Ibrahim

    Due to the lack of standards in reporting digital evidence items, investigators are facing difficulties in efficiently presenting their findings. This paper proposes a standard for digital evidence to be used in reports that are generated using computer forensic software tools. The authors focused on developing a standard digital evidence items by surveying various digital forensic tools while keeping in mind the legal integrity of digital evidence items. Additionally, an online questionnaire was used to gain the opinion of knowledgeable and experienced stakeholders in the digital forensics domain. Based on the findings, the authors propose a standard for digital evidence items that includes data about the case, the evidence source, evidence item, and the chain of custody. Research results enabled the authors in creating a defined XML schema for digital evidence items.

  17. Architecture Framework for Trapped-Ion Quantum Computer based on Performance Simulation Tool

    NASA Astrophysics Data System (ADS)

    Ahsan, Muhammad

    The challenge of building scalable quantum computer lies in striking appropriate balance between designing a reliable system architecture from large number of faulty computational resources and improving the physical quality of system components. The detailed investigation of performance variation with physics of the components and the system architecture requires adequate performance simulation tool. In this thesis we demonstrate a software tool capable of (1) mapping and scheduling the quantum circuit on a realistic quantum hardware architecture with physical resource constraints, (2) evaluating the performance metrics such as the execution time and the success probability of the algorithm execution, and (3) analyzing the constituents of these metrics and visualizing resource utilization to identify system components which crucially define the overall performance. Using this versatile tool, we explore vast design space for modular quantum computer architecture based on trapped ions. We find that while success probability is uniformly determined by the fidelity of physical quantum operation, the execution time is a function of system resources invested at various layers of design hierarchy. At physical level, the number of lasers performing quantum gates, impact the latency of the fault-tolerant circuit blocks execution. When these blocks are used to construct meaningful arithmetic circuit such as quantum adders, the number of ancilla qubits for complicated non-clifford gates and entanglement resources to establish long-distance communication channels, become major performance limiting factors. Next, in order to factorize large integers, these adders are assembled into modular exponentiation circuit comprising bulk of Shor's algorithm. At this stage, the overall scaling of resource-constraint performance with the size of problem, describes the effectiveness of chosen design. By matching the resource investment with the pace of advancement in hardware technology

  18. A simple tool for the computation of the stream-aquifer coefficient.

    NASA Astrophysics Data System (ADS)

    Cousquer, Yohann; Pryet, Alexandre; Dupuy, Alain

    2015-04-01

    Most groundwater models consider a river network in interaction with aquifers, where the stream-aquifer boundary is usually modeled with a Cauchy-type boundary condition. This condition is parameterized with the so-called "river coefficient", which is a lumped parameter representing the effects of numerous geometric and hydrodynamic controlling factors. The value of the river coefficient is essential for the quantification of stream-aquifer flow but is challenging to determine. In recent years, many formulations for the river coefficient have been proposed from analytical and numerical approaches. However, these methods are either too simple to be realistic or too complex to be easily implemented by groundwater modelers. We propose a simple tool to infer the value of the river coefficient from a fine-grid numerical model. This tool allows the simple and fast computation of the river coefficient with various stream geometries and hydraulic parameters. A Python-based pre- and post-processor has been developed, which reduces the contribution of the operator to the definition of the model parameters: river geometry and aquifer properties. The numerical model is implemented with the USGS SUTRA finite element model and considers an aquifer in interaction with a stream in a 2D vertical cross-section. A Dirichlet-type boundary condition is imposed at the stream-aquifer interface. The linearity between the stream-aquifer flow and the head difference between river and the aquifer has been verified. For a given parameter set, the value of river coefficient is estimated by linear regression for different values of head difference between the river and the aquifer. The innovation is that the mesh size of the regional model is also considered for the computation of the river coefficient. This tool has been used to highlight the importance of parameters that were usually neglected for the computation of the river coefficient. The results of this work will be made available to the

  19. AMAS: a fast tool for alignment manipulation and computing of summary statistics.

    PubMed

    Borowiec, Marek L

    2016-01-01

    The amount of data used in phylogenetics has grown explosively in the recent years and many phylogenies are inferred with hundreds or even thousands of loci and many taxa. These modern phylogenomic studies often entail separate analyses of each of the loci in addition to multiple analyses of subsets of genes or concatenated sequences. Computationally efficient tools for handling and computing properties of thousands of single-locus or large concatenated alignments are needed. Here I present AMAS (Alignment Manipulation And Summary), a tool that can be used either as a stand-alone command-line utility or as a Python package. AMAS works on amino acid and nucleotide alignments and combines capabilities of sequence manipulation with a function that calculates basic statistics. The manipulation functions include conversions among popular formats, concatenation, extracting sites and splitting according to a pre-defined partitioning scheme, creation of replicate data sets, and removal of taxa. The statistics calculated include the number of taxa, alignment length, total count of matrix cells, overall number of undetermined characters, percent of missing data, AT and GC contents (for DNA alignments), count and proportion of variable sites, count and proportion of parsimony informative sites, and counts of all characters relevant for a nucleotide or amino acid alphabet. AMAS is particularly suitable for very large alignments with hundreds of taxa and thousands of loci. It is computationally efficient, utilizes parallel processing, and performs better at concatenation than other popular tools. AMAS is a Python 3 program that relies solely on Python's core modules and needs no additional dependencies. AMAS source code and manual can be downloaded from http://github.com/marekborowiec/AMAS/ under GNU General Public License.

  20. AMAS: a fast tool for alignment manipulation and computing of summary statistics.

    PubMed

    Borowiec, Marek L

    2016-01-01

    The amount of data used in phylogenetics has grown explosively in the recent years and many phylogenies are inferred with hundreds or even thousands of loci and many taxa. These modern phylogenomic studies often entail separate analyses of each of the loci in addition to multiple analyses of subsets of genes or concatenated sequences. Computationally efficient tools for handling and computing properties of thousands of single-locus or large concatenated alignments are needed. Here I present AMAS (Alignment Manipulation And Summary), a tool that can be used either as a stand-alone command-line utility or as a Python package. AMAS works on amino acid and nucleotide alignments and combines capabilities of sequence manipulation with a function that calculates basic statistics. The manipulation functions include conversions among popular formats, concatenation, extracting sites and splitting according to a pre-defined partitioning scheme, creation of replicate data sets, and removal of taxa. The statistics calculated include the number of taxa, alignment length, total count of matrix cells, overall number of undetermined characters, percent of missing data, AT and GC contents (for DNA alignments), count and proportion of variable sites, count and proportion of parsimony informative sites, and counts of all characters relevant for a nucleotide or amino acid alphabet. AMAS is particularly suitable for very large alignments with hundreds of taxa and thousands of loci. It is computationally efficient, utilizes parallel processing, and performs better at concatenation than other popular tools. AMAS is a Python 3 program that relies solely on Python's core modules and needs no additional dependencies. AMAS source code and manual can be downloaded from http://github.com/marekborowiec/AMAS/ under GNU General Public License. PMID:26835189

  1. AMAS: a fast tool for alignment manipulation and computing of summary statistics

    PubMed Central

    2016-01-01

    The amount of data used in phylogenetics has grown explosively in the recent years and many phylogenies are inferred with hundreds or even thousands of loci and many taxa. These modern phylogenomic studies often entail separate analyses of each of the loci in addition to multiple analyses of subsets of genes or concatenated sequences. Computationally efficient tools for handling and computing properties of thousands of single-locus or large concatenated alignments are needed. Here I present AMAS (Alignment Manipulation And Summary), a tool that can be used either as a stand-alone command-line utility or as a Python package. AMAS works on amino acid and nucleotide alignments and combines capabilities of sequence manipulation with a function that calculates basic statistics. The manipulation functions include conversions among popular formats, concatenation, extracting sites and splitting according to a pre-defined partitioning scheme, creation of replicate data sets, and removal of taxa. The statistics calculated include the number of taxa, alignment length, total count of matrix cells, overall number of undetermined characters, percent of missing data, AT and GC contents (for DNA alignments), count and proportion of variable sites, count and proportion of parsimony informative sites, and counts of all characters relevant for a nucleotide or amino acid alphabet. AMAS is particularly suitable for very large alignments with hundreds of taxa and thousands of loci. It is computationally efficient, utilizes parallel processing, and performs better at concatenation than other popular tools. AMAS is a Python 3 program that relies solely on Python’s core modules and needs no additional dependencies. AMAS source code and manual can be downloaded from http://github.com/marekborowiec/AMAS/ under GNU General Public License. PMID:26835189

  2. A Review of Diffusion Tensor Magnetic Resonance Imaging Computational Methods and Software Tools

    PubMed Central

    Hasan, Khader M.; Walimuni, Indika S.; Abid, Humaira; Hahn, Klaus R.

    2010-01-01

    In this work we provide an up-to-date short review of computational magnetic resonance imaging (MRI) and software tools that are widely used to process and analyze diffusion-weighted MRI data. A review of different methods used to acquire, model and analyze diffusion-weighted imaging data (DWI) is first provided with focus on diffusion tensor imaging (DTI). The major preprocessing, processing and post-processing procedures applied to DTI data are discussed. A list of freely available software packages to analyze diffusion MRI data is also provided. PMID:21087766

  3. Fortran Transformational Tools in Support of Scientific Application Development for Petascale Computer Architectures

    SciTech Connect

    Sottille, Matthew

    2013-09-12

    This document is the final report for a multi-year effort building infrastructure to support tool development for Fortran programs. We also investigated static analysis and code transformation methods relevant to scientific programmers who are writing Fortran programs for petascale-class high performance computing systems. This report details our accomplishments, technical approaches, and provides information on where the research results and code may be obtained from an open source software repository. The report for the first year of the project that was performed at the University of Oregon prior to the PI moving to Galois, Inc. is included as an appendix.

  4. A semi-automated computer tool for the analysis of retinal vessel diameter dynamics.

    PubMed

    Euvrard, Guillaume; Genevois, Olivier; Rivals, Isabelle; Massin, Pascale; Collet, Amélie; Sahel, José-Alain; Paques, Michel

    2013-06-01

    Retinal vessels are directly accessible to clinical observation. This has numerous potential interests for medical investigations. Using the Retinal Vessel Analyzer, a dedicated eye fundus camera enabling dynamic, video-rate recording of micrometric changes of the diameter of retinal vessels, we developed a semi-automated computer tool that extracts the heart beat rate and pulse amplitude values from the records. The extracted data enabled us to show that there is a decreasing relationship between heart beat rate and pulse amplitude of arteries and veins. Such an approach will facilitate the modeling of hemodynamic interactions in small vessels. PMID:23566397

  5. Unicursal random maze tool path for computer-controlled optical surfacing.

    PubMed

    Wang, Chunjin; Wang, Zhenzhong; Xu, Qiao

    2015-12-01

    A novel unicursal random maze tool path is proposed in this paper, which can not only implement uniform coverage of the polishing surfaces, but also possesses randomness and multidirectionality. The simulation experiments along with the practical polishing experiments are conducted to make the comparison of three kinds of paths, including maze path, raster path, and Hilbert path. The experimental results validate that the maze path can warrant uniform polishing and avoid the appearance of the periodical structures in the polished surface. It is also more effective than the Hilbert path in restraining the mid-spatial frequency error in computer-controlled optical surfacing process.

  6. A semi-automated computer tool for the analysis of retinal vessel diameter dynamics.

    PubMed

    Euvrard, Guillaume; Genevois, Olivier; Rivals, Isabelle; Massin, Pascale; Collet, Amélie; Sahel, José-Alain; Paques, Michel

    2013-06-01

    Retinal vessels are directly accessible to clinical observation. This has numerous potential interests for medical investigations. Using the Retinal Vessel Analyzer, a dedicated eye fundus camera enabling dynamic, video-rate recording of micrometric changes of the diameter of retinal vessels, we developed a semi-automated computer tool that extracts the heart beat rate and pulse amplitude values from the records. The extracted data enabled us to show that there is a decreasing relationship between heart beat rate and pulse amplitude of arteries and veins. Such an approach will facilitate the modeling of hemodynamic interactions in small vessels.

  7. Tools for grid deployment of CDF offline and SAM data handling systems for summer 2004 computing

    SciTech Connect

    Kreymer, A.; Baranovski, A.; Garzoglio, G.; Herber, R.; Illingworth, R.; Kennedy, R.; Loebel-Carpenter, L.; Lyon, A.; Merritt, W.; Stonjek, S.; Terekhov, I.; Trumbo, J.; Veseli, S.; White, S.; Bartsch, V.; Leslie, M.; Belforte, S.; Burgon-Lyon, M.; St. Denis, R.; Kerzel, U.; Ratnikov, F.; /Rutgers U., Piscataway /Texas Tech.

    2004-12-01

    The Fermilab CDF Run-II experiment is now providing official support for remote computing, which has provided approximately 35% of the total CDF computing capacity during the summer of 2004. We face the challenge of unreliable networks, time differences, and remote managers having little experience with this particular software. The approach we have taken has been to separate the data handling components from the main CDF offline code releases by means of shared libraries, permitting live upgrades to otherwise frozen code. We now use a special ''development lite'' release to ensure that all sites have the latest tools available. We have put substantial effort into revision control, so that essentially all active CDF sites are running exactly the same SAM code.

  8. Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val

    1989-01-01

    Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers.

  9. Fast and Accurate Data Extraction for Near Real-Time Registration of 3-D Ultrasound and Computed Tomography in Orthopedic Surgery.

    PubMed

    Brounstein, Anna; Hacihaliloglu, Ilker; Guy, Pierre; Hodgson, Antony; Abugharbieh, Rafeef

    2015-12-01

    Automatic, accurate and real-time registration is an important step in providing effective guidance and successful anatomic restoration in ultrasound (US)-based computer assisted orthopedic surgery. We propose a method in which local phase-based bone surfaces, extracted from intra-operative US data, are registered to pre-operatively segmented computed tomography data. Extracted bone surfaces are downsampled and reinforced with high curvature features. A novel hierarchical simplification algorithm is used to further optimize the point clouds. The final point clouds are represented as Gaussian mixture models and iteratively matched by minimizing the dissimilarity between them using an L2 metric. For 44 clinical data sets from 25 pelvic fracture patients and 49 phantom data sets, we report mean surface registration accuracies of 0.31 and 0.77 mm, respectively, with an average registration time of 1.41 s. Our results suggest the viability and potential of the chosen method for real-time intra-operative registration in orthopedic surgery.

  10. Fast and Accurate Data Extraction for Near Real-Time Registration of 3-D Ultrasound and Computed Tomography in Orthopedic Surgery.

    PubMed

    Brounstein, Anna; Hacihaliloglu, Ilker; Guy, Pierre; Hodgson, Antony; Abugharbieh, Rafeef

    2015-12-01

    Automatic, accurate and real-time registration is an important step in providing effective guidance and successful anatomic restoration in ultrasound (US)-based computer assisted orthopedic surgery. We propose a method in which local phase-based bone surfaces, extracted from intra-operative US data, are registered to pre-operatively segmented computed tomography data. Extracted bone surfaces are downsampled and reinforced with high curvature features. A novel hierarchical simplification algorithm is used to further optimize the point clouds. The final point clouds are represented as Gaussian mixture models and iteratively matched by minimizing the dissimilarity between them using an L2 metric. For 44 clinical data sets from 25 pelvic fracture patients and 49 phantom data sets, we report mean surface registration accuracies of 0.31 and 0.77 mm, respectively, with an average registration time of 1.41 s. Our results suggest the viability and potential of the chosen method for real-time intra-operative registration in orthopedic surgery. PMID:26365924

  11. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  12. Analysis and computer tools for separation processes involving nonideal mixtures. Progress report, December 1, 1992--November 30, 1993

    SciTech Connect

    Lucia, A.

    1993-07-12

    This research is concerned with developing mathematical analysis, numerical analysis, and computer tools for separation processes involving nonideal, homogeneous, and heterogeneous multi-component mixtures. Progress, organized in terms of mathematical analysis, numerical analysis, and algorithmic development, is summarized.

  13. Morpheus Spectral Counter: A computational tool for label-free quantitative mass spectrometry using the Morpheus search engine.

    PubMed

    Gemperline, David C; Scalf, Mark; Smith, Lloyd M; Vierstra, Richard D

    2016-03-01

    Label-free quantitative MS based on the Normalized Spectral Abundance Factor (NSAF) has emerged as a straightforward and robust method to determine the relative abundance of individual proteins within complex mixtures. Here, we present Morpheus Spectral Counter (MSpC) as the first computational tool that directly calculates NSAF values from output obtained from Morpheus, a fast, open-source, peptide-MS/MS matching engine compatible with high-resolution accurate-mass instruments. NSAF has distinct advantages over other MS-based quantification methods, including a greater dynamic range as compared to isobaric tags, no requirement to align and re-extract MS1 peaks, and increased speed. MSpC features an easy-to-use graphic user interface that additionally calculates both distributed and unique NSAF values to permit analyses of both protein families and isoforms/proteoforms. MSpC determinations of protein concentration were linear over several orders of magnitude based on the analysis of several high-mass accuracy datasets either obtained from PRIDE or generated with total cell extracts spiked with purified Arabidopsis 20S proteasomes. The MSpC software was developed in C# and is open sourced under a permissive license with the code made available at http://dcgemperline.github.io/Morpheus_SpC/.

  14. Analysis and computer tools for separation processes involving nonideal mixtures. Progress report, December 1, 1989--November 30, 1992

    SciTech Connect

    Lucia, A.

    1992-05-01

    The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

  15. Online object oriented Monte Carlo computational tool for the needs of biomedical optics

    PubMed Central

    Doronin, Alexander; Meglinski, Igor

    2011-01-01

    Conceptual engineering design and optimization of laser-based imaging techniques and optical diagnostic systems used in the field of biomedical optics requires a clear understanding of the light-tissue interaction and peculiarities of localization of the detected optical radiation within the medium. The description of photon migration within the turbid tissue-like media is based on the concept of radiative transfer that forms a basis of Monte Carlo (MC) modeling. An opportunity of direct simulation of influence of structural variations of biological tissues on the probing light makes MC a primary tool for biomedical optics and optical engineering. Due to the diversity of optical modalities utilizing different properties of light and mechanisms of light-tissue interactions a new MC code is typically required to be developed for the particular diagnostic application. In current paper introducing an object oriented concept of MC modeling and utilizing modern web applications we present the generalized online computational tool suitable for the major applications in biophotonics. The computation is supported by NVIDEA CUDA Graphics Processing Unit providing acceleration of modeling up to 340 times. PMID:21991540

  16. Smartphone qualification & linux-based tools for CubeSat computing payloads

    NASA Astrophysics Data System (ADS)

    Bridges, C. P.; Yeomans, B.; Iacopino, C.; Frame, T. E.; Schofield, A.; Kenyon, S.; Sweeting, M. N.

    Modern computers are now far in advance of satellite systems and leveraging of these technologies for space applications could lead to cheaper and more capable spacecraft. Together with NASA AMES's PhoneSat, the STRaND-1 nanosatellite team has been developing and designing new ways to include smart-phone technologies to the popular CubeSat platform whilst mitigating numerous risks. Surrey Space Centre (SSC) and Surrey Satellite Technology Ltd. (SSTL) have led in qualifying state-of-the-art COTS technologies and capabilities - contributing to numerous low-cost satellite missions. The focus of this paper is to answer if 1) modern smart-phone software is compatible for fast and low-cost development as required by CubeSats, and 2) if the components utilised are robust to the space environment. The STRaND-1 smart-phone payload software explored in this paper is united using various open-source Linux tools and generic interfaces found in terrestrial systems. A major result from our developments is that many existing software and hardware processes are more than sufficient to provide autonomous and operational payload object-to-object and file-based management solutions. The paper will provide methodologies on the software chains and tools used for the STRaND-1 smartphone computing platform, the hardware built with space qualification results (thermal, thermal vacuum, and TID radiation), and how they can be implemented in future missions.

  17. Gmat. A software tool for the computation of the rovibrational G matrix

    NASA Astrophysics Data System (ADS)

    Castro, M. E.; Niño, A.; Muñoz-Caro, C.

    2009-07-01

    Gmat is a C++ program able to compute the rovibrational G matrix in molecules of arbitrary size. This allows the building of arbitrary rovibrational Hamiltonians. In particular, the program is designed to work with the structural results of potential energy hypersurface mappings computed in computer clusters or computational Grid environments. In the present version, 1.0, the program uses internal coordinates as vibrational coordinates, with the principal axes of inertia as body-fixed system. The main design implements a complete separation of the interface and functional parts of the program. The interface part permits the automatic reading of the molecular structures from the output files of different electronic structure codes. At present, Gamess and Gaussian output files are allowed. To such an end, use is made of the object orientation polymorphism characteristic. The functional part computes numerically the derivatives of the nuclear positions respect to the vibrational coordinates. Very accurate derivatives are obtained by using central differences embedded in a nine levels Richardson extrapolation procedure. Program summaryProgram title: Gmat Catalogue identifier: AECZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 17 023 No. of bytes in distributed program, including test data, etc.: 274 714 Distribution format: tar.gz Programming language: Standard C++ Computer: All running Linux/Windows Operating system: Linux, Windows Classification: 16.2 Nature of problem: Computation of the rovibrational G matrix in molecules of any size. This allows the building of arbitrary rovibrational Hamiltonians. It must be possible to obtain the input data from the output files of standard electronic structure codes

  18. Automated metastatic brain lesion detection: a computer aided diagnostic and clinical research tool

    NASA Astrophysics Data System (ADS)

    Devine, Jeremy; Sahgal, Arjun; Karam, Irene; Martel, Anne L.

    2016-03-01

    The accurate localization of brain metastases in magnetic resonance (MR) images is crucial for patients undergoing stereotactic radiosurgery (SRS) to ensure that all neoplastic foci are targeted. Computer automated tumor localization and analysis can improve both of these tasks by eliminating inter and intra-observer variations during the MR image reading process. Lesion localization is accomplished using adaptive thresholding to extract enhancing objects. Each enhancing object is represented as a vector of features which includes information on object size, symmetry, position, shape, and context. These vectors are then used to train a random forest classifier. We trained and tested the image analysis pipeline on 3D axial contrast-enhanced MR images with the intention of localizing the brain metastases. In our cross validation study and at the most effective algorithm operating point, we were able to identify 90% of the lesions at a precision rate of 60%.

  19. Computer assisted 3D pre-operative planning tool for femur fracture orthopedic surgery

    NASA Astrophysics Data System (ADS)

    Gamage, Pavan; Xie, Sheng Quan; Delmas, Patrice; Xu, Wei Liang

    2010-02-01

    Femur shaft fractures are caused by high impact injuries and can affect gait functionality if not treated correctly. Until recently, the pre-operative planning for femur fractures has relied on two-dimensional (2D) radiographs, light boxes, tracing paper, and transparent bone templates. The recent availability of digital radiographic equipment has to some extent improved the workflow for preoperative planning. Nevertheless, imaging is still in 2D X-rays and planning/simulation tools to support fragment manipulation and implant selection are still not available. Direct three-dimensional (3D) imaging modalities such as Computed Tomography (CT) are also still restricted to a minority of complex orthopedic procedures. This paper proposes a software tool which allows orthopedic surgeons to visualize, diagnose, plan and simulate femur shaft fracture reduction procedures in 3D. The tool utilizes frontal and lateral 2D radiographs to model the fracture surface, separate a generic bone into the two fractured fragments, identify the pose of each fragment, and automatically customize the shape of the bone. The use of 3D imaging allows full spatial inspection of the fracture providing different views through the manipulation of the interactively reconstructed 3D model, and ultimately better pre-operative planning.

  20. SOAP. A tool for the fast computation of photometry and radial velocity induced by stellar spots

    NASA Astrophysics Data System (ADS)

    Boisse, I.; Bonfils, X.; Santos, N. C.

    2012-09-01

    We define and put at the disposal of the community SOAP, Spot Oscillation And Planet, a software tool that simulates the effect of stellar spots and plages on radial velocimetry and photometry. This paper describes the tool release and provides instructions for its use. We present detailed tests with previous computations and real data to assess the code's performance and to validate its suitability. We characterize the variations of the radial velocity, line bisector, and photometric amplitude as a function of the main variables: projected stellar rotational velocity, filling factor of the spot, resolution of the spectrograph, linear limb-darkening coefficient, latitude of the spot, and inclination of the star. Finally, we model the spot distributions on the active stars HD 166435, TW Hya and HD 189733, which reproduce the observations. We show that the software is remarkably fast, allowing several evolutions in its capabilities that could be performed to study the next challenges in the exoplanetary field connected with the stellar variability. The tool is available at http://www.astro.up.pt/soap

  1. An APEL Tool Based CPU Usage Accounting Infrastructure for Large Scale Computing Grids

    NASA Astrophysics Data System (ADS)

    Jiang, Ming; Novales, Cristina Del Cano; Mathieu, Gilles; Casson, John; Rogers, William; Gordon, John

    The APEL (Accounting Processor for Event Logs) is the fundamental tool for the CPU usage accounting infrastructure deployed within the WLCG and EGEE Grids. In these Grids, jobs are submitted by users to computing resources via a Grid Resource Broker (e.g. gLite Workload Management System). As a log processing tool, APEL interprets logs of Grid gatekeeper (e.g. globus) and batch system logs (e.g. PBS, LSF, SGE and Condor) to produce CPU job accounting records identified with Grid identities. These records provide a complete description of usage of computing resources by user's jobs. APEL publishes accounting records into an accounting record repository at a Grid Operations Centre (GOC) for the access from a GUI web tool. The functions of log files parsing, records generation and publication are implemented by the APEL Parser, APEL Core, and APEL Publisher component respectively. Within the distributed accounting infrastructure, accounting records are transported from APEL Publishers at Grid sites to either a regionalised accounting system or the central one by choice via a common ActiveMQ message broker network. This provides an open transport layer for other accounting systems to publish relevant accounting data to a central accounting repository via a unified interface provided an APEL Publisher and also will give regional/National Grid Initiatives (NGIs) Grids the flexibility in their choice of accounting system. The robust and secure delivery of accounting record messages at an NGI level and between NGI accounting instances and the central one are achieved by using configurable APEL Publishers and an ActiveMQ message broker network.

  2. N2A: a computational tool for modeling from neurons to algorithms.

    PubMed

    Rothganger, Fredrick; Warrender, Christina E; Trumbo, Derek; Aimone, James B

    2014-01-01

    The exponential increase in available neural data has combined with the exponential growth in computing ("Moore's law") to create new opportunities to understand neural systems at large scale and high detail. The ability to produce large and sophisticated simulations has introduced unique challenges to neuroscientists. Computational models in neuroscience are increasingly broad efforts, often involving the collaboration of experts in different domains. Furthermore, the size and detail of models have grown to levels for which understanding the implications of variability and assumptions is no longer trivial. Here, we introduce the model design platform N2A which aims to facilitate the design and validation of biologically realistic models. N2A uses a hierarchical representation of neural information to enable the integration of models from different users. N2A streamlines computational validation of a model by natively implementing standard tools in sensitivity analysis and uncertainty quantification. The part-relationship representation allows both network-level analysis and dynamical simulations. We will demonstrate how N2A can be used in a range of examples, including a simple Hodgkin-Huxley cable model, basic parameter sensitivity of an 80/20 network, and the expression of the structural plasticity of a growing dendrite and stem cell proliferation and differentiation.

  3. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    SciTech Connect

    Smith, P.R.; Sarfaty, R.

    1993-05-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility`s physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables.

  4. Computational fluid dynamics as a design tool for the hot gas manifold of the Space Shuttle Main Engine

    NASA Technical Reports Server (NTRS)

    Ziebarth, J. P.; Barson, S.; Rosen, R.

    1986-01-01

    The paper discusses the application of computational fluid dynamics as a design tool for the Hot Gas Manifold of the Space Shuttle Main Engine. An improved Hot Gas Manifold configuration was arrived at computationally. This configuration was then built and air flow tested. Testing verified this configuration to be a substantial improvement over existing flight designs.

  5. GANGA: A tool for computational-task management and easy access to Grid resources

    NASA Astrophysics Data System (ADS)

    Mościcki, J. T.; Brochu, F.; Ebke, J.; Egede, U.; Elmsheuser, J.; Harrison, K.; Jones, R. W. L.; Lee, H. C.; Liko, D.; Maier, A.; Muraru, A.; Patrick, G. N.; Pajchel, K.; Reece, W.; Samset, B. H.; Slater, M. W.; Soroko, A.; Tan, C. L.; van der Ster, D. C.; Williams, M.

    2009-11-01

    In this paper, we present the computational task-management tool GANGA, which allows for the specification, submission, bookkeeping and post-processing of computational tasks on a wide set of distributed resources. GANGA has been developed to solve a problem increasingly common in scientific projects, which is that researchers must regularly switch between different processing systems, each with its own command set, to complete their computational tasks. GANGA provides a homogeneous environment for processing data on heterogeneous resources. We give examples from High Energy Physics, demonstrating how an analysis can be developed on a local system and then transparently moved to a Grid system for processing of all available data. GANGA has an API that can be used via an interactive interface, in scripts, or through a GUI. Specific knowledge about types of tasks or computational resources is provided at run-time through a plugin system, making new developments easy to integrate. We give an overview of the GANGA architecture, give examples of current use, and demonstrate how GANGA can be used in many different areas of science. Catalogue identifier: AEEN_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEN_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL No. of lines in distributed program, including test data, etc.: 224 590 No. of bytes in distributed program, including test data, etc.: 14 365 315 Distribution format: tar.gz Programming language: Python Computer: personal computers, laptops Operating system: Linux/Unix RAM: 1 MB Classification: 6.2, 6.5 Nature of problem: Management of computational tasks for scientific applications on heterogenous distributed systems, including local, batch farms, opportunistic clusters and

  6. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  7. A tool for computing time-dependent permeability reduction of fractured volcanic conduit margins.

    NASA Astrophysics Data System (ADS)

    Farquharson, Jamie; Wadsworth, Fabian; Heap, Michael; Baud, Patrick

    2016-04-01

    Laterally-oriented fractures within volcanic conduit margins are thought to play an important role in tempering eruption explosivity by allowing magmatic volatiles to outgas. The permeability of a fractured conduit margin - the equivalent permeability - can be modelled as the sum of permeability contributions of the edifice host rock and the fracture(s) within it. We present here a flexible MATLAB® tool which computes the time-dependent equivalent permeability of a volcanic conduit margin containing ash-filled fractures. The tool is designed so that the end-user can define a wide range of input parameters to yield equivalent permeability estimates for their application. The time-dependence of the equivalent permeability is incorporated by considering permeability decrease as a function of porosity loss in the ash-filled fractures due to viscous sintering (after Russell and Quane, 2005), which is in turn dependent on the depth and temperature of each fracture and the crystal-content of the magma (all user-defined variables). The initial viscosity of the granular material filling the fracture is dependent on the water content (Hess and Dingwell, 1996), which is computed assuming equilibrium depth-dependent water content (Liu et al., 2005). Crystallinity is subsequently accounted for by employing the particle-suspension rheological model of Mueller et al. (2010). The user then defines the number of fractures, their widths, and their depths, and the lengthscale of interest (e.g. the length of the conduit). Using these data, the combined influence of transient fractures on the equivalent permeability of the conduit margin is then calculated by adapting a parallel-plate flow model (developed by Baud et al., 2012 for porous sandstones), for host rock permeabilities from 10-11 to 10-22 m2. The calculated values of porosity and equivalent permeability with time for each host rock permeability is then output in text and worksheet file formats. We introduce two dimensionless

  8. A computer modeling methodology and tool for assessing design concepts for the Space Station Data Management System

    NASA Technical Reports Server (NTRS)

    Jones, W. R.

    1986-01-01

    A computer modeling tool is being developed to assess candidate designs for the Space Station Data Management System (DMS). The DMS is to be a complex distributed computer system including the processor, storage devices, local area networks, and software that will support all processing functions onboard the Space Station. The modeling tool will allow a candidate design for the DMS, or for other subsystems that use the DMS, to be evaluated in terms of parameters. The tool and its associated modeling methodology are intended for use by DMS and subsystem designers to perform tradeoff analyses between design concepts using varied architectures and technologies.

  9. Computational Fluid Dynamics (CFD) as surgical planning tool: a pilot study on middle turbinate resection

    PubMed Central

    Zhao, Kai; Malhotra, Prashant; Rosen, David; Dalton, Pamela; Pribitkin, Edmund A

    2014-01-01

    Controversies exist regarding the resection or preservation of the middle turbinate (MT) during functional endoscopic sinus surgery (FESS). Any MT resection will perturb nasal airflow and may affect the mucociliary dynamics of the osteomeatal complex. Neither rhinometry nor computed tomography (CT) can adequately quantify nasal airflow pattern changes following surgery. This study explores the feasibility of assessing changes in nasal airflow dynamics following partial MT resection using computational fluid dynamics (CFD) techniques. We retrospectively converted the pre- and post-operative CT scans of a patient who underwent isolated partial MT concha bullosa resection into anatomically accurate three-dimensional numerical nasal models. Pre- and post-surgery nasal airflow simulations showed that the partial MT resection resulted in a shift of regional airflow towards the area of MT removal with a resultant decreased airflow velocity, decreased wall shear stress and increased local air pressure. However, the resection did not strongly affect the overall nasal airflow patterns, flow distributions in other areas of the nose, or the odorant uptake rate to the olfactory cleft mucosa. Morever, CFD predicted the patient's failure to perceive an improvement in his unilateral nasal obstruction following surgery. Accordingly, CFD techniques can be used to predict changes in nasal airflow dynamics following partial MT resection. However, the functional implications of this analysis await further clinical studies. Nevertheless, such techniques may potentially provide a quantitative evaluation of surgical effectiveness and may prove useful in preoperatively modeling the effects of surgical interventions. PMID:25312372

  10. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    NASA Astrophysics Data System (ADS)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  11. Development and assessment of a chemistry-based computer video game as a learning tool

    NASA Astrophysics Data System (ADS)

    Martinez-Hernandez, Kermin Joel

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning experience through gameplay. The project consists of three areas: development, assessment, and implementation. However, the foci of this study were the development and assessment of the computer video game including possible learning outcomes and game design elements. A chemistry-based game using a mixed genre of a single player first-person game embedded with action-adventure and puzzle components was developed to determine if students' level of understanding of chemistry concepts change after gameplay intervention. Three phases have been completed to assess students' understanding of chemistry concepts prior and after gameplay intervention. Two main assessment instruments (pre/post open-ended content survey and individual semi-structured interviews) were used to assess student understanding of concepts. In addition, game design elements were evaluated for future development phases. Preliminary analyses of the interview data suggest that students were able to understand most of the chemistry challenges presented in the game and the game served as a review for previously learned concepts as well as a way to apply such previous knowledge. To guarantee a better understanding of the chemistry concepts, additions such as debriefing and feedback about the content presented in the game seem to be needed. The use of visuals in the game to represent chemical processes, game genre, and game idea appear to be the game design elements that students like the most about the current computer video game.

  12. Cone beam computed tomography imaging as a primary diagnostic tool for computer-guided surgery and CAD-CAM interim removable and fixed dental prostheses.

    PubMed

    Charette, Jyme R; Goldberg, Jack; Harris, Bryan T; Morton, Dean; Llop, Daniel R; Lin, Wei-Shao

    2016-08-01

    This article describes a digital workflow using cone beam computed tomography imaging as the primary diagnostic tool in the virtual planning of the computer-guided surgery and fabrication of a maxillary interim complete removable dental prosthesis and mandibular interim implant-supported complete fixed dental prosthesis with computer-aided design and computer-aided manufacturing technology. Diagnostic impressions (conventional or digital) and casts are unnecessary in this proposed digital workflow, providing clinicians with an alternative treatment in the indicated clinical scenario. PMID:27086108

  13. Feasibility study for application of the compressed-sensing framework to interior computed tomography (ICT) for low-dose, high-accurate dental x-ray imaging

    NASA Astrophysics Data System (ADS)

    Je, U. K.; Cho, H. M.; Cho, H. S.; Park, Y. O.; Park, C. K.; Lim, H. W.; Kim, K. S.; Kim, G. A.; Park, S. Y.; Woo, T. H.; Choi, S. I.

    2016-02-01

    In this paper, we propose a new/next-generation type of CT examinations, the so-called Interior Computed Tomography (ICT), which may presumably lead to dose reduction to the patient outside the target region-of-interest (ROI), in dental x-ray imaging. Here an x-ray beam from each projection position covers only a relatively small ROI containing a target of diagnosis from the examined structure, leading to imaging benefits such as decreasing scatters and system cost as well as reducing imaging dose. We considered the compressed-sensing (CS) framework, rather than common filtered-backprojection (FBP)-based algorithms, for more accurate ICT reconstruction. We implemented a CS-based ICT algorithm and performed a systematic simulation to investigate the imaging characteristics. Simulation conditions of two ROI ratios of 0.28 and 0.14 between the target and the whole phantom sizes and four projection numbers of 360, 180, 90, and 45 were tested. We successfully reconstructed ICT images of substantially high image quality by using the CS framework even with few-view projection data, still preserving sharp edges in the images.

  14. A binary image reconstruction technique for accurate determination of the shape and location of metal objects in x-ray computed tomography.

    PubMed

    Wang, Jing; Xing, Lei

    2010-01-01

    The presence of metals in patients causes streaking artifacts in X-ray CT and has been recognized as a problem that limits various applications of CT imaging. Accurate localization of metals in CT images is a critical step for metal artifacts reduction in CT imaging and many practical applications of CT images. The purpose of this work is to develop a method of auto-determination of the shape and location of metallic object(s) in the image space. The proposed method is based on the fact that when a metal object is present in a patient, a CT image can be divided into two prominent components: high density metal and low density normal tissues. This prior knowledge is incorporated into an objective function as the regularization term whose role is to encourage the solution to take a form of two intensity levels. A computer simulation study and four experimental studies are performed to evaluate the proposed approach. Both simulation and experimental studies show that the presented algorithm works well even in the presence of complicated shaped metal objects. For a hexagonally shaped metal embedded in a water phantom, for example, it is found that the accuracy of metal reconstruction is within sub-millimeter.

  15. GPU-accelerated computational tool for studying the effectiveness of asteroid disruption techniques

    NASA Astrophysics Data System (ADS)

    Zimmerman, Ben J.; Wie, Bong

    2016-10-01

    This paper presents the development of a new Graphics Processing Unit (GPU) accelerated computational tool for asteroid disruption techniques. Numerical simulations are completed using the high-order spectral difference (SD) method. Due to the compact nature of the SD method, it is well suited for implementation with the GPU architecture, hence solutions are generated at orders of magnitude faster than the Central Processing Unit (CPU) counterpart. A multiphase model integrated with the SD method is introduced, and several asteroid disruption simulations are conducted, including kinetic-energy impactors, multi-kinetic energy impactor systems, and nuclear options. Results illustrate the benefits of using multi-kinetic energy impactor systems when compared to a single impactor system. In addition, the effectiveness of nuclear options is observed.

  16. Unraveling the web of viroinformatics: computational tools and databases in virus research.

    PubMed

    Sharma, Deepak; Priyadarshini, Pragya; Vrati, Sudhanshu

    2015-02-01

    The beginning of the second century of research in the field of virology (the first virus was discovered in 1898) was marked by its amalgamation with bioinformatics, resulting in the birth of a new domain--viroinformatics. The availability of more than 100 Web servers and databases embracing all or specific viruses (for example, dengue virus, influenza virus, hepatitis virus, human immunodeficiency virus [HIV], hemorrhagic fever virus [HFV], human papillomavirus [HPV], West Nile virus, etc.) as well as distinct applications (comparative/diversity analysis, viral recombination, small interfering RNA [siRNA]/short hairpin RNA [shRNA]/microRNA [miRNA] studies, RNA folding, protein-protein interaction, structural analysis, and phylotyping and genotyping) will definitely aid the development of effective drugs and vaccines. However, information about their access and utility is not available at any single source or on any single platform. Therefore, a compendium of various computational tools and resources dedicated specifically to virology is presented in this article.

  17. Computational Fluid Dynamics-Icing: a Predictive Tool for In-Flight Icing Risk Management

    NASA Astrophysics Data System (ADS)

    Zeppetelli, Danial

    In-flight icing is a hazard that continues to afflict the aviation industry, despite all the research and efforts to mitigate the risks. The recurrence of these types of accidents has given renewed impetus to the development of advanced analytical predictive tools to study both the accretion of ice on aircraft components in flight, and the aerodynamic consequences of such ice accumulations. In this work, an in-depth analysis of the occurrence of in-flight icing accidents and incidents was conducted to identify high-risk flight conditions. To investigate these conditions more thoroughly, a computational fluid dynamics model of a representative airfoil was developed to recreate experiments from the icing wind tunnel that occurred in controlled flight conditions. The ice accumulations and resulting aerodynamic performance degradations of the airfoil were computed for a range or pitch angles and flight speeds. These simulations revealed substantial performance losses such as reduced maximum lift, and decreased stall angle. From these results, an icing hazard analysis tool was developed, using risk management principles, to evaluate the dangers of in-flight icing for a specific aircraft based on the atmospheric conditions it is expected to encounter, as well as the effectiveness of aircraft certification procedures. This method is then demonstrated through the simulation of in-flight icing scenarios based on real flight data from accidents and incidents. The risk management methodology is applied to the results of the simulations and the predicted performance degradation is compared to recorded aircraft performance characteristics at the time of the occurrence. The aircraft performance predictions and resulting risk assessment are found to correspond strongly to the pilot's comments as well as to the severity of the incident.

  18. SOAP: A Tool for the Fast Computation of Photometry and Radial Velocity Induced by Stellar Spots

    NASA Astrophysics Data System (ADS)

    Boisse, I.; Bonfils, X.; Santos, N. C.; Figueira, P.

    2013-04-01

    Dark spots and bright plages are present on the surface of dwarf stars from spectral types F to M, even in their low-active phase (like the Sun). Their appearance and disappearance on the stellar photosphere, combined with the stellar rotation, may lead to errors and uncertainties in the characterization of planets both in radial velocity (RV) and photometry. Spot Oscillation and Planet (SOAP) is a tool offered to the community that enables to simulate spots and plages on rotating stars and computes their impact on RV and photometric measurements. This tool will help to understand the challenges related to the knowledge of stellar activity for the next decade: detect telluric planets in the habitable zone of their stars (from G to M dwarfs), understand the activity in the low-mass end of M dwarf (on which future projects, like SPIRou or CARMENES, will focus), limitation to the characterization of the exoplanetary atmosphere (from the ground or with Spitzer, JWST), search for planets around young stars. These can be simulated with SOAP in order to search for indices and corrections to the effect of activity.

  19. A computational tool for preoperative breast augmentation planning in aesthetic plastic surgery.

    PubMed

    Georgii, Joachim; Eder, Maximilian; Bürger, Kai; Klotz, Sebastian; Ferstl, Florian; Kovacs, Laszlo; Westermann, Rüdiger

    2014-05-01

    Breast augmentation was the most commonly performed cosmetic surgery procedure in 2011 in the United States. Although aesthetically pleasing surgical results can only be achieved if the correct breast implant is selected from a large variety of different prosthesis sizes and shapes available on the market, surgeons still rely on visual assessment and other subjective approaches for operative planning because of lacking objective evaluation tools. In this paper, we present the development of a software prototype for augmentation mammaplasty simulation solely based on 3-D surface scans, from which patient-specific finite-element models are generated in a semiautomatic process. The finite-element model is used to preoperatively simulate the expected breast shapes using physical soft-tissue mechanics. Our approach uses a novel mechanism based on so-called displacement templates, which, for a specific implant shape and position, describe the respective internal body forces. Due to a highly efficient numerical solver we can provide immediate visual feedback of the simulation results, and thus, the software prototype can be integrated smoothly into the medical workflow. The clinical value of the developed 3-D computational tool for aesthetic breast augmentation surgery planning is demonstrated in patient-specific use cases. PMID:24132029

  20. Using Brain–Computer Interfaces and Brain-State Dependent Stimulation as Tools in Cognitive Neuroscience

    PubMed Central

    Jensen, Ole; Bahramisharif, Ali; Oostenveld, Robert; Klanke, Stefan; Hadjipapas, Avgis; Okazaki, Yuka O.; van Gerven, Marcel A. J.

    2011-01-01

    Large efforts are currently being made to develop and improve online analysis of brain activity which can be used, e.g., for brain–computer interfacing (BCI). A BCI allows a subject to control a device by willfully changing his/her own brain activity. BCI therefore holds the promise as a tool for aiding the disabled and for augmenting human performance. While technical developments obviously are important, we will here argue that new insight gained from cognitive neuroscience can be used to identify signatures of neural activation which reliably can be modulated by the subject at will. This review will focus mainly on oscillatory activity in the alpha band which is strongly modulated by changes in covert attention. Besides developing BCIs for their traditional purpose, they might also be used as a research tool for cognitive neuroscience. There is currently a strong interest in how brain-state fluctuations impact cognition. These state fluctuations are partly reflected by ongoing oscillatory activity. The functional role of the brain state can be investigated by introducing stimuli in real-time to subjects depending on the actual state of the brain. This principle of brain-state dependent stimulation may also be used as a practical tool for augmenting human behavior. In conclusion, new approaches based on online analysis of ongoing brain activity are currently in rapid development. These approaches are amongst others informed by new insight gained from electroencephalography/magnetoencephalography studies in cognitive neuroscience and hold the promise of providing new ways for investigating the brain at work. PMID:21687463

  1. Use of whole-genus genome sequence data to develop a multilocus sequence typing tool that accurately identifies Yersinia isolates to the species and subspecies levels.

    PubMed

    Hall, Miquette; Chattaway, Marie A; Reuter, Sandra; Savin, Cyril; Strauch, Eckhard; Carniel, Elisabeth; Connor, Thomas; Van Damme, Inge; Rajakaruna, Lakshani; Rajendram, Dunstan; Jenkins, Claire; Thomson, Nicholas R; McNally, Alan

    2015-01-01

    The genus Yersinia is a large and diverse bacterial genus consisting of human-pathogenic species, a fish-pathogenic species, and a large number of environmental species. Recently, the phylogenetic and population structure of the entire genus was elucidated through the genome sequence data of 241 strains encompassing every known species in the genus. Here we report the mining of this enormous data set to create a multilocus sequence typing-based scheme that can identify Yersinia strains to the species level to a level of resolution equal to that for whole-genome sequencing. Our assay is designed to be able to accurately subtype the important human-pathogenic species Yersinia enterocolitica to whole-genome resolution levels. We also report the validation of the scheme on 386 strains from reference laboratory collections across Europe. We propose that the scheme is an important molecular typing system to allow accurate and reproducible identification of Yersinia isolates to the species level, a process often inconsistent in nonspecialist laboratories. Additionally, our assay is the most phylogenetically informative typing scheme available for Y. enterocolitica.

  2. Use of Whole-Genus Genome Sequence Data To Develop a Multilocus Sequence Typing Tool That Accurately Identifies Yersinia Isolates to the Species and Subspecies Levels

    PubMed Central

    Hall, Miquette; Chattaway, Marie A.; Reuter, Sandra; Savin, Cyril; Strauch, Eckhard; Carniel, Elisabeth; Connor, Thomas; Van Damme, Inge; Rajakaruna, Lakshani; Rajendram, Dunstan; Jenkins, Claire; Thomson, Nicholas R.

    2014-01-01

    The genus Yersinia is a large and diverse bacterial genus consisting of human-pathogenic species, a fish-pathogenic species, and a large number of environmental species. Recently, the phylogenetic and population structure of the entire genus was elucidated through the genome sequence data of 241 strains encompassing every known species in the genus. Here we report the mining of this enormous data set to create a multilocus sequence typing-based scheme that can identify Yersinia strains to the species level to a level of resolution equal to that for whole-genome sequencing. Our assay is designed to be able to accurately subtype the important human-pathogenic species Yersinia enterocolitica to whole-genome resolution levels. We also report the validation of the scheme on 386 strains from reference laboratory collections across Europe. We propose that the scheme is an important molecular typing system to allow accurate and reproducible identification of Yersinia isolates to the species level, a process often inconsistent in nonspecialist laboratories. Additionally, our assay is the most phylogenetically informative typing scheme available for Y. enterocolitica. PMID:25339391

  3. Stimulated dual-band infrared computed tomography: A tool to inspect the aging infrastructure

    SciTech Connect

    Del Grande, N.K.; Durbin, P.F.

    1995-06-27

    The authors have developed stimulated dual-band infrared (IR) computed tomography as a tool to inspect the aging infrastructure. The system has the potential to locate and quantify structural damage within airframes and bridge decks. Typically, dual-band IR detection methods improve the signal-to-noise ratio by a factor of ten, compared to single-band IR detection methods. They conducted a demonstration at Boeing using a uniform pulsed-heat source to stimulate IR images of hidden defects in the 727 fuselage. The dual-band IR camera and image processing system produced temperature, thermal inertia, and cooling-rate maps. In combination, these maps characterized the defect site, size, depth, thickness and type. The authors quantified the percent metal loss from corrosion above a threshold of 5%, with overall uncertainties of 3%. Also, they conducted a feasibility study of dual-band IR thermal imaging for bridge deck inspections. They determined the sites and relative concrete displacement of 2-in. and 4-in. deep delaminations from thin styrofoam implants in asphalt-covered concrete slabs. They demonstrated the value of dual-band IR computed tomography to quantify structural damage within flash-heated airframes and naturally-heated bridge decks.

  4. Stimulated dual-band infrared computed tomography: a tool to inspect the aging infrastructure

    NASA Astrophysics Data System (ADS)

    DelGrande, Nancy; Durbin, Philip F.

    1995-09-01

    We have developed stimulated dual-band infrared (IR) computed tomography as a tool to inspect the aging infrastructure. Our system has the potential to locate and quantify structural damage within airframes and bridge decks. Typically, dual-band IR detection methods improve the signal-to-noise ratio by a factor of ten, compared to single-band IR detection methods. We conducted a demonstration at Boeing using a uniform pulsed-heat source to stimulate IR images of hidden defects in the 727 fuselage. Our dual-band IR camera and image processing system produced temperature, thermal inertia, and cooling-rate maps. In combination, these maps characterized the defect site, size, depth, thickness, and type. We quantified the percent metal loss from corrosion above a threshold of 5%, with overall uncertainties of 3%. Also, we conducted a feasibility study of dual-band IR thermal imaging for bridge deck inspections. We determined the sites and relative concrete displacement of 12- in. and 4-in. deep delaminations from thin styrofoam implants in asphalt-covered concrete slabs. We demonstrated the value of dual-band IR computed tomography to quantify structural damage within flash-heated airframes and naturally heated bridge decks.

  5. A least-squares computational ``tool kit``. Nuclear data and measurements series

    SciTech Connect

    Smith, D.L.

    1993-04-01

    The information assembled in this report is intended to offer a useful computational ``tool kit`` to individuals who are interested in a variety of practical applications for the least-squares method of parameter estimation. The fundamental principles of Bayesian analysis are outlined first and these are applied to development of both the simple and the generalized least-squares conditions. Formal solutions that satisfy these conditions are given subsequently. Their application to both linear and non-linear problems is described in detail. Numerical procedures required to implement these formal solutions are discussed and two utility computer algorithms are offered for this purpose (codes LSIOD and GLSIOD written in FORTRAN). Some simple, easily understood examples are included to illustrate the use of these algorithms. Several related topics are then addressed, including the generation of covariance matrices, the role of iteration in applications of least-squares procedures, the effects of numerical precision and an approach that can be pursued in developing data analysis packages that are directed toward special applications.

  6. Probabilistic graphs as a conceptual and computational tool in hydrology and water management

    NASA Astrophysics Data System (ADS)

    Schoups, Gerrit

    2014-05-01

    Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.

  7. New Geodesy MATLAB Tools To Compute Earth Tides And Analyze Strain Data

    NASA Astrophysics Data System (ADS)

    Sievers, C.; Hodgkinson, K. M.; Mencin, D.

    2012-12-01

    UNAVCO is developing two new geodesy MATLAB tools for the community: one is a translation of SPOTL [Agnew, 2012] for tidal predictions, the other processes and analyzes borehole strainmeter data. Processing borehole strainmeter data from raw data to a useful time series involves numerous steps and meticulous record keeping: counts need to be converted to strain, trends such as load tides, atmospheric response, and long time-scale instrument response have to be accounted, and spurious data points and offsets need to be removed. We have created a MATLAB GUI (graphical user interface) tool that seamlessly accomplishes all these tasks. We employ CleanStrain+ [Langbein, 2010], a FORTRAN program, to estimate the offsets in the data. Although solved via a least-squares technique, CleanStrain+ factors in the temporally correlated nature of strain data. When the amplitude and phase of the main tidal constituents are known, the tidal signal can be removed using the MATLAB version of SPOTL. The user has the option of applying offsets, choosing tidal models and borehole trends provided as Level 2 Earthscope Data Products. All this, including loading and saving the edits, is done through a single GUI interface. SPOTL is a FORTRAN code suite is used to predict ocean load and solid Earth body tides at a location and compute tidal time series over a user-specified time span and sample interval. We converted both the code and the tidal models to MATLAB to make it more portable and easy to use. While the code is primarily designed to be run from the command line, we have built is a front-end GUI that can compute most tides and time series and helps visual the results. The user can specify a region of interest, load station coordinates, specify global and regional ocean load models and select specific tides using pull down menus and input boxes. The interactive nature and the visualization aspect of this GUI could make it useful as a teaching tool for understanding tides.

  8. An Interactive Tool for Outdoor Computer Controlled Cultivation of Microalgae in a Tubular Photobioreactor System

    PubMed Central

    Dormido, Raquel; Sánchez, José; Duro, Natividad; Dormido-Canto, Sebastián; Guinaldo, María; Dormido, Sebastián

    2014-01-01

    This paper describes an interactive virtual laboratory for experimenting with an outdoor tubular photobioreactor (henceforth PBR for short). This virtual laboratory it makes possible to: (a) accurately reproduce the structure of a real plant (the PBR designed and built by the Department of Chemical Engineering of the University of Almería, Spain); (b) simulate a generic tubular PBR by changing the PBR geometry; (c) simulate the effects of changing different operating parameters such as the conditions of the culture (pH, biomass concentration, dissolved O2, inyected CO2, etc.); (d) simulate the PBR in its environmental context; it is possible to change the geographic location of the system or the solar irradiation profile; (e) apply different control strategies to adjust different variables such as the CO2 injection, culture circulation rate or culture temperature in order to maximize the biomass production; (f) simulate the harvesting. In this way, users can learn in an intuitive way how productivity is affected by any change in the design. It facilitates the learning of how to manipulate essential variables for microalgae growth to design an optimal PBR. The simulator has been developed with Easy Java Simulations, a freeware open-source tool developed in Java, specifically designed for the creation of interactive dynamic simulations. PMID:24662450

  9. The use of computer tools by the elderly of a center of reference and citizenship for the elderly.

    PubMed

    Frias, Marcos Antonio da Eira; Peres, Heloisa Helena Ciqueto; Paranhos, Wana Yeda; Leite, Maria Madalena Januário; Prado, Cláudia; Kurcgant, Paulina; Tronchin, Daisy Maria Rizatto; Melleiro, Marta Maria

    2011-12-01

    The objective of this exploratory, descriptive study was to identify the use of computer tools by a group of elderly users of a Center of Reference and Citizenship for the Elderly in the city of São Paulo. Among the 55 subjects, it was found that 33 (60.0%) have a computer at home, 42 (76.4%) referred having taken a computer course; 22 (58.2%) have been using the computer for less than two years, and 40 (85.5%)use the tool for up to two hours a day. The most used communication tools were: e-mails by (41; 75.0%), instant messaging (25; 45.0%), dating websites (17; 31.0%). The reported purposes for using technology tools were: to update and obtain information, for research, for fun, and to talk to relatives and friends. In conclusion, nurses should be aware of this technological profile that is being outlined among the elderly population and search for ways to include computer tools in the care provided to this group.

  10. Computational Protein Design: Validation and Possible Relevance as a Tool for Homology Searching and Fold Recognition

    PubMed Central

    Schmidt am Busch, Marcel; Sedano, Audrey; Simonson, Thomas

    2010-01-01

    Background Protein fold recognition usually relies on a statistical model of each fold; each model is constructed from an ensemble of natural sequences belonging to that fold. A complementary strategy may be to employ sequence ensembles produced by computational protein design. Designed sequences can be more diverse than natural sequences, possibly avoiding some limitations of experimental databases. Methodology/Principal Findings We explore this strategy for four SCOP families: Small Kunitz-type inhibitors (SKIs), Interleukin-8 chemokines, PDZ domains, and large Caspase catalytic subunits, represented by 43 structures. An automated procedure is used to redesign the 43 proteins. We use the experimental backbones as fixed templates in the folded state and a molecular mechanics model to compute the interaction energies between sidechain and backbone groups. Calculations are done with the Proteins@Home volunteer computing platform. A heuristic algorithm is used to scan the sequence and conformational space, yielding 200,000–300,000 sequences per backbone template. The results confirm and generalize our earlier study of SH2 and SH3 domains. The designed sequences ressemble moderately-distant, natural homologues of the initial templates; e.g., the SUPERFAMILY, profile Hidden-Markov Model library recognizes 85% of the low-energy sequences as native-like. Conversely, Position Specific Scoring Matrices derived from the sequences can be used to detect natural homologues within the SwissProt database: 60% of known PDZ domains are detected and around 90% of known SKIs and chemokines. Energy components and inter-residue correlations are analyzed and ways to improve the method are discussed. Conclusions/Significance For some families, designed sequences can be a useful complement to experimental ones for homologue searching. However, improved tools are needed to extract more information from the designed profiles before the method can be of general use. PMID:20463972

  11. The DOE Accelerated Strategic Computing Initiative: Enabling the tools for predictive materials modeling and simulation

    NASA Astrophysics Data System (ADS)

    Mailhiot, Christian

    1997-08-01

    The objective of the DOE Science-Based Stockpile Stewardship (SBSS) program is to ensure confidence in the performance, safety, and reliability of the U.S. nuclear stockpile on the basis of a vigorous science-based approach without nuclear testing, in compliance with the comprehensive test-ban treaty. A critical element of this approach is the development of predictive, first-principles, full-physics computer simulation tools. In support of the SBSS program, the DOE has launched the Accelerated Strategic Computing Initiative (ASCI) to enable these computational developments and to promptly shift from an \\underlineempirical test-based methodology to a \\underlinepredictive simulation-based approach. In particular, the development of advanced materials simulation capabilities to predict the effects of materials properties -- as these properties change as a result of aging and/or re-manufacturing -- on stockpile performance has explicitly been identified as one of the most critical component of the SBSS program. Consequently, the emerging SBSS program at the national laboratories presents unprecedented opportunities and challenges for solving important materials physics problems of significance to national security. A key element in the development of predictive materials simulation capabilities is the establishment of rigorous theoretical links between ab initio quantum-based descriptions at the electronic and atomic levels and engineering continuum-based treatments at the macroscopic scale. These links can be established through the identification of the appropriate degrees of freedom which determine the materials response. Applications which illustrate the use of advanced materials simulation methods for the prediction of the thermodynamical and mechanical properties of materials as they afford to bridge the length-scale gap between different levels of descriptions will be presented.

  12. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  13. Computer-Based Tools for Inquiry in Undergraduate Classrooms: Results from the VGEE

    NASA Astrophysics Data System (ADS)

    Pandya, R. E.; Bramer, D. J.; Elliott, D.; Hay, K. E.; Mallaiahgari, L.; Marlino, M. R.; Middleton, D.; Ramamurhty, M. K.; Scheitlin, T.; Weingroff, M.; Wilhelmson, R.; Yoder, J.

    2002-05-01

    The Visual Geophysical Exploration Environment (VGEE) is a suite of computer-based tools designed to help learners connect observable, large-scale geophysical phenomena to underlying physical principles. Technologically, this connection is mediated by java-based interactive tools: a multi-dimensional visualization environment, authentic scientific data-sets, concept models that illustrate fundamental physical principles, and an interactive web-based work management system for archiving and evaluating learners' progress. Our preliminary investigations showed, however, that the tools alone are not sufficient to empower undergraduate learners; learners have trouble in organizing inquiry and using the visualization tools effectively. To address these issues, the VGEE includes an inquiry strategy and scaffolding activities that are similar to strategies used successfully in K-12 classrooms. The strategy is organized around the steps: identify, relate, explain, and integrate. In the first step, students construct visualizations from data to try to identify salient features of a particular phenomenon. They compare their previous conceptions of a phenomenon to the data examine their current knowledge and motivate investigation. Next, students use the multivariable functionality of the visualization environment to relate the different features they identified. Explain moves the learner temporarily outside the visualization to the concept models, where they explore fundamental physical principles. Finally, in integrate, learners use these fundamental principles within the visualization environment by literally placing the concept model within the visualization environment as a probe and watching it respond to larger-scale patterns. This capability, unique to the VGEE, addresses the disconnect that novice learners often experience between fundamental physics and observable phenomena. It also allows learners the opportunity to reflect on and refine their knowledge as well as

  14. A Computational Tool for the Microstructure Optimization of a Polymeric Heart Valve Prosthesis.

    PubMed

    Serrani, M; Brubert, J; Stasiak, J; De Gaetano, F; Zaffora, A; Costantino, M L; Moggridge, G D

    2016-06-01

    Styrene-based block copolymers are promising materials for the development of a polymeric heart valve prosthesis (PHV), and the mechanical properties of these polymers can be tuned via the manufacturing process, orienting the cylindrical domains to achieve material anisotropy. The aim of this work is the development of a computational tool for the optimization of the material microstructure in a new PHV intended for aortic valve replacement to enhance the mechanical performance of the device. An iterative procedure was implemented to orient the cylinders along the maximum principal stress direction of the leaflet. A numerical model of the leaflet was developed, and the polymer mechanical behavior was described by a hyperelastic anisotropic constitutive law. A custom routine was implemented to align the cylinders with the maximum principal stress direction in the leaflet for each iteration. The study was focused on valve closure, since during this phase the fibrous structure of the leaflets must bear the greatest load. The optimal microstructure obtained by our procedure is characterized by mainly circumferential orientation of the cylinders within the valve leaflet. An increase in the radial strain and a decrease in the circumferential strain due to the microstructure optimization were observed. Also, a decrease in the maximum value of the strain energy density was found in the case of optimized orientation; since the strain energy density is a widely used criterion to predict elastomer's lifetime, this result suggests a possible increase of the device durability if the polymer microstructure is optimized. The present method represents a valuable tool for the design of a new anisotropic PHV, allowing the investigation of different designs, materials, and loading conditions.

  15. A Computational Tool for the Microstructure Optimization of a Polymeric Heart Valve Prosthesis.

    PubMed

    Serrani, M; Brubert, J; Stasiak, J; De Gaetano, F; Zaffora, A; Costantino, M L; Moggridge, G D

    2016-06-01

    Styrene-based block copolymers are promising materials for the development of a polymeric heart valve prosthesis (PHV), and the mechanical properties of these polymers can be tuned via the manufacturing process, orienting the cylindrical domains to achieve material anisotropy. The aim of this work is the development of a computational tool for the optimization of the material microstructure in a new PHV intended for aortic valve replacement to enhance the mechanical performance of the device. An iterative procedure was implemented to orient the cylinders along the maximum principal stress direction of the leaflet. A numerical model of the leaflet was developed, and the polymer mechanical behavior was described by a hyperelastic anisotropic constitutive law. A custom routine was implemented to align the cylinders with the maximum principal stress direction in the leaflet for each iteration. The study was focused on valve closure, since during this phase the fibrous structure of the leaflets must bear the greatest load. The optimal microstructure obtained by our procedure is characterized by mainly circumferential orientation of the cylinders within the valve leaflet. An increase in the radial strain and a decrease in the circumferential strain due to the microstructure optimization were observed. Also, a decrease in the maximum value of the strain energy density was found in the case of optimized orientation; since the strain energy density is a widely used criterion to predict elastomer's lifetime, this result suggests a possible increase of the device durability if the polymer microstructure is optimized. The present method represents a valuable tool for the design of a new anisotropic PHV, allowing the investigation of different designs, materials, and loading conditions. PMID:27018454

  16. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing

    PubMed Central

    2013-01-01

    Background Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Results Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Conclusions Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of

  17. INTRODUCING CAFein, A NEW COMPUTATIONAL TOOL FOR STELLAR PULSATIONS AND DYNAMIC TIDES

    SciTech Connect

    Valsecchi, F.; Farr, W. M.; Willems, B.; Rasio, F. A.; Kalogera, V.

    2013-08-10

    Here we present CAFein, a new computational tool for investigating radiative dissipation of dynamic tides in close binaries and of non-adiabatic, non-radial stellar oscillations in isolated stars in the linear regime. For the latter, CAFein computes the non-adiabatic eigenfrequencies and eigenfunctions of detailed stellar models. The code is based on the so-called Riccati method, a numerical algorithm that has been successfully applied to a variety of stellar pulsators, and which does not suffer from the major drawbacks of commonly used shooting and relaxation schemes. Here we present an extension of the Riccati method to investigate dynamic tides in close binaries. We demonstrate CAFein's capabilities as a stellar pulsation code both in the adiabatic and non-adiabatic regimes, by reproducing previously published eigenfrequencies of a polytrope, and by successfully identifying the unstable modes of a stellar model in the {beta} Cephei/SPB region of the Hertzsprung-Russell diagram. Finally, we verify CAFein's behavior in the dynamic tides regime by investigating the effects of dynamic tides on the eigenfunctions and orbital and spin evolution of massive main sequence stars in eccentric binaries, and of hot Jupiter host stars. The plethora of asteroseismic data provided by NASA's Kepler satellite, some of which include the direct detection of tidally excited stellar oscillations, make CAFein quite timely. Furthermore, the increasing number of observed short-period detached double white dwarfs (WDs) and the observed orbital decay in the tightest of such binaries open up a new possibility of investigating WD interiors through the effects of tides on their orbital evolution.

  18. How to Compute a Slot Marker - Calculation of Controller Managed Spacing Tools for Efficient Descents with Precision Scheduling

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas

    2012-01-01

    This paper describes the underlying principles and algorithms for computing the primary controller managed spacing (CMS) tools developed at NASA for precisely spacing aircraft along efficient descent paths. The trajectory-based CMS tools include slot markers, delay indications and speed advisories. These tools are one of three core NASA technologies integrated in NASAs ATM technology demonstration-1 (ATD-1) that will operationally demonstrate the feasibility of fuel-efficient, high throughput arrival operations using Automatic Dependent Surveillance Broadcast (ADS-B) and ground-based and airborne NASA technologies for precision scheduling and spacing.

  19. Computational Ecology and Open Science: Tools to Help Manage Lakes for Cyanobacteria in Lakes

    EPA Science Inventory

    Computational ecology is an interdisciplinary field that takes advantage of modern computation abilities to expand our ecological understanding. As computational ecologists, we use large data sets, which often cover large spatial extents, and advanced statistical/mathematical co...

  20. Assessment of computational tools for MRI RF dosimetry by comparison with measurements on a laboratory phantom.

    PubMed

    Bottauscio, O; Cassarà, A M; Hand, J W; Giordano, D; Zilberti, L; Borsero, M; Chiampi, M; Weidemann, G

    2015-07-21

    This paper presents an extended comparison between numerical simulations using the different computational tools employed nowadays in electromagnetic dosimetry and measurements of radiofrequency (RF) electromagnetic field distributions in phantoms with tissue-simulating liquids at 64 MHz, 128 MHz and 300 MHz, adopting a customized experimental setup. The aim is to quantify the overall reliability and accuracy of RF dosimetry approaches at frequencies in use in magnetic resonance imaging transmit coils. Measurements are compared against four common techniques used for electromagnetic simulations, i.e. the finite difference time domain (FDTD), the finite integration technique (FIT), the boundary element method (BEM) and the hybrid finite element method-boundary element method (FEM-BEM) approaches. It is shown that FDTD and FIT produce similar results, which generally are also in good agreement with those of FEM-BEM. On the contrary, BEM seems to perform less well than the other methods and shows numerical convergence problems in presence of metallic objects. Maximum uncertainties of about 30% (coverage factor k = 2) can be attributed to measurements regarding electric and magnetic field amplitudes. Discrepancies between simulations and experiments are found to be in the range from 10% to 30%. These values confirm other previously published results of experimental validations performed on a limited set of data and define the accuracy of our measurement setup.

  1. Unraveling the Web of Viroinformatics: Computational Tools and Databases in Virus Research

    PubMed Central

    Priyadarshini, Pragya; Vrati, Sudhanshu

    2014-01-01

    The beginning of the second century of research in the field of virology (the first virus was discovered in 1898) was marked by its amalgamation with bioinformatics, resulting in the birth of a new domain—viroinformatics. The availability of more than 100 Web servers and databases embracing all or specific viruses (for example, dengue virus, influenza virus, hepatitis virus, human immunodeficiency virus [HIV], hemorrhagic fever virus [HFV], human papillomavirus [HPV], West Nile virus, etc.) as well as distinct applications (comparative/diversity analysis, viral recombination, small interfering RNA [siRNA]/short hairpin RNA [shRNA]/microRNA [miRNA] studies, RNA folding, protein-protein interaction, structural analysis, and phylotyping and genotyping) will definitely aid the development of effective drugs and vaccines. However, information about their access and utility is not available at any single source or on any single platform. Therefore, a compendium of various computational tools and resources dedicated specifically to virology is presented in this article. PMID:25428870

  2. A software tool for quality assurance of computed/digital radiography (CR/DR) systems

    NASA Astrophysics Data System (ADS)

    Desai, Nikunj; Valentino, Daniel J.

    2011-03-01

    The recommended methods to test the performance of computed radiography (CR) systems have been established by The American Association of Physicists in Medicine, Report No. 93, "Acceptance Testing and Quality Control of Photostimulable Storage Phosphor Imaging Systems". The quality assurance tests are categorized by how frequently they need to be performed. Quality assurance of CR systems is the responsibility of the facility that performs the exam and is governed by the state in which the facility is located. For Example, the New York State Department of Health has established a guide which lists the tests that a CR facility must perform for quality assurance. This study aims at educating the reader about the new quality assurance requirements defined by the state. It further demonstrates an easy to use software tool, henceforth referred to as the Digital Physicist, developed to aid a radiologic facility in conforming with state guidelines and monitoring quality assurance of CR/DR imaging systems. The Digital Physicist provides a vendor independent procedure for quality assurance of CR/DR systems. Further it, generates a PDF report with a brief description of these tests and the obtained results.

  3. Local Perturbation Analysis: A Computational Tool for Biophysical Reaction-Diffusion Models

    PubMed Central

    Holmes, William R.; Mata, May Anne; Edelstein-Keshet, Leah

    2015-01-01

    Diffusion and interaction of molecular regulators in cells is often modeled using reaction-diffusion partial differential equations. Analysis of such models and exploration of their parameter space is challenging, particularly for systems of high dimensionality. Here, we present a relatively simple and straightforward analysis, the local perturbation analysis, that reveals how parameter variations affect model behavior. This computational tool, which greatly aids exploration of the behavior of a model, exploits a structural feature common to many cellular regulatory systems: regulators are typically either bound to a membrane or freely diffusing in the interior of the cell. Using well-documented, readily available bifurcation software, the local perturbation analysis tracks the approximate early evolution of an arbitrarily large perturbation of a homogeneous steady state. In doing so, it provides a bifurcation diagram that concisely describes various regimes of the model’s behavior, reducing the need for exhaustive simulations to explore parameter space. We explain the method and provide detailed step-by-step guides to its use and application. PMID:25606671

  4. Unraveling the web of viroinformatics: computational tools and databases in virus research.

    PubMed

    Sharma, Deepak; Priyadarshini, Pragya; Vrati, Sudhanshu

    2015-02-01

    The beginning of the second century of research in the field of virology (the first virus was discovered in 1898) was marked by its amalgamation with bioinformatics, resulting in the birth of a new domain--viroinformatics. The availability of more than 100 Web servers and databases embracing all or specific viruses (for example, dengue virus, influenza virus, hepatitis virus, human immunodeficiency virus [HIV], hemorrhagic fever virus [HFV], human papillomavirus [HPV], West Nile virus, etc.) as well as distinct applications (comparative/diversity analysis, viral recombination, small interfering RNA [siRNA]/short hairpin RNA [shRNA]/microRNA [miRNA] studies, RNA folding, protein-protein interaction, structural analysis, and phylotyping and genotyping) will definitely aid the development of effective drugs and vaccines. However, information about their access and utility is not available at any single source or on any single platform. Therefore, a compendium of various computational tools and resources dedicated specifically to virology is presented in this article. PMID:25428870

  5. Cone beam computed tomography (CBCT) as a tool for the analysis of nonhuman skeletal remains in a medico-legal setting.

    PubMed

    Lucena, Joaquin; Mora, Esther; Rodriguez, Lucia; Muñoz, Mariela; Cantin, Mario G; Fonseca, Gabriel M

    2016-09-01

    To confirm the nature and forensic significance of questioned skeletal material submitted a medico-legal setting is a relatively common procedure, although not without difficulties when the remains are fragmented or burned. Different methodologies have been described for this purpose, many of them invasive, time and money consuming or dependent on the availability of the analytical instrument. We present a case in which skeletal material with unusual conditions of preservation and curious discovery was sent to a medico-legal setting to determine its human/nonhuman origin. A combined strategy of imagenological procedures (macroscopic, radiographic and cone beam computed tomography - CBCT-technology) was performed as non-invasive and rapid methods to assess the nonhuman nature of the material, specifically of pig (Sus scrofa) origin. This hypothesis was later confirmed by DNA analysis. CBCT data sets provide accurate three-dimensional reconstructions, which demonstrate its reliable use as a forensic tool. PMID:27372746

  6. Cone beam computed tomography (CBCT) as a tool for the analysis of nonhuman skeletal remains in a medico-legal setting.

    PubMed

    Lucena, Joaquin; Mora, Esther; Rodriguez, Lucia; Muñoz, Mariela; Cantin, Mario G; Fonseca, Gabriel M

    2016-09-01

    To confirm the nature and forensic significance of questioned skeletal material submitted a medico-legal setting is a relatively common procedure, although not without difficulties when the remains are fragmented or burned. Different methodologies have been described for this purpose, many of them invasive, time and money consuming or dependent on the availability of the analytical instrument. We present a case in which skeletal material with unusual conditions of preservation and curious discovery was sent to a medico-legal setting to determine its human/nonhuman origin. A combined strategy of imagenological procedures (macroscopic, radiographic and cone beam computed tomography - CBCT-technology) was performed as non-invasive and rapid methods to assess the nonhuman nature of the material, specifically of pig (Sus scrofa) origin. This hypothesis was later confirmed by DNA analysis. CBCT data sets provide accurate three-dimensional reconstructions, which demonstrate its reliable use as a forensic tool.

  7. GMXPBSA 2.0: A GROMACS tool to perform MM/PBSA and computational alanine scanning

    NASA Astrophysics Data System (ADS)

    Paissoni, C.; Spiliotopoulos, D.; Musco, G.; Spitaleri, A.

    2014-11-01

    GMXPBSA 2.0 is a user-friendly suite of Bash/Perl scripts for streamlining MM/PBSA calculations on structural ensembles derived from GROMACS trajectories, to automatically calculate binding free energies for protein-protein or ligand-protein complexes. GMXPBSA 2.0 is flexible and can easily be customized to specific needs. Additionally, it performs computational alanine scanning (CAS) to study the effects of ligand and/or receptor alanine mutations on the free energy of binding. Calculations require only for protein-protein or protein-ligand MD simulations. GMXPBSA 2.0 performs different comparative analysis, including a posteriori generation of alanine mutants of the wild-type complex, calculation of the binding free energy values of the mutant complexes and comparison of the results with the wild-type system. Moreover, it compares the binding free energy of different complexes trajectories, allowing the study the effects of non-alanine mutations, post-translational modifications or unnatural amino acids on the binding free energy of the system under investigation. Finally, it can calculate and rank relative affinity to the same receptor utilizing MD simulations of proteins in complex with different ligands. In order to dissect the different MM/PBSA energy contributions, including molecular mechanic (MM), electrostatic contribution to solvation (PB) and nonpolar contribution to solvation (SA), the tool combines two freely available programs: the MD simulations software GROMACS and the Poisson-Boltzmann equation solver APBS. All the calculations can be performed in single or distributed automatic fashion on a cluster facility in order to increase the calculation by dividing frames across the available processors. The program is freely available under the GPL license. Catalogue identifier: AETQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AETQ_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing

  8. Chimera Grid Tools

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  9. The Use of Interactive Computer Animations Based on POE as a Presentation Tool in Primary Science Teaching

    ERIC Educational Resources Information Center

    Akpinar, Ercan

    2014-01-01

    This study investigates the effects of using interactive computer animations based on predict-observe-explain (POE) as a presentation tool on primary school students' understanding of the static electricity concepts. A quasi-experimental pre-test/post-test control group design was utilized in this study. The experiment group consisted of 30…

  10. Effects of Online Interaction via Computer-Mediated Communication (CMC) Tools on an E-Mathematics Learning Outcome

    ERIC Educational Resources Information Center

    Okonta, Olomeruom

    2010-01-01

    Recent research studies in open and distance learning have focused on the differences between traditional learning versus online learning, the benefits of computer-mediated communication (CMC) tools in an e-learning environment, and the relationship between online discussion posts and students' achievement. In fact, there is an extant…

  11. A Practitioner Model of the Use of Computer-Based Tools and Resources to Support Mathematics Teaching and Learning.

    ERIC Educational Resources Information Center

    Ruthven, Kenneth; Hennessy, Sara

    2002-01-01

    Analyzes the pedagogical ideas underpinning teachers' accounts of the successful use of computer-based tools and resources to support the teaching and learning of mathematics. Organizes central themes to form a pedagogical model capable of informing the use of such technologies in classroom teaching and generating theoretical conjectures for…

  12. Innovation Configuration Mapping as a Professional Development Tool: The Case of One-to-One Laptop Computing

    ERIC Educational Resources Information Center

    Towndrow, Phillip A.; Fareed, Wan

    2015-01-01

    This article illustrates how findings from a study of teachers' and students' uses of laptop computers in a secondary school in Singapore informed the development of an Innovation Configuration (IC) Map--a tool for identifying and describing alternative ways of implementing innovations based on teachers' unique feelings, preoccupations, thoughts…

  13. Creating Your Own Interactive Computer-Based Algebra Teaching Tools: A No Programming Zone

    ERIC Educational Resources Information Center

    McPherson, Ronald F.; Tyson, Velma

    2006-01-01

    In this article the reader will be able to download four spreadsheet tools that interactively relate symbolic and graphical representations of four different functions and learn how to create tools for other functions. These tools uniquely display the symbolic functional representation exactly as found in textbooks. Five complete lesson activities…

  14. Conducting Creativity Brainstorming Sessions in Small and Medium-Sized Enterprises Using Computer-Mediated Communication Tools

    NASA Astrophysics Data System (ADS)

    Murthy, Uday S.

    A variety of Web-based low cost computer-mediated communication (CMC) tools are now available for use by small and medium-sized enterprises (SME). These tools invariably incorporate chat systems that facilitate simultaneous input in synchronous electronic meeting environments, allowing what is referred to as “electronic brainstorming.” Although prior research in information systems (IS) has established that electronic brainstorming can be superior to face-to-face brainstorming, there is a lack of detailed guidance regarding how CMC tools should be optimally configured to foster creativity in SMEs. This paper discusses factors to be considered in using CMC tools for creativity brainstorming and proposes recommendations for optimally configuring CMC tools to enhance creativity in SMEs. The recommendations are based on lessons learned from several recent experimental studies on the use of CMC tools for rich brainstorming tasks that require participants to invoke domain-specific knowledge. Based on a consideration of the advantages and disadvantages of the various configuration options, the recommendations provided can form the basis for selecting a CMC tool for creativity brainstorming or for creating an in-house CMC tool for the purpose.

  15. GMXPBSA 2.1: A GROMACS tool to perform MM/PBSA and computational alanine scanning

    NASA Astrophysics Data System (ADS)

    Paissoni, C.; Spiliotopoulos, D.; Musco, G.; Spitaleri, A.

    2015-01-01

    GMXPBSA 2.1 is a user-friendly suite of Bash/Perl scripts for streamlining MM/PBSA calculations on structural ensembles derived from GROMACS trajectories, to automatically calculate binding free energies for protein-protein or ligand-protein complexes [R.T. Bradshaw et al., Protein Eng. Des. Sel. 24 (2011) 197-207]. GMXPBSA 2.1 is flexible and can easily be customized to specific needs and it is an improvement of the previous GMXPBSA 2.0 [C. Paissoni et al., Comput. Phys. Commun. (2014), 185, 2920-2929]. Additionally, it performs computational alanine scanning (CAS) to study the effects of ligand and/or receptor alanine mutations on the free energy of binding. Calculations require only for protein-protein or protein-ligand MD simulations. GMXPBSA 2.1 performs different comparative analyses, including a posteriori generation of alanine mutants of the wild-type complex, calculation of the binding free energy values of the mutant complexes and comparison of the results with the wild-type system. Moreover, it compares the binding free energy of different complex trajectories, allowing the study of the effects of non-alanine mutations, post-translational modifications or unnatural amino acids on the binding free energy of the system under investigation. Finally, it can calculate and rank relative affinity to the same receptor utilizing MD simulations of proteins in complex with different ligands. In order to dissect the different MM/PBSA energy contributions, including molecular mechanic (MM), electrostatic contribution to solvation (PB) and nonpolar contribution to solvation (SA), the tool combines two freely available programs: the MD simulations software GROMACS [S. Pronk et al., Bioinformatics 29 (2013) 845-854] and the Poisson-Boltzmann equation solver APBS [N.A. Baker et al., Proc. Natl. Acad. Sci. U.S.A 98 (2001) 10037-10041]. All the calculations can be performed in single or distributed automatic fashion on a cluster facility in order to increase the

  16. Development of an innovative spacer grid model utilizing computational fluid dynamics within a subchannel analysis tool

    NASA Astrophysics Data System (ADS)

    Avramova, Maria

    In the past few decades the need for improved nuclear reactor safety analyses has led to a rapid development of advanced methods for multidimensional thermal-hydraulic analyses. These methods have become progressively more complex in order to account for the many physical phenomena anticipated during steady state and transient Light Water Reactor (LWR) conditions. The advanced thermal-hydraulic subchannel code COBRA-TF (Thurgood, M. J. et al., 1983) is used worldwide for best-estimate evaluations of the nuclear reactor safety margins. In the framework of a joint research project between the Pennsylvania State University (PSU) and AREVA NP GmbH, the theoretical models and numerics of COBRA-TF have been improved. Under the name F-COBRA-TF, the code has been subjected to an extensive verification and validation program and has been applied to variety of LWR steady state and transient simulations. To enable F-COBRA-TF for industrial applications, including safety margins evaluations and design analyses, the code spacer grid models were revised and substantially improved. The state-of-the-art in the modeling of the spacer grid effects on the flow thermal-hydraulic performance in rod bundles employs numerical experiments performed by computational fluid dynamics (CFD) calculations. Because of the involved computational cost, the CFD codes cannot be yet used for full bundle predictions, but their capabilities can be utilized for development of more advanced and sophisticated models for subchannel-level analyses. A subchannel code, equipped with improved physical models, can be then a powerful tool for LWR safety and design evaluations. The unique contributions of this PhD research are seen as development, implementation, and qualification of an innovative spacer grid model by utilizing CFD results within a framework of a subchannel analysis code. Usually, the spacer grid models are mostly related to modeling of the entrainment and deposition phenomena and the heat

  17. General theoretical/computational tool for interpreting NMR spin relaxation in proteins.

    PubMed

    Zerbetto, Mirco; Polimeno, Antonino; Meirovitch, Eva

    2009-10-15

    We developed in recent years the slowly relaxing local structure (SRLS) approach for analyzing NMR spin relaxation in proteins. SRLS is a two-body coupled rotator model which accounts rigorously for mode-coupling between the global motion of the protein and the local motion of the spin-bearing probe and allows for general properties of the second rank tensors involved. We showed that a general tool of data analysis requires both capabilities. Several important functionalities were missing in our previous implementations of SRLS in data fitting schemes, and in some important cases, the calculations were tedious. Here we present a general implementation which allows for asymmetric local and global diffusion tensors, distinct local ordering and local diffusion frames, and features a rhombic local potential which includes Wigner matrix element terms of ranks 2 and 4. A recently developed hydrodynamics-based approach for calculating global diffusion tensors has been incorporated into the data-fitting scheme. The computational efficiency of the latter has been increased significantly through object-oriented programming within the scope of the C++ programming language, and code parallelization. A convenient graphical user interface is provided. Currently autocorrelated (15)N spin relaxation data can be analyzed effectively. Adaptation to any autocorrelated and cross-correlated relaxation analysis is straightforward. New physical insight is gleaned on largely preserved local structure in solution, even in chain segments which experience slow local motion. Prospects associated with improved dynamic models, and new applications made possible by the current implementation of SRLS, are delineated. PMID:19775101

  18. Computed tomography as a tool for percutaneous coronary intervention of chronic total occlusions.

    PubMed

    Magro, Michael; Schultz, Carl; Simsek, Cihan; Garcia-Garcia, Hector M; Regar, Evelyn; Nieman, Koen; Mollet, Nico; Serruys, Patrick W; van Geuns, Robert-Jan

    2010-05-01

    Chronic total occlusions (CTO) constitute a major challenge in percutaneous coronary revascularisation (PCI). The development of new interventional strategies, the availability of purpose made tools including dedicated catheters and wires, as well as increasing expertise by the operators, have contributed to the modest success rates which today hover around 75%. Case selection is of utmost importance since failure of this high risk procedure with its typically high radiation doses, high contrast doses and increased complication rates is associated with long term adverse events. Imaging of the coronary arteries using the gold standard of invasive coronary angiography allows characterisation of the chronic total occlusion and is often able to predict the probability of successful recanalisation. Multislice computed tomography (MSCT) is increasingly being utilised as a non-invasive diagnostic imaging modality to detect coronary artery disease. Its ability to provide information on the soft tissue (including plaque) surrounding the lumen has been applied to better define the morphological features of CTOs. In fact, the amount of calcification, tortuosity and actual length of the occluded segment which are established predictors of success, are all better characterised by MSCT. Three dimensional reconstruction of the coronary anatomy and its integration with two dimensional fluoroscopy images during the actual CTO-PCI procedure may help to identify the best angiographic projection, offering a directional guide at the angiographically "missing segment". More technological advances are needed to optimise this multi-modality imaging integration. Whether this will result in better success rates for CTO-PCI is still the subject of ongoing research. It is then that we can evaluate the true benefit of the use of MSCT for CTO against the risk from excessive radiation associated with this strategy.

  19. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and water

  20. The Computer as a Teaching Tool: Promising Practices. Conference Report (Cambridge, Massachusetts, July 12-13, 1984). CR85-10.

    ERIC Educational Resources Information Center

    McDonald, Joseph P.; And Others

    This report of a 1984 conference on the computer as a teaching tool provides summaries of presentations on the role of the computer in the teaching of science, mathematics, computer literacy, and language arts. Analyses of themes emerging from the conference are then presented under four headings: (1) The Computer and the Curriculum (the computer…

  1. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    NASA Astrophysics Data System (ADS)

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2016-08-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing) that students apply to study how lakes around the globe are experiencing the effects of climate change. In the module, students develop hypotheses about the effects of different climate scenarios on lakes and then test their hypotheses using hundreds of model simulations. We taught the module in a 4-hour workshop and found that participation in the module significantly increased both undergraduate and graduate students' understanding about climate change effects on lakes. Moreover, participation in the module also significantly increased students' perceived experience level in using different software, technologies, and modeling tools. By embedding modeling in an environmental science context, non-computer science students were able to successfully use and master technologies that they had previously never been exposed to. Overall, our findings suggest that modeling is a powerful tool for catalyzing student learning on the effects of climate change.

  2. Computers in School: Socially Isolating or a Tool To Promote Collaboration?

    ERIC Educational Resources Information Center

    Svensson, Ann-Katrin

    2000-01-01

    Presents a study of interactions among eight-year-old students working with computers. Students' communicative interactions in front of the computer were compared with interaction occurring in other activities. Students interacted more when they were using the computer. Most of this interaction was concerned with problem-solving. Students…

  3. Development of Curricula and Materials to Teach Performance Skills Essential to Accurate Computer Assisted Transcription from Machine Shorthand Notes. Final Report.

    ERIC Educational Resources Information Center

    Honsberger, Marion M.

    This project was conducted at Edmonds Community College to develop curriculum and materials for use in teaching hands-on, computer-assisted court reporting. The final product of the project was a course with support materials designed to teach court reporting students performance skills by which each can rapidly create perfect computer-aided…

  4. GMXPBSA 2.0: A GROMACS tool to perform MM/PBSA and computational alanine scanning

    NASA Astrophysics Data System (ADS)

    Paissoni, C.; Spiliotopoulos, D.; Musco, G.; Spitaleri, A.

    2014-11-01

    GMXPBSA 2.0 is a user-friendly suite of Bash/Perl scripts for streamlining MM/PBSA calculations on structural ensembles derived from GROMACS trajectories, to automatically calculate binding free energies for protein-protein or ligand-protein complexes. GMXPBSA 2.0 is flexible and can easily be customized to specific needs. Additionally, it performs computational alanine scanning (CAS) to study the effects of ligand and/or receptor alanine mutations on the free energy of binding. Calculations require only for protein-protein or protein-ligand MD simulations. GMXPBSA 2.0 performs different comparative analysis, including a posteriori generation of alanine mutants of the wild-type complex, calculation of the binding free energy values of the mutant complexes and comparison of the results with the wild-type system. Moreover, it compares the binding free energy of different complexes trajectories, allowing the study the effects of non-alanine mutations, post-translational modifications or unnatural amino acids on the binding free energy of the system under investigation. Finally, it can calculate and rank relative affinity to the same receptor utilizing MD simulations of proteins in complex with different ligands. In order to dissect the different MM/PBSA energy contributions, including molecular mechanic (MM), electrostatic contribution to solvation (PB) and nonpolar contribution to solvation (SA), the tool combines two freely available programs: the MD simulations software GROMACS and the Poisson-Boltzmann equation solver APBS. All the calculations can be performed in single or distributed automatic fashion on a cluster facility in order to increase the calculation by dividing frames across the available processors. The program is freely available under the GPL license. Catalogue identifier: AETQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AETQ_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing

  5. A Usability Study of Users' Perceptions toward a Multimedia Computer-Assisted Learning Tool for Neuroanatomy

    ERIC Educational Resources Information Center

    Gould, Douglas J.; Terrell, Mark A.; Fleming, Jo

    2008-01-01

    This usability study evaluated users' perceptions of a multimedia prototype for a new e-learning tool: Anatomy of the Central Nervous System: A Multimedia Course. Usability testing is a collection of formative evaluation methods that inform the developmental design of e-learning tools to maximize user acceptance, satisfaction, and adoption.…

  6. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    ERIC Educational Resources Information Center

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  7. Population Dynamics P System (PDP) Models: A Standardized Protocol for Describing and Applying Novel Bio-Inspired Computing Tools

    PubMed Central

    Colomer, Maria Àngels; Margalida, Antoni; Pérez-Jiménez, Mario J.

    2013-01-01

    Today, the volume of data and knowledge of processes necessitates more complex models that integrate all available information. This handicap has been solved thanks to the technological advances in both software and hardware. Computational tools available today have allowed developing a new family of models, known as computational models. The description of these models is difficult as they can not be expressed analytically, and it is therefore necessary to create protocols that serve as guidelines for future users. The Population Dynamics P systems models (PDP) are a novel and effective computational tool to model complex problems, are characterized by the ability to work in parallel (simultaneously interrelating different processes), are modular and have a high computational efficiency. However, the difficulty of describing these models therefore requires a protocol to unify the presentation and the steps to follow. We use two case studies to demonstrate the use and implementation of these computational models for population dynamics and ecological process studies, discussing briefly their potential applicability to simulate complex ecosystem dynamics. PMID:23593284

  8. Population Dynamics P system (PDP) models: a standardized protocol for describing and applying novel bio-inspired computing tools.

    PubMed

    Colomer, Maria Àngels; Margalida, Antoni; Pérez-Jiménez, Mario J

    2013-01-01

    Today, the volume of data and knowledge of processes necessitates more complex models that integrate all available information. This handicap has been solved thanks to the technological advances in both software and hardware. Computational tools available today have allowed developing a new family of models, known as computational models. The description of these models is difficult as they can not be expressed analytically, and it is therefore necessary to create protocols that serve as guidelines for future users. The Population Dynamics P systems models (PDP) are a novel and effective computational tool to model complex problems, are characterized by the ability to work in parallel (simultaneously interrelating different processes), are modular and have a high computational efficiency. However, the difficulty of describing these models therefore requires a protocol to unify the presentation and the steps to follow. We use two case studies to demonstrate the use and implementation of these computational models for population dynamics and ecological process studies, discussing briefly their potential applicability to simulate complex ecosystem dynamics.

  9. Atmospheric transmittance of an absorbing gas. 4. OPTRAN: a computationally fast and accurate transmittance model for absorbing gases with fixed and with variable mixing ratios at variable viewing angles

    NASA Astrophysics Data System (ADS)

    McMillin, L. M.; Crone, L. J.; Goldberg, M. D.; Kleespies, T. J.

    1995-09-01

    A fast and accurate method for the generation of atmospheric transmittances, optical path transmittance (OPTRAN), is described. Results from OPTRAN are compared with those produced by other currently used methods. OPTRAN produces transmittances that can be used to generate brightness temperatures that are accurate to better than 0.2 K, well over 10 times as accurate as the current methods. This is significant because it brings the accuracy of transmittance computation to a level at which it will not adversely affect atmospheric retrievals. OPTRAN is the product of an evolution of approaches developed earlier at the National Environmental Satellite, Data, and Information Service. A majorfeature of OPTRAN that contributes to its accuracy is that transmittance is obtained as a function of the absorber amount rather than the pressure.

  10. Development of 3D multimedia with advanced computer animation tools for outreach activities related to Meteor Science and Meteoritics

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.

    2012-09-01

    Documentaries related to Astronomy and Planetary Sciences are a common and very attractive way to promote the interest of the public in these areas. These educational tools can get benefit from new advanced computer animation software and 3D technologies, as these allow making these documentaries even more attractive. However, special care must be taken in order to guarantee that the information contained in them is serious and objective. In this sense, an additional value is given when the footage is produced by the own researchers. With this aim, a new documentary produced and directed by Prof. Madiedo has been developed. The documentary, which has been entirely developed by means of advanced computer animation tools, is dedicated to several aspects of Meteor Science and Meteoritics. The main features of this outreach and education initiative are exposed here.

  11. Using the Eclipse Parallel Tools Platform to Assist Earth Science Model Development and Optimization on High Performance Computers

    NASA Astrophysics Data System (ADS)

    Alameda, J. C.

    2011-12-01

    Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into

  12. A computer controlled power tool for the servicing of the Hubble Space Telescope

    NASA Technical Reports Server (NTRS)

    Richards, Paul W.; Konkel, Carl; Smith, Chris; Brown, Lee; Wagner, Ken

    1996-01-01

    The Hubble Space Telescope (HST) Pistol Grip Tool (PGT) is a self-contained, microprocessor controlled, battery-powered, 3/8-inch-drive hand-held tool. The PGT is also a non-powered ratchet wrench. This tool will be used by astronauts during Extravehicular Activity (EVA) to apply torque to the HST and HST Servicing Support Equipment mechanical interfaces and fasteners. Numerous torque, speed, and turn or angle limits are programmed into the PGT for use during various missions. Batteries are replaceable during ground operations, Intravehicular Activities, and EVA's.

  13. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  14. Embodying Computational Thinking: Initial Design of an Emerging Technological Learning Tool

    ERIC Educational Resources Information Center

    Daily, Shaundra B.; Leonard, Alison E.; Jörg, Sophie; Babu, Sabarish; Gundersen, Kara; Parmar, Dhaval

    2015-01-01

    This emerging technology report describes virtual environment interactions an approach for blending movement and computer programming as an embodied way to support girls in building computational thinking skills. The authors seek to understand how body syntonicity might enable young learners to bootstrap their intuitive knowledge in order to…

  15. A Quantitative Study of Factors Affecting Learner Acceptance of a Computer-Based Training Support Tool

    ERIC Educational Resources Information Center

    Wagner, G. Dale; Flannery, Daniele D.

    2004-01-01

    This study identifies and empirically tests factors that may influence learners' use of a computer-based training support system (TSS). The areas of research and theory were drawn from human-computer interaction, information and business management, and adult education. The factors suggested in the literature that may affect learner's use of a TSS…

  16. Computational Aero-acoustics As a Tool For Turbo-machinery Noise Reduction

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.

    2003-01-01

    This talk will provide an overview of the field of computational aero-acoustics and its use in fan noise prediction. After a brief history of computational fluid dynamics, some of the recent developments in computational aero-acoustics will be explored. Computational issues concerning sound wave production, propagation, and reflection in practical turbo-machinery applications will be discussed including: (a) High order/High Resolution Numerical Techniques. (b) High Resolution Boundary Conditions. [c] MIMD Parallel Computing. [d] Form of Governing Equations Useful for Simulations. In addition, the basic design of our Broadband Analysis Stator Simulator (BASS) code and its application to a 2 D rotor wake-stator interaction will be shown. An example of the noise produced by the wakes from a rotor impinging upon a stator cascade will be shown.

  17. Self port scanning tool : providing a more secure computing Environment through the use of proactive port scanning

    NASA Technical Reports Server (NTRS)

    Kocher, Joshua E; Gilliam, David P.

    2005-01-01

    Secure computing is a necessity in the hostile environment that the internet has become. Protection from nefarious individuals and organizations requires a solution that is more a methodology than a one time fix. One aspect of this methodology is having the knowledge of which network ports a computer has open to the world, These network ports are essentially the doorways from the internet into the computer. An assessment method which uses the nmap software to scan ports has been developed to aid System Administrators (SAs) with analysis of open ports on their system(s). Additionally, baselines for several operating systems have been developed so that SAs can compare their open ports to a baseline for a given operating system. Further, the tool is deployed on a website where SAs and Users can request a port scan of their computer. The results are then emailed to the requestor. This tool aids Users, SAs, and security professionals by providing an overall picture of what services are running, what ports are open, potential trojan programs or backdoors, and what ports can be closed.

  18. The Computational Science Education Reference Desk: A tool for increasing inquiry based learning in the science classroom

    NASA Astrophysics Data System (ADS)

    Joiner, D. A.; Stevenson, D. E.; Panoff, R. M.

    2000-12-01

    The Computational Science Reference Desk is an online tool designed to provide educators in math, physics, astronomy, biology, chemistry, and engineering with information on how to use computational science to enhance inquiry based learning in the undergraduate and pre college classroom. The Reference Desk features a showcase of original content exploration activities, including lesson plans and background materials; a catalog of websites which contain models, lesson plans, software, and instructional resources; and a forum to allow educators to communicate their ideas. Many of the recent advances in astronomy rely on the use of computer simulation, and tools are being developed by CSERD to allow students to experiment with some of the models that have guided scientific discovery. One of these models allows students to study how scientists use spectral information to determine the makeup of the interstellar medium by modeling the interstellar extinction curve using spherical grains of silicate, amorphous carbon, or graphite. Students can directly compare their model to the average interstellar extinction curve, and experiment with how small changes in their model alter the shape of the interstellar extinction curve. A simpler model allows students to visualize spatial relationships between the Earth, Moon, and Sun to understand the cause of the phases of the moon. A report on the usefulness of these models in two classes, the Computational Astrophysics workshop at The Shodor Education Foundation and the Conceptual Astronomy class at the University of North Carolina at Greensboro, will be presented.

  19. Computing Accurate Grammatical Feedback in a Virtual Writing Conference for German-Speaking Elementary-School Children: An Approach Based on Natural Language Generation

    ERIC Educational Resources Information Center

    Harbusch, Karin; Itsova, Gergana; Koch, Ulrich; Kuhner, Christine

    2009-01-01

    We built a natural language processing (NLP) system implementing a "virtual writing conference" for elementary-school children, with German as the target language. Currently, state-of-the-art computer support for writing tasks is restricted to multiple-choice questions or quizzes because automatic parsing of the often ambiguous and fragmentary…

  20. Repeatable, accurate, and high speed multi-level programming of memristor 1T1R arrays for power efficient analog computing applications

    NASA Astrophysics Data System (ADS)

    Merced-Grafals, Emmanuelle J.; Dávila, Noraica; Ge, Ning; Williams, R. Stanley; Strachan, John Paul

    2016-09-01

    Beyond use as high density non-volatile memories, memristors have potential as synaptic components of neuromorphic systems. We investigated the suitability of tantalum oxide (TaOx) transistor-memristor (1T1R) arrays for such applications, particularly the ability to accurately, repeatedly, and rapidly reach arbitrary conductance states. Programming is performed by applying an adaptive pulsed algorithm that utilizes the transistor gate voltage to control the SET switching operation and increase programming speed of the 1T1R cells. We show the capability of programming 64 conductance levels with <0.5% average accuracy using 100 ns pulses and studied the trade-offs between programming speed and programming error. The algorithm is also utilized to program 16 conductance levels on a population of cells in the 1T1R array showing robustness to cell-to-cell variability. In general, the proposed algorithm results in approximately 10× improvement in programming speed over standard algorithms that do not use the transistor gate to control memristor switching. In addition, after only two programming pulses (an initialization pulse followed by a programming pulse), the resulting conductance values are within 12% of the target values in all cases. Finally, endurance of more than 106 cycles is shown through open-loop (single pulses) programming across multiple conductance levels using the optimized gate voltage of the transistor. These results are relevant for applications that require high speed, accurate, and repeatable programming of the cells such as in neural networks and analog data processing.