Science.gov

Sample records for accurate computational tool

  1. High-performance computing and networking as tools for accurate emission computed tomography reconstruction.

    PubMed

    Passeri, A; Formiconi, A R; De Cristofaro, M T; Pupi, A; Meldolesi, U

    1997-04-01

    It is well known that the quantitative potential of emission computed tomography (ECT) relies on the ability to compensate for resolution, attenuation and scatter effects. Reconstruction algorithms which are able to take these effects into account are highly demanding in terms of computing resources. The reported work aimed to investigate the use of a parallel high-performance computing platform for ECT reconstruction taking into account an accurate model of the acquisition of single-photon emission tomographic (SPET) data. An iterative algorithm with an accurate model of the variable system response was ported on the MIMD (Multiple Instruction Multiple Data) parallel architecture of a 64-node Cray T3D massively parallel computer. The system was organized to make it easily accessible even from low-cost PC-based workstations through standard TCP/IP networking. A complete brain study of 30 (64x64) slices could be reconstructed from a set of 90 (64x64) projections with ten iterations of the conjugate gradients algorithm in 9 s, corresponding to an actual speed-up factor of 135. This work demonstrated the possibility of exploiting remote high-performance computing and networking resources from hospital sites by means of low-cost workstations using standard communication protocols without particular problems for routine use. The achievable speed-up factors allow the assessment of the clinical benefit of advanced reconstruction techniques which require a heavy computational burden for the compensation effects such as variable spatial resolution, scatter and attenuation. The possibility of using the same software on the same hardware platform with data acquired in different laboratories with various kinds of SPET instrumentation is appealing for software quality control and for the evaluation of the clinical impact of the reconstruction methods. PMID:9096089

  2. iTagPlot: an accurate computation and interactive drawing tool for tag density plot

    PubMed Central

    Kim, Sung-Hwan; Ezenwoye, Onyeka; Cho, Hwan-Gue; Robertson, Keith D.; Choi, Jeong-Hyeon

    2015-01-01

    Motivation: Tag density plots are very important to intuitively reveal biological phenomena from capture-based sequencing data by visualizing the normalized read depth in a region. Results: We have developed iTagPlot to compute tag density across functional features in parallel using multicores and a grid engine and to interactively explore it in a graphical user interface. It allows us to stratify features by defining groups based on biological function and measurement, summary statistics and unsupervised clustering. Availability and implementation: http://sourceforge.net/projects/itagplot/. Contact: jechoi@gru.edu and jeochoi@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25792550

  3. CAFE: A Computer Tool for Accurate Simulation of the Regulatory Pool Fire Environment for Type B Packages

    SciTech Connect

    Gritzo, L.A.; Koski, J.A.; Suo-Anttila, A.J.

    1999-03-16

    The Container Analysis Fire Environment computer code (CAFE) is intended to provide Type B package designers with an enhanced engulfing fire boundary condition when combined with the PATRAN/P-Thermal commercial code. Historically an engulfing fire boundary condition has been modeled as {sigma}T{sup 4} where {sigma} is the Stefan-Boltzman constant, and T is the fire temperature. The CAFE code includes the necessary chemistry, thermal radiation, and fluid mechanics to model an engulfing fire. Effects included are the local cooling of gases that form a protective boundary layer that reduces the incoming radiant heat flux to values lower than expected from a simple {sigma}T{sup 4} model. In addition, the effect of object shape on mixing that may increase the local fire temperature is included. Both high and low temperature regions that depend upon the local availability of oxygen are also calculated. Thus the competing effects that can both increase and decrease the local values of radiant heat flux are included in a reamer that is not predictable a-priori. The CAFE package consists of a group of computer subroutines that can be linked to workstation-based thermal analysis codes in order to predict package performance during regulatory and other accident fire scenarios.

  4. Selecting Appropriate Computing Tools.

    ERIC Educational Resources Information Center

    Tetlow, William L.

    1990-01-01

    Selecting computer tools requires analyzing information requirements and audiences, assessing existing institutional research and computing capacities, creating or improving a planning database, using computer experts, determining software needs, obtaining sufficient resources for independent operations, acquiring quality, and insisting on…

  5. Hypercard Another Computer Tool.

    ERIC Educational Resources Information Center

    Geske, Joel

    1991-01-01

    Describes "Hypercard," a computer application package usable in all three modes of instructional computing: tutor, tool, and tutee. Suggests using Hypercard in scholastic journalism programs to teach such topics as news, headlines, design, photography, and advertising. Argues that the ability to access, organize, manipulate, and comprehend…

  6. Accurate modeling of parallel scientific computations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Townsend, James C.

    1988-01-01

    Scientific codes are usually parallelized by partitioning a grid among processors. To achieve top performance it is necessary to partition the grid so as to balance workload and minimize communication/synchronization costs. This problem is particularly acute when the grid is irregular, changes over the course of the computation, and is not known until load time. Critical mapping and remapping decisions rest on the ability to accurately predict performance, given a description of a grid and its partition. This paper discusses one approach to this problem, and illustrates its use on a one-dimensional fluids code. The models constructed are shown to be accurate, and are used to find optimal remapping schedules.

  7. Tube dimpling tool assures accurate dip-brazed joints

    NASA Technical Reports Server (NTRS)

    Beuyukian, C. S.; Heisman, R. M.

    1968-01-01

    Portable, hand-held dimpling tool assures accurate brazed joints between tubes of different diameters. Prior to brazing, the tool performs precise dimpling and nipple forming and also provides control and accurate measuring of the height of nipples and depth of dimples so formed.

  8. Computers as tools

    SciTech Connect

    Eriksson, I.V.

    1994-12-31

    The following message was recently posted on a bulletin board and clearly shows the relevance of the conference theme: {open_quotes}The computer and digital networks seem poised to change whole regions of human activity -- how we record knowledge, communicate, learn, work, understand ourselves and the world. What`s the best framework for understanding this digitalization, or virtualization, of seemingly everything? ... Clearly, symbolic tools like the alphabet, book, and mechanical clock have changed some of our most fundamental notions -- self, identity, mind, nature, time, space. Can we say what the computer, a purely symbolic {open_quotes}machine,{close_quotes} is doing to our thinking in these areas? Or is it too early to say, given how much more powerful and less expensive the technology seems destinated to become in the next few decades?{close_quotes} (Verity, 1994) Computers certainly affect our lives and way of thinking but what have computers to do with ethics? A narrow approach would be that on the one hand people can and do abuse computer systems and on the other hand people can be abused by them. Weli known examples of the former are computer comes such as the theft of money, services and information. The latter can be exemplified by violation of privacy, health hazards and computer monitoring. Broadening the concept from computers to information systems (ISs) and information technology (IT) gives a wider perspective. Computers are just the hardware part of information systems which also include software, people and data. Information technology is the concept preferred today. It extends to communication, which is an essential part of information processing. Now let us repeat the question: What has IT to do with ethics? Verity mentioned changes in {open_quotes}how we record knowledge, communicate, learn, work, understand ourselves and the world{close_quotes}.

  9. LensTools: Weak Lensing computing tools

    NASA Astrophysics Data System (ADS)

    Petri, A.

    2016-02-01

    LensTools implements a wide range of routines frequently used in Weak Gravitational Lensing, including tools for image analysis, statistical processing and numerical theory predictions. The package offers many useful features, including complete flexibility and easy customization of input/output formats; efficient measurements of power spectrum, PDF, Minkowski functionals and peak counts of convergence maps; survey masks; artificial noise generation engines; easy to compute parameter statistical inferences; ray tracing simulations; and many others. It requires standard numpy and scipy, and depending on tools used, may require Astropy (ascl:1304.002), emcee (ascl:1303.002), matplotlib, and mpi4py.

  10. Computers as Cognitive Tools.

    ERIC Educational Resources Information Center

    Lajoie, Susanne P., Ed.; Derry, Sharon J., Ed.

    This book provides exemplars of the types of computer-based learning environments represented by the theoretical camps within the field and the practical applications of the theories. The contributors discuss a variety of computer applications to learning, ranging from school-related topics such as geometry, algebra, biology, history, physics, and…

  11. High accurate interpolation of NURBS tool path for CNC machine tools

    NASA Astrophysics Data System (ADS)

    Liu, Qiang; Liu, Huan; Yuan, Songmei

    2016-06-01

    Feedrate fluctuation caused by approximation errors of interpolation methods has great effects on machining quality in NURBS interpolation, but few methods can efficiently eliminate or reduce it to a satisfying level without sacrificing the computing efficiency at present. In order to solve this problem, a high accurate interpolation method for NURBS tool path is proposed. The proposed method can efficiently reduce the feedrate fluctuation by forming a quartic equation with respect to the curve parameter increment, which can be efficiently solved by analytic methods in real-time. Theoretically, the proposed method can totally eliminate the feedrate fluctuation for any 2nd degree NURBS curves and can interpolate 3rd degree NURBS curves with minimal feedrate fluctuation. Moreover, a smooth feedrate planning algorithm is also proposed to generate smooth tool motion with considering multiple constraints and scheduling errors by an efficient planning strategy. Experiments are conducted to verify the feasibility and applicability of the proposed method. This research presents a novel NURBS interpolation method with not only high accuracy but also satisfying computing efficiency.

  12. Fast, accurate, robust and Open Source Brain Extraction Tool (OSBET)

    NASA Astrophysics Data System (ADS)

    Namias, R.; Donnelly Kehoe, P.; D'Amato, J. P.; Nagel, J.

    2015-12-01

    The removal of non-brain regions in neuroimaging is a critical task to perform a favorable preprocessing. The skull-stripping depends on different factors including the noise level in the image, the anatomy of the subject being scanned and the acquisition sequence. For these and other reasons, an ideal brain extraction method should be fast, accurate, user friendly, open-source and knowledge based (to allow for the interaction with the algorithm in case the expected outcome is not being obtained), producing stable results and making it possible to automate the process for large datasets. There are already a large number of validated tools to perform this task but none of them meets the desired characteristics. In this paper we introduced an open source brain extraction tool (OSBET), composed of four steps using simple well-known operations such as: optimal thresholding, binary morphology, labeling and geometrical analysis that aims to assemble all the desired features. We present an experiment comparing OSBET with other six state-of-the-art techniques against a publicly available dataset consisting of 40 T1-weighted 3D scans and their corresponding manually segmented images. OSBET gave both: a short duration with an excellent accuracy, getting the best Dice Coefficient metric. Further validation should be performed, for instance, in unhealthy population, to generalize its usage for clinical purposes.

  13. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy

    PubMed Central

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T.; Cerutti, Francesco; Chin, Mary P. W.; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G.; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R.; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both 4He and 12C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth–dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features. PMID:27242956

  14. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy.

    PubMed

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T; Cerutti, Francesco; Chin, Mary P W; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both (4)He and (12)C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth-dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features. PMID:27242956

  15. CgWind: A high-order accurate simulation tool for wind turbines and wind farms

    SciTech Connect

    Chand, K K; Henshaw, W D; Lundquist, K A; Singer, M A

    2010-02-22

    CgWind is a high-fidelity large eddy simulation (LES) tool designed to meet the modeling needs of wind turbine and wind park engineers. This tool combines several advanced computational technologies in order to model accurately the complex and dynamic nature of wind energy applications. The composite grid approach provides high-quality structured grids for the efficient implementation of high-order accurate discretizations of the incompressible Navier-Stokes equations. Composite grids also provide a natural mechanism for modeling bodies in relative motion and complex geometry. Advanced algorithms such as matrix-free multigrid, compact discretizations and approximate factorization will allow CgWind to perform highly resolved calculations efficiently on a wide class of computing resources. Also in development are nonlinear LES subgrid-scale models required to simulate the many interacting scales present in large wind turbine applications. This paper outlines our approach, the current status of CgWind and future development plans.

  16. Scallops skeletons as tools for accurate proxy calibration

    NASA Astrophysics Data System (ADS)

    Lorrain, A.; Paulet, Y.-M.; Chauvaud, L.; Dunbar, R.; Mucciarone, D.; Pécheyran, C.; Amouroux, D.; Fontugne, M.

    2003-04-01

    Bivalves skeletons are able to produce great geochemical proxies. But general calibration of those proxies are based on approximate time basis because of misunderstanding of growth rhythm. In this context, the Great scallop, Pecten maximus, appears to be a powerful tool as a daily growth deposit has been clearly identified for this species (Chauvaud et al, 1998; Lorrain et al, 2000), allowing accurate environmental calibration. Indeed, using this species, a date can be affiliated to each growth increment, and as a consequence environmental parameters can be closely compared (at a daily scale) to observed chemical and structural shell variations. This daily record provides an unequivocal basis to calibrate proxies. Isotopic (Delta-13C and Delta-15N) and trace element analysis (LA-ICP-MS) have been performed on several individuals and different years depending on the analysed parameter. Seawater parameters measured one meter above the sea-bottom were compared to chemical variations in the calcitic shell. Their confrontation showed that even with a daily basis for data interpretation, calibration is still a challenge. Inter-individual variations are found and correlations are not always reproducible from one year to the others. The first explanation could be an inaccurate appreciation of the proximate environment of the animal, notably the water-sediment interface could best represent Pecten maximus environment. Secondly, physiological parameters could be inferred for those discrepancies. In particular, calcification takes places in the extrapallial fluid, which composition might be very different from external environment. Accurate calibration of chemical proxies should consider biological aspects to gain better insights into the processes controlling the incorporation of those chemical elements. The characterisation of isotopic and trace element composition of the extrapallial fluid and hemolymph could greatly help our understanding of chemical shell variations.

  17. Slim hole MWD tool accurately measures downhole annular pressure

    SciTech Connect

    Burban, B.; Delahaye, T. )

    1994-02-14

    Measurement-while-drilling of downhole pressure accurately determines annular pressure losses from circulation and drillstring rotation and helps monitor swab and surge pressures during tripping. In early 1993, two slim-hole wells (3.4 in. and 3 in. diameter) were drilled with continuous real-time electromagnetic wave transmission of downhole temperature and annular pressure. The data were obtained during all stages of the drilling operation and proved useful for operations personnel. The use of real-time measurements demonstrated the characteristic hydraulic effects of pressure surges induced by drillstring rotation in the small slim-hole annulus under field conditions. The interest in this information is not restricted to the slim-hole geometry. Monitoring or estimating downhole pressure is a key element for drilling operations. Except in special cases, no real-time measurements of downhole annular pressure during drilling and tripping have been used on an operational basis. The hydraulic effects are significant in conventional-geometry wells (3 1/2-in. drill pipe in a 6-in. hole). This paper describes the tool and the results from the field test.

  18. TACT: The Action Computation Tool

    NASA Astrophysics Data System (ADS)

    Sanders, Jason L.; Binney, James

    2015-12-01

    The Action Computation Tool (TACT) tests methods for estimating actions, angles and frequencies of orbits in both axisymmetric and triaxial potentials, including general spherical potentials, analytic potentials (Isochrone and Harmonic oscillator), axisymmetric Stackel fudge, average generating function from orbit (AvGF), and others. It is written in C++; code is provided to compile the routines into a Python library. TM (ascl:1512.014) and LAPACK are required to access some features.

  19. Foundational Tools for Petascale Computing

    SciTech Connect

    Miller, Barton

    2014-05-19

    The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building tools and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.

  20. Efficient and accurate computation of generalized singular-value decompositions

    NASA Astrophysics Data System (ADS)

    Drmac, Zlatko

    2001-11-01

    We present a new family of algorithms for accurate floating--point computation of the singular value decomposition (SVD) of various forms of products (quotients) of two or three matrices. The main goal of such an algorithm is to compute all singular values to high relative accuracy. This means that we are seeking guaranteed number of accurate digits even in the smallest singular values. We also want to achieve computational efficiency, while maintaining high accuracy. To illustrate, consider the SVD of the product A=BTSC. The new algorithm uses certain preconditioning (based on diagonal scalings, the LU and QR factorizations) to replace A with A'=(B')TS'C', where A and A' have the same singular values and the matrix A' is computed explicitly. Theoretical analysis and numerical evidence show that, in the case of full rank B, C, S, the accuracy of the new algorithm is unaffected by replacing B, S, C with, respectively, D1B, D2SD3, D4C, where Di, i=1,...,4 are arbitrary diagonal matrices. As an application, the paper proposes new accurate algorithms for computing the (H,K)-SVD and (H1,K)-SVD of S.

  1. Computers: Tools of Oppression, Tools of Liberation.

    ERIC Educational Resources Information Center

    Taylor, Jefferey H.

    This paper contends that students who are learning to use computers can benefit from having an overview of the history and social context of computers. The paper highlights some milestones in the history of computers, from ancient times to ENIAC to Altair to Bill Gates to the Internet. It also suggests some things for students to think about and…

  2. Tools for Accurate and Efficient Analysis of Complex Evolutionary Mechanisms in Microbial Genomes. Final Report

    SciTech Connect

    Nakhleh, Luay

    2014-03-12

    I proposed to develop computationally efficient tools for accurate detection and reconstruction of microbes' complex evolutionary mechanisms, thus enabling rapid and accurate annotation, analysis and understanding of their genomes. To achieve this goal, I proposed to address three aspects. (1) Mathematical modeling. A major challenge facing the accurate detection of HGT is that of distinguishing between these two events on the one hand and other events that have similar "effects." I proposed to develop a novel mathematical approach for distinguishing among these events. Further, I proposed to develop a set of novel optimization criteria for the evolutionary analysis of microbial genomes in the presence of these complex evolutionary events. (2) Algorithm design. In this aspect of the project, I proposed to develop an array of e cient and accurate algorithms for analyzing microbial genomes based on the formulated optimization criteria. Further, I proposed to test the viability of the criteria and the accuracy of the algorithms in an experimental setting using both synthetic as well as biological data. (3) Software development. I proposed the nal outcome to be a suite of software tools which implements the mathematical models as well as the algorithms developed.

  3. Neutron supermirrors: an accurate theory for layer thickness computation

    NASA Astrophysics Data System (ADS)

    Bray, Michael

    2001-11-01

    We present a new theory for the computation of Super-Mirror stacks, using accurate formulas derived from the classical optics field. Approximations are introduced into the computation, but at a later stage than existing theories, providing a more rigorous treatment of the problem. The final result is a continuous thickness stack, whose properties can be determined at the outset of the design. We find that the well-known fourth power dependence of number of layers versus maximum angle is (of course) asymptotically correct. We find a formula giving directly the relation between desired reflectance, maximum angle, and number of layers (for a given pair of materials). Note: The author of this article, a classical opticist, has limited knowledge of the Neutron world, and begs forgiveness for any shortcomings, erroneous assumptions and/or misinterpretation of previous authors' work on the subject.

  4. Accurate Computation of Survival Statistics in Genome-Wide Studies

    PubMed Central

    Vandin, Fabio; Papoutsaki, Alexandra; Raphael, Benjamin J.; Upfal, Eli

    2015-01-01

    A key challenge in genomics is to identify genetic variants that distinguish patients with different survival time following diagnosis or treatment. While the log-rank test is widely used for this purpose, nearly all implementations of the log-rank test rely on an asymptotic approximation that is not appropriate in many genomics applications. This is because: the two populations determined by a genetic variant may have very different sizes; and the evaluation of many possible variants demands highly accurate computation of very small p-values. We demonstrate this problem for cancer genomics data where the standard log-rank test leads to many false positive associations between somatic mutations and survival time. We develop and analyze a novel algorithm, Exact Log-rank Test (ExaLT), that accurately computes the p-value of the log-rank statistic under an exact distribution that is appropriate for any size populations. We demonstrate the advantages of ExaLT on data from published cancer genomics studies, finding significant differences from the reported p-values. We analyze somatic mutations in six cancer types from The Cancer Genome Atlas (TCGA), finding mutations with known association to survival as well as several novel associations. In contrast, standard implementations of the log-rank test report dozens-hundreds of likely false positive associations as more significant than these known associations. PMID:25950620

  5. Direct computation of parameters for accurate polarizable force fields

    SciTech Connect

    Verstraelen, Toon Vandenbrande, Steven; Ayers, Paul W.

    2014-11-21

    We present an improved electronic linear response model to incorporate polarization and charge-transfer effects in polarizable force fields. This model is a generalization of the Atom-Condensed Kohn-Sham Density Functional Theory (DFT), approximated to second order (ACKS2): it can now be defined with any underlying variational theory (next to KS-DFT) and it can include atomic multipoles and off-center basis functions. Parameters in this model are computed efficiently as expectation values of an electronic wavefunction, obviating the need for their calibration, regularization, and manual tuning. In the limit of a complete density and potential basis set in the ACKS2 model, the linear response properties of the underlying theory for a given molecular geometry are reproduced exactly. A numerical validation with a test set of 110 molecules shows that very accurate models can already be obtained with fluctuating charges and dipoles. These features greatly facilitate the development of polarizable force fields.

  6. An Accurate and Dynamic Computer Graphics Muscle Model

    NASA Technical Reports Server (NTRS)

    Levine, David Asher

    1997-01-01

    A computer based musculo-skeletal model was developed at the University in the departments of Mechanical and Biomedical Engineering. This model accurately represents human shoulder kinematics. The result of this model is the graphical display of bones moving through an appropriate range of motion based on inputs of EMGs and external forces. The need existed to incorporate a geometric muscle model in the larger musculo-skeletal model. Previous muscle models did not accurately represent muscle geometries, nor did they account for the kinematics of tendons. This thesis covers the creation of a new muscle model for use in the above musculo-skeletal model. This muscle model was based on anatomical data from the Visible Human Project (VHP) cadaver study. Two-dimensional digital images from the VHP were analyzed and reconstructed to recreate the three-dimensional muscle geometries. The recreated geometries were smoothed, reduced, and sliced to form data files defining the surfaces of each muscle. The muscle modeling function opened these files during run-time and recreated the muscle surface. The modeling function applied constant volume limitations to the muscle and constant geometry limitations to the tendons.

  7. Development and Validation of a Multidisciplinary Tool for Accurate and Efficient Rotorcraft Noise Prediction (MUTE)

    NASA Technical Reports Server (NTRS)

    Liu, Yi; Anusonti-Inthra, Phuriwat; Diskin, Boris

    2011-01-01

    A physics-based, systematically coupled, multidisciplinary prediction tool (MUTE) for rotorcraft noise was developed and validated with a wide range of flight configurations and conditions. MUTE is an aggregation of multidisciplinary computational tools that accurately and efficiently model the physics of the source of rotorcraft noise, and predict the noise at far-field observer locations. It uses systematic coupling approaches among multiple disciplines including Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and high fidelity acoustics. Within MUTE, advanced high-order CFD tools are used around the rotor blade to predict the transonic flow (shock wave) effects, which generate the high-speed impulsive noise. Predictions of the blade-vortex interaction noise in low speed flight are also improved by using the Particle Vortex Transport Method (PVTM), which preserves the wake flow details required for blade/wake and fuselage/wake interactions. The accuracy of the source noise prediction is further improved by utilizing a coupling approach between CFD and CSD, so that the effects of key structural dynamics, elastic blade deformations, and trim solutions are correctly represented in the analysis. The blade loading information and/or the flow field parameters around the rotor blade predicted by the CFD/CSD coupling approach are used to predict the acoustic signatures at far-field observer locations with a high-fidelity noise propagation code (WOPWOP3). The predicted results from the MUTE tool for rotor blade aerodynamic loading and far-field acoustic signatures are compared and validated with a variation of experimental data sets, such as UH60-A data, DNW test data and HART II test data.

  8. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  9. Visualization Tools for Teaching Computer Security

    ERIC Educational Resources Information Center

    Yuan, Xiaohong; Vega, Percy; Qadah, Yaseen; Archer, Ricky; Yu, Huiming; Xu, Jinsheng

    2010-01-01

    Using animated visualization tools has been an important teaching approach in computer science education. We have developed three visualization and animation tools that demonstrate various information security concepts and actively engage learners. The information security concepts illustrated include: packet sniffer and related computer network…

  10. A fast and accurate computational approach to protein ionization

    PubMed Central

    Spassov, Velin Z.; Yan, Lisa

    2008-01-01

    We report a very fast and accurate physics-based method to calculate pH-dependent electrostatic effects in protein molecules and to predict the pK values of individual sites of titration. In addition, a CHARMm-based algorithm is included to construct and refine the spatial coordinates of all hydrogen atoms at a given pH. The present method combines electrostatic energy calculations based on the Generalized Born approximation with an iterative mobile clustering approach to calculate the equilibria of proton binding to multiple titration sites in protein molecules. The use of the GBIM (Generalized Born with Implicit Membrane) CHARMm module makes it possible to model not only water-soluble proteins but membrane proteins as well. The method includes a novel algorithm for preliminary refinement of hydrogen coordinates. Another difference from existing approaches is that, instead of monopeptides, a set of relaxed pentapeptide structures are used as model compounds. Tests on a set of 24 proteins demonstrate the high accuracy of the method. On average, the RMSD between predicted and experimental pK values is close to 0.5 pK units on this data set, and the accuracy is achieved at very low computational cost. The pH-dependent assignment of hydrogen atoms also shows very good agreement with protonation states and hydrogen-bond network observed in neutron-diffraction structures. The method is implemented as a computational protocol in Accelrys Discovery Studio and provides a fast and easy way to study the effect of pH on many important mechanisms such as enzyme catalysis, ligand binding, protein–protein interactions, and protein stability. PMID:18714088

  11. Photoacoustic computed tomography without accurate ultrasonic transducer responses

    NASA Astrophysics Data System (ADS)

    Sheng, Qiwei; Wang, Kun; Xia, Jun; Zhu, Liren; Wang, Lihong V.; Anastasio, Mark A.

    2015-03-01

    Conventional photoacoustic computed tomography (PACT) image reconstruction methods assume that the object and surrounding medium are described by a constant speed-of-sound (SOS) value. In order to accurately recover fine structures, SOS heterogeneities should be quantified and compensated for during PACT reconstruction. To address this problem, several groups have proposed hybrid systems that combine PACT with ultrasound computed tomography (USCT). In such systems, a SOS map is reconstructed first via USCT. Consequently, this SOS map is employed to inform the PACT reconstruction method. Additionally, the SOS map can provide structural information regarding tissue, which is complementary to the functional information from the PACT image. We propose a paradigm shift in the way that images are reconstructed in hybrid PACT-USCT imaging. Inspired by our observation that information about the SOS distribution is encoded in PACT measurements, we propose to jointly reconstruct the absorbed optical energy density and SOS distributions from a combined set of USCT and PACT measurements, thereby reducing the two reconstruction problems into one. This innovative approach has several advantages over conventional approaches in which PACT and USCT images are reconstructed independently: (1) Variations in the SOS will automatically be accounted for, optimizing PACT image quality; (2) The reconstructed PACT and USCT images will possess minimal systematic artifacts because errors in the imaging models will be optimally balanced during the joint reconstruction; (3) Due to the exploitation of information regarding the SOS distribution in the full-view PACT data, our approach will permit high-resolution reconstruction of the SOS distribution from sparse array data.

  12. MetaBAT, an efficient tool for accurately reconstructing single genomes from complex microbial communities

    DOE PAGESBeta

    Kang, Dongwan D.; Froula, Jeff; Egan, Rob; Wang, Zhong

    2015-01-01

    Grouping large genomic fragments assembled from shotgun metagenomic sequences to deconvolute complex microbial communities, or metagenome binning, enables the study of individual organisms and their interactions. Because of the complex nature of these communities, existing metagenome binning methods often miss a large number of microbial species. In addition, most of the tools are not scalable to large datasets. Here we introduce automated software called MetaBAT that integrates empirical probabilistic distances of genome abundance and tetranucleotide frequency for accurate metagenome binning. MetaBAT outperforms alternative methods in accuracy and computational efficiency on both synthetic and real metagenome datasets. Lastly, it automatically formsmore » hundreds of high quality genome bins on a very large assembly consisting millions of contigs in a matter of hours on a single node. MetaBAT is open source software and available at https://bitbucket.org/berkeleylab/metabat.« less

  13. MetaBAT, an efficient tool for accurately reconstructing single genomes from complex microbial communities

    SciTech Connect

    Kang, Dongwan D.; Froula, Jeff; Egan, Rob; Wang, Zhong

    2015-01-01

    Grouping large genomic fragments assembled from shotgun metagenomic sequences to deconvolute complex microbial communities, or metagenome binning, enables the study of individual organisms and their interactions. Because of the complex nature of these communities, existing metagenome binning methods often miss a large number of microbial species. In addition, most of the tools are not scalable to large datasets. Here we introduce automated software called MetaBAT that integrates empirical probabilistic distances of genome abundance and tetranucleotide frequency for accurate metagenome binning. MetaBAT outperforms alternative methods in accuracy and computational efficiency on both synthetic and real metagenome datasets. Lastly, it automatically forms hundreds of high quality genome bins on a very large assembly consisting millions of contigs in a matter of hours on a single node. MetaBAT is open source software and available at https://bitbucket.org/berkeleylab/metabat.

  14. Automated Development of Accurate Algorithms and Efficient Codes for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.; Dyson, Rodger W.

    1999-01-01

    The simulation of sound generation and propagation in three space dimensions with realistic aircraft components is a very large time dependent computation with fine details. Simulations in open domains with embedded objects require accurate and robust algorithms for propagation, for artificial inflow and outflow boundaries, and for the definition of geometrically complex objects. The development, implementation, and validation of methods for solving these demanding problems is being done to support the NASA pillar goals for reducing aircraft noise levels. Our goal is to provide algorithms which are sufficiently accurate and efficient to produce usable results rapidly enough to allow design engineers to study the effects on sound levels of design changes in propulsion systems, and in the integration of propulsion systems with airframes. There is a lack of design tools for these purposes at this time. Our technical approach to this problem combines the development of new, algorithms with the use of Mathematica and Unix utilities to automate the algorithm development, code implementation, and validation. We use explicit methods to ensure effective implementation by domain decomposition for SPMD parallel computing. There are several orders of magnitude difference in the computational efficiencies of the algorithms which we have considered. We currently have new artificial inflow and outflow boundary conditions that are stable, accurate, and unobtrusive, with implementations that match the accuracy and efficiency of the propagation methods. The artificial numerical boundary treatments have been proven to have solutions which converge to the full open domain problems, so that the error from the boundary treatments can be driven as low as is required. The purpose of this paper is to briefly present a method for developing highly accurate algorithms for computational aeroacoustics, the use of computer automation in this process, and a brief survey of the algorithms that

  15. Groupware: A Tool for Interpersonal Computing.

    ERIC Educational Resources Information Center

    Knupfer, Nancy Nelson; McLellan, Hilary

    Computer networks have provided a foundation for interpersonal computing, and new tools are emerging, the centerpiece of which is called "groupware." Groupware technology is reviewed, and the theoretical framework that will underlie interpersonal collaborative computing is discussed. Groupware can consist of hardware, software, services, and…

  16. Fully computed holographic stereogram based algorithm for computer-generated holograms with accurate depth cues.

    PubMed

    Zhang, Hao; Zhao, Yan; Cao, Liangcai; Jin, Guofan

    2015-02-23

    We propose an algorithm based on fully computed holographic stereogram for calculating full-parallax computer-generated holograms (CGHs) with accurate depth cues. The proposed method integrates point source algorithm and holographic stereogram based algorithm to reconstruct the three-dimensional (3D) scenes. Precise accommodation cue and occlusion effect can be created, and computer graphics rendering techniques can be employed in the CGH generation to enhance the image fidelity. Optical experiments have been performed using a spatial light modulator (SLM) and a fabricated high-resolution hologram, the results show that our proposed algorithm can perform quality reconstructions of 3D scenes with arbitrary depth information. PMID:25836429

  17. Accurate real-time depth control for CP-SSOCT distal sensor based handheld microsurgery tools.

    PubMed

    Cheon, Gyeong Woo; Huang, Yong; Cha, Jaepyeng; Gehlbach, Peter L; Kang, Jin U

    2015-05-01

    This paper presents a novel intuitive targeting and tracking scheme that utilizes a common-path swept source optical coherence tomography (CP-SSOCT) distal sensor integrated handheld microsurgical tool. To achieve micron-order precision control, a reliable and accurate OCT distal sensing method is required; simultaneously, a prediction algorithm is necessary to compensate for the system delay associated with the computational, mechanical and electronic latencies. Due to the multi-layered structure of retina, it is necessary to develop effective surface detection methods rather than simple peak detection. To achieve this, a shifted cross-correlation method is applied for surface detection in order to increase robustness and accuracy in distal sensing. A predictor based on Kalman filter was implemented for more precise motion compensation. The performance was first evaluated using an established dry phantom consisting of stacked cellophane tape. This was followed by evaluation in an ex-vivo bovine retina model to assess system accuracy and precision. The results demonstrate highly accurate depth targeting with less than 5 μm RMSE depth locking. PMID:26137393

  18. Accurate real-time depth control for CP-SSOCT distal sensor based handheld microsurgery tools

    PubMed Central

    Cheon, Gyeong Woo; Huang, Yong; Cha, Jaepyeng; Gehlbach, Peter L.; Kang, Jin U.

    2015-01-01

    This paper presents a novel intuitive targeting and tracking scheme that utilizes a common-path swept source optical coherence tomography (CP-SSOCT) distal sensor integrated handheld microsurgical tool. To achieve micron-order precision control, a reliable and accurate OCT distal sensing method is required; simultaneously, a prediction algorithm is necessary to compensate for the system delay associated with the computational, mechanical and electronic latencies. Due to the multi-layered structure of retina, it is necessary to develop effective surface detection methods rather than simple peak detection. To achieve this, a shifted cross-correlation method is applied for surface detection in order to increase robustness and accuracy in distal sensing. A predictor based on Kalman filter was implemented for more precise motion compensation. The performance was first evaluated using an established dry phantom consisting of stacked cellophane tape. This was followed by evaluation in an ex-vivo bovine retina model to assess system accuracy and precision. The results demonstrate highly accurate depth targeting with less than 5 μm RMSE depth locking. PMID:26137393

  19. An accurate and computationally efficient model for membrane-type circular-symmetric micro-hotplates.

    PubMed

    Khan, Usman; Falconi, Christian

    2014-01-01

    Ideally, the design of high-performance micro-hotplates would require a large number of simulations because of the existence of many important design parameters as well as the possibly crucial effects of both spread and drift. However, the computational cost of FEM simulations, which are the only available tool for accurately predicting the temperature in micro-hotplates, is very high. As a result, micro-hotplate designers generally have no effective simulation-tools for the optimization. In order to circumvent these issues, here, we propose a model for practical circular-symmetric micro-hot-plates which takes advantage of modified Bessel functions, computationally efficient matrix-approach for considering the relevant boundary conditions, Taylor linearization for modeling the Joule heating and radiation losses, and external-region-segmentation strategy in order to accurately take into account radiation losses in the entire micro-hotplate. The proposed model is almost as accurate as FEM simulations and two to three orders of magnitude more computationally efficient (e.g., 45 s versus more than 8 h). The residual errors, which are mainly associated to the undesired heating in the electrical contacts, are small (e.g., few degrees Celsius for an 800 °C operating temperature) and, for important analyses, almost constant. Therefore, we also introduce a computationally-easy single-FEM-compensation strategy in order to reduce the residual errors to about 1 °C. As illustrative examples of the power of our approach, we report the systematic investigation of a spread in the membrane thermal conductivity and of combined variations of both ambient and bulk temperatures. Our model enables a much faster characterization of micro-hotplates and, thus, a much more effective optimization prior to fabrication. PMID:24763214

  20. An Accurate and Computationally Efficient Model for Membrane-Type Circular-Symmetric Micro-Hotplates

    PubMed Central

    Khan, Usman; Falconi, Christian

    2014-01-01

    Ideally, the design of high-performance micro-hotplates would require a large number of simulations because of the existence of many important design parameters as well as the possibly crucial effects of both spread and drift. However, the computational cost of FEM simulations, which are the only available tool for accurately predicting the temperature in micro-hotplates, is very high. As a result, micro-hotplate designers generally have no effective simulation-tools for the optimization. In order to circumvent these issues, here, we propose a model for practical circular-symmetric micro-hot-plates which takes advantage of modified Bessel functions, computationally efficient matrix-approach for considering the relevant boundary conditions, Taylor linearization for modeling the Joule heating and radiation losses, and external-region-segmentation strategy in order to accurately take into account radiation losses in the entire micro-hotplate. The proposed model is almost as accurate as FEM simulations and two to three orders of magnitude more computationally efficient (e.g., 45 s versus more than 8 h). The residual errors, which are mainly associated to the undesired heating in the electrical contacts, are small (e.g., few degrees Celsius for an 800 °C operating temperature) and, for important analyses, almost constant. Therefore, we also introduce a computationally-easy single-FEM-compensation strategy in order to reduce the residual errors to about 1 °C. As illustrative examples of the power of our approach, we report the systematic investigation of a spread in the membrane thermal conductivity and of combined variations of both ambient and bulk temperatures. Our model enables a much faster characterization of micro-hotplates and, thus, a much more effective optimization prior to fabrication. PMID:24763214

  1. Measurement of Fracture Geometry for Accurate Computation of Hydraulic Conductivity

    NASA Astrophysics Data System (ADS)

    Chae, B.; Ichikawa, Y.; Kim, Y.

    2003-12-01

    Fluid flow in rock mass is controlled by geometry of fractures which is mainly characterized by roughness, aperture and orientation. Fracture roughness and aperture was observed by a new confocal laser scanning microscope (CLSM; Olympus OLS1100). The wavelength of laser is 488nm, and the laser scanning is managed by a light polarization method using two galvano-meter scanner mirrors. The system improves resolution in the light axis (namely z) direction because of the confocal optics. The sampling is managed in a spacing 2.5 μ m along x and y directions. The highest measurement resolution of z direction is 0.05 μ m, which is the more accurate than other methods. For the roughness measurements, core specimens of coarse and fine grained granites were provided. Measurements were performed along three scan lines on each fracture surface. The measured data were represented as 2-D and 3-D digital images showing detailed features of roughness. Spectral analyses by the fast Fourier transform (FFT) were performed to characterize on the roughness data quantitatively and to identify influential frequency of roughness. The FFT results showed that components of low frequencies were dominant in the fracture roughness. This study also verifies that spectral analysis is a good approach to understand complicate characteristics of fracture roughness. For the aperture measurements, digital images of the aperture were acquired under applying five stages of uniaxial normal stresses. This method can characterize the response of aperture directly using the same specimen. Results of measurements show that reduction values of aperture are different at each part due to rough geometry of fracture walls. Laboratory permeability tests were also conducted to evaluate changes of hydraulic conductivities related to aperture variation due to different stress levels. The results showed non-uniform reduction of hydraulic conductivity under increase of the normal stress and different values of

  2. Evaluation of distributed computing tools

    SciTech Connect

    Stanberry, L.

    1992-10-28

    The original goal stated in the collaboration agreement from LCC`s perspective was ``to show that networking tools available in UNICOS perform well enough to meet the requirements of LCC customers.`` This translated into evaluating how easy it was to port ELROS over CRI`s ISO 2.0, which itself is a port of ISODE to the Cray. In addition we tested the interoperability of ELROS and ISO 2.0 programs running on the Cray, and communicating with each other, and with servers or clients running on other machines. To achieve these goals from LCC`s side, we ported ELROS to the Cray, and also obtained and installed a copy of the ISO 2.0 distribution from CRI. CRI`s goal for the collaboration was to evaluate the usability of ELROS. In particular, we were interested in their potential feedback on the use of ELROS in implementing ISO protocols--whether ELROS would be easter to use and perform better than other tools that form part of the standard ISODE system. To help achieve these goals for CRI, we provided them with a distribution tar file containing the ELROS system, once we had completed our port of ELROS to the Cray.

  3. Evaluation of distributed computing tools

    SciTech Connect

    Stanberry, L.

    1992-10-28

    The original goal stated in the collaboration agreement from LCC's perspective was to show that networking tools available in UNICOS perform well enough to meet the requirements of LCC customers.'' This translated into evaluating how easy it was to port ELROS over CRI's ISO 2.0, which itself is a port of ISODE to the Cray. In addition we tested the interoperability of ELROS and ISO 2.0 programs running on the Cray, and communicating with each other, and with servers or clients running on other machines. To achieve these goals from LCC's side, we ported ELROS to the Cray, and also obtained and installed a copy of the ISO 2.0 distribution from CRI. CRI's goal for the collaboration was to evaluate the usability of ELROS. In particular, we were interested in their potential feedback on the use of ELROS in implementing ISO protocols--whether ELROS would be easter to use and perform better than other tools that form part of the standard ISODE system. To help achieve these goals for CRI, we provided them with a distribution tar file containing the ELROS system, once we had completed our port of ELROS to the Cray.

  4. Computer assisted blast design and assessment tools

    SciTech Connect

    Cameron, A.R.; Kleine, T.H.; Forsyth, W.W.

    1995-12-31

    In general the software required by a blast designer includes tools that graphically present blast designs (surface and underground), can analyze a design or predict its result, and can assess blasting results. As computers develop and computer literacy continues to rise the development of and use of such tools will spread. An example of the tools that are becoming available includes: Automatic blast pattern generation and underground ring design; blast design evaluation in terms of explosive distribution and detonation simulation; fragmentation prediction; blast vibration prediction and minimization; blast monitoring for assessment of dynamic performance; vibration measurement, display and signal processing; evaluation of blast results in terms of fragmentation; and risk and reliability based blast assessment. The authors have identified a set of criteria that are essential in choosing appropriate software blasting tools.

  5. Computing accurate age and distance factors in cosmology

    NASA Astrophysics Data System (ADS)

    Christiansen, Jodi L.; Siver, Andrew

    2012-05-01

    As the universe expands astronomical observables such as brightness and angular size on the sky change in ways that differ from our simple Cartesian expectation. We show how observed quantities depend on the expansion of space and demonstrate how to calculate such quantities using the Friedmann equations. The general solution to the Friedmann equations requires a numerical solution, which is easily coded in any computing language (including excel). We use these numerical calculations in four projects that help students build their understanding of high-redshift phenomena and cosmology. Instructions for these projects are available as supplementary materials.

  6. Computational tools for the modern andrologist.

    PubMed

    Niederberger, C

    1996-01-01

    With such a wide array of computational tools to solve inference problems, andrologists and their mathematical or statistical collaborators face perhaps bewildering choices. It is tempting to criticize a method with which one is unfamiliar for its apparent complexity. Yet, many methods are quite elegant; neural computation uses nature's own best biological classifier, for example, and genetic algorithms apply rules of natural selection. Computer scientists will likely find no one single best inference engine to solve all classification problems. Rather, the modeler should choose the most appropriate computational tool based on the specific nature of a problem. If the problem can be separated into obvious components, a Markov chain may be useful. If the andrologist would like to encode a well-known clinical algorithm into the computer, the programmer may use an expert system. Once a modeler builds an inference engine, that engine is not truly useful until other andrologists use it to make inferences with their own data. Because a wide variety of computer hardware and software exists, it is a significant endeavor to translate, or "port," software designed and built on one machine to many other different computers. Fortunately, the World Wide Web offers a means by which computational tools may be made directly available to multiple users on many different systems, or "platforms." The World Wide Web refers to a standardization of information traffic on the global computer network, the Internet. The Internet is simply the linkage of many computers worldwide by computer operators who have chosen to allow other users access to their systems. Because many different types of computers exist, until recently only communication in very rudimentary form, such as text, or between select compatible machines, was available. Within the last half-decade, computer scientists and operators began to use standard means of communication between computers. Interpreters of these standard

  7. Towards fast and accurate algorithms for processing fuzzy data: interval computations revisited

    NASA Astrophysics Data System (ADS)

    Xiang, Gang; Kreinovich, Vladik

    2013-02-01

    In many practical applications, we need to process data, e.g. to predict the future values of different quantities based on their current values. Often, the only information that we have about the current values comes from experts, and is described in informal ('fuzzy') terms like 'small'. To process such data, it is natural to use fuzzy techniques, techniques specifically designed by Lotfi Zadeh to handle such informal information. In this survey, we start by revisiting the motivation behind Zadeh's formulae for processing fuzzy data, and explain how the algorithmic problem of processing fuzzy data can be described in terms of interval computations (α-cuts). Many fuzzy practitioners claim 'I tried interval computations, they did not work' - meaning that they got estimates which are much wider than the desired α-cuts. We show that such statements are usually based on a (widely spread) misunderstanding - that interval computations simply mean replacing each arithmetic operation with the corresponding operation with intervals. We show that while such straightforward interval techniques indeed often lead to over-wide estimates, the current advanced interval computations techniques result in estimates which are much more accurate. We overview such advanced interval computations techniques, and show that by using them, we can efficiently and accurately process fuzzy data. We wrote this survey with three audiences in mind. First, we want fuzzy researchers and practitioners to understand the current advanced interval computations techniques and to use them to come up with faster and more accurate algorithms for processing fuzzy data. For this 'fuzzy' audience, we explain these current techniques in detail. Second, we also want interval researchers to better understand this important application area for their techniques. For this 'interval' audience, we want to explain where fuzzy techniques come from, what are possible variants of these techniques, and what are the

  8. Computational Tools to Accelerate Commercial Development

    SciTech Connect

    Miller, David C

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  9. Computing tools for implementing standards for single-case designs.

    PubMed

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. PMID:26358925

  10. MAPPER: A personal computer map projection tool

    NASA Technical Reports Server (NTRS)

    Bailey, Steven A.

    1993-01-01

    MAPPER is a set of software tools designed to let users create and manipulate map projections on a personal computer (PC). The capability exists to generate five popular map projections. These include azimuthal, cylindrical, mercator, lambert, and sinusoidal projections. Data for projections are contained in five coordinate databases at various resolutions. MAPPER is managed by a system of pull-down windows. This interface allows the user to intuitively create, view and export maps to other platforms.

  11. Computational tools to investigate genetic cardiac channelopathies

    PubMed Central

    Abriel, Hugues; de Lange, Enno; Kucera, Jan P.; Loussouarn, Gildas; Tarek, Mounir

    2013-01-01

    The aim of this perspective article is to share with the community of ion channel scientists our thoughts and expectations regarding the increasing role that computational tools will play in the future of our field. The opinions and comments detailed here are the result of a 3-day long international exploratory workshop that took place in October 2013 and that was supported by the Swiss National Science Foundation. PMID:24421770

  12. Final Report: Correctness Tools for Petascale Computing

    SciTech Connect

    Mellor-Crummey, John

    2014-10-27

    In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoring of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.

  13. Use of Monocrystalline Silicon as Tool Material for Highly Accurate Blanking of Thin Metal Foils

    SciTech Connect

    Hildering, Sven; Engel, Ulf; Merklein, Marion

    2011-05-04

    The trend towards miniaturisation of metallic mass production components combined with increased component functionality is still unbroken. Manufacturing these components by forming and blanking offers economical and ecological advantages combined with the needed accuracy. The complexity of producing tools with geometries below 50 {mu}m by conventional manufacturing methods becomes disproportional higher. Expensive serial finishing operations are required to achieve an adequate surface roughness combined with accurate geometry details. A novel approach for producing such tools is the use of advanced etching technologies for monocrystalline silicon that are well-established in the microsystems technology. High-precision vertical geometries with a width down to 5 {mu}m are possible. The present study shows a novel concept using this potential for the blanking of thin copper foils with monocrystallline silicon as a tool material. A self-contained machine-tool with compact outer dimensions was designed to avoid tensile stresses in the brittle silicon punch by an accurate, careful alignment of the punch, die and metal foil. A microscopic analysis of the monocrystalline silicon punch shows appropriate properties regarding flank angle, edge geometry and surface quality for the blanking process. Using a monocrystalline silicon punch with a width of 70 {mu}m blanking experiments on as-rolled copper foils with a thickness of 20 {mu}m demonstrate the general applicability of this material for micro production processes.

  14. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    PubMed

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry. PMID:26465079

  15. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  16. Equilibrium gas flow computations. I - Accurate and efficient calculation of equilibrium gas properties

    NASA Technical Reports Server (NTRS)

    Liu, Yen; Vinokur, Marcel

    1989-01-01

    This paper treats the accurate and efficient calculation of thermodynamic properties of arbitrary gas mixtures for equilibrium flow computations. New improvements in the Stupochenko-Jaffe model for the calculation of thermodynamic properties of diatomic molecules are presented. A unified formulation of equilibrium calculations for gas mixtures in terms of irreversible entropy is given. Using a highly accurate thermo-chemical data base, a new, efficient and vectorizable search algorithm is used to construct piecewise interpolation procedures with generate accurate thermodynamic variable and their derivatives required by modern computational algorithms. Results are presented for equilibrium air, and compared with those given by the Srinivasan program.

  17. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-01-01

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches. PMID:27401684

  18. A Review of Computational Tools in microRNA Discovery

    PubMed Central

    Gomes, Clarissa P. C.; Cho, Ji-Hoon; Hood, Leroy; Franco, Octávio L.; Pereira, Rinaldo W.; Wang, Kai

    2013-01-01

    Since microRNAs (miRNAs) were discovered, their impact on regulating various biological activities has been a surprising and exciting field. Knowing the entire repertoire of these small molecules is the first step to gain a better understanding of their function. High throughput discovery tools such as next-generation sequencing significantly increased the number of known miRNAs in different organisms in recent years. However, the process of being able to accurately identify miRNAs is still a complex and difficult task, requiring the integration of experimental approaches with computational methods. A number of prediction algorithms based on characteristics of miRNA molecules have been developed to identify new miRNA species. Different approaches have certain strengths and weaknesses and in this review, we aim to summarize several commonly used tools in metazoan miRNA discovery. PMID:23720668

  19. Computer-Based Cognitive Tools: Description and Design.

    ERIC Educational Resources Information Center

    Kennedy, David; McNaught, Carmel

    With computers, tangible tools are represented by the hardware (e.g., the central processing unit, scanners, and video display unit), while intangible tools are represented by the software. There is a special category of computer-based software tools (CBSTs) that have the potential to mediate cognitive processes--computer-based cognitive tools…

  20. Urinary PCR as an increasingly useful tool for an accurate diagnosis of leptospirosis in livestock.

    PubMed

    Hamond, C; Martins, G; Loureiro, A P; Pestana, C; Lawson-Ferreira, R; Medeiros, M A; Lilenbaum, W

    2014-03-01

    The aim of the present study was to consider the wide usage of urinary PCR as an increasingly useful tool for an accurate diagnosis of leptospirosis in livestock. A total of 512 adult animals (300 cattle, 138 horses, 59 goats and 15 pigs), from herds/flocks with reproductive problems in Rio de Janeiro, Brazil was studied by serology and urinary PCR. From the 512 serum samples tested, 223 (43.5 %) were seroreactive (cattle: 45.6 %, horses: 41.3 %, goats: 34%and pigs: 60 %). PCR detected leptospiral DNA in 32.4 % (cattle: 21.6 %, horses: 36.2 %, goats: 77.4 % and pigs: 33.3 %. To our knowledge there is no another study including such a large number of samples (512) from different species, providing a comprehensive analysis of the usage of PCR for detecting leptospiral carriers in livestock. Serological and molecular results were discrepant, regardless the titre, what was an expected outcome. Nevertheless, it is impossible to establish agreement between these tests, since the two methodologies are conducted on different samples (MAT - serum; PCR - urine). Additionally, the MAT is an indirect method and PCR is a direct one. In conclusion, we have demonstrated that urinary PCR should be considered and encouraged as an increasingly useful tool for an accurate diagnosis of leptospirosis in livestock. PMID:24222053

  1. The Clinical Impact of Accurate Cystine Calculi Characterization Using Dual-Energy Computed Tomography

    PubMed Central

    Haley, William E.; Ibrahim, El-Sayed H.; Qu, Mingliang; Cernigliaro, Joseph G.; Goldfarb, David S.; McCollough, Cynthia H.

    2015-01-01

    Dual-energy computed tomography (DECT) has recently been suggested as the imaging modality of choice for kidney stones due to its ability to provide information on stone composition. Standard postprocessing of the dual-energy images accurately identifies uric acid stones, but not other types. Cystine stones can be identified from DECT images when analyzed with advanced postprocessing. This case report describes clinical implications of accurate diagnosis of cystine stones using DECT. PMID:26688770

  2. The Clinical Impact of Accurate Cystine Calculi Characterization Using Dual-Energy Computed Tomography.

    PubMed

    Haley, William E; Ibrahim, El-Sayed H; Qu, Mingliang; Cernigliaro, Joseph G; Goldfarb, David S; McCollough, Cynthia H

    2015-01-01

    Dual-energy computed tomography (DECT) has recently been suggested as the imaging modality of choice for kidney stones due to its ability to provide information on stone composition. Standard postprocessing of the dual-energy images accurately identifies uric acid stones, but not other types. Cystine stones can be identified from DECT images when analyzed with advanced postprocessing. This case report describes clinical implications of accurate diagnosis of cystine stones using DECT. PMID:26688770

  3. VISTA - computational tools for comparative genomics

    SciTech Connect

    Frazer, Kelly A.; Pachter, Lior; Poliakov, Alexander; Rubin,Edward M.; Dubchak, Inna

    2004-01-01

    Comparison of DNA sequences from different species is a fundamental method for identifying functional elements in genomes. Here we describe the VISTA family of tools created to assist biologists in carrying out this task. Our first VISTA server at http://www-gsd.lbl.gov/VISTA/ was launched in the summer of 2000 and was designed to align long genomic sequences and visualize these alignments with associated functional annotations. Currently the VISTA site includes multiple comparative genomics tools and provides users with rich capabilities to browse pre-computed whole-genome alignments of large vertebrate genomes and other groups of organisms with VISTA Browser, submit their own sequences of interest to several VISTA servers for various types of comparative analysis, and obtain detailed comparative analysis results for a set of cardiovascular genes. We illustrate capabilities of the VISTA site by the analysis of a 180 kilobase (kb) interval on human chromosome 5 that encodes for the kinesin family member3A (KIF3A) protein.

  4. Creation of Anatomically Accurate Computer-Aided Design (CAD) Solid Models from Medical Images

    NASA Technical Reports Server (NTRS)

    Stewart, John E.; Graham, R. Scott; Samareh, Jamshid A.; Oberlander, Eric J.; Broaddus, William C.

    1999-01-01

    Most surgical instrumentation and implants used in the world today are designed with sophisticated Computer-Aided Design (CAD)/Computer-Aided Manufacturing (CAM) software. This software automates the mechanical development of a product from its conceptual design through manufacturing. CAD software also provides a means of manipulating solid models prior to Finite Element Modeling (FEM). Few surgical products are designed in conjunction with accurate CAD models of human anatomy because of the difficulty with which these models are created. We have developed a novel technique that creates anatomically accurate, patient specific CAD solids from medical images in a matter of minutes.

  5. C-Sibelia: an easy-to-use and highly accurate tool for bacterial genome comparison

    PubMed Central

    Minkin, Ilya; Pham, Hoa; Starostina, Ekaterina; Vyahhi, Nikolay; Pham, Son

    2013-01-01

    We present C-Sibelia, a highly accurate and easy-to-use software tool for comparing two closely related bacterial genomes, which can be presented as either finished sequences or fragmented assemblies. C-Sibelia takes as input two FASTA files and produces: (1) a VCF file containing all identified single nucleotide variations and indels; (2) an XMFA file containing alignment information. The software also produces Circos diagrams visualizing high level genomic architecture for rearrangement analyses. C-Sibelia is a part of the Sibelia comparative genomics suite, which is freely available under the GNU GPL v.2 license at http://sourceforge.net/projects/sibelia-bio. C-Sibelia is compatible with Unix-like operating systems. A web-based version of the software is available at http://etool.me/software/csibelia. PMID:25110578

  6. Computational and Physical Quality Assurance Tools for Radiotherapy

    NASA Astrophysics Data System (ADS)

    Graves, Yan Jiang

    Radiation therapy aims at delivering a prescribed amount of radiation dose to cancerous targets while sparing dose to normal organs. Treatment planning and delivery in modern radiotherapy are highly complex. To ensure the accuracy of the delivered dose to a patient, a quality assurance (QA) procedure is needed before the actual treatment delivery. This dissertation aims at developing computational and physical tools to facilitate the QA process. In Chapter 2, we have developed a fast and accurate computational QA tool using a graphics processing unit based Monte Carlo (MC) dose engine. This QA tool aims at identifying any errors in the treatment planning stage and machine delivery process by comparing three dose distributions: planned dose computed by a treatment planning system, planned dose and delivered dose reconstructed using the MC method. Within this tool, several modules have been built. (1) A denoising algorithm to smooth the MC calculated dose. We have also investigated the effects of statistical uncertainty in MC simulations on a commonly used dose comparison metric. (2) A linear accelerator source model with a semi-automatic commissioning process. (3) A fluence generation module. With all these modules, a web application for this QA tool with a user friendly interface has been developed to provide users with easy access to our tool, facilitating its clinical utilizations. Even after an initial treatment plan fulfills the QA requirements, a patient may experience inter-fractional anatomy variations, which compromise the initial plan optimality. To resolve this issue, adaptive radiotherapy (ART) has been proposed, where treatment plan is redesigned based on most recent patient anatomy. In Chapter 3, we have constructed a physical deformable head and neck (HN) phantom with in-vivo dosimetry capability. This phantom resembles HN patient geometry and simulates tumor shrinkage with a high level of realism. The ground truth deformation field can be measured

  7. High-order computational fluid dynamics tools for aircraft design.

    PubMed

    Wang, Z J

    2014-08-13

    Most forecasts predict an annual airline traffic growth rate between 4.5 and 5% in the foreseeable future. To sustain that growth, the environmental impact of aircraft cannot be ignored. Future aircraft must have much better fuel economy, dramatically less greenhouse gas emissions and noise, in addition to better performance. Many technical breakthroughs must take place to achieve the aggressive environmental goals set up by governments in North America and Europe. One of these breakthroughs will be physics-based, highly accurate and efficient computational fluid dynamics and aeroacoustics tools capable of predicting complex flows over the entire flight envelope and through an aircraft engine, and computing aircraft noise. Some of these flows are dominated by unsteady vortices of disparate scales, often highly turbulent, and they call for higher-order methods. As these tools will be integral components of a multi-disciplinary optimization environment, they must be efficient to impact design. Ultimately, the accuracy, efficiency, robustness, scalability and geometric flexibility will determine which methods will be adopted in the design process. This article explores these aspects and identifies pacing items. PMID:25024419

  8. High-order computational fluid dynamics tools for aircraft design

    PubMed Central

    Wang, Z. J.

    2014-01-01

    Most forecasts predict an annual airline traffic growth rate between 4.5 and 5% in the foreseeable future. To sustain that growth, the environmental impact of aircraft cannot be ignored. Future aircraft must have much better fuel economy, dramatically less greenhouse gas emissions and noise, in addition to better performance. Many technical breakthroughs must take place to achieve the aggressive environmental goals set up by governments in North America and Europe. One of these breakthroughs will be physics-based, highly accurate and efficient computational fluid dynamics and aeroacoustics tools capable of predicting complex flows over the entire flight envelope and through an aircraft engine, and computing aircraft noise. Some of these flows are dominated by unsteady vortices of disparate scales, often highly turbulent, and they call for higher-order methods. As these tools will be integral components of a multi-disciplinary optimization environment, they must be efficient to impact design. Ultimately, the accuracy, efficiency, robustness, scalability and geometric flexibility will determine which methods will be adopted in the design process. This article explores these aspects and identifies pacing items. PMID:25024419

  9. Accurate charge capture and cost allocation: cost justification for bedside computing.

    PubMed Central

    Grewal, R.; Reed, R. L.

    1993-01-01

    This paper shows that cost justification for bedside clinical computing can be made by recouping charges with accurate charge capture. Twelve months worth of professional charges for a sixteen bed surgical intensive care unit are computed from charted data in a bedside clinical database and are compared to the professional charges actually billed by the unit. A substantial difference in predicted charges and billed charges was found. This paper also discusses the concept of appropriate cost allocation in the inpatient environment and the feasibility of appropriate allocation as a by-product of bedside computing. PMID:8130444

  10. Stable, accurate and efficient computation of normal modes for horizontal stratified models

    NASA Astrophysics Data System (ADS)

    Wu, Bo; Chen, Xiaofei

    2016-08-01

    We propose an adaptive root-determining strategy that is very useful when dealing with trapped modes or Stoneley modes whose energies become very insignificant on the free surface in the presence of low-velocity layers or fluid layers in the model. Loss of modes in these cases or inaccuracy in the calculation of these modes may then be easily avoided. Built upon the generalized reflection/transmission coefficients, the concept of `family of secular functions' that we herein call `adaptive mode observers' is thus naturally introduced to implement this strategy, the underlying idea of which has been distinctly noted for the first time and may be generalized to other applications such as free oscillations or applied to other methods in use when these cases are encountered. Additionally, we have made further improvements upon the generalized reflection/transmission coefficient method; mode observers associated with only the free surface and low-velocity layers (and the fluid/solid interface if the model contains fluid layers) are adequate to guarantee no loss and high precision at the same time of any physically existent modes without excessive calculations. Finally, the conventional definition of the fundamental mode is reconsidered, which is entailed in the cases under study. Some computational aspects are remarked on. With the additional help afforded by our superior root-searching scheme and the possibility of speeding calculation using a less number of layers aided by the concept of `turning point', our algorithm is remarkably efficient as well as stable and accurate and can be used as a powerful tool for widely related applications.

  11. Stable, accurate and efficient computation of normal modes for horizontal stratified models

    NASA Astrophysics Data System (ADS)

    Wu, Bo; Chen, Xiaofei

    2016-06-01

    We propose an adaptive root-determining strategy that is very useful when dealing with trapped modes or Stoneley modes whose energies become very insignificant on the free surface in the presence of low-velocity layers or fluid layers in the model. Loss of modes in these cases or inaccuracy in the calculation of these modes may then be easily avoided. Built upon the generalized reflection/transmission coefficients, the concept of "family of secular functions" that we herein call "adaptive mode observers", is thus naturally introduced to implement this strategy, the underlying idea of which has been distinctly noted for the first time and may be generalized to other applications such as free oscillations or applied to other methods in use when these cases are encountered. Additionally, we have made further improvements upon the generalized reflection/transmission coefficient method; mode observers associated with only the free surface and low-velocity layers (and the fluid/solid interface if the model contains fluid layers) are adequate to guarantee no loss and high precision at the same time of any physically existent modes without excessive calculations. Finally, the conventional definition of the fundamental mode is reconsidered, which is entailed in the cases under study. Some computational aspects are remarked on. With the additional help afforded by our superior root-searching scheme and the possibility of speeding calculation using a less number of layers aided by the concept of "turning point", our algorithm is remarkably efficient as well as stable and accurate and can be used as a powerful tool for widely related applications.

  12. EIGER: A new generation of computational electromagnetics tools

    SciTech Connect

    Wilton, D.R.; Johnson, W.A.; Jorgenson, R.E.; Sharpe, R.M.; Grant, J.B.

    1996-03-01

    The EIGER project (Electromagnetic Interactions GenERalized) endeavors to bring the next generation of spectral domain electromagnetic analysis tools to maturity and to cast them in a general form which is amenable to a variety of applications. The tools are written in Fortran 90 and with an object oriented philosophy to yield a package that is easily ported to a variety of platforms, simply maintained, and above all efficiently modified to address wide ranging applications. The modular development style and the choice of Fortran 90 is also driven by the desire to run efficiently on existing high performance computer platforms and to remain flexible for new architectures that are anticipated. The electromagnetic tool box consists of extremely accurate physics models for 2D and 3D electromagnetic scattering, radiation, and penetration problems. The models include surface and volume formulations for conductors and complex materials. In addition, realistic excitations and symmetries are incorporated, as well as, complex environments through the use of Green`s functions.

  13. Computer-based personality judgments are more accurate than those made by humans

    PubMed Central

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-01

    Judging others’ personalities is an essential skill in successful social living, as personality is a key driver behind people’s interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants’ Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507

  14. Computer-based personality judgments are more accurate than those made by humans.

    PubMed

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-27

    Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507

  15. Computer as Research Tools 4.Use Your PC More Effectively

    NASA Astrophysics Data System (ADS)

    Baba, Hajime

    This article shows the useful tools on personal computers. The electronical dictionaries, the full-text search system, the simple usage of the preprint server, and the numeric computation language for applications in engineering and science are introduced.

  16. TOPLHA: an accurate and efficient numerical tool for analysis and design of LH antennas

    NASA Astrophysics Data System (ADS)

    Milanesio, D.; Lancellotti, V.; Meneghini, O.; Maggiora, R.; Vecchi, G.; Bilato, R.

    2007-09-01

    Auxiliary ICRF heating systems in tokamaks often involve large complex antennas, made up of several conducting straps hosted in distinct cavities that open towards the plasma. The same holds especially true in the LH regime, wherein the antennas are comprised of arrays of many phased waveguides. Upon observing that the various cavities or waveguides couple to each other only through the EM fields existing over the plasma-facing apertures, we self-consistently formulated the EM problem by a convenient set of multiple coupled integral equations. Subsequent application of the Method of Moments yields a highly sparse algebraic system; therefore formal inversion of the system matrix happens to be not so memory demanding, despite the number of unknowns may be quite large (typically 105 or so). The overall strategy has been implemented in an enhanced version of TOPICA (Torino Polytechnic Ion Cyclotron Antenna) and in a newly developed code named TOPLHA (Torino Polytechnic Lower Hybrid Antenna). Both are simulation and prediction tools for plasma facing antennas that incorporate commercial-grade 3D graphic interfaces along with an accurate description of the plasma. In this work we present the new proposed formulation along with examples of application to real life large LH antenna systems.

  17. A Unified Methodology for Computing Accurate Quaternion Color Moments and Moment Invariants.

    PubMed

    Karakasis, Evangelos G; Papakostas, George A; Koulouriotis, Dimitrios E; Tourassis, Vassilios D

    2014-02-01

    In this paper, a general framework for computing accurate quaternion color moments and their corresponding invariants is proposed. The proposed unified scheme arose by studying the characteristics of different orthogonal polynomials. These polynomials are used as kernels in order to form moments, the invariants of which can easily be derived. The resulted scheme permits the usage of any polynomial-like kernel in a unified and consistent way. The resulted moments and moment invariants demonstrate robustness to noisy conditions and high discriminative power. Additionally, in the case of continuous moments, accurate computations take place to avoid approximation errors. Based on this general methodology, the quaternion Tchebichef, Krawtchouk, Dual Hahn, Legendre, orthogonal Fourier-Mellin, pseudo Zernike and Zernike color moments, and their corresponding invariants are introduced. A selected paradigm presents the reconstruction capability of each moment family, whereas proper classification scenarios evaluate the performance of color moment invariants. PMID:24216719

  18. Physics Education through Computational Tools: The Case of Geometrical and Physical Optics

    ERIC Educational Resources Information Center

    Rodríguez, Y.; Santana, A.; Mendoza, L. M.

    2013-01-01

    Recently, with the development of more powerful and accurate computational tools, the inclusion of new didactic materials in the classroom is known to have increased. However, the form in which these materials can be used to enhance the learning process is still under debate. Many different methodologies have been suggested for constructing new…

  19. TOPICA: an accurate and efficient numerical tool for analysis and design of ICRF antennas

    NASA Astrophysics Data System (ADS)

    Lancellotti, V.; Milanesio, D.; Maggiora, R.; Vecchi, G.; Kyrytsya, V.

    2006-07-01

    The demand for a predictive tool to help in designing ion-cyclotron radio frequency (ICRF) antenna systems for today's fusion experiments has driven the development of codes such as ICANT, RANT3D, and the early development of TOPICA (TOrino Polytechnic Ion Cyclotron Antenna) code. This paper describes the substantive evolution of TOPICA formulation and implementation that presently allow it to handle the actual geometry of ICRF antennas (with curved, solid straps, a general-shape housing, Faraday screen, etc) as well as an accurate plasma description, accounting for density and temperature profiles and finite Larmor radius effects. The antenna is assumed to be housed in a recess-like enclosure. Both goals have been attained by formally separating the problem into two parts: the vacuum region around the antenna and the plasma region inside the toroidal chamber. Field continuity and boundary conditions allow formulating of a set of two coupled integral equations for the unknown equivalent (current) sources; then the equations are reduced to a linear system by a method of moments solution scheme employing 2D finite elements defined over a 3D non-planar surface triangular-cell mesh. In the vacuum region calculations are done in the spatial (configuration) domain, whereas in the plasma region a spectral (wavenumber) representation of fields and currents is adopted, thus permitting a description of the plasma by a surface impedance matrix. Owing to this approach, any plasma model can be used in principle, and at present the FELICE code has been employed. The natural outcomes of TOPICA are the induced currents on the conductors (antenna, housing, etc) and the electric field in front of the plasma, whence the antenna circuit parameters (impedance/scattering matrices), the radiated power and the fields (at locations other than the chamber aperture) are then obtained. An accurate model of the feeding coaxial lines is also included. The theoretical model and its TOPICA

  20. MicroRNA-200 Family Profile: A Promising Ancillary Tool for Accurate Cancer Diagnosis

    PubMed Central

    Liu, Xiaodong; Zhang, Jianhua; Xie, Botao; Li, Hao; Shen, Jihong; Chen, Jianheng

    2016-01-01

    Cancer is one of the most threatening diseases in the world and great interests have been paid to discover accurate and noninvasive methods for cancer diagnosis. The value of microRNA-200 (miRNA-200, miR-200) family has been revealed in many studies. However, the results from various studies were inconsistent, and thus a meta-analysis was designed and performed to assess the overall value of miRNA200 in cancer diagnosis. Relevant studies were searched electronically from the following databases: PubMed, Embase, Web of Science, the Cochrane Library, and Chinese National Knowledge Infrastructure. Keyword combined with “miR-200,” “cancer,” and “diagnosis” in any fields was used for searching relevant studies. Then, the pooled sensitivity, specificity, area under the curve (AUC), and partial AUC were calculated using the random-effects model. Heterogeneity among individual studies was also explored by subgroup analyses. A total of 28 studies from 18 articles with an overall sample size of 3676 subjects (2097 patients and 1579 controls) were included in this meta-analysis. The overall sensitivity and specificity with 95% confidence intervals (95% CIs) are 0.709 (95% CI: 0.657–0.755) and 0.667 (95% CI: 0.617–0.713), respectively. Additionally, AUC and partial AUC for the pooled data is 0.735 and 0.627, respectively. Subgroup analyses revealed that using miRNA-200 family for cancer diagnosis is more effective in white than in Asian ethnic groups. In addition, cancer diagnosis by miRNA using circulating specimen is more effective than that using noncirculating specimen. Finally, miRNA is more accurate in diagnosing endometrial cancer than other types of cancer, and some miRNA family members (miR-200b and miR-429) have superior diagnostic accuracy than other miR-200 family members. In conclusion, the profiling of miRNA-200 family is likely to be a valuable tool in cancer detection and diagnosis. PMID:26618619

  1. An accurate tool for the fast generation of dark matter halo catalogues

    NASA Astrophysics Data System (ADS)

    Monaco, P.; Sefusatti, E.; Borgani, S.; Crocce, M.; Fosalba, P.; Sheth, R. K.; Theuns, T.

    2013-08-01

    We present a new parallel implementation of the PINpointing Orbit Crossing-Collapsed HIerarchical Objects (PINOCCHIO) algorithm, a quick tool, based on Lagrangian Perturbation Theory, for the hierarchical build-up of dark matter (DM) haloes in cosmological volumes. To assess its ability to predict halo correlations on large scales, we compare its results with those of an N-body simulation of a 3 h-1 Gpc box sampled with 20483 particles taken from the MICE suite, matching the same seeds for the initial conditions. Thanks to the Fastest Fourier Transforms in the West (FFTW) libraries and to the relatively simple design, the code shows very good scaling properties. The CPU time required by PINOCCHIO is a tiny fraction (˜1/2000) of that required by the MICE simulation. Varying some of PINOCCHIO numerical parameters allows one to produce a universal mass function that lies in the range allowed by published fits, although it underestimates the MICE mass function of Friends-of-Friends (FoF) haloes in the high-mass tail. We compare the matter-halo and the halo-halo power spectra with those of the MICE simulation and find that these two-point statistics are well recovered on large scales. In particular, when catalogues are matched in number density, agreement within 10 per cent is achieved for the halo power spectrum. At scales k > 0.1 h Mpc-1, the inaccuracy of the Zel'dovich approximation in locating halo positions causes an underestimate of the power spectrum that can be modelled as a Gaussian factor with a damping scale of d = 3 h-1 Mpc at z = 0, decreasing at higher redshift. Finally, a remarkable match is obtained for the reduced halo bispectrum, showing a good description of non-linear halo bias. Our results demonstrate the potential of PINOCCHIO as an accurate and flexible tool for generating large ensembles of mock galaxy surveys, with interesting applications for the analysis of large galaxy redshift surveys.

  2. High resolution DEM from Tandem-X interferometry: an accurate tool to characterize volcanic activity

    NASA Astrophysics Data System (ADS)

    Albino, Fabien; Kervyn, Francois

    2013-04-01

    Tandem-X mission was launched by the German agency (DLR) in June 2010. It is a new generation high resolution SAR sensor mainly dedicated to topographic applications. For the purpose of our researches focused on the study of the volcano-tectonic activity in the Kivu Rift area, a set of Tandem-X bistatic radar images were used to produce a high resolution InSAR DEM of the Virunga Volcanic Province (VVP). The VVP is part of the Western branch of the African rift, situated at the boundary between D.R. Congo, Rwanda and Uganda. It has two highly active volcanoes, Nyiragongo and Nyamulagira. A first task concerns the quantitative assessment of the vertical accuracy that can be achieved with these new data. The new DEMs are compared to other space borne datasets (SRTM, ASTER) but also to field measurements given by differential GPS. Multi-temporal radar acquisitions allow us to produce several DEM of the same area. This appeared to be very useful in the context of an active volcanic context where new geomorphological features (faults, fissures, volcanic cones and lava flows) appear continuously through time. For example, since the year 2000, time of the SRTM acquisition, we had one eruption at Nyiragongo (2002) and six eruptions at Nyamulagira (2001, 2002, 2004, 2006, 2010 and 2011) which all induce large changes in the landscape with the emplacement of new lava fields and scoria cones. From our repetitive Tandem-X DEM production, we have a tool to identify and also quantify in term of size and volume all the topographic changes relative to this past volcanic activity. These parameters are high value information to improve the understanding of the Virunga volcanoes; the accurate estimation of erupted volume and knowledge of structural features associated to past eruptions are key parameters to understand the volcanic system, to ameliorate the hazard assessment, and finally contribute to risk mitigation in a densely populated area.

  3. Collected Wisdom: Assessment Tools for Computer Science Programs

    ERIC Educational Resources Information Center

    Sanders, Kathryn E.; McCartney, Robert

    2004-01-01

    In this paper, we investigate the question of what assessment tools are being used in practice by United States computing programs and what the faculty doing the assessment think of the tools they use. After presenting some background with regard to the design, implementation, and use of assessment, with particular attention to assessment tools,…

  4. Accurate computation of Stokes flow driven by an open immersed interface

    NASA Astrophysics Data System (ADS)

    Li, Yi; Layton, Anita T.

    2012-06-01

    We present numerical methods for computing two-dimensional Stokes flow driven by forces singularly supported along an open, immersed interface. Two second-order accurate methods are developed: one for accurately evaluating boundary integral solutions at a point, and another for computing Stokes solution values on a rectangular mesh. We first describe a method for computing singular or nearly singular integrals, such as a double layer potential due to sources on a curve in the plane, evaluated at a point on or near the curve. To improve accuracy of the numerical quadrature, we add corrections for the errors arising from discretization, which are found by asymptotic analysis. When used to solve the Stokes equations with sources on an open, immersed interface, the method generates second-order approximations, for both the pressure and the velocity, and preserves the jumps in the solutions and their derivatives across the boundary. We then combine the method with a mesh-based solver to yield a hybrid method for computing Stokes solutions at N2 grid points on a rectangular grid. Numerical results are presented which exhibit second-order accuracy. To demonstrate the applicability of the method, we use the method to simulate fluid dynamics induced by the beating motion of a cilium. The method preserves the sharp jumps in the Stokes solution and their derivatives across the immersed boundary. Model results illustrate the distinct hydrodynamic effects generated by the effective stroke and by the recovery stroke of the ciliary beat cycle.

  5. Accurate calculation of computer-generated holograms using angular-spectrum layer-oriented method.

    PubMed

    Zhao, Yan; Cao, Liangcai; Zhang, Hao; Kong, Dezhao; Jin, Guofan

    2015-10-01

    Fast calculation and correct depth cue are crucial issues in the calculation of computer-generated hologram (CGH) for high quality three-dimensional (3-D) display. An angular-spectrum based algorithm for layer-oriented CGH is proposed. Angular spectra from each layer are synthesized as a layer-corresponded sub-hologram based on the fast Fourier transform without paraxial approximation. The proposed method can avoid the huge computational cost of the point-oriented method and yield accurate predictions of the whole diffracted field compared with other layer-oriented methods. CGHs of versatile formats of 3-D digital scenes, including computed tomography and 3-D digital models, are demonstrated with precise depth performance and advanced image quality. PMID:26480062

  6. Time accurate application of the MacCormack 2-4 scheme on massively parallel computers

    NASA Technical Reports Server (NTRS)

    Hudson, Dale A.; Long, Lyle N.

    1995-01-01

    Many recent computational efforts in turbulence and acoustics research have used higher order numerical algorithms. One popular method has been the explicit MacCormack 2-4 scheme. The MacCormack 2-4 scheme is second order accurate in time and fourth order accurate in space, and is stable for CFL's below 2/3. Current research has shown that the method can give accurate results but does exhibit significant Gibbs phenomena at sharp discontinuities. The impact of adding Jameson type second, third, and fourth order artificial viscosity was examined here. Category 2 problems, the nonlinear traveling wave and the Riemann problem, were computed using a CFL number of 0.25. This research has found that dispersion errors can be significantly reduced or nearly eliminated by using a combination of second and third order terms in the damping. Use of second and fourth order terms reduced the magnitude of dispersion errors but not as effectively as the second and third order combination. The program was coded using Thinking Machine's CM Fortran, a variant of Fortran 90/High Performance Fortran, and was executed on a 2K CM-200. Simple extrapolation boundary conditions were used for both problems.

  7. Object-oriented Tools for Distributed Computing

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1993-01-01

    Distributed computing systems are proliferating, owing to the availability of powerful, affordable microcomputers and inexpensive communication networks. A critical problem in developing such systems is getting application programs to interact with one another across a computer network. Remote interprogram connectivity is particularly challenging across heterogeneous environments, where applications run on different kinds of computers and operating systems. NetWorks! (trademark) is an innovative software product that provides an object-oriented messaging solution to these problems. This paper describes the design and functionality of NetWorks! and illustrates how it is being used to build complex distributed applications for NASA and in the commercial sector.

  8. Palm computer demonstrates a fast and accurate means of burn data collection.

    PubMed

    Lal, S O; Smith, F W; Davis, J P; Castro, H Y; Smith, D W; Chinkes, D L; Barrow, R E

    2000-01-01

    Manual biomedical data collection and entry of the data into a personal computer is time-consuming and can be prone to errors. The purpose of this study was to compare data entry into a hand-held computer versus hand written data followed by entry of the data into a personal computer. A Palm (3Com Palm IIIx, Santa, Clara, Calif) computer with a custom menu-driven program was used for the entry and retrieval of burn-related variables. These variables were also used to create an identical sheet that was filled in by hand. Identical data were retrieved twice from 110 charts 48 hours apart and then used to create an Excel (Microsoft, Redmond, Wash) spreadsheet. One time data were recorded by the Palm entry method, and the other time the data were handwritten. The method of retrieval was alternated between the Palm system and handwritten system every 10 charts. The total time required to log data and to generate an Excel spreadsheet was recorded and used as a study endpoint. The total time for the Palm method of data collection and downloading to a personal computer was 23% faster than hand recording with the personal computer entry method (P < 0.05), and 58% fewer errors were generated with the Palm method.) The Palm is a faster and more accurate means of data collection than a handwritten technique. PMID:11194811

  9. The Computer As A Teaching Tool.

    ERIC Educational Resources Information Center

    Caldwell, Robert M.

    1982-01-01

    Elements of computer use in teaching are explored. They include equipment expense; availability of quality software; development of interactive instruction; use of graphics, color, flashing, and sound; learner response; and feedback. (CT)

  10. RapMap: a rapid, sensitive and accurate tool for mapping RNA-seq reads to transcriptomes

    PubMed Central

    Srivastava, Avi; Sarkar, Hirak; Gupta, Nitish; Patro, Rob

    2016-01-01

    Motivation: The alignment of sequencing reads to a transcriptome is a common and important step in many RNA-seq analysis tasks. When aligning RNA-seq reads directly to a transcriptome (as is common in the de novo setting or when a trusted reference annotation is available), care must be taken to report the potentially large number of multi-mapping locations per read. This can pose a substantial computational burden for existing aligners, and can considerably slow downstream analysis. Results: We introduce a novel concept, quasi-mapping, and an efficient algorithm implementing this approach for mapping sequencing reads to a transcriptome. By attempting only to report the potential loci of origin of a sequencing read, and not the base-to-base alignment by which it derives from the reference, RapMap—our tool implementing quasi-mapping—is capable of mapping sequencing reads to a target transcriptome substantially faster than existing alignment tools. The algorithm we use to implement quasi-mapping uses several efficient data structures and takes advantage of the special structure of shared sequence prevalent in transcriptomes to rapidly provide highly-accurate mapping information. We demonstrate how quasi-mapping can be successfully applied to the problems of transcript-level quantification from RNA-seq reads and the clustering of contigs from de novo assembled transcriptomes into biologically meaningful groups. Availability and implementation: RapMap is implemented in C ++11 and is available as open-source software, under GPL v3, at https://github.com/COMBINE-lab/RapMap. Contact: rob.patro@cs.stonybrook.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27307617

  11. Accurate guidance for percutaneous access to a specific target in soft tissues: preclinical study of computer-assisted pericardiocentesis.

    PubMed

    Chavanon, O; Barbe, C; Troccaz, J; Carrat, L; Ribuot, C; Noirclerc, M; Maitrasse, B; Blin, D

    1999-06-01

    In the field of percutaneous access to soft tissues, our project was to improve classical pericardiocentesis by performing accurate guidance to a selected target, according to a model of the pericardial effusion acquired through three-dimensional (3D) data recording. Required hardware is an echocardiographic device and a needle, both linked to a 3D localizer, and a computer. After acquiring echographic data, a modeling procedure allows definition of the optimal puncture strategy, taking into consideration the mobility of the heart, by determining a stable region, whatever the period of the cardiac cycle. A passive guidance system is then used to reach the planned target accurately, generally a site in the middle of the stable region. After validation on a dynamic phantom and a feasibility study in dogs, an accuracy and reliability analysis protocol was realized on pigs with experimental pericardial effusion. Ten consecutive successful punctures using various trajectories were performed on eight pigs. Nonbloody liquid was collected from pericardial effusions in the stable region (5 to 9 mm wide) within 10 to 15 minutes from echographic acquisition to drainage. Accuracy of at least 2.5 mm was demonstrated. This study demonstrates the feasibility of computer-assisted pericardiocentesis. Beyond the simple improvement of the current technique, this method could be a new way to reach the heart or a new tool for percutaneous access and image-guided puncture of soft tissues. Further investigation will be necessary before routine human application. PMID:10414543

  12. Novel electromagnetic surface integral equations for highly accurate computations of dielectric bodies with arbitrarily low contrasts

    SciTech Connect

    Erguel, Ozguer; Guerel, Levent

    2008-12-01

    We present a novel stabilization procedure for accurate surface formulations of electromagnetic scattering problems involving three-dimensional dielectric objects with arbitrarily low contrasts. Conventional surface integral equations provide inaccurate results for the scattered fields when the contrast of the object is low, i.e., when the electromagnetic material parameters of the scatterer and the host medium are close to each other. We propose a stabilization procedure involving the extraction of nonradiating currents and rearrangement of the right-hand side of the equations using fictitious incident fields. Then, only the radiating currents are solved to calculate the scattered fields accurately. This technique can easily be applied to the existing implementations of conventional formulations, it requires negligible extra computational cost, and it is also appropriate for the solution of large problems with the multilevel fast multipole algorithm. We show that the stabilization leads to robust formulations that are valid even for the solutions of extremely low-contrast objects.

  13. An accurate quadrature technique for the contact boundary in 3D finite element computations

    NASA Astrophysics Data System (ADS)

    Duong, Thang X.; Sauer, Roger A.

    2015-01-01

    This paper presents a new numerical integration technique for 3D contact finite element implementations, focusing on a remedy for the inaccurate integration due to discontinuities at the boundary of contact surfaces. The method is based on the adaptive refinement of the integration domain along the boundary of the contact surface, and is accordingly denoted RBQ for refined boundary quadrature. It can be used for common element types of any order, e.g. Lagrange, NURBS, or T-Spline elements. In terms of both computational speed and accuracy, RBQ exhibits great advantages over a naive increase of the number of quadrature points. Also, the RBQ method is shown to remain accurate for large deformations. Furthermore, since the sharp boundary of the contact surface is determined, it can be used for various purposes like the accurate post-processing of the contact pressure. Several examples are presented to illustrate the new technique.

  14. An accurate Fortran code for computing hydrogenic continuum wave functions at a wide range of parameters

    NASA Astrophysics Data System (ADS)

    Peng, Liang-You; Gong, Qihuang

    2010-12-01

    The accurate computations of hydrogenic continuum wave functions are very important in many branches of physics such as electron-atom collisions, cold atom physics, and atomic ionization in strong laser fields, etc. Although there already exist various algorithms and codes, most of them are only reliable in a certain ranges of parameters. In some practical applications, accurate continuum wave functions need to be calculated at extremely low energies, large radial distances and/or large angular momentum number. Here we provide such a code, which can generate accurate hydrogenic continuum wave functions and corresponding Coulomb phase shifts at a wide range of parameters. Without any essential restrict to angular momentum number, the present code is able to give reliable results at the electron energy range [10,10] eV for radial distances of [10,10] a.u. We also find the present code is very efficient, which should find numerous applications in many fields such as strong field physics. Program summaryProgram title: HContinuumGautchi Catalogue identifier: AEHD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHD_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1233 No. of bytes in distributed program, including test data, etc.: 7405 Distribution format: tar.gz Programming language: Fortran90 in fixed format Computer: AMD Processors Operating system: Linux RAM: 20 MBytes Classification: 2.7, 4.5 Nature of problem: The accurate computation of atomic continuum wave functions is very important in many research fields such as strong field physics and cold atom physics. Although there have already existed various algorithms and codes, most of them can only be applicable and reliable in a certain range of parameters. We present here an accurate FORTRAN program for

  15. Dwell time calculation for computer controlled large tool

    NASA Astrophysics Data System (ADS)

    Fan, Bin; Burge, James H.; Martin, Hubert; Zeng, Zhige; Li, Xiaojin; Zhou, Jiabin

    2012-09-01

    The Computer-controlled Large-tool such as the stressed-lap which firstly developed in the Steward Observatory Mirror Lab (SOML) [1]and the Computer controlled active lap which developed in the IOE (Institute of Optics and Electronics, Chinese Academy of Science), those large tools are controlled by computer to manufacturing large optics, especially for grinding with loose abrasive and polishing with slurry. Comparing the fixed orbital lap, computer-controlled largetool can bend its lap surface timely to match the local sub-aperture, so it always strike the high area preferentially, due to its large diameter , computer-controlled large-tool possess highly remove efficiency and generate less middle-frequency and high-frequency errors comparing some small tools such as computer controlled optical surface (CCOS), but on the other hand how to calculate the dwell time for those computer-controlled large-tool becomes a challenge comparing those small tools. Based on the mathematical removal equation for computer controlled active lap we have none negative least square algorithm to calculate the dwell time, after the simulation, a optimized algorithm based on none negative least square is provided, the dwell time calculated by this optimized algorithm meet the wanted removal volume with little residual errors.

  16. Special purpose hybrid transfinite elements and unified computational methodology for accurately predicting thermoelastic stress waves

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1988-01-01

    This paper represents an attempt to apply extensions of a hybrid transfinite element computational approach for accurately predicting thermoelastic stress waves. The applicability of the present formulations for capturing the thermal stress waves induced by boundary heating for the well known Danilovskaya problems is demonstrated. A unique feature of the proposed formulations for applicability to the Danilovskaya problem of thermal stress waves in elastic solids lies in the hybrid nature of the unified formulations and the development of special purpose transfinite elements in conjunction with the classical Galerkin techniques and transformation concepts. Numerical test cases validate the applicability and superior capability to capture the thermal stress waves induced due to boundary heating.

  17. Tool Use of Experienced Learners in Computer-Based Learning Environments: Can Tools Be Beneficial?

    ERIC Educational Resources Information Center

    Juarez Collazo, Norma A.; Corradi, David; Elen, Jan; Clarebout, Geraldine

    2014-01-01

    Research has documented the use of tools in computer-based learning environments as problematic, that is, learners do not use the tools and when they do, they tend to do it suboptimally. This study attempts to disentangle cause and effect of this suboptimal tool use for experienced learners. More specifically, learner variables (metacognitive and…

  18. The Computer as an Artistic Tool.

    ERIC Educational Resources Information Center

    Sveinson, Lynn

    1978-01-01

    Presents a justification of the belief that science and art can be successfully combined. The computer's merits are viewed as a potential modelbuilder for the formalization of aesthetic concepts. The rest of the paper details recent and current research on such uses of the machine. (VT)

  19. Computer as a Tool in SAT Preparation.

    ERIC Educational Resources Information Center

    Coffin, Gregory C.

    Two experimental programs, designed to increase Scholastic Aptitude Test (SAT) scores of inner city, low achieving students by using computer-assisted SAT preparation, produced differing results. Forty volunteers from a nearby high school were assigned to two groups of 20 each--one experimental and one control group. The first program provided six…

  20. Computer Grading As an Instructional Tool.

    ERIC Educational Resources Information Center

    Rottmann, Ray M.; Hudson, H. T.

    1983-01-01

    Describes computer grading system providing/storing scores and giving feedback to instructors on how students are performing on a day-to-day basis and how they are handling course concepts. Focuses on the hardware and software of this efficient computerized grading package, which can be used with classes of 250 students (or larger). (Author/JN)

  1. Accurate identification and compensation of geometric errors of 5-axis CNC machine tools using double ball bar

    NASA Astrophysics Data System (ADS)

    Lasemi, Ali; Xue, Deyi; Gu, Peihua

    2016-05-01

    Five-axis CNC machine tools are widely used in manufacturing of parts with free-form surfaces. Geometric errors of machine tools have significant effects on the quality of manufactured parts. This research focuses on development of a new method to accurately identify geometric errors of 5-axis CNC machines, especially the errors due to rotary axes, using the magnetic double ball bar. A theoretical model for identification of geometric errors is provided. In this model, both position-independent errors and position-dependent errors are considered as the error sources. This model is simplified by identification and removal of the correlated and insignificant error sources of the machine. Insignificant error sources are identified using the sensitivity analysis technique. Simulation results reveal that the simplified error identification model can result in more accurate estimations of the error parameters. Experiments on a 5-axis CNC machine tool also demonstrate significant reduction in the volumetric error after error compensation.

  2. AI tools in computer based problem solving

    NASA Technical Reports Server (NTRS)

    Beane, Arthur J.

    1988-01-01

    The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.

  3. Accurate methods for computing inviscid and viscous Kelvin-Helmholtz instability

    NASA Astrophysics Data System (ADS)

    Chen, Michael J.; Forbes, Lawrence K.

    2011-02-01

    The Kelvin-Helmholtz instability is modelled for inviscid and viscous fluids. Here, two bounded fluid layers flow parallel to each other with the interface between them growing in an unstable fashion when subjected to a small perturbation. In the various configurations of this problem, and the related problem of the vortex sheet, there are several phenomena associated with the evolution of the interface; notably the formation of a finite time curvature singularity and the ‘roll-up' of the interface. Two contrasting computational schemes will be presented. A spectral method is used to follow the evolution of the interface in the inviscid version of the problem. This allows the interface shape to be computed up to the time that a curvature singularity forms, with several computational difficulties overcome to reach that point. A weakly compressible viscous version of the problem is studied using finite difference techniques and a vorticity-streamfunction formulation. The two versions have comparable, but not identical, initial conditions and so the results exhibit some differences in timing. By including a small amount of viscosity the interface may be followed to the point that it rolls up into a classic ‘cat's-eye' shape. Particular attention was given to computing a consistent initial condition and solving the continuity equation both accurately and efficiently.

  4. Suite of finite element algorithms for accurate computation of soft tissue deformation for surgical simulation

    PubMed Central

    Joldes, Grand Roman; Wittek, Adam; Miller, Karol

    2008-01-01

    Real time computation of soft tissue deformation is important for the use of augmented reality devices and for providing haptic feedback during operation or surgeon training. This requires algorithms that are fast, accurate and can handle material nonlinearities and large deformations. A set of such algorithms is presented in this paper, starting with the finite element formulation and the integration scheme used and addressing common problems such as hourglass control and locking. The computation examples presented prove that by using these algorithms, real time computations become possible without sacrificing the accuracy of the results. For a brain model having more than 7000 degrees of freedom, we computed the reaction forces due to indentation with frequency of around 1000 Hz using a standard dual core PC. Similarly, we conducted simulation of brain shift using a model with more than 50 000 degrees of freedom in less than a minute. The speed benefits of our models results from combining the Total Lagrangian formulation with explicit time integration and low order finite elements. PMID:19152791

  5. A Computer-Based Tool for Introducing Turfgrass Species.

    ERIC Educational Resources Information Center

    Fermanian, T. W.; Wehner, D. J.

    1995-01-01

    Describes a self-contained computer application constructed using the SuperCard development tool which introduces the characteristics of turfgrass species and their optimum environments. Evaluates students' gain in understanding turf species characteristics through this approach. (LZ)

  6. Astronaut's tool for withdrawing/replacing computer cards

    NASA Technical Reports Server (NTRS)

    West, R. L.

    1969-01-01

    Symmetrical tool allows astronauts to withdraw and replace Apollo Telescope Mount control computer cards. It is easily manipulated by a gloved hand, provides positive locking of a withdrawn card, and has a visible locking device.

  7. Analysis and computer tools for separation processes involving nonideal mixtures

    SciTech Connect

    Lucia, A.

    1992-05-01

    The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

  8. An accurate modeling, simulation, and analysis tool for predicting and estimating Raman LIDAR system performance

    NASA Astrophysics Data System (ADS)

    Grasso, Robert J.; Russo, Leonard P.; Barrett, John L.; Odhner, Jefferson E.; Egbert, Paul I.

    2007-09-01

    BAE Systems presents the results of a program to model the performance of Raman LIDAR systems for the remote detection of atmospheric gases, air polluting hydrocarbons, chemical and biological weapons, and other molecular species of interest. Our model, which integrates remote Raman spectroscopy, 2D and 3D LADAR, and USAF atmospheric propagation codes permits accurate determination of the performance of a Raman LIDAR system. The very high predictive performance accuracy of our model is due to the very accurate calculation of the differential scattering cross section for the specie of interest at user selected wavelengths. We show excellent correlation of our calculated cross section data, used in our model, with experimental data obtained from both laboratory measurements and the published literature. In addition, the use of standard USAF atmospheric models provides very accurate determination of the atmospheric extinction at both the excitation and Raman shifted wavelengths.

  9. Caesy: A software tool for computer-aided engineering

    NASA Technical Reports Server (NTRS)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  10. Scratch as a Computational Modelling Tool for Teaching Physics

    ERIC Educational Resources Information Center

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  11. Computational Tools for Accelerating Carbon Capture Process Development

    SciTech Connect

    Miller, David; Sahinidis, N V; Cozad, A; Lee, A; Kim, H; Morinelly, J; Eslick, J; Yuan, Z

    2013-06-04

    This presentation reports development of advanced computational tools to accelerate next generation technology development. These tools are to develop an optimized process using rigorous models. They include: Process Models; Simulation-Based Optimization; Optimized Process; Uncertainty Quantification; Algebraic Surrogate Models; and Superstructure Optimization (Determine Configuration).

  12. Computational Tools to Assess Turbine Biological Performance

    SciTech Connect

    Richmond, Marshall C.; Serkowski, John A.; Rakowski, Cynthia L.; Strickler, Brad; Weisbeck, Molly; Dotson, Curtis L.

    2014-07-24

    Public Utility District No. 2 of Grant County (GCPUD) operates the Priest Rapids Dam (PRD), a hydroelectric facility on the Columbia River in Washington State. The dam contains 10 Kaplan-type turbine units that are now more than 50 years old. Plans are underway to refit these aging turbines with new runners. The Columbia River at PRD is a migratory pathway for several species of juvenile and adult salmonids, so passage of fish through the dam is a major consideration when upgrading the turbines. In this paper, a method for turbine biological performance assessment (BioPA) is demonstrated. Using this method, a suite of biological performance indicators is computed based on simulated data from a CFD model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. Using known relationships between the dose of an injury mechanism and frequency of injury (dose–response) from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from proposed designs, the engineer can identify the more-promising alternatives. We present an application of the BioPA method for baseline risk assessment calculations for the existing Kaplan turbines at PRD that will be used as the minimum biological performance that a proposed new design must achieve.

  13. DeconMSn: A Software Tool for accurate parent ion monoisotopic mass determination for tandem mass spectra

    SciTech Connect

    Mayampurath, Anoop M.; Jaitly, Navdeep; Purvine, Samuel O.; Monroe, Matthew E.; Auberry, Kenneth J.; Adkins, Joshua N.; Smith, Richard D.

    2008-04-01

    We present a new software tool for tandem MS analyses that: • accurately calculates the monoisotopic mass and charge of high–resolution parent ions • accurately operates regardless of the mass selected for fragmentation • performs independent of instrument settings • enables optimal selection of search mass tolerance for high mass accuracy experiments • is open source and thus can be tailored to individual needs • incorporates a SVM-based charge detection algorithm for analyzing low resolution tandem MS spectra • creates multiple output data formats (.dta, .MGF) • handles .RAW files and .mzXML formats • compatible with SEQUEST, MASCOT, X!Tandem

  14. A simplified approach to characterizing a kilovoltage source spectrum for accurate dose computation

    SciTech Connect

    Poirier, Yannick; Kouznetsov, Alexei; Tambasco, Mauro

    2012-06-15

    % for the homogeneous and heterogeneous block phantoms, and agreement for the transverse dose profiles was within 6%. Conclusions: The HVL and kVp are sufficient for characterizing a kV x-ray source spectrum for accurate dose computation. As these parameters can be easily and accurately measured, they provide for a clinically feasible approach to characterizing a kV energy spectrum to be used for patient specific x-ray dose computations. Furthermore, these results provide experimental validation of our novel hybrid dose computation algorithm.

  15. Optical computed tomography of radiochromic gels for accurate three-dimensional dosimetry

    NASA Astrophysics Data System (ADS)

    Babic, Steven

    In this thesis, three-dimensional (3-D) radiochromic Ferrous Xylenol-orange (FX) and Leuco Crystal Violet (LCV) micelles gels were imaged by laser and cone-beam (Vista(TM)) optical computed tomography (CT) scanners. The objective was to develop optical CT of radiochromic gels for accurate 3-D dosimetry of intensity-modulated radiation therapy (IMRT) and small field techniques used in modern radiotherapy. First, the cause of a threshold dose response in FX gel dosimeters when scanned with a yellow light source was determined. This effect stems from a spectral sensitivity to multiple chemical complexes that are at different dose levels between ferric ions and xylenol-orange. To negate the threshold dose, an initial concentration of ferric ions is needed in order to shift the chemical equilibrium so that additional dose results in a linear production of a coloured complex that preferentially absorbs at longer wavelengths. Second, a low diffusion leuco-based radiochromic gel consisting of Triton X-100 micelles was developed. The diffusion coefficient of the LCV micelle gel was found to be minimal (0.036 + 0.001 mm2 hr-1 ). Although a dosimetric characterization revealed a reduced sensitivity to radiation, this was offset by a lower auto-oxidation rate and base optical density, higher melting point and no spectral sensitivity. Third, the Radiological Physics Centre (RPC) head-and-neck IMRT protocol was extended to 3-D dose verification using laser and cone-beam (Vista(TM)) optical CT scans of FX gels. Both optical systems yielded comparable measured dose distributions in high-dose regions and low gradients. The FX gel dosimetry results were crossed checked against independent thermoluminescent dosimeter and GAFChromicRTM EBT film measurements made by the RPC. It was shown that optical CT scanned FX gels can be used for accurate IMRT dose verification in 3-D. Finally, corrections for FX gel diffusion and scattered stray light in the Vista(TM) scanner were developed to

  16. The Learning Computer: Low Bandwidth Tool that Bridges Digital Divide

    ERIC Educational Resources Information Center

    Johnson, Russell; Kemp, Elizabeth; Kemp, Ray; Blakey, Peter

    2007-01-01

    This article reports on a project that explores strategies for narrowing the digital divide by providing a practicable e-learning option for the millions living outside the ambit of high performance computing and communication technology. The concept is introduced of a "learning computer," a low bandwidth tool that provides a simplified,…

  17. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  18. The Use of Computer Tools to Support Meaningful Learning

    ERIC Educational Resources Information Center

    Keengwe, Jared; Onchwari, Grace; Wachira, Patrick

    2008-01-01

    This article attempts to provide a review of literature pertaining to computer technology use in education. The authors discuss the benefits of learning with technology tools when integrated into teaching. The argument that introducing computer technology into schools will neither improve nor change the quality of classroom instruction unless…

  19. Scratch as a computational modelling tool for teaching physics

    NASA Astrophysics Data System (ADS)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-05-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling programs. In this article, we briefly discuss why Scratch could be a useful tool for computational modelling in the primary or secondary physics classroom, and we present practical examples of how it can be used to build a model.

  20. Accurate 3-D finite difference computation of traveltimes in strongly heterogeneous media

    NASA Astrophysics Data System (ADS)

    Noble, M.; Gesret, A.; Belayouni, N.

    2014-12-01

    Seismic traveltimes and their spatial derivatives are the basis of many imaging methods such as pre-stack depth migration and tomography. A common approach to compute these quantities is to solve the eikonal equation with a finite-difference scheme. If many recently published algorithms for resolving the eikonal equation do now yield fairly accurate traveltimes for most applications, the spatial derivatives of traveltimes remain very approximate. To address this accuracy issue, we develop a new hybrid eikonal solver that combines a spherical approximation when close to the source and a plane wave approximation when far away. This algorithm reproduces properly the spherical behaviour of wave fronts in the vicinity of the source. We implement a combination of 16 local operators that enables us to handle velocity models with sharp vertical and horizontal velocity contrasts. We associate to these local operators a global fast sweeping method to take into account all possible directions of wave propagation. Our formulation allows us to introduce a variable grid spacing in all three directions of space. We demonstrate the efficiency of this algorithm in terms of computational time and the gain in accuracy of the computed traveltimes and their derivatives on several numerical examples.

  1. Computationally efficient and accurate enantioselectivity modeling by clusters of molecular dynamics simulations.

    PubMed

    Wijma, Hein J; Marrink, Siewert J; Janssen, Dick B

    2014-07-28

    Computational approaches could decrease the need for the laborious high-throughput experimental screening that is often required to improve enzymes by mutagenesis. Here, we report that using multiple short molecular dynamics (MD) simulations makes it possible to accurately model enantioselectivity for large numbers of enzyme-substrate combinations at low computational costs. We chose four different haloalkane dehalogenases as model systems because of the availability of a large set of experimental data on the enantioselective conversion of 45 different substrates. To model the enantioselectivity, we quantified the frequency of occurrence of catalytically productive conformations (near attack conformations) for pairs of enantiomers during MD simulations. We found that the angle of nucleophilic attack that leads to carbon-halogen bond cleavage was a critical variable that limited the occurrence of productive conformations; enantiomers for which this angle reached values close to 180° were preferentially converted. A cluster of 20-40 very short (10 ps) MD simulations allowed adequate conformational sampling and resulted in much better agreement to experimental enantioselectivities than single long MD simulations (22 ns), while the computational costs were 50-100 fold lower. With single long MD simulations, the dynamics of enzyme-substrate complexes remained confined to a conformational subspace that rarely changed significantly, whereas with multiple short MD simulations a larger diversity of conformations of enzyme-substrate complexes was observed. PMID:24916632

  2. Making it Easy to Construct Accurate Hydrological Models that Exploit High Performance Computers (Invited)

    NASA Astrophysics Data System (ADS)

    Kees, C. E.; Farthing, M. W.; Terrel, A.; Certik, O.; Seljebotn, D.

    2013-12-01

    This presentation will focus on two barriers to progress in the hydrological modeling community, and research and development conducted to lessen or eliminate them. The first is a barrier to sharing hydrological models among specialized scientists that is caused by intertwining the implementation of numerical methods with the implementation of abstract numerical modeling information. In the Proteus toolkit for computational methods and simulation, we have decoupled these two important parts of computational model through separate "physics" and "numerics" interfaces. More recently we have begun developing the Strong Form Language for easy and direct representation of the mathematical model formulation in a domain specific language embedded in Python. The second major barrier is sharing ANY scientific software tools that have complex library or module dependencies, as most parallel, multi-physics hydrological models must have. In this setting, users and developer are dependent on an entire distribution, possibly depending on multiple compilers and special instructions depending on the environment of the target machine. To solve these problem we have developed, hashdist, a stateless package management tool and a resulting portable, open source scientific software distribution.

  3. CoMOGrad and PHOG: From Computer Vision to Fast and Accurate Protein Tertiary Structure Retrieval

    PubMed Central

    Karim, Rezaul; Aziz, Mohd. Momin Al; Shatabda, Swakkhar; Rahman, M. Sohel; Mia, Md. Abul Kashem; Zaman, Farhana; Rakin, Salman

    2015-01-01

    The number of entries in a structural database of proteins is increasing day by day. Methods for retrieving protein tertiary structures from such a large database have turn out to be the key to comparative analysis of structures that plays an important role to understand proteins and their functions. In this paper, we present fast and accurate methods for the retrieval of proteins having tertiary structures similar to a query protein from a large database. Our proposed methods borrow ideas from the field of computer vision. The speed and accuracy of our methods come from the two newly introduced features- the co-occurrence matrix of the oriented gradient and pyramid histogram of oriented gradient- and the use of Euclidean distance as the distance measure. Experimental results clearly indicate the superiority of our approach in both running time and accuracy. Our method is readily available for use from this website: http://research.buet.ac.bd:8080/Comograd/. PMID:26293226

  4. CoMOGrad and PHOG: From Computer Vision to Fast and Accurate Protein Tertiary Structure Retrieval.

    PubMed

    Karim, Rezaul; Aziz, Mohd Momin Al; Shatabda, Swakkhar; Rahman, M Sohel; Mia, Md Abul Kashem; Zaman, Farhana; Rakin, Salman

    2015-01-01

    The number of entries in a structural database of proteins is increasing day by day. Methods for retrieving protein tertiary structures from such a large database have turn out to be the key to comparative analysis of structures that plays an important role to understand proteins and their functions. In this paper, we present fast and accurate methods for the retrieval of proteins having tertiary structures similar to a query protein from a large database. Our proposed methods borrow ideas from the field of computer vision. The speed and accuracy of our methods come from the two newly introduced features- the co-occurrence matrix of the oriented gradient and pyramid histogram of oriented gradient- and the use of Euclidean distance as the distance measure. Experimental results clearly indicate the superiority of our approach in both running time and accuracy. Our method is readily available for use from this website: http://research.buet.ac.bd:8080/Comograd/. PMID:26293226

  5. Computer tools for systems engineering at LaRC

    NASA Technical Reports Server (NTRS)

    Walters, J. Milam

    1994-01-01

    The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.

  6. Towards the computations of accurate spectroscopic parameters and vibrational spectra for organic compounds

    NASA Astrophysics Data System (ADS)

    Hochlaf, M.; Puzzarini, C.; Senent, M. L.

    2015-07-01

    We present multi-component computations for rotational constants, vibrational and torsional levels of medium-sized molecules. Through the treatment of two organic sulphur molecules, ethyl mercaptan and dimethyl sulphide, which are relevant for atmospheric and astrophysical media, we point out the outstanding capabilities of explicitly correlated coupled clusters (CCSD(T)-F12) method in conjunction with the cc-pVTZ-F12 basis set for the accurate predictions of such quantities. Indeed, we show that the CCSD(T)-F12/cc-pVTZ-F12 equilibrium rotational constants are in good agreement with those obtained by means of a composite scheme based on CCSD(T) calculations that accounts for the extrapolation to the complete basis set (CBS) limit and core-correlation effects [CCSD(T)/CBS+CV], thus leading to values of ground-state rotational constants rather close to the corresponding experimental data. For vibrational and torsional levels, our analysis reveals that the anharmonic frequencies derived from CCSD(T)-F12/cc-pVTZ-F12 harmonic frequencies and anharmonic corrections (Δν = ω - ν) at the CCSD/cc-pVTZ level closely agree with experimental results. The pattern of the torsional transitions and the shape of the potential energy surfaces along the torsional modes are also well reproduced using the CCSD(T)-F12/cc-pVTZ-F12 energies. Interestingly, this good accuracy is accompanied with a strong reduction of the computational costs. This makes the procedures proposed here as schemes of choice for effective and accurate prediction of spectroscopic properties of organic compounds. Finally, popular density functional approaches are compared with the coupled cluster (CC) methodologies in torsional studies. The long-range CAM-B3LYP functional of Handy and co-workers is recommended for large systems.

  7. Accurate Time-Dependent Traveling-Wave Tube Model Developed for Computational Bit-Error-Rate Testing

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.

    2001-01-01

    The phenomenal growth of the satellite communications industry has created a large demand for traveling-wave tubes (TWT's) operating with unprecedented specifications requiring the design and production of many novel devices in record time. To achieve this, the TWT industry heavily relies on computational modeling. However, the TWT industry's computational modeling capabilities need to be improved because there are often discrepancies between measured TWT data and that predicted by conventional two-dimensional helical TWT interaction codes. This limits the analysis and design of novel devices or TWT's with parameters differing from what is conventionally manufactured. In addition, the inaccuracy of current computational tools limits achievable TWT performance because optimized designs require highly accurate models. To address these concerns, a fully three-dimensional, time-dependent, helical TWT interaction model was developed using the electromagnetic particle-in-cell code MAFIA (Solution of MAxwell's equations by the Finite-Integration-Algorithm). The model includes a short section of helical slow-wave circuit with excitation fed by radiofrequency input/output couplers, and an electron beam contained by periodic permanent magnet focusing. A cutaway view of several turns of the three-dimensional helical slow-wave circuit with input/output couplers is shown. This has been shown to be more accurate than conventionally used two-dimensional models. The growth of the communications industry has also imposed a demand for increased data rates for the transmission of large volumes of data. To achieve increased data rates, complex modulation and multiple access techniques are employed requiring minimum distortion of the signal as it is passed through the TWT. Thus, intersymbol interference (ISI) becomes a major consideration, as well as suspected causes such as reflections within the TWT. To experimentally investigate effects of the physical TWT on ISI would be

  8. Enabling high grayscale resolution displays and accurate response time measurements on conventional computers.

    PubMed

    Li, Xiangrui; Lu, Zhong-Lin

    2012-01-01

    Display systems based on conventional computer graphics cards are capable of generating images with 8-bit gray level resolution. However, most experiments in vision research require displays with more than 12 bits of luminance resolution. Several solutions are available. Bit++ (1) and DataPixx (2) use the Digital Visual Interface (DVI) output from graphics cards and high resolution (14 or 16-bit) digital-to-analog converters to drive analog display devices. The VideoSwitcher (3) described here combines analog video signals from the red and blue channels of graphics cards with different weights using a passive resister network (4) and an active circuit to deliver identical video signals to the three channels of color monitors. The method provides an inexpensive way to enable high-resolution monochromatic displays using conventional graphics cards and analog monitors. It can also provide trigger signals that can be used to mark stimulus onsets, making it easy to synchronize visual displays with physiological recordings or response time measurements. Although computer keyboards and mice are frequently used in measuring response times (RT), the accuracy of these measurements is quite low. The RTbox is a specialized hardware and software solution for accurate RT measurements. Connected to the host computer through a USB connection, the driver of the RTbox is compatible with all conventional operating systems. It uses a microprocessor and high-resolution clock to record the identities and timing of button events, which are buffered until the host computer retrieves them. The recorded button events are not affected by potential timing uncertainties or biases associated with data transmission and processing in the host computer. The asynchronous storage greatly simplifies the design of user programs. Several methods are available to synchronize the clocks of the RTbox and the host computer. The RTbox can also receive external triggers and be used to measure RT with respect

  9. Numerical Computation of a Continuous-thrust State Transition Matrix Incorporating Accurate Hardware and Ephemeris Models

    NASA Technical Reports Server (NTRS)

    Ellison, Donald; Conway, Bruce; Englander, Jacob

    2015-01-01

    A significant body of work exists showing that providing a nonlinear programming (NLP) solver with expressions for the problem constraint gradient substantially increases the speed of program execution and can also improve the robustness of convergence, especially for local optimizers. Calculation of these derivatives is often accomplished through the computation of spacecraft's state transition matrix (STM). If the two-body gravitational model is employed as is often done in the context of preliminary design, closed form expressions for these derivatives may be provided. If a high fidelity dynamics model, that might include perturbing forces such as the gravitational effect from multiple third bodies and solar radiation pressure is used then these STM's must be computed numerically. We present a method for the power hardward model and a full ephemeris model. An adaptive-step embedded eight order Dormand-Prince numerical integrator is discussed and a method for the computation of the time of flight derivatives in this framework is presented. The use of these numerically calculated derivatieves offer a substantial improvement over finite differencing in the context of a global optimizer. Specifically the inclusion of these STM's into the low thrust missiondesign tool chain in use at NASA Goddard Spaceflight Center allows for an increased preliminary mission design cadence.

  10. Laser-induced accurate frontal cortex damage: a new tool for brain study

    NASA Astrophysics Data System (ADS)

    Flores, Gonzalo; Khotiaintsev, Sergei N.; Sanchez-Huerta, Maria L.; Ibanes, Osvaldo; Hernandez, Adan; Silva, Adriana B.; Calderon, Rafael; Ayala, Griselda; Marroquin, Javier; Svirid, Vladimir; Khotiaintsev, Yuri V.

    1999-01-01

    New laser-based technique for anatomical-functional study of the medial prefrontal cortex (MPFC) of the brain of experimental animals (rats) is presented. The technique is based on making accurate well-controlled lesions to small MPFC and subsequent observing behavioral alterations in the lesioned animals relative to control ones. Laser produces smaller and more accurate lesions in comparison to those obtained by traditional methods, such as: mechanical action, chemical means, and electrical currents. For producing the brain lesions, a 10 W CO2 CW laser is employed for reasons of its sufficiently high power, which is combined with relatively low cost-per-Watt ratio. In our experience, such power rating is sufficient for making MPFC lesions. The laser radiation is applied in a form of pulse series via hollow circular metallic waveguide made of stainless steel. The waveguide is of inner diameter 1.3 mm and 95 mm long. The anesthetized animals are placed in stereotaxic instrument. Via perforations made in the skull bone, the MPFC is exposed to the laser radiation. Several weeks later (after animal recuperation), standard behavioral tests are performed. They reveal behavioral changes, which point to a damage of some small regions of the MPFC. These results correlate with the histological data, which reveal the existence of small and accurate MPFC lesions. The present technique has good prospects for use in anatomical- functional studies of brain by areas. In addition, this technique appears to have considerable promise as a treatment method for some pathologies, e.g. the Parkinson's disease.

  11. A new software tool for computing Earth's atmospheric transmission of near- and far-infrared radiation

    NASA Technical Reports Server (NTRS)

    Lord, Steven D.

    1992-01-01

    This report describes a new software tool, ATRAN, which computes the transmittance of Earth's atmosphere at near- and far-infrared wavelengths. We compare the capabilities of this program with others currently available and demonstrate its utility for observational data calibration and reduction. The program employs current water-vapor and ozone models to produce fast and accurate transmittance spectra for wavelengths ranging from 0.8 microns to 10 mm.

  12. Parallel Higher-order Finite Element Method for Accurate Field Computations in Wakefield and PIC Simulations

    SciTech Connect

    Candel, A.; Kabel, A.; Lee, L.; Li, Z.; Limborg, C.; Ng, C.; Prudencio, E.; Schussman, G.; Uplenchwar, R.; Ko, K.; /SLAC

    2009-06-19

    Over the past years, SLAC's Advanced Computations Department (ACD), under SciDAC sponsorship, has developed a suite of 3D (2D) parallel higher-order finite element (FE) codes, T3P (T2P) and Pic3P (Pic2P), aimed at accurate, large-scale simulation of wakefields and particle-field interactions in radio-frequency (RF) cavities of complex shape. The codes are built on the FE infrastructure that supports SLAC's frequency domain codes, Omega3P and S3P, to utilize conformal tetrahedral (triangular)meshes, higher-order basis functions and quadratic geometry approximation. For time integration, they adopt an unconditionally stable implicit scheme. Pic3P (Pic2P) extends T3P (T2P) to treat charged-particle dynamics self-consistently using the PIC (particle-in-cell) approach, the first such implementation on a conformal, unstructured grid using Whitney basis functions. Examples from applications to the International Linear Collider (ILC), Positron Electron Project-II (PEP-II), Linac Coherent Light Source (LCLS) and other accelerators will be presented to compare the accuracy and computational efficiency of these codes versus their counterparts using structured grids.

  13. Development of highly accurate approximate scheme for computing the charge transfer integral.

    PubMed

    Pershin, Anton; Szalay, Péter G

    2015-08-21

    The charge transfer integral is a key parameter required by various theoretical models to describe charge transport properties, e.g., in organic semiconductors. The accuracy of this important property depends on several factors, which include the level of electronic structure theory and internal simplifications of the applied formalism. The goal of this paper is to identify the performance of various approximate approaches of the latter category, while using the high level equation-of-motion coupled cluster theory for the electronic structure. The calculations have been performed on the ethylene dimer as one of the simplest model systems. By studying different spatial perturbations, it was shown that while both energy split in dimer and fragment charge difference methods are equivalent with the exact formulation for symmetrical displacements, they are less efficient when describing transfer integral along the asymmetric alteration coordinate. Since the "exact" scheme was found computationally expensive, we examine the possibility to obtain the asymmetric fluctuation of the transfer integral by a Taylor expansion along the coordinate space. By exploring the efficiency of this novel approach, we show that the Taylor expansion scheme represents an attractive alternative to the "exact" calculations due to a substantial reduction of computational costs, when a considerably large region of the potential energy surface is of interest. Moreover, we show that the Taylor expansion scheme, irrespective of the dimer symmetry, is very accurate for the entire range of geometry fluctuations that cover the space the molecule accesses at room temperature. PMID:26298117

  14. Development of highly accurate approximate scheme for computing the charge transfer integral

    SciTech Connect

    Pershin, Anton; Szalay, Péter G.

    2015-08-21

    The charge transfer integral is a key parameter required by various theoretical models to describe charge transport properties, e.g., in organic semiconductors. The accuracy of this important property depends on several factors, which include the level of electronic structure theory and internal simplifications of the applied formalism. The goal of this paper is to identify the performance of various approximate approaches of the latter category, while using the high level equation-of-motion coupled cluster theory for the electronic structure. The calculations have been performed on the ethylene dimer as one of the simplest model systems. By studying different spatial perturbations, it was shown that while both energy split in dimer and fragment charge difference methods are equivalent with the exact formulation for symmetrical displacements, they are less efficient when describing transfer integral along the asymmetric alteration coordinate. Since the “exact” scheme was found computationally expensive, we examine the possibility to obtain the asymmetric fluctuation of the transfer integral by a Taylor expansion along the coordinate space. By exploring the efficiency of this novel approach, we show that the Taylor expansion scheme represents an attractive alternative to the “exact” calculations due to a substantial reduction of computational costs, when a considerably large region of the potential energy surface is of interest. Moreover, we show that the Taylor expansion scheme, irrespective of the dimer symmetry, is very accurate for the entire range of geometry fluctuations that cover the space the molecule accesses at room temperature.

  15. Virtual Cell: computational tools for modeling in cell biology

    PubMed Central

    Resasco, Diana C.; Gao, Fei; Morgan, Frank; Novak, Igor L.; Schaff, James C.; Slepchenko, Boris M.

    2011-01-01

    The Virtual Cell (VCell) is a general computational framework for modeling physico-chemical and electrophysiological processes in living cells. Developed by the National Resource for Cell Analysis and Modeling at the University of Connecticut Health Center, it provides automated tools for simulating a wide range of cellular phenomena in space and time, both deterministically and stochastically. These computational tools allow one to couple electrophysiology and reaction kinetics with transport mechanisms, such as diffusion and directed transport, and map them onto spatial domains of various shapes, including irregular three-dimensional geometries derived from experimental images. In this article, we review new robust computational tools recently deployed in VCell for treating spatially resolved models. PMID:22139996

  16. SPARSKIT: A basic tool kit for sparse matrix computations

    NASA Technical Reports Server (NTRS)

    Saad, Youcef

    1990-01-01

    Presented here are the main features of a tool package for manipulating and working with sparse matrices. One of the goals of the package is to provide basic tools to facilitate the exchange of software and data between researchers in sparse matrix computations. The starting point is the Harwell/Boeing collection of matrices for which the authors provide a number of tools. Among other things, the package provides programs for converting data structures, printing simple statistics on a matrix, plotting a matrix profile, and performing linear algebra operations with sparse matrices.

  17. Tool Use and Performance: Relationships between Tool- and Learner-Related Characteristics in a Computer-Based Learning Environment

    ERIC Educational Resources Information Center

    Juarez-Collazo, Norma A.; Elen, Jan; Clarebout, Geraldine

    2013-01-01

    It is still unclear on what and how tool and learner characteristics influence tool use and consequently performance in computer-based learning environments (CBLEs). This study examines the relationships between tool-related characteristics (tool presentation: non-/embedded tool and instructional cues: non-/explained tool functionality) and…

  18. Accurate computation and interpretation of spin-dependent properties in metalloproteins

    NASA Astrophysics Data System (ADS)

    Rodriguez, Jorge

    2006-03-01

    Nature uses the properties of open-shell transition metal ions to carry out a variety of functions associated with vital life processes. Mononuclear and binuclear iron centers, in particular, are intriguing structural motifs present in many heme and non-heme proteins. Hemerythrin and methane monooxigenase, for example, are members of the latter class whose diiron active sites display magnetic ordering. We have developed a computational protocol based on spin density functional theory (SDFT) to accurately predict physico-chemical parameters of metal sites in proteins and bioinorganic complexes which traditionally had only been determined from experiment. We have used this new methodology to perform a comprehensive study of the electronic structure and magnetic properties of heme and non-heme iron proteins and related model compounds. We have been able to predict with a high degree of accuracy spectroscopic (Mössbauer, EPR, UV-vis, Raman) and magnetization parameters of iron proteins and, at the same time, gained unprecedented microscopic understanding of their physico-chemical properties. Our results have allowed us to establish important correlations between the electronic structure, geometry, spectroscopic data, and biochemical function of heme and non- heme iron proteins.

  19. Aeroacoustic Flow Phenomena Accurately Captured by New Computational Fluid Dynamics Method

    NASA Technical Reports Server (NTRS)

    Blech, Richard A.

    2002-01-01

    One of the challenges in the computational fluid dynamics area is the accurate calculation of aeroacoustic phenomena, especially in the presence of shock waves. One such phenomenon is "transonic resonance," where an unsteady shock wave at the throat of a convergent-divergent nozzle results in the emission of acoustic tones. The space-time Conservation-Element and Solution-Element (CE/SE) method developed at the NASA Glenn Research Center can faithfully capture the shock waves, their unsteady motion, and the generated acoustic tones. The CE/SE method is a revolutionary new approach to the numerical modeling of physical phenomena where features with steep gradients (e.g., shock waves, phase transition, etc.) must coexist with those having weaker variations. The CE/SE method does not require the complex interpolation procedures (that allow for the possibility of a shock between grid cells) used by many other methods to transfer information between grid cells. These interpolation procedures can add too much numerical dissipation to the solution process. Thus, while shocks are resolved, weaker waves, such as acoustic waves, are washed out.

  20. Fast and accurate computation of two-dimensional non-separable quadratic-phase integrals.

    PubMed

    Koç, Aykut; Ozaktas, Haldun M; Hesselink, Lambertus

    2010-06-01

    We report a fast and accurate algorithm for numerical computation of two-dimensional non-separable linear canonical transforms (2D-NS-LCTs). Also known as quadratic-phase integrals, this class of integral transforms represents a broad class of optical systems including Fresnel propagation in free space, propagation in graded-index media, passage through thin lenses, and arbitrary concatenations of any number of these, including anamorphic/astigmatic/non-orthogonal cases. The general two-dimensional non-separable case poses several challenges which do not exist in the one-dimensional case and the separable two-dimensional case. The algorithm takes approximately N log N time, where N is the two-dimensional space-bandwidth product of the signal. Our method properly tracks and controls the space-bandwidth products in two dimensions, in order to achieve information theoretically sufficient, but not wastefully redundant, sampling required for the reconstruction of the underlying continuous functions at any stage of the algorithm. Additionally, we provide an alternative definition of general 2D-NS-LCTs that shows its kernel explicitly in terms of its ten parameters, and relate these parameters bidirectionally to conventional ABCD matrix parameters. PMID:20508697

  1. Accurate computation of surface stresses and forces with immersed boundary methods

    NASA Astrophysics Data System (ADS)

    Goza, Andres; Liska, Sebastian; Morley, Benjamin; Colonius, Tim

    2016-09-01

    Many immersed boundary methods solve for surface stresses that impose the velocity boundary conditions on an immersed body. These surface stresses may contain spurious oscillations that make them ill-suited for representing the physical surface stresses on the body. Moreover, these inaccurate stresses often lead to unphysical oscillations in the history of integrated surface forces such as the coefficient of lift. While the errors in the surface stresses and forces do not necessarily affect the convergence of the velocity field, it is desirable, especially in fluid-structure interaction problems, to obtain smooth and convergent stress distributions on the surface. To this end, we show that the equation for the surface stresses is an integral equation of the first kind whose ill-posedness is the source of spurious oscillations in the stresses. We also demonstrate that for sufficiently smooth delta functions, the oscillations may be filtered out to obtain physically accurate surface stresses. The filtering is applied as a post-processing procedure, so that the convergence of the velocity field is unaffected. We demonstrate the efficacy of the method by computing stresses and forces that converge to the physical stresses and forces for several test problems.

  2. Facilitating the selection and creation of accurate interatomic potentials with robust tools and characterization

    NASA Astrophysics Data System (ADS)

    Trautt, Zachary T.; Tavazza, Francesca; Becker, Chandler A.

    2015-10-01

    The Materials Genome Initiative seeks to significantly decrease the cost and time of development and integration of new materials. Within the domain of atomistic simulations, several roadblocks stand in the way of reaching this goal. While the NIST Interatomic Potentials Repository hosts numerous interatomic potentials (force fields), researchers cannot immediately determine the best choice(s) for their use case. Researchers developing new potentials, specifically those in restricted environments, lack a comprehensive portfolio of efficient tools capable of calculating and archiving the properties of their potentials. This paper elucidates one solution to these problems, which uses Python-based scripts that are suitable for rapid property evaluation and human knowledge transfer. Calculation results are visible on the repository website, which reduces the time required to select an interatomic potential for a specific use case. Furthermore, property evaluation scripts are being integrated with modern platforms to improve discoverability and access of materials property data. To demonstrate these scripts and features, we will discuss the automation of stacking fault energy calculations and their application to additional elements. While the calculation methodology was developed previously, we are using it here as a case study in simulation automation and property calculations. We demonstrate how the use of Python scripts allows for rapid calculation in a more easily managed way where the calculations can be modified, and the results presented in user-friendly and concise ways. Additionally, the methods can be incorporated into other efforts, such as openKIM.

  3. Toward accurate tooth segmentation from computed tomography images using a hybrid level set model

    SciTech Connect

    Gan, Yangzhou; Zhao, Qunfei; Xia, Zeyang E-mail: jing.xiong@siat.ac.cn; Hu, Ying; Xiong, Jing E-mail: jing.xiong@siat.ac.cn; Zhang, Jianwei

    2015-01-15

    Purpose: A three-dimensional (3D) model of the teeth provides important information for orthodontic diagnosis and treatment planning. Tooth segmentation is an essential step in generating the 3D digital model from computed tomography (CT) images. The aim of this study is to develop an accurate and efficient tooth segmentation method from CT images. Methods: The 3D dental CT volumetric images are segmented slice by slice in a two-dimensional (2D) transverse plane. The 2D segmentation is composed of a manual initialization step and an automatic slice by slice segmentation step. In the manual initialization step, the user manually picks a starting slice and selects a seed point for each tooth in this slice. In the automatic slice segmentation step, a developed hybrid level set model is applied to segment tooth contours from each slice. Tooth contour propagation strategy is employed to initialize the level set function automatically. Cone beam CT (CBCT) images of two subjects were used to tune the parameters. Images of 16 additional subjects were used to validate the performance of the method. Volume overlap metrics and surface distance metrics were adopted to assess the segmentation accuracy quantitatively. The volume overlap metrics were volume difference (VD, mm{sup 3}) and Dice similarity coefficient (DSC, %). The surface distance metrics were average symmetric surface distance (ASSD, mm), RMS (root mean square) symmetric surface distance (RMSSSD, mm), and maximum symmetric surface distance (MSSD, mm). Computation time was recorded to assess the efficiency. The performance of the proposed method has been compared with two state-of-the-art methods. Results: For the tested CBCT images, the VD, DSC, ASSD, RMSSSD, and MSSD for the incisor were 38.16 ± 12.94 mm{sup 3}, 88.82 ± 2.14%, 0.29 ± 0.03 mm, 0.32 ± 0.08 mm, and 1.25 ± 0.58 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the canine were 49.12 ± 9.33 mm{sup 3}, 91.57 ± 0.82%, 0.27 ± 0.02 mm, 0

  4. Computer Mathematical Tools: Practical Experience of Learning to Use Them

    ERIC Educational Resources Information Center

    Semenikhina, Elena; Drushlyak, Marina

    2014-01-01

    The article contains general information about the use of specialized mathematics software in the preparation of math teachers. The authors indicate the reasons to study the mathematics software. In particular, they analyze the possibility of presenting basic mathematical courses using mathematical computer tools from both a teacher and a student,…

  5. The Computer and Language Learning: Productivity Tools in the Classroom.

    ERIC Educational Resources Information Center

    Thrush, Emily A.

    Early programs for computer-assisted language learning were limited in size and power by the capabilities of the first generation of microcomputers. As these capabilities have increased, it has become possible for language teachers to take advantage of tools originally intended for use in the business world, such as word processors, spreadsheets,…

  6. Integrating Computer-Assisted Translation Tools into Language Learning

    ERIC Educational Resources Information Center

    Fernández-Parra, María

    2016-01-01

    Although Computer-Assisted Translation (CAT) tools play an important role in the curriculum in many university translator training programmes, they are seldom used in the context of learning a language, as a good command of a language is needed before starting to translate. Since many institutions often have translator-training programmes as well…

  7. Software Tools: A One-Semester Secondary School Computer Course.

    ERIC Educational Resources Information Center

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  8. Cartoons beyond Clipart: A Computer Tool for Storyboarding and Storywriting

    ERIC Educational Resources Information Center

    Madden, M.; Chung, P. W. H.; Dawson, C. W.

    2009-01-01

    This paper describes the motivation, proposal, and early prototype testing of a computer tool for story visualisation. An analysis of current software for making various types of visual story is made; this identifies a gap between software which emphasises preset banks of artwork, and software which emphasises low-level construction and/or…

  9. A MATLAB-based tool for accurate detection of perfect overlapping and nested inverted repeats in DNA sequences

    PubMed Central

    Sreeskandarajan, Sutharzan; Flowers, Michelle M.; Karro, John E.; Liang, Chun

    2014-01-01

    Summary: Palindromic sequences, or inverted repeats (IRs), in DNA sequences involve important biological processes such as DNA–protein binding, DNA replication and DNA transposition. Development of bioinformatics tools that are capable of accurately detecting perfect IRs can enable genome-wide studies of IR patterns in both prokaryotes and eukaryotes. Different from conventional string-comparison approaches, we propose a novel algorithm that uses a cumulative score system based on a prime number representation of nucleotide bases. We then implemented this algorithm as a MATLAB-based program for perfect IR detection. In comparison with other existing tools, our program demonstrates a high accuracy in detecting nested and overlapping IRs. Availability and implementation: The source code is freely available on (http://bioinfolab.miamioh.edu/bioinfolab/palindrome.php) Contact: liangc@miamioh.edu or karroje@miamioh.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24215021

  10. Raman Spectroscopy Provides a Powerful Diagnostic Tool for Accurate Determination of Albumin Glycation

    PubMed Central

    Dingari, Narahara Chari; Horowitz, Gary L.; Kang, Jeon Woong; Dasari, Ramachandra R.; Barman, Ishan

    2012-01-01

    We present the first demonstration of glycated albumin detection and quantification using Raman spectroscopy without the addition of reagents. Glycated albumin is an important marker for monitoring the long-term glycemic history of diabetics, especially as its concentrations, in contrast to glycated hemoglobin levels, are unaffected by changes in erythrocyte life times. Clinically, glycated albumin concentrations show a strong correlation with the development of serious diabetes complications including nephropathy and retinopathy. In this article, we propose and evaluate the efficacy of Raman spectroscopy for determination of this important analyte. By utilizing the pre-concentration obtained through drop-coating deposition, we show that glycation of albumin leads to subtle, but consistent, changes in vibrational features, which with the help of multivariate classification techniques can be used to discriminate glycated albumin from the unglycated variant with 100% accuracy. Moreover, we demonstrate that the calibration model developed on the glycated albumin spectral dataset shows high predictive power, even at substantially lower concentrations than those typically encountered in clinical practice. In fact, the limit of detection for glycated albumin measurements is calculated to be approximately four times lower than its minimum physiological concentration. Importantly, in relation to the existing detection methods for glycated albumin, the proposed method is also completely reagent-free, requires barely any sample preparation and has the potential for simultaneous determination of glycated hemoglobin levels as well. Given these key advantages, we believe that the proposed approach can provide a uniquely powerful tool for quantification of glycation status of proteins in biopharmaceutical development as well as for glycemic marker determination in routine clinical diagnostics in the future. PMID:22393405

  11. Managing expectations when publishing tools and methods for computational proteomics.

    PubMed

    Martens, Lennart; Kohlbacher, Oliver; Weintraub, Susan T

    2015-05-01

    Computational tools are pivotal in proteomics because they are crucial for identification, quantification, and statistical assessment of data. The gateway to finding the best choice of a tool or approach for a particular problem is frequently journal articles, yet there is often an overwhelming variety of options that makes it hard to decide on the best solution. This is particularly difficult for nonexperts in bioinformatics. The maturity, reliability, and performance of tools can vary widely because publications may appear at different stages of development. A novel idea might merit early publication despite only offering proof-of-principle, while it may take years before a tool can be considered mature, and by that time it might be difficult for a new publication to be accepted because of a perceived lack of novelty. After discussions with members of the computational mass spectrometry community, we describe here proposed recommendations for organization of informatics manuscripts as a way to set the expectations of readers (and reviewers) through three different manuscript types that are based on existing journal designations. Brief Communications are short reports describing novel computational approaches where the implementation is not necessarily production-ready. Research Articles present both a novel idea and mature implementation that has been suitably benchmarked. Application Notes focus on a mature and tested tool or concept and need not be novel but should offer advancement from improved quality, ease of use, and/or implementation. Organizing computational proteomics contributions into these three manuscript types will facilitate the review process and will also enable readers to identify the maturity and applicability of the tool for their own workflows. PMID:25764342

  12. Computational Tools for Accelerating Carbon Capture Process Development

    SciTech Connect

    Miller, David

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  13. The role of customized computational tools in product development.

    SciTech Connect

    Heinstein, Martin Wilhelm; Kempka, Steven Norman; Tikare, Veena

    2005-06-01

    Model-based computer simulations have revolutionized product development in the last 10 to 15 years. Technologies that have existed for many decades or even centuries have been improved with the aid of computer simulations. Everything from low-tech consumer goods such as detergents, lubricants and light bulb filaments to the most advanced high-tech products such as airplane wings, wireless communication technologies and pharmaceuticals is engineered with the aid of computer simulations today. In this paper, we present a framework for describing computational tools and their application within the context of product engineering. We examine a few cases of product development that integrate numerical computer simulations into the development stage. We will discuss how the simulations were integrated into the development process, what features made the simulations useful, the level of knowledge and experience that was necessary to run meaningful simulations and other details of the process. Based on this discussion, recommendations for the incorporation of simulations and computational tools into product development will be made.

  14. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  15. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    PubMed Central

    Wang, Danli; Liu, Zhen

    2014-01-01

    Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity. PMID:24719575

  16. [Affective computing--a mysterious tool to explore human emotions].

    PubMed

    Li, Xin; Li, Honghong; Dou, Yi; Hou, Yongjie; Li, Changwu

    2013-12-01

    Perception, affection and consciousness are basic psychological functions of human being. Affection is the subjective reflection of different kinds of objects. The foundation of human being's thinking is constituted by the three basic functions. Affective computing is an effective tool of revealing the affectiveness of human being in order to understand the world. Our research of affective computing focused on the relation, the generation and the influent factors among different affections. In this paper, the affective mechanism, the basic theory of affective computing, is studied, the method of acquiring and recognition of affective information is discussed, and the application of affective computing is summarized as well, in order to attract more researchers into this working area. PMID:24645628

  17. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    PubMed Central

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  18. A tangible programming tool for children to cultivate computational thinking.

    PubMed

    Wang, Danli; Wang, Tingting; Liu, Zhen

    2014-01-01

    Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5-9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity. PMID:24719575

  19. A tool for modeling concurrent real-time computation

    NASA Technical Reports Server (NTRS)

    Sharma, D. D.; Huang, Shie-Rei; Bhatt, Rahul; Sridharan, N. S.

    1990-01-01

    Real-time computation is a significant area of research in general, and in AI in particular. The complexity of practical real-time problems demands use of knowledge-based problem solving techniques while satisfying real-time performance constraints. Since the demands of a complex real-time problem cannot be predicted (owing to the dynamic nature of the environment) powerful dynamic resource control techniques are needed to monitor and control the performance. A real-time computation model for a real-time tool, an implementation of the QP-Net simulator on a Symbolics machine, and an implementation on a Butterfly multiprocessor machine are briefly described.

  20. Procedure for computer-controlled milling of accurate surfaces of revolution for millimeter and far-infrared mirrors

    NASA Technical Reports Server (NTRS)

    Emmons, Louisa; De Zafra, Robert

    1991-01-01

    A simple method for milling accurate off-axis parabolic mirrors with a computer-controlled milling machine is discussed. For machines with a built-in circle-cutting routine, an exact paraboloid can be milled with few computer commands and without the use of the spherical or linear approximations. The proposed method can be adapted easily to cut off-axis sections of elliptical or spherical mirrors.

  1. Final Report for Foundational Tools for Petascale Computing

    SciTech Connect

    Hollingsworth, Jeff

    2015-02-12

    This project concentrated on various aspects of creating tool infrastructure to make it easier to program large-scale parallel computers. This project was collaborative with the University of Wisconsin and closely related to the project DE-SC0002606 (“Tools for the Development of High Performance Energy Applications and Systems”) . The research conducted during this project is summarized in this report. The complete details of the work are available in the ten publications listed at the end of the report. Many of the concepts created during this project have been incorporated into tools and made available as freely downloadable software (at www.dyninst.org). It also supported the Ph.D. studies of three students and one research staff member.

  2. Accurate technique for complete geometric calibration of cone-beam computed tomography systems.

    PubMed

    Cho, Youngbin; Moseley, Douglas J; Siewerdsen, Jeffrey H; Jaffray, David A

    2005-04-01

    Cone-beam computed tomography systems have been developed to provide in situ imaging for the purpose of guiding radiation therapy. Clinical systems have been constructed using this approach, a clinical linear accelerator (Elekta Synergy RP) and an iso-centric C-arm. Geometric calibration involves the estimation of a set of parameters that describes the geometry of such systems, and is essential for accurate image reconstruction. We have developed a general analytic algorithm and corresponding calibration phantom for estimating these geometric parameters in cone-beam computed tomography (CT) systems. The performance of the calibration algorithm is evaluated and its application is discussed. The algorithm makes use of a calibration phantom to estimate the geometric parameters of the system. The phantom consists of 24 steel ball bearings (BBs) in a known geometry. Twelve BBs are spaced evenly at 30 deg in two plane-parallel circles separated by a given distance along the tube axis. The detector (e.g., a flat panel detector) is assumed to have no spatial distortion. The method estimates geometric parameters including the position of the x-ray source, position, and rotation of the detector, and gantry angle, and can describe complex source-detector trajectories. The accuracy and sensitivity of the calibration algorithm was analyzed. The calibration algorithm estimates geometric parameters in a high level of accuracy such that the quality of CT reconstruction is not degraded by the error of estimation. Sensitivity analysis shows uncertainty of 0.01 degrees (around beam direction) to 0.3 degrees (normal to the beam direction) in rotation, and 0.2 mm (orthogonal to the beam direction) to 4.9 mm (beam direction) in position for the medical linear accelerator geometry. Experimental measurements using a laboratory bench Cone-beam CT system of known geometry demonstrate the sensitivity of the method in detecting small changes in the imaging geometry with an uncertainty of 0

  3. Cloud-Based Computational Tools for Earth Science Applications

    NASA Astrophysics Data System (ADS)

    Arendt, A. A.; Fatland, R.; Howe, B.

    2015-12-01

    Earth scientists are increasingly required to think across disciplines and utilize a wide range of datasets in order to solve complex environmental challenges. Although significant progress has been made in distributing data, researchers must still invest heavily in developing computational tools to accommodate their specific domain. Here we document our development of lightweight computational data systems aimed at enabling rapid data distribution, analytics and problem solving tools for Earth science applications. Our goal is for these systems to be easily deployable, scalable and flexible to accommodate new research directions. As an example we describe "Ice2Ocean", a software system aimed at predicting runoff from snow and ice in the Gulf of Alaska region. Our backend components include relational database software to handle tabular and vector datasets, Python tools (NumPy, pandas and xray) for rapid querying of gridded climate data, and an energy and mass balance hydrological simulation model (SnowModel). These components are hosted in a cloud environment for direct access across research teams, and can also be accessed via API web services using a REST interface. This API is a vital component of our system architecture, as it enables quick integration of our analytical tools across disciplines, and can be accessed by any existing data distribution centers. We will showcase several data integration and visualization examples to illustrate how our system has expanded our ability to conduct cross-disciplinary research.

  4. Accelerating Battery Design Using Computer-Aided Engineering Tools: Preprint

    SciTech Connect

    Pesaran, A.; Heon, G. H.; Smith, K.

    2011-01-01

    Computer-aided engineering (CAE) is a proven pathway, especially in the automotive industry, to improve performance by resolving the relevant physics in complex systems, shortening the product development design cycle, thus reducing cost, and providing an efficient way to evaluate parameters for robust designs. Academic models include the relevant physics details, but neglect engineering complexities. Industry models include the relevant macroscopic geometry and system conditions, but simplify the fundamental physics too much. Most of the CAE battery tools for in-house use are custom model codes and require expert users. There is a need to make these battery modeling and design tools more accessible to end users such as battery developers, pack integrators, and vehicle makers. Developing integrated and physics-based CAE battery tools can reduce the design, build, test, break, re-design, re-build, and re-test cycle and help lower costs. NREL has been involved in developing various models to predict the thermal and electrochemical performance of large-format cells and has used in commercial three-dimensional finite-element analysis and computational fluid dynamics to study battery pack thermal issues. These NREL cell and pack design tools can be integrated to help support the automotive industry and to accelerate battery design.

  5. Computational Tools for the Secondary Analysis of Metabolomics Experiments

    PubMed Central

    Booth, Sean C.; Weljie, Aalim M.; Turner, Raymond J.

    2013-01-01

    Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before. PMID:24688685

  6. Computational science: shifting the focus from tools to models

    PubMed Central

    Hinsen, Konrad

    2014-01-01

    Computational techniques have revolutionized many aspects of scientific research over the last few decades. Experimentalists use computation for data analysis, processing ever bigger data sets. Theoreticians compute predictions from ever more complex models. However, traditional articles do not permit the publication of big data sets or complex models. As a consequence, these crucial pieces of information no longer enter the scientific record. Moreover, they have become prisoners of scientific software: many models exist only as software implementations, and the data are often stored in proprietary formats defined by the software. In this article, I argue that this emphasis on software tools over models and data is detrimental to science in the long term, and I propose a means by which this can be reversed. PMID:25309728

  7. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    PubMed Central

    Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris

    2014-01-01

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations. PMID:25340050

  8. Computational tools and algorithms for designing customized synthetic genes.

    PubMed

    Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris

    2014-01-01

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations. PMID:25340050

  9. An accurate and efficient computation method of the hydration free energy of a large, complex molecule

    NASA Astrophysics Data System (ADS)

    Yoshidome, Takashi; Ekimoto, Toru; Matubayasi, Nobuyuki; Harano, Yuichi; Kinoshita, Masahiro; Ikeguchi, Mitsunori

    2015-05-01

    The hydration free energy (HFE) is a crucially important physical quantity to discuss various chemical processes in aqueous solutions. Although an explicit-solvent computation with molecular dynamics (MD) simulations is a preferable treatment of the HFE, huge computational load has been inevitable for large, complex solutes like proteins. In the present paper, we propose an efficient computation method for the HFE. In our method, the HFE is computed as a sum of /2 ( is the ensemble average of the sum of pair interaction energy between solute and water molecule) and the water reorganization term mainly reflecting the excluded volume effect. Since can readily be computed through a MD of the system composed of solute and water, an efficient computation of the latter term leads to a reduction of computational load. We demonstrate that the water reorganization term can quantitatively be calculated using the morphometric approach (MA) which expresses the term as the linear combinations of the four geometric measures of a solute and the corresponding coefficients determined with the energy representation (ER) method. Since the MA enables us to finish the computation of the solvent reorganization term in less than 0.1 s once the coefficients are determined, the use of the MA enables us to provide an efficient computation of the HFE even for large, complex solutes. Through the applications, we find that our method has almost the same quantitative performance as the ER method with substantial reduction of the computational load.

  10. Computer-Based Tools for Evaluating Graphical User Interfaces

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  11. Limited rotational and rovibrational line lists computed with highly accurate quartic force fields and ab initio dipole surfaces.

    PubMed

    Fortenberry, Ryan C; Huang, Xinchuan; Schwenke, David W; Lee, Timothy J

    2014-02-01

    In this work, computational procedures are employed to compute the rotational and rovibrational spectra and line lists for H2O, CO2, and SO2. Building on the established use of quartic force fields, MP2 and CCSD(T) Dipole Moment Surfaces (DMSs) are computed for each system of study in order to produce line intensities as well as the transition energies. The computed results exhibit a clear correlation to reference data available in the HITRAN database. Additionally, even though CCSD(T) DMSs produce more accurate intensities as compared to experiment, the use of MP2 DMSs results in reliable line lists that are still comparable to experiment. The use of the less computationally costly MP2 method is beneficial in the study of larger systems where use of CCSD(T) would be more costly. PMID:23692860

  12. Symmetry-Based Computational Tools for Magnetic Crystallography

    NASA Astrophysics Data System (ADS)

    Perez-Mato, J. M.; Gallego, S. V.; Tasci, E. S.; Elcoro, L.; de la Flor, G.; Aroyo, M. I.

    2015-07-01

    In recent years, two important advances have opened new doors for the characterization and determination of magnetic structures. Firstly, researchers have produced computer-readable listings of the magnetic or Shubnikov space groups. Secondly, they have extended and applied the superspace formalism, which is presently the standard approach for the description of nonmagnetic incommensurate structures and their symmetry, to magnetic structures. These breakthroughs have been the basis for the subsequent development of a series of computer tools that allow a more efficient and comprehensive application of magnetic symmetry, both commensurate and incommensurate. Here we briefly review the capabilities of these computation instruments and present the fundamental concepts on which they are based, providing various examples. We show how these tools facilitate the use of symmetry arguments expressed as either a magnetic space group or a magnetic superspace group and allow the exploration of the possible magnetic orderings associated with one or more propagation vectors in a form that complements and goes beyond the traditional representation method. Special focus is placed on the programs available online at the Bilbao Crystallographic Server ( http://www.cryst.ehu.es ).

  13. Applying computer simulation models as learning tools in fishery management

    USGS Publications Warehouse

    Johnson, B.L.

    1995-01-01

    Computer models can be powerful tools for addressing many problems in fishery management, but uncertainty about how to apply models and how they should perform can lead to a cautious approach to modeling. Within this approach, we expect models to make quantitative predictions but only after all model inputs have been estimated from empirical data and after the model has been tested for agreement with an independent data set. I review the limitations to this approach and show how models can be more useful as tools for organizing data and concepts, learning about the system to be managed, and exploring management options. Fishery management requires deciding what actions to pursue to meet management objectives. Models do not make decisions for us but can provide valuable input to the decision-making process. When empirical data are lacking, preliminary modeling with parameters derived from other sources can help determine priorities for data collection. When evaluating models for management applications, we should attempt to define the conditions under which the model is a useful, analytical tool (its domain of applicability) and should focus on the decisions made using modeling results, rather than on quantitative model predictions. I describe an example of modeling used as a learning tool for the yellow perch Perca flavescens fishery in Green Bay, Lake Michigan.

  14. Computers and the Internet: Tools for Youth Empowerment

    PubMed Central

    2005-01-01

    Background Youth are often disenfranchised in their communities and may feel they have little voice. Since computers are an important aspect of youth culture, they may offer solutions to increasing youth participation in communities. Objective This qualitative case study investigated the perceptions of 19 (predominantly female) inner-city school youth about their use of computers and the Internet in a school-based community development project. Methods Youth working with public health nurses in a school-based community development project communicated with local community members using computer-mediated communication, surveyed peers online, built websites, searched for information online, and prepared project materials using computers and the Internet. Participant observation, semistructured interviews, analysis of online messages, and online- and paper-based surveys were used to gather data about youth’s and adults’ perceptions and use of the technologies. Constant comparison method and between-method triangulation were used in the analysis to satisfy the existence of themes. Results Not all youth were interested in working with computers. Some electronic messages from adults were perceived to be critical, and writing to adults was intimidating for some youth. In addition, technical problems were experienced. Despite these barriers, most youth perceived that using computers and the Internet reduced their anxiety concerning communication with adults, increased their control when dealing with adults, raised their perception of their social status, increased participation within the community, supported reflective thought, increased efficiency, and improved their access to resources. Conclusions Overall, youth perceived computers and the Internet to be empowering tools, and they should be encouraged to use such technology to support them in community initiatives. PMID:16403715

  15. Parallelization of ARC3D with Computer-Aided Tools

    NASA Technical Reports Server (NTRS)

    Jin, Haoqiang; Hribar, Michelle; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    A series of efforts have been devoted to investigating methods of porting and parallelizing applications quickly and efficiently for new architectures, such as the SCSI Origin 2000 and Cray T3E. This report presents the parallelization of a CFD application, ARC3D, using the computer-aided tools, Cesspools. Steps of parallelizing this code and requirements of achieving better performance are discussed. The generated parallel version has achieved reasonably well performance, for example, having a speedup of 30 for 36 Cray T3E processors. However, this performance could not be obtained without modification of the original serial code. It is suggested that in many cases improving serial code and performing necessary code transformations are important parts for the automated parallelization process although user intervention in many of these parts are still necessary. Nevertheless, development and improvement of useful software tools, such as Cesspools, can help trim down many tedious parallelization details and improve the processing efficiency.

  16. Combining Theory and Experiment to Compute Highly Accurate Line Lists for Stable Molecules, and Purely AB Initio Theory to Compute Accurate Rotational and Rovibrational Line Lists for Transient Molecules

    NASA Astrophysics Data System (ADS)

    Lee, Timothy J.; Huang, Xinchuan; Fortenberry, Ryan C.; Schwenke, David W.

    2013-06-01

    Theoretical chemists have been computing vibrational and rovibrational spectra of small molecules for more than 40 years, but over the last decade the interest in this application has grown significantly. The increased interest in computing accurate rotational and rovibrational spectra for small molecules could not come at a better time, as NASA and ESA have begun to acquire a mountain of high-resolution spectra from the Herschel mission, and soon will from the SOFIA and JWST missions. In addition, the ground-based telescope, ALMA, has begun to acquire high-resolution spectra in the same time frame. Hence the need for highly accurate line lists for many small molecules, including their minor isotopologues, will only continue to increase. I will present the latest developments from our group on using the "Best Theory + High-Resolution Experimental Data" strategy to compute highly accurate rotational and rovibrational spectra for small molecules, including NH3, CO2, and SO2. I will also present the latest work from our group in producing purely ab initio line lists and spectroscopic constants for small molecules thought to exist in various astrophysical environments, but for which there is either limited or no high-resolution experimental data available. These more limited line lists include purely rotational transitions as well as rovibrational transitions for bands up through a few combination/overtones.

  17. A fourth order accurate finite difference scheme for the computation of elastic waves

    NASA Technical Reports Server (NTRS)

    Bayliss, A.; Jordan, K. E.; Lemesurier, B. J.; Turkel, E.

    1986-01-01

    A finite difference for elastic waves is introduced. The model is based on the first order system of equations for the velocities and stresses. The differencing is fourth order accurate on the spatial derivatives and second order accurate in time. The model is tested on a series of examples including the Lamb problem, scattering from plane interf aces and scattering from a fluid-elastic interface. The scheme is shown to be effective for these problems. The accuracy and stability is insensitive to the Poisson ratio. For the class of problems considered here it is found that the fourth order scheme requires for two-thirds to one-half the resolution of a typical second order scheme to give comparable accuracy.

  18. A computer-based tool for generation of progress notes.

    PubMed Central

    Campbell, K. E.; Wieckert, K.; Fagan, L. M.; Musen, M. A.

    1993-01-01

    IVORY, a computer-based tool that uses clinical findings as the basic unit for composing progress notes, generates progress notes more efficiently than does a character-based word processor. IVORY's clinical findings are contained within a structured vocabulary that we developed to support generation of both prose progress notes and SNOMED III codes. Observational studies of physician participation in the development of IVORY's structured vocabulary have helped us to identify areas where changes are required before IVORY will be acceptable for routine clinical use. PMID:8130479

  19. A computer aided engineering tool for ECLS systems

    NASA Technical Reports Server (NTRS)

    Bangham, Michal E.; Reuter, James L.

    1987-01-01

    The Computer-Aided Systems Engineering and Analysis tool used by NASA for environmental control and life support system design studies is capable of simulating atmospheric revitalization systems, water recovery and management systems, and single-phase active thermal control systems. The designer/analysis interface used is graphics-based, and allows the designer to build a model by constructing a schematic of the system under consideration. Data management functions are performed, and the program is translated into a format that is compatible with the solution routines.

  20. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  1. Tools for 3D scientific visualization in computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val

    1989-01-01

    The purpose is to describe the tools and techniques in use at the NASA Ames Research Center for performing visualization of computational aerodynamics, for example visualization of flow fields from computer simulations of fluid dynamics about vehicles such as the Space Shuttle. The hardware used for visualization is a high-performance graphics workstation connected to a super computer with a high speed channel. At present, the workstation is a Silicon Graphics IRIS 3130, the supercomputer is a CRAY2, and the high speed channel is a hyperchannel. The three techniques used for visualization are post-processing, tracking, and steering. Post-processing analysis is done after the simulation. Tracking analysis is done during a simulation but is not interactive, whereas steering analysis involves modifying the simulation interactively during the simulation. Using post-processing methods, a flow simulation is executed on a supercomputer and, after the simulation is complete, the results of the simulation are processed for viewing. The software in use and under development at NASA Ames Research Center for performing these types of tasks in computational aerodynamics is described. Workstation performance issues, benchmarking, and high-performance networks for this purpose are also discussed as well as descriptions of other hardware for digital video and film recording.

  2. Computer vision as a tool to study plant development.

    PubMed

    Spalding, Edgar P

    2009-01-01

    Morphological phenotypes due to mutations frequently provide key information about the biological function of the affected genes. This has long been true of the plant Arabidopsis thaliana, though phenotypes are known for only a minority of this model organism's approximately 25,000 genes. One common explanation for lack of phenotype in a given mutant is that a genetic redundancy masks the effect of the missing gene. Another possibility is that a phenotype escaped detection or manifests itself only in a certain unexamined condition. Addressing this potentially nettlesome alternative requires the development of more sophisticated tools for studying morphological development. Computer vision is a technical field that holds much promise in this regard. This chapter explains in general terms how computer algorithms can extract quantitative information from images of plant structures undergoing development. Automation is a central feature of a successful computer vision application as it enables more conditions and more dependencies to be characterized. This in turn expands the concept of phenotype into a point set in multidimensional condition space. New ways of measuring and thinking about phenotypes, and therefore the functions of genes, are expected to result from expanding the role of computer vision in plant biology. PMID:19588113

  3. Solid rocket booster internal flow analysis by highly accurate adaptive computational methods

    NASA Technical Reports Server (NTRS)

    Huang, C. Y.; Tworzydlo, W.; Oden, J. T.; Bass, J. M.; Cullen, C.; Vadaketh, S.

    1991-01-01

    The primary objective of this project was to develop an adaptive finite element flow solver for simulating internal flows in the solid rocket booster. Described here is a unique flow simulator code for analyzing highly complex flow phenomena in the solid rocket booster. New methodologies and features incorporated into this analysis tool are described.

  4. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  5. A Scalable and Accurate Targeted Gene Assembly Tool (SAT-Assembler) for Next-Generation Sequencing Data

    PubMed Central

    Zhang, Yuan; Sun, Yanni; Cole, James R.

    2014-01-01

    Gene assembly, which recovers gene segments from short reads, is an important step in functional analysis of next-generation sequencing data. Lacking quality reference genomes, de novo assembly is commonly used for RNA-Seq data of non-model organisms and metagenomic data. However, heterogeneous sequence coverage caused by heterogeneous expression or species abundance, similarity between isoforms or homologous genes, and large data size all pose challenges to de novo assembly. As a result, existing assembly tools tend to output fragmented contigs or chimeric contigs, or have high memory footprint. In this work, we introduce a targeted gene assembly program SAT-Assembler, which aims to recover gene families of particular interest to biologists. It addresses the above challenges by conducting family-specific homology search, homology-guided overlap graph construction, and careful graph traversal. It can be applied to both RNA-Seq and metagenomic data. Our experimental results on an Arabidopsis RNA-Seq data set and two metagenomic data sets show that SAT-Assembler has smaller memory usage, comparable or better gene coverage, and lower chimera rate for assembling a set of genes from one or multiple pathways compared with other assembly tools. Moreover, the family-specific design and rapid homology search allow SAT-Assembler to be naturally compatible with parallel computing platforms. The source code of SAT-Assembler is available at https://sourceforge.net/projects/sat-assembler/. The data sets and experimental settings can be found in supplementary material. PMID:25122209

  6. A scalable and accurate targeted gene assembly tool (SAT-Assembler) for next-generation sequencing data.

    PubMed

    Zhang, Yuan; Sun, Yanni; Cole, James R

    2014-08-01

    Gene assembly, which recovers gene segments from short reads, is an important step in functional analysis of next-generation sequencing data. Lacking quality reference genomes, de novo assembly is commonly used for RNA-Seq data of non-model organisms and metagenomic data. However, heterogeneous sequence coverage caused by heterogeneous expression or species abundance, similarity between isoforms or homologous genes, and large data size all pose challenges to de novo assembly. As a result, existing assembly tools tend to output fragmented contigs or chimeric contigs, or have high memory footprint. In this work, we introduce a targeted gene assembly program SAT-Assembler, which aims to recover gene families of particular interest to biologists. It addresses the above challenges by conducting family-specific homology search, homology-guided overlap graph construction, and careful graph traversal. It can be applied to both RNA-Seq and metagenomic data. Our experimental results on an Arabidopsis RNA-Seq data set and two metagenomic data sets show that SAT-Assembler has smaller memory usage, comparable or better gene coverage, and lower chimera rate for assembling a set of genes from one or multiple pathways compared with other assembly tools. Moreover, the family-specific design and rapid homology search allow SAT-Assembler to be naturally compatible with parallel computing platforms. The source code of SAT-Assembler is available at https://sourceforge.net/projects/sat-assembler/. The data sets and experimental settings can be found in supplementary material. PMID:25122209

  7. Lensfree Computational Microscopy Tools and their Biomedical Applications

    NASA Astrophysics Data System (ADS)

    Sencan, Ikbal

    Conventional microscopy has been a revolutionary tool for biomedical applications since its invention several centuries ago. Ability to non-destructively observe very fine details of biological objects in real time enabled to answer many important questions about their structures and functions. Unfortunately, most of these advance microscopes are complex, bulky, expensive, and/or hard to operate, so they could not reach beyond the walls of well-equipped laboratories. Recent improvements in optoelectronic components and computational methods allow creating imaging systems that better fulfill the specific needs of clinics or research related biomedical applications. In this respect, lensfree computational microscopy aims to replace bulky and expensive optical components with compact and cost-effective alternatives through the use of computation, which can be particularly useful for lab-on-a-chip platforms as well as imaging applications in low-resource settings. Several high-throughput on-chip platforms are built with this approach for applications including, but not limited to, cytometry, micro-array imaging, rare cell analysis, telemedicine, and water quality screening. The lack of optical complexity in these lensfree on-chip imaging platforms is compensated by using computational techniques. These computational methods are utilized for various purposes in coherent, incoherent and fluorescent on-chip imaging platforms e.g. improving the spatial resolution, to undo the light diffraction without using lenses, localization of objects in a large volume and retrieval of the phase or the color/spectral content of the objects. For instance, pixel super resolution approaches based on source shifting are used in lensfree imaging platforms to prevent under sampling, Bayer pattern, and aliasing artifacts. Another method, iterative phase retrieval, is utilized to compensate the lack of lenses by undoing the diffraction and removing the twin image noise of in-line holograms

  8. OVERSMART Reporting Tool for Flow Computations Over Large Grid Systems

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Chan, William M.

    2012-01-01

    Structured grid solvers such as NASA's OVERFLOW compressible Navier-Stokes flow solver can generate large data files that contain convergence histories for flow equation residuals, turbulence model equation residuals, component forces and moments, and component relative motion dynamics variables. Most of today's large-scale problems can extend to hundreds of grids, and over 100 million grid points. However, due to the lack of efficient tools, only a small fraction of information contained in these files is analyzed. OVERSMART (OVERFLOW Solution Monitoring And Reporting Tool) provides a comprehensive report of solution convergence of flow computations over large, complex grid systems. It produces a one-page executive summary of the behavior of flow equation residuals, turbulence model equation residuals, and component forces and moments. Under the automatic option, a matrix of commonly viewed plots such as residual histograms, composite residuals, sub-iteration bar graphs, and component forces and moments is automatically generated. Specific plots required by the user can also be prescribed via a command file or a graphical user interface. Output is directed to the user s computer screen and/or to an html file for archival purposes. The current implementation has been targeted for the OVERFLOW flow solver, which is used to obtain a flow solution on structured overset grids. The OVERSMART framework allows easy extension to other flow solvers.

  9. Covariance approximation for fast and accurate computation of channelized Hotelling observer statistics

    SciTech Connect

    Bonetto, Paola; Qi, Jinyi; Leahy, Richard M.

    1999-10-01

    We describe a method for computing linear observer statistics for maximum a posteriori (MAP) reconstructions of PET images. The method is based on a theoretical approximation for the mean and covariance of MAP reconstructions. In particular, we derive here a closed form for the channelized Hotelling observer (CHO) statistic applied to 2D MAP images. We show reasonably good correspondence between these theoretical results and Monte Carlo studies. The accuracy and low computational cost of the approximation allow us to analyze the observer performance over a wide range of operating conditions and parameter settings for the MAP reconstruction algorithm.

  10. Time-Accurate Computations of Isolated Circular Synthetic Jets in Crossflow

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Schaeffler, N. W.; Milanovic, I. M.; Zaman, K. B. M. Q.

    2007-01-01

    Results from unsteady Reynolds-averaged Navier-Stokes computations are described for two different synthetic jet flows issuing into a turbulent boundary layer crossflow through a circular orifice. In one case the jet effect is mostly contained within the boundary layer, while in the other case the jet effect extends beyond the boundary layer edge. Both cases have momentum flux ratios less than 2. Several numerical parameters are investigated, and some lessons learned regarding the CFD methods for computing these types of flow fields are summarized. Results in both cases are compared to experiment.

  11. Time-Accurate Computations of Isolated Circular Synthetic Jets in Crossflow

    NASA Technical Reports Server (NTRS)

    Rumsey, Christoper L.; Schaeffler, Norman W.; Milanovic, I. M.; Zaman, K. B. M. Q.

    2005-01-01

    Results from unsteady Reynolds-averaged Navier-Stokes computations are described for two different synthetic jet flows issuing into a turbulent boundary layer crossflow through a circular orifice. In one case the jet effect is mostly contained within the boundary layer, while in the other case the jet effect extends beyond the boundary layer edge. Both cases have momentum flux ratios less than 2. Several numerical parameters are investigated, and some lessons learned regarding the CFD methods for computing these types of flow fields are outlined. Results in both cases are compared to experiment.

  12. Java Analysis Tools for Element Production Calculations in Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Lingerfelt, E.; Hix, W.; Guidry, M.; Smith, M.

    2002-12-01

    We are developing a set of extendable, cross-platform tools and interfaces using Java and vector graphic technologies such as SVG and SWF to facilitate element production calculations in computational astrophysics. The Java technologies are customizable and portable, and can be utilized as stand-alone applications or distributed across a network. These tools, which have broad applications in general scientific visualization, are currently being used to explore and analyze a large library of nuclear reaction rates and visualize results of explosive nucleosynthesis calculations with compact, high quality vector graphics. The facilities for reading and plotting nuclear reaction rates and their components from a network or library permit the user to easily include new rates and compare and adjust current ones. Sophisticated visualization and graphical analysis tools offer the ability to view results in an interactive, scalable vector graphics format, which leads to a dramatic (ten-fold) reduction in visualization file sizes while maintaining high visual quality and interactive control. ORNL Physics Division is managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.

  13. Modeling of Tool-Tissue Interactions for Computer-Based Surgical Simulation: A Literature Review

    PubMed Central

    Misra, Sarthak; Ramesh, K. T.; Okamura, Allison M.

    2009-01-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in the development of high-fidelity surgical simulators. Researchers have attempted to model tool-tissue interactions in a wide variety of ways, which can be broadly classified as (1) linear elasticity-based, (2) nonlinear (hyperelastic) elasticity-based finite element (FE) methods, and (3) other techniques that not based on FE methods or continuum mechanics. Realistic modeling of organ deformation requires populating the model with real tissue data (which are difficult to acquire in vivo) and simulating organ response in real time (which is computationally expensive). Further, it is challenging to account for connective tissue supporting the organ, friction, and topological changes resulting from tool-tissue interactions during invasive surgical procedures. Overcoming such obstacles will not only help us to model tool-tissue interactions in real time, but also enable realistic force feedback to the user during surgical simulation. This review paper classifies the existing research on tool-tissue interactions for surgical simulators specifically based on the modeling techniques employed and the kind of surgical operation being simulated, in order to inform and motivate future research on improved tool-tissue interaction models. PMID:20119508

  14. Computer subroutine ISUDS accurately solves large system of simultaneous linear algebraic equations

    NASA Technical Reports Server (NTRS)

    Collier, G.

    1967-01-01

    Computer program, an Iterative Scheme Using a Direct Solution, obtains double precision accuracy using a single-precision coefficient matrix. ISUDS solves a system of equations written in matrix form as AX equals B, where A is a square non-singular coefficient matrix, X is a vector, and B is a vector.

  15. Time-Accurate Computation of Viscous Flow Around Deforming Bodies Using Overset Grids

    SciTech Connect

    Fast, P; Henshaw, W D

    2001-04-02

    Dynamically evolving boundaries and deforming bodies interacting with a flow are commonly encountered in fluid dynamics. However, the numerical simulation of flows with dynamic boundaries is difficult with current methods. We propose a new method for studying such problems. The key idea is to use the overset grid method with a thin, body-fitted grid near the deforming boundary, while using fixed Cartesian grids to cover most of the computational domain. Our approach combines the strengths of earlier moving overset grid methods for rigid body motion, and unstructured grid methods for Aow-structure interactions. Large scale deformation of the flow boundaries can be handled without a global regridding, and in a computationally efficient way. In terms of computational cost, even a full overset grid regridding is significantly cheaper than a full regridding of an unstructured grid for the same domain, especially in three dimensions. Numerical studies are used to verify accuracy and convergence of our flow solver. As a computational example, we consider two-dimensional incompressible flow past a flexible filament with prescribed dynamics.

  16. Virtual temporal bone: creation and application of a new computer-based teaching tool.

    PubMed

    Mason, T P; Applebaum, E L; Rasmussen, M; Millman, A; Evenhouse, R; Panko, W

    2000-02-01

    The human temporal bone is a 3-dimensionally complex anatomic region with many unique qualities that make anatomic teaching and learning difficult. Current teaching tools have proved only partially adequate for the needs of the aspiring otologic surgeon in learning this anatomy. We used a variety of computerized image processing and reconstruction techniques to reconstruct an anatomically accurate 3-dimensional computer model of the human temporal bone from serial histologic sections. The model is viewed with a specialized visualization system that allows it to be manipulated easily in a stereoscopic virtual environment. The model may then be interactively studied from any viewpoint, greatly simplifying the task of conceptualizing and learning this anatomy. The system also provides for simultaneous computer networking that can bring distant participants into a single shared virtual teaching environment. Future directions of the project are discussed. PMID:10652385

  17. A Microanalysis of Pair Problem Solving With and Without a Computer Tool.

    ERIC Educational Resources Information Center

    Derry, Sharon; And Others

    The social interactions that occur during pair problem solving were studied using a computer tool and without using the tool. The computer tool, (the TAPS system) is an instructional system that presents complex word problems and provides a graphics user interface with tools for constructing problem trees (network structures showing…

  18. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    NASA Astrophysics Data System (ADS)

    Schovancová, J.; Campana, S.; Di Girolamo, A.; Jézéquel, S.; Ueda, I.; Wenaus, T.; Atlas Collaboration

    2014-06-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources. During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visualization bits across the different tools. A rich family of various filtering and searching options enhancing available user interfaces comes naturally with the data and visualization layer separation. With a variety of reliable monitoring data accessible through standardized interfaces, the possibility of automating actions under well defined conditions correlating multiple data sources has become feasible. In this contribution we discuss also about the automated exclusion of degraded resources and their automated recovery in various activities.

  19. Accurate Analysis and Computer Aided Design of Microstrip Dual Mode Resonators and Filters.

    NASA Astrophysics Data System (ADS)

    Grounds, Preston Whitfield, III

    1995-01-01

    Microstrip structures are of interest due to their many applications in microwave circuit design. Their small size and ease of connection to both passive and active components make them well suited for use in systems where size and space is at a premium. These include satellite communication systems, radar systems, satellite navigation systems, cellular phones and many others. In general, space is always a premium for any mobile system. Microstrip resonators find particular application in oscillators and filters. In typical filters each microstrip patch corresponds to one resonator. However, when dual mode patches are employed, each patch acts as two resonators and therefore reduces the amount of space required to build the filter. This dissertation focuses on the accurate electromagnetic analysis of the components of planar dual mode filters. Highly accurate analyses are required so that the resonator to resonator coupling and the resonator to input/output can be predicted with precision. Hence, filters can be built with a minimum of design iterations and tuning. The analysis used herein is an integral equation formulation in the spectral domain. The analysis is done in the spectral domain since the Green's function can be derived in closed form, and the spatial domain convolution becomes a simple product. The resulting set of equations is solved using the Method of Moments with Galerkin's procedure. The electromagnetic analysis is applied to range of problems including unloaded dual mode patches, dual mode patches coupled to microstrip feedlines, and complete filter structures. At each step calculated results are compared to measured results and good agreement is found. The calculated results are also compared to results from the circuit analysis program HP EESOF^{ rm TM} and again good agreement is found. A dual mode elliptic filter is built and good performance is obtained.

  20. Enabling fast, stable and accurate peridynamic computations using multi-time-step integration

    DOE PAGESBeta

    Lindsay, P.; Parks, M. L.; Prakash, A.

    2016-04-13

    Peridynamics is a nonlocal extension of classical continuum mechanics that is well-suited for solving problems with discontinuities such as cracks. This paper extends the peridynamic formulation to decompose a problem domain into a number of smaller overlapping subdomains and to enable the use of different time steps in different subdomains. This approach allows regions of interest to be isolated and solved at a small time step for increased accuracy while the rest of the problem domain can be solved at a larger time step for greater computational efficiency. Lastly, performance of the proposed method in terms of stability, accuracy, andmore » computational cost is examined and several numerical examples are presented to corroborate the findings.« less

  1. Matrix-vector multiplication using digital partitioning for more accurate optical computing

    NASA Technical Reports Server (NTRS)

    Gary, C. K.

    1992-01-01

    Digital partitioning offers a flexible means of increasing the accuracy of an optical matrix-vector processor. This algorithm can be implemented with the same architecture required for a purely analog processor, which gives optical matrix-vector processors the ability to perform high-accuracy calculations at speeds comparable with or greater than electronic computers as well as the ability to perform analog operations at a much greater speed. Digital partitioning is compared with digital multiplication by analog convolution, residue number systems, and redundant number representation in terms of the size and the speed required for an equivalent throughput as well as in terms of the hardware requirements. Digital partitioning and digital multiplication by analog convolution are found to be the most efficient alogrithms if coding time and hardware are considered, and the architecture for digital partitioning permits the use of analog computations to provide the greatest throughput for a single processor.

  2. Iofetamine I 123 single photon emission computed tomography is accurate in the diagnosis of Alzheimer's disease

    SciTech Connect

    Johnson, K.A.; Holman, B.L.; Rosen, T.J.; Nagel, J.S.; English, R.J.; Growdon, J.H. )

    1990-04-01

    To determine the diagnostic accuracy of iofetamine hydrochloride I 123 (IMP) with single photon emission computed tomography in Alzheimer's disease, we studied 58 patients with AD and 15 age-matched healthy control subjects. We used a qualitative method to assess regional IMP uptake in the entire brain and to rate image data sets as normal or abnormal without knowledge of subjects'clinical classification. The sensitivity and specificity of IMP with single photon emission computed tomography in AD were 88% and 87%, respectively. In 15 patients with mild cognitive deficits (Blessed Dementia Scale score, less than or equal to 10), sensitivity was 80%. With the use of a semiquantitative measure of regional cortical IMP uptake, the parietal lobes were the most functionally impaired in AD and the most strongly associated with the patients' Blessed Dementia Scale scores. These results indicated that IMP with single photon emission computed tomography may be a useful adjunct in the clinical diagnosis of AD in early, mild disease.

  3. Accurate and Scalable O(N) Algorithm for First-Principles Molecular-Dynamics Computations on Large Parallel Computers

    NASA Astrophysics Data System (ADS)

    Osei-Kuffuor, Daniel; Fattebert, Jean-Luc

    2014-01-01

    We present the first truly scalable first-principles molecular dynamics algorithm with O(N) complexity and controllable accuracy, capable of simulating systems with finite band gaps of sizes that were previously impossible with this degree of accuracy. By avoiding global communications, we provide a practical computational scheme capable of extreme scalability. Accuracy is controlled by the mesh spacing of the finite difference discretization, the size of the localization regions in which the electronic wave functions are confined, and a cutoff beyond which the components of the overlap matrix can be omitted when computing selected elements of its inverse. We demonstrate the algorithm's excellent parallel scaling for up to 101 952 atoms on 23 328 processors, with a wall-clock time of the order of 1 min per molecular dynamics time step and numerical error on the forces of less than 7×10-4 Ha/Bohr.

  4. Accurate and Scalable O(N) Algorithm for First-Principles Molecular-Dynamics Computations on Large Parallel Computers

    SciTech Connect

    Osei-Kuffuor, Daniel; Fattebert, Jean-Luc

    2014-01-01

    We present the first truly scalable first-principles molecular dynamics algorithm with O(N) complexity and controllable accuracy, capable of simulating systems with finite band gaps of sizes that were previously impossible with this degree of accuracy. By avoiding global communications, we provide a practical computational scheme capable of extreme scalability. Accuracy is controlled by the mesh spacing of the finite difference discretization, the size of the localization regions in which the electronic wave functions are confined, and a cutoff beyond which the components of the overlap matrix can be omitted when computing selected elements of its inverse. We demonstrate the algorithm's excellent parallel scaling for up to 101 952 atoms on 23 328 processors, with a wall-clock time of the order of 1 min per molecular dynamics time step and numerical error on the forces of less than 7x10-4 Ha/Bohr.

  5. Computational tools and resources for prediction and analysis of gene regulatory regions in the chick genome

    PubMed Central

    Khan, Mohsin A. F.; Soto-Jimenez, Luz Mayela; Howe, Timothy; Streit, Andrea; Sosinsky, Alona; Stern, Claudio D.

    2013-01-01

    The discovery of cis-regulatory elements is a challenging problem in bioinformatics, owing to distal locations and context-specific roles of these elements in controlling gene regulation. Here we review the current bioinformatics methodologies and resources available for systematic discovery of cis-acting regulatory elements and conserved transcription factor binding sites in the chick genome. In addition, we propose and make available, a novel workflow using computational tools that integrate CTCF analysis to predict putative insulator elements, enhancer prediction and TFBS analysis. To demonstrate the usefulness of this computational workflow, we then use it to analyze the locus of the gene Sox2 whose developmental expression is known to be controlled by a complex array of cis-acting regulatory elements. The workflow accurately predicts most of the experimentally verified elements along with some that have not yet been discovered. A web version of the CTCF tool, together with instructions for using the workflow can be accessed from http://toolshed.g2.bx.psu.edu/view/mkhan1980/ctcf_analysis. For local installation of the tool, relevant Perl scripts and instructions are provided in the directory named “code” in the supplementary materials. PMID:23355428

  6. Ecoupling server: A tool to compute and analyze electronic couplings.

    PubMed

    Cabeza de Vaca, Israel; Acebes, Sandra; Guallar, Victor

    2016-07-01

    Electron transfer processes are often studied through the evaluation and analysis of the electronic coupling (EC). Since most standard QM codes do not provide readily such a measure, additional, and user-friendly tools to compute and analyze electronic coupling from external wave functions will be of high value. The first server to provide a friendly interface for evaluation and analysis of electronic couplings under two different approximations (FDC and GMH) is presented in this communication. Ecoupling server accepts inputs from common QM and QM/MM software and provides useful plots to understand and analyze the results easily. The web server has been implemented in CGI-python using Apache and it is accessible at http://ecouplingserver.bsc.es. Ecoupling server is free and open to all users without login. © 2016 Wiley Periodicals, Inc. PMID:27157013

  7. Atomdroid: a computational chemistry tool for mobile platforms.

    PubMed

    Feldt, Jonas; Mata, Ricardo A; Dieterich, Johannes M

    2012-04-23

    We present the implementation of a new molecular mechanics program designed for use in mobile platforms, the first specifically built for these devices. The software is designed to run on Android operating systems and is compatible with several modern tablet-PCs and smartphones available in the market. It includes molecular viewer/builder capabilities with integrated routines for geometry optimizations and Monte Carlo simulations. These functionalities allow it to work as a stand-alone tool. We discuss some particular development aspects, as well as the overall feasibility of using computational chemistry software packages in mobile platforms. Benchmark calculations show that through efficient implementation techniques even hand-held devices can be used to simulate midsized systems using force fields. PMID:22404249

  8. Computer-aided design tools for economical MEMS fabrication processes

    NASA Astrophysics Data System (ADS)

    Schneider, Christian; Priebe, Andreas; Brueck, Rainer; Hahn, Kai

    1999-03-01

    Since the early 70s when microsystem technology was first introduce an enormous market for MST-products has been developed. Airbag sensors, micro pumps, ink jet nozzles etc. and the market is just about to start up. Establishing these products for a reasonable price requires mass production. Meanwhile, also computer-based design-tools have been developed in order to reduce the expenses for MST-design. In contrast to other physical design processes like e.g. in micro electronics, MEMS physical design is characterized by the fact that each product requires a tailored sequence of fabrication steps, usually selected from a variety of processing alternatives. The selection from these alternatives is based on economical constraints. Therefore, the design has a strong influence on the money and time spent to take an MST-product to market.

  9. Materials by numbers: Computations as tools of discovery

    PubMed Central

    Landman, Uzi

    2005-01-01

    Current issues pertaining to theoretical simulations of materials, with a focus on systems of nanometer-scale dimensions, are discussed. The use of atomistic simulations as high-resolution numerical experiments, enabling and guiding formulation and testing of analytic theoretical descriptions, is demonstrated through studies of the generation and breakup of nanojets, which have led to the derivation of a stochastic hydrodynamic description. Subsequently, I illustrate the use of computations and simulations as tools of discovery, with examples that include the self-organized formation of nanowires, the surprising nanocatalytic activity of small aggregates of gold that, in the bulk form, is notorious for being chemically inert, and the emergence of rotating electron molecules in two-dimensional quantum dots. I conclude with a brief discussion of some key challenges in nanomaterials simulations. PMID:15870210

  10. Accurate Experiment to Computation Coupling for Understanding QH-mode physics using NIMROD

    NASA Astrophysics Data System (ADS)

    King, J. R.; Burrell, K. H.; Garofalo, A. M.; Groebner, R. J.; Hanson, J. D.; Hebert, J. D.; Hudson, S. R.; Pankin, A. Y.; Kruger, S. E.; Snyder, P. B.

    2015-11-01

    It is desirable to have an ITER H-mode regime that is quiescent to edge-localized modes (ELMs). The quiescent H-mode (QH-mode) with edge harmonic oscillations (EHO) is one such regime. High quality equilibria are essential for accurate EHO simulations with initial-value codes such as NIMROD. We include profiles outside the LCFS which generate associated currents when we solve the Grad-Shafranov equation with open-flux regions using the NIMEQ solver. The new solution is an equilibrium that closely resembles the original reconstruction (which does not contain open-flux currents). This regenerated equilibrium is consistent with the profiles that are measured by the high quality diagnostics on DIII-D. Results from nonlinear NIMROD simulations of the EHO are presented. The full measured rotation profiles are included in the simulation. The simulation develops into a saturated state. The saturation mechanism of the EHO is explored and simulation is compared to magnetic-coil measurements. This work is currently supported in part by the US DOE Office of Science under awards DE-FC02-04ER54698, DE-AC02-09CH11466 and the SciDAC Center for Extended MHD Modeling.

  11. Gravitational Focusing and the Computation of an Accurate Moon/Mars Cratering Ratio

    NASA Technical Reports Server (NTRS)

    Matney, Mark J.

    2006-01-01

    There have been a number of attempts to use asteroid populations to simultaneously compute cratering rates on the Moon and bodies elsewhere in the Solar System to establish the cratering ratio (e.g., [1],[2]). These works use current asteroid orbit population databases combined with collision rate calculations based on orbit intersections alone. As recent work on meteoroid fluxes [3] have highlighted, however, collision rates alone are insufficient to describe the cratering rates on planetary surfaces - especially planets with stronger gravitational fields than the Moon, such as Earth and Mars. Such calculations also need to include the effects of gravitational focusing, whereby the spatial density of the slower-moving impactors is preferentially "focused" by the gravity of the body. This leads overall to higher fluxes and cratering rates, and is highly dependent on the detailed velocity distributions of the impactors. In this paper, a comprehensive gravitational focusing algorithm originally developed to describe fluxes of interplanetary meteoroids [3] is applied to the collision rates and cratering rates of populations of asteroids and long-period comets to compute better cratering ratios for terrestrial bodies in the Solar System. These results are compared to the calculations of other researchers.

  12. Thermal Conductivities in Solids from First Principles: Accurate Computations and Rapid Estimates

    NASA Astrophysics Data System (ADS)

    Carbogno, Christian; Scheffler, Matthias

    In spite of significant research efforts, a first-principles determination of the thermal conductivity κ at high temperatures has remained elusive. Boltzmann transport techniques that account for anharmonicity perturbatively become inaccurate under such conditions. Ab initio molecular dynamics (MD) techniques using the Green-Kubo (GK) formalism capture the full anharmonicity, but can become prohibitively costly to converge in time and size. We developed a formalism that accelerates such GK simulations by several orders of magnitude and that thus enables its application within the limited time and length scales accessible in ab initio MD. For this purpose, we determine the effective harmonic potential occurring during the MD, the associated temperature-dependent phonon properties and lifetimes. Interpolation in reciprocal and frequency space then allows to extrapolate to the macroscopic scale. For both force-field and ab initio MD, we validate this approach by computing κ for Si and ZrO2, two materials known for their particularly harmonic and anharmonic character. Eventually, we demonstrate how these techniques facilitate reasonable estimates of κ from existing MD calculations at virtually no additional computational cost.

  13. Two-component density functional theory within the projector augmented-wave approach: Accurate and self-consistent computations of positron lifetimes and momentum distributions

    NASA Astrophysics Data System (ADS)

    Wiktor, Julia; Jomard, Gérald; Torrent, Marc

    2015-09-01

    Many techniques have been developed in the past in order to compute positron lifetimes in materials from first principles. However, there is still a lack of a fast and accurate self-consistent scheme that could handle accurately the forces acting on the ions induced by the presence of the positron. We will show in this paper that we have reached this goal by developing the two-component density functional theory within the projector augmented-wave (PAW) method in the open-source code abinit. This tool offers the accuracy of the all-electron methods with the computational efficiency of the plane-wave ones. We can thus deal with supercells that contain few hundreds to thousands of atoms to study point defects as well as more extended defects clusters. Moreover, using the PAW basis set allows us to use techniques able to, for instance, treat strongly correlated systems or spin-orbit coupling, which are necessary to study heavy elements, such as the actinides or their compounds.

  14. TRAC, a collaborative computer tool for tracer-test interpretation

    NASA Astrophysics Data System (ADS)

    Gutierrez, A.; Klinka, T.; Thiéry, D.; Buscarlet, E.; Binet, S.; Jozja, N.; Défarge, C.; Leclerc, B.; Fécamp, C.; Ahumada, Y.; Elsass, J.

    2013-05-01

    Artificial tracer tests are widely used by consulting engineers for demonstrating water circulation, proving the existence of leakage, or estimating groundwater velocity. However, the interpretation of such tests is often very basic, with the result that decision makers and professionals commonly face unreliable results through hasty and empirical interpretation. There is thus an increasing need for a reliable interpretation tool, compatible with the latest operating systems and available in several languages. BRGM, the French Geological Survey, has developed a project together with hydrogeologists from various other organizations to build software assembling several analytical solutions in order to comply with various field contexts. This computer program, called TRAC, is very light and simple, allowing the user to add his own analytical solution if the formula is not yet included. It aims at collaborative improvement by sharing the tool and the solutions. TRAC can be used for interpreting data recovered from a tracer test as well as for simulating the transport of a tracer in the saturated zone (for the time being). Calibration of a site operation is based on considering the hydrodynamic and hydrodispersive features of groundwater flow as well as the amount, nature and injection mode of the artificial tracer. The software is available in French, English and Spanish, and the latest version can be downloaded from the web site http://trac.brgm.fr">http://trac.brgm.fr.

  15. Computational tool for modeling and simulation of mechanically ventilated patients.

    PubMed

    Serna, Leidy Y; Hernandez, Alher M; Mananas, Miguel A

    2010-01-01

    The mechanical ventilator settings in patients with respiratory diseases like chronic obstructive pulmonary disease (COPD) during episodes of acute respiratory failure (ARF) is not a simple task that in most cases is successful based on the experience of physicians. This paper describes an interactive tool based in mathematical models, developed to make easier the study of the interaction between a mechanical ventilator and a patient. It describes all stages of system development, including simulated ventilatory modes, the pathologies of interest and interaction between the user and the system through a graphical interface developed in Matlab and Simulink. The developed computational tool allows the study of most widely used ventilatory modes and its advantages in the treatment of different kind of patients. The graphical interface displays all variables and parameters in the common way of last generation mechanical ventilators do and it is totally interactive, making possible its use by clinical personal, hiding the complexity of implemented mathematical models to the user. The evaluation in different clinical simulated scenes adjusts properly with recent findings in mechanical ventilation scientific literature. PMID:21096101

  16. Integrated modeling tool for performance engineering of complex computer systems

    NASA Technical Reports Server (NTRS)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  17. Time-Accurate Computational Fluid Dynamics Simulation of a Pair of Moving Solid Rocket Boosters

    NASA Technical Reports Server (NTRS)

    Strutzenberg, Louise L.; Williams, Brandon R.

    2011-01-01

    Since the Columbia accident, the threat to the Shuttle launch vehicle from debris during the liftoff timeframe has been assessed by the Liftoff Debris Team at NASA/MSFC. In addition to engineering methods of analysis, CFD-generated flow fields during the liftoff timeframe have been used in conjunction with 3-DOF debris transport methods to predict the motion of liftoff debris. Early models made use of a quasi-steady flow field approximation with the vehicle positioned at a fixed location relative to the ground; however, a moving overset mesh capability has recently been developed for the Loci/CHEM CFD software which enables higher-fidelity simulation of the Shuttle transient plume startup and liftoff environment. The present work details the simulation of the launch pad and mobile launch platform (MLP) with truncated solid rocket boosters (SRBs) moving in a prescribed liftoff trajectory derived from Shuttle flight measurements. Using Loci/CHEM, time-accurate RANS and hybrid RANS/LES simulations were performed for the timeframe T0+0 to T0+3.5 seconds, which consists of SRB startup to a vehicle altitude of approximately 90 feet above the MLP. Analysis of the transient flowfield focuses on the evolution of the SRB plumes in the MLP plume holes and the flame trench, impingement on the flame deflector, and especially impingment on the MLP deck resulting in upward flow which is a transport mechanism for debris. The results show excellent qualitative agreement with the visual record from past Shuttle flights, and comparisons to pressure measurements in the flame trench and on the MLP provide confidence in these simulation capabilities.

  18. Efficiency and Accuracy of Time-Accurate Turbulent Navier-Stokes Computations

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Sanetrik, Mark D.; Biedron, Robert T.; Melson, N. Duane; Parlette, Edward B.

    1995-01-01

    The accuracy and efficiency of two types of subiterations in both explicit and implicit Navier-Stokes codes are explored for unsteady laminar circular-cylinder flow and unsteady turbulent flow over an 18-percent-thick circular-arc (biconvex) airfoil. Grid and time-step studies are used to assess the numerical accuracy of the methods. Nonsubiterative time-stepping schemes and schemes with physical time subiterations are subject to time-step limitations in practice that are removed by pseudo time sub-iterations. Computations for the circular-arc airfoil indicate that a one-equation turbulence model predicts the unsteady separated flow better than an algebraic turbulence model; also, the hysteresis with Mach number of the self-excited unsteadiness due to shock and boundary-layer separation is well predicted.

  19. A model for the accurate computation of the lateral scattering of protons in water.

    PubMed

    Bellinzona, E V; Ciocca, M; Embriaco, A; Ferrari, A; Fontana, A; Mairani, A; Parodi, K; Rotondi, A; Sala, P; Tessonnier, T

    2016-02-21

    A pencil beam model for the calculation of the lateral scattering in water of protons for any therapeutic energy and depth is presented. It is based on the full Molière theory, taking into account the energy loss and the effects of mixtures and compounds. Concerning the electromagnetic part, the model has no free parameters and is in very good agreement with the FLUKA Monte Carlo (MC) code. The effects of the nuclear interactions are parametrized with a two-parameter tail function, adjusted on MC data calculated with FLUKA. The model, after the convolution with the beam and the detector response, is in agreement with recent proton data in water from HIT. The model gives results with the same accuracy of the MC codes based on Molière theory, with a much shorter computing time. PMID:26808380

  20. A model for the accurate computation of the lateral scattering of protons in water

    NASA Astrophysics Data System (ADS)

    Bellinzona, E. V.; Ciocca, M.; Embriaco, A.; Ferrari, A.; Fontana, A.; Mairani, A.; Parodi, K.; Rotondi, A.; Sala, P.; Tessonnier, T.

    2016-02-01

    A pencil beam model for the calculation of the lateral scattering in water of protons for any therapeutic energy and depth is presented. It is based on the full Molière theory, taking into account the energy loss and the effects of mixtures and compounds. Concerning the electromagnetic part, the model has no free parameters and is in very good agreement with the FLUKA Monte Carlo (MC) code. The effects of the nuclear interactions are parametrized with a two-parameter tail function, adjusted on MC data calculated with FLUKA. The model, after the convolution with the beam and the detector response, is in agreement with recent proton data in water from HIT. The model gives results with the same accuracy of the MC codes based on Molière theory, with a much shorter computing time.

  1. Computer-implemented system and method for automated and highly accurate plaque analysis, reporting, and visualization

    NASA Technical Reports Server (NTRS)

    Kemp, James Herbert (Inventor); Talukder, Ashit (Inventor); Lambert, James (Inventor); Lam, Raymond (Inventor)

    2008-01-01

    A computer-implemented system and method of intra-oral analysis for measuring plaque removal is disclosed. The system includes hardware for real-time image acquisition and software to store the acquired images on a patient-by-patient basis. The system implements algorithms to segment teeth of interest from surrounding gum, and uses a real-time image-based morphing procedure to automatically overlay a grid onto each segmented tooth. Pattern recognition methods are used to classify plaque from surrounding gum and enamel, while ignoring glare effects due to the reflection of camera light and ambient light from enamel regions. The system integrates these components into a single software suite with an easy-to-use graphical user interface (GUI) that allows users to do an end-to-end run of a patient record, including tooth segmentation of all teeth, grid morphing of each segmented tooth, and plaque classification of each tooth image.

  2. Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals’ Behaviour

    PubMed Central

    Calderara, Simone; Pistocchi, Simone; Cucchiara, Rita; Podaliri-Vulpiani, Michele; Messori, Stefano; Ferri, Nicola

    2016-01-01

    Mankind directly controls the environment and lifestyles of several domestic species for purposes ranging from production and research to conservation and companionship. These environments and lifestyles may not offer these animals the best quality of life. Behaviour is a direct reflection of how the animal is coping with its environment. Behavioural indicators are thus among the preferred parameters to assess welfare. However, behavioural recording (usually from video) can be very time consuming and the accuracy and reliability of the output rely on the experience and background of the observers. The outburst of new video technology and computer image processing gives the basis for promising solutions. In this pilot study, we present a new prototype software able to automatically infer the behaviour of dogs housed in kennels from 3D visual data and through structured machine learning frameworks. Depth information acquired through 3D features, body part detection and training are the key elements that allow the machine to recognise postures, trajectories inside the kennel and patterns of movement that can be later labelled at convenience. The main innovation of the software is its ability to automatically cluster frequently observed temporal patterns of movement without any pre-set ethogram. Conversely, when common patterns are defined through training, a deviation from normal behaviour in time or between individuals could be assessed. The software accuracy in correctly detecting the dogs’ behaviour was checked through a validation process. An automatic behaviour recognition system, independent from human subjectivity, could add scientific knowledge on animals’ quality of life in confinement as well as saving time and resources. This 3D framework was designed to be invariant to the dog’s shape and size and could be extended to farm, laboratory and zoo quadrupeds in artificial housing. The computer vision technique applied to this software is innovative in non

  3. Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals' Behaviour.

    PubMed

    Barnard, Shanis; Calderara, Simone; Pistocchi, Simone; Cucchiara, Rita; Podaliri-Vulpiani, Michele; Messori, Stefano; Ferri, Nicola

    2016-01-01

    Mankind directly controls the environment and lifestyles of several domestic species for purposes ranging from production and research to conservation and companionship. These environments and lifestyles may not offer these animals the best quality of life. Behaviour is a direct reflection of how the animal is coping with its environment. Behavioural indicators are thus among the preferred parameters to assess welfare. However, behavioural recording (usually from video) can be very time consuming and the accuracy and reliability of the output rely on the experience and background of the observers. The outburst of new video technology and computer image processing gives the basis for promising solutions. In this pilot study, we present a new prototype software able to automatically infer the behaviour of dogs housed in kennels from 3D visual data and through structured machine learning frameworks. Depth information acquired through 3D features, body part detection and training are the key elements that allow the machine to recognise postures, trajectories inside the kennel and patterns of movement that can be later labelled at convenience. The main innovation of the software is its ability to automatically cluster frequently observed temporal patterns of movement without any pre-set ethogram. Conversely, when common patterns are defined through training, a deviation from normal behaviour in time or between individuals could be assessed. The software accuracy in correctly detecting the dogs' behaviour was checked through a validation process. An automatic behaviour recognition system, independent from human subjectivity, could add scientific knowledge on animals' quality of life in confinement as well as saving time and resources. This 3D framework was designed to be invariant to the dog's shape and size and could be extended to farm, laboratory and zoo quadrupeds in artificial housing. The computer vision technique applied to this software is innovative in non

  4. Computational Chemical Imaging for Cardiovascular Pathology: Chemical Microscopic Imaging Accurately Determines Cardiac Transplant Rejection

    PubMed Central

    Tiwari, Saumya; Reddy, Vijaya B.; Bhargava, Rohit; Raman, Jaishankar

    2015-01-01

    Rejection is a common problem after cardiac transplants leading to significant number of adverse events and deaths, particularly in the first year of transplantation. The gold standard to identify rejection is endomyocardial biopsy. This technique is complex, cumbersome and requires a lot of expertise in the correct interpretation of stained biopsy sections. Traditional histopathology cannot be used actively or quickly during cardiac interventions or surgery. Our objective was to develop a stain-less approach using an emerging technology, Fourier transform infrared (FT-IR) spectroscopic imaging to identify different components of cardiac tissue by their chemical and molecular basis aided by computer recognition, rather than by visual examination using optical microscopy. We studied this technique in assessment of cardiac transplant rejection to evaluate efficacy in an example of complex cardiovascular pathology. We recorded data from human cardiac transplant patients’ biopsies, used a Bayesian classification protocol and developed a visualization scheme to observe chemical differences without the need of stains or human supervision. Using receiver operating characteristic curves, we observed probabilities of detection greater than 95% for four out of five histological classes at 10% probability of false alarm at the cellular level while correctly identifying samples with the hallmarks of the immune response in all cases. The efficacy of manual examination can be significantly increased by observing the inherent biochemical changes in tissues, which enables us to achieve greater diagnostic confidence in an automated, label-free manner. We developed a computational pathology system that gives high contrast images and seems superior to traditional staining procedures. This study is a prelude to the development of real time in situ imaging systems, which can assist interventionists and surgeons actively during procedures. PMID:25932912

  5. An accurate and scalable O(N) algorithm for First-Principles Molecular Dynamics computations on petascale computers and beyond

    NASA Astrophysics Data System (ADS)

    Osei-Kuffuor, Daniel; Fattebert, Jean-Luc

    2014-03-01

    We present a truly scalable First-Principles Molecular Dynamics algorithm with O(N) complexity and fully controllable accuracy, capable of simulating systems of sizes that were previously impossible with this degree of accuracy. By avoiding global communication, we have extended W. Kohn's condensed matter ``nearsightedness'' principle to a practical computational scheme capable of extreme scalability. Accuracy is controlled by the mesh spacing of the finite difference discretization, the size of the localization regions in which the electronic wavefunctions are confined, and a cutoff beyond which the components of the overlap matrix can be omitted when computing selected elements of its inverse. We demonstrate the algorithm's excellent parallel scaling for up to 100,000 atoms on 100,000 processors, with a wall-clock time of the order of one minute per molecular dynamics time step. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  6. Tools for studying dry-cured ham processing by using computed tomography.

    PubMed

    Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena

    2012-01-11

    An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions. PMID:22141464

  7. Accurate computation of the radiation from simple antennas using the finite-difference time-domain method

    NASA Astrophysics Data System (ADS)

    Maloney, James G.; Smith, Glenn S.; Scott, Waymond R., Jr.

    1990-07-01

    Two antennas are considered, a cylindrical monopole and a conical monopole. Both are driven through an image plane from a coaxial transmission line. Each of these antennas corresponds to a well-posed theoretical electromagnetic boundary value problem and a realizable experimental model. These antennas are analyzed by a straightforward application of the time-domain finite-difference method. The computed results for these antennas are shown to be in excellent agreement with accurate experimental measurements for both the time domain and the frequency domain. The graphical displays presented for the transient near-zone and far-zone radiation from these antennas provide physical insight into the radiation process.

  8. Improved targeting device and computer navigation for accurate placement of brachytherapy needles

    SciTech Connect

    Pappas, Ion P.I.; Ryan, Paul; Cossmann, Peter; Kowal, Jens; Borgeson, Blake; Caversaccio, Marco

    2005-06-15

    Successful treatment of skull base tumors with interstitial brachytherapy requires high targeting accuracy for the brachytherapy needles to avoid harming vital anatomical structures. To enable safe placement of the needles in this area, we developed an image-based planning and navigation system for brachytherapy, which includes a custom-made mechanical positioning arm that allows rough and fine adjustment of the needle position. The fine-adjustment mechanism consists of an XYZ microstage at the base of the arm and a needle holder with two fine-adjustable inclinations. The rotation axes of the inclinations cross at the tip of the needle so that the inclinational adjustments do not interfere with the translational adjustments. A vacuum cushion and a noninvasive fixation frame are used for the head immobilization. To avoid mechanical bending of the needles due to the weight of attached tracking markers, which would be detrimental for targeting accuracy, only a single LED marker on the tail of the needle is used. An experimental phantom-based targeting study with this setup demonstrated that a positioning accuracy of 1.4 mm (rms) can be achieved. The study showed that the proposed setup allows brachytherapy needles to be easily aligned and inserted with high targeting accuracy according to a preliminary plan. The achievable accuracy is higher than if the needles are inserted manually. The proposed system can be linked to a standard afterloader and standard dosimetry planning module. The associated additional effort is reasonable for the clinical practice and therefore the proposed procedure provides a promising tool for the safe treatment of tumors in the skull base area.

  9. Highly Accurate Frequency Calculations of Crab Cavities Using the VORPAL Computational Framework

    SciTech Connect

    Austin, T.M.; Cary, J.R.; Bellantoni, L.; /Argonne

    2009-05-01

    We have applied the Werner-Cary method [J. Comp. Phys. 227, 5200-5214 (2008)] for extracting modes and mode frequencies from time-domain simulations of crab cavities, as are needed for the ILC and the beam delivery system of the LHC. This method for frequency extraction relies on a small number of simulations, and post-processing using the SVD algorithm with Tikhonov regularization. The time-domain simulations were carried out using the VORPAL computational framework, which is based on the eminently scalable finite-difference time-domain algorithm. A validation study was performed on an aluminum model of the 3.9 GHz RF separators built originally at Fermi National Accelerator Laboratory in the US. Comparisons with measurements of the A15 cavity show that this method can provide accuracy to within 0.01% of experimental results after accounting for manufacturing imperfections. To capture the near degeneracies two simulations, requiring in total a few hours on 600 processors were employed. This method has applications across many areas including obtaining MHD spectra from time-domain simulations.

  10. In pursuit of an accurate spatial and temporal model of biomolecules at the atomistic level: a perspective on computer simulation

    PubMed Central

    Gray, Alan; Harlen, Oliver G.; Harris, Sarah A.; Khalid, Syma; Leung, Yuk Ming; Lonsdale, Richard; Mulholland, Adrian J.; Pearson, Arwen R.; Read, Daniel J.; Richardson, Robin A.

    2015-01-01

    Despite huge advances in the computational techniques available for simulating biomolecules at the quantum-mechanical, atomistic and coarse-grained levels, there is still a widespread perception amongst the experimental community that these calculations are highly specialist and are not generally applicable by researchers outside the theoretical community. In this article, the successes and limitations of biomolecular simulation and the further developments that are likely in the near future are discussed. A brief overview is also provided of the experimental biophysical methods that are commonly used to probe biomolecular structure and dynamics, and the accuracy of the information that can be obtained from each is compared with that from modelling. It is concluded that progress towards an accurate spatial and temporal model of biomacromolecules requires a combination of all of these biophysical techniques, both experimental and computational. PMID:25615870

  11. Sampling strategies for accurate computational inferences of gametic phase across highly polymorphic major histocompatibility complex loci

    PubMed Central

    2011-01-01

    Background Genes of the Major Histocompatibility Complex (MHC) are very popular genetic markers among evolutionary biologists because of their potential role in pathogen confrontation and sexual selection. However, MHC genotyping still remains challenging and time-consuming in spite of substantial methodological advances. Although computational haplotype inference has brought into focus interesting alternatives, high heterozygosity, extensive genetic variation and population admixture are known to cause inaccuracies. We have investigated the role of sample size, genetic polymorphism and genetic structuring on the performance of the popular Bayesian PHASE algorithm. To cover this aim, we took advantage of a large database of known genotypes (using traditional laboratory-based techniques) at single MHC class I (N = 56 individuals and 50 alleles) and MHC class II B (N = 103 individuals and 62 alleles) loci in the lesser kestrel Falco naumanni. Findings Analyses carried out over real MHC genotypes showed that the accuracy of gametic phase reconstruction improved with sample size as a result of the reduction in the allele to individual ratio. We then simulated different data sets introducing variations in this parameter to define an optimal ratio. Conclusions Our results demonstrate a critical influence of the allele to individual ratio on PHASE performance. We found that a minimum allele to individual ratio (1:2) yielded 100% accuracy for both MHC loci. Sampling effort is therefore a crucial step to obtain reliable MHC haplotype reconstructions and must be accomplished accordingly to the degree of MHC polymorphism. We expect our findings provide a foothold into the design of straightforward and cost-effective genotyping strategies of those MHC loci from which locus-specific primers are available. PMID:21615903

  12. Towards an accurate and computationally-efficient modelling of Fe(II)-based spin crossover materials.

    PubMed

    Vela, Sergi; Fumanal, Maria; Ribas-Arino, Jordi; Robert, Vincent

    2015-07-01

    The DFT + U methodology is regarded as one of the most-promising strategies to treat the solid state of molecular materials, as it may provide good energetic accuracy at a moderate computational cost. However, a careful parametrization of the U-term is mandatory since the results may be dramatically affected by the selected value. Herein, we benchmarked the Hubbard-like U-term for seven Fe(ii)N6-based pseudo-octahedral spin crossover (SCO) compounds, using as a reference an estimation of the electronic enthalpy difference (ΔHelec) extracted from experimental data (T1/2, ΔS and ΔH). The parametrized U-value obtained for each of those seven compounds ranges from 2.37 eV to 2.97 eV, with an average value of U = 2.65 eV. Interestingly, we have found that this average value can be taken as a good starting point since it leads to an unprecedented mean absolute error (MAE) of only 4.3 kJ mol(-1) in the evaluation of ΔHelec for the studied compounds. Moreover, by comparing our results on the solid state and the gas phase of the materials, we quantify the influence of the intermolecular interactions on the relative stability of the HS and LS states, with an average effect of ca. 5 kJ mol(-1), whose sign cannot be generalized. Overall, the findings reported in this manuscript pave the way for future studies devoted to understand the crystalline phase of SCO compounds, or the adsorption of individual molecules on organic or metallic surfaces, in which the rational incorporation of the U-term within DFT + U yields the required energetic accuracy that is dramatically missing when using bare-DFT functionals. PMID:26040609

  13. Accurate micro-computed tomography imaging of pore spaces in collagen-based scaffold.

    PubMed

    Zidek, Jan; Vojtova, Lucy; Abdel-Mohsen, A M; Chmelik, Jiri; Zikmund, Tomas; Brtnikova, Jana; Jakubicek, Roman; Zubal, Lukas; Jan, Jiri; Kaiser, Jozef

    2016-06-01

    In this work we have used X-ray micro-computed tomography (μCT) as a method to observe the morphology of 3D porous pure collagen and collagen-composite scaffolds useful in tissue engineering. Two aspects of visualizations were taken into consideration: improvement of the scan and investigation of its sensitivity to the scan parameters. Due to the low material density some parts of collagen scaffolds are invisible in a μCT scan. Therefore, here we present different contrast agents, which increase the contrast of the scanned biopolymeric sample for μCT visualization. The increase of contrast of collagenous scaffolds was performed with ceramic hydroxyapatite microparticles (HAp), silver ions (Ag(+)) and silver nanoparticles (Ag-NPs). Since a relatively small change in imaging parameters (e.g. in 3D volume rendering, threshold value and μCT acquisition conditions) leads to a completely different visualized pattern, we have optimized these parameters to obtain the most realistic picture for visual and qualitative evaluation of the biopolymeric scaffold. Moreover, scaffold images were stereoscopically visualized in order to better see the 3D biopolymer composite scaffold morphology. However, the optimized visualization has some discontinuities in zoomed view, which can be problematic for further analysis of interconnected pores by commonly used numerical methods. Therefore, we applied the locally adaptive method to solve discontinuities issue. The combination of contrast agent and imaging techniques presented in this paper help us to better understand the structure and morphology of the biopolymeric scaffold that is crucial in the design of new biomaterials useful in tissue engineering. PMID:27153826

  14. A fully automatic tool to perform accurate flood mapping by merging remote sensing imagery and ancillary data

    NASA Astrophysics Data System (ADS)

    D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco; Pasquariello, Guido

    2016-04-01

    Flooding is one of the most frequent and expansive natural hazard. High-resolution flood mapping is an essential step in the monitoring and prevention of inundation hazard, both to gain insight into the processes involved in the generation of flooding events, and from the practical point of view of the precise assessment of inundated areas. Remote sensing data are recognized to be useful in this respect, thanks to the high resolution and regular revisit schedules of state-of-the-art satellites, moreover offering a synoptic overview of the extent of flooding. In particular, Synthetic Aperture Radar (SAR) data present several favorable characteristics for flood mapping, such as their relative insensitivity to the meteorological conditions during acquisitions, as well as the possibility of acquiring independently of solar illumination, thanks to the active nature of the radar sensors [1]. However, flood scenarios are typical examples of complex situations in which different factors have to be considered to provide accurate and robust interpretation of the situation on the ground: the presence of many land cover types, each one with a particular signature in presence of flood, requires modelling the behavior of different objects in the scene in order to associate them to flood or no flood conditions [2]. Generally, the fusion of multi-temporal, multi-sensor, multi-resolution and/or multi-platform Earth observation image data, together with other ancillary information, seems to have a key role in the pursuit of a consistent interpretation of complex scenes. In the case of flooding, distance from the river, terrain elevation, hydrologic information or some combination thereof can add useful information to remote sensing data. Suitable methods, able to manage and merge different kind of data, are so particularly needed. In this work, a fully automatic tool, based on Bayesian Networks (BNs) [3] and able to perform data fusion, is presented. It supplies flood maps

  15. Teaching chemistry using guided discovery and an interactive computer tool

    NASA Astrophysics Data System (ADS)

    Khan, Samia A.

    An initial test of scientific inquiry skills revealed that students enrolled in a computer enhanced introductory college chemistry class using a guided discovery approach produced significantly larger gains after class instruction compared with two other introductory chemistry classes at the same institution and three introductory science classes at two other college institutions. The purpose of this study was to analyze the instructional strategy in this class to understand how it may have contributed to gains in inquiry skills. Classroom observations of the computer enhanced guided discovery class and two other lecture based chemistry classes, uncovered a pattern of instruction in the guided discovery case that was markedly different from the other two classes, yet more similar to model construction processes of scientists. The central pattern of instruction in the primary case was referred to as the guided discovery approach and was characterized by instructional strategies designed to trigger generate, evaluate, and modify or GEM cycles, other teacher guidance strategies, and the integration of an interactive computer tool. Analysis of classroom observation data and student surveys confirmed a higher frequency of students' generating ideas about chemistry, constructing explanations, and quantitative problem solving in the guided discovery case than the lecture-based classes and a higher rate of teacher requests for students to engage in several of these processes. Small group observations revealed students' reasoning processes as they interacted with their teacher and the computer during instruction. Overall, compared with more traditional forms of chemistry instruction, the evidence suggests that the instructional strategies in the guided discovery case were successful in sustaining student engagement with several fundamental processes of scientific inquiry and may have led to the development of important inquiry skills. The guided discovery case used

  16. Accurate treatments of electrostatics for computer simulations of biological systems: A brief survey of developments and existing problems

    NASA Astrophysics Data System (ADS)

    Yi, Sha-Sha; Pan, Cong; Hu, Zhong-Han

    2015-12-01

    Modern computer simulations of biological systems often involve an explicit treatment of the complex interactions among a large number of molecules. While it is straightforward to compute the short-ranged Van der Waals interaction in classical molecular dynamics simulations, it has been a long-lasting issue to develop accurate methods for the longranged Coulomb interaction. In this short review, we discuss three types of methodologies for the accurate treatment of electrostatics in simulations of explicit molecules: truncation-type methods, Ewald-type methods, and mean-field-type methods. Throughout the discussion, we brief the formulations and developments of these methods, emphasize the intrinsic connections among the three types of methods, and focus on the existing problems which are often associated with the boundary conditions of electrostatics. This brief survey is summarized with a short perspective on future trends along the method developments and applications in the field of biological simulations. Project supported by the National Natural Science Foundation of China (Grant Nos. 91127015 and 21522304) and the Open Project from the State Key Laboratory of Theoretical Physics, and the Innovation Project from the State Key Laboratory of Supramolecular Structure and Materials.

  17. A computationally efficient and accurate numerical representation of thermodynamic properties of steam and water for computations of non-equilibrium condensing steam flow in steam turbines

    NASA Astrophysics Data System (ADS)

    Hrubý, Jan

    2012-04-01

    Mathematical modeling of the non-equilibrium condensing transonic steam flow in the complex 3D geometry of a steam turbine is a demanding problem both concerning the physical concepts and the required computational power. Available accurate formulations of steam properties IAPWS-95 and IAPWS-IF97 require much computation time. For this reason, the modelers often accept the unrealistic ideal-gas behavior. Here we present a computation scheme based on a piecewise, thermodynamically consistent representation of the IAPWS-95 formulation. Density and internal energy are chosen as independent variables to avoid variable transformations and iterations. On the contrary to the previous Tabular Taylor Series Expansion Method, the pressure and temperature are continuous functions of the independent variables, which is a desirable property for the solution of the differential equations of the mass, energy, and momentum conservation for both phases.

  18. A More Accurate and Efficient Technique Developed for Using Computational Methods to Obtain Helical Traveling-Wave Tube Interaction Impedance

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.

    1999-01-01

    The phenomenal growth of commercial communications has created a great demand for traveling-wave tube (TWT) amplifiers. Although the helix slow-wave circuit remains the mainstay of the TWT industry because of its exceptionally wide bandwidth, until recently it has been impossible to accurately analyze a helical TWT using its exact dimensions because of the complexity of its geometrical structure. For the first time, an accurate three-dimensional helical model was developed that allows accurate prediction of TWT cold-test characteristics including operating frequency, interaction impedance, and attenuation. This computational model, which was developed at the NASA Lewis Research Center, allows TWT designers to obtain a more accurate value of interaction impedance than is possible using experimental methods. Obtaining helical slow-wave circuit interaction impedance is an important part of the design process for a TWT because it is related to the gain and efficiency of the tube. This impedance cannot be measured directly; thus, conventional methods involve perturbing a helical circuit with a cylindrical dielectric rod placed on the central axis of the circuit and obtaining the difference in resonant frequency between the perturbed and unperturbed circuits. A mathematical relationship has been derived between this frequency difference and the interaction impedance (ref. 1). However, because of the complex configuration of the helical circuit, deriving this relationship involves several approximations. In addition, this experimental procedure is time-consuming and expensive, but until recently it was widely accepted as the most accurate means of determining interaction impedance. The advent of an accurate three-dimensional helical circuit model (ref. 2) made it possible for Lewis researchers to fully investigate standard approximations made in deriving the relationship between measured perturbation data and interaction impedance. The most prominent approximations made

  19. IGS-global ionospheric maps for accurate computation of GPS single- frequency ionospheric delay-simulation study

    NASA Astrophysics Data System (ADS)

    Farah, A.

    The Ionospheric delay is still one of the largest sources of error that affects the positioning accuracy of any satellite positioning system. This problem could be solved due to the dispersive nature of the Ionosphere by combining simultaneous measurements of signals at two different frequencies but it is still there for single- frequency users. Much effort has been made in establishing models for single- frequency users to make this effect as small as possible. These models vary in accuracy, input data and computational complexity, so the choice between the different models depends on the individual circumstances of the user. From the simulation point of view, the model needed should be accurate with a global coverage and good description to the Ionosphere's variable nature with both time and location. The author reviews some of these established models, starting with the BENT model, the Klobuchar model and the IRI (International Reference Ionosphere) model. Since quiet a long time, Klobuchar model considers the most widely used model ever in this field, due to its simplicity and time saving. Any GPS user could find Klobuchar model's coefficients in the broadcast navigation message. CODE, Centre for Orbit Determination in Europe provides a new set of coefficients for Klobuchar model, which gives more accurate results for the Ionospheric delay computation. IGS (International GPS Service) services include providing GPS community with a global Ionospheric maps in IONEX-format (IONosphere Map Exchange format) which enables the computation of the Ionospheric delay at the desired location and time. The study was undertaken from GPS-data simulation point of view. The aim was to select a model for the simulation of GPS data that gives a good description of the Ionosphere's nature with a high degree of accuracy in computing the Ionospheric delay that yields to better-simulated data. A new model developed by the author based on IGS global Ionospheric maps. A comparison

  20. Analyzing the Cohesion of English Text and Discourse with Automated Computer Tools

    ERIC Educational Resources Information Center

    Jeon, Moongee

    2014-01-01

    This article investigates the lexical and discourse features of English text and discourse with automated computer technologies. Specifically, this article examines the cohesion of English text and discourse with automated computer tools, Coh-Metrix and TEES. Coh-Metrix is a text analysis computer tool that can analyze English text and discourse…

  1. Development of a computer tool to detect and classify nodules in ultrasound breast images

    NASA Astrophysics Data System (ADS)

    Marcomini, Karem D.; Carneiro, Antonio O.; Schiabel, Homero

    2014-03-01

    Due to the high incidence rate of breast cancer in women, many procedures have been developed to assist the diagnosis and early detection. Currently, ultrasonography has proved as a useful tool in distinguishing benign and malignant masses. In this context, the computer-aided diagnosis schemes have provided to the specialist a second opinion more accurately and reliably, minimizing the visual subjectivity between observers. Thus, we propose the application of an automatic detection method based on the use of the technique of active contour in order to show precisely the contour of the lesion and provide a better understanding of their morphology. For this, a total of 144 images of phantoms were segmented and submitted to morphological operations of opening and closing for smoothing the edges. Then morphological features were extracted and selected to work as input parameters for the neural classifier Multilayer Perceptron which obtained 95.34% correct classification of data and Az of 0.96.

  2. Physics education through computational tools: the case of geometrical and physical optics

    NASA Astrophysics Data System (ADS)

    Rodríguez, Y.; Santana, A.; Mendoza, L. M.

    2013-09-01

    Recently, with the development of more powerful and accurate computational tools, the inclusion of new didactic materials in the classroom is known to have increased. However, the form in which these materials can be used to enhance the learning process is still under debate. Many different methodologies have been suggested for constructing new relevant curricular material and, among them, just-in-time teaching (JiTT) has arisen as an effective and successful way to improve the content of classes. In this paper, we will show the implemented pedagogic strategies for the courses of geometrical and optical physics for students of optometry. Thus, the use of the GeoGebra software for the geometrical optics class and the employment of new in-house software for the physical optics class created using the high-level programming language Python is shown with the corresponding activities developed for each of these applets.

  3. Navigating Traditional Chinese Medicine Network Pharmacology and Computational Tools

    PubMed Central

    Chen, Jia-Lei; Xu, Li-Wen

    2013-01-01

    The concept of “network target” has ushered in a new era in the field of traditional Chinese medicine (TCM). As a new research approach, network pharmacology is based on the analysis of network models and systems biology. Taking advantage of advancements in systems biology, a high degree of integration data analysis strategy and interpretable visualization provides deeper insights into the underlying mechanisms of TCM theories, including the principles of herb combination, biological foundations of herb or herbal formulae action, and molecular basis of TCM syndromes. In this study, we review several recent developments in TCM network pharmacology research and discuss their potential for bridging the gap between traditional and modern medicine. We briefly summarize the two main functional applications of TCM network models: understanding/uncovering and predicting/discovering. In particular, we focus on how TCM network pharmacology research is conducted and highlight different computational tools, such as network-based and machine learning algorithms, and sources that have been proposed and applied to the different steps involved in the research process. To make network pharmacology research commonplace, some basic network definitions and analysis methods are presented. PMID:23983798

  4. Computer Instrumentation and the New Tools of Science.

    ERIC Educational Resources Information Center

    Snyder, H. David

    1990-01-01

    The impact and uses of new technologies in science teaching are discussed. Included are computers, software, sensors, integrated circuits, computer signal access, and computer interfaces. Uses and advantages of these new technologies are suggested. (CW)

  5. Use Computer-Aided Tools to Parallelize Large CFD Applications

    NASA Technical Reports Server (NTRS)

    Jin, H.; Frumkin, M.; Yan, J.

    2000-01-01

    Porting applications to high performance parallel computers is always a challenging task. It is time consuming and costly. With rapid progressing in hardware architectures and increasing complexity of real applications in recent years, the problem becomes even more sever. Today, scalability and high performance are mostly involving handwritten parallel programs using message-passing libraries (e.g. MPI). However, this process is very difficult and often error-prone. The recent reemergence of shared memory parallel (SMP) architectures, such as the cache coherent Non-Uniform Memory Access (ccNUMA) architecture used in the SGI Origin 2000, show good prospects for scaling beyond hundreds of processors. Programming on an SMP is simplified by working in a globally accessible address space. The user can supply compiler directives, such as OpenMP, to parallelize the code. As an industry standard for portable implementation of parallel programs for SMPs, OpenMP is a set of compiler directives and callable runtime library routines that extend Fortran, C and C++ to express shared memory parallelism. It promises an incremental path for parallel conversion of existing software, as well as scalability and performance for a complete rewrite or an entirely new development. Perhaps the main disadvantage of programming with directives is that inserted directives may not necessarily enhance performance. In the worst cases, it can create erroneous results. While vendors have provided tools to perform error-checking and profiling, automation in directive insertion is very limited and often failed on large programs, primarily due to the lack of a thorough enough data dependence analysis. To overcome the deficiency, we have developed a toolkit, CAPO, to automatically insert OpenMP directives in Fortran programs and apply certain degrees of optimization. CAPO is aimed at taking advantage of detailed inter-procedural dependence analysis provided by CAPTools, developed by the University of

  6. CRISIS2012: An Updated Tool to Compute Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Ordaz, M.; Martinelli, F.; Meletti, C.; D'Amico, V.

    2013-05-01

    CRISIS is a computer tool for probabilistic seismic hazard analysis (PSHA), whose development started in the late 1980's at the Instituto de Ingeniería, UNAM, Mexico. It started circulating outside the Mexican borders at the beginning of the 1990's, when it was first distributed as part of SEISAN tools. Throughout the years, CRISIS has been used for seismic hazard studies in several countries in Latin America (Mexico, Guatemala, Belize, El Salvador, Honduras, Nicaragua, Costa Rica, Panama, Colombia, Venezuela, Ecuador, Peru, Argentina and Chile), and in many other countries of the World. CRISIS has always circulated free of charge for non-commercial applications. It is worth noting that CRISIS has been mainly written by people that are, at the same time, PSHA practitioners. Therefore, the development loop has been relatively short, and most of the modifications and improvements have been made to satisfy the needs of the developers themselves. CRISIS has evolved from a rather simple FORTRAN code to a relatively complex program with a friendly graphical interface, able to handle a variety of modeling possibilities for source geometries, seismicity descriptions and ground motion prediction models (GMPM). We will describe some of the improvements made for the newest version of the code: CRISIS 2012.These improvements, some of which were made in the frame of the Italian research project INGV-DPC S2 (http://nuovoprogettoesse2.stru.polimi.it/), funded by the Dipartimento della Protezione Civile (DPC; National Civil Protection Department), include: A wider variety of source geometries A wider variety of seismicity models, including the ability to handle non-Poissonian occurrence models and Poissonian smoothed-seismicity descriptions. Enhanced capabilities for using different kinds of GMPM: attenuation tables, built-in models and generalized attenuation models. In the case of built-in models, there is, by default, a set ready to use in CRISIS, but additional custom GMPMs

  7. Ratsnake: A Versatile Image Annotation Tool with Application to Computer-Aided Diagnosis

    PubMed Central

    Iakovidis, D. K.; Goudas, T.; Smailis, C.; Maglogiannis, I.

    2014-01-01

    Image segmentation and annotation are key components of image-based medical computer-aided diagnosis (CAD) systems. In this paper we present Ratsnake, a publicly available generic image annotation tool providing annotation efficiency, semantic awareness, versatility, and extensibility, features that can be exploited to transform it into an effective CAD system. In order to demonstrate this unique capability, we present its novel application for the evaluation and quantification of salient objects and structures of interest in kidney biopsy images. Accurate annotation identifying and quantifying such structures in microscopy images can provide an estimation of pathogenesis in obstructive nephropathy, which is a rather common disease with severe implication in children and infants. However a tool for detecting and quantifying the disease is not yet available. A machine learning-based approach, which utilizes prior domain knowledge and textural image features, is considered for the generation of an image force field customizing the presented tool for automatic evaluation of kidney biopsy images. The experimental evaluation of the proposed application of Ratsnake demonstrates its efficiency and effectiveness and promises its wide applicability across a variety of medical imaging domains. PMID:24616617

  8. A procedure for computing accurate ab initio quartic force fields: Application to HO2+ and H2O

    NASA Astrophysics Data System (ADS)

    Huang, Xinchuan; Lee, Timothy J.

    2008-07-01

    A procedure for the calculation of molecular quartic force fields (QFFs) is proposed and investigated. The goal is to generate highly accurate ab initio QFFs that include many of the so-called ``small'' effects that are necessary to achieve high accuracy. The small effects investigated in the present study include correlation of the core electrons (core correlation), extrapolation to the one-particle basis set limit, correction for scalar relativistic contributions, correction for higher-order correlation effects, and inclusion of diffuse functions in the one-particle basis set. The procedure is flexible enough to allow for some effects to be computed directly, while others may be added as corrections. A single grid of points is used and is centered about an initial reference geometry that is designed to be as close as possible to the final ab initio equilibrium structure (with all effects included). It is shown that the least-squares fit of the QFF is not compromised by the added corrections, and the balance between elimination of contamination from higher-order force constants while retaining energy differences large enough to yield meaningful quartic force constants is essentially unchanged from the standard procedures we have used for many years. The initial QFF determined from the least-squares fit is transformed to the exact minimum in order to eliminate gradient terms and allow for the use of second-order perturbation theory for evaluation of spectroscopic constants. It is shown that this step has essentially no effect on the quality of the QFF largely because the initial reference structure is, by design, very close to the final ab initio equilibrium structure. The procedure is used to compute an accurate, purely ab initio QFF for the H2O molecule, which is used as a benchmark test case. The procedure is then applied to the ground and first excited electronic states of the HO2+ molecular cation. Fundamental vibrational frequencies and spectroscopic

  9. A streamline splitting pore-network approach for computationally inexpensive and accurate simulation of transport in porous media

    NASA Astrophysics Data System (ADS)

    Mehmani, Yashar; Oostrom, Mart; Balhoff, Matthew T.

    2014-03-01

    Several approaches have been developed in the literature for solving flow and transport at the pore scale. Some authors use a direct modeling approach where the fundamental flow and transport equations are solved on the actual pore-space geometry. Such direct modeling, while very accurate, comes at a great computational cost. Network models are computationally more efficient because the pore-space morphology is approximated. Typically, a mixed cell method (MCM) is employed for solving the flow and transport system which assumes pore-level perfect mixing. This assumption is invalid at moderate to high Peclet regimes. In this work, a novel Eulerian perspective on modeling flow and transport at the pore scale is developed. The new streamline splitting method (SSM) allows for circumventing the pore-level perfect-mixing assumption, while maintaining the computational efficiency of pore-network models. SSM was verified with direct simulations and validated against micromodel experiments; excellent matches were obtained across a wide range of pore-structure and fluid-flow parameters. The increase in the computational cost from MCM to SSM is shown to be minimal, while the accuracy of SSM is much higher than that of MCM and comparable to direct modeling approaches. Therefore, SSM can be regarded as an appropriate balance between incorporating detailed physics and controlling computational cost. The truly predictive capability of the model allows for the study of pore-level interactions of fluid flow and transport in different porous materials. In this paper, we apply SSM and MCM to study the effects of pore-level mixing on transverse dispersion in 3-D disordered granular media.

  10. A streamline splitting pore-network approach for computationally inexpensive and accurate simulation of transport in porous media

    SciTech Connect

    Mehmani, Yashar; Oostrom, Martinus; Balhoff, Matthew

    2014-03-20

    Several approaches have been developed in the literature for solving flow and transport at the pore-scale. Some authors use a direct modeling approach where the fundamental flow and transport equations are solved on the actual pore-space geometry. Such direct modeling, while very accurate, comes at a great computational cost. Network models are computationally more efficient because the pore-space morphology is approximated. Typically, a mixed cell method (MCM) is employed for solving the flow and transport system which assumes pore-level perfect mixing. This assumption is invalid at moderate to high Peclet regimes. In this work, a novel Eulerian perspective on modeling flow and transport at the pore-scale is developed. The new streamline splitting method (SSM) allows for circumventing the pore-level perfect mixing assumption, while maintaining the computational efficiency of pore-network models. SSM was verified with direct simulations and excellent matches were obtained against micromodel experiments across a wide range of pore-structure and fluid-flow parameters. The increase in the computational cost from MCM to SSM is shown to be minimal, while the accuracy of SSM is much higher than that of MCM and comparable to direct modeling approaches. Therefore, SSM can be regarded as an appropriate balance between incorporating detailed physics and controlling computational cost. The truly predictive capability of the model allows for the study of pore-level interactions of fluid flow and transport in different porous materials. In this paper, we apply SSM and MCM to study the effects of pore-level mixing on transverse dispersion in 3D disordered granular media.

  11. In pursuit of an accurate spatial and temporal model of biomolecules at the atomistic level: a perspective on computer simulation

    SciTech Connect

    Gray, Alan; Harlen, Oliver G.; Harris, Sarah A.; Khalid, Syma; Leung, Yuk Ming; Lonsdale, Richard; Mulholland, Adrian J.; Pearson, Arwen R.; Read, Daniel J.; Richardson, Robin A.

    2015-01-01

    The current computational techniques available for biomolecular simulation are described, and the successes and limitations of each with reference to the experimental biophysical methods that they complement are presented. Despite huge advances in the computational techniques available for simulating biomolecules at the quantum-mechanical, atomistic and coarse-grained levels, there is still a widespread perception amongst the experimental community that these calculations are highly specialist and are not generally applicable by researchers outside the theoretical community. In this article, the successes and limitations of biomolecular simulation and the further developments that are likely in the near future are discussed. A brief overview is also provided of the experimental biophysical methods that are commonly used to probe biomolecular structure and dynamics, and the accuracy of the information that can be obtained from each is compared with that from modelling. It is concluded that progress towards an accurate spatial and temporal model of biomacromolecules requires a combination of all of these biophysical techniques, both experimental and computational.

  12. Frances: A Tool for Understanding Computer Architecture and Assembly Language

    ERIC Educational Resources Information Center

    Sondag, Tyler; Pokorny, Kian L.; Rajan, Hridesh

    2012-01-01

    Students in all areas of computing require knowledge of the computing device including software implementation at the machine level. Several courses in computer science curricula address these low-level details such as computer architecture and assembly languages. For such courses, there are advantages to studying real architectures instead of…

  13. Covariance Analysis Tool (G-CAT) for Computing Ascent, Descent, and Landing Errors

    NASA Technical Reports Server (NTRS)

    Boussalis, Dhemetrios; Bayard, David S.

    2013-01-01

    G-CAT is a covariance analysis tool that enables fast and accurate computation of error ellipses for descent, landing, ascent, and rendezvous scenarios, and quantifies knowledge error contributions needed for error budgeting purposes. Because GCAT supports hardware/system trade studies in spacecraft and mission design, it is useful in both early and late mission/ proposal phases where Monte Carlo simulation capability is not mature, Monte Carlo simulation takes too long to run, and/or there is a need to perform multiple parametric system design trades that would require an unwieldy number of Monte Carlo runs. G-CAT is formulated as a variable-order square-root linearized Kalman filter (LKF), typically using over 120 filter states. An important property of G-CAT is that it is based on a 6-DOF (degrees of freedom) formulation that completely captures the combined effects of both attitude and translation errors on the propagated trajectories. This ensures its accuracy for guidance, navigation, and control (GN&C) analysis. G-CAT provides the desired fast turnaround analysis needed for error budgeting in support of mission concept formulations, design trade studies, and proposal development efforts. The main usefulness of a covariance analysis tool such as G-CAT is its ability to calculate the performance envelope directly from a single run. This is in sharp contrast to running thousands of simulations to obtain similar information using Monte Carlo methods. It does this by propagating the "statistics" of the overall design, rather than simulating individual trajectories. G-CAT supports applications to lunar, planetary, and small body missions. It characterizes onboard knowledge propagation errors associated with inertial measurement unit (IMU) errors (gyro and accelerometer), gravity errors/dispersions (spherical harmonics, masscons), and radar errors (multiple altimeter beams, multiple Doppler velocimeter beams). G-CAT is a standalone MATLAB- based tool intended to

  14. The Computer in Composition Instruction: A Writer's Tool.

    ERIC Educational Resources Information Center

    Wresch, William, Ed.

    This compilation of articles on computer applications in writing instruction deals with the areas of prewriting, editing and grammar, word processing research and applications, and programs for the writing process. It contains the following papers: "Recollections of First-Generation Computer-Assisted Prewriting," by Hugh Burns; "Computer-Based…

  15. A robust and accurate approach to computing compressible multiphase flow: Stratified flow model and AUSM{sup +}-up scheme

    SciTech Connect

    Chang, Chih-Hao . E-mail: chchang@engineering.ucsb.edu; Liou, Meng-Sing . E-mail: meng-sing.liou@grc.nasa.gov

    2007-07-01

    In this paper, we propose a new approach to compute compressible multifluid equations. Firstly, a single-pressure compressible multifluid model based on the stratified flow model is proposed. The stratified flow model, which defines different fluids in separated regions, is shown to be amenable to the finite volume method. We can apply the conservation law to each subregion and obtain a set of balance equations. Secondly, the AUSM{sup +} scheme, which is originally designed for the compressible gas flow, is extended to solve compressible liquid flows. By introducing additional dissipation terms into the numerical flux, the new scheme, called AUSM{sup +}-up, can be applied to both liquid and gas flows. Thirdly, the contribution to the numerical flux due to interactions between different phases is taken into account and solved by the exact Riemann solver. We will show that the proposed approach yields an accurate and robust method for computing compressible multiphase flows involving discontinuities, such as shock waves and fluid interfaces. Several one-dimensional test problems are used to demonstrate the capability of our method, including the Ransom's water faucet problem and the air-water shock tube problem. Finally, several two dimensional problems will show the capability to capture enormous details and complicated wave patterns in flows having large disparities in the fluid density and velocities, such as interactions between water shock wave and air bubble, between air shock wave and water column(s), and underwater explosion.

  16. A fast and accurate method for computing the Sunyaev-Zel'dovich signal of hot galaxy clusters

    NASA Astrophysics Data System (ADS)

    Chluba, Jens; Nagai, Daisuke; Sazonov, Sergey; Nelson, Kaylea

    2012-10-01

    New-generation ground- and space-based cosmic microwave background experiments have ushered in discoveries of massive galaxy clusters via the Sunyaev-Zel'dovich (SZ) effect, providing a new window for studying cluster astrophysics and cosmology. Many of the newly discovered, SZ-selected clusters contain hot intracluster plasma (kTe ≳ 10 keV) and exhibit disturbed morphology, indicative of frequent mergers with large peculiar velocity (v ≳ 1000 km s-1). It is well known that for the interpretation of the SZ signal from hot, moving galaxy clusters, relativistic corrections must be taken into account, and in this work, we present a fast and accurate method for computing these effects. Our approach is based on an alternative derivation of the Boltzmann collision term which provides new physical insight into the sources of different kinematic corrections in the scattering problem. In contrast to previous works, this allows us to obtain a clean separation of kinematic and scattering terms. We also briefly mention additional complications connected with kinematic effects that should be considered when interpreting future SZ data for individual clusters. One of the main outcomes of this work is SZPACK, a numerical library which allows very fast and precise (≲0.001 per cent at frequencies hν ≲ 20kTγ) computation of the SZ signals up to high electron temperature (kTe ≃ 25 keV) and large peculiar velocity (v/c ≃ 0.01). The accuracy is well beyond the current and future precision of SZ observations and practically eliminates uncertainties which are usually overcome with more expensive numerical evaluation of the Boltzmann collision term. Our new approach should therefore be useful for analysing future high-resolution, multifrequency SZ observations as well as computing the predicted SZ effect signals from numerical simulations.

  17. Professors' and students' perceptions and experiences of computational simulations as learning tools

    NASA Astrophysics Data System (ADS)

    Magana de Leon, Alejandra De Jesus

    Computational simulations are becoming a critical component of scientific and engineering research, and now are becoming an important component for learning. This dissertation provides findings from a multifaceted research study exploring the ways computational simulations have been perceived and experienced as learning tools by instructors and students. Three studies were designed with an increasing focus on the aspects of learning and instructing with computational simulation tools. Study One used a student survey with undergraduate and graduate students whose instructors enhanced their teaching using online computational tools. Results of this survey were used to identify students' perceptions and experiences with these simulations as learning tools. The results provided both an evaluation of the instructional design and an indicator of which instructors were selected in Study Two. Study Two used a phenomenographic research design resulting in a two dimensional outcome space with six qualitatively different ways instructors perceived their learning outcomes associated with using simulation tools as part of students' learning experiences. Results from this work provide a framework for identifying major learning objectives to promote learning with computational simulation tools. Study Three used a grounded theory methodology to expand on instructors' learning objectives to include their perceptions of formative assessment and pedagogy. These perceptions were compared and contrasted with students' perceptions associated with learning with computational tools. The study is organized around three phases and analyzed as a collection of case studies focused on the instructors and their students' perceptions and experiences of computational simulations as learning tools. This third study resulted in a model for using computational simulations as learning tools. This model indicates the potential of integrating the computational simulation tools into formal learning

  18. A Visual Tool for Computer Supported Learning: The Robot Motion Planning Example

    ERIC Educational Resources Information Center

    Elnagar, Ashraf; Lulu, Leena

    2007-01-01

    We introduce an effective computer aided learning visual tool (CALVT) to teach graph-based applications. We present the robot motion planning problem as an example of such applications. The proposed tool can be used to simulate and/or further to implement practical systems in different areas of computer science such as graphics, computational…

  19. Accurate 3d Textured Models of Vessels for the Improvement of the Educational Tools of a Museum

    NASA Astrophysics Data System (ADS)

    Soile, S.; Adam, K.; Ioannidis, C.; Georgopoulos, A.

    2013-02-01

    Besides the demonstration of the findings, modern museums organize educational programs which aim to experience and knowledge sharing combined with entertainment rather than to pure learning. Toward that effort, 2D and 3D digital representations are gradually replacing the traditional recording of the findings through photos or drawings. The present paper refers to a project that aims to create 3D textured models of two lekythoi that are exhibited in the National Archaeological Museum of Athens in Greece; on the surfaces of these lekythoi scenes of the adventures of Odysseus are depicted. The project is expected to support the production of an educational movie and some other relevant interactive educational programs for the museum. The creation of accurate developments of the paintings and of accurate 3D models is the basis for the visualization of the adventures of the mythical hero. The data collection was made by using a structured light scanner consisting of two machine vision cameras that are used for the determination of geometry of the object, a high resolution camera for the recording of the texture, and a DLP projector. The creation of the final accurate 3D textured model is a complicated and tiring procedure which includes the collection of geometric data, the creation of the surface, the noise filtering, the merging of individual surfaces, the creation of a c-mesh, the creation of the UV map, the provision of the texture and, finally, the general processing of the 3D textured object. For a better result a combination of commercial and in-house software made for the automation of various steps of the procedure was used. The results derived from the above procedure were especially satisfactory in terms of accuracy and quality of the model. However, the procedure was proved to be time consuming while the use of various software packages presumes the services of a specialist.

  20. Fine structure in proton radioactivity: An accurate tool to ascertain the breaking of axial symmetry in {sup 145}Tm

    SciTech Connect

    Arumugam, P.; Ferreira, L. S.; Maglione, E.

    2008-10-15

    With a proper formalism for proton emission from triaxially deformed nuclei, we perform exact calculations of decay widths for the decays to ground and first excited 2{sup +} states in the daughter nucleus. Our results for rotational spectrum, decay width and fine structure in the case of the nucleus {sup 145}Tm lead for the first time to an accurate identification of triaxial deformation using proton emission. This work also puts in evidence the advantage of proton emission over the conventional probes to study nuclear structure at the proton drip-line.

  1. In Vivo Computed Tomography as a Research Tool to Investigate Asthma and COPD: Where Do We Stand?

    PubMed Central

    Dournes, Gaël; Montaudon, Michel; Berger, Patrick; Laurent, François

    2012-01-01

    Computed tomography (CT) is a clinical tool widely used to assess and followup asthma and chonic obstructive pulmonary disease (COPD) in humans. Strong efforts have been made the last decade to improve this technique as a quantitative research tool. Using semiautomatic softwares, quantification of airway wall thickness, lumen area, and bronchial wall density are available from large to intermediate conductive airways. Skeletonization of the bronchial tree can be built to assess its three-dimensional geometry. Lung parenchyma density can be analysed as a surrogate of small airway disease and emphysema. Since resident cells involve airway wall and lung parenchyma abnormalities, CT provides an accurate and reliable research tool to assess their role in vivo. This litterature review highlights the most recent advances made to assess asthma and COPD with CT, and also their drawbacks and the place of CT in clarifying the complex physiopathology of both diseases. PMID:22287977

  2. The Utility of Computer Tracking Tools for User-Centered Design.

    ERIC Educational Resources Information Center

    Gay, Geri; Mazur, Joan

    1993-01-01

    Describes tracking tools used by designers and users to evaluate the efficacy of hypermedia systems. Highlights include human-computer interaction research; tracking tools and user-centered design; and three examples from the Interactive Multimedia Group at Cornell University that illustrate uses of various tracking tools. (27 references) (LRW)

  3. WASTE REDUCTION USING COMPUTER-AIDED DESIGN TOOLS

    EPA Science Inventory

    Growing environmental concerns have spurred considerable interest in pollution prevention. In most instances, pollution prevention involves introducing radical changes to the design of processes so that waste generation is minimized.
    Process simulators can be effective tools i...

  4. iTools: A Framework for Classification, Categorization and Integration of Computational Biology Resources

    PubMed Central

    Dinov, Ivo D.; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H. V.; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D. Stott; Toga, Arthur W.

    2008-01-01

    The advancement of the computational biology field hinges on progress in three fundamental directions – the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources–data, software tools and web-services. The iTools design, implementation and resource meta - data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource

  5. Managing Laboratory Data Using Cloud Computing as an Organizational Tool

    ERIC Educational Resources Information Center

    Bennett, Jacqueline; Pence, Harry E.

    2011-01-01

    One of the most significant difficulties encountered when directing undergraduate research and developing new laboratory experiments is how to efficiently manage the data generated by a number of students. Cloud computing, where both software and computer files reside online, offers a solution to this data-management problem and allows researchers…

  6. Information and Communicative Technology--Computers as Research Tools

    ERIC Educational Resources Information Center

    Sarsani, Mahender Reddy

    2007-01-01

    The emergence of "the electronic age,/electronic cottages/the electronic world" has affected the whole world; particularly the emergence of computers has penetrated everyone's life to a remarkable degree. They are being used in various fields including education. Recent advances, especially in the area of computer technology have…

  7. Computer Art--A New Tool in Advertising Graphics.

    ERIC Educational Resources Information Center

    Wassmuth, Birgit L.

    Using computers to produce art began with scientists, mathematicians, and individuals with strong technical backgrounds who used the graphic material as visualizations of data in technical fields. People are using computer art in advertising, as well as in painting; sculpture; music; textile, product, industrial, and interior design; architecture;…

  8. Using Computers as Reading Instructional Tools: Applications and Implications.

    ERIC Educational Resources Information Center

    Singhal, Meena

    A review of the literature investigated how computers have been used in relation to the teaching of reading over the last 20 years, how effective those endeavors and research studies (mainly conducted at the college level) have been, and what computer instructional programs in the area of reading need to address. Various efforts have been made to…

  9. IHT: Tools for Computing Insolation Absorption by Particle Laden Flows

    SciTech Connect

    Grout, R. W.

    2013-10-01

    This report describes IHT, a toolkit for computing radiative heat exchange between particles. Well suited for insolation absorption computations, it is also has potential applications in combustion (sooting flames), biomass gasification processes and similar processes. The algorithm is based on the 'Photon Monte Carlo' approach and implemented in a library that can be interfaced with a variety of computational fluid dynamics codes to analyze radiative heat transfer in particle-laden flows. The emphasis in this report is on the data structures and organization of IHT for developers seeking to use the IHT toolkit to add Photon Monte Carlo capabilities to their own codes.

  10. The Binary Abacus: A Useful Tool for Explaining Computer Operations.

    ERIC Educational Resources Information Center

    Good, Robert C., Jr.

    1985-01-01

    A pair of abacuses is used to illustrate addition, subtraction (by adding complements), multiplication, and division, with simplified examples given to illustrate the various features of each operation. Also indicates how this information can help students understand computer operations. (JN)

  11. Computer Databases as an Educational Tool in the Basic Sciences.

    ERIC Educational Resources Information Center

    Friedman, Charles P.; And Others

    1990-01-01

    The University of North Carolina School of Medicine developed a computer database, INQUIRER, containing scientific information in bacteriology, and then integrated the database into routine educational activities for first-year medical students in their microbiology course. (Author/MLW)

  12. Computer Simulations as a Teaching Tool in Community Colleges

    ERIC Educational Resources Information Center

    Grimm, Floyd M., III

    1978-01-01

    Describes the implementation of a computer assisted instruction program at Harford Community College. Eight different biology simulation programs are used covering topics in ecology, genetics, biochemistry, and sociobiology. (MA)

  13. Further Uses of the Analog Computer as a Teaching Tool

    ERIC Educational Resources Information Center

    Shonle, John I.

    1976-01-01

    Discusses the use of an analog computer oscilloscope to illustrate the transition from underdamped to overdamped for the simple harmonic oscillator, the maximum range for a projectile, and the behavior of charged particles in crossed electric and magnetic fields. (MLH)

  14. EFL Learners' Attitudes towards Using Computers as a Learning Tool in Language Learning

    ERIC Educational Resources Information Center

    Kitchakarn, Orachorn

    2015-01-01

    The study was conducted to investigate attitudes toward using computers as a learning tool among undergraduate students in a private university. In this regards, some variables which might be potential antecedents of attitudes toward computer including gender, experience of using computers and perceived abilities in using programs were examined.…

  15. Project-Based Teaching-Learning Computer-Aided Engineering Tools

    ERIC Educational Resources Information Center

    Simoes, J. A.; Relvas, C.; Moreira, R.

    2004-01-01

    Computer-aided design, computer-aided manufacturing, computer-aided analysis, reverse engineering and rapid prototyping are tools that play an important key role within product design. These are areas of technical knowledge that must be part of engineering and industrial design courses' curricula. This paper describes our teaching experience of…

  16. COPS: a sensitive and accurate tool for detecting somatic Copy Number Alterations using short-read sequence data from paired samples.

    PubMed

    Krishnan, Neeraja M; Gaur, Prakhar; Chaudhary, Rakshit; Rao, Arjun A; Panda, Binay

    2012-01-01

    Copy Number Alterations (CNAs) such as deletions and duplications; compose a larger percentage of genetic variations than single nucleotide polymorphisms or other structural variations in cancer genomes that undergo major chromosomal re-arrangements. It is, therefore, imperative to identify cancer-specific somatic copy number alterations (SCNAs), with respect to matched normal tissue, in order to understand their association with the disease. We have devised an accurate, sensitive, and easy-to-use tool, COPS, COpy number using Paired Samples, for detecting SCNAs. We rigorously tested the performance of COPS using short sequence simulated reads at various sizes and coverage of SCNAs, read depths, read lengths and also with real tumor:normal paired samples. We found COPS to perform better in comparison to other known SCNA detection tools for all evaluated parameters, namely, sensitivity (detection of true positives), specificity (detection of false positives) and size accuracy. COPS performed well for sequencing reads of all lengths when used with most upstream read alignment tools. Additionally, by incorporating a downstream boundary segmentation detection tool, the accuracy of SCNA boundaries was further improved. Here, we report an accurate, sensitive and easy to use tool in detecting cancer-specific SCNAs using short-read sequence data. In addition to cancer, COPS can be used for any disease as long as sequence reads from both disease and normal samples from the same individual are available. An added boundary segmentation detection module makes COPS detected SCNA boundaries more specific for the samples studied. COPS is available at ftp://115.119.160.213 with username "cops" and password "cops". PMID:23110103

  17. Measurement Model for Division as a Tool in Computing Applications

    ERIC Educational Resources Information Center

    Abramovich, Sergei; Strock, Tracy

    2002-01-01

    The paper describes the use of a spreadsheet in a mathematics teacher education course. It shows how the tool can serve as a link between seemingly disconnected mathematical concepts. The didactical triad of using a spreadsheet as an agent, consumer, and amplifier of mathematical activities allows for an extended investigation of simple yet…

  18. Building a Better Bibliography: Computer-Aided Research Tools.

    ERIC Educational Resources Information Center

    Bloomfield, Elizabeth

    1989-01-01

    Describes a project at the University of Guelph (Ontario) that combined both bibliographical and archival references in one large machine readable database to facilitate local history research. The description covers research tool creation, planning activities, system design, the database management system used, material selection, record…

  19. Understanding Computation of Impulse Response in Microwave Software Tools

    ERIC Educational Resources Information Center

    Potrebic, Milka M.; Tosic, Dejan V.; Pejovic, Predrag V.

    2010-01-01

    In modern microwave engineering curricula, the introduction of the many new topics in microwave industrial development, or of software tools for design and simulation, sometimes results in students having an inadequate understanding of the fundamental theory. The terminology for and the explanation of algorithms for calculating impulse response in…

  20. Computational tool for simulation of power and refrigeration cycles

    NASA Astrophysics Data System (ADS)

    Córdoba Tuta, E.; Reyes Orozco, M.

    2016-07-01

    Small improvement in thermal efficiency of power cycles brings huge cost savings in the production of electricity, for that reason have a tool for simulation of power cycles allows modeling the optimal changes for a best performance. There is also a big boom in research Organic Rankine Cycle (ORC), which aims to get electricity at low power through cogeneration, in which the working fluid is usually a refrigerant. A tool to design the elements of an ORC cycle and the selection of the working fluid would be helpful, because sources of heat from cogeneration are very different and in each case would be a custom design. In this work the development of a multiplatform software for the simulation of power cycles and refrigeration, which was implemented in the C ++ language and includes a graphical interface which was developed using multiplatform environment Qt and runs on operating systems Windows and Linux. The tool allows the design of custom power cycles, selection the type of fluid (thermodynamic properties are calculated through CoolProp library), calculate the plant efficiency, identify the fractions of flow in each branch and finally generates a report very educational in pdf format via the LaTeX tool.

  1. Distributed design tools: Mapping targeted design tools onto a Web-based distributed architecture for high-performance computing

    SciTech Connect

    Holmes, V.P.; Linebarger, J.M.; Miller, D.J.; Poore, C.A.

    1999-11-30

    Design Tools use a Web-based Java interface to guide a product designer through the design-to-analysis cycle for a specific, well-constrained design problem. When these Design Tools are mapped onto a Web-based distributed architecture for high-performance computing, the result is a family of Distributed Design Tools (DDTs). The software components that enable this mapping consist of a Task Sequencer, a generic Script Execution Service, and the storage of both data and metadata in an active, object-oriented database called the Product Database Operator (PDO). The benefits of DDTs include improved security, reliability, scalability (in both problem size and computing hardware), robustness, and reusability. In addition, access to the PDO unlocks its wide range of services for distributed components, such as lookup and launch capability, persistent shared memory for communication between cooperating services, state management, event notification, and archival of design-to-analysis session data.

  2. A Perspective on Computational Human Performance Models as Design Tools

    NASA Technical Reports Server (NTRS)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  3. RepurposeVS: A Drug Repurposing-Focused Computational Method for Accurate Drug-Target Signature Predictions.

    PubMed

    Issa, Naiem T; Peters, Oakland J; Byers, Stephen W; Dakshanamurthy, Sivanesan

    2015-01-01

    We describe here RepurposeVS for the reliable prediction of drug-target signatures using X-ray protein crystal structures. RepurposeVS is a virtual screening method that incorporates docking, drug-centric and protein-centric 2D/3D fingerprints with a rigorous mathematical normalization procedure to account for the variability in units and provide high-resolution contextual information for drug-target binding. Validity was confirmed by the following: (1) providing the greatest enrichment of known drug binders for multiple protein targets in virtual screening experiments, (2) determining that similarly shaped protein target pockets are predicted to bind drugs of similar 3D shapes when RepurposeVS is applied to 2,335 human protein targets, and (3) determining true biological associations in vitro for mebendazole (MBZ) across many predicted kinase targets for potential cancer repurposing. Since RepurposeVS is a drug repurposing-focused method, benchmarking was conducted on a set of 3,671 FDA approved and experimental drugs rather than the Database of Useful Decoys (DUDE) so as to streamline downstream repurposing experiments. We further apply RepurposeVS to explore the overall potential drug repurposing space for currently approved drugs. RepurposeVS is not computationally intensive and increases performance accuracy, thus serving as an efficient and powerful in silico tool to predict drug-target associations in drug repurposing. PMID:26234515

  4. Uranus: a rapid prototyping tool for FPGA embedded computer vision

    NASA Astrophysics Data System (ADS)

    Rosales-Hernández, Victor; Castillo-Jimenez, Liz; Viveros-Velez, Gilberto; Zuñiga-Grajeda, Virgilio; Treviño Torres, Abel; Arias-Estrada, M.

    2007-01-01

    The starting point for all successful system development is the simulation. Performing high level simulation of a system can help to identify, insolate and fix design problems. This work presents Uranus, a software tool for simulation and evaluation of image processing algorithms with support to migrate them to an FPGA environment for algorithm acceleration and embedded processes purposes. The tool includes an integrated library of previous coded operators in software and provides the necessary support to read and display image sequences as well as video files. The user can use the previous compiled soft-operators in a high level process chain, and code his own operators. Additional to the prototyping tool, Uranus offers FPGA-based hardware architecture with the same organization as the software prototyping part. The hardware architecture contains a library of FPGA IP cores for image processing that are connected with a PowerPC based system. The Uranus environment is intended for rapid prototyping of machine vision and the migration to FPGA accelerator platform, and it is distributed for academic purposes.

  5. Coordinated Computer-Supported Collaborative Learning: Awareness and Awareness Tools

    ERIC Educational Resources Information Center

    Janssen, Jeroen; Bodemer, Daniel

    2013-01-01

    Traditionally, research on awareness during online collaboration focused on topics such as the effects of spatial information about group members' activities on the collaborative process. When the concept of awareness was introduced to computer-supported collaborative learning, this focus shifted to cognitive group awareness (e.g., information…

  6. DEVELOPMENT OF COMPUTATIONAL TOOLS FOR OPTIMAL IDENTIFICATION OF BIOLOGICAL NETWORKS

    EPA Science Inventory

    Following the theoretical analysis and computer simulations, the next step for the development of SNIP will be a proof-of-principle laboratory application. Specifically, we have obtained a synthetic transcriptional cascade (harbored in Escherichia coli...

  7. IHT: Tools for Computing Insolation Absorption by Particle Laden Flows

    2013-09-17

    INT is a toolkit for computing radiative heat exchange between particles. The algorithm is based on the the 'Photon Monte Carlo" approach described by Wang and Modest and implemented as a library that can be interfaced with a variety of CFD codes to analyze radiative heat transfer in particle laden flows.

  8. Integrated computational materials engineering: Tools, simulations and new applications

    DOE PAGESBeta

    Madison, Jonathan D.

    2016-03-30

    Here, Integrated Computational Materials Engineering (ICME) is a relatively new methodology full of tremendous potential to revolutionize how science, engineering and manufacturing work together. ICME was motivated by the desire to derive greater understanding throughout each portion of the development life cycle of materials, while simultaneously reducing the time between discovery to implementation [1,2].

  9. Students' Attitudes towards Animated Demonstrations as Computer Learning Tools

    ERIC Educational Resources Information Center

    Despotakis, Theofanis C.; Palaigeorgiou, George E.; Tsoukalas, Ioannis A.

    2007-01-01

    Animated demonstrations are increasingly used for presenting the functionality of various computer applications. Nevertheless, our understanding of whether and how students integrate this technology into their learning strategies remains limited. Although, several studies have examined animated demonstrations' learning efficiency, this study aims…

  10. Computer Vision Tools for Finding Images and Video Sequences.

    ERIC Educational Resources Information Center

    Forsyth, D. A.

    1999-01-01

    Computer vision offers a variety of techniques for searching for pictures in large collections of images. Appearance methods compare images based on the overall content of the image using certain criteria. Finding methods concentrate on matching subparts of images, defined in a variety of ways, in hope of finding particular objects. These ideas…

  11. Computer Assisted Reading Instruction: New Tools for New Experiences.

    ERIC Educational Resources Information Center

    Sponder, Barry

    A Language Experience Approach (LEA) to reading is based on the premise that a child's thinking naturally leads to talking, writing, and eventually reading. Information technologies offer powerful support for learning, but teachers and parents must learn to use these technologies effectively. Three types of computer applications that are…

  12. A Web Browsing Tool for a Shared Computer Environment

    ERIC Educational Resources Information Center

    Bodnar, George H.

    2007-01-01

    This paper provides a Microsoft .NET framework application that makes browsing the Internet in a shared computer environment convenient and secure. One simply opens the program, then points and clicks to both open Internet Explorer and have it move directly to the selected address. Addresses do not need to be manually entered or copied and pasted…

  13. Toward accurate volumetry of brain aneurysms: combination of an algorithm for automatic thresholding with a 3D eraser tool.

    PubMed

    Costalat, Vincent; Maldonado, Igor Lima; Strauss, Olivier; Bonafé, Alain

    2011-06-15

    The present study describes a new approach for aneurysm volume quantification on three-dimensional angiograms, which focuses on solving three common technical problems: the variability associated with the use of manual thresholds, the irregular morphology of some aneurysms, and the imprecision of the limits between the parent artery and the aneurysm sac. The method consists of combining an algorithm for automatic threshold determination with a spherical eraser tool that allows the user to separate the image of the aneurysm from the parent artery. The accuracy of volumetry after automatic thresholding was verified with an in vitro experiment in which 57 measurements were performed using four artificial aneurysms of known volume. The reliability of the method was compared to that obtained with the technique of ellipsoid approximation in a clinical setting of 15 real angiograms and 150 measurements performed by five different users. The mean error in the measurement of the artificial aneurysms was 7.23%. The reliability of the new approach was significantly higher than that of the ellipsoid approximation. Limits of agreement between two measurements were determined with Bland-Altman plots and ranged from -14 to 13% for complex and from -10.8 to 11.03% for simple-shaped sacs. The reproducibility was lower (>20% of variation) for small aneurysms (<70 mm³) and for those presenting a very wide neck (dome-to-neck ratio<1). The method is potentially useful in the clinical practice, since it provides relatively precise, reproducible, volume quantification. A safety coiling volume can be established in order to perform sufficient but not excessive filling of the aneurysm pouch. PMID:21540054

  14. Establishing Magnetic Resonance Imaging as an Accurate and Reliable Tool to Diagnose and Monitor Esophageal Cancer in a Rat Model

    PubMed Central

    Kosovec, Juliann E.; Zaidi, Ali H.; Komatsu, Yoshihiro; Kasi, Pashtoon M.; Cothron, Kyle; Thompson, Diane V.; Lynch, Edward; Jobe, Blair A.

    2014-01-01

    Objective To assess the reliability of magnetic resonance imaging (MRI) for detection of esophageal cancer in the Levrat model of end-to-side esophagojejunostomy. Background The Levrat model has proven utility in terms of its ability to replicate Barrett’s carcinogenesis by inducing gastroduodenoesophageal reflux (GDER). Due to lack of data on the utility of non-invasive methods for detection of esophageal cancer, treatment efficacy studies have been limited, as adenocarcinoma histology has only been validated post-mortem. It would therefore be of great value if the validity and reliability of MRI could be established in this setting. Methods Chronic GDER reflux was induced in 19 male Sprague-Dawley rats using the modified Levrat model. At 40 weeks post-surgery, all animals underwent endoscopy, MRI scanning, and post-mortem histological analysis of the esophagus and anastomosis. With post-mortem histology serving as the gold standard, assessment of presence of esophageal cancer was made by five esophageal specialists and five radiologists on endoscopy and MRI, respectively. Results The accuracy of MRI and endoscopic analysis to correctly identify cancer vs. no cancer was 85.3% and 50.5%, respectively. ROC curves demonstrated that MRI rating had an AUC of 0.966 (p<0.001) and endoscopy rating had an AUC of 0.534 (p = 0.804). The sensitivity and specificity of MRI for identifying cancer vs. no-cancer was 89.1% and 80% respectively, as compared to 45.5% and 57.5% for endoscopy. False positive rates of MRI and endoscopy were 20% and 42.5%, respectively. Conclusions MRI is a more reliable diagnostic method than endoscopy in the Levrat model. The non-invasiveness of the tool and its potential to volumetrically quantify the size and number of tumors likely makes it even more useful in evaluating novel agents and their efficacy in treatment studies of esophageal cancer. PMID:24705451

  15. Lilith: A scalable secure tool for massively parallel distributed computing

    SciTech Connect

    Armstrong, R.C.; Camp, L.J.; Evensky, D.A.; Gentile, A.C.

    1997-06-01

    Changes in high performance computing have necessitated the ability to utilize and interrogate potentially many thousands of processors. The ASCI (Advanced Strategic Computing Initiative) program conducted by the United States Department of Energy, for example, envisions thousands of distinct operating systems connected by low-latency gigabit-per-second networks. In addition multiple systems of this kind will be linked via high-capacity networks with latencies as low as the speed of light will allow. Code which spans systems of this sort must be scalable; yet constructing such code whether for applications, debugging, or maintenance is an unsolved problem. Lilith is a research software platform that attempts to answer these questions with an end toward meeting these needs. Presently, Lilith exists as a test-bed, written in Java, for various spanning algorithms and security schemes. The test-bed software has, and enforces, hooks allowing implementation and testing of various security schemes.

  16. Computer aided systems human engineering: A hypermedia tool

    NASA Technical Reports Server (NTRS)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  17. Present status of computational tools for maglev development

    SciTech Connect

    Wang, Z.; Chen, S.S.; Rote, D.M.

    1991-10-01

    High-speed vehicles that employ magnetic levitation (maglev) have received great attention worldwide as a means of relieving both highway and air-traffic congestion. At this time, Japan and Germany are leading the development of maglev. After fifteen years of inactivity that is attributed to technical policy decisions, the federal government of the United States has reconsidered the possibility of using maglev in the United States. The National Maglev Initiative (NMI) was established in May 1990 to assess the potential of maglev in the United States. One of the tasks of the NMI, which is also the objective of this report, is to determine the status of existing computer software that can be applied to maglev-related problems. The computational problems involved in maglev assessment, research, and development can be classified into two categories: electromagnetic and mechanical. Because most maglev problems are complicated and difficult to solve analytically, proper numerical methods are needed to find solutions. To determine the status of maglev-related software, developers and users of computer codes were surveyed. The results of the survey are described in this report. 25 refs.

  18. Synthetic RNAs for Gene Regulation: Design Principles and Computational Tools

    PubMed Central

    Laganà, Alessandro; Shasha, Dennis; Croce, Carlo Maria

    2014-01-01

    The use of synthetic non-coding RNAs for post-transcriptional regulation of gene expression has not only become a standard laboratory tool for gene functional studies but it has also opened up new perspectives in the design of new and potentially promising therapeutic strategies. Bioinformatics has provided researchers with a variety of tools for the design, the analysis, and the evaluation of RNAi agents such as small-interfering RNA (siRNA), short-hairpin RNA (shRNA), artificial microRNA (a-miR), and microRNA sponges. More recently, a new system for genome engineering based on the bacterial CRISPR-Cas9 system (Clustered Regularly Interspaced Short Palindromic Repeats), was shown to have the potential to also regulate gene expression at both transcriptional and post-transcriptional level in a more specific way. In this mini review, we present RNAi and CRISPRi design principles and discuss the advantages and limitations of the current design approaches. PMID:25566532

  19. Brain–computer interface technology as a tool to augment plasticity and outcomes for neurological rehabilitation

    PubMed Central

    Dobkin, Bruce H

    2007-01-01

    Brain–computer interfaces (BCIs) are a rehabilitation tool for tetraplegic patients that aim to improve quality of life by augmenting communication, control of the environment, and self-care. The neurobiology of both rehabilitation and BCI control depends upon learning to modify the efficacy of spared neural ensembles that represent movement, sensation and cognition through progressive practice with feedback and reward. To serve patients, BCI systems must become safe, reliable, cosmetically acceptable, quickly mastered with minimal ongoing technical support, and highly accurate even in the face of mental distractions and the uncontrolled environment beyond a laboratory. BCI technologies may raise ethical concerns if their availability affects the decisions of patients who become locked-in with brain stem stroke or amyotrophic lateral sclerosis to be sustained with ventilator support. If BCI technology becomes flexible and affordable, volitional control of cortical signals could be employed for the rehabilitation of motor and cognitive impairments in hemiplegic or paraplegic patients by offering on-line feedback about cortical activity associated with mental practice, motor intention, and other neural recruitment strategies during progressive task-oriented practice. Clinical trials with measures of quality of life will be necessary to demonstrate the value of near-term and future BCI applications. PMID:17095557

  20. Computer implemented method, and apparatus for controlling a hand-held tool

    NASA Technical Reports Server (NTRS)

    Wagner, Kenneth William (Inventor); Taylor, James Clayton (Inventor)

    1999-01-01

    The invention described here in is a computer-implemented method and apparatus for controlling a hand-held tool. In particular, the control of a hand held tool is for the purpose of controlling the speed of a fastener interface mechanism and the torque applied to fasteners by the fastener interface mechanism of the hand-held tool and monitoring the operating parameters of the tool. The control is embodied in intool software embedded on a processor within the tool which also communicates with remote software. An operator can run the tool, or through the interaction of both software, operate the tool from a remote location, analyze data from a performance history recorded by the tool, and select various torque and speed parameters for each fastener.

  1. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  2. Real-time compensation for tool form errors in turning using computer vision

    SciTech Connect

    Nobel, G.; Donmez, M.A.; Burton, R.

    1990-01-01

    Deviations from the circular shape of the cutting edge of a single-point turning tool cause form errors in the workpiece during contour cutting. One can compensate for these tool-form errors by determining the size of the effective deviation at a particular instant during cutting, and then adjusting the position of the cutting tool accordingly. An algorithm for the compensation of tool-nose-radius errors in real time has been developed and implemented on a CNC fuming center. A previously developed computer-vision-based tool- inspection system is used to determine the size of the deviations. Information from this system is fed to the error compensation computer which modifies the tool path in real time. Workpieces were cut utilizing the compensation system and were inspected on a coordinate measuring machine. Significant improvements in workpiece form were obtained.

  3. Real-time compensation for tool form errors in turning using computer vision

    SciTech Connect

    Nobel, G.; Donmez, M.A.; Burton, R.

    1990-12-31

    Deviations from the circular shape of the cutting edge of a single-point turning tool cause form errors in the workpiece during contour cutting. One can compensate for these tool-form errors by determining the size of the effective deviation at a particular instant during cutting, and then adjusting the position of the cutting tool accordingly. An algorithm for the compensation of tool-nose-radius errors in real time has been developed and implemented on a CNC fuming center. A previously developed computer-vision-based tool- inspection system is used to determine the size of the deviations. Information from this system is fed to the error compensation computer which modifies the tool path in real time. Workpieces were cut utilizing the compensation system and were inspected on a coordinate measuring machine. Significant improvements in workpiece form were obtained.

  4. Real-time compensation for tool form errors in turning using computer vision

    NASA Astrophysics Data System (ADS)

    Nobel, Gary; Donmez, M. Alkan; Burton, Richard

    1990-11-01

    Deviations from the circular shape of the cutting edge of a single-point turning tool cause form errors in the workpiece during contour cutting. One can compensate for these tool-form errors by determining the size of the effective deviation at a particular instant during cutting and then adjusting the position of the cutting tool accordingly. An algorithm for the compensation of tool-nose-radius errors in real time has been developed and implemented on a CNC turning center. A previously developed computer-vision-based tool- inspection system is used to determine the size of the deviations. 1 Information from this system is fed to the error compensation computer which modifies the tool path in real time. Workpieces were cut utilizing the compensation system and were inspected on a coordinate measuring machine. Significant improvements in workpiece form were obtained. 1.

  5. Computational Modeling as a Design Tool in Microelectronics Manufacturing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    Plans to introduce pilot lines or fabs for 300 mm processing are in progress. The IC technology is simultaneously moving towards 0.25/0.18 micron. The convergence of these two trends places unprecedented stringent demands on processes and equipments. More than ever, computational modeling is called upon to play a complementary role in equipment and process design. The pace in hardware/process development needs a matching pace in software development: an aggressive move towards developing "virtual reactors" is desirable and essential to reduce design cycle and costs. This goal has three elements: reactor scale model, feature level model, and database of physical/chemical properties. With these elements coupled, the complete model should function as a design aid in a CAD environment. This talk would aim at the description of various elements. At the reactor level, continuum, DSMC(or particle) and hybrid models will be discussed and compared using examples of plasma and thermal process simulations. In microtopography evolution, approaches such as level set methods compete with conventional geometric models. Regardless of the approach, the reliance on empricism is to be eliminated through coupling to reactor model and computational surface science. This coupling poses challenging issues of orders of magnitude variation in length and time scales. Finally, database development has fallen behind; current situation is rapidly aggravated by the ever newer chemistries emerging to meet process metrics. The virtual reactor would be a useless concept without an accompanying reliable database that consists of: thermal reaction pathways and rate constants, electron-molecule cross sections, thermochemical properties, transport properties, and finally, surface data on the interaction of radicals, atoms and ions with various surfaces. Large scale computational chemistry efforts are critical as experiments alone cannot meet database needs due to the difficulties associated with such

  6. Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models

    ERIC Educational Resources Information Center

    Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron

    2011-01-01

    To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…

  7. Plans and resources required for a computer numerically controlled machine tool tester

    SciTech Connect

    Newton, L.E.; Burleson, R.R.; McCue, H.K.; Pomernacki, C.L.; Mansfield, A.R.; Childs, J.J.

    1982-07-19

    Precision computer numerically controlled (CNC) machine tools present unique and especially difficult problems in the areas of qualification and fault isolation. In this report, we examine and classify these problems, discuss methods to resolve them effectively, and present estimates of the resources needed to design and build a CNC/machine tool tester.

  8. A Methodology for Integrating Computer-Based Learning Tools in Science Curricula

    ERIC Educational Resources Information Center

    Papadouris, Nicos; Constantinou, Constantinos P.

    2009-01-01

    This paper demonstrates a methodology for effectively integrating computer-based learning tools in science teaching and learning. This methodology provides a means of systematic analysis to identify the capabilities of particular software tools and to formulate a series of competencies relevant to physical science that could be developed by means…

  9. Which Way Will the Wind Blow? Networked Computer Tools for Studying the Weather.

    ERIC Educational Resources Information Center

    Fishman, Barry J.; D'Amico, Laura M.

    A suite of networked computer tools within a pedagogical framework was designed to enhance earth science education at the high school level. These tools give students access to live satellite images, weather maps, and other scientific data dealing with the weather, and make it easy for students to make their own weather forecasts by creating…

  10. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    ERIC Educational Resources Information Center

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  11. An Evaluation of Teaching Introductory Geomorphology Using Computer-based Tools.

    ERIC Educational Resources Information Center

    Wentz, Elizabeth A.; Vender, Joann C.; Brewer, Cynthia A.

    1999-01-01

    Compares student reactions to traditional teaching methods and an approach where computer-based tools (GEODe CD-ROM and GIS-based exercises) were either integrated with or replaced the traditional methods. Reveals that the students found both of these tools valuable forms of instruction when used in combination with the traditional methods. (CMK)

  12. A Multiple-Sessions Interactive Computer-Based Learning Tool for Ability Cultivation in Circuit Simulation

    ERIC Educational Resources Information Center

    Xu, Q.; Lai, L. L.; Tse, N. C. F.; Ichiyanagi, K.

    2011-01-01

    An interactive computer-based learning tool with multiple sessions is proposed in this paper, which teaches students to think and helps them recognize the merits and limitations of simulation tools so as to improve their practical abilities in electrical circuit simulation based on the case of a power converter with progressive problems. The…

  13. DEVELOPMENT AND USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOLS FOR POLLUTION PREVENTION

    EPA Science Inventory

    The use of Computer-Aided Process Engineering (CAPE) and process simulation tools has become established industry practice to predict simulation software, new opportunities are available for the creation of a wide range of ancillary tools that can be used from within multiple sim...

  14. Computational Tools for Interpreting Ion Channel pH-Dependence

    PubMed Central

    Sazanavets, Ivan; Warwicker, Jim

    2015-01-01

    Activity in many biological systems is mediated by pH, involving proton titratable groups with pKas in the relevant pH range. Experimental analysis of pH-dependence in proteins focusses on particular sidechains, often with mutagenesis of histidine, due to its pKa near to neutral pH. The key question for algorithms that predict pKas is whether they are sufficiently accurate to effectively narrow the search for molecular determinants of pH-dependence. Through analysis of inwardly rectifying potassium (Kir) channels and acid-sensing ion channels (ASICs), mutational effects on pH-dependence are probed, distinguishing between groups described as pH-coupled or pH-sensor. Whereas mutation can lead to a shift in transition pH between open and closed forms for either type of group, only for pH-sensor groups does mutation modulate the amplitude of the transition. It is shown that a hybrid Finite Difference Poisson-Boltzmann (FDPB) – Debye-Hückel continuum electrostatic model can filter mutation candidates, providing enrichment for key pH-coupled and pH-sensor residues in both ASICs and Kir channels, in comparison with application of FDPB alone. PMID:25915903

  15. Computational Tools for Interpreting Ion Channel pH-Dependence.

    PubMed

    Sazanavets, Ivan; Warwicker, Jim

    2015-01-01

    Activity in many biological systems is mediated by pH, involving proton titratable groups with pKas in the relevant pH range. Experimental analysis of pH-dependence in proteins focusses on particular sidechains, often with mutagenesis of histidine, due to its pKa near to neutral pH. The key question for algorithms that predict pKas is whether they are sufficiently accurate to effectively narrow the search for molecular determinants of pH-dependence. Through analysis of inwardly rectifying potassium (Kir) channels and acid-sensing ion channels (ASICs), mutational effects on pH-dependence are probed, distinguishing between groups described as pH-coupled or pH-sensor. Whereas mutation can lead to a shift in transition pH between open and closed forms for either type of group, only for pH-sensor groups does mutation modulate the amplitude of the transition. It is shown that a hybrid Finite Difference Poisson-Boltzmann (FDPB) - Debye-Hückel continuum electrostatic model can filter mutation candidates, providing enrichment for key pH-coupled and pH-sensor residues in both ASICs and Kir channels, in comparison with application of FDPB alone. PMID:25915903

  16. Arc Flash Boundary Calculations Using Computer Software Tools

    SciTech Connect

    Gibbs, M.D.

    2005-01-07

    Arc Flash Protection boundary calculations have become easier to perform with the availability of personal computer software. These programs incorporate arc flash protection boundary formulas for different voltage and current levels, calculate the bolted fault current at each bus, and use built in time-current coordination curves to determine the clearing time of protective devices in the system. Results of the arc flash protection boundary calculations can be presented in several different forms--as an annotation to the one-line diagram, as a table of arc flash protection boundary distances, and as printed placards to be attached to the appropriate equipment. Basic arc flash protection boundary principles are presented in this paper along with several helpful suggestions for performing arc flash protection boundary calculations.

  17. Computational databases, pathway and cheminformatics tools for tuberculosis drug discovery

    PubMed Central

    Ekins, Sean; Freundlich, Joel S.; Choi, Inhee; Sarker, Malabika; Talcott, Carolyn

    2010-01-01

    We are witnessing the growing menace of both increasing cases of drug-sensitive and drug-resistant Mycobacterium tuberculosis strains and the challenge to produce the first new tuberculosis (TB) drug in well over 40 years. The TB community, having invested in extensive high-throughput screening efforts, is faced with the question of how to optimally leverage this data in order to move from a hit to a lead to a clinical candidate and potentially a new drug. Complementing this approach, yet conducted on a much smaller scale, cheminformatic techniques have been leveraged and are herein reviewed. We suggest these computational approaches should be more optimally integrated in a workflow with experimental approaches to accelerate TB drug discovery. PMID:21129975

  18. Assessing Affordances of Selected Cloud Computing Tools for Language Teacher Education in Nigeria

    ERIC Educational Resources Information Center

    Ofemile, Abdulmalik Yusuf

    2015-01-01

    This paper reports part of a study that hoped to understand Teacher Educators' (TE) assessment of the affordances of selected cloud computing tools ranked among the top 100 for the year 2010. Research has shown that ICT and by extension cloud computing has positive impacts on daily life and this informed the Nigerian government's policy to…

  19. Development and Assessment of a Chemistry-Based Computer Video Game as a Learning Tool

    ERIC Educational Resources Information Center

    Martinez-Hernandez, Kermin Joel

    2010-01-01

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning…

  20. An Instructor's Guide to Collaborative Writing with CECE Talk: A Computer Network Tool.

    ERIC Educational Resources Information Center

    Neuwirth, Christine M.; And Others

    Describing a computer network communication tool which allows users to communicate concurrently across networked, advanced-function workstations, this guide presents information on how to use the Center for Educational Computing in English (CECE) Talk in the writing classroom. The guide focuses on three topics: (1) introducing CECE Talk to…

  1. Using Artificial Intelligence in Education: Computer-Based Tools for Instructional Development.

    ERIC Educational Resources Information Center

    Perez, Ray S.; Seidel, Robert J.

    1990-01-01

    Discussion of the use of artificial intelligence in computer-based instruction focuses on training development for the U.S. Army. Topics discussed include the Systems Approach to Training (SAT); knowledge acquisition; domain expertise; intelligent computer-assisted instruction; software tools and decision aids; and expert systems. (10 references)…

  2. A New Accurate 3D Measurement Tool to Assess the Range of Motion of the Tongue in Oral Cancer Patients: A Standardized Model.

    PubMed

    van Dijk, Simone; van Alphen, Maarten J A; Jacobi, Irene; Smeele, Ludwig E; van der Heijden, Ferdinand; Balm, Alfons J M

    2016-02-01

    In oral cancer treatment, function loss such as speech and swallowing deterioration can be severe, mostly due to reduced lingual mobility. Until now, there is no standardized measurement tool for tongue mobility and pre-operative prediction of function loss is based on expert opinion instead of evidence based insight. The purpose of this study was to assess the reliability of a triple-camera setup for the measurement of tongue range of motion (ROM) in healthy adults and its feasibility in patients with partial glossectomy. A triple-camera setup was used, and 3D coordinates of the tongue in five standardized tongue positions were achieved in 15 healthy volunteers. Maximum distances between the tip of the tongue and the maxillary midline were calculated. Each participant was recorded twice, and each movie was analysed three times by two separate raters. Intrarater, interrater and test-retest reliability were the main outcome measures. Secondly, feasibility of the method was tested in ten patients treated for oral tongue carcinoma. Intrarater, interrater and test-retest reliability all showed high correlation coefficients of >0.9 in both study groups. All healthy subjects showed perfect symmetrical tongue ROM. In patients, significant differences in lateral tongue movements were found, due to restricted tongue mobility after surgery. This triple-camera setup is a reliable measurement tool to assess three-dimensional information of tongue ROM. It constitutes an accurate tool for objective grading of reduced tongue mobility after partial glossectomy. PMID:26516075

  3. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    SciTech Connect

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1998-03-01

    Lilith is a general purpose framework, written in Java, that provides a highly scalable distribution of user code across a heterogeneous computing platform. By creation of suitable user code, the Lilith framework can be used for tool development. The scalable performance provided by Lilith is crucial to the development of effective tools for large distributed systems. Furthermore, since Lilith handles the details of code distribution and communication, the user code need focus primarily on the tool functionality, thus, greatly decreasing the time required for tool development. In this paper, the authors concentrate on the use of the Lilith framework to develop scalable tools. The authors review the functionality of Lilith and introduce a typical tool capitalizing on the features of the framework. They present new Objects directly involved with tool creation. They explain details of development and illustrate with an example. They present timing results demonstrating scalability.

  4. Tool or Science? The History of Computing at the Norwegian University of Science and Technology

    NASA Astrophysics Data System (ADS)

    Nordal, Ola

    One may characterize the history of computing at the Norwegian University of Science and Technology by a tension between the computer as a tool in other disciplines and computer science as discipline in itself. This tension has been latent since the pioneering period of the 1950s until today. This paper shows how this have been expressed in the early attempts to take up computing at the University, and how it gave the Division of Computer Science a fairly rough start when it opened in 1972.

  5. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  6. Computational Tools for Parsimony Phylogenetic Analysis of Omics Data.

    PubMed

    Salazar, Jose; Amri, Hakima; Noursi, David; Abu-Asab, Mones

    2015-08-01

    High-throughput assays from genomics, proteomics, metabolomics, and next generation sequencing produce massive omics datasets that are challenging to analyze in biological or clinical contexts. Thus far, there is no publicly available program for converting quantitative omics data into input formats to be used in off-the-shelf robust phylogenetic programs. To the best of our knowledge, this is the first report on creation of two Windows-based programs, OmicsTract and SynpExtractor, to address this gap. We note, as a way of introduction and development of these programs, that one particularly useful bioinformatics inferential modeling is the phylogenetic cladogram. Cladograms are multidimensional tools that show the relatedness between subgroups of healthy and diseased individuals and the latter's shared aberrations; they also reveal some characteristics of a disease that would not otherwise be apparent by other analytical methods. The OmicsTract and SynpExtractor were written for the respective tasks of (1) accommodating advanced phylogenetic parsimony analysis (through standard programs of MIX [from PHYLIP] and TNT), and (2) extracting shared aberrations at the cladogram nodes. OmicsTract converts comma-delimited data tables through assigning each data point into a binary value ("0" for normal states and "1" for abnormal states) then outputs the converted data tables into the proper input file formats for MIX or with embedded commands for TNT. SynapExtractor uses outfiles from MIX and TNT to extract the shared aberrations of each node of the cladogram, matching them with identifying labels from the dataset and exporting them into a comma-delimited file. Labels may be gene identifiers in gene-expression datasets or m/z values in mass spectrometry datasets. By automating these steps, OmicsTract and SynpExtractor offer a veritable opportunity for rapid and standardized phylogenetic analyses of omics data; their model can also be extended to next generation sequencing

  7. Computational Tools for the Interactive Exploration of Proteomic and Structural Data*

    PubMed Central

    Morris, John H.; Meng, Elaine C.; Ferrin, Thomas E.

    2010-01-01

    Linking proteomics and structural data is critical to our understanding of cellular processes, and interactive exploration of these complementary data sets can be extremely valuable for developing or confirming hypotheses in silico. However, few computational tools facilitate linking these types of data interactively. In addition, the tools that do exist are neither well understood nor widely used by the proteomics or structural biology communities. We briefly describe several relevant tools, and then, using three scenarios, we present in depth two tools for the integrated exploration of proteomics and structural data. PMID:20525940

  8. Acts -- A collection of high performing software tools for scientific computing

    SciTech Connect

    Drummond, L.A.; Marques, O.A.

    2002-11-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Further, many new discoveries depend on high performance computer simulations to satisfy their demands for large computational resources and short response time. The Advanced CompuTational Software (ACTS) Collection brings together a number of general-purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS collection promotes code portability, reusability, reduction of duplicate efforts, and tool maturity. This paper presents a brief introduction to the functionality available in ACTS. It also highlight the tools that are in demand by Climate and Weather modelers.

  9. System-level tools and reconfigurable computing for next-generation HWIL systems

    NASA Astrophysics Data System (ADS)

    Stark, Derek; McAulay, Derek; Cantle, Allan J.; Devlin, Malachy

    2001-08-01

    Previous work has been presented on the creation of computing architectures called DIME, which addressed the particular computing demands of hardware in the loop systems. These demands include low latency, high data rates and interfacing. While it is essential to have a capable platform for handling and processing of the data streams, the tools must also complement this so that a system's engineer is able to construct their final system. The paper will present the work in the area of integration of system level design tools, such as MATLAB and SIMULINK, with a reconfigurable computing platform. This will demonstrate how algorithms can be implemented and simulated in a familiar rapid application development environment before they are automatically transposed for downloading directly to the computing platform. This complements the established control tools, which handle the configuration and control of the processing systems leading to a tool suite for system development and implementation. As the development tools have evolved the core-processing platform has also been enhanced. These improved platforms are based on dynamically reconfigurable computing, utilizing FPGA technologies, and parallel processing methods that more than double the performance and data bandwidth capabilities. This offers support for the processing of images in Infrared Scene Projectors with 1024 X 1024 resolutions at 400 Hz frame rates. The processing elements will be using the latest generation of FPGAs, which implies that the presented systems will be rated in terms of Tera (1012) operations per second.

  10. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    SciTech Connect

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1997-12-31

    Lilith is a general purpose tool that provides a highly scalable, easy distribution of user code across a heterogeneous computing platform. By handling the details of code distribution and communication, such a framework allows for the rapid development of tools for the use and management of large distributed systems. This speed-up in development not only enables the easy creation of tools as needed but also facilitates the ultimate development of more refined, hard-coded tools as well. Lilith is written in Java, providing platform independence and further facilitating rapid tool development through Object reuse and ease of development. The authors present the user-involved objects in the Lilith Distributed Object System and the Lilith User API. They present an example of tool development, illustrating the user calls, and present results demonstrating Lilith`s scalability.

  11. Informed public choices for low-carbon electricity portfolios using a computer decision tool.

    PubMed

    Mayer, Lauren A Fleishman; Bruine de Bruin, Wändi; Morgan, M Granger

    2014-04-01

    Reducing CO2 emissions from the electricity sector will likely require policies that encourage the widespread deployment of a diverse mix of low-carbon electricity generation technologies. Public discourse informs such policies. To make informed decisions and to productively engage in public discourse, citizens need to understand the trade-offs between electricity technologies proposed for widespread deployment. Building on previous paper-and-pencil studies, we developed a computer tool that aimed to help nonexperts make informed decisions about the challenges faced in achieving a low-carbon energy future. We report on an initial usability study of this interactive computer tool. After providing participants with comparative and balanced information about 10 electricity technologies, we asked them to design a low-carbon electricity portfolio. Participants used the interactive computer tool, which constrained portfolio designs to be realistic and yield low CO2 emissions. As they changed their portfolios, the tool updated information about projected CO2 emissions, electricity costs, and specific environmental impacts. As in the previous paper-and-pencil studies, most participants designed diverse portfolios that included energy efficiency, nuclear, coal with carbon capture and sequestration, natural gas, and wind. Our results suggest that participants understood the tool and used it consistently. The tool may be downloaded from http://cedmcenter.org/tools-for-cedm/informing-the-public-about-low-carbon-technologies/ . PMID:24564708

  12. Development of generalized mapping tools to improve implementation of data driven computer simulations (04-ERD-083)

    SciTech Connect

    Ramirez, A; Pasyanos, M; Franz, G A

    2004-09-17

    The Stochastic Engine (SE) is a data driven computer simulation tool for predicting the characteristics of complex systems. The SE integrates accurate simulators with the Monte Carlo Markov Chain (MCMC) approach (a stochastic inverse technique) to identify alternative models that are consistent with available data and ranks these alternatives according to their probabilities. Implementation of the SE is currently cumbersome owing to the need to customize the pre-processing and processing steps that are required for a specific application. This project widens the applicability of the Stochastic Engine by generalizing some aspects of the method (i.e. model-to-data transformation types, configuration, model representation). We have generalized several of the transformations that are necessary to match the observations to proposed models. These transformations are sufficiently general not to pertain to any single application. This approach provides a framework that increases the efficiency of the SE implementation. The overall goal is to reduce response time and make the approach as ''plug-and-play'' as possible, and will result in the rapid accumulation of new data types for a host of both earth science and non-earth science problems. When adapting the SE approach to a specific application, there are various pre-processing and processing steps that are typically needed to run a specific problem. Many of these steps are common to a wide variety of specific applications. Here we list and describe several data transformations that are common to a variety of subsurface inverse problems. A subset of these steps have been developed in a generalized form such that they could be used with little or no modifications in a wide variety of specific applications. This work was funded by the LDRD Program (tracking number 04-ERD-083).

  13. Time-Accurate, Unstructured-Mesh Navier-Stokes Computations with the Space-Time CESE Method

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan

    2006-01-01

    Application of the newly emerged space-time conservation element solution element (CESE) method to compressible Navier-Stokes equations is studied. In contrast to Euler equations solvers, several issues such as boundary conditions, numerical dissipation, and grid stiffness warrant systematic investigations and validations. Non-reflecting boundary conditions applied at the truncated boundary are also investigated from the stand point of acoustic wave propagation. Validations of the numerical solutions are performed by comparing with exact solutions for steady-state as well as time-accurate viscous flow problems. The test cases cover a broad speed regime for problems ranging from acoustic wave propagation to 3D hypersonic configurations. Model problems pertinent to hypersonic configurations demonstrate the effectiveness of the CESE method in treating flows with shocks, unsteady waves, and separations. Good agreement with exact solutions suggests that the space-time CESE method provides a viable alternative for time-accurate Navier-Stokes calculations of a broad range of problems.

  14. Performance of computational tools in evaluating the functional impact of laboratory-induced amino acid mutations.

    PubMed

    Gray, Vanessa E; Kukurba, Kimberly R; Kumar, Sudhir

    2012-08-15

    Site-directed mutagenesis is frequently used by scientists to investigate the functional impact of amino acid mutations in the laboratory. Over 10,000 such laboratory-induced mutations have been reported in the UniProt database along with the outcomes of functional assays. Here, we explore the performance of state-of-the-art computational tools (Condel, PolyPhen-2 and SIFT) in correctly annotating the function-altering potential of 10,913 laboratory-induced mutations from 2372 proteins. We find that computational tools are very successful in diagnosing laboratory-induced mutations that elicit significant functional change in the laboratory (up to 92% accuracy). But, these tools consistently fail in correctly annotating laboratory-induced mutations that show no functional impact in the laboratory assays. Therefore, the overall accuracy of computational tools for laboratory-induced mutations is much lower than that observed for the naturally occurring human variants. We tested and rejected the possibilities that the preponderance of changes to alanine and the presence of multiple base-pair mutations in the laboratory were the reasons for the observed discordance between the performance of computational tools for natural and laboratory mutations. Instead, we discover that the laboratory-induced mutations occur predominately at the highly conserved positions in proteins, where the computational tools have the lowest accuracy of correct prediction for variants that do not impact function (neutral). Therefore, the comparisons of experimental-profiling results with those from computational predictions need to be sensitive to the evolutionary conservation of the positions harboring the amino acid change. PMID:22685075

  15. Scale up tools in reactive extrusion and compounding processes. Could 1D-computer modeling be helpful?

    NASA Astrophysics Data System (ADS)

    Pradel, J.-L.; David, C.; Quinebèche, S.; Blondel, P.

    2014-05-01

    Industrial scale-up (or scale down) in Compounding and Reactive Extrusion processes is one of the most critical R&D challenges. Indeed, most of High Performances Polymers are obtained within a reactive compounding involving chemistry: free radical grafting, in situ compatibilization, rheology control... but also side reactions: oxidation, branching, chain scission... As described by basic Arrhenius and kinetics laws, the competition between all chemical reactions depends on residence time distribution and temperature. Then, to ensure the best possible scale up methodology, we need tools to match thermal history of the formulation along the screws from a lab scale twin screw extruder to an industrial one. This paper proposes a comparison between standard scale-up laws and the use of Computer modeling Software such as Ludovic® applied and compared to experimental data. Scaling data from a compounding line to another one, applying general rules (for example at constant specific mechanical energy), shows differences between experimental and computed data, and error depends on the screw speed range. For more accurate prediction, 1D-Computer Modeling could be used to optimize the process conditions to ensure the best scale-up product, especially in temperature sensitive reactive extrusion processes. When the product temperature along the screws is the key, Ludovic® software could help to compute the temperature profile along the screws and extrapolate conditions, even screw profile, on industrial extruders.

  16. Prostate cancer nodal oligometastasis accurately assessed using prostate-specific membrane antigen positron emission tomography-computed tomography and confirmed histologically following robotic-assisted lymph node dissection

    PubMed Central

    O’Kane, Dermot B.; Lawrentschuk, Nathan; Bolton, Damien M.

    2016-01-01

    We herein present a case of a 76-year-old gentleman, where prostate-specific membrane antigen positron emission tomography-computed tomography (PSMA PET-CT) was used to accurately detect prostate cancer (PCa), pelvic lymph node (LN) metastasis in the setting of biochemical recurrence following definitive treatment for PCa. The positive PSMA PET-CT result was confirmed with histological examination of the involved pelvic LNs following pelvic LN dissection. PMID:27141207

  17. Computer-based tools for decision support at the Hanford Site

    SciTech Connect

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high` level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the ``glue`` or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  18. Computer-based tools for decision support at the Hanford Site

    SciTech Connect

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high' level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the glue'' or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  19. The -mdoc macro package: A software tool to support computer documentation standards

    SciTech Connect

    Sanders, C.E.

    1987-09-16

    At Los Alamos National Laboratory a small staff of writers and word processors in the Computer Documentation Group is responsible for producing computer documentation for the over 8000 users of the Laboratory's computer network. The -mdoc macro package was developed as a software tool to support that effort. The -mdoc macro package is used with the NROFF/TROFF document preparation system on the UNIX operating system. The -mdoc macro package incorporates the standards for computer documentation at Los Alamos that were established by the writers. Use of the -mdoc macro package has freed the staff of programming format details, allowing writers to concentrate on content of documents and word processors to produce documents in a timely manner. It is an easy-to-use software tool that adapts to changing skills, needs, and technology. 5 refs.

  20. Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.

    2016-01-01

    A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.

  1. Accurate prediction of the toxicity of benzoic acid compounds in mice via oral without using any computer codes.

    PubMed

    Keshavarz, Mohammad Hossein; Gharagheizi, Farhad; Shokrolahi, Arash; Zakinejad, Sajjad

    2012-10-30

    Most of benzoic acid derivatives are toxic, which may cause serious public health and environmental problems. Two novel simple and reliable models are introduced for desk calculations of the toxicity of benzoic acid compounds in mice via oral LD(50) with more reliance on their answers as one could attach to the more complex outputs. They require only elemental composition and molecular fragments without using any computer codes. The first model is based on only the number of carbon and hydrogen atoms, which can be improved by several molecular fragments in the second model. For 57 benzoic compounds, where the computed results of quantitative structure-toxicity relationship (QSTR) were recently reported, the predicted results of two simple models of present method are more reliable than QSTR computations. The present simple method is also tested with further 324 benzoic acid compounds including complex molecular structures, which confirm good forecasting ability of the second model. PMID:22959133

  2. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    NASA Technical Reports Server (NTRS)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  3. An ONIOM study of the Bergman reaction: a computationally efficient and accurate method for modeling the enediyne anticancer antibiotics

    NASA Astrophysics Data System (ADS)

    Feldgus, Steven; Shields, George C.

    2001-10-01

    The Bergman cyclization of large polycyclic enediyne systems that mimic the cores of the enediyne anticancer antibiotics was studied using the ONIOM hybrid method. Tests on small enediynes show that ONIOM can accurately match experimental data. The effect of the triggering reaction in the natural products is investigated, and we support the argument that it is strain effects that lower the cyclization barrier. The barrier for the triggered molecule is very low, leading to a reasonable half-life at biological temperatures. No evidence is found that would suggest a concerted cyclization/H-atom abstraction mechanism is necessary for DNA cleavage.

  4. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  5. Development of a computational high-throughput tool for the quantitative examination of dose-dependent histological features.

    PubMed

    Nault, Rance; Colbry, Dirk; Brandenberger, Christina; Harkema, Jack R; Zacharewski, Timothy R

    2015-04-01

    High-resolution digitalizing of histology slides facilitates the development of computational alternatives to manual quantitation of features of interest. We developed a MATLAB-based quantitative histological analysis tool (QuHAnT) for the high-throughput assessment of distinguishable histological features. QuHAnT validation was demonstrated by comparison with manual quantitation using liver sections from mice orally gavaged with sesame oil vehicle or 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD; 0.001-30 μg/kg) every 4 days for 28 days, which elicits hepatic steatosis with mild fibrosis. A quality control module of QuHAnT reduced the number of quantifiable Oil Red O (ORO)-stained images from 3,123 to 2,756. Increased ORO staining was measured at 10 and 30 μg/kg TCDD with a high correlation between manual and computational volume densities (Vv ), although the dynamic range of QuHAnT was 10-fold greater. Additionally, QuHAnT determined the size of each ORO vacuole, which could not be accurately quantitated by visual examination or manual point counting. PicroSirius Red quantitation demonstrated superior collagen deposition detection due to the ability to consider all images within each section. QuHAnT dramatically reduced analysis time and facilitated the comprehensive assessment of features improving accuracy and sensitivity and represents a complementary tool for tissue/cellular features that are difficult and tedious to assess via subjective or semiquantitative methods. PMID:25274660

  6. Development of a Computational High-Throughput Tool for the Quantitative Examination of Dose-Dependent Histological Features

    PubMed Central

    Nault, Rance; Colbry, Dirk; Brandenberger, Christina; Harkema, Jack R.; Zacharewski, Timothy R.

    2015-01-01

    High-resolution digitalizing of histology slides facilitates the development of computational alternatives to manual quantitation of features of interest. We developed a MATLAB-based quantitative histological analysis tool (QuHAnT) for the high-throughput assessment of distinguishable histological features. QuHAnT validation was demonstrated by comparison with manual quantitation using liver sections from mice orally gavaged with sesame oil vehicle or 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD; 0.001–30 µg/kg) every 4 days for 28 days, which elicits hepatic steatosis with mild fibrosis. A quality control module of QuHAnT reduced the number of quantifiable Oil Red O (ORO)-stained images from 3,123 to 2,756. Increased ORO staining was measured at 10 and 30 µg/kg TCDD with a high correlation between manual and computational volume densities (Vv), although the dynamic range of QuHAnT was 10-fold greater. Additionally, QuHAnT determined the size of each ORO vacuole, which could not be accurately quantitated by visual examination or manual point counting. PicroSirius Red quantitation demonstrated superior collagen deposition detection due to the ability to consider all images within each section. QuHAnT dramatically reduced analysis time and facilitated the comprehensive assessment of features improving accuracy and sensitivity and represents a complementary tool for tissue/cellular features that are difficult and tedious to assess via subjective or semiquantitative methods. PMID:25274660

  7. A visualization tool for parallel and distributed computing using the Lilith framework

    SciTech Connect

    Gentile, A.C.; Evensky, D.A.; Wyckoff, P.

    1998-05-01

    The authors present a visualization tool for the monitoring and debugging of codes run in a parallel and distributed computing environment, called Lilith Lights. This tool can be used both for debugging parallel codes as well as for resource management of clusters. It was developed under Lilith, a framework for creating scalable software tools for distributed computing. The use of Lilith provides scalable, non-invasive debugging, as opposed to other commonly used software debugging and visualization tools. Furthermore, by implementing the visualization tool in software rather than in hardware (as available on some MPPs), Lilith Lights is easily transferable to other machines, and well adapted for use on distributed clusters of machines. The information provided in a clustered environment can further be used for resource management of the cluster. In this paper, they introduce Lilith Lights, discussing its use on the Computational Plant cluster at Sandia National Laboratories, show its design and development under the Lilith framework, and present metrics for resource use and performance.

  8. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road

    PubMed Central

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on “on-demand payment” for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  9. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    PubMed

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  10. Examining the effects of computational tools on students' understanding of thermodynamics of material concepts and representations

    NASA Astrophysics Data System (ADS)

    Ogunwuyi, Oluwatosin

    Technology is becoming a more critical agent for supporting learning as well as research in science and engineering. In particular, technology-based tools in the form of simulations and virtual environments support learning using mathematical models and computational methods. The purpose of this research is to: (a) measure the value added in conveying Thermodynamics of materials concepts with a blended learning environment using computational simulation tools with lectures; and (b) characterize students' use of representational forms to convey their conceptual understanding of core concepts within a learning environment that blended Gibbs computational resource and traditional lectures. A mix-method approach was implemented that included the use of statistical analysis to compare student test performance as a result of interacting with Gibbs tool and the use of Grounded Theory inductive analysis to explore students' use of representational forms to express their understanding of thermodynamics of material concepts. Results for the quantitative study revealed positive gains in students' conceptual understanding before and after interacting with Gibbs tool for the majority of the concepts tested. In addition, insight gained from the qualitative analysis helped provide understanding about how students utilized representational forms in communicating their understanding of thermodynamics of material concepts. Knowledge of how novice students construct meaning in this context will provide insight for engineering education instructors and researchers in understanding students' learning processes in the context of educational environments that integrate expert simulation tools as part of their instructional resources for foundational domain knowledge.

  11. Computers as Pedagogical Tools in Brazil: A Pseudo-Panel Analysis

    ERIC Educational Resources Information Center

    Sprietsma, Maresa

    2012-01-01

    The number of schools that have access to computers and the Internet has increased rapidly since the beginning of the 1990s. However, evidence of their effectiveness as pedagogical tools to acquire reading and math skills is still the object of debate. We use repeated cross-section data from Brazil to evaluate the effect of the availability of a…

  12. Computer-Based Cognitive Tools in Teacher Training: The COG-TECH Projects

    ERIC Educational Resources Information Center

    Orhun, Emrah

    2003-01-01

    The COG-TECH (Cognitive Technologies for Problem Solving and Learning) Network conducted three international projects between 1994 and 2001 under the auspices of the European Commission. The main purpose of these projects was to train teacher educators in the Mediterranean countries to use computers as effective pedagogical tools. The summer…

  13. Making Waves: A Simulation and Modeling Computer-Tool for Studying Wave Phenomena.

    ERIC Educational Resources Information Center

    Snir, Joseph

    1989-01-01

    Examines the use of a computer simulation program as a tool to help in the understanding of wave phenomena. After analyzing some of the main difficulties and common misconceptions about waves, features of the "Making Waves" software package are described. Figures showing a typical monitor display are presented. (YP)

  14. USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOL IN POLLUTION PREVENTION

    EPA Science Inventory

    Computer-Aided Process Engineering has become established in industry as a design tool. With the establishment of the CAPE-OPEN software specifications for process simulation environments. CAPE-OPEN provides a set of "middleware" standards that enable software developers to acces...

  15. An Evaluation of the Webquest as a Computer-Based Learning Tool

    ERIC Educational Resources Information Center

    Hassanien, Ahmed

    2006-01-01

    This paper explores the preparation and use of an internet activity for undergraduate learners in higher education (HE). It evaluates the effectiveness of using webquest as a computer-based learning (CBL) tool to support students to learn in HE. The evaluation undertaken offers insights into learner perceptions concerning the ease of use of the…

  16. Instruction of Statistics via Computer-Based Tools: Effects on Statistics' Anxiety, Attitude, and Achievement

    ERIC Educational Resources Information Center

    Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar

    2014-01-01

    The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…

  17. Graphical and Normative Analysis of Binocular Vision by Mini Computer: A Teaching Aid and Clinical Tool.

    ERIC Educational Resources Information Center

    Kees, Martin; Schor, Clifton

    1981-01-01

    An inexpensive computer graphics systems (Commodore PET), used as a video aid for teaching students advanced case analysis, is described. The course provides students with the analytical tools for evaluating with graphical and statistical techniques and treating with lenses, prisms, and orthoptics various anomalies of binocular vision. (MLW)

  18. CAROLINA CENTER FOR COMPUTATIONAL TOXICOLOGY: ASSAYS, MODELS AND TOOLS FOR NEXTGEN SAFETY ASSESSMENTS

    EPA Science Inventory

    The Center will develop new methods and tools, and will continue to collaborate closely with EPA, Tox21 and other environmental scientists. New in vitro populationbased assays and computer-based models that fill critical gaps in risk assessment will be developed and deliver...

  19. Structure of the Brazilian Sign Language (Libras) for Computational Tools: Citizenship and Social Inclusion

    NASA Astrophysics Data System (ADS)

    Guimaraes, Cayley; Antunes, Diego R.; de F. Guilhermino Trindade, Daniela; da Silva, Rafaella A. Lopes; Garcia, Laura Sanchez

    This work presents a computational model (XML) of the Brazilian Sign Language (Libras), based on its phonology. The model was used to create a sample of representative signs to aid the recording of a base of videos whose aim is to support the development of tools to support genuine social inclusion of the deaf.

  20. Computer-Mediated Communication as a Teaching Tool: A Case Study.

    ERIC Educational Resources Information Center

    Everett, Donna R.; Ahern, Terence C.

    1994-01-01

    Discussion of emerging educational technologies focuses on a case study of college students that was conducted to observe the effects of using computer-mediated communication and appropriate groupware as a teaching tool. Highlights include effects on the students, the structure of the classroom, and interpersonal interactions. (Contains 29…

  1. Technology and Jobs: Computer-Aided Design. Numerical-Control Machine-Tool Operators. Office Automation.

    ERIC Educational Resources Information Center

    Stanton, Michael; And Others

    1985-01-01

    Three reports on the effects of high technology on the nature of work include (1) Stanton on applications and implications of computer-aided design for engineers, drafters, and architects; (2) Nardone on the outlook and training of numerical-control machine tool operators; and (3) Austin and Drake on the future of clerical occupations in automated…

  2. Multimedia Instructional Tools' Impact on Student Motivation and Learning Strategies in Computer Applications Courses

    ERIC Educational Resources Information Center

    Chapman, Debra; Wang, Shuyan

    2015-01-01

    Multimedia instructional tools (MMIT) have been identified as a way effectively and economically present instructional material. MMITs are commonly used in introductory computer applications courses as MMITs should be effective in increasing student knowledge and positively impact motivation and learning strategies, without increasing costs. This…

  3. Video Analysis of Projectile Motion Using Tablet Computers as Experimental Tools

    ERIC Educational Resources Information Center

    Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and "g" in order to explore the underlying laws of motion. This experiment…

  4. High order accurate and low dissipation method for unsteady compressible viscous flow computation on helicopter rotor in forward flight

    NASA Astrophysics Data System (ADS)

    Xu, Li; Weng, Peifen

    2014-02-01

    An improved fifth-order weighted essentially non-oscillatory (WENO-Z) scheme combined with the moving overset grid technique has been developed to compute unsteady compressible viscous flows on the helicopter rotor in forward flight. In order to enforce periodic rotation and pitching of the rotor and relative motion between rotor blades, the moving overset grid technique is extended, where a special judgement standard is presented near the odd surface of the blade grid during search donor cells by using the Inverse Map method. The WENO-Z scheme is adopted for reconstructing left and right state values with the Roe Riemann solver updating the inviscid fluxes and compared with the monotone upwind scheme for scalar conservation laws (MUSCL) and the classical WENO scheme. Since the WENO schemes require a six point stencil to build the fifth-order flux, the method of three layers of fringes for hole boundaries and artificial external boundaries is proposed to carry out flow information exchange between chimera grids. The time advance on the unsteady solution is performed by the full implicit dual time stepping method with Newton type LU-SGS subiteration, where the solutions of pseudo steady computation are as the initial fields of the unsteady flow computation. Numerical results on non-variable pitch rotor and periodic variable pitch rotor in forward flight reveal that the approach can effectively capture vortex wake with low dissipation and reach periodic solutions very soon.

  5. G-LoSA: An efficient computational tool for local structure-centric biological studies and drug design.

    PubMed

    Lee, Hui Sun; Im, Wonpil

    2016-04-01

    Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G-LoSA. G-LoSA aligns protein local structures in a sequence order independent way and provides a GA-score, a chemical feature-based and size-independent structure similarity score. Our benchmark validation shows the robust performance of G-LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure-centric comparative biology studies. In particular, G-LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G-LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer-aided drug design. We hope that G-LoSA can be a useful computational method for exploring interesting biological problems through large-scale comparison of protein local structures and facilitating drug discovery research and development. G-LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. PMID:26813336

  6. Predicting suitable optoelectronic properties of monoclinic VON semiconductor crystals for photovoltaics using accurate first-principles computations.

    PubMed

    Harb, Moussab

    2015-10-14

    Using accurate first-principles quantum calculations based on DFT (including the DFPT) with the range-separated hybrid HSE06 exchange-correlation functional, we can predict the essential fundamental properties (such as bandgap, optical absorption co-efficient, dielectric constant, charge carrier effective masses and exciton binding energy) of two stable monoclinic vanadium oxynitride (VON) semiconductor crystals for solar energy conversion applications. In addition to the predicted band gaps in the optimal range for making single-junction solar cells, both polymorphs exhibit a relatively high absorption efficiency in the visible range, high dielectric constant, high charge carrier mobility and much lower exciton binding energy than the thermal energy at room temperature. Moreover, their optical absorption, dielectric and exciton dissociation properties were found to be better than those obtained for semiconductors frequently utilized in photovoltaic devices such as Si, CdTe and GaAs. These novel results offer a great opportunity for this stoichiometric VON material to be properly synthesized and considered as a new good candidate for photovoltaic applications. PMID:26351755

  7. InteractoMIX: a suite of computational tools to exploit interactomes in biological and clinical research.

    PubMed

    Poglayen, Daniel; Marín-López, Manuel Alejandro; Bonet, Jaume; Fornes, Oriol; Garcia-Garcia, Javier; Planas-Iglesias, Joan; Segura, Joan; Oliva, Baldo; Fernandez-Fuentes, Narcis

    2016-06-15

    Virtually all the biological processes that occur inside or outside cells are mediated by protein-protein interactions (PPIs). Hence, the charting and description of the PPI network, initially in organisms, the interactome, but more recently in specific tissues, is essential to fully understand cellular processes both in health and disease. The study of PPIs is also at the heart of renewed efforts in the medical and biotechnological arena in the quest of new therapeutic targets and drugs. Here, we present a mini review of 11 computational tools and resources tools developed by us to address different aspects of PPIs: from interactome level to their atomic 3D structural details. We provided details on each specific resource, aims and purpose and compare with equivalent tools in the literature. All the tools are presented in a centralized, one-stop, web site: InteractoMIX (http://interactomix.com). PMID:27284060

  8. Extending peripersonal space representation without tool-use: evidence from a combined behavioral-computational approach

    PubMed Central

    Serino, Andrea; Canzoneri, Elisa; Marzolla, Marilena; di Pellegrino, Giuseppe; Magosso, Elisa

    2015-01-01

    Stimuli from different sensory modalities occurring on or close to the body are integrated in a multisensory representation of the space surrounding the body, i.e., peripersonal space (PPS). PPS dynamically modifies depending on experience, e.g., it extends after using a tool to reach far objects. However, the neural mechanism underlying PPS plasticity after tool use is largely unknown. Here we use a combined computational-behavioral approach to propose and test a possible mechanism accounting for PPS extension. We first present a neural network model simulating audio-tactile representation in the PPS around one hand. Simulation experiments showed that our model reproduced the main property of PPS neurons, i.e., selective multisensory response for stimuli occurring close to the hand. We used the neural network model to simulate the effects of a tool-use training. In terms of sensory inputs, tool use was conceptualized as a concurrent tactile stimulation from the hand, due to holding the tool, and an auditory stimulation from the far space, due to tool-mediated action. Results showed that after exposure to those inputs, PPS neurons responded also to multisensory stimuli far from the hand. The model thus suggests that synchronous pairing of tactile hand stimulation and auditory stimulation from the far space is sufficient to extend PPS, such as after tool-use. Such prediction was confirmed by a behavioral experiment, where we used an audio-tactile interaction paradigm to measure the boundaries of PPS representation. We found that PPS extended after synchronous tactile-hand stimulation and auditory-far stimulation in a group of healthy volunteers. Control experiments both in simulation and behavioral settings showed that the same amount of tactile and auditory inputs administered out of synchrony did not change PPS representation. We conclude by proposing a simple, biological-plausible model to explain plasticity in PPS representation after tool-use, which is

  9. Extending peripersonal space representation without tool-use: evidence from a combined behavioral-computational approach.

    PubMed

    Serino, Andrea; Canzoneri, Elisa; Marzolla, Marilena; di Pellegrino, Giuseppe; Magosso, Elisa

    2015-01-01

    Stimuli from different sensory modalities occurring on or close to the body are integrated in a multisensory representation of the space surrounding the body, i.e., peripersonal space (PPS). PPS dynamically modifies depending on experience, e.g., it extends after using a tool to reach far objects. However, the neural mechanism underlying PPS plasticity after tool use is largely unknown. Here we use a combined computational-behavioral approach to propose and test a possible mechanism accounting for PPS extension. We first present a neural network model simulating audio-tactile representation in the PPS around one hand. Simulation experiments showed that our model reproduced the main property of PPS neurons, i.e., selective multisensory response for stimuli occurring close to the hand. We used the neural network model to simulate the effects of a tool-use training. In terms of sensory inputs, tool use was conceptualized as a concurrent tactile stimulation from the hand, due to holding the tool, and an auditory stimulation from the far space, due to tool-mediated action. Results showed that after exposure to those inputs, PPS neurons responded also to multisensory stimuli far from the hand. The model thus suggests that synchronous pairing of tactile hand stimulation and auditory stimulation from the far space is sufficient to extend PPS, such as after tool-use. Such prediction was confirmed by a behavioral experiment, where we used an audio-tactile interaction paradigm to measure the boundaries of PPS representation. We found that PPS extended after synchronous tactile-hand stimulation and auditory-far stimulation in a group of healthy volunteers. Control experiments both in simulation and behavioral settings showed that the same amount of tactile and auditory inputs administered out of synchrony did not change PPS representation. We conclude by proposing a simple, biological-plausible model to explain plasticity in PPS representation after tool-use, which is

  10. TepiTool: A Pipeline for Computational Prediction of T Cell Epitope Candidates.

    PubMed

    Paul, Sinu; Sidney, John; Sette, Alessandro; Peters, Bjoern

    2016-01-01

    Computational prediction of T cell epitope candidates is currently being used in several applications including vaccine discovery studies, development of diagnostics, and removal of unwanted immune responses against protein therapeutics. There have been continuous improvements in the performance of MHC binding prediction tools, but their general adoption by immunologists has been slow due to the lack of user-friendly interfaces and guidelines. Current tools only provide minimal advice on what alleles to include, what lengths to consider, how to deal with homologous peptides, and what cutoffs should be considered relevant. This protocol provides step-by-step instructions with necessary recommendations for prediction of the best T cell epitope candidates with the newly developed online tool called TepiTool. TepiTool, which is part of the Immune Epitope Database (IEDB), provides some of the top MHC binding prediction algorithms for number of species including humans, chimpanzees, bovines, gorillas, macaques, mice, and pigs. The TepiTool is freely accessible at http://tools.iedb.org/tepitool/. © 2016 by John Wiley & Sons, Inc. PMID:27479659

  11. Blast-induced biomechanical loading of the rat: an experimental and anatomically accurate computational blast injury model.

    PubMed

    Sundaramurthy, Aravind; Alai, Aaron; Ganpule, Shailesh; Holmberg, Aaron; Plougonven, Erwan; Chandra, Namas

    2012-09-01

    Blast waves generated by improvised explosive devices (IEDs) cause traumatic brain injury (TBI) in soldiers and civilians. In vivo animal models that use shock tubes are extensively used in laboratories to simulate field conditions, to identify mechanisms of injury, and to develop injury thresholds. In this article, we place rats in different locations along the length of the shock tube (i.e., inside, outside, and near the exit), to examine the role of animal placement location (APL) in the biomechanical load experienced by the animal. We found that the biomechanical load on the brain and internal organs in the thoracic cavity (lungs and heart) varied significantly depending on the APL. When the specimen is positioned outside, organs in the thoracic cavity experience a higher pressure for a longer duration, in contrast to APL inside the shock tube. This in turn will possibly alter the injury type, severity, and lethality. We found that the optimal APL is where the Friedlander waveform is first formed inside the shock tube. Once the optimal APL was determined, the effect of the incident blast intensity on the surface and intracranial pressure was measured and analyzed. Noticeably, surface and intracranial pressure increases linearly with the incident peak overpressures, though surface pressures are significantly higher than the other two. Further, we developed and validated an anatomically accurate finite element model of the rat head. With this model, we determined that the main pathway of pressure transmission to the brain was through the skull and not through the snout; however, the snout plays a secondary role in diffracting the incoming blast wave towards the skull. PMID:22620716

  12. Network Computing Infrastructure to Share Tools and Data in Global Nuclear Energy Partnership

    NASA Astrophysics Data System (ADS)

    Kim, Guehee; Suzuki, Yoshio; Teshima, Naoya

    CCSE/JAEA (Center for Computational Science and e-Systems/Japan Atomic Energy Agency) integrated a prototype system of a network computing infrastructure for sharing tools and data to support the U.S. and Japan collaboration in GNEP (Global Nuclear Energy Partnership). We focused on three technical issues to apply our information process infrastructure, which are accessibility, security, and usability. In designing the prototype system, we integrated and improved both network and Web technologies. For the accessibility issue, we adopted SSL-VPN (Security Socket Layer-Virtual Private Network) technology for the access beyond firewalls. For the security issue, we developed an authentication gateway based on the PKI (Public Key Infrastructure) authentication mechanism to strengthen the security. Also, we set fine access control policy to shared tools and data and used shared key based encryption method to protect tools and data against leakage to third parties. For the usability issue, we chose Web browsers as user interface and developed Web application to provide functions to support sharing tools and data. By using WebDAV (Web-based Distributed Authoring and Versioning) function, users can manipulate shared tools and data through the Windows-like folder environment. We implemented the prototype system in Grid infrastructure for atomic energy research: AEGIS (Atomic Energy Grid Infrastructure) developed by CCSE/JAEA. The prototype system was applied for the trial use in the first period of GNEP.

  13. Lilith: A Java framework for the development of scalable tools for high performance distributed computing platforms

    SciTech Connect

    Evensky, D.A.; Gentile, A.C.; Armstrong, R.C.

    1998-03-19

    Increasingly, high performance computing constitutes the use of very large heterogeneous clusters of machines. The use and maintenance of such clusters are subject to complexities of communication between the machines in a time efficient and secure manner. Lilith is a general purpose tool that provides a highly scalable, secure, and easy distribution of user code across a heterogeneous computing platform. By handling the details of code distribution and communication, such a framework allows for the rapid development of tools for the use and management of large distributed systems. Lilith is written in Java, taking advantage of Java`s unique features of loading and distributing code dynamically, its platform independence, its thread support, and its provision of graphical components to facilitate easy to use resultant tools. The authors describe the use of Lilith in a tool developed for the maintenance of the large distributed cluster at their institution and present details of the Lilith architecture and user API for the general user development of scalable tools.

  14. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  15. A computational dosimetry tool for the study of tumor doses and skin toxicities in BNCT.

    PubMed

    Gossio, Sebastián; Carando, Daniel G; González, Sara J

    2009-07-01

    A Matlab-based computational tool, named SPHERE, was developed that helps determining tumor and skin doses in BNCT treatments. It was especially designed for cutaneous melanoma treatments and, among its features, it provides a guide for the location and delineation of tumors and a visual representation of superficial dose distributions (for both tumor and normal tissues). It also generates cumulative dose-volume histograms for different volumes of interest and dose-area histograms for skin. A description of the tool is presented, as well as examples of its application. PMID:19386508

  16. Video analysis of projectile motion using tablet computers as experimental tools

    NASA Astrophysics Data System (ADS)

    Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and g in order to explore the underlying laws of motion. This experiment can easily be performed by students themselves, providing more autonomy in their problem-solving processes than traditional learning approaches. We believe that this autonomy and the authenticity of the experimental tool both foster their motivation.

  17. On the utility of spectroscopic imaging as a tool for generating geometrically accurate MR images and parameter maps in the presence of field inhomogeneities and chemical shift effects.

    PubMed

    Bakker, Chris J G; de Leeuw, Hendrik; van de Maat, Gerrit H; van Gorp, Jetse S; Bouwman, Job G; Seevinck, Peter R

    2013-01-01

    Lack of spatial accuracy is a recognized problem in magnetic resonance imaging (MRI) which severely detracts from its value as a stand-alone modality for applications that put high demands on geometric fidelity, such as radiotherapy treatment planning and stereotactic neurosurgery. In this paper, we illustrate the potential and discuss the limitations of spectroscopic imaging as a tool for generating purely phase-encoded MR images and parameter maps that preserve the geometry of an object and allow localization of object features in world coordinates. Experiments were done on a clinical system with standard facilities for imaging and spectroscopy. Images were acquired with a regular spin echo sequence and a corresponding spectroscopic imaging sequence. In the latter, successive samples of the acquired echo were used for the reconstruction of a series of evenly spaced images in the time and frequency domain. Experiments were done with a spatial linearity phantom and a series of test objects representing a wide range of susceptibility- and chemical-shift-induced off-resonance conditions. In contrast to regular spin echo imaging, spectroscopic imaging was shown to be immune to off-resonance effects, such as those caused by field inhomogeneity, susceptibility, chemical shift, f(0) offset and field drift, and to yield geometrically accurate images and parameter maps that allowed object structures to be localized in world coordinates. From these illustrative examples and a discussion of the limitations of purely phase-encoded imaging techniques, it is concluded that spectroscopic imaging offers a fundamental solution to the geometric deficiencies of MRI which may evolve toward a practical solution when full advantage will be taken of current developments with regard to scan time reduction. This perspective is backed up by a demonstration of the significant scan time reduction that may be achieved by the use of compressed sensing for a simple phantom. PMID:22898694

  18. Effects of a More Accurate Polarizable Hamiltonian on Polymorph Free Energies Computed Efficiently by Reweighting Point-Charge Potentials.

    PubMed

    Dybeck, Eric C; Schieber, Natalie P; Shirts, Michael R

    2016-08-01

    We examine the free energies of three benzene polymorphs as a function of temperature in the point-charge OPLS-AA and GROMOS54A7 potentials as well as the polarizable AMOEBA09 potential. For this system, using a polarizable Hamiltonian instead of the cheaper point-charge potentials is shown to have a significantly smaller effect on the stability at 250 K than on the lattice energy at 0 K. The benzene I polymorph is found to be the most stable crystal structure in all three potentials examined and at all temperatures examined. For each potential, we report the free energies over a range of temperatures and discuss the added value of using full free energy methods over the minimized lattice energy to determine the relative crystal stability at finite temperatures. The free energies in the polarizable Hamiltonian are efficiently calculated using samples collected in a cheaper point-charge potential. The polarizable free energies are estimated from the point-charge trajectories using Boltzmann reweighting with MBAR. The high configuration-space overlap necessary for efficient Boltzmann reweighting is achieved by designing point-charge potentials with intramolecular parameters matching those in the expensive polarizable Hamiltonian. Finally, we compare the computational cost of this indirect reweighted free energy estimate to the cost of simulating directly in the expensive polarizable Hamiltonian. PMID:27341280

  19. Staging of osteonecrosis of the jaw requires computed tomography for accurate definition of the extent of bony disease.

    PubMed

    Bedogni, Alberto; Fedele, Stefano; Bedogni, Giorgio; Scoletta, Matteo; Favia, Gianfranco; Colella, Giuseppe; Agrillo, Alessandro; Bettini, Giordana; Di Fede, Olga; Oteri, Giacomo; Fusco, Vittorio; Gabriele, Mario; Ottolenghi, Livia; Valsecchi, Stefano; Porter, Stephen; Petruzzi, Massimo; Arduino, Paolo; D'Amato, Salvatore; Ungari, Claudio; Fung Polly, Pok-Lam; Saia, Giorgia; Campisi, Giuseppina

    2014-09-01

    Management of osteonecrosis of the jaw associated with antiresorptive agents is challenging, and outcomes are unpredictable. The severity of disease is the main guide to management, and can help to predict prognosis. Most available staging systems for osteonecrosis, including the widely-used American Association of Oral and Maxillofacial Surgeons (AAOMS) system, classify severity on the basis of clinical and radiographic findings. However, clinical inspection and radiography are limited in their ability to identify the extent of necrotic bone disease compared with computed tomography (CT). We have organised a large multicentre retrospective study (known as MISSION) to investigate the agreement between the AAOMS staging system and the extent of osteonecrosis of the jaw (focal compared with diffuse involvement of bone) as detected on CT. We studied 799 patients with detailed clinical phenotyping who had CT images taken. Features of diffuse bone disease were identified on CT within all AAOMS stages (20%, 8%, 48%, and 24% of patients in stages 0, 1, 2, and 3, respectively). Of the patients classified as stage 0, 110/192 (57%) had diffuse disease on CT, and about 1 in 3 with CT evidence of diffuse bone disease was misclassified by the AAOMS system as having stages 0 and 1 osteonecrosis. In addition, more than a third of patients with AAOMS stage 2 (142/405, 35%) had focal bone disease on CT. We conclude that the AAOMS staging system does not correctly identify the extent of bony disease in patients with osteonecrosis of the jaw. PMID:24856927

  20. An efficient and accurate technique to compute the absorption, emission, and transmission of radiation by the Martian atmosphere

    NASA Technical Reports Server (NTRS)

    Lindner, Bernhard Lee; Ackerman, Thomas P.; Pollack, James B.

    1990-01-01

    CO2 comprises 95 pct. of the composition of the Martian atmosphere. However, the Martian atmosphere also has a high aerosol content. Dust particles vary from less than 0.2 to greater than 3.0. CO2 is an active absorber and emitter in near IR and IR wavelengths; the near IR absorption bands of CO2 provide significant heating of the atmosphere, and the 15 micron band provides rapid cooling. Including both CO2 and aerosol radiative transfer simultaneously in a model is difficult. Aerosol radiative transfer requires a multiple scattering code, while CO2 radiative transfer must deal with complex wavelength structure. As an alternative to the pure atmosphere treatment in most models which causes inaccuracies, a treatment was developed called the exponential sum or k distribution approximation. The chief advantage of the exponential sum approach is that the integration over k space of f(k) can be computed more quickly than the integration of k sub upsilon over frequency. The exponential sum approach is superior to the photon path distribution and emissivity techniques for dusty conditions. This study was the first application of the exponential sum approach to Martian conditions.

  1. Assessment of the extended Koopmans' theorem for the chemical reactivity: Accurate computations of chemical potentials, chemical hardnesses, and electrophilicity indices.

    PubMed

    Yildiz, Dilan; Bozkaya, Uğur

    2016-01-30

    The extended Koopmans' theorem (EKT) provides a straightforward way to compute ionization potentials and electron affinities from any level of theory. Although it is widely applied to ionization potentials, the EKT approach has not been applied to evaluation of the chemical reactivity. We present the first benchmarking study to investigate the performance of the EKT methods for predictions of chemical potentials (μ) (hence electronegativities), chemical hardnesses (η), and electrophilicity indices (ω). We assess the performance of the EKT approaches for post-Hartree-Fock methods, such as Møller-Plesset perturbation theory, the coupled-electron pair theory, and their orbital-optimized counterparts for the evaluation of the chemical reactivity. Especially, results of the orbital-optimized coupled-electron pair theory method (with the aug-cc-pVQZ basis set) for predictions of the chemical reactivity are very promising; the corresponding mean absolute errors are 0.16, 0.28, and 0.09 eV for μ, η, and ω, respectively. PMID:26458329

  2. Computational Study of the Reactions of Methanol with the Hydroperoxyl and Methyl Radicals. Part I: Accurate Thermochemistry and Barrier Heights

    SciTech Connect

    Alecu, I. M.; Truhlar, D. G.

    2011-04-07

    The reactions of CH3OH with the HO2 and CH3 radicals are important in the combustion of methanol and are prototypes for reactions of heavier alcohols in biofuels. The reaction energies and barrier heights for these reaction systems are computed with CCSD(T) theory extrapolated to the complete basis set limit using correlation-consistent basis sets, both augmented and unaugmented, and further refined by including a fully coupled treatment of the connected triple excitations, a second-order perturbative treatment of quadruple excitations (by CCSDT(2)Q), core–valence corrections, and scalar relativistic effects. It is shown that the M08-HX and M08-SO hybrid meta-GGA density functionals can achieve sub-kcal mol-1 agreement with the high-level ab initio results, identifying these functionals as important potential candidates for direct dynamics studies on the rates of these and homologous reaction systems.

  3. CG3TOOL: an interactive computer program to process Scintrex CG-3/3M gravity data for high-resolution applications

    NASA Astrophysics Data System (ADS)

    Gabalda, G.; Bonvalot, S.; Hipkin, R.

    2003-03-01

    A newly developed interactive computer program, CG3TOOL, has been dedicated to the processing of the gravity data acquired by the Scintrex CG-3/3M automated gravity meter. The aim of CG3TOOL is two fold: to allow for an objective evaluation of Scintrex data and to provide a higher resolution in data reductions than those computed in real time by the microprocessor-controlled instrument. The program reads the gravity data acquired in either field or cycle mode (field surveys and continuous recordings, respectively) and then downloaded from the meter to a PC computer. The processing tasks are divided into two successive levels. Level 1 is dedicated to the reduction of the daily data files by applying standard or accurate corrections (earth tide, instrumental drift, atmospheric pressure). The precise corrections are performed up to the microGal (μGal) level, in accordance with the specifications of high-resolution surveys. Level 2 contains a series of processing tools (including network adjustment, anomaly computation, and gravity meter calibration) that will precisely compute and adjust the gravity values with error estimates. The interactive procedures and the program output (plot and text files) have been designed to ease data handling and archiving as well as to provide useful information for future purposes of data interpretation or modeling. CG3TOOL was developed in a standard C language for Unix Sun workstations and uses the standard graphical and mathematical Generic Mapping Tools (GMT) free library, available from the web. The objectives and principles of the computer program are presented below along with corresponding examples of the main processing tasks applied to observed data.

  4. Highly Accurate Infrared Line Lists of SO2 Isotopologues Computed for Atmospheric Modeling on Venus and Exoplanets

    NASA Astrophysics Data System (ADS)

    Huang, X.; Schwenke, D.; Lee, T. J.

    2014-12-01

    Last year we reported a semi-empirical 32S16O2 spectroscopic line list (denoted Ames-296K) for its atmospheric characterization in Venus and other Exoplanetary environments. In order to facilitate the Sulfur isotopic ratio and Sulfur chemistry model determination, now we present Ames-296K line lists for both 626 (upgraded) and other 4 symmetric isotopologues: 636, 646, 666 and 828. The line lists are computed on an ab initio potential energy surface refined with most reliable high resolution experimental data, using a high quality CCSD(T)/aug-cc-pV(Q+d)Z dipole moment surface. The most valuable part of our approach is to provide "truly reliable" predictions (and alternatives) for those unknown or hard-to-measure/analyze spectra. This strategy has guaranteed the lists are the best available alternative for those wide spectra region missing from spectroscopic databases such as HITRAN and GEISA, where only very limited data exist for 626/646 and no Infrared data at all for 636/666 or other minor isotopologues. Our general line position accuracy up to 5000 cm-1 is 0.01 - 0.02 cm-1 or better. Most transition intensity deviations are less than 5%, compare to experimentally measured quantities. Note that we have solved a convergence issue and further improved the quality and completeness of the main isotopologue 626 list at 296K. We will compare the lists to available models in CDMS/JPL/HITRAN and discuss the future mutually beneficial interactions between theoretical and experimental efforts.

  5. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  6. A Queue Simulation Tool for a High Performance Scientific Computing Center

    NASA Technical Reports Server (NTRS)

    Spear, Carrie; McGalliard, James

    2007-01-01

    The NASA Center for Computational Sciences (NCCS) at the Goddard Space Flight Center provides high performance highly parallel processors, mass storage, and supporting infrastructure to a community of computational Earth and space scientists. Long running (days) and highly parallel (hundreds of CPUs) jobs are common in the workload. NCCS management structures batch queues and allocates resources to optimize system use and prioritize workloads. NCCS technical staff use a locally developed discrete event simulation tool to model the impacts of evolving workloads, potential system upgrades, alternative queue structures and resource allocation policies.

  7. IMAT (Integrated Multidisciplinary Analysis Tool) user's guide for the VAX/VMS computer

    NASA Technical Reports Server (NTRS)

    Meissner, Frances T. (Editor)

    1988-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system for the VAX/VMS computer developed at the Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

  8. Computational Tools for Predictive Modeling of Properties in Complex Actinide Systems

    SciTech Connect

    Autschbach, Jochen; Govind, Niranjan; Atta Fynn, Raymond; Bylaska, Eric J.; Weare, John H.; de Jong, Wibe A.

    2015-03-30

    In this chapter we focus on methodological and computational aspects that are key to accurately modeling the spectroscopic and thermodynamic properties of molecular systems containing actinides within the density functional theory (DFT) framework. Our focus is on properties that require either an accurate relativistic all-electron description or an accurate description of the dynamical behavior of actinide species in an environment at finite temperature, or both. The implementation of the methods and the calculations discussed in this chapter were done with the NWChem software suite (Valiev et al. 2010). In the first two sections we discuss two methods that account for relativistic effects, the ZORA and the X2C Hamiltonian. Section 1.2.1 discusses the implementation of the approximate relativistic ZORA Hamiltonian and its extension to magnetic properties. Section 1.3 focuses on the exact X2C Hamiltonian and the application of this methodology to obtain accurate molecular properties. In Section 1.4 we examine the role of a dynamical environment at finite temperature as well as the presence of other ions on the thermodynamics of hydrolysis and exchange reaction mechanisms. Finally, Section 1.5 discusses the modeling of XAS (EXAFS, XANES) properties in realistic environments accounting for both the dynamics of the system and (for XANES) the relativistic effects.

  9. Design and Development of a Sample "Computer Programming" Course Tool via Story-Based E-Learning Approach

    ERIC Educational Resources Information Center

    Kose, Utku; Koc, Durmus; Yucesoy, Suleyman Anil

    2013-01-01

    This study introduces a story-based e-learning oriented course tool that was designed and developed for using within "computer programming" courses. With this tool, students can easily adapt themselves to the subjects in the context of computer programming principles, thanks to the story-based, interactive processes. By using visually…

  10. Computational thermodynamics, Gaussian processes and genetic algorithms: combined tools to design new alloys

    NASA Astrophysics Data System (ADS)

    Tancret, F.

    2013-06-01

    A new alloy design procedure is proposed, combining in a single computational tool several modelling and predictive techniques that have already been used and assessed in the field of materials science and alloy design: a genetic algorithm is used to optimize the alloy composition for target properties and performance on the basis of the prediction of mechanical properties (estimated by Gaussian process regression of data on existing alloys) and of microstructural constitution, stability and processability (evaluated by computational themodynamics). These tools are integrated in a unique Matlab programme. An example is given in the case of the design of a new nickel-base superalloy for future power plant applications (such as the ultra-supercritical (USC) coal-fired plant, or the high-temperature gas-cooled nuclear reactor (HTGCR or HTGR), where the selection criteria include cost, oxidation and creep resistance around 750 °C, long-term stability at service temperature, forgeability, weldability, etc.

  11. Computer-based knowledge extraction tool: a step in the development of a cognitive skills tutor

    SciTech Connect

    Stoddard, M.L.; Kern, R.P.; Emerson, J.D.

    1986-01-01

    Los Alamos National Laboratory, under the sponsorship of the Army Research Institute, is developing an experimental computer-tutor to be used as part of the Armor Officer's Basic Course at Fort Knox, Kentucky. The tutor's objective is to train students to apply the types of cognitive processing strategies needed to more effectively organize their knowledge for application in planning and conducting tactical operations. The tutor is being developed through an iterative process with the first phase being a knowledge extraction computer-based tool. Student knowledge organization in this domain will be obtained through collection of online and offline performance data. The tool is designed to obtain the knowledge organization through a motivating, realistic tactical operations exercise. 18 refs.

  12. System capacity and economic modeling computer tool for satellite mobile communications systems

    NASA Technical Reports Server (NTRS)

    Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.

    1988-01-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  13. Mobile computing device as tools for college student education: a case on flashcards application

    NASA Astrophysics Data System (ADS)

    Kang, Congying

    2012-04-01

    Traditionally, college students always use flash cards as a tool to remember massive knowledge, such as nomenclature, structures, and reactions in chemistry. Educational and information technology have enabled flashcards viewed on computers, like Slides and PowerPoint, works as tunnels of drilling and feedback for the learners. The current generation of students is more capable of information technology and mobile computing devices. For example, they use their Mobile phones much more intensively everyday day. Trends of using Mobile phone as an educational tool is analyzed and a educational technology initiative is proposed, which use Mobile phone flash cards applications to help students learn biology and chemistry. Experiments show that users responded positively to these mobile flash cards.

  14. Distributed computing as a virtual supercomputer: Tools to run and manage large-scale BOINC simulations

    NASA Astrophysics Data System (ADS)

    Giorgino, Toni; Harvey, M. J.; de Fabritiis, Gianni

    2010-08-01

    Distributed computing (DC) projects tackle large computational problems by exploiting the donated processing power of thousands of volunteered computers, connected through the Internet. To efficiently employ the computational resources of one of world's largest DC efforts, GPUGRID, the project scientists require tools that handle hundreds of thousands of tasks which run asynchronously and generate gigabytes of data every day. We describe RBoinc, an interface that allows computational scientists to embed the DC methodology into the daily work-flow of high-throughput experiments. By extending the Berkeley Open Infrastructure for Network Computing (BOINC), the leading open-source middleware for current DC projects, with mechanisms to submit and manage large-scale distributed computations from individual workstations, RBoinc turns distributed grids into cost-effective virtual resources that can be employed by researchers in work-flows similar to conventional supercomputers. The GPUGRID project is currently using RBoinc for all of its in silico experiments based on molecular dynamics methods, including the determination of binding free energies and free energy profiles in all-atom models of biomolecules.

  15. Computer tools in the discovery of HIV-I integrase inhibitors

    PubMed Central

    Liao, Chenzhong; Nicklaus, Marc C

    2010-01-01

    Computer-aided drug design (CADD) methodologies have made great advances and contributed significantly to the discovery and/or optimization of many clinically used drugs in recent years. CADD tools have likewise been applied to the discovery of inhibitors of HIV-I integrase, a difficult and worthwhile target for the development of efficient anti-HIV drugs. This article reviews the application of CADD tools, including pharmacophore search, quantitative structure–activity relationships, model building of integrase complexed with viral DNA and quantum-chemical studies in the discovery of HIV-I integrase inhibitors. Different structurally diverse integrase inhibitors have been identified by, or with significant help from, various CADD tools. PMID:21426160

  16. On the Development of a Computer Based Diagnostic Assessment Tool to Help in Teaching and Learning Process

    ERIC Educational Resources Information Center

    Ahmad, Afaq; Al-Mashari, Ahmed; Al-Lawati, Ali

    2010-01-01

    This paper presents a computer based diagnostic tool developed to facilitate the learning process. The developed tool is capable of generating possible error syndromes associated with the answers received. The developed tool simulates the error pattern of the test results and then accordingly models the action plan to help in children's learning…

  17. The role of optimization in the next generation of computer-based design tools

    NASA Technical Reports Server (NTRS)

    Rogan, J. Edward

    1989-01-01

    There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.

  18. Accelerating Design of Batteries Using Computer-Aided Engineering Tools (Presentation)

    SciTech Connect

    Pesaran, A.; Kim, G. H.; Smith, K.

    2010-11-01

    Computer-aided engineering (CAE) is a proven pathway, especially in the automotive industry, to improve performance by resolving the relevant physics in complex systems, shortening the product development design cycle, thus reducing cost, and providing an efficient way to evaluate parameters for robust designs. Academic models include the relevant physics details, but neglect engineering complexities. Industry models include the relevant macroscopic geometry and system conditions, but simplify the fundamental physics too much. Most of the CAE battery tools for in-house use are custom model codes and require expert users. There is a need to make these battery modeling and design tools more accessible to end users such as battery developers, pack integrators, and vehicle makers. Developing integrated and physics-based CAE battery tools can reduce the design, build, test, break, re-design, re-build, and re-test cycle and help lower costs. NREL has been involved in developing various models to predict the thermal and electrochemical performance of large-format cells and has used in commercial three-dimensional finite-element analysis and computational fluid dynamics to study battery pack thermal issues. These NREL cell and pack design tools can be integrated to help support the automotive industry and to accelerate battery design.

  19. Fecal Calprotectin is an Accurate Tool and Correlated to Seo Index in Prediction of Relapse in Iranian Patients With Ulcerative Colitis

    PubMed Central

    Hosseini, Seyed Vahid; Jafari, Peyman; Taghavi, Seyed Alireza; Safarpour, Ali Reza; Rezaianzadeh, Abbas; Moini, Maryam; Mehrabi, Manoosh

    2015-01-01

    Background: The natural clinical course of Ulcerative Colitis (UC) is characterized by episodes of relapse and remission. Fecal Calprotectin (FC) is a relatively new marker of intestinal inflammation and is an available, non-expensive tool for predicting relapse of quiescent UC. The Seo colitis activity index is a clinical index for assessment of the severity of UC. Objectives: The present study aimed to evaluate the accuracy of FC and the Seo colitis activity index and their correlation in prediction of UC exacerbation. Patients and Methods: In this prospective cohort study, 157 patients with clinical and endoscopic diagnosis of UC selected randomly from 1273 registered patients in Fars province’s IBD registry center in Shiraz, Iran, were followed from October 2012 to October 2013 for 12 months or shorter, if they had a relapse. Two patients left the study before completion and one patient had relapse because of discontinuation of drugs. The participants' clinical and serum factors were evaluated every three months. Furthermore, stool samples were collected at the beginning of study and every three months and FC concentration (commercially available enzyme linked immunoassay) and the Seo Index were assessed. Then univariate analysis, multiple variable logistic regression, Receiver Operating Characteristics (ROC) curve analysis, and Pearson’s correlation test (r) were used for statistical analysis of data. Results: According to the results, 74 patients (48.1%) relapsed during the follow-up (33 men and 41 women). Mean ± SD of FC was 862.82 ± 655.97 μg/g and 163.19 ± 215.85 μg/g in relapsing and non-relapsing patients, respectively (P < 0.001). Multiple logistic regression analysis revealed that age, number of previous relapses, FC and the Seo index were significant predictors of relapse. ROC curve analysis of FC level and Seo activity index for prediction of relapse demonstrated area under the curve of 0.882 (P < 0.001) and 0.92 1(P < 0.001), respectively

  20. Architecture-Adaptive Computing Environment: A Tool for Teaching Parallel Programming

    NASA Technical Reports Server (NTRS)

    Dorband, John E.; Aburdene, Maurice F.

    2002-01-01

    Recently, networked and cluster computation have become very popular. This paper is an introduction to a new C based parallel language for architecture-adaptive programming, aCe C. The primary purpose of aCe (Architecture-adaptive Computing Environment) is to encourage programmers to implement applications on parallel architectures by providing them the assurance that future architectures will be able to run their applications with a minimum of modification. A secondary purpose is to encourage computer architects to develop new types of architectures by providing an easily implemented software development environment and a library of test applications. This new language should be an ideal tool to teach parallel programming. In this paper, we will focus on some fundamental features of aCe C.

  1. General purpose computational tools for simulation and analysis of medium-energy backscattering spectra

    NASA Astrophysics Data System (ADS)

    Weller, Robert A.

    1999-06-01

    This paper describes a suite of computational tools for general-purpose ion-solid calculations, which has been implemented in the platform-independent computational environment Mathematica®. Although originally developed for medium energy work (beam energies < 300 keV), they are suitable for general, classical, non-relativistic calculations. Routines are available for stopping power, Rutherford and Lenz-Jensen (screened) cross sections, sputtering yields, small-angle multiple scattering, and back-scattering-spectrum simulation and analysis. Also included are a full range of supporting functions, as well as easily accessible atomic mass and other data on all the stable isotopes in the periodic table. The functions use common calling protocols, recognize elements and isotopes by symbolic names and, wherever possible, return symbolic results for symbolic inputs, thereby facilitating further computation. A new paradigm for the representation of backscattering spectra is introduced.

  2. Investigation of computational aeroacoustic tools for noise predictions of wind turbine aerofoils

    NASA Astrophysics Data System (ADS)

    Humpf, A.; Ferrer, E.; Munduate, X.

    2007-07-01

    In this work trailing edge noise levels of a research aerofoil have been computed and compared to aeroacoustic measurements using two different approaches. On the other hand, aerodynamic and aeroacoustic calculations were performed with the full Navier-Stokes CFD code Fluent [Fluent Inc 2005 Fluent 6.2 Users Guide, Lebanon, NH, USA] on the basis of a steady RANS simulation. Aerodynamic characteristics were computed by the aid of various turbulence models. By the combined usage of implemented broadband noise source models, it was tried to isolate and determine the trailing edge noise level. Throughout this work two methods of different computational cost have been tested and quantitative and qualitative results obtained. On the one hand, the semi-empirical noise prediction tool NAFNoise [Moriarty P 2005 NAFNoise User's Guide. Golden, Colorado, July. http://wind.nrel.gov/designcodes/ simulators/NAFNoise] was used to directly predict trailing edge noise by taking into consideration the nature of the experiments.

  3. Identification of fidgety movements and prediction of CP by the use of computer-based video analysis is more accurate when based on two video recordings.

    PubMed

    Adde, Lars; Helbostad, Jorunn; Jensenius, Alexander R; Langaas, Mette; Støen, Ragnhild

    2013-08-01

    This study evaluates the role of postterm age at assessment and the use of one or two video recordings for the detection of fidgety movements (FMs) and prediction of cerebral palsy (CP) using computer vision software. Recordings between 9 and 17 weeks postterm age from 52 preterm and term infants (24 boys, 28 girls; 26 born preterm) were used. Recordings were analyzed using computer vision software. Movement variables, derived from differences between subsequent video frames, were used for quantitative analysis. Sensitivities, specificities, and area under curve were estimated for the first and second recording, or a mean of both. FMs were classified based on the Prechtl approach of general movement assessment. CP status was reported at 2 years. Nine children developed CP of whom all recordings had absent FMs. The mean variability of the centroid of motion (CSD) from two recordings was more accurate than using only one recording, and identified all children who were diagnosed with CP at 2 years. Age at assessment did not influence the detection of FMs or prediction of CP. The accuracy of computer vision techniques in identifying FMs and predicting CP based on two recordings should be confirmed in future studies. PMID:23343036

  4. Ring polymer molecular dynamics fast computation of rate coefficients on accurate potential energy surfaces in local configuration space: Application to the abstraction of hydrogen from methane

    NASA Astrophysics Data System (ADS)

    Meng, Qingyong; Chen, Jun; Zhang, Dong H.

    2016-04-01

    To fast and accurately compute rate coefficients of the H/D + CH4 → H2/HD + CH3 reactions, we propose a segmented strategy for fitting suitable potential energy surface (PES), on which ring-polymer molecular dynamics (RPMD) simulations are performed. On the basis of recently developed permutation invariant polynomial neural-network approach [J. Li et al., J. Chem. Phys. 142, 204302 (2015)], PESs in local configuration spaces are constructed. In this strategy, global PES is divided into three parts, including asymptotic, intermediate, and interaction parts, along the reaction coordinate. Since less fitting parameters are involved in the local PESs, the computational efficiency for operating the PES routine is largely enhanced by a factor of ˜20, comparing with that for global PES. On interaction part, the RPMD computational time for the transmission coefficient can be further efficiently reduced by cutting off the redundant part of the child trajectories. For H + CH4, good agreements among the present RPMD rates and those from previous simulations as well as experimental results are found. For D + CH4, on the other hand, qualitative agreement between present RPMD and experimental results is predicted.

  5. Investigation of the "Convince Me" Computer Environment as a Tool for Critical Argumentation about Public Policy Issues

    ERIC Educational Resources Information Center

    Adams, Stephen T.

    2003-01-01

    The "Convince Me" computer environment supports critical thinking by allowing users to create and evaluate computer-based representations of arguments. This study investigates theoretical and design considerations pertinent to using "Convince Me" as an educational tool to support reasoning about public policy issues. Among computer environments…

  6. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    ERIC Educational Resources Information Center

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-01-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…

  7. Complexities of learning with computer-based tools: A case of inquiry about sound and music in elementary school

    NASA Astrophysics Data System (ADS)

    Magnusson, Shirley J.

    1996-12-01

    Computer-based technology is increasingly becoming available for students at all grade levels in schools, and its promise and power as a learning tool is being extolled by many. From a constructive perspective, if individuals actively construct meaning from their experiences, then simply having particular tools to work with via a computer doesn't ensure that desired learning will result. Thus, it is important to examine how students construct meaning while using such tools. This study examined what fourth grade students learned from the use of two computer-based tools intended to help them understand sound and music: software that emulated an oscilloscope and allowed students to view sound waves from audio input; and software that turned the computer into an electronic keyboard, which provided students with standard pitches for comparison purposes. Principles of selective attention and pior knowledge and experiences—foundational ideas of a constructivist epistemology—were useful in understanding learning outcomes from inquiry with these tools. Our findings provide critical information for future instruction with the goal of supporting learning about sound and music from such tools. They also indicate the need for more studies examining learning from computer-based tools in specific contexts, to advance our understanding of how teachers can mediate student activity with computer-based tools to support the development of conceptual understanding.

  8. Assessing students' learning and decision-making skills using high performance web-based computational tools

    NASA Astrophysics Data System (ADS)

    Martin, Akilah

    Using web-based computational tool in classrooms in conjunction with advanced computing models provide the opportunity for students to learn large scale processes, such as state, regional, and global environmental issues that are difficult to incorporate into student learning exercises with present basic models. These tools aided in bridging the gap between multi-field scale models and enhanced student learning. The expectations were that students would improve their decision-making skills by solving realistic and large scale (multi-field conditions) environmental issues that were made possible through faster computation time, larger datasets, larger scale (multi-field), and predictions over longer time periods using the Century soil organic carbon model. The Century Model was linked to a web-based series of functional pages through which students could run the model through. In this project, 239 undergraduate students' learning and decision-making skills using high performance classroom computing tools were assessed. Among the many Century Model parameters, the students were able to alter four variables (climate, crop, tillage, and soil texture). Students were able to simulate several scenarios simultaneously. The results of the study revealed that pretest for the four courses combined was found significant (P < 0.05), meaning that the pretest was a major contributor to their increased posttest score. Although, the scenario scale (multi-field conditions vs. single field conditions) factor was not statistically significant, the students completing the multi-field scenario assignment scored higher on the posttest and also had a higher increase in points from pretest to posttest. Overall, these results revealed that the tool provided had a positive impact on the students' learning which was evident in their enhanced pretest to posttest score and also their perceptions from the written evaluation they provided. Most students felt that the project was a good learning

  9. User's Manual for FOMOCO Utilities-Force and Moment Computation Tools for Overset Grids

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Buning, Pieter G.

    1996-01-01

    In the numerical computations of flows around complex configurations, accurate calculations of force and moment coefficients for aerodynamic surfaces are required. When overset grid methods are used, the surfaces on which force and moment coefficients are sought typically consist of a collection of overlapping surface grids. Direct integration of flow quantities on the overlapping grids would result in the overlapped regions being counted more than once. The FOMOCO Utilities is a software package for computing flow coefficients (force, moment, and mass flow rate) on a collection of overset surfaces with accurate accounting of the overlapped zones. FOMOCO Utilities can be used in stand-alone mode or in conjunction with the Chimera overset grid compressible Navier-Stokes flow solver OVERFLOW. The software package consists of two modules corresponding to a two-step procedure: (1) hybrid surface grid generation (MIXSUR module), and (2) flow quantities integration (OVERINT module). Instructions on how to use this software package are described in this user's manual. Equations used in the flow coefficients calculation are given in Appendix A.

  10. Integrated high performance computational tools for simulations of transport and diffusion of contaminants in urban areas

    NASA Astrophysics Data System (ADS)

    Aliabadi, S.; Tu, S.; Watts, M.; Ji, A.; Johnson, A.

    2006-05-01

    Rapid analysis of transport and diffusion of chemical and biological aerosols and contaminants in an urban environment is a critical part of any homeland security response team. High performance computing (HPC) is a valuable technique for such analysis. The time constraint needed to create fully developed complex 3D city terrain models to support such dispersion simulations requires a task of converting agency data to the format necessary on the simulation platform. Numerous data sets have been employed in the development of complex 3D city models. Such data include the use of multi-layer building morphology data, the use of geographic information system (GIS) based shapefiles and digital elevation models (DEM), and the use of remote sensing data such as Light Detection and Ranging (LIDAR). The constructed geometry models are used to generate large-scale computational domains on a platform that supports our HPC tools. These tools include fully automated unstructured mesh generation, parallel and scalable flow solvers based on stabilized finite element formulations and a remote client-server environment for large-scale flow visualization. The stabilized finite element formulations, which are based on the SUPG and PSPG techniques, are parallelized and vectorized on the Cray X1. The 3D validation problem involves transient simulation of flow past a building with a source point releasing traces. A 3D application problem is presented to demonstrate the capability of the integrated HPC tools.

  11. Development of Experimental and Computational Aeroacoustic Tools for Advanced Liner Evaluation

    NASA Technical Reports Server (NTRS)

    Jones, Michael G.; Watson, Willie R.; Nark, Douglas N.; Parrott, Tony L.; Gerhold, Carl H.; Brown, Martha C.

    2006-01-01

    Acoustic liners in aircraft engine nacelles suppress radiated noise. Therefore, as air travel increases, increasingly sophisticated tools are needed to maximize noise suppression. During the last 30 years, NASA has invested significant effort in development of experimental and computational acoustic liner evaluation tools. The Curved Duct Test Rig is a 152-mm by 381- mm curved duct that supports liner evaluation at Mach numbers up to 0.3 and source SPLs up to 140 dB, in the presence of user-selected modes. The Grazing Flow Impedance Tube is a 51- mm by 63-mm duct currently being fabricated to operate at Mach numbers up to 0.6 with source SPLs up to at least 140 dB, and will replace the existing 51-mm by 51-mm duct. Together, these test rigs allow evaluation of advanced acoustic liners over a range of conditions representative of those observed in aircraft engine nacelles. Data acquired with these test ducts are processed using three aeroacoustic propagation codes. Two are based on finite element solutions to convected Helmholtz and linearized Euler equations. The third is based on a parabolic approximation to the convected Helmholtz equation. The current status of these computational tools and their associated usage with the Langley test rigs is provided.

  12. PipTools: a computational toolkit to annotate and analyze pairwise comparisons of genomic sequences.

    PubMed

    Elnitski, Laura; Riemer, Cathy; Petrykowska, Hanna; Florea, Liliana; Schwartz, Scott; Miller, Webb; Hardison, Ross

    2002-12-01

    Sequence conservation between species is useful both for locating coding regions of genes and for identifying functional noncoding segments. Hence interspecies alignment of genomic sequences is an important computational technique. However, its utility is limited without extensive annotation. We describe a suite of software tools, PipTools, and related programs that facilitate the annotation of genes and putative regulatory elements in pairwise alignments. The alignment server PipMaker uses the output of these tools to display detailed information needed to interpret alignments. These programs are provided in a portable format for use on common desktop computers and both the toolkit and the PipMaker server can be found at our Web site (http://bio.cse.psu.edu/). We illustrate the utility of the toolkit using annotation of a pairwise comparison of the mouse MHC class II and class III regions with orthologous human sequences and subsequently identify conserved, noncoding sequences that are DNase I hypersensitive sites in chromatin of mouse cells. PMID:12504859

  13. Computational tools for the interpretation of electron spin resonance spectra in solution

    NASA Astrophysics Data System (ADS)

    Zerbetto, Mirco; Licari, Daniele; Barone, Vincenzo; Polimeno, Antonino

    2013-10-01

    Spectroscopic observables can be used for monitoring relaxation processes of molecules. In particular, electron spin resonance of stable multi-radicals is sensitive to the details of the rotational and internal dynamics in rigid and flexible molecules. Integration with advanced theoretical/computational methods proves to be particularly effective to acquire direct information on long-range relaxation processes, based on molecular dynamics, multi-scale approaches and coarse-graining treatments. Together, experimental data and computational interpretation provide a way to understand the effect of chemical changes on specific systems. In this paper we review computational tools aimed at the characterisation of dynamical properties of molecules gathered from electron spin resonance measurements. Stochastic models are employed, based on a number of structural parameters that are calculated at atomistic and/or mesoscopic level depending on their nature. Open source software tools built as user-friendly 'virtual spectroscopes' targeted for use by experimentalists are provided as a kind of extension of the laboratory equipment. An overview of their range of applicability is provided.

  14. The Astronomy Workshop: Computer Assisted Learning Tools with Instructor Support Materials and Student Activities

    NASA Astrophysics Data System (ADS)

    Deming, Grace; Hamilton, D.; Hayes-Gehrke, M.

    2006-12-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is a collection of interactive World Wide Web tools that were developed under the direction of Doug Hamilton for use in undergraduate classes, as supplementary materials appropriate for grades 9-12, and by the general public. The philosophy of the website is to foster student and public interest in astronomy by capitalizing on their fascination with computers and the internet. Many of the tools were developed by graduate and undergraduate students at UMD. This website contains over 20 tools on topics including scientific notation, giant impacts, extrasolar planets, astronomical distances, planets, moons, comets, and asteroids. Educators around the country at universities, colleges, and secondary schools have used the Astronomy Workshop’s tools and activities as homework assignments, in-class demos, or extra credit. Since 2005, Grace Deming has assessed several of the Astronomy Workshop’s tools for clarity and effectiveness by interviewing students as they used tools on the website. Based on these interviews, Deming wrote student activities and instructor support materials and posted them to the website. Over the next three years, we will continue to interview students, develop web materials, and field-test activities. We are targeting classes in introductory undergraduate astronomy courses and grades 11-12 for our Spring 2007 field tests. We are interested in hearing your ideas on how we can make the Astronomy Workshop more appealing to educators, museum directors, specialty programs, and professors. This research is funded by NASA EPO grants NNG04GM18G and NNG06GGF99G.

  15. Java and Vector Graphics Tools for Element Production Calculations in Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Lingerfelt, Eric; McMahon, Erin; Hix, Raph; Guidry, Mike; Smith, Michael

    2002-08-01

    We are developing a set of extendable, cross-platform tools and interfaces using Java and vector technologies such as SVG and SWF to facilitate element production calculations in computational astrophysics. The Java technologies are customizable and portable, and can be utilized as a stand-alone application or distributed across a network. These tools, which can have a broad applications in general scientific visualization, are currently being used to explore and compare various reaction rates, set up and run explosive nucleosynthesis calculations, and visualize these results with compact, high quality vector graphics. The facilities for reading and plotting nuclear reaction rates and their components from a network or library permit the user to include new rates and adjust current ones. Setup and initialization of a nucleosynthesis calculation is through an intuitive graphical interface. Sophisticated visualization and graphical analysis tools offer the ability to view results in an interactive, scalable vector graphics format, which leads to a dramatic reduction in visualization file sizes while maintaining high visual quality and interactive control. The use of these tools for other applications will also be mentioned.

  16. Validation of space/ground antenna control algorithms using a computer-aided design tool

    NASA Technical Reports Server (NTRS)

    Gantenbein, Rex E.

    1995-01-01

    The validation of the algorithms for controlling the space-to-ground antenna subsystem for Space Station Alpha is an important step in assuring reliable communications. These algorithms have been developed and tested using a simulation environment based on a computer-aided design tool that can provide a time-based execution framework with variable environmental parameters. Our work this summer has involved the exploration of this environment and the documentation of the procedures used to validate these algorithms. We have installed a variety of tools in a laboratory of the Tracking and Communications division for reproducing the simulation experiments carried out on these algorithms to verify that they do meet their requirements for controlling the antenna systems. In this report, we describe the processes used in these simulations and our work in validating the tests used.

  17. Angular Determination of Toolmarks Using a Computer-Generated Virtual Tool.

    PubMed

    Spotts, Ryan; Chumbley, L Scott; Ekstrand, Laura; Zhang, Song; Kreiser, James

    2015-07-01

    A blind study to determine whether virtual toolmarks created using a computer could be used to identify and characterize angle of incidence of physical toolmarks was conducted. Six sequentially manufactured screwdriver tips and one random screwdriver were used to create toolmarks at various angles. An apparatus controlled tool angle. Resultant toolmarks were randomly coded and sent to the researchers, who scanned both tips and toolmarks using an optical profilometer to obtain 3D topography data. Developed software was used to create virtual marks based on the tool topography data. Virtual marks generated at angles from 30 to 85° (5° increments) were compared to physical toolmarks using a statistical algorithm. Twenty of twenty toolmarks were correctly identified by the algorithm. On average, the algorithm misidentified the correct angle of incidence by -6.12°. This study presents the results, their significance, and offers reasons for the average angular misidentification. PMID:25929523

  18. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  19. Defining a Standard for Reporting Digital Evidence Items in Computer Forensic Tools

    NASA Astrophysics Data System (ADS)

    Bariki, Hamda; Hashmi, Mariam; Baggili, Ibrahim

    Due to the lack of standards in reporting digital evidence items, investigators are facing difficulties in efficiently presenting their findings. This paper proposes a standard for digital evidence to be used in reports that are generated using computer forensic software tools. The authors focused on developing a standard digital evidence items by surveying various digital forensic tools while keeping in mind the legal integrity of digital evidence items. Additionally, an online questionnaire was used to gain the opinion of knowledgeable and experienced stakeholders in the digital forensics domain. Based on the findings, the authors propose a standard for digital evidence items that includes data about the case, the evidence source, evidence item, and the chain of custody. Research results enabled the authors in creating a defined XML schema for digital evidence items.

  20. A simple tool for the computation of the stream-aquifer coefficient.

    NASA Astrophysics Data System (ADS)

    Cousquer, Yohann; Pryet, Alexandre; Dupuy, Alain

    2015-04-01

    Most groundwater models consider a river network in interaction with aquifers, where the stream-aquifer boundary is usually modeled with a Cauchy-type boundary condition. This condition is parameterized with the so-called "river coefficient", which is a lumped parameter representing the effects of numerous geometric and hydrodynamic controlling factors. The value of the river coefficient is essential for the quantification of stream-aquifer flow but is challenging to determine. In recent years, many formulations for the river coefficient have been proposed from analytical and numerical approaches. However, these methods are either too simple to be realistic or too complex to be easily implemented by groundwater modelers. We propose a simple tool to infer the value of the river coefficient from a fine-grid numerical model. This tool allows the simple and fast computation of the river coefficient with various stream geometries and hydraulic parameters. A Python-based pre- and post-processor has been developed, which reduces the contribution of the operator to the definition of the model parameters: river geometry and aquifer properties. The numerical model is implemented with the USGS SUTRA finite element model and considers an aquifer in interaction with a stream in a 2D vertical cross-section. A Dirichlet-type boundary condition is imposed at the stream-aquifer interface. The linearity between the stream-aquifer flow and the head difference between river and the aquifer has been verified. For a given parameter set, the value of river coefficient is estimated by linear regression for different values of head difference between the river and the aquifer. The innovation is that the mesh size of the regional model is also considered for the computation of the river coefficient. This tool has been used to highlight the importance of parameters that were usually neglected for the computation of the river coefficient. The results of this work will be made available to the

  1. AMAS: a fast tool for alignment manipulation and computing of summary statistics

    PubMed Central

    2016-01-01

    The amount of data used in phylogenetics has grown explosively in the recent years and many phylogenies are inferred with hundreds or even thousands of loci and many taxa. These modern phylogenomic studies often entail separate analyses of each of the loci in addition to multiple analyses of subsets of genes or concatenated sequences. Computationally efficient tools for handling and computing properties of thousands of single-locus or large concatenated alignments are needed. Here I present AMAS (Alignment Manipulation And Summary), a tool that can be used either as a stand-alone command-line utility or as a Python package. AMAS works on amino acid and nucleotide alignments and combines capabilities of sequence manipulation with a function that calculates basic statistics. The manipulation functions include conversions among popular formats, concatenation, extracting sites and splitting according to a pre-defined partitioning scheme, creation of replicate data sets, and removal of taxa. The statistics calculated include the number of taxa, alignment length, total count of matrix cells, overall number of undetermined characters, percent of missing data, AT and GC contents (for DNA alignments), count and proportion of variable sites, count and proportion of parsimony informative sites, and counts of all characters relevant for a nucleotide or amino acid alphabet. AMAS is particularly suitable for very large alignments with hundreds of taxa and thousands of loci. It is computationally efficient, utilizes parallel processing, and performs better at concatenation than other popular tools. AMAS is a Python 3 program that relies solely on Python’s core modules and needs no additional dependencies. AMAS source code and manual can be downloaded from http://github.com/marekborowiec/AMAS/ under GNU General Public License. PMID:26835189

  2. AMAS: a fast tool for alignment manipulation and computing of summary statistics.

    PubMed

    Borowiec, Marek L

    2016-01-01

    The amount of data used in phylogenetics has grown explosively in the recent years and many phylogenies are inferred with hundreds or even thousands of loci and many taxa. These modern phylogenomic studies often entail separate analyses of each of the loci in addition to multiple analyses of subsets of genes or concatenated sequences. Computationally efficient tools for handling and computing properties of thousands of single-locus or large concatenated alignments are needed. Here I present AMAS (Alignment Manipulation And Summary), a tool that can be used either as a stand-alone command-line utility or as a Python package. AMAS works on amino acid and nucleotide alignments and combines capabilities of sequence manipulation with a function that calculates basic statistics. The manipulation functions include conversions among popular formats, concatenation, extracting sites and splitting according to a pre-defined partitioning scheme, creation of replicate data sets, and removal of taxa. The statistics calculated include the number of taxa, alignment length, total count of matrix cells, overall number of undetermined characters, percent of missing data, AT and GC contents (for DNA alignments), count and proportion of variable sites, count and proportion of parsimony informative sites, and counts of all characters relevant for a nucleotide or amino acid alphabet. AMAS is particularly suitable for very large alignments with hundreds of taxa and thousands of loci. It is computationally efficient, utilizes parallel processing, and performs better at concatenation than other popular tools. AMAS is a Python 3 program that relies solely on Python's core modules and needs no additional dependencies. AMAS source code and manual can be downloaded from http://github.com/marekborowiec/AMAS/ under GNU General Public License. PMID:26835189

  3. Architecture Framework for Trapped-Ion Quantum Computer based on Performance Simulation Tool

    NASA Astrophysics Data System (ADS)

    Ahsan, Muhammad

    The challenge of building scalable quantum computer lies in striking appropriate balance between designing a reliable system architecture from large number of faulty computational resources and improving the physical quality of system components. The detailed investigation of performance variation with physics of the components and the system architecture requires adequate performance simulation tool. In this thesis we demonstrate a software tool capable of (1) mapping and scheduling the quantum circuit on a realistic quantum hardware architecture with physical resource constraints, (2) evaluating the performance metrics such as the execution time and the success probability of the algorithm execution, and (3) analyzing the constituents of these metrics and visualizing resource utilization to identify system components which crucially define the overall performance. Using this versatile tool, we explore vast design space for modular quantum computer architecture based on trapped ions. We find that while success probability is uniformly determined by the fidelity of physical quantum operation, the execution time is a function of system resources invested at various layers of design hierarchy. At physical level, the number of lasers performing quantum gates, impact the latency of the fault-tolerant circuit blocks execution. When these blocks are used to construct meaningful arithmetic circuit such as quantum adders, the number of ancilla qubits for complicated non-clifford gates and entanglement resources to establish long-distance communication channels, become major performance limiting factors. Next, in order to factorize large integers, these adders are assembled into modular exponentiation circuit comprising bulk of Shor's algorithm. At this stage, the overall scaling of resource-constraint performance with the size of problem, describes the effectiveness of chosen design. By matching the resource investment with the pace of advancement in hardware technology

  4. Performance evaluation of computer aided diagnostic tool (CAD) for detection of ultrasonic based liver disease.

    PubMed

    Sriraam, N; Roopa, J; Saranya, M; Dhanalakshmi, M

    2009-08-01

    Recent advances in digital imaging technology have greatly enhanced the interpretation of critical/pathology conditions from the 2-dimensional medical images. This has become realistic due to the existence of the computer aided diagnostic tool. A computer aided diagnostic (CAD) tool generally possesses components like preprocessing, identification/selection of region of interest, extraction of typical features and finally an efficient classification system. This paper enumerates on development of CAD tool for classification of chronic liver disease through the 2-D image acquired from ultrasonic device. Characterization of tissue through qualitative treatment leads to the detection of abnormality which is not viable through qualitative visual inspection by the radiologist. Common liver diseases are the indicators of changes in tissue elasticity. One can show the detection of normal, fatty or malignant condition based on the application of CAD tool thereby, further investigation required by radiologist can be avoided. The proposed work involves an optimal block analysis (64 x 64) of the liver image of actual size 256 x 256 by incorporating Gabor wavelet transform which does the texture classification through automated mode. Statistical features such as gray level mean as well as variance values are estimated after this preprocessing mode. A non-linear back propagation neural network (BPNN) is applied for classifying the normal (vs) fatty and normal (vs) malignant liver which yields a classification accuracy of 96.8%. Further multi classification is also performed and a classification accuracy of 94% is obtained. It can be concluded that the proposed CAD can be used as an expert system to aid the automated diagnosis of liver diseases. PMID:19697693

  5. A semi-automated computer tool for the analysis of retinal vessel diameter dynamics.

    PubMed

    Euvrard, Guillaume; Genevois, Olivier; Rivals, Isabelle; Massin, Pascale; Collet, Amélie; Sahel, José-Alain; Paques, Michel

    2013-06-01

    Retinal vessels are directly accessible to clinical observation. This has numerous potential interests for medical investigations. Using the Retinal Vessel Analyzer, a dedicated eye fundus camera enabling dynamic, video-rate recording of micrometric changes of the diameter of retinal vessels, we developed a semi-automated computer tool that extracts the heart beat rate and pulse amplitude values from the records. The extracted data enabled us to show that there is a decreasing relationship between heart beat rate and pulse amplitude of arteries and veins. Such an approach will facilitate the modeling of hemodynamic interactions in small vessels. PMID:23566397

  6. A Review of Diffusion Tensor Magnetic Resonance Imaging Computational Methods and Software Tools

    PubMed Central

    Hasan, Khader M.; Walimuni, Indika S.; Abid, Humaira; Hahn, Klaus R.

    2010-01-01

    In this work we provide an up-to-date short review of computational magnetic resonance imaging (MRI) and software tools that are widely used to process and analyze diffusion-weighted MRI data. A review of different methods used to acquire, model and analyze diffusion-weighted imaging data (DWI) is first provided with focus on diffusion tensor imaging (DTI). The major preprocessing, processing and post-processing procedures applied to DTI data are discussed. A list of freely available software packages to analyze diffusion MRI data is also provided. PMID:21087766

  7. A review of diffusion tensor magnetic resonance imaging computational methods and software tools.

    PubMed

    Hasan, Khader M; Walimuni, Indika S; Abid, Humaira; Hahn, Klaus R

    2011-12-01

    In this work we provide an up-to-date short review of computational magnetic resonance imaging (MRI) and software tools that are widely used to process and analyze diffusion-weighted MRI data. A review of different methods used to acquire, model and analyze diffusion-weighted imaging data (DWI) is first provided with focus on diffusion tensor imaging (DTI). The major preprocessing, processing and post-processing procedures applied to DTI data are discussed. A list of freely available software packages to analyze diffusion MRI data is also provided. PMID:21087766

  8. Fortran Transformational Tools in Support of Scientific Application Development for Petascale Computer Architectures

    SciTech Connect

    Sottille, Matthew

    2013-09-12

    This document is the final report for a multi-year effort building infrastructure to support tool development for Fortran programs. We also investigated static analysis and code transformation methods relevant to scientific programmers who are writing Fortran programs for petascale-class high performance computing systems. This report details our accomplishments, technical approaches, and provides information on where the research results and code may be obtained from an open source software repository. The report for the first year of the project that was performed at the University of Oregon prior to the PI moving to Galois, Inc. is included as an appendix.

  9. Unicursal random maze tool path for computer-controlled optical surfacing.

    PubMed

    Wang, Chunjin; Wang, Zhenzhong; Xu, Qiao

    2015-12-01

    A novel unicursal random maze tool path is proposed in this paper, which can not only implement uniform coverage of the polishing surfaces, but also possesses randomness and multidirectionality. The simulation experiments along with the practical polishing experiments are conducted to make the comparison of three kinds of paths, including maze path, raster path, and Hilbert path. The experimental results validate that the maze path can warrant uniform polishing and avoid the appearance of the periodical structures in the polished surface. It is also more effective than the Hilbert path in restraining the mid-spatial frequency error in computer-controlled optical surfacing process. PMID:26836670

  10. Computing 1-D atomic densities in macromolecular simulations: The density profile tool for VMD

    NASA Astrophysics Data System (ADS)

    Giorgino, Toni

    2014-01-01

    Molecular dynamics simulations have a prominent role in biophysics and drug discovery due to the atomistic information they provide on the structure, energetics and dynamics of biomolecules. Specialized software packages are required to analyze simulated trajectories, either interactively or via scripts, to derive quantities of interest and provide insight for further experiments. This paper presents the Density Profile Tool, a package that enhances the Visual Molecular Dynamics environment with the ability to interactively compute and visualize 1-D projections of various density functions of molecular models. We describe how the plugin is used to perform computations both via a graphical interface and programmatically. Results are presented for realistic examples, all-atom bilayer models, showing how mass and electron densities readily provide measurements such as membrane thickness, location of structural elements, and how they compare to X-ray diffraction experiments.

  11. Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val

    1989-01-01

    Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers.

  12. Fast and Accurate Data Extraction for Near Real-Time Registration of 3-D Ultrasound and Computed Tomography in Orthopedic Surgery.

    PubMed

    Brounstein, Anna; Hacihaliloglu, Ilker; Guy, Pierre; Hodgson, Antony; Abugharbieh, Rafeef

    2015-12-01

    Automatic, accurate and real-time registration is an important step in providing effective guidance and successful anatomic restoration in ultrasound (US)-based computer assisted orthopedic surgery. We propose a method in which local phase-based bone surfaces, extracted from intra-operative US data, are registered to pre-operatively segmented computed tomography data. Extracted bone surfaces are downsampled and reinforced with high curvature features. A novel hierarchical simplification algorithm is used to further optimize the point clouds. The final point clouds are represented as Gaussian mixture models and iteratively matched by minimizing the dissimilarity between them using an L2 metric. For 44 clinical data sets from 25 pelvic fracture patients and 49 phantom data sets, we report mean surface registration accuracies of 0.31 and 0.77 mm, respectively, with an average registration time of 1.41 s. Our results suggest the viability and potential of the chosen method for real-time intra-operative registration in orthopedic surgery. PMID:26365924

  13. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  14. Analysis and computer tools for separation processes involving nonideal mixtures. Progress report, December 1, 1989--November 30, 1992

    SciTech Connect

    Lucia, A.

    1992-05-01

    The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

  15. PepBind: a comprehensive database and computational tool for analysis of protein-peptide interactions.

    PubMed

    Das, Arindam Atanu; Sharma, Om Prakash; Kumar, Muthuvel Suresh; Krishna, Ramadas; Mathur, Premendu P

    2013-08-01

    Protein-peptide interactions, where one partner is a globular protein (domain) and the other is a flexible linear peptide, are key components of cellular processes predominantly in signaling and regulatory networks, hence are prime targets for drug design. To derive the details of the protein-peptide interaction mechanism is often a cumbersome task, though it can be made easier with the availability of specific databases and tools. The Peptide Binding Protein Database (PepBind) is a curated and searchable repository of the structures, sequences and experimental observations of 3100 protein-peptide complexes. The web interface contains a computational tool, protein inter-chain interaction (PICI), for computing several types of weak or strong interactions at the protein-peptide interaction interface and visualizing the identified interactions between residues in Jmol viewer. This initial database release focuses on providing protein-peptide interface information along with structure and sequence information for protein-peptide complexes deposited in the Protein Data Bank (PDB). Structures in PepBind are classified based on their cellular activity. More than 40% of the structures in the database are found to be involved in different regulatory pathways and nearly 20% in the immune system. These data indicate the importance of protein-peptide complexes in the regulation of cellular processes. PMID:23896518

  16. Smartphone qualification & linux-based tools for CubeSat computing payloads

    NASA Astrophysics Data System (ADS)

    Bridges, C. P.; Yeomans, B.; Iacopino, C.; Frame, T. E.; Schofield, A.; Kenyon, S.; Sweeting, M. N.

    Modern computers are now far in advance of satellite systems and leveraging of these technologies for space applications could lead to cheaper and more capable spacecraft. Together with NASA AMES's PhoneSat, the STRaND-1 nanosatellite team has been developing and designing new ways to include smart-phone technologies to the popular CubeSat platform whilst mitigating numerous risks. Surrey Space Centre (SSC) and Surrey Satellite Technology Ltd. (SSTL) have led in qualifying state-of-the-art COTS technologies and capabilities - contributing to numerous low-cost satellite missions. The focus of this paper is to answer if 1) modern smart-phone software is compatible for fast and low-cost development as required by CubeSats, and 2) if the components utilised are robust to the space environment. The STRaND-1 smart-phone payload software explored in this paper is united using various open-source Linux tools and generic interfaces found in terrestrial systems. A major result from our developments is that many existing software and hardware processes are more than sufficient to provide autonomous and operational payload object-to-object and file-based management solutions. The paper will provide methodologies on the software chains and tools used for the STRaND-1 smartphone computing platform, the hardware built with space qualification results (thermal, thermal vacuum, and TID radiation), and how they can be implemented in future missions.

  17. Online object oriented Monte Carlo computational tool for the needs of biomedical optics

    PubMed Central

    Doronin, Alexander; Meglinski, Igor

    2011-01-01

    Conceptual engineering design and optimization of laser-based imaging techniques and optical diagnostic systems used in the field of biomedical optics requires a clear understanding of the light-tissue interaction and peculiarities of localization of the detected optical radiation within the medium. The description of photon migration within the turbid tissue-like media is based on the concept of radiative transfer that forms a basis of Monte Carlo (MC) modeling. An opportunity of direct simulation of influence of structural variations of biological tissues on the probing light makes MC a primary tool for biomedical optics and optical engineering. Due to the diversity of optical modalities utilizing different properties of light and mechanisms of light-tissue interactions a new MC code is typically required to be developed for the particular diagnostic application. In current paper introducing an object oriented concept of MC modeling and utilizing modern web applications we present the generalized online computational tool suitable for the major applications in biophotonics. The computation is supported by NVIDEA CUDA Graphics Processing Unit providing acceleration of modeling up to 340 times. PMID:21991540

  18. Validation of Three Early Ejaculation Diagnostic Tools: A Composite Measure Is Accurate and More Adequate for Diagnosis by Updated Diagnostic Criteria

    PubMed Central

    Jern, Patrick; Piha, Juhana; Santtila, Pekka

    2013-01-01

    Purpose To validate three early ejaculation diagnostic tools, and propose a new tool for diagnosis in line with proposed changes to diagnostic criteria. Significant changes to diagnostic criteria are expected in the near future. Available screening tools do not necessarily reflect proposed changes. Materials and Methods Data from 148 diagnosed early ejaculation patients (Mage = 42.8) and 892 controls (Mage = 33.1 years) from a population-based sample were used. Participants responded to three different questionnaires (Premature Ejaculation Profile; Premature Ejaculation Diagnostic Tool; Multiple Indicators of Premature Ejaculation). Stopwatch measured ejaculation latency times were collected from a subsample of early ejaculation patients. We used two types of responses to the questionnaires depending on the treatment status of the patients 1) responses regarding the situation before starting pharmacological treatment and 2) responses regarding current situation. Logistic regressions and Receiver Operating Characteristics were used to assess ability of both the instruments and individual items to differentiate between patients and controls. Results All instruments had very good precision (Areas under the Curve ranging from .93-.98). A new five-item instrument (named CHecklist for Early Ejaculation Symptoms – CHEES) consisting of high-performance variables selected from the three instruments had validity (Nagelkerke R2 range .51-.79 for backwards/forwards logistic regression) equal to or slightly better than any individual instrument (i.e., had slightly higher validity statistics, but these differences did not achieve statistical significance). Importantly, however, this instrument was more in line with proposed changes to diagnostic criteria. Conclusions All three screening tools had good validity. A new 5-item diagnostic tool (CHEES) based on the three instruments had equal or somewhat more favorable validity statistics compared to the other three tools, but is

  19. Gmat. A software tool for the computation of the rovibrational G matrix

    NASA Astrophysics Data System (ADS)

    Castro, M. E.; Niño, A.; Muñoz-Caro, C.

    2009-07-01

    Gmat is a C++ program able to compute the rovibrational G matrix in molecules of arbitrary size. This allows the building of arbitrary rovibrational Hamiltonians. In particular, the program is designed to work with the structural results of potential energy hypersurface mappings computed in computer clusters or computational Grid environments. In the present version, 1.0, the program uses internal coordinates as vibrational coordinates, with the principal axes of inertia as body-fixed system. The main design implements a complete separation of the interface and functional parts of the program. The interface part permits the automatic reading of the molecular structures from the output files of different electronic structure codes. At present, Gamess and Gaussian output files are allowed. To such an end, use is made of the object orientation polymorphism characteristic. The functional part computes numerically the derivatives of the nuclear positions respect to the vibrational coordinates. Very accurate derivatives are obtained by using central differences embedded in a nine levels Richardson extrapolation procedure. Program summaryProgram title: Gmat Catalogue identifier: AECZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 17 023 No. of bytes in distributed program, including test data, etc.: 274 714 Distribution format: tar.gz Programming language: Standard C++ Computer: All running Linux/Windows Operating system: Linux, Windows Classification: 16.2 Nature of problem: Computation of the rovibrational G matrix in molecules of any size. This allows the building of arbitrary rovibrational Hamiltonians. It must be possible to obtain the input data from the output files of standard electronic structure codes

  20. A remote sensing computer-assisted learning tool developed using the unified modeling language

    NASA Astrophysics Data System (ADS)

    Friedrich, J.; Karslioglu, M. O.

    The goal of this work has been to create an easy-to-use and simple-to-make learning tool for remote sensing at an introductory level. Many students struggle to comprehend what seems to be a very basic knowledge of digital images, image processing and image arithmetic, for example. Because professional programs are generally too complex and overwhelming for beginners and often not tailored to the specific needs of a course regarding functionality, a computer-assisted learning (CAL) program was developed based on the unified modeling language (UML), the present standard for object-oriented (OO) system development. A major advantage of this approach is an easier transition from modeling to coding of such an application, if modern UML tools are being used. After introducing the constructed UML model, its implementation is briefly described followed by a series of learning exercises. They illustrate how the resulting CAL tool supports students taking an introductory course in remote sensing at the author's institution.

  1. Computer assisted 3D pre-operative planning tool for femur fracture orthopedic surgery

    NASA Astrophysics Data System (ADS)

    Gamage, Pavan; Xie, Sheng Quan; Delmas, Patrice; Xu, Wei Liang

    2010-02-01

    Femur shaft fractures are caused by high impact injuries and can affect gait functionality if not treated correctly. Until recently, the pre-operative planning for femur fractures has relied on two-dimensional (2D) radiographs, light boxes, tracing paper, and transparent bone templates. The recent availability of digital radiographic equipment has to some extent improved the workflow for preoperative planning. Nevertheless, imaging is still in 2D X-rays and planning/simulation tools to support fragment manipulation and implant selection are still not available. Direct three-dimensional (3D) imaging modalities such as Computed Tomography (CT) are also still restricted to a minority of complex orthopedic procedures. This paper proposes a software tool which allows orthopedic surgeons to visualize, diagnose, plan and simulate femur shaft fracture reduction procedures in 3D. The tool utilizes frontal and lateral 2D radiographs to model the fracture surface, separate a generic bone into the two fractured fragments, identify the pose of each fragment, and automatically customize the shape of the bone. The use of 3D imaging allows full spatial inspection of the fracture providing different views through the manipulation of the interactively reconstructed 3D model, and ultimately better pre-operative planning.

  2. Supporting Scientific Modeling Practices in Atmospheric Sciences: Intended and Actual Affordances of a Computer-Based Modeling Tool

    ERIC Educational Resources Information Center

    Wu, Pai-Hsing; Wu, Hsin-Kai; Kuo, Che-Yu; Hsu, Ying-Shao

    2015-01-01

    Computer-based learning tools include design features to enhance learning but learners may not always perceive the existence of these features and use them in desirable ways. There might be a gap between what the tool features are designed to offer (intended affordance) and what they are actually used (actual affordance). This study thus aims at…

  3. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    SciTech Connect

    Smith, P.R.; Sarfaty, R.

    1993-05-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility`s physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables.

  4. N2A: a computational tool for modeling from neurons to algorithms

    PubMed Central

    Rothganger, Fredrick; Warrender, Christina E.; Trumbo, Derek; Aimone, James B.

    2014-01-01

    The exponential increase in available neural data has combined with the exponential growth in computing (“Moore's law”) to create new opportunities to understand neural systems at large scale and high detail. The ability to produce large and sophisticated simulations has introduced unique challenges to neuroscientists. Computational models in neuroscience are increasingly broad efforts, often involving the collaboration of experts in different domains. Furthermore, the size and detail of models have grown to levels for which understanding the implications of variability and assumptions is no longer trivial. Here, we introduce the model design platform N2A which aims to facilitate the design and validation of biologically realistic models. N2A uses a hierarchical representation of neural information to enable the integration of models from different users. N2A streamlines computational validation of a model by natively implementing standard tools in sensitivity analysis and uncertainty quantification. The part-relationship representation allows both network-level analysis and dynamical simulations. We will demonstrate how N2A can be used in a range of examples, including a simple Hodgkin-Huxley cable model, basic parameter sensitivity of an 80/20 network, and the expression of the structural plasticity of a growing dendrite and stem cell proliferation and differentiation. PMID:24478635

  5. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  6. A tool for computing time-dependent permeability reduction of fractured volcanic conduit margins.

    NASA Astrophysics Data System (ADS)

    Farquharson, Jamie; Wadsworth, Fabian; Heap, Michael; Baud, Patrick

    2016-04-01

    Laterally-oriented fractures within volcanic conduit margins are thought to play an important role in tempering eruption explosivity by allowing magmatic volatiles to outgas. The permeability of a fractured conduit margin - the equivalent permeability - can be modelled as the sum of permeability contributions of the edifice host rock and the fracture(s) within it. We present here a flexible MATLAB® tool which computes the time-dependent equivalent permeability of a volcanic conduit margin containing ash-filled fractures. The tool is designed so that the end-user can define a wide range of input parameters to yield equivalent permeability estimates for their application. The time-dependence of the equivalent permeability is incorporated by considering permeability decrease as a function of porosity loss in the ash-filled fractures due to viscous sintering (after Russell and Quane, 2005), which is in turn dependent on the depth and temperature of each fracture and the crystal-content of the magma (all user-defined variables). The initial viscosity of the granular material filling the fracture is dependent on the water content (Hess and Dingwell, 1996), which is computed assuming equilibrium depth-dependent water content (Liu et al., 2005). Crystallinity is subsequently accounted for by employing the particle-suspension rheological model of Mueller et al. (2010). The user then defines the number of fractures, their widths, and their depths, and the lengthscale of interest (e.g. the length of the conduit). Using these data, the combined influence of transient fractures on the equivalent permeability of the conduit margin is then calculated by adapting a parallel-plate flow model (developed by Baud et al., 2012 for porous sandstones), for host rock permeabilities from 10‑11 to 10‑22 m2. The calculated values of porosity and equivalent permeability with time for each host rock permeability is then output in text and worksheet file formats. We introduce two

  7. Generator program for computer-assisted instruction: MACGEN. A software tool for generating computer-assisted instructional texts.

    PubMed

    Utsch, M J; Ingram, D

    1983-01-01

    This publication describes MACGEN, an interactive development tool to assist teachers to create, modify and extend case simulations, tutorial exercises and multiple-choice question tests designed for computer-aided instruction. The menu-driven software provides full authoring facilities for text files in MACAID format by means of interactive editing. Authors are prompted for items which they might want to change whereas all user-independent items are provided automatically. Optional default values and explanatory messages are available with every prompt. Errors are corrected automatically or commented upon. Thus the program eliminates the need to familiarize with a new language or details of the text file structure. The options for modification of existing text files include display, renumbering of frames and a line-oriented editor. The resulting text files can be interpreted by the MACAID driver without further changes. The text file is held as ASCII records and as such is also accessible with many standard word-processing systems if desired. PMID:6362978

  8. A computer modeling methodology and tool for assessing design concepts for the Space Station Data Management System

    NASA Technical Reports Server (NTRS)

    Jones, W. R.

    1986-01-01

    A computer modeling tool is being developed to assess candidate designs for the Space Station Data Management System (DMS). The DMS is to be a complex distributed computer system including the processor, storage devices, local area networks, and software that will support all processing functions onboard the Space Station. The modeling tool will allow a candidate design for the DMS, or for other subsystems that use the DMS, to be evaluated in terms of parameters. The tool and its associated modeling methodology are intended for use by DMS and subsystem designers to perform tradeoff analyses between design concepts using varied architectures and technologies.

  9. GANGA: A tool for computational-task management and easy access to Grid resources

    NASA Astrophysics Data System (ADS)

    Mościcki, J. T.; Brochu, F.; Ebke, J.; Egede, U.; Elmsheuser, J.; Harrison, K.; Jones, R. W. L.; Lee, H. C.; Liko, D.; Maier, A.; Muraru, A.; Patrick, G. N.; Pajchel, K.; Reece, W.; Samset, B. H.; Slater, M. W.; Soroko, A.; Tan, C. L.; van der Ster, D. C.; Williams, M.

    2009-11-01

    In this paper, we present the computational task-management tool GANGA, which allows for the specification, submission, bookkeeping and post-processing of computational tasks on a wide set of distributed resources. GANGA has been developed to solve a problem increasingly common in scientific projects, which is that researchers must regularly switch between different processing systems, each with its own command set, to complete their computational tasks. GANGA provides a homogeneous environment for processing data on heterogeneous resources. We give examples from High Energy Physics, demonstrating how an analysis can be developed on a local system and then transparently moved to a Grid system for processing of all available data. GANGA has an API that can be used via an interactive interface, in scripts, or through a GUI. Specific knowledge about types of tasks or computational resources is provided at run-time through a plugin system, making new developments easy to integrate. We give an overview of the GANGA architecture, give examples of current use, and demonstrate how GANGA can be used in many different areas of science. Catalogue identifier: AEEN_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEN_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL No. of lines in distributed program, including test data, etc.: 224 590 No. of bytes in distributed program, including test data, etc.: 14 365 315 Distribution format: tar.gz Programming language: Python Computer: personal computers, laptops Operating system: Linux/Unix RAM: 1 MB Classification: 6.2, 6.5 Nature of problem: Management of computational tasks for scientific applications on heterogenous distributed systems, including local, batch farms, opportunistic clusters and

  10. Computational Fluid Dynamics (CFD) as surgical planning tool: a pilot study on middle turbinate resection

    PubMed Central

    Zhao, Kai; Malhotra, Prashant; Rosen, David; Dalton, Pamela; Pribitkin, Edmund A

    2014-01-01

    Controversies exist regarding the resection or preservation of the middle turbinate (MT) during functional endoscopic sinus surgery (FESS). Any MT resection will perturb nasal airflow and may affect the mucociliary dynamics of the osteomeatal complex. Neither rhinometry nor computed tomography (CT) can adequately quantify nasal airflow pattern changes following surgery. This study explores the feasibility of assessing changes in nasal airflow dynamics following partial MT resection using computational fluid dynamics (CFD) techniques. We retrospectively converted the pre- and post-operative CT scans of a patient who underwent isolated partial MT concha bullosa resection into anatomically accurate three-dimensional numerical nasal models. Pre- and post-surgery nasal airflow simulations showed that the partial MT resection resulted in a shift of regional airflow towards the area of MT removal with a resultant decreased airflow velocity, decreased wall shear stress and increased local air pressure. However, the resection did not strongly affect the overall nasal airflow patterns, flow distributions in other areas of the nose, or the odorant uptake rate to the olfactory cleft mucosa. Morever, CFD predicted the patient's failure to perceive an improvement in his unilateral nasal obstruction following surgery. Accordingly, CFD techniques can be used to predict changes in nasal airflow dynamics following partial MT resection. However, the functional implications of this analysis await further clinical studies. Nevertheless, such techniques may potentially provide a quantitative evaluation of surgical effectiveness and may prove useful in preoperatively modeling the effects of surgical interventions. PMID:25312372

  11. A fourth-order accurate curvature computation in a level set framework for two-phase flows subjected to surface tension forces

    NASA Astrophysics Data System (ADS)

    Coquerelle, Mathieu; Glockner, Stéphane

    2016-01-01

    We propose an accurate and robust fourth-order curvature extension algorithm in a level set framework for the transport of the interface. The method is based on the Continuum Surface Force approach, and is shown to efficiently calculate surface tension forces for two-phase flows. In this framework, the accuracy of the algorithms mostly relies on the precise computation of the surface curvature which we propose to accomplish using a two-step algorithm: first by computing a reliable fourth-order curvature estimation from the level set function, and second by extending this curvature rigorously in the vicinity of the surface, following the Closest Point principle. The algorithm is easy to implement and to integrate into existing solvers, and can easily be extended to 3D. We propose a detailed analysis of the geometrical and numerical criteria responsible for the appearance of spurious currents, a well known phenomenon observed in various numerical frameworks. We study the effectiveness of this novel numerical method on state-of-the-art test cases showing that the resulting curvature estimate significantly reduces parasitic currents. In addition, the proposed approach converges to fourth-order regarding spatial discretization, which is two orders of magnitude better than algorithms currently available. We also show the necessity for high-order transport methods for the surface by studying the case of the 2D advection of a column at equilibrium thereby proving the robustness of the proposed approach. The algorithm is further validated on more complex test cases such as a rising bubble.

  12. Feasibility study for application of the compressed-sensing framework to interior computed tomography (ICT) for low-dose, high-accurate dental x-ray imaging

    NASA Astrophysics Data System (ADS)

    Je, U. K.; Cho, H. M.; Cho, H. S.; Park, Y. O.; Park, C. K.; Lim, H. W.; Kim, K. S.; Kim, G. A.; Park, S. Y.; Woo, T. H.; Choi, S. I.

    2016-02-01

    In this paper, we propose a new/next-generation type of CT examinations, the so-called Interior Computed Tomography (ICT), which may presumably lead to dose reduction to the patient outside the target region-of-interest (ROI), in dental x-ray imaging. Here an x-ray beam from each projection position covers only a relatively small ROI containing a target of diagnosis from the examined structure, leading to imaging benefits such as decreasing scatters and system cost as well as reducing imaging dose. We considered the compressed-sensing (CS) framework, rather than common filtered-backprojection (FBP)-based algorithms, for more accurate ICT reconstruction. We implemented a CS-based ICT algorithm and performed a systematic simulation to investigate the imaging characteristics. Simulation conditions of two ROI ratios of 0.28 and 0.14 between the target and the whole phantom sizes and four projection numbers of 360, 180, 90, and 45 were tested. We successfully reconstructed ICT images of substantially high image quality by using the CS framework even with few-view projection data, still preserving sharp edges in the images.

  13. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    NASA Astrophysics Data System (ADS)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  14. Cone beam computed tomography imaging as a primary diagnostic tool for computer-guided surgery and CAD-CAM interim removable and fixed dental prostheses.

    PubMed

    Charette, Jyme R; Goldberg, Jack; Harris, Bryan T; Morton, Dean; Llop, Daniel R; Lin, Wei-Shao

    2016-08-01

    This article describes a digital workflow using cone beam computed tomography imaging as the primary diagnostic tool in the virtual planning of the computer-guided surgery and fabrication of a maxillary interim complete removable dental prosthesis and mandibular interim implant-supported complete fixed dental prosthesis with computer-aided design and computer-aided manufacturing technology. Diagnostic impressions (conventional or digital) and casts are unnecessary in this proposed digital workflow, providing clinicians with an alternative treatment in the indicated clinical scenario. PMID:27086108

  15. Continuous monitoring as a tool for more accurate assessment of remaining lifetime for rotors and casings of steam turbines in service

    SciTech Connect

    Leyzerovich, A.; Berlyand, V.; Pozhidaev, A.; Yatskevich, S.

    1998-12-31

    The continuous monitoring of steam parameters and metal temperatures allows assessing the individual remaining lifetime for major high-temperature design components of steam turbines in service more accurately. Characteristic metal temperature differences and corresponding maximum thermal stresses and strains are calculated on-line to estimate the metal fatigue damage accumulated during the operation process. This can be one of the diagnostic functions of the power unit`s computerized Data Acquisition System (DAS) or special Subsystem of Diagnostic monitoring (SDM) for the turbine. In doing so, the remaining lifetime is assessed in terms of actual operating conditions and operation quality for the individual unit, and the problem of lifetime extension for each object is solved more accurately. Such an approach is considered as applied to a specific case of the supercritical-pressure steam turbine of 300-MW output. The applied mathematical models were developed on the basis of combined experimentation (field) and calculation investigations of the metal temperature and strain-stress fields in the high-temperature (HP and IP) rotors and casings under the most characteristic stationary and transient operating conditions. The monitoring results are used for revealing the operating conditions with the extreme thermal stresses and specific metal damage, as well as for making decisions about scheduling the turbine`s overhauls and extension of the turbine lifetime beyond the limits having been set originally.

  16. An open-source computational tool to automatically quantify immunolabeled retinal ganglion cells.

    PubMed

    Dordea, Ana C; Bray, Mark-Anthony; Allen, Kaitlin; Logan, David J; Fei, Fei; Malhotra, Rajeev; Gregory, Meredith S; Carpenter, Anne E; Buys, Emmanuel S

    2016-06-01

    A fully automated and robust method was developed to quantify β-III-tubulin-stained retinal ganglion cells, combining computational recognition of individual cells by CellProfiler and a machine-learning tool to teach phenotypic classification of the retinal ganglion cells by CellProfiler Analyst. In animal models of glaucoma, quantification of immunolabeled retinal ganglion cells is currently performed manually and remains time-consuming. Using this automated method, quantifications of retinal ganglion cell images were accelerated tenfold: 1800 images were counted in 3 h using our automated method, while manual counting of the same images took 72 h. This new method was validated in an established murine model of microbead-induced optic neuropathy. The use of the publicly available software and the method's user-friendly design allows this technique to be easily implemented in any laboratory. PMID:27119563

  17. FILMPAR: A parallel algorithm designed for the efficient and accurate computation of thin film flow on functional surfaces containing micro-structure

    NASA Astrophysics Data System (ADS)

    Lee, Y. C.; Thompson, H. M.; Gaskell, P. H.

    2009-12-01

    , industrial and physical applications. However, despite recent modelling advances, the accurate numerical solution of the equations governing such problems is still at a relatively early stage. Indeed, recent studies employing a simplifying long-wave approximation have shown that highly efficient numerical methods are necessary to solve the resulting lubrication equations in order to achieve the level of grid resolution required to accurately capture the effects of micro- and nano-scale topographical features. Solution method: A portable parallel multigrid algorithm has been developed for the above purpose, for the particular case of flow over submerged topographical features. Within the multigrid framework adopted, a W-cycle is used to accelerate convergence in respect of the time dependent nature of the problem, with relaxation sweeps performed using a fixed number of pre- and post-Red-Black Gauss-Seidel Newton iterations. In addition, the algorithm incorporates automatic adaptive time-stepping to avoid the computational expense associated with repeated time-step failure. Running time: 1.31 minutes using 128 processors on BlueGene/P with a problem size of over 16.7 million mesh points.

  18. Computational Fluid Dynamics-Icing: a Predictive Tool for In-Flight Icing Risk Management

    NASA Astrophysics Data System (ADS)

    Zeppetelli, Danial

    In-flight icing is a hazard that continues to afflict the aviation industry, despite all the research and efforts to mitigate the risks. The recurrence of these types of accidents has given renewed impetus to the development of advanced analytical predictive tools to study both the accretion of ice on aircraft components in flight, and the aerodynamic consequences of such ice accumulations. In this work, an in-depth analysis of the occurrence of in-flight icing accidents and incidents was conducted to identify high-risk flight conditions. To investigate these conditions more thoroughly, a computational fluid dynamics model of a representative airfoil was developed to recreate experiments from the icing wind tunnel that occurred in controlled flight conditions. The ice accumulations and resulting aerodynamic performance degradations of the airfoil were computed for a range or pitch angles and flight speeds. These simulations revealed substantial performance losses such as reduced maximum lift, and decreased stall angle. From these results, an icing hazard analysis tool was developed, using risk management principles, to evaluate the dangers of in-flight icing for a specific aircraft based on the atmospheric conditions it is expected to encounter, as well as the effectiveness of aircraft certification procedures. This method is then demonstrated through the simulation of in-flight icing scenarios based on real flight data from accidents and incidents. The risk management methodology is applied to the results of the simulations and the predicted performance degradation is compared to recorded aircraft performance characteristics at the time of the occurrence. The aircraft performance predictions and resulting risk assessment are found to correspond strongly to the pilot's comments as well as to the severity of the incident.

  19. Monitoring of seismic time-series with advanced parallel computational tools and complex networks

    NASA Astrophysics Data System (ADS)

    Kechaidou, M.; Sirakoulis, G. Ch.; Scordilis, E. M.

    2012-04-01

    Earthquakes have been in the focus of human and research interest for several centuries due to their catastrophic effect to the everyday life as they occur almost all over the world demonstrating a hard to be modelled unpredictable behaviour. On the other hand, their monitoring with more or less technological updated instruments has been almost continuous and thanks to this fact several mathematical models have been presented and proposed so far to describe possible connections and patterns found in the resulting seismological time-series. Especially, in Greece, one of the most seismically active territories on earth, detailed instrumental seismological data are available from the beginning of the past century providing the researchers with valuable and differential knowledge about the seismicity levels all over the country. Considering available powerful parallel computational tools, such as Cellular Automata, these data can be further successfully analysed and, most important, modelled to provide possible connections between different parameters of the under study seismic time-series. More specifically, Cellular Automata have been proven very effective to compose and model nonlinear complex systems resulting in the advancement of several corresponding models as possible analogues of earthquake fault dynamics. In this work preliminary results of modelling of the seismic time-series with the help of Cellular Automata so as to compose and develop the corresponding complex networks are presented. The proposed methodology will be able to reveal under condition hidden relations as found in the examined time-series and to distinguish the intrinsic time-series characteristics in an effort to transform the examined time-series to complex networks and graphically represent their evolvement in the time-space. Consequently, based on the presented results, the proposed model will eventually serve as a possible efficient flexible computational tool to provide a generic

  20. Computational and molecular tools for scalable rAAV-mediated genome editing.

    PubMed

    Stoimenov, Ivaylo; Ali, Muhammad Akhtar; Pandzic, Tatjana; Sjöblom, Tobias

    2015-03-11

    The rapid discovery of potential driver mutations through large-scale mutational analyses of human cancers generates a need to characterize their cellular phenotypes. Among the techniques for genome editing, recombinant adeno-associated virus (rAAV)-mediated gene targeting is suited for knock-in of single nucleotide substitutions and to a lesser degree for gene knock-outs. However, the generation of gene targeting constructs and the targeting process is time-consuming and labor-intense. To facilitate rAAV-mediated gene targeting, we developed the first software and complementary automation-friendly vector tools to generate optimized targeting constructs for editing human protein encoding genes. By computational approaches, rAAV constructs for editing ~71% of bases in protein-coding exons were designed. Similarly, ~81% of genes were predicted to be targetable by rAAV-mediated knock-out. A Gateway-based cloning system for facile generation of rAAV constructs suitable for robotic automation was developed and used in successful generation of targeting constructs. Together, these tools enable automated rAAV targeting construct design, generation as well as enrichment and expansion of targeted cells with desired integrations. PMID:25488813

  1. A computational tool for preoperative breast augmentation planning in aesthetic plastic surgery.

    PubMed

    Georgii, Joachim; Eder, Maximilian; Bürger, Kai; Klotz, Sebastian; Ferstl, Florian; Kovacs, Laszlo; Westermann, Rüdiger

    2014-05-01

    Breast augmentation was the most commonly performed cosmetic surgery procedure in 2011 in the United States. Although aesthetically pleasing surgical results can only be achieved if the correct breast implant is selected from a large variety of different prosthesis sizes and shapes available on the market, surgeons still rely on visual assessment and other subjective approaches for operative planning because of lacking objective evaluation tools. In this paper, we present the development of a software prototype for augmentation mammaplasty simulation solely based on 3-D surface scans, from which patient-specific finite-element models are generated in a semiautomatic process. The finite-element model is used to preoperatively simulate the expected breast shapes using physical soft-tissue mechanics. Our approach uses a novel mechanism based on so-called displacement templates, which, for a specific implant shape and position, describe the respective internal body forces. Due to a highly efficient numerical solver we can provide immediate visual feedback of the simulation results, and thus, the software prototype can be integrated smoothly into the medical workflow. The clinical value of the developed 3-D computational tool for aesthetic breast augmentation surgery planning is demonstrated in patient-specific use cases. PMID:24132029

  2. MSITE: a new computational tool for comparison of homological proteins in holo form.

    PubMed

    Sicinska, Wanda; Kurcinski, Mateusz

    2010-07-01

    The mechanism by which nuclear receptors respond differentially to structurally distinct agonists is not a well understood process. However, it is now obvious that transcriptional activity of nuclear receptors is a function of their interactions with co-activators. Recently, we released a new computational tool, CCOMP, for comparing side chain conformations in crystal structures of homologous protein complexes. Application of the CCOMP program revealed that 20-epi-1alpha,25-(OH)2D3 changes the side chain conformation of vitamin D receptor amino acids residing mostly far away from the ligand-receptor contacts. This strongly suggests that the ligand-co-activator signaling pathway involves indirect interactions between amino acids lining the binding pocket and outer surface residues that could attract co-activators. To facilitate identification of amino acids transmitting the subtle receptor changes upon ligand/modulator binding we developed another simple tool, MSITE. The program automatically lists the nearest neighbors of a given amino acid (for example neighbors of residues that are in contact with a ligand or reorient their side chains in the presence of a co-factor) in an arbitrary number of compared complexes. Comparison of seven binary vitamin D receptor complexes holding as ligands the analogs of 1alpha,25-(OH)2D3 with inverted configuration at carbon 14 or 20, or with incorporated oxolane ring bridging carbons 20 and 23, is reported. PMID:20399855

  3. Initial development of a computer-aided diagnosis tool for solitary pulmonary nodules

    NASA Astrophysics Data System (ADS)

    Catarious, David M., Jr.; Baydush, Alan H.; Floyd, Carey E., Jr.

    2001-07-01

    This paper describes the development of a computer-aided diagnosis (CAD) tool for solitary pulmonary nodules. This CAD tool is built upon physically meaningful features that were selected because of their relevance to shape and texture. These features included a modified version of the Hotelling statistic (HS), a channelized HS, three measures of fractal properties, two measures of spicularity, and three manually measured shape features. These features were measured from a difficult database consisting of 237 regions of interest (ROIs) extracted from digitized chest radiographs. The center of each 256x256 pixel ROI contained a suspicious lesion which was sent to follow-up by a radiologist and whose nature was later clinically determined. Linear discriminant analysis (LDA) was used to search the feature space via sequential forward search using percentage correct as the performance metric. An optimized feature subset, selected for the highest accuracy, was then fed into a three layer artificial neural network (ANN). The ANN's performance was assessed by receiver operating characteristic (ROC) analysis. A leave-one-out testing/training methodology was employed for the ROC analysis. The performance of this system is competitive with that of three radiologists on the same database.

  4. Using Brain–Computer Interfaces and Brain-State Dependent Stimulation as Tools in Cognitive Neuroscience

    PubMed Central

    Jensen, Ole; Bahramisharif, Ali; Oostenveld, Robert; Klanke, Stefan; Hadjipapas, Avgis; Okazaki, Yuka O.; van Gerven, Marcel A. J.

    2011-01-01

    Large efforts are currently being made to develop and improve online analysis of brain activity which can be used, e.g., for brain–computer interfacing (BCI). A BCI allows a subject to control a device by willfully changing his/her own brain activity. BCI therefore holds the promise as a tool for aiding the disabled and for augmenting human performance. While technical developments obviously are important, we will here argue that new insight gained from cognitive neuroscience can be used to identify signatures of neural activation which reliably can be modulated by the subject at will. This review will focus mainly on oscillatory activity in the alpha band which is strongly modulated by changes in covert attention. Besides developing BCIs for their traditional purpose, they might also be used as a research tool for cognitive neuroscience. There is currently a strong interest in how brain-state fluctuations impact cognition. These state fluctuations are partly reflected by ongoing oscillatory activity. The functional role of the brain state can be investigated by introducing stimuli in real-time to subjects depending on the actual state of the brain. This principle of brain-state dependent stimulation may also be used as a practical tool for augmenting human behavior. In conclusion, new approaches based on online analysis of ongoing brain activity are currently in rapid development. These approaches are amongst others informed by new insight gained from electroencephalography/magnetoencephalography studies in cognitive neuroscience and hold the promise of providing new ways for investigating the brain at work. PMID:21687463

  5. An Interactive Tool for Outdoor Computer Controlled Cultivation of Microalgae in a Tubular Photobioreactor System

    PubMed Central

    Dormido, Raquel; Sánchez, José; Duro, Natividad; Dormido-Canto, Sebastián; Guinaldo, María; Dormido, Sebastián

    2014-01-01

    This paper describes an interactive virtual laboratory for experimenting with an outdoor tubular photobioreactor (henceforth PBR for short). This virtual laboratory it makes possible to: (a) accurately reproduce the structure of a real plant (the PBR designed and built by the Department of Chemical Engineering of the University of Almería, Spain); (b) simulate a generic tubular PBR by changing the PBR geometry; (c) simulate the effects of changing different operating parameters such as the conditions of the culture (pH, biomass concentration, dissolved O2, inyected CO2, etc.); (d) simulate the PBR in its environmental context; it is possible to change the geographic location of the system or the solar irradiation profile; (e) apply different control strategies to adjust different variables such as the CO2 injection, culture circulation rate or culture temperature in order to maximize the biomass production; (f) simulate the harvesting. In this way, users can learn in an intuitive way how productivity is affected by any change in the design. It facilitates the learning of how to manipulate essential variables for microalgae growth to design an optimal PBR. The simulator has been developed with Easy Java Simulations, a freeware open-source tool developed in Java, specifically designed for the creation of interactive dynamic simulations. PMID:24662450

  6. An interactive tool for outdoor computer controlled cultivation of microalgae in a tubular photobioreactor system.

    PubMed

    Dormido, Raquel; Sánchez, José; Duro, Natividad; Dormido-Canto, Sebastián; Guinaldo, María; Dormido, Sebastián

    2014-01-01

    This paper describes an interactive virtual laboratory for experimenting with an outdoor tubular photobioreactor (henceforth PBR for short). This virtual laboratory it makes possible to: (a) accurately reproduce the structure of a real plant (the PBR designed and built by the Department of Chemical Engineering of the University of Almería, Spain); (b) simulate a generic tubular PBR by changing the PBR geometry; (c) simulate the effects of changing different operating parameters such as the conditions of the culture (pH, biomass concentration, dissolved O2, inyected CO2, etc.); (d) simulate the PBR in its environmental context; it is possible to change the geographic location of the system or the solar irradiation profile; (e) apply different control strategies to adjust different variables such as the CO2 injection, culture circulation rate or culture temperature in order to maximize the biomass production; (f) simulate the harvesting. In this way, users can learn in an intuitive way how productivity is affected by any change in the design. It facilitates the learning of how to manipulate essential variables for microalgae growth to design an optimal PBR. The simulator has been developed with Easy Java Simulations, a freeware open-source tool developed in Java, specifically designed for the creation of interactive dynamic simulations. PMID:24662450

  7. 16S classifier: a tool for fast and accurate taxonomic classification of 16S rRNA hypervariable regions in metagenomic datasets.

    PubMed

    Chaudhary, Nikhil; Sharma, Ashok K; Agarwal, Piyush; Gupta, Ankit; Sharma, Vineet K

    2015-01-01

    The diversity of microbial species in a metagenomic study is commonly assessed using 16S rRNA gene sequencing. With the rapid developments in genome sequencing technologies, the focus has shifted towards the sequencing of hypervariable regions of 16S rRNA gene instead of full length gene sequencing. Therefore, 16S Classifier is developed using a machine learning method, Random Forest, for faster and accurate taxonomic classification of short hypervariable regions of 16S rRNA sequence. It displayed precision values of up to 0.91 on training datasets and the precision values of up to 0.98 on the test dataset. On real metagenomic datasets, it showed up to 99.7% accuracy at the phylum level and up to 99.0% accuracy at the genus level. 16S Classifier is available freely at http://metagenomics.iiserb.ac.in/16Sclassifier and http://metabiosys.iiserb.ac.in/16Sclassifier. PMID:25646627

  8. Stimulated dual-band infrared computed tomography: a tool to inspect the aging infrastructure

    NASA Astrophysics Data System (ADS)

    DelGrande, Nancy; Durbin, Philip F.

    1995-09-01

    We have developed stimulated dual-band infrared (IR) computed tomography as a tool to inspect the aging infrastructure. Our system has the potential to locate and quantify structural damage within airframes and bridge decks. Typically, dual-band IR detection methods improve the signal-to-noise ratio by a factor of ten, compared to single-band IR detection methods. We conducted a demonstration at Boeing using a uniform pulsed-heat source to stimulate IR images of hidden defects in the 727 fuselage. Our dual-band IR camera and image processing system produced temperature, thermal inertia, and cooling-rate maps. In combination, these maps characterized the defect site, size, depth, thickness, and type. We quantified the percent metal loss from corrosion above a threshold of 5%, with overall uncertainties of 3%. Also, we conducted a feasibility study of dual-band IR thermal imaging for bridge deck inspections. We determined the sites and relative concrete displacement of 12- in. and 4-in. deep delaminations from thin styrofoam implants in asphalt-covered concrete slabs. We demonstrated the value of dual-band IR computed tomography to quantify structural damage within flash-heated airframes and naturally heated bridge decks.

  9. Stimulated dual-band infrared computed tomography: A tool to inspect the aging infrastructure

    SciTech Connect

    Del Grande, N.K.; Durbin, P.F.

    1995-06-27

    The authors have developed stimulated dual-band infrared (IR) computed tomography as a tool to inspect the aging infrastructure. The system has the potential to locate and quantify structural damage within airframes and bridge decks. Typically, dual-band IR detection methods improve the signal-to-noise ratio by a factor of ten, compared to single-band IR detection methods. They conducted a demonstration at Boeing using a uniform pulsed-heat source to stimulate IR images of hidden defects in the 727 fuselage. The dual-band IR camera and image processing system produced temperature, thermal inertia, and cooling-rate maps. In combination, these maps characterized the defect site, size, depth, thickness and type. The authors quantified the percent metal loss from corrosion above a threshold of 5%, with overall uncertainties of 3%. Also, they conducted a feasibility study of dual-band IR thermal imaging for bridge deck inspections. They determined the sites and relative concrete displacement of 2-in. and 4-in. deep delaminations from thin styrofoam implants in asphalt-covered concrete slabs. They demonstrated the value of dual-band IR computed tomography to quantify structural damage within flash-heated airframes and naturally-heated bridge decks.

  10. A least-squares computational ``tool kit``. Nuclear data and measurements series

    SciTech Connect

    Smith, D.L.

    1993-04-01

    The information assembled in this report is intended to offer a useful computational ``tool kit`` to individuals who are interested in a variety of practical applications for the least-squares method of parameter estimation. The fundamental principles of Bayesian analysis are outlined first and these are applied to development of both the simple and the generalized least-squares conditions. Formal solutions that satisfy these conditions are given subsequently. Their application to both linear and non-linear problems is described in detail. Numerical procedures required to implement these formal solutions are discussed and two utility computer algorithms are offered for this purpose (codes LSIOD and GLSIOD written in FORTRAN). Some simple, easily understood examples are included to illustrate the use of these algorithms. Several related topics are then addressed, including the generation of covariance matrices, the role of iteration in applications of least-squares procedures, the effects of numerical precision and an approach that can be pursued in developing data analysis packages that are directed toward special applications.

  11. Probabilistic graphs as a conceptual and computational tool in hydrology and water management

    NASA Astrophysics Data System (ADS)

    Schoups, Gerrit

    2014-05-01

    Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.

  12. ChemScreener: A Distributed Computing Tool for Scaffold based Virtual Screening.

    PubMed

    Karthikeyan, Muthukumarasamy; Pandit, Deepak; Vyas, Renu

    2015-01-01

    In this work we present ChemScreener, a Java-based application to perform virtual library generation combined with virtual screening in a platform-independent distributed computing environment. ChemScreener comprises a scaffold identifier, a distinct scaffold extractor, an interactive virtual library generator as well as a virtual screening module for subsequently selecting putative bioactive molecules. The virtual libraries are annotated with chemophore-, pharmacophore- and toxicophore-based information for compound prioritization. The hits selected can then be further processed using QSAR, docking and other in silico approaches which can all be interfaced within the ChemScreener framework. As a sample application, in this work scaffold selectivity, diversity, connectivity and promiscuity towards six important therapeutic classes have been studied. In order to illustrate the computational power of the application, 55 scaffolds extracted from 161 anti-psychotic compounds were enumerated to produce a virtual library comprising 118 million compounds (17 GB) and annotated with chemophore, pharmacophore and toxicophore based features in a single step which would be non-trivial to perform with many standard software tools today on libraries of this size. PMID:26138574

  13. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  14. TRAJELIX: a computational tool for the geometric characterization of protein helices during molecular dynamics simulations.

    PubMed

    Mezei, Mihaly; Filizola, Marta

    2006-02-01

    We have developed a computer program with the necessary mathematical formalism for the geometric characterization of distorted conformations of alpha-helices proteins, such as those that can potentially be sampled during typical molecular dynamics simulations. This formalism has been incorporated into TRAJELIX, a new module within the SIMULAID framework (http://inka.mssm.edu/~mezei/simulaid/) that is capable of monitoring distortions of alpha-helices in terms of their displacement, global and local tilting, rotation around their axes, compression/extension, winding/unwinding, and bending. Accurate evaluation of these global and local structural properties of the helix can help study possible intramolecular and intermolecular changes in the helix packing of alpha-helical membrane proteins, as shown here in an application to the interacting helical domains of rhodopsin dimers. Quantification of the dynamic structural behavior of alpha-helical membrane proteins is critical for our understanding of signal transduction, and may enable structure-based design of more specific and efficient drugs. PMID:16783601

  15. A Computational Tool for the Microstructure Optimization of a Polymeric Heart Valve Prosthesis.

    PubMed

    Serrani, M; Brubert, J; Stasiak, J; De Gaetano, F; Zaffora, A; Costantino, M L; Moggridge, G D

    2016-06-01

    Styrene-based block copolymers are promising materials for the development of a polymeric heart valve prosthesis (PHV), and the mechanical properties of these polymers can be tuned via the manufacturing process, orienting the cylindrical domains to achieve material anisotropy. The aim of this work is the development of a computational tool for the optimization of the material microstructure in a new PHV intended for aortic valve replacement to enhance the mechanical performance of the device. An iterative procedure was implemented to orient the cylinders along the maximum principal stress direction of the leaflet. A numerical model of the leaflet was developed, and the polymer mechanical behavior was described by a hyperelastic anisotropic constitutive law. A custom routine was implemented to align the cylinders with the maximum principal stress direction in the leaflet for each iteration. The study was focused on valve closure, since during this phase the fibrous structure of the leaflets must bear the greatest load. The optimal microstructure obtained by our procedure is characterized by mainly circumferential orientation of the cylinders within the valve leaflet. An increase in the radial strain and a decrease in the circumferential strain due to the microstructure optimization were observed. Also, a decrease in the maximum value of the strain energy density was found in the case of optimized orientation; since the strain energy density is a widely used criterion to predict elastomer's lifetime, this result suggests a possible increase of the device durability if the polymer microstructure is optimized. The present method represents a valuable tool for the design of a new anisotropic PHV, allowing the investigation of different designs, materials, and loading conditions. PMID:27018454

  16. A Monte Carlo tool for raster-scanning particle therapy dose computation

    NASA Astrophysics Data System (ADS)

    Jelen, U.; Radon, M.; Santiago, A.; Wittig, A.; Ammazzalorso, F.

    2014-03-01

    Purpose of this work was to implement Monte Carlo (MC) dose computation in realistic patient geometries with raster-scanning, the most advanced ion beam delivery technique, combining magnetic beam deflection with energy variation. FLUKA, a Monte Carlo package well-established in particle therapy applications, was extended to simulate raster-scanning delivery with clinical data, unavailable as built-in feature. A new complex beam source, compatible with FLUKA public programming interface, was implemented in Fortran to model the specific properties of raster-scanning, i.e. delivery by means of multiple spot sources with variable spatial distributions, energies and numbers of particles. The source was plugged into the MC engine through the user hook system provided by FLUKA. Additionally, routines were provided to populate the beam source with treatment plan data, stored as DICOM RTPlan or TRiP98's RST format, enabling MC recomputation of clinical plans. Finally, facilities were integrated to read computerised tomography (CT) data into FLUKA. The tool was used to recompute two representative carbon ion treatment plans, a skull base and a prostate case, prepared with analytical dose calculation (TRiP98). Selected, clinically relevant issues influencing the dose distributions were investigated: (1) presence of positioning errors, (2) influence of fiducial markers and (3) variations in pencil beam width. Notable differences in modelling of these challenging situations were observed between the analytical and Monte Carlo results. In conclusion, a tool was developed, to support particle therapy research and treatment, when high precision MC calculations are required, e.g. in presence of severe density heterogeneities or in quality assurance procedures.

  17. CarSPred: a computational tool for predicting carbonylation sites of human proteins.

    PubMed

    Lv, Hongqiang; Han, Jiuqiang; Liu, Jun; Zheng, Jiguang; Liu, Ruiling; Zhong, Dexing

    2014-01-01

    Protein carbonylation is one of the most pervasive oxidative stress-induced post-translational modifications (PTMs), which plays a significant role in the etiology and progression of several human diseases. It has been regarded as a biomarker of oxidative stress due to its relatively early formation and stability compared with other oxidative PTMs. Only a subset of proteins is prone to carbonylation and most carbonyl groups are formed from lysine (K), arginine (R), threonine (T) and proline (P) residues. Recent advancements in analysis of the PTM by mass spectrometry provided new insights into the mechanisms of protein carbonylation, such as protein susceptibility and exact modification sites. However, the experimental approaches to identifying carbonylation sites are costly, time-consuming and capable of processing a limited number of proteins, and there is no bioinformatics method or tool devoted to predicting carbonylation sites of human proteins so far. In the paper, a computational method is proposed to identify carbonylation sites of human proteins. The method extracted four kinds of features and combined the minimum Redundancy Maximum Relevance (mRMR) feature selection criterion with weighted support vector machine (WSVM) to achieve total accuracies of 85.72%, 85.95%, 83.92% and 85.72% for K, R, T and P carbonylation site predictions respectively using 10-fold cross-validation. The final optimal feature sets were analysed, the position-specific composition and hydrophobicity environment of flanking residues of modification sites were discussed. In addition, a software tool named CarSPred has been developed to facilitate the application of the method. Datasets and the software involved in the paper are available at https://sourceforge.net/projects/hqlstudio/files/CarSPred-1.0/. PMID:25347395

  18. Evaluation of triple stage mass spectrometry as a robust and accurate diagnostic tool for determination of free cordycepin in designer egg.

    PubMed

    Chen, Yi Hsin; Lim, Chee Wei; Chan, Sheot Harn

    2014-05-01

    Direct determination of free cordycepin in designer egg using a highly selective mass spectrometric (MS) technique aided by a rapid and efficient dilute-and-shoot workflow would enhance their application as diagnostic tools in food fraud control. Here, triple stage mass spectrometry (MS(3)) demonstrated excellent analyte selectivity capability even when incomplete chromatographic separation was performed. Method validation was performed at six concentration levels of 100, 200, 400, 800, 1200 and 1600ngg(-1). Spiking experiments were examined at three concentration levels of 200, 400, and 1200ngg(-1) in individual egg white and egg yolk, measured over 2days. MS(3) enabled ion chromatograms with zero-background interference to be made in egg extracts. MS(3) eliminated severe over recovery (p<0.05) observed in all fortified samples, a challenge that MRM-transition could not address in a single step. Matrix-matched calibrants were needed to compensate for over recovery observed under MRM-transition mode. PMID:24360442

  19. INTRODUCING CAFein, A NEW COMPUTATIONAL TOOL FOR STELLAR PULSATIONS AND DYNAMIC TIDES

    SciTech Connect

    Valsecchi, F.; Farr, W. M.; Willems, B.; Rasio, F. A.; Kalogera, V.

    2013-08-10

    Here we present CAFein, a new computational tool for investigating radiative dissipation of dynamic tides in close binaries and of non-adiabatic, non-radial stellar oscillations in isolated stars in the linear regime. For the latter, CAFein computes the non-adiabatic eigenfrequencies and eigenfunctions of detailed stellar models. The code is based on the so-called Riccati method, a numerical algorithm that has been successfully applied to a variety of stellar pulsators, and which does not suffer from the major drawbacks of commonly used shooting and relaxation schemes. Here we present an extension of the Riccati method to investigate dynamic tides in close binaries. We demonstrate CAFein's capabilities as a stellar pulsation code both in the adiabatic and non-adiabatic regimes, by reproducing previously published eigenfrequencies of a polytrope, and by successfully identifying the unstable modes of a stellar model in the {beta} Cephei/SPB region of the Hertzsprung-Russell diagram. Finally, we verify CAFein's behavior in the dynamic tides regime by investigating the effects of dynamic tides on the eigenfunctions and orbital and spin evolution of massive main sequence stars in eccentric binaries, and of hot Jupiter host stars. The plethora of asteroseismic data provided by NASA's Kepler satellite, some of which include the direct detection of tidally excited stellar oscillations, make CAFein quite timely. Furthermore, the increasing number of observed short-period detached double white dwarfs (WDs) and the observed orbital decay in the tightest of such binaries open up a new possibility of investigating WD interiors through the effects of tides on their orbital evolution.

  20. How to Compute a Slot Marker - Calculation of Controller Managed Spacing Tools for Efficient Descents with Precision Scheduling

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas

    2012-01-01

    This paper describes the underlying principles and algorithms for computing the primary controller managed spacing (CMS) tools developed at NASA for precisely spacing aircraft along efficient descent paths. The trajectory-based CMS tools include slot markers, delay indications and speed advisories. These tools are one of three core NASA technologies integrated in NASAs ATM technology demonstration-1 (ATD-1) that will operationally demonstrate the feasibility of fuel-efficient, high throughput arrival operations using Automatic Dependent Surveillance Broadcast (ADS-B) and ground-based and airborne NASA technologies for precision scheduling and spacing.

  1. Local Perturbation Analysis: A Computational Tool for Biophysical Reaction-Diffusion Models

    PubMed Central

    Holmes, William R.; Mata, May Anne; Edelstein-Keshet, Leah

    2015-01-01

    Diffusion and interaction of molecular regulators in cells is often modeled using reaction-diffusion partial differential equations. Analysis of such models and exploration of their parameter space is challenging, particularly for systems of high dimensionality. Here, we present a relatively simple and straightforward analysis, the local perturbation analysis, that reveals how parameter variations affect model behavior. This computational tool, which greatly aids exploration of the behavior of a model, exploits a structural feature common to many cellular regulatory systems: regulators are typically either bound to a membrane or freely diffusing in the interior of the cell. Using well-documented, readily available bifurcation software, the local perturbation analysis tracks the approximate early evolution of an arbitrarily large perturbation of a homogeneous steady state. In doing so, it provides a bifurcation diagram that concisely describes various regimes of the model’s behavior, reducing the need for exhaustive simulations to explore parameter space. We explain the method and provide detailed step-by-step guides to its use and application. PMID:25606671

  2. A software tool for quality assurance of computed/digital radiography (CR/DR) systems

    NASA Astrophysics Data System (ADS)

    Desai, Nikunj; Valentino, Daniel J.

    2011-03-01

    The recommended methods to test the performance of computed radiography (CR) systems have been established by The American Association of Physicists in Medicine, Report No. 93, "Acceptance Testing and Quality Control of Photostimulable Storage Phosphor Imaging Systems". The quality assurance tests are categorized by how frequently they need to be performed. Quality assurance of CR systems is the responsibility of the facility that performs the exam and is governed by the state in which the facility is located. For Example, the New York State Department of Health has established a guide which lists the tests that a CR facility must perform for quality assurance. This study aims at educating the reader about the new quality assurance requirements defined by the state. It further demonstrates an easy to use software tool, henceforth referred to as the Digital Physicist, developed to aid a radiologic facility in conforming with state guidelines and monitoring quality assurance of CR/DR imaging systems. The Digital Physicist provides a vendor independent procedure for quality assurance of CR/DR systems. Further it, generates a PDF report with a brief description of these tests and the obtained results.

  3. Unraveling the Web of Viroinformatics: Computational Tools and Databases in Virus Research

    PubMed Central

    Priyadarshini, Pragya; Vrati, Sudhanshu

    2014-01-01

    The beginning of the second century of research in the field of virology (the first virus was discovered in 1898) was marked by its amalgamation with bioinformatics, resulting in the birth of a new domain—viroinformatics. The availability of more than 100 Web servers and databases embracing all or specific viruses (for example, dengue virus, influenza virus, hepatitis virus, human immunodeficiency virus [HIV], hemorrhagic fever virus [HFV], human papillomavirus [HPV], West Nile virus, etc.) as well as distinct applications (comparative/diversity analysis, viral recombination, small interfering RNA [siRNA]/short hairpin RNA [shRNA]/microRNA [miRNA] studies, RNA folding, protein-protein interaction, structural analysis, and phylotyping and genotyping) will definitely aid the development of effective drugs and vaccines. However, information about their access and utility is not available at any single source or on any single platform. Therefore, a compendium of various computational tools and resources dedicated specifically to virology is presented in this article. PMID:25428870

  4. Assessment of computational tools for MRI RF dosimetry by comparison with measurements on a laboratory phantom.

    PubMed

    Bottauscio, O; Cassarà, A M; Hand, J W; Giordano, D; Zilberti, L; Borsero, M; Chiampi, M; Weidemann, G

    2015-07-21

    This paper presents an extended comparison between numerical simulations using the different computational tools employed nowadays in electromagnetic dosimetry and measurements of radiofrequency (RF) electromagnetic field distributions in phantoms with tissue-simulating liquids at 64 MHz, 128 MHz and 300 MHz, adopting a customized experimental setup. The aim is to quantify the overall reliability and accuracy of RF dosimetry approaches at frequencies in use in magnetic resonance imaging transmit coils. Measurements are compared against four common techniques used for electromagnetic simulations, i.e. the finite difference time domain (FDTD), the finite integration technique (FIT), the boundary element method (BEM) and the hybrid finite element method-boundary element method (FEM-BEM) approaches. It is shown that FDTD and FIT produce similar results, which generally are also in good agreement with those of FEM-BEM. On the contrary, BEM seems to perform less well than the other methods and shows numerical convergence problems in presence of metallic objects. Maximum uncertainties of about 30% (coverage factor k = 2) can be attributed to measurements regarding electric and magnetic field amplitudes. Discrepancies between simulations and experiments are found to be in the range from 10% to 30%. These values confirm other previously published results of experimental validations performed on a limited set of data and define the accuracy of our measurement setup. PMID:26147075

  5. Computational materials science: an increasingly reliable engineering tool (example: defects in HgCdTe alloys)

    NASA Astrophysics Data System (ADS)

    Sher, Arden; van Schilfgaarde, M.; Berding, M. A.

    1998-04-01

    Computational materials science has evolved in recent years into a reliable theory capable of predicting not only idealized materials and device performance properties, but also those that apply to practical engineering developments. The codes run on workstations and even now are fast enough to be useful design tools. A review will be presented of the current status of this rapidly advancing field.As a demonstration of the power of the methods, predictions of the native point and complex defect, and impurity densities for the Hg0.8Cd0.2Te alloy as functions of external processing conditions will be treated. Where measurements have been done, the observed values agree well with the predictions. As an example, we find that As incorporates predominately on the cation sublattice, if the material is grown form the Te side of the existence curve, whereas it tends to reside on the anion sublattice in Hg-saturated growth. On the cation sublattice As is a donor. It is an acceptor on the Te sublattice. We have devised a post-MBE- growth processing method to encourage the transfer of As form the cation to the anion sublattice. Those aspects of the proposed process that have been tested work.

  6. Computational Ecology and Open Science: Tools to Help Manage Lakes for Cyanobacteria in Lakes

    EPA Science Inventory

    Computational ecology is an interdisciplinary field that takes advantage of modern computation abilities to expand our ecological understanding. As computational ecologists, we use large data sets, which often cover large spatial extents, and advanced statistical/mathematical co...

  7. Chimera Grid Tools

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  8. GMXPBSA 2.0: A GROMACS tool to perform MM/PBSA and computational alanine scanning

    NASA Astrophysics Data System (ADS)

    Paissoni, C.; Spiliotopoulos, D.; Musco, G.; Spitaleri, A.

    2014-11-01

    GMXPBSA 2.0 is a user-friendly suite of Bash/Perl scripts for streamlining MM/PBSA calculations on structural ensembles derived from GROMACS trajectories, to automatically calculate binding free energies for protein-protein or ligand-protein complexes. GMXPBSA 2.0 is flexible and can easily be customized to specific needs. Additionally, it performs computational alanine scanning (CAS) to study the effects of ligand and/or receptor alanine mutations on the free energy of binding. Calculations require only for protein-protein or protein-ligand MD simulations. GMXPBSA 2.0 performs different comparative analysis, including a posteriori generation of alanine mutants of the wild-type complex, calculation of the binding free energy values of the mutant complexes and comparison of the results with the wild-type system. Moreover, it compares the binding free energy of different complexes trajectories, allowing the study the effects of non-alanine mutations, post-translational modifications or unnatural amino acids on the binding free energy of the system under investigation. Finally, it can calculate and rank relative affinity to the same receptor utilizing MD simulations of proteins in complex with different ligands. In order to dissect the different MM/PBSA energy contributions, including molecular mechanic (MM), electrostatic contribution to solvation (PB) and nonpolar contribution to solvation (SA), the tool combines two freely available programs: the MD simulations software GROMACS and the Poisson-Boltzmann equation solver APBS. All the calculations can be performed in single or distributed automatic fashion on a cluster facility in order to increase the calculation by dividing frames across the available processors. The program is freely available under the GPL license. Catalogue identifier: AETQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AETQ_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing

  9. Computer-Assisted Mathematics: An Investigation of the Effectiveness of the Computer Used as a Tool to Learn Mathematics.

    ERIC Educational Resources Information Center

    Hatfield, Larry Lee

    Reported are the results of an investigation of the effects of programing a computer in a seventh grade mathematics class. Two treatments were conducted during two successive years. The students in the treatment group used the programing language BASIC to write computer algorithms following supplemental instruction. The mathematical content was…

  10. The Use of Interactive Computer Animations Based on POE as a Presentation Tool in Primary Science Teaching

    ERIC Educational Resources Information Center

    Akpinar, Ercan

    2014-01-01

    This study investigates the effects of using interactive computer animations based on predict-observe-explain (POE) as a presentation tool on primary school students' understanding of the static electricity concepts. A quasi-experimental pre-test/post-test control group design was utilized in this study. The experiment group consisted of 30…

  11. Innovation Configuration Mapping as a Professional Development Tool: The Case of One-to-One Laptop Computing

    ERIC Educational Resources Information Center

    Towndrow, Phillip A.; Fareed, Wan

    2015-01-01

    This article illustrates how findings from a study of teachers' and students' uses of laptop computers in a secondary school in Singapore informed the development of an Innovation Configuration (IC) Map--a tool for identifying and describing alternative ways of implementing innovations based on teachers' unique feelings, preoccupations, thoughts…

  12. Effects of Online Interaction via Computer-Mediated Communication (CMC) Tools on an E-Mathematics Learning Outcome

    ERIC Educational Resources Information Center

    Okonta, Olomeruom

    2010-01-01

    Recent research studies in open and distance learning have focused on the differences between traditional learning versus online learning, the benefits of computer-mediated communication (CMC) tools in an e-learning environment, and the relationship between online discussion posts and students' achievement. In fact, there is an extant…

  13. Scanning to the Beep: A Teacher-Tested Computer-Based Observational Assessment Tool for the Distance Education Classroom.

    ERIC Educational Resources Information Center

    Hausafus, Cheryl O.; Torrie, Margaret

    1995-01-01

    Discusses results of a study that examined preservice and inservice teachers' use of hand-held Computer-Based Observational Assessment Tools (CBOATs) in a distance education environment using the Iowa Communications Network. The CBOAT studied was Learner Profile, which included a small bar-code reader and data management software. (LRW)

  14. Voronota: A fast and reliable tool for computing the vertices of the Voronoi diagram of atomic balls.

    PubMed

    Olechnovič, Kliment; Venclovas, Ceslovas

    2014-03-30

    The Voronoi diagram of balls, corresponding to atoms of van der Waals radii, is particularly well-suited for the analysis of three-dimensional structures of biological macromolecules. However, due to the shortage of practical algorithms and the corresponding software, simpler approaches are often used instead. Here, we present a simple and robust algorithm for computing the vertices of the Voronoi diagram of balls. The vertices of Voronoi cells correspond to the centers of the empty tangent spheres defined by quadruples of balls. The algorithm is implemented as an open-source software tool, Voronota. Large-scale tests show that Voronota is a fast and reliable tool for processing both experimentally determined and computationally modeled macromolecular structures. Voronota can be easily deployed and may be used for the development of various other structure analysis tools that utilize the Voronoi diagram of balls. Voronota is available at: http://www.ibt.lt/bioinformatics/voronota. PMID:24523197

  15. Development of an innovative spacer grid model utilizing computational fluid dynamics within a subchannel analysis tool

    NASA Astrophysics Data System (ADS)

    Avramova, Maria

    In the past few decades the need for improved nuclear reactor safety analyses has led to a rapid development of advanced methods for multidimensional thermal-hydraulic analyses. These methods have become progressively more complex in order to account for the many physical phenomena anticipated during steady state and transient Light Water Reactor (LWR) conditions. The advanced thermal-hydraulic subchannel code COBRA-TF (Thurgood, M. J. et al., 1983) is used worldwide for best-estimate evaluations of the nuclear reactor safety margins. In the framework of a joint research project between the Pennsylvania State University (PSU) and AREVA NP GmbH, the theoretical models and numerics of COBRA-TF have been improved. Under the name F-COBRA-TF, the code has been subjected to an extensive verification and validation program and has been applied to variety of LWR steady state and transient simulations. To enable F-COBRA-TF for industrial applications, including safety margins evaluations and design analyses, the code spacer grid models were revised and substantially improved. The state-of-the-art in the modeling of the spacer grid effects on the flow thermal-hydraulic performance in rod bundles employs numerical experiments performed by computational fluid dynamics (CFD) calculations. Because of the involved computational cost, the CFD codes cannot be yet used for full bundle predictions, but their capabilities can be utilized for development of more advanced and sophisticated models for subchannel-level analyses. A subchannel code, equipped with improved physical models, can be then a powerful tool for LWR safety and design evaluations. The unique contributions of this PhD research are seen as development, implementation, and qualification of an innovative spacer grid model by utilizing CFD results within a framework of a subchannel analysis code. Usually, the spacer grid models are mostly related to modeling of the entrainment and deposition phenomena and the heat

  16. GMXPBSA 2.1: A GROMACS tool to perform MM/PBSA and computational alanine scanning

    NASA Astrophysics Data System (ADS)

    Paissoni, C.; Spiliotopoulos, D.; Musco, G.; Spitaleri, A.

    2015-01-01

    GMXPBSA 2.1 is a user-friendly suite of Bash/Perl scripts for streamlining MM/PBSA calculations on structural ensembles derived from GROMACS trajectories, to automatically calculate binding free energies for protein-protein or ligand-protein complexes [R.T. Bradshaw et al., Protein Eng. Des. Sel. 24 (2011) 197-207]. GMXPBSA 2.1 is flexible and can easily be customized to specific needs and it is an improvement of the previous GMXPBSA 2.0 [C. Paissoni et al., Comput. Phys. Commun. (2014), 185, 2920-2929]. Additionally, it performs computational alanine scanning (CAS) to study the effects of ligand and/or receptor alanine mutations on the free energy of binding. Calculations require only for protein-protein or protein-ligand MD simulations. GMXPBSA 2.1 performs different comparative analyses, including a posteriori generation of alanine mutants of the wild-type complex, calculation of the binding free energy values of the mutant complexes and comparison of the results with the wild-type system. Moreover, it compares the binding free energy of different complex trajectories, allowing the study of the effects of non-alanine mutations, post-translational modifications or unnatural amino acids on the binding free energy of the system under investigation. Finally, it can calculate and rank relative affinity to the same receptor utilizing MD simulations of proteins in complex with different ligands. In order to dissect the different MM/PBSA energy contributions, including molecular mechanic (MM), electrostatic contribution to solvation (PB) and nonpolar contribution to solvation (SA), the tool combines two freely available programs: the MD simulations software GROMACS [S. Pronk et al., Bioinformatics 29 (2013) 845-854] and the Poisson-Boltzmann equation solver APBS [N.A. Baker et al., Proc. Natl. Acad. Sci. U.S.A 98 (2001) 10037-10041]. All the calculations can be performed in single or distributed automatic fashion on a cluster facility in order to increase the

  17. PolyCTLDesigner: a computational tool for constructing polyepitope T-cell antigens

    PubMed Central

    2013-01-01

    Background Construction of artificial polyepitope antigens is one of the most promising strategies for developing more efficient and safer vaccines evoking T-cell immune responses. Epitope rearrangements and utilization of certain spacer sequences have been proven to greatly influence the immunogenicity of polyepitope constructs. However, despite numerous efforts towards constructing and evaluating artificial polyepitope immunogens as well as despite numerous computational methods elaborated to date for predicting T-cell epitopes, peptides binding to TAP and for antigen processing prediction, only a few computational tools were currently developed for rational design of polyepitope antigens. Findings Here we present a PolyCTLDesigner program that is intended for constructing polyepitope immunogens. Given a set of either known or predicted T-cell epitopes the program selects N-terminal flanking sequences for each epitope to optimize its binding to TAP (if necessary) and joins resulting oligopeptides into a polyepitope in a way providing efficient liberation of potential epitopes by proteasomal and/or immunoproteasomal processing. And it also tries to minimize the number of non-target junctional epitopes resulting from artificial juxtaposition of target epitopes within the polyepitope. For constructing polyepitopes, PolyCTLDesigner utilizes known amino acid patterns of TAP-binding and proteasomal/immunoproteasomal cleavage specificity together with genetic algorithm and graph theory approaches. The program was implemented using Python programming language and it can be used either interactively or through scripting, which allows users familiar with Python to create custom pipelines. Conclusions The developed software realizes a rational approach to designing poly-CTL-epitope antigens and can be used to develop new candidate polyepitope vaccines. The current version of PolyCTLDesigner is integrated with our TEpredict program for predicting T-cell epitopes, and thus it

  18. SuccinSite: a computational tool for the prediction of protein succinylation sites by exploiting the amino acid patterns and properties.

    PubMed

    Hasan, Md Mehedi; Yang, Shiping; Zhou, Yuan; Mollah, Md Nurul Haque

    2016-03-01

    Lysine succinylation is an emerging protein post-translational modification, which plays an important role in regulating the cellular processes in both eukaryotic and prokaryotic cells. However, the succinylation modification site is particularly difficult to detect because the experimental technologies used are often time-consuming and costly. Thus, an accurate computational method for predicting succinylation sites may help researchers towards designing their experiments and to understand the molecular mechanism of succinylation. In this study, a novel computational tool termed SuccinSite has been developed to predict protein succinylation sites by incorporating three sequence encodings, i.e., k-spaced amino acid pairs, binary and amino acid index properties. Then, the random forest classifier was trained with these encodings to build the predictor. The SuccinSite predictor achieves an AUC score of 0.802 in the 5-fold cross-validation set and performs significantly better than existing predictors on a comprehensive independent test set. Furthermore, informative features and predominant rules (i.e. feature combinations) were extracted from the trained random forest model for an improved interpretation of the predictor. Finally, we also compiled a database covering 4411 experimentally verified succinylation proteins with 12 456 lysine succinylation sites. Taken together, these results suggest that SuccinSite would be a helpful computational resource for succinylation sites prediction. The web-server, datasets, source code and database are freely available at http://systbio.cau.edu.cn/SuccinSite/. PMID:26739209

  19. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and water

  20. General theoretical/computational tool for interpreting NMR spin relaxation in proteins.

    PubMed

    Zerbetto, Mirco; Polimeno, Antonino; Meirovitch, Eva

    2009-10-15

    We developed in recent years the slowly relaxing local structure (SRLS) approach for analyzing NMR spin relaxation in proteins. SRLS is a two-body coupled rotator model which accounts rigorously for mode-coupling between the global motion of the protein and the local motion of the spin-bearing probe and allows for general properties of the second rank tensors involved. We showed that a general tool of data analysis requires both capabilities. Several important functionalities were missing in our previous implementations of SRLS in data fitting schemes, and in some important cases, the calculations were tedious. Here we present a general implementation which allows for asymmetric local and global diffusion tensors, distinct local ordering and local diffusion frames, and features a rhombic local potential which includes Wigner matrix element terms of ranks 2 and 4. A recently developed hydrodynamics-based approach for calculating global diffusion tensors has been incorporated into the data-fitting scheme. The computational efficiency of the latter has been increased significantly through object-oriented programming within the scope of the C++ programming language, and code parallelization. A convenient graphical user interface is provided. Currently autocorrelated (15)N spin relaxation data can be analyzed effectively. Adaptation to any autocorrelated and cross-correlated relaxation analysis is straightforward. New physical insight is gleaned on largely preserved local structure in solution, even in chain segments which experience slow local motion. Prospects associated with improved dynamic models, and new applications made possible by the current implementation of SRLS, are delineated. PMID:19775101