Science.gov

Sample records for accurate computational tools

  1. High-performance computing and networking as tools for accurate emission computed tomography reconstruction.

    PubMed

    Passeri, A; Formiconi, A R; De Cristofaro, M T; Pupi, A; Meldolesi, U

    1997-04-01

    It is well known that the quantitative potential of emission computed tomography (ECT) relies on the ability to compensate for resolution, attenuation and scatter effects. Reconstruction algorithms which are able to take these effects into account are highly demanding in terms of computing resources. The reported work aimed to investigate the use of a parallel high-performance computing platform for ECT reconstruction taking into account an accurate model of the acquisition of single-photon emission tomographic (SPET) data. An iterative algorithm with an accurate model of the variable system response was ported on the MIMD (Multiple Instruction Multiple Data) parallel architecture of a 64-node Cray T3D massively parallel computer. The system was organized to make it easily accessible even from low-cost PC-based workstations through standard TCP/IP networking. A complete brain study of 30 (64x64) slices could be reconstructed from a set of 90 (64x64) projections with ten iterations of the conjugate gradients algorithm in 9 s, corresponding to an actual speed-up factor of 135. This work demonstrated the possibility of exploiting remote high-performance computing and networking resources from hospital sites by means of low-cost workstations using standard communication protocols without particular problems for routine use. The achievable speed-up factors allow the assessment of the clinical benefit of advanced reconstruction techniques which require a heavy computational burden for the compensation effects such as variable spatial resolution, scatter and attenuation. The possibility of using the same software on the same hardware platform with data acquired in different laboratories with various kinds of SPET instrumentation is appealing for software quality control and for the evaluation of the clinical impact of the reconstruction methods.

  2. CAFE: A Computer Tool for Accurate Simulation of the Regulatory Pool Fire Environment for Type B Packages

    SciTech Connect

    Gritzo, L.A.; Koski, J.A.; Suo-Anttila, A.J.

    1999-03-16

    The Container Analysis Fire Environment computer code (CAFE) is intended to provide Type B package designers with an enhanced engulfing fire boundary condition when combined with the PATRAN/P-Thermal commercial code. Historically an engulfing fire boundary condition has been modeled as {sigma}T{sup 4} where {sigma} is the Stefan-Boltzman constant, and T is the fire temperature. The CAFE code includes the necessary chemistry, thermal radiation, and fluid mechanics to model an engulfing fire. Effects included are the local cooling of gases that form a protective boundary layer that reduces the incoming radiant heat flux to values lower than expected from a simple {sigma}T{sup 4} model. In addition, the effect of object shape on mixing that may increase the local fire temperature is included. Both high and low temperature regions that depend upon the local availability of oxygen are also calculated. Thus the competing effects that can both increase and decrease the local values of radiant heat flux are included in a reamer that is not predictable a-priori. The CAFE package consists of a group of computer subroutines that can be linked to workstation-based thermal analysis codes in order to predict package performance during regulatory and other accident fire scenarios.

  3. Magnetic ranging tool accurately guides replacement well

    SciTech Connect

    Lane, J.B.; Wesson, J.P. )

    1992-12-21

    This paper reports on magnetic ranging surveys and directional drilling technology which accurately guided a replacement well bore to intersect a leaking gas storage well with casing damage. The second well bore was then used to pump cement into the original leaking casing shoe. The repair well bore kicked off from the surface hole, bypassed casing damage in the middle of the well, and intersected the damaged well near the casing shoe. The repair well was subsequently completed in the gas storage zone near the original well bore, salvaging the valuable bottom hole location in the reservoir. This method would prevent the loss of storage gas, and it would prevent a potential underground blowout that could permanently damage the integrity of the storage field.

  4. Accurate method for computing correlated color temperature.

    PubMed

    Li, Changjun; Cui, Guihua; Melgosa, Manuel; Ruan, Xiukai; Zhang, Yaoju; Ma, Long; Xiao, Kaida; Luo, M Ronnier

    2016-06-27

    For the correlated color temperature (CCT) of a light source to be estimated, a nonlinear optimization problem must be solved. In all previous methods available to compute CCT, the objective function has only been approximated, and their predictions have achieved limited accuracy. For example, different unacceptable CCT values have been predicted for light sources located on the same isotemperature line. In this paper, we propose to compute CCT using the Newton method, which requires the first and second derivatives of the objective function. Following the current recommendation by the International Commission on Illumination (CIE) for the computation of tristimulus values (summations at 1 nm steps from 360 nm to 830 nm), the objective function and its first and second derivatives are explicitly given and used in our computations. Comprehensive tests demonstrate that the proposed method, together with an initial estimation of CCT using Robertson's method [J. Opt. Soc. Am. 58, 1528-1535 (1968)], gives highly accurate predictions below 0.0012 K for light sources with CCTs ranging from 500 K to 106 K.

  5. Tube dimpling tool assures accurate dip-brazed joints

    NASA Technical Reports Server (NTRS)

    Beuyukian, C. S.; Heisman, R. M.

    1968-01-01

    Portable, hand-held dimpling tool assures accurate brazed joints between tubes of different diameters. Prior to brazing, the tool performs precise dimpling and nipple forming and also provides control and accurate measuring of the height of nipples and depth of dimples so formed.

  6. Computers as tools

    SciTech Connect

    Eriksson, I.V.

    1994-12-31

    The following message was recently posted on a bulletin board and clearly shows the relevance of the conference theme: {open_quotes}The computer and digital networks seem poised to change whole regions of human activity -- how we record knowledge, communicate, learn, work, understand ourselves and the world. What`s the best framework for understanding this digitalization, or virtualization, of seemingly everything? ... Clearly, symbolic tools like the alphabet, book, and mechanical clock have changed some of our most fundamental notions -- self, identity, mind, nature, time, space. Can we say what the computer, a purely symbolic {open_quotes}machine,{close_quotes} is doing to our thinking in these areas? Or is it too early to say, given how much more powerful and less expensive the technology seems destinated to become in the next few decades?{close_quotes} (Verity, 1994) Computers certainly affect our lives and way of thinking but what have computers to do with ethics? A narrow approach would be that on the one hand people can and do abuse computer systems and on the other hand people can be abused by them. Weli known examples of the former are computer comes such as the theft of money, services and information. The latter can be exemplified by violation of privacy, health hazards and computer monitoring. Broadening the concept from computers to information systems (ISs) and information technology (IT) gives a wider perspective. Computers are just the hardware part of information systems which also include software, people and data. Information technology is the concept preferred today. It extends to communication, which is an essential part of information processing. Now let us repeat the question: What has IT to do with ethics? Verity mentioned changes in {open_quotes}how we record knowledge, communicate, learn, work, understand ourselves and the world{close_quotes}.

  7. LensTools: Weak Lensing computing tools

    NASA Astrophysics Data System (ADS)

    Petri, A.

    2016-02-01

    LensTools implements a wide range of routines frequently used in Weak Gravitational Lensing, including tools for image analysis, statistical processing and numerical theory predictions. The package offers many useful features, including complete flexibility and easy customization of input/output formats; efficient measurements of power spectrum, PDF, Minkowski functionals and peak counts of convergence maps; survey masks; artificial noise generation engines; easy to compute parameter statistical inferences; ray tracing simulations; and many others. It requires standard numpy and scipy, and depending on tools used, may require Astropy (ascl:1304.002), emcee (ascl:1303.002), matplotlib, and mpi4py.

  8. Computers as Cognitive Tools.

    ERIC Educational Resources Information Center

    Lajoie, Susanne P., Ed.; Derry, Sharon J., Ed.

    This book provides exemplars of the types of computer-based learning environments represented by the theoretical camps within the field and the practical applications of the theories. The contributors discuss a variety of computer applications to learning, ranging from school-related topics such as geometry, algebra, biology, history, physics, and…

  9. Accurate atom-mapping computation for biochemical reactions.

    PubMed

    Latendresse, Mario; Malerich, Jeremiah P; Travers, Mike; Karp, Peter D

    2012-11-26

    The complete atom mapping of a chemical reaction is a bijection of the reactant atoms to the product atoms that specifies the terminus of each reactant atom. Atom mapping of biochemical reactions is useful for many applications of systems biology, in particular for metabolic engineering where synthesizing new biochemical pathways has to take into account for the number of carbon atoms from a source compound that are conserved in the synthesis of a target compound. Rapid, accurate computation of the atom mapping(s) of a biochemical reaction remains elusive despite significant work on this topic. In particular, past researchers did not validate the accuracy of mapping algorithms. We introduce a new method for computing atom mappings called the minimum weighted edit-distance (MWED) metric. The metric is based on bond propensity to react and computes biochemically valid atom mappings for a large percentage of biochemical reactions. MWED models can be formulated efficiently as Mixed-Integer Linear Programs (MILPs). We have demonstrated this approach on 7501 reactions of the MetaCyc database for which 87% of the models could be solved in less than 10 s. For 2.1% of the reactions, we found multiple optimal atom mappings. We show that the error rate is 0.9% (22 reactions) by comparing these atom mappings to 2446 atom mappings of the manually curated Kyoto Encyclopedia of Genes and Genomes (KEGG) RPAIR database. To our knowledge, our computational atom-mapping approach is the most accurate and among the fastest published to date. The atom-mapping data will be available in the MetaCyc database later in 2012; the atom-mapping software will be available within the Pathway Tools software later in 2012.

  10. High accurate interpolation of NURBS tool path for CNC machine tools

    NASA Astrophysics Data System (ADS)

    Liu, Qiang; Liu, Huan; Yuan, Songmei

    2016-09-01

    Feedrate fluctuation caused by approximation errors of interpolation methods has great effects on machining quality in NURBS interpolation, but few methods can efficiently eliminate or reduce it to a satisfying level without sacrificing the computing efficiency at present. In order to solve this problem, a high accurate interpolation method for NURBS tool path is proposed. The proposed method can efficiently reduce the feedrate fluctuation by forming a quartic equation with respect to the curve parameter increment, which can be efficiently solved by analytic methods in real-time. Theoretically, the proposed method can totally eliminate the feedrate fluctuation for any 2nd degree NURBS curves and can interpolate 3rd degree NURBS curves with minimal feedrate fluctuation. Moreover, a smooth feedrate planning algorithm is also proposed to generate smooth tool motion with considering multiple constraints and scheduling errors by an efficient planning strategy. Experiments are conducted to verify the feasibility and applicability of the proposed method. This research presents a novel NURBS interpolation method with not only high accuracy but also satisfying computing efficiency.

  11. Fast, accurate, robust and Open Source Brain Extraction Tool (OSBET)

    NASA Astrophysics Data System (ADS)

    Namias, R.; Donnelly Kehoe, P.; D'Amato, J. P.; Nagel, J.

    2015-12-01

    The removal of non-brain regions in neuroimaging is a critical task to perform a favorable preprocessing. The skull-stripping depends on different factors including the noise level in the image, the anatomy of the subject being scanned and the acquisition sequence. For these and other reasons, an ideal brain extraction method should be fast, accurate, user friendly, open-source and knowledge based (to allow for the interaction with the algorithm in case the expected outcome is not being obtained), producing stable results and making it possible to automate the process for large datasets. There are already a large number of validated tools to perform this task but none of them meets the desired characteristics. In this paper we introduced an open source brain extraction tool (OSBET), composed of four steps using simple well-known operations such as: optimal thresholding, binary morphology, labeling and geometrical analysis that aims to assemble all the desired features. We present an experiment comparing OSBET with other six state-of-the-art techniques against a publicly available dataset consisting of 40 T1-weighted 3D scans and their corresponding manually segmented images. OSBET gave both: a short duration with an excellent accuracy, getting the best Dice Coefficient metric. Further validation should be performed, for instance, in unhealthy population, to generalize its usage for clinical purposes.

  12. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy

    PubMed Central

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T.; Cerutti, Francesco; Chin, Mary P. W.; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G.; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R.; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both 4He and 12C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth–dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features. PMID:27242956

  13. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy.

    PubMed

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T; Cerutti, Francesco; Chin, Mary P W; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both (4)He and (12)C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth-dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features.

  14. CgWind: A high-order accurate simulation tool for wind turbines and wind farms

    SciTech Connect

    Chand, K K; Henshaw, W D; Lundquist, K A; Singer, M A

    2010-02-22

    CgWind is a high-fidelity large eddy simulation (LES) tool designed to meet the modeling needs of wind turbine and wind park engineers. This tool combines several advanced computational technologies in order to model accurately the complex and dynamic nature of wind energy applications. The composite grid approach provides high-quality structured grids for the efficient implementation of high-order accurate discretizations of the incompressible Navier-Stokes equations. Composite grids also provide a natural mechanism for modeling bodies in relative motion and complex geometry. Advanced algorithms such as matrix-free multigrid, compact discretizations and approximate factorization will allow CgWind to perform highly resolved calculations efficiently on a wide class of computing resources. Also in development are nonlinear LES subgrid-scale models required to simulate the many interacting scales present in large wind turbine applications. This paper outlines our approach, the current status of CgWind and future development plans.

  15. Foundational Tools for Petascale Computing

    SciTech Connect

    Miller, Barton

    2014-05-19

    The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building tools and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.

  16. Preparing Rapid, Accurate Construction Cost Estimates with a Personal Computer.

    ERIC Educational Resources Information Center

    Gerstel, Sanford M.

    1986-01-01

    An inexpensive and rapid method for preparing accurate cost estimates of construction projects in a university setting, using a personal computer, purchased software, and one estimator, is described. The case against defined estimates, the rapid estimating system, and adjusting standard unit costs are discussed. (MLW)

  17. Efficient and accurate computation of the incomplete Airy functions

    NASA Technical Reports Server (NTRS)

    Constantinides, E. D.; Marhefka, R. J.

    1993-01-01

    The incomplete Airy integrals serve as canonical functions for the uniform ray optical solutions to several high-frequency scattering and diffraction problems that involve a class of integrals characterized by two stationary points that are arbitrarily close to one another or to an integration endpoint. Integrals with such analytical properties describe transition region phenomena associated with composite shadow boundaries. An efficient and accurate method for computing the incomplete Airy functions would make the solutions to such problems useful for engineering purposes. In this paper a convergent series solution for the incomplete Airy functions is derived. Asymptotic expansions involving several terms are also developed and serve as large argument approximations. The combination of the series solution with the asymptotic formulae provides for an efficient and accurate computation of the incomplete Airy functions. Validation of accuracy is accomplished using direct numerical integration data.

  18. Accurate and fast computation of transmission cross coefficients

    NASA Astrophysics Data System (ADS)

    Apostol, Štefan; Hurley, Paul; Ionescu, Radu-Cristian

    2015-03-01

    Precise and fast computation of aerial images are essential. Typical lithographic simulators employ a Köhler illumination system for which aerial imagery is obtained using a large number of Transmission Cross Coefficients (TCCs). These are generally computed by a slow numerical evaluation of a double integral. We review the general framework in which the 2D imagery is solved and then propose a fast and accurate method to obtain the TCCs. We acquire analytical solutions and thus avoid the complexity-accuracy trade-off encountered with numerical integration. Compared to other analytical integration methods, the one presented is faster, more general and more tractable.

  19. An Accurate and Efficient Method of Computing Differential Seismograms

    NASA Astrophysics Data System (ADS)

    Hu, S.; Zhu, L.

    2013-12-01

    Inversion of seismic waveforms for Earth structure usually requires computing partial derivatives of seismograms with respect to velocity model parameters. We developed an accurate and efficient method to calculate differential seismograms for multi-layered elastic media, based on the Thompson-Haskell propagator matrix technique. We first derived the partial derivatives of the Haskell matrix and its compound matrix respect to the layer parameters (P wave velocity, shear wave velocity and density). We then derived the partial derivatives of surface displacement kernels in the frequency-wavenumber domain. The differential seismograms are obtained by using the frequency-wavenumber double integration method. The implementation is computationally efficient and the total computing time is proportional to the time of computing the seismogram itself, i.e., independent of the number of layers in the model. We verified the correctness of results by comparing with differential seismograms computed using the finite differences method. Our results are more accurate because of the analytical nature of the derived partial derivatives.

  20. Computational Time-Accurate Body Movement: Methodology, Validation, and Application

    DTIC Science & Technology

    1995-10-01

    used that had a leading-edge sweep angle of 45 deg and a NACA 64A010 symmetrical airfoil section. A cross section of the pylon is a symmetrical...25 2. Information Flow for the Time-Accurate Store Trajectory Prediction Process . . . . . . . . . 26 3. Pitch Rates for NACA -0012 Airfoil...section are comparisons of the computational results to data for a NACA -0012 airfoil following a predefined pitching motion. Validation of the

  1. Tools for Accurate and Efficient Analysis of Complex Evolutionary Mechanisms in Microbial Genomes. Final Report

    SciTech Connect

    Nakhleh, Luay

    2014-03-12

    I proposed to develop computationally efficient tools for accurate detection and reconstruction of microbes' complex evolutionary mechanisms, thus enabling rapid and accurate annotation, analysis and understanding of their genomes. To achieve this goal, I proposed to address three aspects. (1) Mathematical modeling. A major challenge facing the accurate detection of HGT is that of distinguishing between these two events on the one hand and other events that have similar "effects." I proposed to develop a novel mathematical approach for distinguishing among these events. Further, I proposed to develop a set of novel optimization criteria for the evolutionary analysis of microbial genomes in the presence of these complex evolutionary events. (2) Algorithm design. In this aspect of the project, I proposed to develop an array of e cient and accurate algorithms for analyzing microbial genomes based on the formulated optimization criteria. Further, I proposed to test the viability of the criteria and the accuracy of the algorithms in an experimental setting using both synthetic as well as biological data. (3) Software development. I proposed the nal outcome to be a suite of software tools which implements the mathematical models as well as the algorithms developed.

  2. A new approach to compute accurate velocity of meteors

    NASA Astrophysics Data System (ADS)

    Egal, Auriane; Gural, Peter; Vaubaillon, Jeremie; Colas, Francois; Thuillot, William

    2016-10-01

    The CABERNET project was designed to push the limits of meteoroid orbit measurements by improving the determination of the meteors' velocities. Indeed, despite of the development of the cameras networks dedicated to the observation of meteors, there is still an important discrepancy between the measured orbits of meteoroids computed and the theoretical results. The gap between the observed and theoretic semi-major axis of the orbits is especially significant; an accurate determination of the orbits of meteoroids therefore largely depends on the computation of the pre-atmospheric velocities. It is then imperative to dig out how to increase the precision of the measurements of the velocity.In this work, we perform an analysis of different methods currently used to compute the velocities and trajectories of the meteors. They are based on the intersecting planes method developed by Ceplecha (1987), the least squares method of Borovicka (1990), and the multi-parameter fitting (MPF) method published by Gural (2012).In order to objectively compare the performances of these techniques, we have simulated realistic meteors ('fakeors') reproducing the different error measurements of many cameras networks. Some fakeors are built following the propagation models studied by Gural (2012), and others created by numerical integrations using the Borovicka et al. 2007 model. Different optimization techniques have also been investigated in order to pick the most suitable one to solve the MPF, and the influence of the geometry of the trajectory on the result is also presented.We will present here the results of an improved implementation of the multi-parameter fitting that allow an accurate orbit computation of meteors with CABERNET. The comparison of different velocities computation seems to show that if the MPF is by far the best method to solve the trajectory and the velocity of a meteor, the ill-conditioning of the costs functions used can lead to large estimate errors for noisy

  3. IVUSAngio tool: a publicly available software for fast and accurate 3D reconstruction of coronary arteries.

    PubMed

    Doulaverakis, Charalampos; Tsampoulatidis, Ioannis; Antoniadis, Antonios P; Chatzizisis, Yiannis S; Giannopoulos, Andreas; Kompatsiaris, Ioannis; Giannoglou, George D

    2013-11-01

    There is an ongoing research and clinical interest in the development of reliable and easily accessible software for the 3D reconstruction of coronary arteries. In this work, we present the architecture and validation of IVUSAngio Tool, an application which performs fast and accurate 3D reconstruction of the coronary arteries by using intravascular ultrasound (IVUS) and biplane angiography data. The 3D reconstruction is based on the fusion of the detected arterial boundaries in IVUS images with the 3D IVUS catheter path derived from the biplane angiography. The IVUSAngio Tool suite integrates all the intermediate processing and computational steps and provides a user-friendly interface. It also offers additional functionality, such as automatic selection of the end-diastolic IVUS images, semi-automatic and automatic IVUS segmentation, vascular morphometric measurements, graphical visualization of the 3D model and export in a format compatible with other computer-aided design applications. Our software was applied and validated in 31 human coronary arteries yielding quite promising results. Collectively, the use of IVUSAngio Tool significantly reduces the total processing time for 3D coronary reconstruction. IVUSAngio Tool is distributed as free software, publicly available to download and use.

  4. A GPU tool for efficient, accurate, and realistic simulation of cone beam CT projections

    PubMed Central

    Jia, Xun; Yan, Hao; Cerviño, Laura; Folkerts, Michael; Jiang, Steve B.

    2012-01-01

    conducted to calibrate gDRR against a real CBCT scanner. The calculated projections are accurate and realistic, such that beam-hardening artifacts and scatter artifacts can be reproduced using the simulated projections. The noise amplitudes in the CBCT images reconstructed from the simulated projections also agree with those in the measured images at corresponding mAs levels. Conclusions: A GPU computational tool, gDRR, has been developed for the accurate and efficient simulations of x-ray projections of CBCT with realistic configurations. PMID:23231286

  5. Direct computation of parameters for accurate polarizable force fields

    SciTech Connect

    Verstraelen, Toon Vandenbrande, Steven; Ayers, Paul W.

    2014-11-21

    We present an improved electronic linear response model to incorporate polarization and charge-transfer effects in polarizable force fields. This model is a generalization of the Atom-Condensed Kohn-Sham Density Functional Theory (DFT), approximated to second order (ACKS2): it can now be defined with any underlying variational theory (next to KS-DFT) and it can include atomic multipoles and off-center basis functions. Parameters in this model are computed efficiently as expectation values of an electronic wavefunction, obviating the need for their calibration, regularization, and manual tuning. In the limit of a complete density and potential basis set in the ACKS2 model, the linear response properties of the underlying theory for a given molecular geometry are reproduced exactly. A numerical validation with a test set of 110 molecules shows that very accurate models can already be obtained with fluctuating charges and dipoles. These features greatly facilitate the development of polarizable force fields.

  6. An Accurate and Dynamic Computer Graphics Muscle Model

    NASA Technical Reports Server (NTRS)

    Levine, David Asher

    1997-01-01

    A computer based musculo-skeletal model was developed at the University in the departments of Mechanical and Biomedical Engineering. This model accurately represents human shoulder kinematics. The result of this model is the graphical display of bones moving through an appropriate range of motion based on inputs of EMGs and external forces. The need existed to incorporate a geometric muscle model in the larger musculo-skeletal model. Previous muscle models did not accurately represent muscle geometries, nor did they account for the kinematics of tendons. This thesis covers the creation of a new muscle model for use in the above musculo-skeletal model. This muscle model was based on anatomical data from the Visible Human Project (VHP) cadaver study. Two-dimensional digital images from the VHP were analyzed and reconstructed to recreate the three-dimensional muscle geometries. The recreated geometries were smoothed, reduced, and sliced to form data files defining the surfaces of each muscle. The muscle modeling function opened these files during run-time and recreated the muscle surface. The modeling function applied constant volume limitations to the muscle and constant geometry limitations to the tendons.

  7. Development and Validation of a Multidisciplinary Tool for Accurate and Efficient Rotorcraft Noise Prediction (MUTE)

    NASA Technical Reports Server (NTRS)

    Liu, Yi; Anusonti-Inthra, Phuriwat; Diskin, Boris

    2011-01-01

    A physics-based, systematically coupled, multidisciplinary prediction tool (MUTE) for rotorcraft noise was developed and validated with a wide range of flight configurations and conditions. MUTE is an aggregation of multidisciplinary computational tools that accurately and efficiently model the physics of the source of rotorcraft noise, and predict the noise at far-field observer locations. It uses systematic coupling approaches among multiple disciplines including Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and high fidelity acoustics. Within MUTE, advanced high-order CFD tools are used around the rotor blade to predict the transonic flow (shock wave) effects, which generate the high-speed impulsive noise. Predictions of the blade-vortex interaction noise in low speed flight are also improved by using the Particle Vortex Transport Method (PVTM), which preserves the wake flow details required for blade/wake and fuselage/wake interactions. The accuracy of the source noise prediction is further improved by utilizing a coupling approach between CFD and CSD, so that the effects of key structural dynamics, elastic blade deformations, and trim solutions are correctly represented in the analysis. The blade loading information and/or the flow field parameters around the rotor blade predicted by the CFD/CSD coupling approach are used to predict the acoustic signatures at far-field observer locations with a high-fidelity noise propagation code (WOPWOP3). The predicted results from the MUTE tool for rotor blade aerodynamic loading and far-field acoustic signatures are compared and validated with a variation of experimental data sets, such as UH60-A data, DNW test data and HART II test data.

  8. Visualization Tools for Teaching Computer Security

    ERIC Educational Resources Information Center

    Yuan, Xiaohong; Vega, Percy; Qadah, Yaseen; Archer, Ricky; Yu, Huiming; Xu, Jinsheng

    2010-01-01

    Using animated visualization tools has been an important teaching approach in computer science education. We have developed three visualization and animation tools that demonstrate various information security concepts and actively engage learners. The information security concepts illustrated include: packet sniffer and related computer network…

  9. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  10. Photoacoustic computed tomography without accurate ultrasonic transducer responses

    NASA Astrophysics Data System (ADS)

    Sheng, Qiwei; Wang, Kun; Xia, Jun; Zhu, Liren; Wang, Lihong V.; Anastasio, Mark A.

    2015-03-01

    Conventional photoacoustic computed tomography (PACT) image reconstruction methods assume that the object and surrounding medium are described by a constant speed-of-sound (SOS) value. In order to accurately recover fine structures, SOS heterogeneities should be quantified and compensated for during PACT reconstruction. To address this problem, several groups have proposed hybrid systems that combine PACT with ultrasound computed tomography (USCT). In such systems, a SOS map is reconstructed first via USCT. Consequently, this SOS map is employed to inform the PACT reconstruction method. Additionally, the SOS map can provide structural information regarding tissue, which is complementary to the functional information from the PACT image. We propose a paradigm shift in the way that images are reconstructed in hybrid PACT-USCT imaging. Inspired by our observation that information about the SOS distribution is encoded in PACT measurements, we propose to jointly reconstruct the absorbed optical energy density and SOS distributions from a combined set of USCT and PACT measurements, thereby reducing the two reconstruction problems into one. This innovative approach has several advantages over conventional approaches in which PACT and USCT images are reconstructed independently: (1) Variations in the SOS will automatically be accounted for, optimizing PACT image quality; (2) The reconstructed PACT and USCT images will possess minimal systematic artifacts because errors in the imaging models will be optimally balanced during the joint reconstruction; (3) Due to the exploitation of information regarding the SOS distribution in the full-view PACT data, our approach will permit high-resolution reconstruction of the SOS distribution from sparse array data.

  11. Groupware: A Tool for Interpersonal Computing.

    ERIC Educational Resources Information Center

    Knupfer, Nancy Nelson; McLellan, Hilary

    Computer networks have provided a foundation for interpersonal computing, and new tools are emerging, the centerpiece of which is called "groupware." Groupware technology is reviewed, and the theoretical framework that will underlie interpersonal collaborative computing is discussed. Groupware can consist of hardware, software, services,…

  12. CombiROC: an interactive web tool for selecting accurate marker combinations of omics data.

    PubMed

    Mazzara, Saveria; Rossi, Riccardo L; Grifantini, Renata; Donizetti, Simone; Abrignani, Sergio; Bombaci, Mauro

    2017-03-30

    Diagnostic accuracy can be improved considerably by combining multiple markers, whose performance in identifying diseased subjects is usually assessed via receiver operating characteristic (ROC) curves. The selection of multimarker signatures is a complicated process that requires integration of data signatures with sophisticated statistical methods. We developed a user-friendly tool, called CombiROC, to help researchers accurately determine optimal markers combinations from diverse omics methods. With CombiROC data from different domains, such as proteomics and transcriptomics, can be analyzed using sensitivity/specificity filters: the number of candidate marker panels rising from combinatorial analysis is easily optimized bypassing limitations imposed by the nature of different experimental approaches. Leaving to the user full control on initial selection stringency, CombiROC computes sensitivity and specificity for all markers combinations, performances of best combinations and ROC curves for automatic comparisons, all visualized in a graphic interface. CombiROC was designed without hard-coded thresholds, allowing a custom fit to each specific data: this dramatically reduces the computational burden and lowers the false negative rates given by fixed thresholds. The application was validated with published data, confirming the marker combination already originally described or even finding new ones. CombiROC is a novel tool for the scientific community freely available at http://CombiROC.eu.

  13. CombiROC: an interactive web tool for selecting accurate marker combinations of omics data

    PubMed Central

    Mazzara, Saveria; Rossi, Riccardo L.; Grifantini, Renata; Donizetti, Simone; Abrignani, Sergio; Bombaci, Mauro

    2017-01-01

    Diagnostic accuracy can be improved considerably by combining multiple markers, whose performance in identifying diseased subjects is usually assessed via receiver operating characteristic (ROC) curves. The selection of multimarker signatures is a complicated process that requires integration of data signatures with sophisticated statistical methods. We developed a user-friendly tool, called CombiROC, to help researchers accurately determine optimal markers combinations from diverse omics methods. With CombiROC data from different domains, such as proteomics and transcriptomics, can be analyzed using sensitivity/specificity filters: the number of candidate marker panels rising from combinatorial analysis is easily optimized bypassing limitations imposed by the nature of different experimental approaches. Leaving to the user full control on initial selection stringency, CombiROC computes sensitivity and specificity for all markers combinations, performances of best combinations and ROC curves for automatic comparisons, all visualized in a graphic interface. CombiROC was designed without hard-coded thresholds, allowing a custom fit to each specific data: this dramatically reduces the computational burden and lowers the false negative rates given by fixed thresholds. The application was validated with published data, confirming the marker combination already originally described or even finding new ones. CombiROC is a novel tool for the scientific community freely available at http://CombiROC.eu. PMID:28358118

  14. MetaBAT, an efficient tool for accurately reconstructing single genomes from complex microbial communities

    SciTech Connect

    Kang, Dongwan D.; Froula, Jeff; Egan, Rob; Wang, Zhong

    2015-01-01

    Grouping large genomic fragments assembled from shotgun metagenomic sequences to deconvolute complex microbial communities, or metagenome binning, enables the study of individual organisms and their interactions. Because of the complex nature of these communities, existing metagenome binning methods often miss a large number of microbial species. In addition, most of the tools are not scalable to large datasets. Here we introduce automated software called MetaBAT that integrates empirical probabilistic distances of genome abundance and tetranucleotide frequency for accurate metagenome binning. MetaBAT outperforms alternative methods in accuracy and computational efficiency on both synthetic and real metagenome datasets. Lastly, it automatically forms hundreds of high quality genome bins on a very large assembly consisting millions of contigs in a matter of hours on a single node. MetaBAT is open source software and available at https://bitbucket.org/berkeleylab/metabat.

  15. MetaBAT, an efficient tool for accurately reconstructing single genomes from complex microbial communities

    DOE PAGES

    Kang, Dongwan D.; Froula, Jeff; Egan, Rob; ...

    2015-01-01

    Grouping large genomic fragments assembled from shotgun metagenomic sequences to deconvolute complex microbial communities, or metagenome binning, enables the study of individual organisms and their interactions. Because of the complex nature of these communities, existing metagenome binning methods often miss a large number of microbial species. In addition, most of the tools are not scalable to large datasets. Here we introduce automated software called MetaBAT that integrates empirical probabilistic distances of genome abundance and tetranucleotide frequency for accurate metagenome binning. MetaBAT outperforms alternative methods in accuracy and computational efficiency on both synthetic and real metagenome datasets. Lastly, it automatically formsmore » hundreds of high quality genome bins on a very large assembly consisting millions of contigs in a matter of hours on a single node. MetaBAT is open source software and available at https://bitbucket.org/berkeleylab/metabat.« less

  16. Automated Development of Accurate Algorithms and Efficient Codes for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.; Dyson, Rodger W.

    1999-01-01

    The simulation of sound generation and propagation in three space dimensions with realistic aircraft components is a very large time dependent computation with fine details. Simulations in open domains with embedded objects require accurate and robust algorithms for propagation, for artificial inflow and outflow boundaries, and for the definition of geometrically complex objects. The development, implementation, and validation of methods for solving these demanding problems is being done to support the NASA pillar goals for reducing aircraft noise levels. Our goal is to provide algorithms which are sufficiently accurate and efficient to produce usable results rapidly enough to allow design engineers to study the effects on sound levels of design changes in propulsion systems, and in the integration of propulsion systems with airframes. There is a lack of design tools for these purposes at this time. Our technical approach to this problem combines the development of new, algorithms with the use of Mathematica and Unix utilities to automate the algorithm development, code implementation, and validation. We use explicit methods to ensure effective implementation by domain decomposition for SPMD parallel computing. There are several orders of magnitude difference in the computational efficiencies of the algorithms which we have considered. We currently have new artificial inflow and outflow boundary conditions that are stable, accurate, and unobtrusive, with implementations that match the accuracy and efficiency of the propagation methods. The artificial numerical boundary treatments have been proven to have solutions which converge to the full open domain problems, so that the error from the boundary treatments can be driven as low as is required. The purpose of this paper is to briefly present a method for developing highly accurate algorithms for computational aeroacoustics, the use of computer automation in this process, and a brief survey of the algorithms that

  17. Computer Simulations: An Integrating Tool.

    ERIC Educational Resources Information Center

    Bilan, Bohdan J.

    This introduction to computer simulations as an integrated learning experience reports on their use with students in grades 5 through 10 using commercial software packages such as SimCity, SimAnt, SimEarth, and Civilization. Students spent an average of 60 hours with the simulation games and reported their experiences each week in a personal log.…

  18. Methods for Efficiently and Accurately Computing Quantum Mechanical Free Energies for Enzyme Catalysis.

    PubMed

    Kearns, F L; Hudson, P S; Boresch, S; Woodcock, H L

    2016-01-01

    Enzyme activity is inherently linked to free energies of transition states, ligand binding, protonation/deprotonation, etc.; these free energies, and thus enzyme function, can be affected by residue mutations, allosterically induced conformational changes, and much more. Therefore, being able to predict free energies associated with enzymatic processes is critical to understanding and predicting their function. Free energy simulation (FES) has historically been a computational challenge as it requires both the accurate description of inter- and intramolecular interactions and adequate sampling of all relevant conformational degrees of freedom. The hybrid quantum mechanical molecular mechanical (QM/MM) framework is the current tool of choice when accurate computations of macromolecular systems are essential. Unfortunately, robust and efficient approaches that employ the high levels of computational theory needed to accurately describe many reactive processes (ie, ab initio, DFT), while also including explicit solvation effects and accounting for extensive conformational sampling are essentially nonexistent. In this chapter, we will give a brief overview of two recently developed methods that mitigate several major challenges associated with QM/MM FES: the QM non-Boltzmann Bennett's acceptance ratio method and the QM nonequilibrium work method. We will also describe usage of these methods to calculate free energies associated with (1) relative properties and (2) along reaction paths, using simple test cases with relevance to enzymes examples.

  19. Accurate real-time depth control for CP-SSOCT distal sensor based handheld microsurgery tools

    PubMed Central

    Cheon, Gyeong Woo; Huang, Yong; Cha, Jaepyeng; Gehlbach, Peter L.; Kang, Jin U.

    2015-01-01

    This paper presents a novel intuitive targeting and tracking scheme that utilizes a common-path swept source optical coherence tomography (CP-SSOCT) distal sensor integrated handheld microsurgical tool. To achieve micron-order precision control, a reliable and accurate OCT distal sensing method is required; simultaneously, a prediction algorithm is necessary to compensate for the system delay associated with the computational, mechanical and electronic latencies. Due to the multi-layered structure of retina, it is necessary to develop effective surface detection methods rather than simple peak detection. To achieve this, a shifted cross-correlation method is applied for surface detection in order to increase robustness and accuracy in distal sensing. A predictor based on Kalman filter was implemented for more precise motion compensation. The performance was first evaluated using an established dry phantom consisting of stacked cellophane tape. This was followed by evaluation in an ex-vivo bovine retina model to assess system accuracy and precision. The results demonstrate highly accurate depth targeting with less than 5 μm RMSE depth locking. PMID:26137393

  20. Computational Tools for Genomic Studies in Plants.

    PubMed

    Martinez, Manuel

    2016-12-01

    In recent years, the genomic sequence of numerous plant species including the main crop species has been determined. Computational tools have been developed to deal with the issue of which plant has been sequenced and where is the sequence hosted. In this mini-review, the databases for genome projects, the databases created to host species/clade projects and the databases developed to perform plant comparative genomics are revised. Because of their importance in modern research, an in-depth analysis of the plant comparative genomics databases has been performed. This comparative analysis is focused in the common and specific computational tools developed to achieve the particular objectives of each database. Besides, emerging high-performance bioinformatics tools specific for plant research are commented. What kind of computational approaches should be implemented in next years to efficiently analyze plant genomes is discussed.

  1. Towards accurate quantum simulations of large systems with small computers

    NASA Astrophysics Data System (ADS)

    Yang, Yonggang

    2017-01-01

    Numerical simulations are important for many systems. In particular, various standard computer programs have been developed for solving the quantum Schrödinger equations. However, the accuracy of these calculations is limited by computer capabilities. In this work, an iterative method is introduced to enhance the accuracy of these numerical calculations, which is otherwise prohibitive by conventional methods. The method is easily implementable and general for many systems.

  2. Towards accurate quantum simulations of large systems with small computers.

    PubMed

    Yang, Yonggang

    2017-01-24

    Numerical simulations are important for many systems. In particular, various standard computer programs have been developed for solving the quantum Schrödinger equations. However, the accuracy of these calculations is limited by computer capabilities. In this work, an iterative method is introduced to enhance the accuracy of these numerical calculations, which is otherwise prohibitive by conventional methods. The method is easily implementable and general for many systems.

  3. Towards accurate quantum simulations of large systems with small computers

    PubMed Central

    Yang, Yonggang

    2017-01-01

    Numerical simulations are important for many systems. In particular, various standard computer programs have been developed for solving the quantum Schrödinger equations. However, the accuracy of these calculations is limited by computer capabilities. In this work, an iterative method is introduced to enhance the accuracy of these numerical calculations, which is otherwise prohibitive by conventional methods. The method is easily implementable and general for many systems. PMID:28117366

  4. Graphical arterial blood gas visualization tool supports rapid and accurate data interpretation.

    PubMed

    Doig, Alexa K; Albert, Robert W; Syroid, Noah D; Moon, Shaun; Agutter, Jim A

    2011-04-01

    A visualization tool that integrates numeric information from an arterial blood gas report with novel graphics was designed for the purpose of promoting rapid and accurate interpretation of acid-base data. A study compared data interpretation performance when arterial blood gas results were presented in a traditional numerical list versus the graphical visualization tool. Critical-care nurses (n = 15) and nursing students (n = 15) were significantly more accurate identifying acid-base states and assessing trends in acid-base data when using the graphical visualization tool. Critical-care nurses and nursing students using traditional numerical data had an average accuracy of 69% and 74%, respectively. Using the visualization tool, average accuracy improved to 83% for critical-care nurses and 93% for nursing students. Analysis of response times demonstrated that the visualization tool might help nurses overcome the "speed/accuracy trade-off" during high-stress situations when rapid decisions must be rendered. Perceived mental workload was significantly reduced for nursing students when they used the graphical visualization tool. In this study, the effects of implementing the graphical visualization were greater for nursing students than for critical-care nurses, which may indicate that the experienced nurses needed more training and use of the new technology prior to testing to show similar gains. Results of the objective and subjective evaluations support the integration of this graphical visualization tool into clinical environments that require accurate and timely interpretation of arterial blood gas data.

  5. Mapping methods for computationally efficient and accurate structural reliability

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1992-01-01

    Mapping methods are developed to improve the accuracy and efficiency of probabilistic structural analyses with coarse finite element meshes. The mapping methods consist of: (1) deterministic structural analyses with fine (convergent) finite element meshes, (2) probabilistic structural analyses with coarse finite element meshes, (3) the relationship between the probabilistic structural responses from the coarse and fine finite element meshes, and (4) a probabilistic mapping. The results show that the scatter of the probabilistic structural responses and structural reliability can be accurately predicted using a coarse finite element model with proper mapping methods. Therefore, large structures can be analyzed probabilistically using finite element methods.

  6. Computational tools for the modern andrologist.

    PubMed

    Niederberger, C

    1996-01-01

    With such a wide array of computational tools to solve inference problems, andrologists and their mathematical or statistical collaborators face perhaps bewildering choices. It is tempting to criticize a method with which one is unfamiliar for its apparent complexity. Yet, many methods are quite elegant; neural computation uses nature's own best biological classifier, for example, and genetic algorithms apply rules of natural selection. Computer scientists will likely find no one single best inference engine to solve all classification problems. Rather, the modeler should choose the most appropriate computational tool based on the specific nature of a problem. If the problem can be separated into obvious components, a Markov chain may be useful. If the andrologist would like to encode a well-known clinical algorithm into the computer, the programmer may use an expert system. Once a modeler builds an inference engine, that engine is not truly useful until other andrologists use it to make inferences with their own data. Because a wide variety of computer hardware and software exists, it is a significant endeavor to translate, or "port," software designed and built on one machine to many other different computers. Fortunately, the World Wide Web offers a means by which computational tools may be made directly available to multiple users on many different systems, or "platforms." The World Wide Web refers to a standardization of information traffic on the global computer network, the Internet. The Internet is simply the linkage of many computers worldwide by computer operators who have chosen to allow other users access to their systems. Because many different types of computers exist, until recently only communication in very rudimentary form, such as text, or between select compatible machines, was available. Within the last half-decade, computer scientists and operators began to use standard means of communication between computers. Interpreters of these standard

  7. Accurate Computation of Divided Differences of the Exponential Function,

    DTIC Science & Technology

    1983-06-01

    differences are not for arbitrary smooth functions f but for well known analytic functions such as exp. sin and cos. Thus we can exploit their properties in...have a bad name in practice. However in a number of applications the functional form of f is known (e.g. exp) and can be exploited to obtain accurate...n do X =s(1) s(1)=d(i) For j=2.....-1 do11=t, (j) z=Y next j next i SS7 . (Shift back and stop.] ,-tt+77. d(i).-e"d(i), s(i-1)’e~ s(i-i) for i=2

  8. Accurate computation of Zernike moments in polar coordinates.

    PubMed

    Xin, Yongqing; Pawlak, Miroslaw; Liao, Simon

    2007-02-01

    An algorithm for high-precision numerical computation of Zernike moments is presented. The algorithm, based on the introduced polar pixel tiling scheme, does not exhibit the geometric error and numerical integration error which are inherent in conventional methods based on Cartesian coordinates. This yields a dramatic improvement of the Zernike moments accuracy in terms of their reconstruction and invariance properties. The introduced image tiling requires an interpolation algorithm which turns out to be of the second order importance compared to the discretization error. Various comparisons are made between the accuracy of the proposed method and that of commonly used techniques. The results reveal the great advantage of our approach.

  9. Computational Tools to Accelerate Commercial Development

    SciTech Connect

    Miller, David C

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  10. Casing shoe depths accurately and quickly selected with computer assistance

    SciTech Connect

    Mattiello, D.; Piantanida, M.; Schenato, A.; Tomada, L. )

    1993-10-04

    A computer-aided support system for casing design and shoe depth selection improves the reliability of solutions, reduces total project time, and helps reduce costs. This system is part of ADIS (Advanced Drilling Information System), an integrated environment developed by three companies of the ENI group (Agip SpA, Enidata, and Saipem). The ADIS project focuses on the on site planning and control of drilling operations. The first version of the computer-aided support for casing design (Cascade) was experimentally introduced by Agip SpA in July 1991. After several modifications, the system was introduced to field operations in December 1991 and is now used in Agip's district locations and headquarters. The results from the validation process and practical uses indicated it has several pluses: the reliability of the casing shoe depths proposed by the system helps reduce the project errors and improve the economic feasibility of the proposed solutions; the system has helped spread the use of the best engineering practices concerning shoe depth selection and casing design; the Cascade system finds numerous solutions rapidly, thereby reducing project time compared to previous methods of casing design; the system finds or verifies solutions efficiently, allowing the engineer to analyze several alternatives simultaneously rather than to concentrate only on the analysis of a single solution; the system is flexible by means of a user-friendly integration with the other software packages in the ADIS project. The paper describes the design methodology, validation cases, shoe depths, casing design, hardware and software, and results.

  11. Macromolecular Entropy Can Be Accurately Computed from Force.

    PubMed

    Hensen, Ulf; Gräter, Frauke; Henchman, Richard H

    2014-11-11

    A method is presented to evaluate a molecule's entropy from the atomic forces calculated in a molecular dynamics simulation. Specifically, diagonalization of the mass-weighted force covariance matrix produces eigenvalues which in the harmonic approximation can be related to vibrational frequencies. The harmonic oscillator entropies of each vibrational mode may be summed to give the total entropy. The results for a series of hydrocarbons, dialanine and a β hairpin are found to agree much better with values derived from thermodynamic integration than results calculated using quasiharmonic analysis. Forces are found to follow a harmonic distribution more closely than coordinate displacements and better capture the underlying potential energy surface. The method's accuracy, simplicity, and computational similarity to quasiharmonic analysis, requiring as input force trajectories instead of coordinate trajectories, makes it readily applicable to a wide range of problems.

  12. MAPPER: A personal computer map projection tool

    NASA Technical Reports Server (NTRS)

    Bailey, Steven A.

    1993-01-01

    MAPPER is a set of software tools designed to let users create and manipulate map projections on a personal computer (PC). The capability exists to generate five popular map projections. These include azimuthal, cylindrical, mercator, lambert, and sinusoidal projections. Data for projections are contained in five coordinate databases at various resolutions. MAPPER is managed by a system of pull-down windows. This interface allows the user to intuitively create, view and export maps to other platforms.

  13. Final Report: Correctness Tools for Petascale Computing

    SciTech Connect

    Mellor-Crummey, John

    2014-10-27

    In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoring of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.

  14. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    PubMed

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry.

  15. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  16. Use of Monocrystalline Silicon as Tool Material for Highly Accurate Blanking of Thin Metal Foils

    SciTech Connect

    Hildering, Sven; Engel, Ulf; Merklein, Marion

    2011-05-04

    The trend towards miniaturisation of metallic mass production components combined with increased component functionality is still unbroken. Manufacturing these components by forming and blanking offers economical and ecological advantages combined with the needed accuracy. The complexity of producing tools with geometries below 50 {mu}m by conventional manufacturing methods becomes disproportional higher. Expensive serial finishing operations are required to achieve an adequate surface roughness combined with accurate geometry details. A novel approach for producing such tools is the use of advanced etching technologies for monocrystalline silicon that are well-established in the microsystems technology. High-precision vertical geometries with a width down to 5 {mu}m are possible. The present study shows a novel concept using this potential for the blanking of thin copper foils with monocrystallline silicon as a tool material. A self-contained machine-tool with compact outer dimensions was designed to avoid tensile stresses in the brittle silicon punch by an accurate, careful alignment of the punch, die and metal foil. A microscopic analysis of the monocrystalline silicon punch shows appropriate properties regarding flank angle, edge geometry and surface quality for the blanking process. Using a monocrystalline silicon punch with a width of 70 {mu}m blanking experiments on as-rolled copper foils with a thickness of 20 {mu}m demonstrate the general applicability of this material for micro production processes.

  17. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-07-11

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.

  18. Simulation Concept - How to Exploit Tools for Computing Hybrids

    DTIC Science & Technology

    2009-07-01

    multiphysics design tools (Simulation of Biological Systems - SIMBIOSYS ), provide an open source environment for biological simulation tools (Bio...SCHETCH Simulation Concept – How to Exploit Tools for Computing Project SIMBIOSYS Simulation of Biological Systems Program SPICE Simulation

  19. Computer-Based Cognitive Tools: Description and Design.

    ERIC Educational Resources Information Center

    Kennedy, David; McNaught, Carmel

    With computers, tangible tools are represented by the hardware (e.g., the central processing unit, scanners, and video display unit), while intangible tools are represented by the software. There is a special category of computer-based software tools (CBSTs) that have the potential to mediate cognitive processes--computer-based cognitive tools…

  20. VISTA - computational tools for comparative genomics

    SciTech Connect

    Frazer, Kelly A.; Pachter, Lior; Poliakov, Alexander; Rubin,Edward M.; Dubchak, Inna

    2004-01-01

    Comparison of DNA sequences from different species is a fundamental method for identifying functional elements in genomes. Here we describe the VISTA family of tools created to assist biologists in carrying out this task. Our first VISTA server at http://www-gsd.lbl.gov/VISTA/ was launched in the summer of 2000 and was designed to align long genomic sequences and visualize these alignments with associated functional annotations. Currently the VISTA site includes multiple comparative genomics tools and provides users with rich capabilities to browse pre-computed whole-genome alignments of large vertebrate genomes and other groups of organisms with VISTA Browser, submit their own sequences of interest to several VISTA servers for various types of comparative analysis, and obtain detailed comparative analysis results for a set of cardiovascular genes. We illustrate capabilities of the VISTA site by the analysis of a 180 kilobase (kb) interval on human chromosome 5 that encodes for the kinesin family member3A (KIF3A) protein.

  1. Computational resources and tools for antimicrobial peptides.

    PubMed

    Liu, Shicai; Fan, Linlin; Sun, Jian; Lao, Xingzhen; Zheng, Heng

    2017-01-01

    Antimicrobial peptides (AMPs), as evolutionarily conserved components of innate immune system, protect against pathogens including bacteria, fungi, viruses, and parasites. In general, AMPs are relatively small peptides (<10 kDa) with cationic nature and amphipathic structure and have modes of action different from traditional antibiotics. Up to now, there are more than 19 000 AMPs that have been reported, including those isolated from nature sources or by synthesis. They have been considered to be promising substitutes of conventional antibiotics in the quest to address the increasing occurrence of antibiotic resistance. However, most AMPs have modest direct antimicrobial activity, and their mechanisms of action, as well as their structure-activity relationships, are still poorly understood. Computational strategies are invaluable assets to provide insight into the activity of AMPs and thus exploit their potential as a new generation of antimicrobials. This article reviews the advances of AMP databases and computational tools for the prediction and design of new active AMPs. Copyright © 2016 European Peptide Society and John Wiley & Sons, Ltd.

  2. Creation of Anatomically Accurate Computer-Aided Design (CAD) Solid Models from Medical Images

    NASA Technical Reports Server (NTRS)

    Stewart, John E.; Graham, R. Scott; Samareh, Jamshid A.; Oberlander, Eric J.; Broaddus, William C.

    1999-01-01

    Most surgical instrumentation and implants used in the world today are designed with sophisticated Computer-Aided Design (CAD)/Computer-Aided Manufacturing (CAM) software. This software automates the mechanical development of a product from its conceptual design through manufacturing. CAD software also provides a means of manipulating solid models prior to Finite Element Modeling (FEM). Few surgical products are designed in conjunction with accurate CAD models of human anatomy because of the difficulty with which these models are created. We have developed a novel technique that creates anatomically accurate, patient specific CAD solids from medical images in a matter of minutes.

  3. Petascale self-consistent electromagnetic computations using scalable and accurate algorithms for complex structures

    NASA Astrophysics Data System (ADS)

    Cary, John R.; Abell, D.; Amundson, J.; Bruhwiler, D. L.; Busby, R.; Carlsson, J. A.; Dimitrov, D. A.; Kashdan, E.; Messmer, P.; Nieter, C.; Smithe, D. N.; Spentzouris, P.; Stoltz, P.; Trines, R. M.; Wang, H.; Werner, G. R.

    2006-09-01

    As the size and cost of particle accelerators escalate, high-performance computing plays an increasingly important role; optimization through accurate, detailed computermodeling increases performance and reduces costs. But consequently, computer simulations face enormous challenges. Early approximation methods, such as expansions in distance from the design orbit, were unable to supply detailed accurate results, such as in the computation of wake fields in complex cavities. Since the advent of message-passing supercomputers with thousands of processors, earlier approximations are no longer necessary, and it is now possible to compute wake fields, the effects of dampers, and self-consistent dynamics in cavities accurately. In this environment, the focus has shifted towards the development and implementation of algorithms that scale to large numbers of processors. So-called charge-conserving algorithms evolve the electromagnetic fields without the need for any global solves (which are difficult to scale up to many processors). Using cut-cell (or embedded) boundaries, these algorithms can simulate the fields in complex accelerator cavities with curved walls. New implicit algorithms, which are stable for any time-step, conserve charge as well, allowing faster simulation of structures with details small compared to the characteristic wavelength. These algorithmic and computational advances have been implemented in the VORPAL7 Framework, a flexible, object-oriented, massively parallel computational application that allows run-time assembly of algorithms and objects, thus composing an application on the fly.

  4. Computational and Physical Quality Assurance Tools for Radiotherapy

    NASA Astrophysics Data System (ADS)

    Graves, Yan Jiang

    Radiation therapy aims at delivering a prescribed amount of radiation dose to cancerous targets while sparing dose to normal organs. Treatment planning and delivery in modern radiotherapy are highly complex. To ensure the accuracy of the delivered dose to a patient, a quality assurance (QA) procedure is needed before the actual treatment delivery. This dissertation aims at developing computational and physical tools to facilitate the QA process. In Chapter 2, we have developed a fast and accurate computational QA tool using a graphics processing unit based Monte Carlo (MC) dose engine. This QA tool aims at identifying any errors in the treatment planning stage and machine delivery process by comparing three dose distributions: planned dose computed by a treatment planning system, planned dose and delivered dose reconstructed using the MC method. Within this tool, several modules have been built. (1) A denoising algorithm to smooth the MC calculated dose. We have also investigated the effects of statistical uncertainty in MC simulations on a commonly used dose comparison metric. (2) A linear accelerator source model with a semi-automatic commissioning process. (3) A fluence generation module. With all these modules, a web application for this QA tool with a user friendly interface has been developed to provide users with easy access to our tool, facilitating its clinical utilizations. Even after an initial treatment plan fulfills the QA requirements, a patient may experience inter-fractional anatomy variations, which compromise the initial plan optimality. To resolve this issue, adaptive radiotherapy (ART) has been proposed, where treatment plan is redesigned based on most recent patient anatomy. In Chapter 3, we have constructed a physical deformable head and neck (HN) phantom with in-vivo dosimetry capability. This phantom resembles HN patient geometry and simulates tumor shrinkage with a high level of realism. The ground truth deformation field can be measured

  5. High-order computational fluid dynamics tools for aircraft design

    PubMed Central

    Wang, Z. J.

    2014-01-01

    Most forecasts predict an annual airline traffic growth rate between 4.5 and 5% in the foreseeable future. To sustain that growth, the environmental impact of aircraft cannot be ignored. Future aircraft must have much better fuel economy, dramatically less greenhouse gas emissions and noise, in addition to better performance. Many technical breakthroughs must take place to achieve the aggressive environmental goals set up by governments in North America and Europe. One of these breakthroughs will be physics-based, highly accurate and efficient computational fluid dynamics and aeroacoustics tools capable of predicting complex flows over the entire flight envelope and through an aircraft engine, and computing aircraft noise. Some of these flows are dominated by unsteady vortices of disparate scales, often highly turbulent, and they call for higher-order methods. As these tools will be integral components of a multi-disciplinary optimization environment, they must be efficient to impact design. Ultimately, the accuracy, efficiency, robustness, scalability and geometric flexibility will determine which methods will be adopted in the design process. This article explores these aspects and identifies pacing items. PMID:25024419

  6. Computer Series, 101: Accurate Equations of State in Computational Chemistry Projects.

    ERIC Educational Resources Information Center

    Albee, David; Jones, Edward

    1989-01-01

    Discusses the use of computers in chemistry courses at the United States Military Academy. Provides two examples of computer projects: (1) equations of state, and (2) solving for molar volume. Presents BASIC and PASCAL listings for the second project. Lists 10 applications for physical chemistry. (MVL)

  7. Computer-based personality judgments are more accurate than those made by humans

    PubMed Central

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-01

    Judging others’ personalities is an essential skill in successful social living, as personality is a key driver behind people’s interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants’ Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507

  8. Computer-based personality judgments are more accurate than those made by humans.

    PubMed

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-27

    Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy.

  9. UP-TORR: online tool for accurate and Up-to-Date annotation of RNAi Reagents.

    PubMed

    Hu, Yanhui; Roesel, Charles; Flockhart, Ian; Perkins, Lizabeth; Perrimon, Norbert; Mohr, Stephanie E

    2013-09-01

    RNA interference (RNAi) is a widely adopted tool for loss-of-function studies but RNAi results only have biological relevance if the reagents are appropriately mapped to genes. Several groups have designed and generated RNAi reagent libraries for studies in cells or in vivo for Drosophila and other species. At first glance, matching RNAi reagents to genes appears to be a simple problem, as each reagent is typically designed to target a single gene. In practice, however, the reagent-gene relationship is complex. Although the sequences of oligonucleotides used to generate most types of RNAi reagents are static, the reference genome and gene annotations are regularly updated. Thus, at the time a researcher chooses an RNAi reagent or analyzes RNAi data, the most current interpretation of the RNAi reagent-gene relationship, as well as related information regarding specificity (e.g., predicted off-target effects), can be different from the original interpretation. Here, we describe a set of strategies and an accompanying online tool, UP-TORR (for Updated Targets of RNAi Reagents; www.flyrnai.org/up-torr), useful for accurate and up-to-date annotation of cell-based and in vivo RNAi reagents. Importantly, UP-TORR automatically synchronizes with gene annotations daily, retrieving the most current information available, and for Drosophila, also synchronizes with the major reagent collections. Thus, UP-TORR allows users to choose the most appropriate RNAi reagents at the onset of a study, as well as to perform the most appropriate analyses of results of RNAi-based studies.

  10. Computational tools for enzyme improvement: why everyone can - and should - use them.

    PubMed

    Ebert, Maximilian Ccjc; Pelletier, Joelle N

    2017-02-20

    This review presents computational methods that experimentalists can readily use to create smart libraries for enzyme engineering and to obtain insights into protein-substrate complexes. Computational tools have the reputation of being hard to use and inaccurate compared to experimental methods in enzyme engineering, yet they are essential to probe datasets of ever-increasing size and complexity. In recent years, bioinformatics groups have made a huge leap forward in providing user-friendly interfaces and accurate algorithms for experimentalists. These methods guide efficient experimental planning and allow the enzyme engineer to rationalize time and resources. Computational tools nevertheless face challenges in the realm of transient modern technology.

  11. Physics Education through Computational Tools: The Case of Geometrical and Physical Optics

    ERIC Educational Resources Information Center

    Rodríguez, Y.; Santana, A.; Mendoza, L. M.

    2013-01-01

    Recently, with the development of more powerful and accurate computational tools, the inclusion of new didactic materials in the classroom is known to have increased. However, the form in which these materials can be used to enhance the learning process is still under debate. Many different methodologies have been suggested for constructing new…

  12. AI in Reverse: Computer Tools That Become Cognitive.

    ERIC Educational Resources Information Center

    Salomon, Gavriel

    The question of whether human thinking can come to simulate computer intelligence--i.e., AI in reverse--is addressed in this paper. Examples are given of three computer tools which perform several functions that constitute an intellectual partnership between student and tool. Such functions include: (1) assuming part of the intellectual burden in…

  13. Are accurate computations of the 13C' shielding feasible at the DFT level of theory?

    PubMed

    Vila, Jorge A; Arnautova, Yelena A; Martin, Osvaldo A; Scheraga, Harold A

    2014-02-05

    The goal of this study is twofold. First, to investigate the relative influence of the main structural factors affecting the computation of the (13)C' shielding, namely, the conformation of the residue itself and the next nearest-neighbor effects. Second, to determine whether calculation of the (13)C' shielding at the density functional level of theory (DFT), with an accuracy similar to that of the (13)C(α) shielding, is feasible with the existing computational resources. The DFT calculations, carried out for a large number of possible conformations of the tripeptide Ac-GXY-NMe, with different combinations of X and Y residues, enable us to conclude that the accurate computation of the (13)C' shielding for a given residue X depends on the: (i) (ϕ,ψ) backbone torsional angles of X; (ii) side-chain conformation of X; (iii) (ϕ,ψ) torsional angles of Y; and (iv) identity of residue Y. Consequently, DFT-based quantum mechanical calculations of the (13)C' shielding, with all these factors taken into account, are two orders of magnitude more CPU demanding than the computation, with similar accuracy, of the (13)C(α) shielding. Despite not considering the effect of the possible hydrogen bond interaction of the carbonyl oxygen, this work contributes to our general understanding of the main structural factors affecting the accurate computation of the (13)C' shielding in proteins and may spur significant progress in effort to develop new validation methods for protein structures.

  14. Collected Wisdom: Assessment Tools for Computer Science Programs

    ERIC Educational Resources Information Center

    Sanders, Kathryn E.; McCartney, Robert

    2004-01-01

    In this paper, we investigate the question of what assessment tools are being used in practice by United States computing programs and what the faculty doing the assessment think of the tools they use. After presenting some background with regard to the design, implementation, and use of assessment, with particular attention to assessment tools,…

  15. Computing tools for accelerator design calculations

    SciTech Connect

    Fischler, M.; Nash, T.

    1984-01-01

    This note is intended as a brief, summary guide for accelerator designers to the new generation of commercial and special processors that allow great increases in computing cost effectiveness. New thinking is required to take best advantage of these computing opportunities, in particular, when moving from analytical approaches to tracking simulations. In this paper, we outline the relevant considerations.

  16. Computers as a Language Learning Tool.

    ERIC Educational Resources Information Center

    Ruschoff, Bernd

    1984-01-01

    Describes a computer-assisted language learning project at the University of Wuppertal (West Germany). It's hoped that teachers can overcome the two handicaps of the past--lack of teacher awareness of current audio-visual technical aids, as well as unsophisticated computer hardware--both problems by getting the opportunity to familiarize…

  17. MicroRNA-200 Family Profile: A Promising Ancillary Tool for Accurate Cancer Diagnosis.

    PubMed

    Liu, Xiaodong; Zhang, Jianhua; Xie, Botao; Li, Hao; Shen, Jihong; Chen, Jianheng

    2016-01-01

    Cancer is one of the most threatening diseases in the world and great interests have been paid to discover accurate and noninvasive methods for cancer diagnosis. The value of microRNA-200 (miRNA-200, miR-200) family has been revealed in many studies. However, the results from various studies were inconsistent, and thus a meta-analysis was designed and performed to assess the overall value of miRNA200 in cancer diagnosis. Relevant studies were searched electronically from the following databases: PubMed, Embase, Web of Science, the Cochrane Library, and Chinese National Knowledge Infrastructure. Keyword combined with "miR-200," "cancer," and "diagnosis" in any fields was used for searching relevant studies. Then, the pooled sensitivity, specificity, area under the curve (AUC), and partial AUC were calculated using the random-effects model. Heterogeneity among individual studies was also explored by subgroup analyses. A total of 28 studies from 18 articles with an overall sample size of 3676 subjects (2097 patients and 1579 controls) were included in this meta-analysis. The overall sensitivity and specificity with 95% confidence intervals (95% CIs) are 0.709 (95% CI: 0.657-0.755) and 0.667 (95% CI: 0.617-0.713), respectively. Additionally, AUC and partial AUC for the pooled data is 0.735 and 0.627, respectively. Subgroup analyses revealed that using miRNA-200 family for cancer diagnosis is more effective in white than in Asian ethnic groups. In addition, cancer diagnosis by miRNA using circulating specimen is more effective than that using noncirculating specimen. Finally, miRNA is more accurate in diagnosing endometrial cancer than other types of cancer, and some miRNA family members (miR-200b and miR-429) have superior diagnostic accuracy than other miR-200 family members. In conclusion, the profiling of miRNA-200 family is likely to be a valuable tool in cancer detection and diagnosis.

  18. Assessment tool for nursing student computer competencies.

    PubMed

    Elder, Betty L; Koehn, Mary L

    2009-01-01

    Computer skills have been established as important for nursing students and for graduate nurses. No current research was found on the best method to evaluate the skills of incoming nursing students. The purpose of this descriptive, correlational study was to compare student ratings of their computer competency to their performance of those skills on a computer-graded assessment. A convenience sample of 87 nursing students was used. There was a low, but significant correlation between the scores on the survey and the assessment. The results suggest that students rate themselves higher on their skills than their actual performance of computer skills. Implications for educators are presented, and the value of using a computer-graded assessment is discussed.

  19. An accurate tool for the fast generation of dark matter halo catalogues

    NASA Astrophysics Data System (ADS)

    Monaco, P.; Sefusatti, E.; Borgani, S.; Crocce, M.; Fosalba, P.; Sheth, R. K.; Theuns, T.

    2013-08-01

    We present a new parallel implementation of the PINpointing Orbit Crossing-Collapsed HIerarchical Objects (PINOCCHIO) algorithm, a quick tool, based on Lagrangian Perturbation Theory, for the hierarchical build-up of dark matter (DM) haloes in cosmological volumes. To assess its ability to predict halo correlations on large scales, we compare its results with those of an N-body simulation of a 3 h-1 Gpc box sampled with 20483 particles taken from the MICE suite, matching the same seeds for the initial conditions. Thanks to the Fastest Fourier Transforms in the West (FFTW) libraries and to the relatively simple design, the code shows very good scaling properties. The CPU time required by PINOCCHIO is a tiny fraction (˜1/2000) of that required by the MICE simulation. Varying some of PINOCCHIO numerical parameters allows one to produce a universal mass function that lies in the range allowed by published fits, although it underestimates the MICE mass function of Friends-of-Friends (FoF) haloes in the high-mass tail. We compare the matter-halo and the halo-halo power spectra with those of the MICE simulation and find that these two-point statistics are well recovered on large scales. In particular, when catalogues are matched in number density, agreement within 10 per cent is achieved for the halo power spectrum. At scales k > 0.1 h Mpc-1, the inaccuracy of the Zel'dovich approximation in locating halo positions causes an underestimate of the power spectrum that can be modelled as a Gaussian factor with a damping scale of d = 3 h-1 Mpc at z = 0, decreasing at higher redshift. Finally, a remarkable match is obtained for the reduced halo bispectrum, showing a good description of non-linear halo bias. Our results demonstrate the potential of PINOCCHIO as an accurate and flexible tool for generating large ensembles of mock galaxy surveys, with interesting applications for the analysis of large galaxy redshift surveys.

  20. Object-oriented Tools for Distributed Computing

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1993-01-01

    Distributed computing systems are proliferating, owing to the availability of powerful, affordable microcomputers and inexpensive communication networks. A critical problem in developing such systems is getting application programs to interact with one another across a computer network. Remote interprogram connectivity is particularly challenging across heterogeneous environments, where applications run on different kinds of computers and operating systems. NetWorks! (trademark) is an innovative software product that provides an object-oriented messaging solution to these problems. This paper describes the design and functionality of NetWorks! and illustrates how it is being used to build complex distributed applications for NASA and in the commercial sector.

  1. Computational Tools for Stem Cell Biology.

    PubMed

    Bian, Qin; Cahan, Patrick

    2016-12-01

    For over half a century, the field of developmental biology has leveraged computation to explore mechanisms of developmental processes. More recently, computational approaches have been critical in the translation of high throughput data into knowledge of both developmental and stem cell biology. In the past several years, a new subdiscipline of computational stem cell biology has emerged that synthesizes the modeling of systems-level aspects of stem cells with high-throughput molecular data. In this review, we provide an overview of this new field and pay particular attention to the impact that single cell transcriptomics is expected to have on our understanding of development and our ability to engineer cell fate.

  2. Fast and accurate computation of system matrix for area integral model-based algebraic reconstruction technique

    NASA Astrophysics Data System (ADS)

    Zhang, Shunli; Zhang, Dinghua; Gong, Hao; Ghasemalizadeh, Omid; Wang, Ge; Cao, Guohua

    2014-11-01

    Iterative algorithms, such as the algebraic reconstruction technique (ART), are popular for image reconstruction. For iterative reconstruction, the area integral model (AIM) is more accurate for better reconstruction quality than the line integral model (LIM). However, the computation of the system matrix for AIM is more complex and time-consuming than that for LIM. Here, we propose a fast and accurate method to compute the system matrix for AIM. First, we calculate the intersection of each boundary line of a narrow fan-beam with pixels in a recursive and efficient manner. Then, by grouping the beam-pixel intersection area into six types according to the slopes of the two boundary lines, we analytically compute the intersection area of the narrow fan-beam with the pixels in a simple algebraic fashion. Overall, experimental results show that our method is about three times faster than the Siddon algorithm and about two times faster than the distance-driven model (DDM) in computation of the system matrix. The reconstruction speed of our AIM-based ART is also faster than the LIM-based ART that uses the Siddon algorithm and DDM-based ART, for one iteration. The fast reconstruction speed of our method was accomplished without compromising the image quality.

  3. High resolution DEM from Tandem-X interferometry: an accurate tool to characterize volcanic activity

    NASA Astrophysics Data System (ADS)

    Albino, Fabien; Kervyn, Francois

    2013-04-01

    Tandem-X mission was launched by the German agency (DLR) in June 2010. It is a new generation high resolution SAR sensor mainly dedicated to topographic applications. For the purpose of our researches focused on the study of the volcano-tectonic activity in the Kivu Rift area, a set of Tandem-X bistatic radar images were used to produce a high resolution InSAR DEM of the Virunga Volcanic Province (VVP). The VVP is part of the Western branch of the African rift, situated at the boundary between D.R. Congo, Rwanda and Uganda. It has two highly active volcanoes, Nyiragongo and Nyamulagira. A first task concerns the quantitative assessment of the vertical accuracy that can be achieved with these new data. The new DEMs are compared to other space borne datasets (SRTM, ASTER) but also to field measurements given by differential GPS. Multi-temporal radar acquisitions allow us to produce several DEM of the same area. This appeared to be very useful in the context of an active volcanic context where new geomorphological features (faults, fissures, volcanic cones and lava flows) appear continuously through time. For example, since the year 2000, time of the SRTM acquisition, we had one eruption at Nyiragongo (2002) and six eruptions at Nyamulagira (2001, 2002, 2004, 2006, 2010 and 2011) which all induce large changes in the landscape with the emplacement of new lava fields and scoria cones. From our repetitive Tandem-X DEM production, we have a tool to identify and also quantify in term of size and volume all the topographic changes relative to this past volcanic activity. These parameters are high value information to improve the understanding of the Virunga volcanoes; the accurate estimation of erupted volume and knowledge of structural features associated to past eruptions are key parameters to understand the volcanic system, to ameliorate the hazard assessment, and finally contribute to risk mitigation in a densely populated area.

  4. Time accurate application of the MacCormack 2-4 scheme on massively parallel computers

    NASA Technical Reports Server (NTRS)

    Hudson, Dale A.; Long, Lyle N.

    1995-01-01

    Many recent computational efforts in turbulence and acoustics research have used higher order numerical algorithms. One popular method has been the explicit MacCormack 2-4 scheme. The MacCormack 2-4 scheme is second order accurate in time and fourth order accurate in space, and is stable for CFL's below 2/3. Current research has shown that the method can give accurate results but does exhibit significant Gibbs phenomena at sharp discontinuities. The impact of adding Jameson type second, third, and fourth order artificial viscosity was examined here. Category 2 problems, the nonlinear traveling wave and the Riemann problem, were computed using a CFL number of 0.25. This research has found that dispersion errors can be significantly reduced or nearly eliminated by using a combination of second and third order terms in the damping. Use of second and fourth order terms reduced the magnitude of dispersion errors but not as effectively as the second and third order combination. The program was coded using Thinking Machine's CM Fortran, a variant of Fortran 90/High Performance Fortran, and was executed on a 2K CM-200. Simple extrapolation boundary conditions were used for both problems.

  5. Tool Use of Experienced Learners in Computer-Based Learning Environments: Can Tools Be Beneficial?

    ERIC Educational Resources Information Center

    Juarez Collazo, Norma A.; Corradi, David; Elen, Jan; Clarebout, Geraldine

    2014-01-01

    Research has documented the use of tools in computer-based learning environments as problematic, that is, learners do not use the tools and when they do, they tend to do it suboptimally. This study attempts to disentangle cause and effect of this suboptimal tool use for experienced learners. More specifically, learner variables (metacognitive and…

  6. Novel electromagnetic surface integral equations for highly accurate computations of dielectric bodies with arbitrarily low contrasts

    SciTech Connect

    Erguel, Ozguer; Guerel, Levent

    2008-12-01

    We present a novel stabilization procedure for accurate surface formulations of electromagnetic scattering problems involving three-dimensional dielectric objects with arbitrarily low contrasts. Conventional surface integral equations provide inaccurate results for the scattered fields when the contrast of the object is low, i.e., when the electromagnetic material parameters of the scatterer and the host medium are close to each other. We propose a stabilization procedure involving the extraction of nonradiating currents and rearrangement of the right-hand side of the equations using fictitious incident fields. Then, only the radiating currents are solved to calculate the scattered fields accurately. This technique can easily be applied to the existing implementations of conventional formulations, it requires negligible extra computational cost, and it is also appropriate for the solution of large problems with the multilevel fast multipole algorithm. We show that the stabilization leads to robust formulations that are valid even for the solutions of extremely low-contrast objects.

  7. The Computer as an Artistic Tool.

    ERIC Educational Resources Information Center

    Sveinson, Lynn

    1978-01-01

    Presents a justification of the belief that science and art can be successfully combined. The computer's merits are viewed as a potential modelbuilder for the formalization of aesthetic concepts. The rest of the paper details recent and current research on such uses of the machine. (VT)

  8. Personal computers as a project management tool

    SciTech Connect

    Levers, W.H.

    1985-01-01

    This paper deals with project management experience related to application of business level personal computers to two design and construction projects. Projects include brine support facilities for two 50 MW geothermal power plants in the Imperial Valley of California adjacent to the Mexican border. The installed value of the facilities involved is approximately $40 million.

  9. Special purpose hybrid transfinite elements and unified computational methodology for accurately predicting thermoelastic stress waves

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1988-01-01

    This paper represents an attempt to apply extensions of a hybrid transfinite element computational approach for accurately predicting thermoelastic stress waves. The applicability of the present formulations for capturing the thermal stress waves induced by boundary heating for the well known Danilovskaya problems is demonstrated. A unique feature of the proposed formulations for applicability to the Danilovskaya problem of thermal stress waves in elastic solids lies in the hybrid nature of the unified formulations and the development of special purpose transfinite elements in conjunction with the classical Galerkin techniques and transformation concepts. Numerical test cases validate the applicability and superior capability to capture the thermal stress waves induced due to boundary heating.

  10. AI tools in computer based problem solving

    NASA Technical Reports Server (NTRS)

    Beane, Arthur J.

    1988-01-01

    The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.

  11. A Computer-Based Tool for Introducing Turfgrass Species.

    ERIC Educational Resources Information Center

    Fermanian, T. W.; Wehner, D. J.

    1995-01-01

    Describes a self-contained computer application constructed using the SuperCard development tool which introduces the characteristics of turfgrass species and their optimum environments. Evaluates students' gain in understanding turf species characteristics through this approach. (LZ)

  12. Computer Network Attack: An Operational Tool?

    DTIC Science & Technology

    2007-11-02

    Spectrum of Conflict, Cyber Warfare , Preemptive Strike, Effects Based Targeting. 15. Abstract: Computer Network Attack (CNA) is defined as...great deal of attention as the world’s capabilities in cyber - warfare grow. 11 Although addressing the wide ranging legal aspects of CNA is beyond the...the notion of cyber - warfare has not yet developed to the point that international norms have been established.15 These norms will be developed in

  13. Analysis and computer tools for separation processes involving nonideal mixtures

    SciTech Connect

    Lucia, A.

    1992-05-01

    The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

  14. Accurate identification and compensation of geometric errors of 5-axis CNC machine tools using double ball bar

    NASA Astrophysics Data System (ADS)

    Lasemi, Ali; Xue, Deyi; Gu, Peihua

    2016-05-01

    Five-axis CNC machine tools are widely used in manufacturing of parts with free-form surfaces. Geometric errors of machine tools have significant effects on the quality of manufactured parts. This research focuses on development of a new method to accurately identify geometric errors of 5-axis CNC machines, especially the errors due to rotary axes, using the magnetic double ball bar. A theoretical model for identification of geometric errors is provided. In this model, both position-independent errors and position-dependent errors are considered as the error sources. This model is simplified by identification and removal of the correlated and insignificant error sources of the machine. Insignificant error sources are identified using the sensitivity analysis technique. Simulation results reveal that the simplified error identification model can result in more accurate estimations of the error parameters. Experiments on a 5-axis CNC machine tool also demonstrate significant reduction in the volumetric error after error compensation.

  15. Computational Tools for Accelerating Carbon Capture Process Development

    SciTech Connect

    Miller, David; Sahinidis, N V; Cozad, A; Lee, A; Kim, H; Morinelly, J; Eslick, J; Yuan, Z

    2013-06-04

    This presentation reports development of advanced computational tools to accelerate next generation technology development. These tools are to develop an optimized process using rigorous models. They include: Process Models; Simulation-Based Optimization; Optimized Process; Uncertainty Quantification; Algebraic Surrogate Models; and Superstructure Optimization (Determine Configuration).

  16. Scratch as a Computational Modelling Tool for Teaching Physics

    ERIC Educational Resources Information Center

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  17. Computational Tools to Assess Turbine Biological Performance

    SciTech Connect

    Richmond, Marshall C.; Serkowski, John A.; Rakowski, Cynthia L.; Strickler, Brad; Weisbeck, Molly; Dotson, Curtis L.

    2014-07-24

    Public Utility District No. 2 of Grant County (GCPUD) operates the Priest Rapids Dam (PRD), a hydroelectric facility on the Columbia River in Washington State. The dam contains 10 Kaplan-type turbine units that are now more than 50 years old. Plans are underway to refit these aging turbines with new runners. The Columbia River at PRD is a migratory pathway for several species of juvenile and adult salmonids, so passage of fish through the dam is a major consideration when upgrading the turbines. In this paper, a method for turbine biological performance assessment (BioPA) is demonstrated. Using this method, a suite of biological performance indicators is computed based on simulated data from a CFD model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. Using known relationships between the dose of an injury mechanism and frequency of injury (dose–response) from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from proposed designs, the engineer can identify the more-promising alternatives. We present an application of the BioPA method for baseline risk assessment calculations for the existing Kaplan turbines at PRD that will be used as the minimum biological performance that a proposed new design must achieve.

  18. The Use of Computer Tools to Support Meaningful Learning

    ERIC Educational Resources Information Center

    Keengwe, Jared; Onchwari, Grace; Wachira, Patrick

    2008-01-01

    This article attempts to provide a review of literature pertaining to computer technology use in education. The authors discuss the benefits of learning with technology tools when integrated into teaching. The argument that introducing computer technology into schools will neither improve nor change the quality of classroom instruction unless…

  19. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  20. The Learning Computer: Low Bandwidth Tool that Bridges Digital Divide

    ERIC Educational Resources Information Center

    Johnson, Russell; Kemp, Elizabeth; Kemp, Ray; Blakey, Peter

    2007-01-01

    This article reports on a project that explores strategies for narrowing the digital divide by providing a practicable e-learning option for the millions living outside the ambit of high performance computing and communication technology. The concept is introduced of a "learning computer," a low bandwidth tool that provides a simplified,…

  1. Computer tools for systems engineering at LaRC

    NASA Technical Reports Server (NTRS)

    Walters, J. Milam

    1994-01-01

    The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.

  2. Making it Easy to Construct Accurate Hydrological Models that Exploit High Performance Computers (Invited)

    NASA Astrophysics Data System (ADS)

    Kees, C. E.; Farthing, M. W.; Terrel, A.; Certik, O.; Seljebotn, D.

    2013-12-01

    This presentation will focus on two barriers to progress in the hydrological modeling community, and research and development conducted to lessen or eliminate them. The first is a barrier to sharing hydrological models among specialized scientists that is caused by intertwining the implementation of numerical methods with the implementation of abstract numerical modeling information. In the Proteus toolkit for computational methods and simulation, we have decoupled these two important parts of computational model through separate "physics" and "numerics" interfaces. More recently we have begun developing the Strong Form Language for easy and direct representation of the mathematical model formulation in a domain specific language embedded in Python. The second major barrier is sharing ANY scientific software tools that have complex library or module dependencies, as most parallel, multi-physics hydrological models must have. In this setting, users and developer are dependent on an entire distribution, possibly depending on multiple compilers and special instructions depending on the environment of the target machine. To solve these problem we have developed, hashdist, a stateless package management tool and a resulting portable, open source scientific software distribution.

  3. CoMOGrad and PHOG: From Computer Vision to Fast and Accurate Protein Tertiary Structure Retrieval

    PubMed Central

    Karim, Rezaul; Aziz, Mohd. Momin Al; Shatabda, Swakkhar; Rahman, M. Sohel; Mia, Md. Abul Kashem; Zaman, Farhana; Rakin, Salman

    2015-01-01

    The number of entries in a structural database of proteins is increasing day by day. Methods for retrieving protein tertiary structures from such a large database have turn out to be the key to comparative analysis of structures that plays an important role to understand proteins and their functions. In this paper, we present fast and accurate methods for the retrieval of proteins having tertiary structures similar to a query protein from a large database. Our proposed methods borrow ideas from the field of computer vision. The speed and accuracy of our methods come from the two newly introduced features- the co-occurrence matrix of the oriented gradient and pyramid histogram of oriented gradient- and the use of Euclidean distance as the distance measure. Experimental results clearly indicate the superiority of our approach in both running time and accuracy. Our method is readily available for use from this website: http://research.buet.ac.bd:8080/Comograd/. PMID:26293226

  4. Accurate Time-Dependent Traveling-Wave Tube Model Developed for Computational Bit-Error-Rate Testing

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.

    2001-01-01

    The phenomenal growth of the satellite communications industry has created a large demand for traveling-wave tubes (TWT's) operating with unprecedented specifications requiring the design and production of many novel devices in record time. To achieve this, the TWT industry heavily relies on computational modeling. However, the TWT industry's computational modeling capabilities need to be improved because there are often discrepancies between measured TWT data and that predicted by conventional two-dimensional helical TWT interaction codes. This limits the analysis and design of novel devices or TWT's with parameters differing from what is conventionally manufactured. In addition, the inaccuracy of current computational tools limits achievable TWT performance because optimized designs require highly accurate models. To address these concerns, a fully three-dimensional, time-dependent, helical TWT interaction model was developed using the electromagnetic particle-in-cell code MAFIA (Solution of MAxwell's equations by the Finite-Integration-Algorithm). The model includes a short section of helical slow-wave circuit with excitation fed by radiofrequency input/output couplers, and an electron beam contained by periodic permanent magnet focusing. A cutaway view of several turns of the three-dimensional helical slow-wave circuit with input/output couplers is shown. This has been shown to be more accurate than conventionally used two-dimensional models. The growth of the communications industry has also imposed a demand for increased data rates for the transmission of large volumes of data. To achieve increased data rates, complex modulation and multiple access techniques are employed requiring minimum distortion of the signal as it is passed through the TWT. Thus, intersymbol interference (ISI) becomes a major consideration, as well as suspected causes such as reflections within the TWT. To experimentally investigate effects of the physical TWT on ISI would be

  5. A new software tool for computing Earth's atmospheric transmission of near- and far-infrared radiation

    NASA Technical Reports Server (NTRS)

    Lord, Steven D.

    1992-01-01

    This report describes a new software tool, ATRAN, which computes the transmittance of Earth's atmosphere at near- and far-infrared wavelengths. We compare the capabilities of this program with others currently available and demonstrate its utility for observational data calibration and reduction. The program employs current water-vapor and ozone models to produce fast and accurate transmittance spectra for wavelengths ranging from 0.8 microns to 10 mm.

  6. Accurate and efficient computation of nonlocal potentials based on Gaussian-sum approximation

    NASA Astrophysics Data System (ADS)

    Exl, Lukas; Mauser, Norbert J.; Zhang, Yong

    2016-12-01

    We introduce an accurate and efficient method for the numerical evaluation of nonlocal potentials, including the 3D/2D Coulomb, 2D Poisson and 3D dipole-dipole potentials. Our method is based on a Gaussian-sum approximation of the singular convolution kernel combined with a Taylor expansion of the density. Starting from the convolution formulation of the nonlocal potential, for smooth and fast decaying densities, we make a full use of the Fourier pseudospectral (plane wave) approximation of the density and a separable Gaussian-sum approximation of the kernel in an interval where the singularity (the origin) is excluded. The potential is separated into a regular integral and a near-field singular correction integral. The first is computed with the Fourier pseudospectral method, while the latter is well resolved utilizing a low-order Taylor expansion of the density. Both parts are accelerated by fast Fourier transforms (FFT). The method is accurate (14-16 digits), efficient (O (Nlog ⁡ N) complexity), low in storage, easily adaptable to other different kernels, applicable for anisotropic densities and highly parallelizable.

  7. Challenges and promises for translating computational tools into clinical practice.

    PubMed

    Ahn, Woo-Young; Busemeyer, Jerome R

    2016-10-01

    Computational modeling and associated methods have greatly advanced our understanding of cognition and neurobiology underlying complex behaviors and psychiatric conditions. Yet, no computational methods have been successfully translated into clinical settings. This review discusses three major methodological and practical challenges (A. precise characterization of latent neurocognitive processes, B. developing optimal assays, C. developing large-scale longitudinal studies and generating predictions from multi-modal data) and potential promises and tools that have been developed in various fields including mathematical psychology, computational neuroscience, computer science, and statistics. We conclude by highlighting a strong need to communicate and collaborate across multiple disciplines.

  8. Enabling high grayscale resolution displays and accurate response time measurements on conventional computers.

    PubMed

    Li, Xiangrui; Lu, Zhong-Lin

    2012-02-29

    Display systems based on conventional computer graphics cards are capable of generating images with 8-bit gray level resolution. However, most experiments in vision research require displays with more than 12 bits of luminance resolution. Several solutions are available. Bit++ (1) and DataPixx (2) use the Digital Visual Interface (DVI) output from graphics cards and high resolution (14 or 16-bit) digital-to-analog converters to drive analog display devices. The VideoSwitcher (3) described here combines analog video signals from the red and blue channels of graphics cards with different weights using a passive resister network (4) and an active circuit to deliver identical video signals to the three channels of color monitors. The method provides an inexpensive way to enable high-resolution monochromatic displays using conventional graphics cards and analog monitors. It can also provide trigger signals that can be used to mark stimulus onsets, making it easy to synchronize visual displays with physiological recordings or response time measurements. Although computer keyboards and mice are frequently used in measuring response times (RT), the accuracy of these measurements is quite low. The RTbox is a specialized hardware and software solution for accurate RT measurements. Connected to the host computer through a USB connection, the driver of the RTbox is compatible with all conventional operating systems. It uses a microprocessor and high-resolution clock to record the identities and timing of button events, which are buffered until the host computer retrieves them. The recorded button events are not affected by potential timing uncertainties or biases associated with data transmission and processing in the host computer. The asynchronous storage greatly simplifies the design of user programs. Several methods are available to synchronize the clocks of the RTbox and the host computer. The RTbox can also receive external triggers and be used to measure RT with respect

  9. Numerical Computation of a Continuous-thrust State Transition Matrix Incorporating Accurate Hardware and Ephemeris Models

    NASA Technical Reports Server (NTRS)

    Ellison, Donald; Conway, Bruce; Englander, Jacob

    2015-01-01

    A significant body of work exists showing that providing a nonlinear programming (NLP) solver with expressions for the problem constraint gradient substantially increases the speed of program execution and can also improve the robustness of convergence, especially for local optimizers. Calculation of these derivatives is often accomplished through the computation of spacecraft's state transition matrix (STM). If the two-body gravitational model is employed as is often done in the context of preliminary design, closed form expressions for these derivatives may be provided. If a high fidelity dynamics model, that might include perturbing forces such as the gravitational effect from multiple third bodies and solar radiation pressure is used then these STM's must be computed numerically. We present a method for the power hardward model and a full ephemeris model. An adaptive-step embedded eight order Dormand-Prince numerical integrator is discussed and a method for the computation of the time of flight derivatives in this framework is presented. The use of these numerically calculated derivatieves offer a substantial improvement over finite differencing in the context of a global optimizer. Specifically the inclusion of these STM's into the low thrust missiondesign tool chain in use at NASA Goddard Spaceflight Center allows for an increased preliminary mission design cadence.

  10. Majority vote and other problems when using computational tools.

    PubMed

    Vihinen, Mauno

    2014-08-01

    Computational tools are essential for most of our research. To use these tools, one needs to know how they work. Problems in application of computational methods to variation analysis can appear at several stages and affect, for example, the interpretation of results. Such cases are discussed along with suggestions how to avoid them. The applications include incomplete reporting of methods, especially about the use of prediction tools; method selection on unscientific grounds and without consulting independent method performance assessments; extending application area of methods outside their intended purpose; use of the same data several times for obtaining majority vote; and filtering of datasets so that variants of interest are excluded. All these issues can be avoided by discontinuing the use software tools as black boxes.

  11. Development of highly accurate approximate scheme for computing the charge transfer integral

    NASA Astrophysics Data System (ADS)

    Pershin, Anton; Szalay, Péter G.

    2015-08-01

    The charge transfer integral is a key parameter required by various theoretical models to describe charge transport properties, e.g., in organic semiconductors. The accuracy of this important property depends on several factors, which include the level of electronic structure theory and internal simplifications of the applied formalism. The goal of this paper is to identify the performance of various approximate approaches of the latter category, while using the high level equation-of-motion coupled cluster theory for the electronic structure. The calculations have been performed on the ethylene dimer as one of the simplest model systems. By studying different spatial perturbations, it was shown that while both energy split in dimer and fragment charge difference methods are equivalent with the exact formulation for symmetrical displacements, they are less efficient when describing transfer integral along the asymmetric alteration coordinate. Since the "exact" scheme was found computationally expensive, we examine the possibility to obtain the asymmetric fluctuation of the transfer integral by a Taylor expansion along the coordinate space. By exploring the efficiency of this novel approach, we show that the Taylor expansion scheme represents an attractive alternative to the "exact" calculations due to a substantial reduction of computational costs, when a considerably large region of the potential energy surface is of interest. Moreover, we show that the Taylor expansion scheme, irrespective of the dimer symmetry, is very accurate for the entire range of geometry fluctuations that cover the space the molecule accesses at room temperature.

  12. Development of highly accurate approximate scheme for computing the charge transfer integral.

    PubMed

    Pershin, Anton; Szalay, Péter G

    2015-08-21

    The charge transfer integral is a key parameter required by various theoretical models to describe charge transport properties, e.g., in organic semiconductors. The accuracy of this important property depends on several factors, which include the level of electronic structure theory and internal simplifications of the applied formalism. The goal of this paper is to identify the performance of various approximate approaches of the latter category, while using the high level equation-of-motion coupled cluster theory for the electronic structure. The calculations have been performed on the ethylene dimer as one of the simplest model systems. By studying different spatial perturbations, it was shown that while both energy split in dimer and fragment charge difference methods are equivalent with the exact formulation for symmetrical displacements, they are less efficient when describing transfer integral along the asymmetric alteration coordinate. Since the "exact" scheme was found computationally expensive, we examine the possibility to obtain the asymmetric fluctuation of the transfer integral by a Taylor expansion along the coordinate space. By exploring the efficiency of this novel approach, we show that the Taylor expansion scheme represents an attractive alternative to the "exact" calculations due to a substantial reduction of computational costs, when a considerably large region of the potential energy surface is of interest. Moreover, we show that the Taylor expansion scheme, irrespective of the dimer symmetry, is very accurate for the entire range of geometry fluctuations that cover the space the molecule accesses at room temperature.

  13. Development of highly accurate approximate scheme for computing the charge transfer integral

    SciTech Connect

    Pershin, Anton; Szalay, Péter G.

    2015-08-21

    The charge transfer integral is a key parameter required by various theoretical models to describe charge transport properties, e.g., in organic semiconductors. The accuracy of this important property depends on several factors, which include the level of electronic structure theory and internal simplifications of the applied formalism. The goal of this paper is to identify the performance of various approximate approaches of the latter category, while using the high level equation-of-motion coupled cluster theory for the electronic structure. The calculations have been performed on the ethylene dimer as one of the simplest model systems. By studying different spatial perturbations, it was shown that while both energy split in dimer and fragment charge difference methods are equivalent with the exact formulation for symmetrical displacements, they are less efficient when describing transfer integral along the asymmetric alteration coordinate. Since the “exact” scheme was found computationally expensive, we examine the possibility to obtain the asymmetric fluctuation of the transfer integral by a Taylor expansion along the coordinate space. By exploring the efficiency of this novel approach, we show that the Taylor expansion scheme represents an attractive alternative to the “exact” calculations due to a substantial reduction of computational costs, when a considerably large region of the potential energy surface is of interest. Moreover, we show that the Taylor expansion scheme, irrespective of the dimer symmetry, is very accurate for the entire range of geometry fluctuations that cover the space the molecule accesses at room temperature.

  14. Laser-induced accurate frontal cortex damage: a new tool for brain study

    NASA Astrophysics Data System (ADS)

    Flores, Gonzalo; Khotiaintsev, Sergei N.; Sanchez-Huerta, Maria L.; Ibanes, Osvaldo; Hernandez, Adan; Silva, Adriana B.; Calderon, Rafael; Ayala, Griselda; Marroquin, Javier; Svirid, Vladimir; Khotiaintsev, Yuri V.

    1999-01-01

    New laser-based technique for anatomical-functional study of the medial prefrontal cortex (MPFC) of the brain of experimental animals (rats) is presented. The technique is based on making accurate well-controlled lesions to small MPFC and subsequent observing behavioral alterations in the lesioned animals relative to control ones. Laser produces smaller and more accurate lesions in comparison to those obtained by traditional methods, such as: mechanical action, chemical means, and electrical currents. For producing the brain lesions, a 10 W CO2 CW laser is employed for reasons of its sufficiently high power, which is combined with relatively low cost-per-Watt ratio. In our experience, such power rating is sufficient for making MPFC lesions. The laser radiation is applied in a form of pulse series via hollow circular metallic waveguide made of stainless steel. The waveguide is of inner diameter 1.3 mm and 95 mm long. The anesthetized animals are placed in stereotaxic instrument. Via perforations made in the skull bone, the MPFC is exposed to the laser radiation. Several weeks later (after animal recuperation), standard behavioral tests are performed. They reveal behavioral changes, which point to a damage of some small regions of the MPFC. These results correlate with the histological data, which reveal the existence of small and accurate MPFC lesions. The present technique has good prospects for use in anatomical- functional studies of brain by areas. In addition, this technique appears to have considerable promise as a treatment method for some pathologies, e.g. the Parkinson's disease.

  15. A computer tool to support in design of industrial Ethernet.

    PubMed

    Lugli, Alexandre Baratella; Santos, Max Mauro Dias; Franco, Lucia Regina Horta Rodrigues

    2009-04-01

    This paper presents a computer tool to support in the project and development of an industrial Ethernet network, verifying the physical layer (cables-resistance and capacitance, scan time, network power supply-POE's concept "Power Over Ethernet" and wireless), and occupation rate (amount of information transmitted to the network versus the controller network scan time). These functions are accomplished without a single physical element installed in the network, using only simulation. The computer tool has a software that presents a detailed vision of the network to the user, besides showing some possible problems in the network, and having an extremely friendly environment.

  16. Accurate molecular structure and spectroscopic properties for nucleobases: A combined computational - microwave investigation of 2-thiouracil as a case study

    PubMed Central

    Puzzarini, Cristina; Biczysko, Malgorzata; Barone, Vincenzo; Peña, Isabel; Cabezas, Carlos; Alonso, José L.

    2015-01-01

    The computational composite scheme purposely set up for accurately describing the electronic structure and spectroscopic properties of small biomolecules has been applied to the first study of the rotational spectrum of 2-thiouracil. The experimental investigation was made possible thanks to the combination of the laser ablation technique with Fourier Transform Microwave spectrometers. The joint experimental – computational study allowed us to determine accurate molecular structure and spectroscopic properties for the title molecule, but more important, it demonstrates a reliable approach for the accurate investigation of isolated small biomolecules. PMID:24002739

  17. Review of parallel computing methods and tools for FPGA technology

    NASA Astrophysics Data System (ADS)

    Cieszewski, Radosław; Linczuk, Maciej; Pozniak, Krzysztof; Romaniuk, Ryszard

    2013-10-01

    Parallel computing is emerging as an important area of research in computer architectures and software systems. Many algorithms can be greatly accelerated using parallel computing techniques. Specialized parallel computer architectures are used for accelerating speci c tasks. High-Energy Physics Experiments measuring systems often use FPGAs for ne-grained computation. FPGA combines many bene ts of both software and ASIC implementations. Like software, the mapped circuit is exible, and can be recon gured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This paper presents existing methods and tools for ne-grained computation implemented in FPGA using Behavioral Description and High Level Programming Languages.

  18. From plant genomes to protein families: computational tools

    PubMed Central

    Martinez, Manuel

    2013-01-01

    The development of new high-throughput sequencing technologies has increased dramatically the number of successful genomic projects. Thus, draft genomic sequences of more than 60 plant species are currently available. Suitable bioinformatics tools are being developed to assemble, annotate and analyze the enormous number of sequences produced. In this context, specific plant comparative genomic databases are become powerful tools for gene family annotation in plant clades. In this mini-review, the current state-of-art of genomic projects is glossed. Besides, the computational tools developed to compare genomic data are compiled. PMID:24688740

  19. SPARSKIT: A basic tool kit for sparse matrix computations

    NASA Technical Reports Server (NTRS)

    Saad, Youcef

    1990-01-01

    Presented here are the main features of a tool package for manipulating and working with sparse matrices. One of the goals of the package is to provide basic tools to facilitate the exchange of software and data between researchers in sparse matrix computations. The starting point is the Harwell/Boeing collection of matrices for which the authors provide a number of tools. Among other things, the package provides programs for converting data structures, printing simple statistics on a matrix, plotting a matrix profile, and performing linear algebra operations with sparse matrices.

  20. Tool Use and Performance: Relationships between Tool- and Learner-Related Characteristics in a Computer-Based Learning Environment

    ERIC Educational Resources Information Center

    Juarez-Collazo, Norma A.; Elen, Jan; Clarebout, Geraldine

    2013-01-01

    It is still unclear on what and how tool and learner characteristics influence tool use and consequently performance in computer-based learning environments (CBLEs). This study examines the relationships between tool-related characteristics (tool presentation: non-/embedded tool and instructional cues: non-/explained tool functionality) and…

  1. WEMAP - A computer aided instructional tool for electromagnetics

    SciTech Connect

    Garg, V.K.; Ware, L.E.; Bogden, F.J.; DelVecchio, R.M.; Ashkin, M.; Woodward, W.S.

    1989-05-01

    The recent advances in computer technology and associated software have introduced new concepts and techniques into the traditional classroom environment of engineering education. WEMAP, an electromagnetic analysis computer code, is a part of a new breed of computer aided engineering tools for teaching electromagnetics to power engineering students. WEMAP is a stand-alone interactive graphics system for electromagnetic analysis, which includes electrostatics, magnetostatics, eddy currents (time harmonic as well as transient), and permanent magnet fields. These capabilities are described in this paper and examples are given to illustrate how this program enhances the classroom presentation of some of the electromagnetic phenomena.

  2. Software Tools: A One-Semester Secondary School Computer Course.

    ERIC Educational Resources Information Center

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  3. Computer Mathematical Tools: Practical Experience of Learning to Use Them

    ERIC Educational Resources Information Center

    Semenikhina, Elena; Drushlyak, Marina

    2014-01-01

    The article contains general information about the use of specialized mathematics software in the preparation of math teachers. The authors indicate the reasons to study the mathematics software. In particular, they analyze the possibility of presenting basic mathematical courses using mathematical computer tools from both a teacher and a student,…

  4. Integrating Computer-Assisted Translation Tools into Language Learning

    ERIC Educational Resources Information Center

    Fernández-Parra, María

    2016-01-01

    Although Computer-Assisted Translation (CAT) tools play an important role in the curriculum in many university translator training programmes, they are seldom used in the context of learning a language, as a good command of a language is needed before starting to translate. Since many institutions often have translator-training programmes as well…

  5. Cartoons beyond Clipart: A Computer Tool for Storyboarding and Storywriting

    ERIC Educational Resources Information Center

    Madden, M.; Chung, P. W. H.; Dawson, C. W.

    2009-01-01

    This paper describes the motivation, proposal, and early prototype testing of a computer tool for story visualisation. An analysis of current software for making various types of visual story is made; this identifies a gap between software which emphasises preset banks of artwork, and software which emphasises low-level construction and/or…

  6. A Computational Tool for Quantitative Analysis of Vascular Networks

    PubMed Central

    Zudaire, Enrique; Gambardella, Laure; Kurcz, Christopher; Vermeren, Sonja

    2011-01-01

    Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time - and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called “branching index” (branch points / unit area), providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge. PMID:22110636

  7. Accurate computation of surface stresses and forces with immersed boundary methods

    NASA Astrophysics Data System (ADS)

    Goza, Andres; Liska, Sebastian; Morley, Benjamin; Colonius, Tim

    2016-09-01

    Many immersed boundary methods solve for surface stresses that impose the velocity boundary conditions on an immersed body. These surface stresses may contain spurious oscillations that make them ill-suited for representing the physical surface stresses on the body. Moreover, these inaccurate stresses often lead to unphysical oscillations in the history of integrated surface forces such as the coefficient of lift. While the errors in the surface stresses and forces do not necessarily affect the convergence of the velocity field, it is desirable, especially in fluid-structure interaction problems, to obtain smooth and convergent stress distributions on the surface. To this end, we show that the equation for the surface stresses is an integral equation of the first kind whose ill-posedness is the source of spurious oscillations in the stresses. We also demonstrate that for sufficiently smooth delta functions, the oscillations may be filtered out to obtain physically accurate surface stresses. The filtering is applied as a post-processing procedure, so that the convergence of the velocity field is unaffected. We demonstrate the efficacy of the method by computing stresses and forces that converge to the physical stresses and forces for several test problems.

  8. Managing expectations when publishing tools and methods for computational proteomics.

    PubMed

    Martens, Lennart; Kohlbacher, Oliver; Weintraub, Susan T

    2015-05-01

    Computational tools are pivotal in proteomics because they are crucial for identification, quantification, and statistical assessment of data. The gateway to finding the best choice of a tool or approach for a particular problem is frequently journal articles, yet there is often an overwhelming variety of options that makes it hard to decide on the best solution. This is particularly difficult for nonexperts in bioinformatics. The maturity, reliability, and performance of tools can vary widely because publications may appear at different stages of development. A novel idea might merit early publication despite only offering proof-of-principle, while it may take years before a tool can be considered mature, and by that time it might be difficult for a new publication to be accepted because of a perceived lack of novelty. After discussions with members of the computational mass spectrometry community, we describe here proposed recommendations for organization of informatics manuscripts as a way to set the expectations of readers (and reviewers) through three different manuscript types that are based on existing journal designations. Brief Communications are short reports describing novel computational approaches where the implementation is not necessarily production-ready. Research Articles present both a novel idea and mature implementation that has been suitably benchmarked. Application Notes focus on a mature and tested tool or concept and need not be novel but should offer advancement from improved quality, ease of use, and/or implementation. Organizing computational proteomics contributions into these three manuscript types will facilitate the review process and will also enable readers to identify the maturity and applicability of the tool for their own workflows.

  9. Optimizing odor identification testing as quick and accurate diagnostic tool for Parkinson's disease

    PubMed Central

    Mahlknecht, Philipp; Pechlaner, Raimund; Boesveldt, Sanne; Volc, Dieter; Pinter, Bernardette; Reiter, Eva; Müller, Christoph; Krismer, Florian; Berendse, Henk W.; van Hilten, Jacobus J.; Wuschitz, Albert; Schimetta, Wolfgang; Högl, Birgit; Djamshidian, Atbin; Nocker, Michael; Göbel, Georg; Gasperi, Arno; Kiechl, Stefan; Willeit, Johann; Poewe, Werner

    2016-01-01

    ABSTRACT Introduction The aim of this study was to evaluate odor identification testing as a quick, cheap, and reliable tool to identify PD. Methods Odor identification with the 16‐item Sniffin' Sticks test (SS‐16) was assessed in a total of 646 PD patients and 606 controls from three European centers (A, B, and C), as well as 75 patients with atypical parkinsonism or essential tremor and in a prospective cohort of 24 patients with idiopathic rapid eye movement sleep behavior disorder (center A). Reduced odor sets most discriminative for PD were determined in a discovery cohort derived from a random split of PD patients and controls from center A using L1‐regularized logistic regression. Diagnostic accuracy was assessed in the rest of the patients/controls as validation cohorts. Results Olfactory performance was lower in PD patients compared with controls and non‐PD patients in all cohorts (each P < 0.001). Both the full SS‐16 and a subscore of the top eight discriminating odors (SS‐8) were associated with an excellent discrimination of PD from controls (areas under the curve ≥0.90; sensitivities ≥83.3%; specificities ≥82.0%) and from non‐PD patients (areas under the curve ≥0.91; sensitivities ≥84.1%; specificities ≥84.0%) in all cohorts. This remained unchanged when patients with >3 years of disease duration were excluded from analysis. All 8 incident PD cases among patients with idiopathic rapid eye movement sleep behavior disorder were predicted with the SS‐16 and the SS‐8 (sensitivity, 100%; positive predictive value, 61.5%). Conclusions Odor identification testing provides excellent diagnostic accuracy in the distinction of PD patients from controls and diagnostic mimics. A reduced set of eight odors could be used as a quick tool in the workup of patients presenting with parkinsonism and for PD risk indication. © 2016 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and

  10. Facilitating the selection and creation of accurate interatomic potentials with robust tools and characterization

    NASA Astrophysics Data System (ADS)

    Trautt, Zachary T.; Tavazza, Francesca; Becker, Chandler A.

    2015-10-01

    The Materials Genome Initiative seeks to significantly decrease the cost and time of development and integration of new materials. Within the domain of atomistic simulations, several roadblocks stand in the way of reaching this goal. While the NIST Interatomic Potentials Repository hosts numerous interatomic potentials (force fields), researchers cannot immediately determine the best choice(s) for their use case. Researchers developing new potentials, specifically those in restricted environments, lack a comprehensive portfolio of efficient tools capable of calculating and archiving the properties of their potentials. This paper elucidates one solution to these problems, which uses Python-based scripts that are suitable for rapid property evaluation and human knowledge transfer. Calculation results are visible on the repository website, which reduces the time required to select an interatomic potential for a specific use case. Furthermore, property evaluation scripts are being integrated with modern platforms to improve discoverability and access of materials property data. To demonstrate these scripts and features, we will discuss the automation of stacking fault energy calculations and their application to additional elements. While the calculation methodology was developed previously, we are using it here as a case study in simulation automation and property calculations. We demonstrate how the use of Python scripts allows for rapid calculation in a more easily managed way where the calculations can be modified, and the results presented in user-friendly and concise ways. Additionally, the methods can be incorporated into other efforts, such as openKIM.

  11. Poisonous or non-poisonous plants? DNA-based tools and applications for accurate identification.

    PubMed

    Mezzasalma, Valerio; Ganopoulos, Ioannis; Galimberti, Andrea; Cornara, Laura; Ferri, Emanuele; Labra, Massimo

    2017-01-01

    Plant exposures are among the most frequently reported cases to poison control centres worldwide. This is a growing condition due to recent societal trends oriented towards the consumption of wild plants as food, cosmetics, or medicine. At least three general causes of plant poisoning can be identified: plant misidentification, introduction of new plant-based supplements and medicines with no controls about their safety, and the lack of regulation for the trading of herbal and phytochemical products. Moreover, an efficient screening for the occurrence of plants poisonous to humans is also desirable at the different stages of the food supply chain: from the raw material to the final transformed product. A rapid diagnosis of intoxication cases is necessary in order to provide the most reliable treatment. However, a precise taxonomic characterization of the ingested species is often challenging. In this review, we provide an overview of the emerging DNA-based tools and technologies to address the issue of poisonous plant identification. Specifically, classic DNA barcoding and its applications using High Resolution Melting (Bar-HRM) ensure high universality and rapid response respectively, whereas High Throughput Sequencing techniques (HTS) provide a complete characterization of plant residues in complex matrices. The pros and cons of each approach have been evaluated with the final aim of proposing a general user's guide to molecular identification directed to different stakeholder categories interested in the diagnostics of poisonous plants.

  12. Satellite identification: object oriented tools for accurate maintenance of the catalog

    NASA Astrophysics Data System (ADS)

    Kamensky, S.

    2001-10-01

    Satellite identification procedures are based on joint analysis of orbital and non-coordinate data available in the Data Processing Center. The work presents the system of software tools used for this analysis. The first section of the paper presents general composition of the model of the space situation used by the Data Processing Center, indicating and describing the basic entities, objects, classes, attributes and techniques (with the required databases and archives) used for statistical and logical inference. The place of satellite identification tasks in this system is outlined. Then a set of classifiers, used for statistical inference on satellite type (spacecraft, rocket-body, fragment; specific type (series) of a spacecraft or rocket-body) is described. These classifiers use orbital and non-orbital (estimations of sizes, rotation, ballistic characteristics) data and evaluation of the evolution of these parameters if needed. However, the actual satellite identification techniques normally involve the data on all the objects of the launch (for analysis of new satellites) and consider the structure of satellite groups and constellations - regarding the place of the analyzed satellites in them and their evolution. The examples of enhanced efficiency of the system compared to the simple cluster-based analysis are presented as well as the illustrations of the structural advantages and "richer" inference capabilities.

  13. Toward accurate tooth segmentation from computed tomography images using a hybrid level set model

    SciTech Connect

    Gan, Yangzhou; Zhao, Qunfei; Xia, Zeyang E-mail: jing.xiong@siat.ac.cn; Hu, Ying; Xiong, Jing E-mail: jing.xiong@siat.ac.cn; Zhang, Jianwei

    2015-01-15

    Purpose: A three-dimensional (3D) model of the teeth provides important information for orthodontic diagnosis and treatment planning. Tooth segmentation is an essential step in generating the 3D digital model from computed tomography (CT) images. The aim of this study is to develop an accurate and efficient tooth segmentation method from CT images. Methods: The 3D dental CT volumetric images are segmented slice by slice in a two-dimensional (2D) transverse plane. The 2D segmentation is composed of a manual initialization step and an automatic slice by slice segmentation step. In the manual initialization step, the user manually picks a starting slice and selects a seed point for each tooth in this slice. In the automatic slice segmentation step, a developed hybrid level set model is applied to segment tooth contours from each slice. Tooth contour propagation strategy is employed to initialize the level set function automatically. Cone beam CT (CBCT) images of two subjects were used to tune the parameters. Images of 16 additional subjects were used to validate the performance of the method. Volume overlap metrics and surface distance metrics were adopted to assess the segmentation accuracy quantitatively. The volume overlap metrics were volume difference (VD, mm{sup 3}) and Dice similarity coefficient (DSC, %). The surface distance metrics were average symmetric surface distance (ASSD, mm), RMS (root mean square) symmetric surface distance (RMSSSD, mm), and maximum symmetric surface distance (MSSD, mm). Computation time was recorded to assess the efficiency. The performance of the proposed method has been compared with two state-of-the-art methods. Results: For the tested CBCT images, the VD, DSC, ASSD, RMSSSD, and MSSD for the incisor were 38.16 ± 12.94 mm{sup 3}, 88.82 ± 2.14%, 0.29 ± 0.03 mm, 0.32 ± 0.08 mm, and 1.25 ± 0.58 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the canine were 49.12 ± 9.33 mm{sup 3}, 91.57 ± 0.82%, 0.27 ± 0.02 mm, 0

  14. Evaluation of the Astronomy Workshop's Computer Assisted Learning Tools

    NASA Astrophysics Data System (ADS)

    Deming, G. L.; Hamilton, D. P.

    2005-05-01

    The computer assisted learning tools at the Astronomy Workshop web site (http://janus.astro.umd.edu) have been available on the Internet since 1997. The site consists of 25 interactive tools designed primarily for undergraduate non-science majors. The site is popular with more than 87,000 hits to the main page since counting began in January 2000. A Google search for "collisions" lists one of the Astronomy Workshop's tools as its first item. We have begun a study of the impact of three of the tools on undergraduate learning as part of a NASA EPO grant. The first phase of our study involves student interviews, the results of which will be presented. We welcome feedback from the community. This work is funded by NASA EPO 04 410.

  15. Computational Tools for Accelerating Carbon Capture Process Development

    SciTech Connect

    Miller, David

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  16. Analysis and accurate reconstruction of incomplete data in X-ray differential phase-contrast computed tomography.

    PubMed

    Fu, Jian; Tan, Renbo; Chen, Liyuan

    2014-01-01

    X-ray differential phase-contrast computed tomography (DPC-CT) is a powerful physical and biochemical analysis tool. In practical applications, there are often challenges for DPC-CT due to insufficient data caused by few-view, bad or missing detector channels, or limited scanning angular range. They occur quite frequently because of experimental constraints from imaging hardware, scanning geometry, and the exposure dose delivered to living specimens. In this work, we analyze the influence of incomplete data on DPC-CT image reconstruction. Then, a reconstruction method is developed and investigated for incomplete data DPC-CT. It is based on an algebraic iteration reconstruction technique, which minimizes the image total variation and permits accurate tomographic imaging with less data. This work comprises a numerical study of the method and its experimental verification using a dataset measured at the W2 beamline of the storage ring DORIS III equipped with a Talbot-Lau interferometer. The numerical and experimental results demonstrate that the presented method can handle incomplete data. It will be of interest for a wide range of DPC-CT applications in medicine, biology, and nondestructive testing.

  17. The role of customized computational tools in product development.

    SciTech Connect

    Heinstein, Martin Wilhelm; Kempka, Steven Norman; Tikare, Veena

    2005-06-01

    Model-based computer simulations have revolutionized product development in the last 10 to 15 years. Technologies that have existed for many decades or even centuries have been improved with the aid of computer simulations. Everything from low-tech consumer goods such as detergents, lubricants and light bulb filaments to the most advanced high-tech products such as airplane wings, wireless communication technologies and pharmaceuticals is engineered with the aid of computer simulations today. In this paper, we present a framework for describing computational tools and their application within the context of product engineering. We examine a few cases of product development that integrate numerical computer simulations into the development stage. We will discuss how the simulations were integrated into the development process, what features made the simulations useful, the level of knowledge and experience that was necessary to run meaningful simulations and other details of the process. Based on this discussion, recommendations for the incorporation of simulations and computational tools into product development will be made.

  18. Raman Spectroscopy Provides a Powerful Diagnostic Tool for Accurate Determination of Albumin Glycation

    PubMed Central

    Dingari, Narahara Chari; Horowitz, Gary L.; Kang, Jeon Woong; Dasari, Ramachandra R.; Barman, Ishan

    2012-01-01

    We present the first demonstration of glycated albumin detection and quantification using Raman spectroscopy without the addition of reagents. Glycated albumin is an important marker for monitoring the long-term glycemic history of diabetics, especially as its concentrations, in contrast to glycated hemoglobin levels, are unaffected by changes in erythrocyte life times. Clinically, glycated albumin concentrations show a strong correlation with the development of serious diabetes complications including nephropathy and retinopathy. In this article, we propose and evaluate the efficacy of Raman spectroscopy for determination of this important analyte. By utilizing the pre-concentration obtained through drop-coating deposition, we show that glycation of albumin leads to subtle, but consistent, changes in vibrational features, which with the help of multivariate classification techniques can be used to discriminate glycated albumin from the unglycated variant with 100% accuracy. Moreover, we demonstrate that the calibration model developed on the glycated albumin spectral dataset shows high predictive power, even at substantially lower concentrations than those typically encountered in clinical practice. In fact, the limit of detection for glycated albumin measurements is calculated to be approximately four times lower than its minimum physiological concentration. Importantly, in relation to the existing detection methods for glycated albumin, the proposed method is also completely reagent-free, requires barely any sample preparation and has the potential for simultaneous determination of glycated hemoglobin levels as well. Given these key advantages, we believe that the proposed approach can provide a uniquely powerful tool for quantification of glycation status of proteins in biopharmaceutical development as well as for glycemic marker determination in routine clinical diagnostics in the future. PMID:22393405

  19. A tangible programming tool for children to cultivate computational thinking.

    PubMed

    Wang, Danli; Wang, Tingting; Liu, Zhen

    2014-01-01

    Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5-9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity.

  20. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    PubMed Central

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  1. [Affective computing--a mysterious tool to explore human emotions].

    PubMed

    Li, Xin; Li, Honghong; Dou, Yi; Hou, Yongjie; Li, Changwu

    2013-12-01

    Perception, affection and consciousness are basic psychological functions of human being. Affection is the subjective reflection of different kinds of objects. The foundation of human being's thinking is constituted by the three basic functions. Affective computing is an effective tool of revealing the affectiveness of human being in order to understand the world. Our research of affective computing focused on the relation, the generation and the influent factors among different affections. In this paper, the affective mechanism, the basic theory of affective computing, is studied, the method of acquiring and recognition of affective information is discussed, and the application of affective computing is summarized as well, in order to attract more researchers into this working area.

  2. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  3. The extended Koopmans' theorem for orbital-optimized methods: accurate computation of ionization potentials.

    PubMed

    Bozkaya, Uğur

    2013-10-21

    The extended Koopmans' theorem (EKT) provides a straightforward way to compute ionization potentials (IPs) from any level of theory, in principle. However, for non-variational methods, such as Møller-Plesset perturbation and coupled-cluster theories, the EKT computations can only be performed as by-products of analytic gradients as the relaxed generalized Fock matrix (GFM) and one- and two-particle density matrices (OPDM and TPDM, respectively) are required [J. Cioslowski, P. Piskorz, and G. Liu, J. Chem. Phys. 107, 6804 (1997)]. However, for the orbital-optimized methods both the GFM and OPDM are readily available and symmetric, as opposed to the standard post Hartree-Fock (HF) methods. Further, the orbital optimized methods solve the N-representability problem, which may arise when the relaxed particle density matrices are employed for the standard methods, by disregarding the orbital Z-vector contributions for the OPDM. Moreover, for challenging chemical systems, where spin or spatial symmetry-breaking problems are observed, the abnormal orbital response contributions arising from the numerical instabilities in the HF molecular orbital Hessian can be avoided by the orbital-optimization. Hence, it appears that the orbital-optimized methods are the most natural choice for the study of the EKT. In this research, the EKT for the orbital-optimized methods, such as orbital-optimized second- and third-order Møller-Plesset perturbation [U. Bozkaya, J. Chem. Phys. 135, 224103 (2011)] and coupled-electron pair theories [OCEPA(0)] [U. Bozkaya and C. D. Sherrill, J. Chem. Phys. 139, 054104 (2013)], are presented. The presented methods are applied to IPs of the second- and third-row atoms, and closed- and open-shell molecules. Performances of the orbital-optimized methods are compared with those of the counterpart standard methods. Especially, results of the OCEPA(0) method (with the aug-cc-pVTZ basis set) for the lowest IPs of the considered atoms and closed

  4. A tool for modeling concurrent real-time computation

    NASA Technical Reports Server (NTRS)

    Sharma, D. D.; Huang, Shie-Rei; Bhatt, Rahul; Sridharan, N. S.

    1990-01-01

    Real-time computation is a significant area of research in general, and in AI in particular. The complexity of practical real-time problems demands use of knowledge-based problem solving techniques while satisfying real-time performance constraints. Since the demands of a complex real-time problem cannot be predicted (owing to the dynamic nature of the environment) powerful dynamic resource control techniques are needed to monitor and control the performance. A real-time computation model for a real-time tool, an implementation of the QP-Net simulator on a Symbolics machine, and an implementation on a Butterfly multiprocessor machine are briefly described.

  5. A New Computational Tool for Understanding Light-Matter Interactions

    DTIC Science & Technology

    2016-02-11

    magnetic moment of the metallic nanostructure, as well as the polarization of the incident light. IV Summary To summarize, we propose that QED is...SECURITY CLASSIFICATION OF: Plasmonic resonance of a metallic nanostructure results from coherent motion of its conduction electrons driven by...reviewed journals: Final Report: A New Computational Tool For Understanding Light-Matter Interactions Report Title Plasmonic resonance of a metallic

  6. Final Report for Foundational Tools for Petascale Computing

    SciTech Connect

    Hollingsworth, Jeff

    2015-02-12

    This project concentrated on various aspects of creating tool infrastructure to make it easier to program large-scale parallel computers. This project was collaborative with the University of Wisconsin and closely related to the project DE-SC0002606 (“Tools for the Development of High Performance Energy Applications and Systems”) . The research conducted during this project is summarized in this report. The complete details of the work are available in the ten publications listed at the end of the report. Many of the concepts created during this project have been incorporated into tools and made available as freely downloadable software (at www.dyninst.org). It also supported the Ph.D. studies of three students and one research staff member.

  7. Cloud-Based Computational Tools for Earth Science Applications

    NASA Astrophysics Data System (ADS)

    Arendt, A. A.; Fatland, R.; Howe, B.

    2015-12-01

    Earth scientists are increasingly required to think across disciplines and utilize a wide range of datasets in order to solve complex environmental challenges. Although significant progress has been made in distributing data, researchers must still invest heavily in developing computational tools to accommodate their specific domain. Here we document our development of lightweight computational data systems aimed at enabling rapid data distribution, analytics and problem solving tools for Earth science applications. Our goal is for these systems to be easily deployable, scalable and flexible to accommodate new research directions. As an example we describe "Ice2Ocean", a software system aimed at predicting runoff from snow and ice in the Gulf of Alaska region. Our backend components include relational database software to handle tabular and vector datasets, Python tools (NumPy, pandas and xray) for rapid querying of gridded climate data, and an energy and mass balance hydrological simulation model (SnowModel). These components are hosted in a cloud environment for direct access across research teams, and can also be accessed via API web services using a REST interface. This API is a vital component of our system architecture, as it enables quick integration of our analytical tools across disciplines, and can be accessed by any existing data distribution centers. We will showcase several data integration and visualization examples to illustrate how our system has expanded our ability to conduct cross-disciplinary research.

  8. Accelerating Battery Design Using Computer-Aided Engineering Tools: Preprint

    SciTech Connect

    Pesaran, A.; Heon, G. H.; Smith, K.

    2011-01-01

    Computer-aided engineering (CAE) is a proven pathway, especially in the automotive industry, to improve performance by resolving the relevant physics in complex systems, shortening the product development design cycle, thus reducing cost, and providing an efficient way to evaluate parameters for robust designs. Academic models include the relevant physics details, but neglect engineering complexities. Industry models include the relevant macroscopic geometry and system conditions, but simplify the fundamental physics too much. Most of the CAE battery tools for in-house use are custom model codes and require expert users. There is a need to make these battery modeling and design tools more accessible to end users such as battery developers, pack integrators, and vehicle makers. Developing integrated and physics-based CAE battery tools can reduce the design, build, test, break, re-design, re-build, and re-test cycle and help lower costs. NREL has been involved in developing various models to predict the thermal and electrochemical performance of large-format cells and has used in commercial three-dimensional finite-element analysis and computational fluid dynamics to study battery pack thermal issues. These NREL cell and pack design tools can be integrated to help support the automotive industry and to accelerate battery design.

  9. Computational Tools for the Secondary Analysis of Metabolomics Experiments

    PubMed Central

    Booth, Sean C.; Weljie, Aalim M.; Turner, Raymond J.

    2013-01-01

    Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before. PMID:24688685

  10. Procedure for computer-controlled milling of accurate surfaces of revolution for millimeter and far-infrared mirrors

    NASA Technical Reports Server (NTRS)

    Emmons, Louisa; De Zafra, Robert

    1991-01-01

    A simple method for milling accurate off-axis parabolic mirrors with a computer-controlled milling machine is discussed. For machines with a built-in circle-cutting routine, an exact paraboloid can be milled with few computer commands and without the use of the spherical or linear approximations. The proposed method can be adapted easily to cut off-axis sections of elliptical or spherical mirrors.

  11. Reliability automation tool (RAT) for fault tolerance computation

    NASA Astrophysics Data System (ADS)

    Singh, N. S. S.; Hamid, N. H.; Asirvadam, V. S.

    2012-09-01

    As CMOS transistors reduced in size, the circuit built using these nano-scale transistors naturally becomes less reliable. The reliability reduction, which is the measure of circuit performance, has brought up so many challenges in designing modern logic integrated circuit. Therefore, reliability modeling is increasingly important subject to be considered in designing modern logic integrated circuit. This drives a need to compute reliability measures for nano-scale circuits. This paper looks into the development of reliability automation tool (RAT) for circuit's reliability computation. The tool is developed using Matlab programming language based on the reliability evaluation model called Probabilistic Transfer Matrix (PTM). RAT allows users to significantly speed-up the reliability assessments of nano-scale circuits. Users have to provide circuit's netlist as the input to RAT for its reliability computation. The netlist signifies the circuit's description in terms of Gate Profile Matrix (GPM), Adjacency Computation Matrix (ACM) and Grid Layout Matrix (GLM). GPM, ACM and GLM indicate the types of logic gates, the interconnection between these logic gates and the layout matrix of these logic gates respectively in a given circuit design. Here, the reliability assessment by RAT is carried out on Full Adder circuit as the benchmark test circuit.

  12. DEM sourcing guidelines for computing 1 Eö accurate terrain corrections for airborne gravity gradiometry

    NASA Astrophysics Data System (ADS)

    Annecchione, Maria; Hatch, David; Hefford, Shane W.

    2017-01-01

    In this paper we investigate digital elevation model (DEM) sourcing requirements to compute gravity gradiometry terrain corrections accurate to 1 Eötvös (Eö) at observation heights of 80 m or more above ground. Such survey heights are typical in fixed-wing airborne surveying for resource exploration where the maximum signal-to-noise ratio is sought. We consider the accuracy of terrain corrections relevant for recent commercial airborne gravity gradiometry systems operating at the 10 Eö noise level and for future systems with a target noise level of 1 Eö. We focus on the requirements for the vertical gradient of the vertical component of gravity (Gdd) because this element of the gradient tensor is most commonly interpreted qualitatively and quantitatively. Terrain correction accuracy depends on the bare-earth DEM accuracy and spatial resolution. The bare-earth DEM accuracy and spatial resolution depends on its source. Two possible sources are considered: airborne LiDAR and Shuttle Radar Topography Mission (SRTM). The accuracy of an SRTM DEM is affected by vegetation height. The SRTM footprint is also larger and the DEM resolution is thus lower. However, resolution requirements relax as relief decreases. Publicly available LiDAR data and 1 arc-second and 3 arc-second SRTM data were selected over four study areas representing end member cases of vegetation cover and relief. The four study areas are presented as reference material for processing airborne gravity gradiometry data at the 1 Eö noise level with 50 m spatial resolution. From this investigation we find that to achieve 1 Eö accuracy in the terrain correction at 80 m height airborne LiDAR data are required even when terrain relief is a few tens of meters and the vegetation is sparse. However, as satellite ranging technologies progress bare-earth DEMs of sufficient accuracy and resolution may be sourced at lesser cost. We found that a bare-earth DEM of 10 m resolution and 2 m accuracy are sufficient for

  13. Computational science: shifting the focus from tools to models

    PubMed Central

    Hinsen, Konrad

    2014-01-01

    Computational techniques have revolutionized many aspects of scientific research over the last few decades. Experimentalists use computation for data analysis, processing ever bigger data sets. Theoreticians compute predictions from ever more complex models. However, traditional articles do not permit the publication of big data sets or complex models. As a consequence, these crucial pieces of information no longer enter the scientific record. Moreover, they have become prisoners of scientific software: many models exist only as software implementations, and the data are often stored in proprietary formats defined by the software. In this article, I argue that this emphasis on software tools over models and data is detrimental to science in the long term, and I propose a means by which this can be reversed. PMID:25309728

  14. Computer-Based Tools for Evaluating Graphical User Interfaces

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  15. An accurate and efficient computation method of the hydration free energy of a large, complex molecule.

    PubMed

    Yoshidome, Takashi; Ekimoto, Toru; Matubayasi, Nobuyuki; Harano, Yuichi; Kinoshita, Masahiro; Ikeguchi, Mitsunori

    2015-05-07

    The hydration free energy (HFE) is a crucially important physical quantity to discuss various chemical processes in aqueous solutions. Although an explicit-solvent computation with molecular dynamics (MD) simulations is a preferable treatment of the HFE, huge computational load has been inevitable for large, complex solutes like proteins. In the present paper, we propose an efficient computation method for the HFE. In our method, the HFE is computed as a sum of 〈UUV〉/2 (〈UUV〉 is the ensemble average of the sum of pair interaction energy between solute and water molecule) and the water reorganization term mainly reflecting the excluded volume effect. Since 〈UUV〉 can readily be computed through a MD of the system composed of solute and water, an efficient computation of the latter term leads to a reduction of computational load. We demonstrate that the water reorganization term can quantitatively be calculated using the morphometric approach (MA) which expresses the term as the linear combinations of the four geometric measures of a solute and the corresponding coefficients determined with the energy representation (ER) method. Since the MA enables us to finish the computation of the solvent reorganization term in less than 0.1 s once the coefficients are determined, the use of the MA enables us to provide an efficient computation of the HFE even for large, complex solutes. Through the applications, we find that our method has almost the same quantitative performance as the ER method with substantial reduction of the computational load.

  16. Symmetry-Based Computational Tools for Magnetic Crystallography

    NASA Astrophysics Data System (ADS)

    Perez-Mato, J. M.; Gallego, S. V.; Tasci, E. S.; Elcoro, L.; de la Flor, G.; Aroyo, M. I.

    2015-07-01

    In recent years, two important advances have opened new doors for the characterization and determination of magnetic structures. Firstly, researchers have produced computer-readable listings of the magnetic or Shubnikov space groups. Secondly, they have extended and applied the superspace formalism, which is presently the standard approach for the description of nonmagnetic incommensurate structures and their symmetry, to magnetic structures. These breakthroughs have been the basis for the subsequent development of a series of computer tools that allow a more efficient and comprehensive application of magnetic symmetry, both commensurate and incommensurate. Here we briefly review the capabilities of these computation instruments and present the fundamental concepts on which they are based, providing various examples. We show how these tools facilitate the use of symmetry arguments expressed as either a magnetic space group or a magnetic superspace group and allow the exploration of the possible magnetic orderings associated with one or more propagation vectors in a form that complements and goes beyond the traditional representation method. Special focus is placed on the programs available online at the Bilbao Crystallographic Server ( http://www.cryst.ehu.es ).

  17. A 3D assessment tool for accurate volume measurement for monitoring the evolution of cutaneous leishmaniasis wounds.

    PubMed

    Zvietcovich, Fernando; Castañeda, Benjamin; Valencia, Braulio; Llanos-Cuentas, Alejandro

    2012-01-01

    Clinical assessment and outcome metrics are serious weaknesses identified on the systematic reviews of cutaneous Leishmaniasis wounds. Methods with high accuracy and low-variability are required to standarize study outcomes in clinical trials. This work presents a precise, complete and noncontact 3D assessment tool for monitoring the evolution of cutaneous Leishmaniasis (CL) wounds based on a 3D laser scanner and computer vision algorithms. A 3D mesh of the wound is obtained by a commercial 3D laser scanner. Then, a semi-automatic segmentation using active contours is performed to separate the ulcer from the healthy skin. Finally, metrics of volume, area, perimeter and depth are obtained from the mesh. Traditional manual 3D and 3D measurements are obtained as a gold standard. Experiments applied to phantoms and real CL wounds suggest that the proposed 3D assessment tool provides higher accuracy (error <2%) and precision rates (error <4%) than conventional manual methods (precision error < 35%). This 3D assessment tool provides high accuracy metrics which deserve more formal prospective study.

  18. Applying computer simulation models as learning tools in fishery management

    USGS Publications Warehouse

    Johnson, B.L.

    1995-01-01

    Computer models can be powerful tools for addressing many problems in fishery management, but uncertainty about how to apply models and how they should perform can lead to a cautious approach to modeling. Within this approach, we expect models to make quantitative predictions but only after all model inputs have been estimated from empirical data and after the model has been tested for agreement with an independent data set. I review the limitations to this approach and show how models can be more useful as tools for organizing data and concepts, learning about the system to be managed, and exploring management options. Fishery management requires deciding what actions to pursue to meet management objectives. Models do not make decisions for us but can provide valuable input to the decision-making process. When empirical data are lacking, preliminary modeling with parameters derived from other sources can help determine priorities for data collection. When evaluating models for management applications, we should attempt to define the conditions under which the model is a useful, analytical tool (its domain of applicability) and should focus on the decisions made using modeling results, rather than on quantitative model predictions. I describe an example of modeling used as a learning tool for the yellow perch Perca flavescens fishery in Green Bay, Lake Michigan.

  19. Modeling of edge effect in subaperture tool influence functions of computer controlled optical surfacing.

    PubMed

    Wan, Songlin; Zhang, Xiangchao; He, Xiaoying; Xu, Min

    2016-12-20

    Computer controlled optical surfacing requires an accurate tool influence function (TIF) for reliable path planning and deterministic fabrication. Near the edge of the workpieces, the TIF has a nonlinear removal behavior, which will cause a severe edge-roll phenomenon. In the present paper, a new edge pressure model is developed based on the finite element analysis results. The model is represented as the product of a basic pressure function and a correcting function. The basic pressure distribution is calculated according to the surface shape of the polishing pad, and the correcting function is used to compensate the errors caused by the edge effect. Practical experimental results demonstrate that the new model can accurately predict the edge TIFs with different overhang ratios. The relative error of the new edge model can be reduced to 15%.

  20. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  1. Tools for 3D scientific visualization in computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val

    1989-01-01

    The purpose is to describe the tools and techniques in use at the NASA Ames Research Center for performing visualization of computational aerodynamics, for example visualization of flow fields from computer simulations of fluid dynamics about vehicles such as the Space Shuttle. The hardware used for visualization is a high-performance graphics workstation connected to a super computer with a high speed channel. At present, the workstation is a Silicon Graphics IRIS 3130, the supercomputer is a CRAY2, and the high speed channel is a hyperchannel. The three techniques used for visualization are post-processing, tracking, and steering. Post-processing analysis is done after the simulation. Tracking analysis is done during a simulation but is not interactive, whereas steering analysis involves modifying the simulation interactively during the simulation. Using post-processing methods, a flow simulation is executed on a supercomputer and, after the simulation is complete, the results of the simulation are processed for viewing. The software in use and under development at NASA Ames Research Center for performing these types of tasks in computational aerodynamics is described. Workstation performance issues, benchmarking, and high-performance networks for this purpose are also discussed as well as descriptions of other hardware for digital video and film recording.

  2. Performance Evaluation Tools for Next Generation Scalable Computing Platforms

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Sarukkai, Sekhar; Craw, James (Technical Monitor)

    1995-01-01

    The Federal High Performance and Communications (HPCC) Program continue to focus on R&D in a wide range of high performance computing and communications technologies. Using its accomplishments in the past four years as building blocks towards a Global Information Infrastructure (GII), an Implementation Plan that identifies six Strategic Focus Areas for R&D has been proposed. This white paper argues that a new generation of system software and programming tools must be developed to support these focus areas, so that the R&D we invest today can lead to technology pay-off a decade from now. The Global Computing Infrastructure (GCI) in the Year 2000 and Beyond would consists of thousands of powerful computing nodes connected via high-speed networks across the globe. Users will be able to obtain computing in formation services the GCI with the ease of using a plugging a toaster into the electrical outlet on the wall anywhere in the country. Developing and managing the GO requires performance prediction and monitoring capabilities that do not exist. Various accomplishments in this field today must be integrated and expanded to support this vision.

  3. Lensfree Computational Microscopy Tools and their Biomedical Applications

    NASA Astrophysics Data System (ADS)

    Sencan, Ikbal

    Conventional microscopy has been a revolutionary tool for biomedical applications since its invention several centuries ago. Ability to non-destructively observe very fine details of biological objects in real time enabled to answer many important questions about their structures and functions. Unfortunately, most of these advance microscopes are complex, bulky, expensive, and/or hard to operate, so they could not reach beyond the walls of well-equipped laboratories. Recent improvements in optoelectronic components and computational methods allow creating imaging systems that better fulfill the specific needs of clinics or research related biomedical applications. In this respect, lensfree computational microscopy aims to replace bulky and expensive optical components with compact and cost-effective alternatives through the use of computation, which can be particularly useful for lab-on-a-chip platforms as well as imaging applications in low-resource settings. Several high-throughput on-chip platforms are built with this approach for applications including, but not limited to, cytometry, micro-array imaging, rare cell analysis, telemedicine, and water quality screening. The lack of optical complexity in these lensfree on-chip imaging platforms is compensated by using computational techniques. These computational methods are utilized for various purposes in coherent, incoherent and fluorescent on-chip imaging platforms e.g. improving the spatial resolution, to undo the light diffraction without using lenses, localization of objects in a large volume and retrieval of the phase or the color/spectral content of the objects. For instance, pixel super resolution approaches based on source shifting are used in lensfree imaging platforms to prevent under sampling, Bayer pattern, and aliasing artifacts. Another method, iterative phase retrieval, is utilized to compensate the lack of lenses by undoing the diffraction and removing the twin image noise of in-line holograms

  4. A fourth order accurate finite difference scheme for the computation of elastic waves

    NASA Technical Reports Server (NTRS)

    Bayliss, A.; Jordan, K. E.; Lemesurier, B. J.; Turkel, E.

    1986-01-01

    A finite difference for elastic waves is introduced. The model is based on the first order system of equations for the velocities and stresses. The differencing is fourth order accurate on the spatial derivatives and second order accurate in time. The model is tested on a series of examples including the Lamb problem, scattering from plane interf aces and scattering from a fluid-elastic interface. The scheme is shown to be effective for these problems. The accuracy and stability is insensitive to the Poisson ratio. For the class of problems considered here it is found that the fourth order scheme requires for two-thirds to one-half the resolution of a typical second order scheme to give comparable accuracy.

  5. OVERSMART Reporting Tool for Flow Computations Over Large Grid Systems

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Chan, William M.

    2012-01-01

    Structured grid solvers such as NASA's OVERFLOW compressible Navier-Stokes flow solver can generate large data files that contain convergence histories for flow equation residuals, turbulence model equation residuals, component forces and moments, and component relative motion dynamics variables. Most of today's large-scale problems can extend to hundreds of grids, and over 100 million grid points. However, due to the lack of efficient tools, only a small fraction of information contained in these files is analyzed. OVERSMART (OVERFLOW Solution Monitoring And Reporting Tool) provides a comprehensive report of solution convergence of flow computations over large, complex grid systems. It produces a one-page executive summary of the behavior of flow equation residuals, turbulence model equation residuals, and component forces and moments. Under the automatic option, a matrix of commonly viewed plots such as residual histograms, composite residuals, sub-iteration bar graphs, and component forces and moments is automatically generated. Specific plots required by the user can also be prescribed via a command file or a graphical user interface. Output is directed to the user s computer screen and/or to an html file for archival purposes. The current implementation has been targeted for the OVERFLOW flow solver, which is used to obtain a flow solution on structured overset grids. The OVERSMART framework allows easy extension to other flow solvers.

  6. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  7. Solid rocket booster internal flow analysis by highly accurate adaptive computational methods

    NASA Technical Reports Server (NTRS)

    Huang, C. Y.; Tworzydlo, W.; Oden, J. T.; Bass, J. M.; Cullen, C.; Vadaketh, S.

    1991-01-01

    The primary objective of this project was to develop an adaptive finite element flow solver for simulating internal flows in the solid rocket booster. Described here is a unique flow simulator code for analyzing highly complex flow phenomena in the solid rocket booster. New methodologies and features incorporated into this analysis tool are described.

  8. Modeling of Tool-Tissue Interactions for Computer-Based Surgical Simulation: A Literature Review.

    PubMed

    Misra, Sarthak; Ramesh, K T; Okamura, Allison M

    2008-10-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in the development of high-fidelity surgical simulators. Researchers have attempted to model tool-tissue interactions in a wide variety of ways, which can be broadly classified as (1) linear elasticity-based, (2) nonlinear (hyperelastic) elasticity-based finite element (FE) methods, and (3) other techniques that not based on FE methods or continuum mechanics. Realistic modeling of organ deformation requires populating the model with real tissue data (which are difficult to acquire in vivo) and simulating organ response in real time (which is computationally expensive). Further, it is challenging to account for connective tissue supporting the organ, friction, and topological changes resulting from tool-tissue interactions during invasive surgical procedures. Overcoming such obstacles will not only help us to model tool-tissue interactions in real time, but also enable realistic force feedback to the user during surgical simulation. This review paper classifies the existing research on tool-tissue interactions for surgical simulators specifically based on the modeling techniques employed and the kind of surgical operation being simulated, in order to inform and motivate future research on improved tool-tissue interaction models.

  9. Modeling of Tool-Tissue Interactions for Computer-Based Surgical Simulation: A Literature Review

    PubMed Central

    Misra, Sarthak; Ramesh, K. T.; Okamura, Allison M.

    2009-01-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in the development of high-fidelity surgical simulators. Researchers have attempted to model tool-tissue interactions in a wide variety of ways, which can be broadly classified as (1) linear elasticity-based, (2) nonlinear (hyperelastic) elasticity-based finite element (FE) methods, and (3) other techniques that not based on FE methods or continuum mechanics. Realistic modeling of organ deformation requires populating the model with real tissue data (which are difficult to acquire in vivo) and simulating organ response in real time (which is computationally expensive). Further, it is challenging to account for connective tissue supporting the organ, friction, and topological changes resulting from tool-tissue interactions during invasive surgical procedures. Overcoming such obstacles will not only help us to model tool-tissue interactions in real time, but also enable realistic force feedback to the user during surgical simulation. This review paper classifies the existing research on tool-tissue interactions for surgical simulators specifically based on the modeling techniques employed and the kind of surgical operation being simulated, in order to inform and motivate future research on improved tool-tissue interaction models. PMID:20119508

  10. A scalable and accurate targeted gene assembly tool (SAT-Assembler) for next-generation sequencing data.

    PubMed

    Zhang, Yuan; Sun, Yanni; Cole, James R

    2014-08-01

    Gene assembly, which recovers gene segments from short reads, is an important step in functional analysis of next-generation sequencing data. Lacking quality reference genomes, de novo assembly is commonly used for RNA-Seq data of non-model organisms and metagenomic data. However, heterogeneous sequence coverage caused by heterogeneous expression or species abundance, similarity between isoforms or homologous genes, and large data size all pose challenges to de novo assembly. As a result, existing assembly tools tend to output fragmented contigs or chimeric contigs, or have high memory footprint. In this work, we introduce a targeted gene assembly program SAT-Assembler, which aims to recover gene families of particular interest to biologists. It addresses the above challenges by conducting family-specific homology search, homology-guided overlap graph construction, and careful graph traversal. It can be applied to both RNA-Seq and metagenomic data. Our experimental results on an Arabidopsis RNA-Seq data set and two metagenomic data sets show that SAT-Assembler has smaller memory usage, comparable or better gene coverage, and lower chimera rate for assembling a set of genes from one or multiple pathways compared with other assembly tools. Moreover, the family-specific design and rapid homology search allow SAT-Assembler to be naturally compatible with parallel computing platforms. The source code of SAT-Assembler is available at https://sourceforge.net/projects/sat-assembler/. The data sets and experimental settings can be found in supplementary material.

  11. Covariance approximation for fast and accurate computation of channelized Hotelling observer statistics

    SciTech Connect

    Bonetto, Paola; Qi, Jinyi; Leahy, Richard M.

    1999-10-01

    We describe a method for computing linear observer statistics for maximum a posteriori (MAP) reconstructions of PET images. The method is based on a theoretical approximation for the mean and covariance of MAP reconstructions. In particular, we derive here a closed form for the channelized Hotelling observer (CHO) statistic applied to 2D MAP images. We show reasonably good correspondence between these theoretical results and Monte Carlo studies. The accuracy and low computational cost of the approximation allow us to analyze the observer performance over a wide range of operating conditions and parameter settings for the MAP reconstruction algorithm.

  12. Time-Accurate Computations of Isolated Circular Synthetic Jets in Crossflow

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Schaeffler, N. W.; Milanovic, I. M.; Zaman, K. B. M. Q.

    2007-01-01

    Results from unsteady Reynolds-averaged Navier-Stokes computations are described for two different synthetic jet flows issuing into a turbulent boundary layer crossflow through a circular orifice. In one case the jet effect is mostly contained within the boundary layer, while in the other case the jet effect extends beyond the boundary layer edge. Both cases have momentum flux ratios less than 2. Several numerical parameters are investigated, and some lessons learned regarding the CFD methods for computing these types of flow fields are summarized. Results in both cases are compared to experiment.

  13. MULTICORR: A Computer Program for Fast, Accurate, Small-Sample Testing of Correlational Pattern Hypotheses.

    ERIC Educational Resources Information Center

    Steiger, James H.

    1979-01-01

    The program presented computes a chi-square statistic for testing pattern hypotheses on correlation matrices. The statistic is based on a multivariate generalization of the Fisher r-to-z transformation. This statistic has small sample performance which is superior to an analogous likelihood ratio statistic obtained via the analysis of covariance…

  14. Computational methods toward accurate RNA structure prediction using coarse-grained and all-atom models.

    PubMed

    Krokhotin, Andrey; Dokholyan, Nikolay V

    2015-01-01

    Computational methods can provide significant insights into RNA structure and dynamics, bridging the gap in our understanding of the relationship between structure and biological function. Simulations enrich and enhance our understanding of data derived on the bench, as well as provide feasible alternatives to costly or technically challenging experiments. Coarse-grained computational models of RNA are especially important in this regard, as they allow analysis of events occurring in timescales relevant to RNA biological function, which are inaccessible through experimental methods alone. We have developed a three-bead coarse-grained model of RNA for discrete molecular dynamics simulations. This model is efficient in de novo prediction of short RNA tertiary structure, starting from RNA primary sequences of less than 50 nucleotides. To complement this model, we have incorporated additional base-pairing constraints and have developed a bias potential reliant on data obtained from hydroxyl probing experiments that guide RNA folding to its correct state. By introducing experimentally derived constraints to our computer simulations, we are able to make reliable predictions of RNA tertiary structures up to a few hundred nucleotides. Our refined model exemplifies a valuable benefit achieved through integration of computation and experimental methods.

  15. Time-Accurate Computation of Viscous Flow Around Deforming Bodies Using Overset Grids

    SciTech Connect

    Fast, P; Henshaw, W D

    2001-04-02

    Dynamically evolving boundaries and deforming bodies interacting with a flow are commonly encountered in fluid dynamics. However, the numerical simulation of flows with dynamic boundaries is difficult with current methods. We propose a new method for studying such problems. The key idea is to use the overset grid method with a thin, body-fitted grid near the deforming boundary, while using fixed Cartesian grids to cover most of the computational domain. Our approach combines the strengths of earlier moving overset grid methods for rigid body motion, and unstructured grid methods for Aow-structure interactions. Large scale deformation of the flow boundaries can be handled without a global regridding, and in a computationally efficient way. In terms of computational cost, even a full overset grid regridding is significantly cheaper than a full regridding of an unstructured grid for the same domain, especially in three dimensions. Numerical studies are used to verify accuracy and convergence of our flow solver. As a computational example, we consider two-dimensional incompressible flow past a flexible filament with prescribed dynamics.

  16. Computing Highly Accurate Spectroscopic Line Lists that Cover a Large Temperature Range for Characterization of Exoplanet Atmospheres

    NASA Astrophysics Data System (ADS)

    Lee, T. J.; Huang, X.; Schwenke, D. W.

    2013-12-01

    Over the last decade, it has become apparent that the most effective approach for determining highly accurate rotational and rovibrational line lists for molecules of interest in planetary atmospheres is through a combination of high-resolution laboratory experiments coupled with state-of-the art ab initio quantum chemistry methods. The approach involves computing the most accurate potential energy surface (PES) possible using state-of-the art electronic structure methods, followed by computing rotational and rovibrational energy levels using an exact variational method to solve the nuclear Schrödinger equation. Then, reliable experimental data from high-resolution experiments is used to refine the ab initio PES in order to improve the accuracy of the computed energy levels and transition energies. From the refinement step, we have been able to achieve an accuracy of approximately 0.015 cm-1 for rovibrational transition energies, and even better for purely rotational transitions. This combined 'experiment / theory' approach allows for determination of essentially a complete line list, with hundreds of millions of transitions, and having the transition energies and intensities be highly accurate. Our group has successfully applied this approach to determine highly accurate line lists for NH3 and CO2 (and isotopologues), and very recently for SO2 and isotopologues. Here I will report our latest results for SO2 including all isotopologues. Comparisons to the available data in HITRAN2012 and other available databases will be shown, though we note that our line lists SO2 are significantly more complete than any other databases. Since it is important to span a large temperature range in order to model the spectral signature of exoplanets, we will also demonstrate how the spectra change on going from low temperatures (100 K) to higher temperatures (500 K).

  17. Enabling fast, stable and accurate peridynamic computations using multi-time-step integration

    SciTech Connect

    Lindsay, P.; Parks, M. L.; Prakash, A.

    2016-04-13

    Peridynamics is a nonlocal extension of classical continuum mechanics that is well-suited for solving problems with discontinuities such as cracks. This paper extends the peridynamic formulation to decompose a problem domain into a number of smaller overlapping subdomains and to enable the use of different time steps in different subdomains. This approach allows regions of interest to be isolated and solved at a small time step for increased accuracy while the rest of the problem domain can be solved at a larger time step for greater computational efficiency. Lastly, performance of the proposed method in terms of stability, accuracy, and computational cost is examined and several numerical examples are presented to corroborate the findings.

  18. Enabling fast, stable and accurate peridynamic computations using multi-time-step integration

    DOE PAGES

    Lindsay, P.; Parks, M. L.; Prakash, A.

    2016-04-13

    Peridynamics is a nonlocal extension of classical continuum mechanics that is well-suited for solving problems with discontinuities such as cracks. This paper extends the peridynamic formulation to decompose a problem domain into a number of smaller overlapping subdomains and to enable the use of different time steps in different subdomains. This approach allows regions of interest to be isolated and solved at a small time step for increased accuracy while the rest of the problem domain can be solved at a larger time step for greater computational efficiency. Lastly, performance of the proposed method in terms of stability, accuracy, andmore » computational cost is examined and several numerical examples are presented to corroborate the findings.« less

  19. Matrix-vector multiplication using digital partitioning for more accurate optical computing

    NASA Technical Reports Server (NTRS)

    Gary, C. K.

    1992-01-01

    Digital partitioning offers a flexible means of increasing the accuracy of an optical matrix-vector processor. This algorithm can be implemented with the same architecture required for a purely analog processor, which gives optical matrix-vector processors the ability to perform high-accuracy calculations at speeds comparable with or greater than electronic computers as well as the ability to perform analog operations at a much greater speed. Digital partitioning is compared with digital multiplication by analog convolution, residue number systems, and redundant number representation in terms of the size and the speed required for an equivalent throughput as well as in terms of the hardware requirements. Digital partitioning and digital multiplication by analog convolution are found to be the most efficient alogrithms if coding time and hardware are considered, and the architecture for digital partitioning permits the use of analog computations to provide the greatest throughput for a single processor.

  20. Enabling Computational Technologies for the Accurate Prediction/Description of Molecular Interactions in Condensed Phases

    DTIC Science & Technology

    2014-10-08

    Marenich, Christopher J. Cramer, Donald G. Truhlar, and Chang-Guo Zhan. Free Energies of Solvation with Surface , Volume, and Local Electrostatic...Effects and Atomic Surface Tensions to Represent the First Solvation Shell (Reprint), Journal of Chemical Theory and Computation, (01 2010): . doi...the Gibbs free energy of solvation and dissociation of HCl in water via Monte Carlo simulations and continuum solvation models, Physical Chemistry

  1. An Accurate Method to Compute the Parasitic Electromagnetic Radiations of Real Solar Panels

    NASA Astrophysics Data System (ADS)

    Andreiu, G.; Panh, J.; Reineix, A.; Pelissou, P.; Girard, C.; Delannoy, P.; Romeuf, X.; Schmitt, D.

    2012-05-01

    The methodology [1] able to compute the parasitic electromagnetic (EM) radiations of a solar panel is highly improved in this paper to model real solar panels. Thus, honeycomb composite panels, triple junction solar cells and serie or shunt regulation system can now be taken into account. After a brief summary of the methodology, the improvements are detailed. Finally, some encouraging frequency and time-domain results of magnetic field emitted by a real solar panel are presented.

  2. Computational tools and resources for prediction and analysis of gene regulatory regions in the chick genome.

    PubMed

    Khan, Mohsin A F; Soto-Jimenez, Luz Mayela; Howe, Timothy; Streit, Andrea; Sosinsky, Alona; Stern, Claudio D

    2013-05-01

    The discovery of cis-regulatory elements is a challenging problem in bioinformatics, owing to distal locations and context-specific roles of these elements in controlling gene regulation. Here we review the current bioinformatics methodologies and resources available for systematic discovery of cis-acting regulatory elements and conserved transcription factor binding sites in the chick genome. In addition, we propose and make available, a novel workflow using computational tools that integrate CTCF analysis to predict putative insulator elements, enhancer prediction, and TFBS analysis. To demonstrate the usefulness of this computational workflow, we then use it to analyze the locus of the gene Sox2 whose developmental expression is known to be controlled by a complex array of cis-acting regulatory elements. The workflow accurately predicts most of the experimentally verified elements along with some that have not yet been discovered. A web version of the CTCF tool, together with instructions for using the workflow can be accessed from http://toolshed.g2.bx.psu.edu/view/mkhan1980/ctcf_analysis. For local installation of the tool, relevant Perl scripts and instructions are provided in the directory named "code" in the supplementary materials.

  3. Computational tools and resources for prediction and analysis of gene regulatory regions in the chick genome

    PubMed Central

    Khan, Mohsin A. F.; Soto-Jimenez, Luz Mayela; Howe, Timothy; Streit, Andrea; Sosinsky, Alona; Stern, Claudio D.

    2013-01-01

    The discovery of cis-regulatory elements is a challenging problem in bioinformatics, owing to distal locations and context-specific roles of these elements in controlling gene regulation. Here we review the current bioinformatics methodologies and resources available for systematic discovery of cis-acting regulatory elements and conserved transcription factor binding sites in the chick genome. In addition, we propose and make available, a novel workflow using computational tools that integrate CTCF analysis to predict putative insulator elements, enhancer prediction and TFBS analysis. To demonstrate the usefulness of this computational workflow, we then use it to analyze the locus of the gene Sox2 whose developmental expression is known to be controlled by a complex array of cis-acting regulatory elements. The workflow accurately predicts most of the experimentally verified elements along with some that have not yet been discovered. A web version of the CTCF tool, together with instructions for using the workflow can be accessed from http://toolshed.g2.bx.psu.edu/view/mkhan1980/ctcf_analysis. For local installation of the tool, relevant Perl scripts and instructions are provided in the directory named “code” in the supplementary materials. PMID:23355428

  4. Monte Carlo tolerancing tool using nonsequential ray tracing on a computer cluster

    NASA Astrophysics Data System (ADS)

    Reimer, Christopher

    2010-08-01

    The development of a flexible tolerancing tool for illumination systems based on Matlab® and Zemax® is described in this paper. Two computationally intensive techniques are combined, Monte Carlo tolerancing and non-sequential ray tracing. Implementation of the tool on a computer cluster allows for relatively rapid tolerancing. This paper explores the tool structure, describing the splitting the task of tolerancing between Zemax and Matlab. An equation is derived that determines the number of simulated ray traces needed to accurately resolve illumination uniformity. Two examples of tolerancing illuminators are given. The first one is a projection system consisting of a pico-DLP, a light pipe, a TIR prism and the critical illumination relay optics. The second is a wide band, high performance Köhler illuminator, which includes a modified molded LED as the light source. As high performance illumination systems evolve, the practice of applying standard workshop tolerances to these systems may need to be re-examined.

  5. Accurate computation of weights in classical Gauss-Christoffel quadrature rules

    SciTech Connect

    Yakimiw, E.

    1996-12-01

    For many classical Gauss-Christoffel quadrature rules there does not exist a method which guarantees a uniform level of accuracy for the Gaussian quadrature weights at all quadrature nodes unless the nodes are known exactly. More disturbing, some algebraic expressions for these weights exhibit an excessive sensitivity to even the smallest perturbations in the node location. This sensitivity rapidly increases with high order quadrature rules. Current uses of very high order quadratures are common with the advent of more powerful computers, and a loss of accuracy in the weights has become a problem and must be addressed. A simple but efficient and general method for improving the accuracy of the computation of the quadrature weights even though the nodes may carry a significant large error. In addition, a highly efficient root-finding iterative technique with superlinear converging rates for computing the nodes is developed. It uses solely the quadrature polynomials and their first derivatives. A comparison of this method with the eigenvalue method of Golub and Welsh implemented in most standard software libraries is made. The proposed method outperforms the latter from the point of view of both accuracy and efficiency. The Legendre, Lobatto, Radau, Hermite, and Laguerre quadrature rules are examined. 22 refs., 7 figs., 5 tabs.

  6. Accurate computation and continuation of homoclinic and heteroclinic orbits for singular perturbation problems

    NASA Technical Reports Server (NTRS)

    Vaughan, William W.; Friedman, Mark J.; Monteiro, Anand C.

    1993-01-01

    In earlier papers, Doedel and the authors have developed a numerical method and derived error estimates for the computation of branches of heteroclinic orbits for a system of autonomous ordinary differential equations in R(exp n). The idea of the method is to reduce a boundary value problem on the real line to a boundary value problem on a finite interval by using a local (linear or higher order) approximation of the stable and unstable manifolds. A practical limitation for the computation of homoclinic and heteroclinic orbits has been the difficulty in obtaining starting orbits. Typically these were obtained from a closed form solution or via a homotopy from a known solution. Here we consider extensions of our algorithm which allow us to obtain starting orbits on the continuation branch in a more systematic way as well as make the continuation algorithm more flexible. In applications, we use the continuation software package AUTO in combination with some initial value software. The examples considered include computation of homoclinic orbits in a singular perturbation problem and in a turbulent fluid boundary layer in the wall region problem.

  7. Iofetamine I 123 single photon emission computed tomography is accurate in the diagnosis of Alzheimer's disease

    SciTech Connect

    Johnson, K.A.; Holman, B.L.; Rosen, T.J.; Nagel, J.S.; English, R.J.; Growdon, J.H. )

    1990-04-01

    To determine the diagnostic accuracy of iofetamine hydrochloride I 123 (IMP) with single photon emission computed tomography in Alzheimer's disease, we studied 58 patients with AD and 15 age-matched healthy control subjects. We used a qualitative method to assess regional IMP uptake in the entire brain and to rate image data sets as normal or abnormal without knowledge of subjects'clinical classification. The sensitivity and specificity of IMP with single photon emission computed tomography in AD were 88% and 87%, respectively. In 15 patients with mild cognitive deficits (Blessed Dementia Scale score, less than or equal to 10), sensitivity was 80%. With the use of a semiquantitative measure of regional cortical IMP uptake, the parietal lobes were the most functionally impaired in AD and the most strongly associated with the patients' Blessed Dementia Scale scores. These results indicated that IMP with single photon emission computed tomography may be a useful adjunct in the clinical diagnosis of AD in early, mild disease.

  8. Necessary conditions for accurate computations of three-body partial decay widths

    NASA Astrophysics Data System (ADS)

    Garrido, E.; Jensen, A. S.; Fedorov, D. V.

    2008-09-01

    The partial width for decay of a resonance into three fragments is largely determined at distances where the energy is smaller than the effective potential producing the corresponding wave function. At short distances the many-body properties are accounted for by preformation or spectroscopic factors. We use the adiabatic expansion method combined with the WKB approximation to obtain the indispensable cluster model wave functions at intermediate and larger distances. We test the concept by deriving conditions for the minimal basis expressed in terms of partial waves and radial nodes. We compare results for different effective interactions and methods. Agreement is found with experimental values for a sufficiently large basis. We illustrate the ideas with realistic examples from α emission of C12 and two-proton emission of Ne17. Basis requirements for accurate momentum distributions are briefly discussed.

  9. Computational Tools for Modeling and Measuring Chromosome Structure

    NASA Astrophysics Data System (ADS)

    Ross, Brian Christopher

    DNA conformation within cells has many important biological implications, but there are challenges both in modeling DNA due to the need for specialized techniques, and experimentally since tracing out in vivo conformations is currently impossible. This thesis contributes two computational projects to these efforts. The first project is a set of online and offline calculators of conformational statistics using a variety of published and unpublished methods, addressing the current lack of DNA model-building tools intended for general use. The second project is a reconstructive analysis that could enable in vivo mapping of DNA conformation at high resolution with current experimental technology. (Copies available exclusively from MIT Libraries, libraries.mit.edu/docs - docs mit.edu)

  10. Materials by numbers: Computations as tools of discovery

    PubMed Central

    Landman, Uzi

    2005-01-01

    Current issues pertaining to theoretical simulations of materials, with a focus on systems of nanometer-scale dimensions, are discussed. The use of atomistic simulations as high-resolution numerical experiments, enabling and guiding formulation and testing of analytic theoretical descriptions, is demonstrated through studies of the generation and breakup of nanojets, which have led to the derivation of a stochastic hydrodynamic description. Subsequently, I illustrate the use of computations and simulations as tools of discovery, with examples that include the self-organized formation of nanowires, the surprising nanocatalytic activity of small aggregates of gold that, in the bulk form, is notorious for being chemically inert, and the emergence of rotating electron molecules in two-dimensional quantum dots. I conclude with a brief discussion of some key challenges in nanomaterials simulations. PMID:15870210

  11. Computer-aided design tools for economical MEMS fabrication processes

    NASA Astrophysics Data System (ADS)

    Schneider, Christian; Priebe, Andreas; Brueck, Rainer; Hahn, Kai

    1999-03-01

    Since the early 70s when microsystem technology was first introduce an enormous market for MST-products has been developed. Airbag sensors, micro pumps, ink jet nozzles etc. and the market is just about to start up. Establishing these products for a reasonable price requires mass production. Meanwhile, also computer-based design-tools have been developed in order to reduce the expenses for MST-design. In contrast to other physical design processes like e.g. in micro electronics, MEMS physical design is characterized by the fact that each product requires a tailored sequence of fabrication steps, usually selected from a variety of processing alternatives. The selection from these alternatives is based on economical constraints. Therefore, the design has a strong influence on the money and time spent to take an MST-product to market.

  12. Integrated modeling tool for performance engineering of complex computer systems

    NASA Technical Reports Server (NTRS)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  13. Accurate and scalable O(N) algorithm for first-principles molecular-dynamics computations on large parallel computers.

    PubMed

    Osei-Kuffuor, Daniel; Fattebert, Jean-Luc

    2014-01-31

    We present the first truly scalable first-principles molecular dynamics algorithm with O(N) complexity and controllable accuracy, capable of simulating systems with finite band gaps of sizes that were previously impossible with this degree of accuracy. By avoiding global communications, we provide a practical computational scheme capable of extreme scalability. Accuracy is controlled by the mesh spacing of the finite difference discretization, the size of the localization regions in which the electronic wave functions are confined, and a cutoff beyond which the components of the overlap matrix can be omitted when computing selected elements of its inverse. We demonstrate the algorithm's excellent parallel scaling for up to 101,952 atoms on 23,328 processors, with a wall-clock time of the order of 1 min per molecular dynamics time step and numerical error on the forces of less than 7×10(-4)  Ha/Bohr.

  14. Accurate and Scalable O(N) Algorithm for First-Principles Molecular-Dynamics Computations on Large Parallel Computers

    SciTech Connect

    Osei-Kuffuor, Daniel; Fattebert, Jean-Luc

    2014-01-01

    We present the first truly scalable first-principles molecular dynamics algorithm with O(N) complexity and controllable accuracy, capable of simulating systems with finite band gaps of sizes that were previously impossible with this degree of accuracy. By avoiding global communications, we provide a practical computational scheme capable of extreme scalability. Accuracy is controlled by the mesh spacing of the finite difference discretization, the size of the localization regions in which the electronic wave functions are confined, and a cutoff beyond which the components of the overlap matrix can be omitted when computing selected elements of its inverse. We demonstrate the algorithm's excellent parallel scaling for up to 101 952 atoms on 23 328 processors, with a wall-clock time of the order of 1 min per molecular dynamics time step and numerical error on the forces of less than 7x10-4 Ha/Bohr.

  15. Three computational tools for predicting bacterial essential genes.

    PubMed

    Guo, Feng-Biao; Ye, Yuan-Nong; Ning, Lu-Wen; Wei, Wen

    2015-01-01

    Essential genes are those genes indispensable for the survival of any living cell. Bacterial essential genes constitute the cornerstones of synthetic biology and are often attractive targets in the development of antibiotics and vaccines. Because identification of essential genes with wet-lab ways often means expensive economic costs and tremendous labor, scientists changed to seek for alternative way of computational prediction. Aiming to help to solve this issue, our research group (CEFG: group of Computational, Comparative, Evolutionary and Functional Genomics, http://cefg.uestc.edu.cn) has constructed three online services to predict essential genes in bacterial genomes. These freely available tools are applicable for single gene sequences without annotated functions, single genes with definite names, and complete genomes of bacterial strains. To ensure reliable predictions, the investigated species should belong to the same family (for EGP) or phylum (for CEG_Match and Geptop) with one of the reference species, respectively. As the pilot software for the issue, predicting accuracies of them have been assessed and compared with existing algorithms, and note that all of other published algorithms have not any formed online services. We hope these services at CEFG will help scientists and researchers in the field of essential genes.

  16. A distributed computing tool for generating neural simulation databases.

    PubMed

    Calin-Jageman, Robert J; Katz, Paul S

    2006-12-01

    After developing a model neuron or network, it is important to systematically explore its behavior across a wide range of parameter values or experimental conditions, or both. However, compiling a very large set of simulation runs is challenging because it typically requires both access to and expertise with high-performance computing facilities. To lower the barrier for large-scale model analysis, we have developed NeuronPM, a client/server application that creates a "screen-saver" cluster for running simulations in NEURON (Hines & Carnevale, 1997). NeuronPM provides a user-friendly way to use existing computing resources to catalog the performance of a neural simulation across a wide range of parameter values and experimental conditions. The NeuronPM client is a Windows-based screen saver, and the NeuronPM server can be hosted on any Apache/PHP/MySQL server. During idle time, the client retrieves model files and work assignments from the server, invokes NEURON to run the simulation, and returns results to the server. Administrative panels make it simple to upload model files, define the parameters and conditions to vary, and then monitor client status and work progress. NeuronPM is open-source freeware and is available for download at http://neuronpm.homeip.net . It is a useful entry-level tool for systematically analyzing complex neuron and network simulations.

  17. SUPIN: A Computational Tool for Supersonic Inlet Design

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2016-01-01

    A computational tool named SUPIN is being developed to design and analyze the aerodynamic performance of supersonic inlets. The inlet types available include the axisymmetric pitot, three-dimensional pitot, axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flow-field is divided into parts to provide a framework for the geometry and aerodynamic modeling. Each part of the inlet is defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick design and analysis. SUPIN provides inlet geometry in the form of coordinates, surface angles, and cross-sectional areas. SUPIN can generate inlet surface grids and three-dimensional, structured volume grids for use with higher-fidelity computational fluid dynamics (CFD) analysis. Capabilities highlighted in this paper include the design and analysis of streamline-traced external-compression inlets, modeling of porous bleed, and the design and analysis of mixed-compression inlets. CFD analyses are used to verify the SUPIN results.

  18. Design and highly accurate 3D displacement characterization of monolithic SMA microgripper using computer vision

    NASA Astrophysics Data System (ADS)

    Bellouard, Yves; Sulzmann, Armin; Jacot, Jacques; Clavel, Reymond

    1998-01-01

    In the robotics field, several grippers have been developed using SMA technologies, but, so far, SMA is only used as the actuating part of the mechanical device. However mechanical device requires assembly and in some cases this means friction. In the case of micro-grippers, this becomes a major problem due to the small size of the components. In this paper, a new monolithic concept of micro-gripper is presented. This concept is applied to the grasping of sub- millimeter optical elements such as Selfoc lenses and the fastening of optical fibers. Measurements are performed using a newly developed high precision 3D-computer vision tracking system to characterize the spatial positions of the micro-gripper in action. To characterize relative motion of the micro-gripper the natural texture of the micro-gripper is used to compute 3D displacement. The microscope image CCD receivers high frequency changes in light intensity from the surface of the ripper. Using high resolution camera calibration, passive auto focus algorithms and 2D object recognition, the position of the micro-gripper can be characterized in the 3D workspace and can be guided in future micro assembly tasks.

  19. Accurate quantification of width and density of bone structures by computed tomography

    SciTech Connect

    Hangartner, Thomas N.; Short, David F.

    2007-10-15

    In computed tomography (CT), the representation of edges between objects of different densities is influenced by the limited spatial resolution of the scanner. This results in the misrepresentation of density of narrow objects, leading to errors of up to 70% and more. Our interest is in the imaging and measurement of narrow bone structures, and the issues are the same for imaging with clinical CT scanners, peripheral quantitative CT scanners or micro CT scanners. Mathematical models, phantoms and tests with patient data led to the following procedures: (i) extract density profiles at one-degree increments from the CT images at right angles to the bone boundary; (ii) consider the outer and inner edge of each profile separately due to different adjacent soft tissues; (iii) measure the width of each profile based on a threshold at fixed percentage of the difference between the soft-tissue value and a first approximated bone value; (iv) correct the underlying material density of bone for each profile based on the measured width with the help of the density-versus-width curve obtained from computer simulations and phantom measurements. This latter curve is specific to a certain scanner and is not dependent on the densities of the tissues within the range seen in patients. This procedure allows the calculation of the material density of bone. Based on phantom measurements, we estimate the density error to be below 2% relative to the density of normal bone and the bone-width error about one tenth of a pixel size.

  20. A novel class of highly efficient and accurate time-integrators in nonlinear computational mechanics

    NASA Astrophysics Data System (ADS)

    Wang, Xuechuan; Atluri, Satya N.

    2017-01-01

    A new class of time-integrators is presented for strongly nonlinear dynamical systems. These algorithms are far superior to the currently common time integrators in computational efficiency and accuracy. These three algorithms are based on a local variational iteration method applied over a finite interval of time. By using Chebyshev polynomials as trial functions and Dirac-Delta functions as the test functions over the finite time interval, the three algorithms are developed into three different discrete time-integrators through the collocation method. These time integrators are labeled as Chebyshev local iterative collocation methods. Through examples of the forced Duffing oscillator, the Lorenz system, and the multiple coupled Duffing equations (which arise as semi-discrete equations for beams, plates and shells undergoing large deformations), it is shown that the new algorithms are far superior to the 4th order Runge-Kutta and ODE45 of MATLAB, in predicting the chaotic responses of strongly nonlinear dynamical systems.

  1. Computer-implemented system and method for automated and highly accurate plaque analysis, reporting, and visualization

    NASA Technical Reports Server (NTRS)

    Kemp, James Herbert (Inventor); Talukder, Ashit (Inventor); Lambert, James (Inventor); Lam, Raymond (Inventor)

    2008-01-01

    A computer-implemented system and method of intra-oral analysis for measuring plaque removal is disclosed. The system includes hardware for real-time image acquisition and software to store the acquired images on a patient-by-patient basis. The system implements algorithms to segment teeth of interest from surrounding gum, and uses a real-time image-based morphing procedure to automatically overlay a grid onto each segmented tooth. Pattern recognition methods are used to classify plaque from surrounding gum and enamel, while ignoring glare effects due to the reflection of camera light and ambient light from enamel regions. The system integrates these components into a single software suite with an easy-to-use graphical user interface (GUI) that allows users to do an end-to-end run of a patient record, including tooth segmentation of all teeth, grid morphing of each segmented tooth, and plaque classification of each tooth image.

  2. Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals' Behaviour.

    PubMed

    Barnard, Shanis; Calderara, Simone; Pistocchi, Simone; Cucchiara, Rita; Podaliri-Vulpiani, Michele; Messori, Stefano; Ferri, Nicola

    2016-01-01

    Mankind directly controls the environment and lifestyles of several domestic species for purposes ranging from production and research to conservation and companionship. These environments and lifestyles may not offer these animals the best quality of life. Behaviour is a direct reflection of how the animal is coping with its environment. Behavioural indicators are thus among the preferred parameters to assess welfare. However, behavioural recording (usually from video) can be very time consuming and the accuracy and reliability of the output rely on the experience and background of the observers. The outburst of new video technology and computer image processing gives the basis for promising solutions. In this pilot study, we present a new prototype software able to automatically infer the behaviour of dogs housed in kennels from 3D visual data and through structured machine learning frameworks. Depth information acquired through 3D features, body part detection and training are the key elements that allow the machine to recognise postures, trajectories inside the kennel and patterns of movement that can be later labelled at convenience. The main innovation of the software is its ability to automatically cluster frequently observed temporal patterns of movement without any pre-set ethogram. Conversely, when common patterns are defined through training, a deviation from normal behaviour in time or between individuals could be assessed. The software accuracy in correctly detecting the dogs' behaviour was checked through a validation process. An automatic behaviour recognition system, independent from human subjectivity, could add scientific knowledge on animals' quality of life in confinement as well as saving time and resources. This 3D framework was designed to be invariant to the dog's shape and size and could be extended to farm, laboratory and zoo quadrupeds in artificial housing. The computer vision technique applied to this software is innovative in non

  3. Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals’ Behaviour

    PubMed Central

    Calderara, Simone; Pistocchi, Simone; Cucchiara, Rita; Podaliri-Vulpiani, Michele; Messori, Stefano; Ferri, Nicola

    2016-01-01

    Mankind directly controls the environment and lifestyles of several domestic species for purposes ranging from production and research to conservation and companionship. These environments and lifestyles may not offer these animals the best quality of life. Behaviour is a direct reflection of how the animal is coping with its environment. Behavioural indicators are thus among the preferred parameters to assess welfare. However, behavioural recording (usually from video) can be very time consuming and the accuracy and reliability of the output rely on the experience and background of the observers. The outburst of new video technology and computer image processing gives the basis for promising solutions. In this pilot study, we present a new prototype software able to automatically infer the behaviour of dogs housed in kennels from 3D visual data and through structured machine learning frameworks. Depth information acquired through 3D features, body part detection and training are the key elements that allow the machine to recognise postures, trajectories inside the kennel and patterns of movement that can be later labelled at convenience. The main innovation of the software is its ability to automatically cluster frequently observed temporal patterns of movement without any pre-set ethogram. Conversely, when common patterns are defined through training, a deviation from normal behaviour in time or between individuals could be assessed. The software accuracy in correctly detecting the dogs’ behaviour was checked through a validation process. An automatic behaviour recognition system, independent from human subjectivity, could add scientific knowledge on animals’ quality of life in confinement as well as saving time and resources. This 3D framework was designed to be invariant to the dog’s shape and size and could be extended to farm, laboratory and zoo quadrupeds in artificial housing. The computer vision technique applied to this software is innovative in non

  4. High-order accurate solution of the incompressible Navier-Stokes equations on massively parallel computers

    NASA Astrophysics Data System (ADS)

    Henniger, R.; Obrist, D.; Kleiser, L.

    2010-05-01

    The emergence of "petascale" supercomputers requires us to develop today's simulation codes for (incompressible) flows by codes which are using numerical schemes and methods that are better able to exploit the offered computational power. In that spirit, we present a massively parallel high-order Navier-Stokes solver for large incompressible flow problems in three dimensions. The governing equations are discretized with finite differences in space and a semi-implicit time integration scheme. This discretization leads to a large linear system of equations which is solved with a cascade of iterative solvers. The iterative solver for the pressure uses a highly efficient commutation-based preconditioner which is robust with respect to grid stretching. The efficiency of the implementation is further enhanced by carefully setting the (adaptive) termination criteria for the different iterative solvers. The computational work is distributed to different processing units by a geometric data decomposition in all three dimensions. This decomposition scheme ensures a low communication overhead and excellent scaling capabilities. The discretization is thoroughly validated. First, we verify the convergence orders of the spatial and temporal discretizations for a forced channel flow. Second, we analyze the iterative solution technique by investigating the absolute accuracy of the implementation with respect to the different termination criteria. Third, Orr-Sommerfeld and Squire eigenmodes for plane Poiseuille flow are simulated and compared to analytical results. Fourth, the practical applicability of the implementation is tested for transitional and turbulent channel flow. The results are compared to solutions from a pseudospectral solver. Subsequently, the performance of the commutation-based preconditioner for the pressure iteration is demonstrated. Finally, the excellent parallel scalability of the proposed method is demonstrated with a weak and a strong scaling test on up to

  5. Time-Accurate Computational Fluid Dynamics Simulation of a Pair of Moving Solid Rocket Boosters

    NASA Technical Reports Server (NTRS)

    Strutzenberg, Louise L.; Williams, Brandon R.

    2011-01-01

    Since the Columbia accident, the threat to the Shuttle launch vehicle from debris during the liftoff timeframe has been assessed by the Liftoff Debris Team at NASA/MSFC. In addition to engineering methods of analysis, CFD-generated flow fields during the liftoff timeframe have been used in conjunction with 3-DOF debris transport methods to predict the motion of liftoff debris. Early models made use of a quasi-steady flow field approximation with the vehicle positioned at a fixed location relative to the ground; however, a moving overset mesh capability has recently been developed for the Loci/CHEM CFD software which enables higher-fidelity simulation of the Shuttle transient plume startup and liftoff environment. The present work details the simulation of the launch pad and mobile launch platform (MLP) with truncated solid rocket boosters (SRBs) moving in a prescribed liftoff trajectory derived from Shuttle flight measurements. Using Loci/CHEM, time-accurate RANS and hybrid RANS/LES simulations were performed for the timeframe T0+0 to T0+3.5 seconds, which consists of SRB startup to a vehicle altitude of approximately 90 feet above the MLP. Analysis of the transient flowfield focuses on the evolution of the SRB plumes in the MLP plume holes and the flame trench, impingement on the flame deflector, and especially impingment on the MLP deck resulting in upward flow which is a transport mechanism for debris. The results show excellent qualitative agreement with the visual record from past Shuttle flights, and comparisons to pressure measurements in the flame trench and on the MLP provide confidence in these simulation capabilities.

  6. Accurate Computed Enthalpies of Spin Crossover in Iron and Cobalt Complexes

    NASA Astrophysics Data System (ADS)

    Jensen, Kasper P.; Cirera, Jordi

    2009-08-01

    Despite their importance in many chemical processes, the relative energies of spin states of transition metal complexes have so far been haunted by large computational errors. By the use of six functionals, B3LYP, BP86, TPSS, TPSSh, M06, and M06L, this work studies nine complexes (seven with iron and two with cobalt) for which experimental enthalpies of spin crossover are available. It is shown that such enthalpies can be used as quantitative benchmarks of a functional's ability to balance electron correlation in both the involved states. TPSSh achieves an unprecedented mean absolute error of ˜11 kJ/mol in spin transition energies, with the local functional M06L a distant second (25 kJ/mol). Other tested functionals give mean absolute errors of 40 kJ/mol or more. This work confirms earlier suggestions that 10% exact exchange is near-optimal for describing the electron correlation effects of first-row transition metal systems. Furthermore, it is shown that given an experimental structure of an iron complex, TPSSh can predict the electronic state corresponding to that experimental structure. We recommend this functional as current state-of-the-art for studying spin crossover and relative energies of close-lying electronic configurations in first-row transition metal systems.

  7. Improved targeting device and computer navigation for accurate placement of brachytherapy needles

    SciTech Connect

    Pappas, Ion P.I.; Ryan, Paul; Cossmann, Peter; Kowal, Jens; Borgeson, Blake; Caversaccio, Marco

    2005-06-15

    Successful treatment of skull base tumors with interstitial brachytherapy requires high targeting accuracy for the brachytherapy needles to avoid harming vital anatomical structures. To enable safe placement of the needles in this area, we developed an image-based planning and navigation system for brachytherapy, which includes a custom-made mechanical positioning arm that allows rough and fine adjustment of the needle position. The fine-adjustment mechanism consists of an XYZ microstage at the base of the arm and a needle holder with two fine-adjustable inclinations. The rotation axes of the inclinations cross at the tip of the needle so that the inclinational adjustments do not interfere with the translational adjustments. A vacuum cushion and a noninvasive fixation frame are used for the head immobilization. To avoid mechanical bending of the needles due to the weight of attached tracking markers, which would be detrimental for targeting accuracy, only a single LED marker on the tail of the needle is used. An experimental phantom-based targeting study with this setup demonstrated that a positioning accuracy of 1.4 mm (rms) can be achieved. The study showed that the proposed setup allows brachytherapy needles to be easily aligned and inserted with high targeting accuracy according to a preliminary plan. The achievable accuracy is higher than if the needles are inserted manually. The proposed system can be linked to a standard afterloader and standard dosimetry planning module. The associated additional effort is reasonable for the clinical practice and therefore the proposed procedure provides a promising tool for the safe treatment of tumors in the skull base area.

  8. Preoperative misdiagnosis analysis and accurate distinguish intrathymic cyst from small thymoma on computed tomography

    PubMed Central

    Li, Xin; Han, Xingpeng; Sun, Wei; Wang, Meng; Jing, Guohui

    2016-01-01

    Background To evaluate the role of computed tomography (CT) in preoperative diagnosis of intrathymic cyst and small thymoma, and determine the best CT threshold for distinguish intrathymic cyst from small thymoma. Methods We retrospectively reviewed the medical records of 30 patients (17 intrathymic cyst and 13 small thymoma) who had undergone mediastinal masses resection (with diameter less than 3 cm) under thoracoscope between January 2014 and July 2015 at our hospital. Clinical and CT features were compared and receiver-operating characteristics curve (ROC) analysis was performed. Results The CT value of small thymoma [39.5 HU (IQR, 33.7–42.2 HU)] was significantly higher than intrathymic cyst [25.8 HU (IQR, 22.3–29.3 HU), P=0.004]. When CT value was 31.2 HU, it could act as a threshold for identification of small thymoma and intrathymic cyst (the sensitivity and specificity was 92.3% and 82.4%, respectively). The ΔCT value of enhanced CT value with the non-enhanced CT value was significantly different between small thymoma [18.7 HU (IQR, 10.9–19.0 HU)] and intrathymic cyst [4.3 HU (IQR, 3.0–11.7 HU), P=0.04]. The density was more homogenous in intrathymic cyst than small thymoma, and the contour of the intrathymic cyst was more smoothly than small thymoma. Conclusions Preoperative CT scans could help clinicians to identify intrathymic cyst and small thymoma, and we recommend 31.2 HU as the best thresholds. Contrast-enhanced CT scans is useful for further identification of the two diseases. PMID:27621863

  9. Towards an accurate and computationally-efficient modelling of Fe(II)-based spin crossover materials.

    PubMed

    Vela, Sergi; Fumanal, Maria; Ribas-Arino, Jordi; Robert, Vincent

    2015-07-07

    The DFT + U methodology is regarded as one of the most-promising strategies to treat the solid state of molecular materials, as it may provide good energetic accuracy at a moderate computational cost. However, a careful parametrization of the U-term is mandatory since the results may be dramatically affected by the selected value. Herein, we benchmarked the Hubbard-like U-term for seven Fe(ii)N6-based pseudo-octahedral spin crossover (SCO) compounds, using as a reference an estimation of the electronic enthalpy difference (ΔHelec) extracted from experimental data (T1/2, ΔS and ΔH). The parametrized U-value obtained for each of those seven compounds ranges from 2.37 eV to 2.97 eV, with an average value of U = 2.65 eV. Interestingly, we have found that this average value can be taken as a good starting point since it leads to an unprecedented mean absolute error (MAE) of only 4.3 kJ mol(-1) in the evaluation of ΔHelec for the studied compounds. Moreover, by comparing our results on the solid state and the gas phase of the materials, we quantify the influence of the intermolecular interactions on the relative stability of the HS and LS states, with an average effect of ca. 5 kJ mol(-1), whose sign cannot be generalized. Overall, the findings reported in this manuscript pave the way for future studies devoted to understand the crystalline phase of SCO compounds, or the adsorption of individual molecules on organic or metallic surfaces, in which the rational incorporation of the U-term within DFT + U yields the required energetic accuracy that is dramatically missing when using bare-DFT functionals.

  10. Analyzing the Cohesion of English Text and Discourse with Automated Computer Tools

    ERIC Educational Resources Information Center

    Jeon, Moongee

    2014-01-01

    This article investigates the lexical and discourse features of English text and discourse with automated computer technologies. Specifically, this article examines the cohesion of English text and discourse with automated computer tools, Coh-Metrix and TEES. Coh-Metrix is a text analysis computer tool that can analyze English text and discourse…

  11. The Implications of Cognitive Psychology for Computer-Based Learning Tools.

    ERIC Educational Resources Information Center

    Kozma, Robert B.

    1987-01-01

    Defines cognitive computer tools as software programs that use the control capabilities of computers to amplify, extend, or enhance human cognition; suggests seven ways in which computers can aid learning; and describes the "Learning Tool," a software package for the Apple Macintosh microcomputer that is designed to aid learning of…

  12. A fully automatic tool to perform accurate flood mapping by merging remote sensing imagery and ancillary data

    NASA Astrophysics Data System (ADS)

    D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco; Pasquariello, Guido

    2016-04-01

    Flooding is one of the most frequent and expansive natural hazard. High-resolution flood mapping is an essential step in the monitoring and prevention of inundation hazard, both to gain insight into the processes involved in the generation of flooding events, and from the practical point of view of the precise assessment of inundated areas. Remote sensing data are recognized to be useful in this respect, thanks to the high resolution and regular revisit schedules of state-of-the-art satellites, moreover offering a synoptic overview of the extent of flooding. In particular, Synthetic Aperture Radar (SAR) data present several favorable characteristics for flood mapping, such as their relative insensitivity to the meteorological conditions during acquisitions, as well as the possibility of acquiring independently of solar illumination, thanks to the active nature of the radar sensors [1]. However, flood scenarios are typical examples of complex situations in which different factors have to be considered to provide accurate and robust interpretation of the situation on the ground: the presence of many land cover types, each one with a particular signature in presence of flood, requires modelling the behavior of different objects in the scene in order to associate them to flood or no flood conditions [2]. Generally, the fusion of multi-temporal, multi-sensor, multi-resolution and/or multi-platform Earth observation image data, together with other ancillary information, seems to have a key role in the pursuit of a consistent interpretation of complex scenes. In the case of flooding, distance from the river, terrain elevation, hydrologic information or some combination thereof can add useful information to remote sensing data. Suitable methods, able to manage and merge different kind of data, are so particularly needed. In this work, a fully automatic tool, based on Bayesian Networks (BNs) [3] and able to perform data fusion, is presented. It supplies flood maps

  13. Development of an accurate EPID-based output measurement and dosimetric verification tool for electron beam therapy

    PubMed Central

    Ding, Aiping; Xing, Lei; Han, Bin

    2015-01-01

    chamber measurements. The average discrepancy between EPID and ion chamber/film measurements was 0.81% ± 0.60% (SD) and 1.34% ± 0.75%, respectively. For the three clinical cases, the difference in output between the EPID- and ion chamber array measured values was found to be 1.13% ± 0.11%, 0.54% ± 0.10%, and 0.74% ± 0.11%, respectively. Furthermore, the γ-index analysis showed an excellent agreement between the EPID- and ion chamber array measured dose distributions: 100% of the pixels passed the criteria of 3%/3 mm. When the γ-index was set to be 2%/2 mm, the pass rate was found to be 99.0% ± 0.07%, 98.2% ± 0.14%, and 100% for the three cases. Conclusions: The EPID dosimetry system developed in this work provides an accurate and reliable tool for routine output measurement and dosimetric verification of electron beam therapy. Coupled with its portability and ease of use, the proposed system promises to replace the current film-based approach for fast and reliable assessment of small and irregular electron field dosimetry. PMID:26133618

  14. A computationally efficient and accurate numerical representation of thermodynamic properties of steam and water for computations of non-equilibrium condensing steam flow in steam turbines

    NASA Astrophysics Data System (ADS)

    Hrubý, Jan

    2012-04-01

    Mathematical modeling of the non-equilibrium condensing transonic steam flow in the complex 3D geometry of a steam turbine is a demanding problem both concerning the physical concepts and the required computational power. Available accurate formulations of steam properties IAPWS-95 and IAPWS-IF97 require much computation time. For this reason, the modelers often accept the unrealistic ideal-gas behavior. Here we present a computation scheme based on a piecewise, thermodynamically consistent representation of the IAPWS-95 formulation. Density and internal energy are chosen as independent variables to avoid variable transformations and iterations. On the contrary to the previous Tabular Taylor Series Expansion Method, the pressure and temperature are continuous functions of the independent variables, which is a desirable property for the solution of the differential equations of the mass, energy, and momentum conservation for both phases.

  15. Computer Instrumentation and the New Tools of Science.

    ERIC Educational Resources Information Center

    Snyder, H. David

    1990-01-01

    The impact and uses of new technologies in science teaching are discussed. Included are computers, software, sensors, integrated circuits, computer signal access, and computer interfaces. Uses and advantages of these new technologies are suggested. (CW)

  16. Accurate treatments of electrostatics for computer simulations of biological systems: A brief survey of developments and existing problems

    NASA Astrophysics Data System (ADS)

    Yi, Sha-Sha; Pan, Cong; Hu, Zhong-Han

    2015-12-01

    Modern computer simulations of biological systems often involve an explicit treatment of the complex interactions among a large number of molecules. While it is straightforward to compute the short-ranged Van der Waals interaction in classical molecular dynamics simulations, it has been a long-lasting issue to develop accurate methods for the longranged Coulomb interaction. In this short review, we discuss three types of methodologies for the accurate treatment of electrostatics in simulations of explicit molecules: truncation-type methods, Ewald-type methods, and mean-field-type methods. Throughout the discussion, we brief the formulations and developments of these methods, emphasize the intrinsic connections among the three types of methods, and focus on the existing problems which are often associated with the boundary conditions of electrostatics. This brief survey is summarized with a short perspective on future trends along the method developments and applications in the field of biological simulations. Project supported by the National Natural Science Foundation of China (Grant Nos. 91127015 and 21522304) and the Open Project from the State Key Laboratory of Theoretical Physics, and the Innovation Project from the State Key Laboratory of Supramolecular Structure and Materials.

  17. Methods for Computing Accurate Atomic Spin Moments for Collinear and Noncollinear Magnetism in Periodic and Nonperiodic Materials.

    PubMed

    Manz, Thomas A; Sholl, David S

    2011-12-13

    The partitioning of electron spin density among atoms in a material gives atomic spin moments (ASMs), which are important for understanding magnetic properties. We compare ASMs computed using different population analysis methods and introduce a method for computing density derived electrostatic and chemical (DDEC) ASMs. Bader and DDEC ASMs can be computed for periodic and nonperiodic materials with either collinear or noncollinear magnetism, while natural population analysis (NPA) ASMs can be computed for nonperiodic materials with collinear magnetism. Our results show Bader, DDEC, and (where applicable) NPA methods give similar ASMs, but different net atomic charges. Because they are optimized to reproduce both the magnetic field and the chemical states of atoms in a material, DDEC ASMs are especially suitable for constructing interaction potentials for atomistic simulations. We describe the computation of accurate ASMs for (a) a variety of systems using collinear and noncollinear spin DFT, (b) highly correlated materials (e.g., magnetite) using DFT+U, and (c) various spin states of ozone using coupled cluster expansions. The computed ASMs are in good agreement with available experimental results for a variety of periodic and nonperiodic materials. Examples considered include the antiferromagnetic metal organic framework Cu3(BTC)2, several ozone spin states, mono- and binuclear transition metal complexes, ferri- and ferro-magnetic solids (e.g., Fe3O4, Fe3Si), and simple molecular systems. We briefly discuss the theory of exchange-correlation functionals for studying noncollinear magnetism. A method for finding the ground state of systems with highly noncollinear magnetism is introduced. We use these methods to study the spin-orbit coupling potential energy surface of the single molecule magnet Fe4C40H52N4O12, which has highly noncollinear magnetism, and find that it contains unusual features that give a new interpretation to experimental data.

  18. Navigating traditional chinese medicine network pharmacology and computational tools.

    PubMed

    Yang, Ming; Chen, Jia-Lei; Xu, Li-Wen; Ji, Guang

    2013-01-01

    The concept of "network target" has ushered in a new era in the field of traditional Chinese medicine (TCM). As a new research approach, network pharmacology is based on the analysis of network models and systems biology. Taking advantage of advancements in systems biology, a high degree of integration data analysis strategy and interpretable visualization provides deeper insights into the underlying mechanisms of TCM theories, including the principles of herb combination, biological foundations of herb or herbal formulae action, and molecular basis of TCM syndromes. In this study, we review several recent developments in TCM network pharmacology research and discuss their potential for bridging the gap between traditional and modern medicine. We briefly summarize the two main functional applications of TCM network models: understanding/uncovering and predicting/discovering. In particular, we focus on how TCM network pharmacology research is conducted and highlight different computational tools, such as network-based and machine learning algorithms, and sources that have been proposed and applied to the different steps involved in the research process. To make network pharmacology research commonplace, some basic network definitions and analysis methods are presented.

  19. Navigating Traditional Chinese Medicine Network Pharmacology and Computational Tools

    PubMed Central

    Chen, Jia-Lei; Xu, Li-Wen

    2013-01-01

    The concept of “network target” has ushered in a new era in the field of traditional Chinese medicine (TCM). As a new research approach, network pharmacology is based on the analysis of network models and systems biology. Taking advantage of advancements in systems biology, a high degree of integration data analysis strategy and interpretable visualization provides deeper insights into the underlying mechanisms of TCM theories, including the principles of herb combination, biological foundations of herb or herbal formulae action, and molecular basis of TCM syndromes. In this study, we review several recent developments in TCM network pharmacology research and discuss their potential for bridging the gap between traditional and modern medicine. We briefly summarize the two main functional applications of TCM network models: understanding/uncovering and predicting/discovering. In particular, we focus on how TCM network pharmacology research is conducted and highlight different computational tools, such as network-based and machine learning algorithms, and sources that have been proposed and applied to the different steps involved in the research process. To make network pharmacology research commonplace, some basic network definitions and analysis methods are presented. PMID:23983798

  20. CProb: a computational tool for conducting conditional probability analysis.

    PubMed

    Hollister, Jeffrey W; Walker, Henry A; Paul, John F

    2008-01-01

    Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.

  1. Physics education through computational tools: the case of geometrical and physical optics

    NASA Astrophysics Data System (ADS)

    Rodríguez, Y.; Santana, A.; Mendoza, L. M.

    2013-09-01

    Recently, with the development of more powerful and accurate computational tools, the inclusion of new didactic materials in the classroom is known to have increased. However, the form in which these materials can be used to enhance the learning process is still under debate. Many different methodologies have been suggested for constructing new relevant curricular material and, among them, just-in-time teaching (JiTT) has arisen as an effective and successful way to improve the content of classes. In this paper, we will show the implemented pedagogic strategies for the courses of geometrical and optical physics for students of optometry. Thus, the use of the GeoGebra software for the geometrical optics class and the employment of new in-house software for the physical optics class created using the high-level programming language Python is shown with the corresponding activities developed for each of these applets.

  2. Use Computer-Aided Tools to Parallelize Large CFD Applications

    NASA Technical Reports Server (NTRS)

    Jin, H.; Frumkin, M.; Yan, J.

    2000-01-01

    Porting applications to high performance parallel computers is always a challenging task. It is time consuming and costly. With rapid progressing in hardware architectures and increasing complexity of real applications in recent years, the problem becomes even more sever. Today, scalability and high performance are mostly involving handwritten parallel programs using message-passing libraries (e.g. MPI). However, this process is very difficult and often error-prone. The recent reemergence of shared memory parallel (SMP) architectures, such as the cache coherent Non-Uniform Memory Access (ccNUMA) architecture used in the SGI Origin 2000, show good prospects for scaling beyond hundreds of processors. Programming on an SMP is simplified by working in a globally accessible address space. The user can supply compiler directives, such as OpenMP, to parallelize the code. As an industry standard for portable implementation of parallel programs for SMPs, OpenMP is a set of compiler directives and callable runtime library routines that extend Fortran, C and C++ to express shared memory parallelism. It promises an incremental path for parallel conversion of existing software, as well as scalability and performance for a complete rewrite or an entirely new development. Perhaps the main disadvantage of programming with directives is that inserted directives may not necessarily enhance performance. In the worst cases, it can create erroneous results. While vendors have provided tools to perform error-checking and profiling, automation in directive insertion is very limited and often failed on large programs, primarily due to the lack of a thorough enough data dependence analysis. To overcome the deficiency, we have developed a toolkit, CAPO, to automatically insert OpenMP directives in Fortran programs and apply certain degrees of optimization. CAPO is aimed at taking advantage of detailed inter-procedural dependence analysis provided by CAPTools, developed by the University of

  3. CRISIS2012: An Updated Tool to Compute Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Ordaz, M.; Martinelli, F.; Meletti, C.; D'Amico, V.

    2013-05-01

    CRISIS is a computer tool for probabilistic seismic hazard analysis (PSHA), whose development started in the late 1980's at the Instituto de Ingeniería, UNAM, Mexico. It started circulating outside the Mexican borders at the beginning of the 1990's, when it was first distributed as part of SEISAN tools. Throughout the years, CRISIS has been used for seismic hazard studies in several countries in Latin America (Mexico, Guatemala, Belize, El Salvador, Honduras, Nicaragua, Costa Rica, Panama, Colombia, Venezuela, Ecuador, Peru, Argentina and Chile), and in many other countries of the World. CRISIS has always circulated free of charge for non-commercial applications. It is worth noting that CRISIS has been mainly written by people that are, at the same time, PSHA practitioners. Therefore, the development loop has been relatively short, and most of the modifications and improvements have been made to satisfy the needs of the developers themselves. CRISIS has evolved from a rather simple FORTRAN code to a relatively complex program with a friendly graphical interface, able to handle a variety of modeling possibilities for source geometries, seismicity descriptions and ground motion prediction models (GMPM). We will describe some of the improvements made for the newest version of the code: CRISIS 2012.These improvements, some of which were made in the frame of the Italian research project INGV-DPC S2 (http://nuovoprogettoesse2.stru.polimi.it/), funded by the Dipartimento della Protezione Civile (DPC; National Civil Protection Department), include: A wider variety of source geometries A wider variety of seismicity models, including the ability to handle non-Poissonian occurrence models and Poissonian smoothed-seismicity descriptions. Enhanced capabilities for using different kinds of GMPM: attenuation tables, built-in models and generalized attenuation models. In the case of built-in models, there is, by default, a set ready to use in CRISIS, but additional custom GMPMs

  4. A More Accurate and Efficient Technique Developed for Using Computational Methods to Obtain Helical Traveling-Wave Tube Interaction Impedance

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.

    1999-01-01

    The phenomenal growth of commercial communications has created a great demand for traveling-wave tube (TWT) amplifiers. Although the helix slow-wave circuit remains the mainstay of the TWT industry because of its exceptionally wide bandwidth, until recently it has been impossible to accurately analyze a helical TWT using its exact dimensions because of the complexity of its geometrical structure. For the first time, an accurate three-dimensional helical model was developed that allows accurate prediction of TWT cold-test characteristics including operating frequency, interaction impedance, and attenuation. This computational model, which was developed at the NASA Lewis Research Center, allows TWT designers to obtain a more accurate value of interaction impedance than is possible using experimental methods. Obtaining helical slow-wave circuit interaction impedance is an important part of the design process for a TWT because it is related to the gain and efficiency of the tube. This impedance cannot be measured directly; thus, conventional methods involve perturbing a helical circuit with a cylindrical dielectric rod placed on the central axis of the circuit and obtaining the difference in resonant frequency between the perturbed and unperturbed circuits. A mathematical relationship has been derived between this frequency difference and the interaction impedance (ref. 1). However, because of the complex configuration of the helical circuit, deriving this relationship involves several approximations. In addition, this experimental procedure is time-consuming and expensive, but until recently it was widely accepted as the most accurate means of determining interaction impedance. The advent of an accurate three-dimensional helical circuit model (ref. 2) made it possible for Lewis researchers to fully investigate standard approximations made in deriving the relationship between measured perturbation data and interaction impedance. The most prominent approximations made

  5. Frances: A Tool for Understanding Computer Architecture and Assembly Language

    ERIC Educational Resources Information Center

    Sondag, Tyler; Pokorny, Kian L.; Rajan, Hridesh

    2012-01-01

    Students in all areas of computing require knowledge of the computing device including software implementation at the machine level. Several courses in computer science curricula address these low-level details such as computer architecture and assembly languages. For such courses, there are advantages to studying real architectures instead of…

  6. A streamline splitting pore-network approach for computationally inexpensive and accurate simulation of transport in porous media

    SciTech Connect

    Mehmani, Yashar; Oostrom, Martinus; Balhoff, Matthew

    2014-03-20

    Several approaches have been developed in the literature for solving flow and transport at the pore-scale. Some authors use a direct modeling approach where the fundamental flow and transport equations are solved on the actual pore-space geometry. Such direct modeling, while very accurate, comes at a great computational cost. Network models are computationally more efficient because the pore-space morphology is approximated. Typically, a mixed cell method (MCM) is employed for solving the flow and transport system which assumes pore-level perfect mixing. This assumption is invalid at moderate to high Peclet regimes. In this work, a novel Eulerian perspective on modeling flow and transport at the pore-scale is developed. The new streamline splitting method (SSM) allows for circumventing the pore-level perfect mixing assumption, while maintaining the computational efficiency of pore-network models. SSM was verified with direct simulations and excellent matches were obtained against micromodel experiments across a wide range of pore-structure and fluid-flow parameters. The increase in the computational cost from MCM to SSM is shown to be minimal, while the accuracy of SSM is much higher than that of MCM and comparable to direct modeling approaches. Therefore, SSM can be regarded as an appropriate balance between incorporating detailed physics and controlling computational cost. The truly predictive capability of the model allows for the study of pore-level interactions of fluid flow and transport in different porous materials. In this paper, we apply SSM and MCM to study the effects of pore-level mixing on transverse dispersion in 3D disordered granular media.

  7. Covariance Analysis Tool (G-CAT) for Computing Ascent, Descent, and Landing Errors

    NASA Technical Reports Server (NTRS)

    Boussalis, Dhemetrios; Bayard, David S.

    2013-01-01

    G-CAT is a covariance analysis tool that enables fast and accurate computation of error ellipses for descent, landing, ascent, and rendezvous scenarios, and quantifies knowledge error contributions needed for error budgeting purposes. Because GCAT supports hardware/system trade studies in spacecraft and mission design, it is useful in both early and late mission/ proposal phases where Monte Carlo simulation capability is not mature, Monte Carlo simulation takes too long to run, and/or there is a need to perform multiple parametric system design trades that would require an unwieldy number of Monte Carlo runs. G-CAT is formulated as a variable-order square-root linearized Kalman filter (LKF), typically using over 120 filter states. An important property of G-CAT is that it is based on a 6-DOF (degrees of freedom) formulation that completely captures the combined effects of both attitude and translation errors on the propagated trajectories. This ensures its accuracy for guidance, navigation, and control (GN&C) analysis. G-CAT provides the desired fast turnaround analysis needed for error budgeting in support of mission concept formulations, design trade studies, and proposal development efforts. The main usefulness of a covariance analysis tool such as G-CAT is its ability to calculate the performance envelope directly from a single run. This is in sharp contrast to running thousands of simulations to obtain similar information using Monte Carlo methods. It does this by propagating the "statistics" of the overall design, rather than simulating individual trajectories. G-CAT supports applications to lunar, planetary, and small body missions. It characterizes onboard knowledge propagation errors associated with inertial measurement unit (IMU) errors (gyro and accelerometer), gravity errors/dispersions (spherical harmonics, masscons), and radar errors (multiple altimeter beams, multiple Doppler velocimeter beams). G-CAT is a standalone MATLAB- based tool intended to

  8. TestMaker: A Computer-Based Test Development Tool.

    ERIC Educational Resources Information Center

    Gibbs, William J.; Lario-Gibbs, Annette M.

    This paper discusses a computer-based prototype called TestMaker that enables educators to create computer-based tests. Given the functional needs of faculty, the host of research implications computer technology has for assessment, and current educational perspectives such as constructivism and their impact on testing, the purposes for developing…

  9. Tool life modeling and computer simulation of tool wear when nickel-based material turning

    NASA Astrophysics Data System (ADS)

    Zebala, W.

    2016-09-01

    Paper presents some tool life investigations, concerning modeling and simulation of tool wear when turning a difficult-to-cut material like nickel based sintered powder workpiece. A cutting tool made of CBN has its special geometry. The workpiece in the form of disc is an aircraft engine part. The aim of researches is to optimize the cutting data for the purpose to decrease the tool wear and improve the machined surface roughness.

  10. Computers as a Writing Tool: Learning Package for Eighth Grade Students.

    ERIC Educational Resources Information Center

    Burnside, Patricia

    This report describes the design, development, implementation, and evaluation of a learning activity package intended to enable middle school students to view and use the computer as a tool that has application in their everyday lives. Eighth grade students in two keyboarding classes used the computer as a word processing tool in the subject of…

  11. Effects of Attitudes and Behaviours on Learning Mathematics with Computer Tools

    ERIC Educational Resources Information Center

    Reed, Helen C.; Drijvers, Paul; Kirschner, Paul A.

    2010-01-01

    This mixed-methods study investigates the effects of student attitudes and behaviours on the outcomes of learning mathematics with computer tools. A computer tool was used to help students develop the mathematical concept of function. In the whole sample (N = 521), student attitudes could account for a 3.4 point difference in test scores between…

  12. BioBloom tools: fast, accurate and memory-efficient host species sequence screening using bloom filters

    PubMed Central

    Chu, Justin; Sadeghi, Sara; Raymond, Anthony; Jackman, Shaun D.; Nip, Ka Ming; Mar, Richard; Mohamadi, Hamid; Butterfield, Yaron S.; Robertson, A. Gordon; Birol, Inanç

    2014-01-01

    Large datasets can be screened for sequences from a specific organism, quickly and with low memory requirements, by a data structure that supports time- and memory-efficient set membership queries. Bloom filters offer such queries but require that false positives be controlled. We present BioBloom Tools, a Bloom filter-based sequence-screening tool that is faster than BWA, Bowtie 2 (popular alignment algorithms) and FACS (a membership query algorithm). It delivers accuracies comparable with these tools, controls false positives and has low memory requirements. Availability and implementaion: www.bcgsc.ca/platform/bioinfo/software/biobloomtools Contact: cjustin@bcgsc.ca or ibirol@bcgsc.ca Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25143290

  13. Edge control in a computer controlled optical surfacing process using a heterocercal tool influence function.

    PubMed

    Hu, Haixiang; Zhang, Xin; Ford, Virginia; Luo, Xiao; Qi, Erhui; Zeng, Xuefeng; Zhang, Xuejun

    2016-11-14

    Edge effect is regarded as one of the most difficult technical issues in a computer controlled optical surfacing (CCOS) process. Traditional opticians have to even up the consequences of the two following cases. Operating CCOS in a large overhang condition affects the accuracy of material removal, while in a small overhang condition, it achieves a more accurate performance, but leaves a narrow rolled-up edge, which takes time and effort to remove. In order to control the edge residuals in the latter case, we present a new concept of the 'heterocercal' tool influence function (TIF). Generated from compound motion equipment, this type of TIF can 'transfer' the material removal from the inner place to the edge, meanwhile maintaining the high accuracy and efficiency of CCOS. We call it the 'heterocercal' TIF, because of the inspiration from the heterocercal tails of sharks, whose upper lobe provides most of the explosive power. The heterocercal TIF was theoretically analyzed, and physically realized in CCOS facilities. Experimental and simulation results showed good agreement. It enables significant control of the edge effect and convergence of entire surface errors in large tool-to-mirror size-ratio conditions. This improvement will largely help manufacturing efficiency in some extremely large optical system projects, like the tertiary mirror of the Thirty Meter Telescope.

  14. New computer architectures as tools for ecological thought.

    PubMed

    Villa, F

    1992-06-01

    Recent achievements of computer science provide unrivaled power for the advancement of ecology. This power is not merely computational: parallel computers, having hierarchical organization as their architectural principle, also provide metaphors for understanding complex systems. In this sense they might play for a science of ecological complexity a role like equilibrium-based metaphors had in the development of dynamic systems ecology. Parallel computers provide this opportunity through an informational view of ecological reality and multilevel modelling paradigms. Spatial and individual-oriented models allow application and full understanding of the new metaphors in the ecological context.

  15. The Utility of Computer Tracking Tools for User-Centered Design.

    ERIC Educational Resources Information Center

    Gay, Geri; Mazur, Joan

    1993-01-01

    Describes tracking tools used by designers and users to evaluate the efficacy of hypermedia systems. Highlights include human-computer interaction research; tracking tools and user-centered design; and three examples from the Interactive Multimedia Group at Cornell University that illustrate uses of various tracking tools. (27 references) (LRW)

  16. iTools: A Framework for Classification, Categorization and Integration of Computational Biology Resources

    PubMed Central

    Dinov, Ivo D.; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H. V.; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D. Stott; Toga, Arthur W.

    2008-01-01

    The advancement of the computational biology field hinges on progress in three fundamental directions – the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources–data, software tools and web-services. The iTools design, implementation and resource meta - data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource

  17. iTools: a framework for classification, categorization and integration of computational biology resources.

    PubMed

    Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W

    2008-05-28

    The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management

  18. WASTE REDUCTION USING COMPUTER-AIDED DESIGN TOOLS

    EPA Science Inventory

    Growing environmental concerns have spurred considerable interest in pollution prevention. In most instances, pollution prevention involves introducing radical changes to the design of processes so that waste generation is minimized.
    Process simulators can be effective tools i...

  19. Computer Art--A New Tool in Advertising Graphics.

    ERIC Educational Resources Information Center

    Wassmuth, Birgit L.

    Using computers to produce art began with scientists, mathematicians, and individuals with strong technical backgrounds who used the graphic material as visualizations of data in technical fields. People are using computer art in advertising, as well as in painting; sculpture; music; textile, product, industrial, and interior design; architecture;…

  20. Information and Communicative Technology--Computers as Research Tools

    ERIC Educational Resources Information Center

    Sarsani, Mahender Reddy

    2007-01-01

    The emergence of "the electronic age,/electronic cottages/the electronic world" has affected the whole world; particularly the emergence of computers has penetrated everyone's life to a remarkable degree. They are being used in various fields including education. Recent advances, especially in the area of computer technology have…

  1. Managing Laboratory Data Using Cloud Computing as an Organizational Tool

    ERIC Educational Resources Information Center

    Bennett, Jacqueline; Pence, Harry E.

    2011-01-01

    One of the most significant difficulties encountered when directing undergraduate research and developing new laboratory experiments is how to efficiently manage the data generated by a number of students. Cloud computing, where both software and computer files reside online, offers a solution to this data-management problem and allows researchers…

  2. The Computer as a Tool for Learning through Reflection.

    DTIC Science & Technology

    1986-03-01

    Santa Barbara, CA 93106 CANADA Dr. Daniel Gopher Dr. Wayne Harvey Industrial Engineering Center for Learning Technology & Management Educational...Silver Spring, MD 20910 Department of Computer Science Stanford University Dr. Wayne Gray Stanford, CA 95305 Army Research Institute 5001 Eisenhower...Shafto Dr. Richard E. Snow Department of Department of Psychology Computer Science Stanford University / :~. Towson State University Stanford, CA

  3. Development of Computer Aided Database Design and Maintenance Tools.

    DTIC Science & Technology

    1984-12-01

    effectively. 13. A user friendly (graphical or voice ) interface to point to attributes when defining functional dependencies. Such a tool could be especially...effectively, freeing the DBA to tend to his/her other duties. The second tool to be worked on is the user friendly (graphical or voice ) interface to specify...consuming task faster, easier, and more effectively. 18. A user friendly (graphical or voice ) interface to point to attributes when defining functional

  4. Accurate 3d Textured Models of Vessels for the Improvement of the Educational Tools of a Museum

    NASA Astrophysics Data System (ADS)

    Soile, S.; Adam, K.; Ioannidis, C.; Georgopoulos, A.

    2013-02-01

    Besides the demonstration of the findings, modern museums organize educational programs which aim to experience and knowledge sharing combined with entertainment rather than to pure learning. Toward that effort, 2D and 3D digital representations are gradually replacing the traditional recording of the findings through photos or drawings. The present paper refers to a project that aims to create 3D textured models of two lekythoi that are exhibited in the National Archaeological Museum of Athens in Greece; on the surfaces of these lekythoi scenes of the adventures of Odysseus are depicted. The project is expected to support the production of an educational movie and some other relevant interactive educational programs for the museum. The creation of accurate developments of the paintings and of accurate 3D models is the basis for the visualization of the adventures of the mythical hero. The data collection was made by using a structured light scanner consisting of two machine vision cameras that are used for the determination of geometry of the object, a high resolution camera for the recording of the texture, and a DLP projector. The creation of the final accurate 3D textured model is a complicated and tiring procedure which includes the collection of geometric data, the creation of the surface, the noise filtering, the merging of individual surfaces, the creation of a c-mesh, the creation of the UV map, the provision of the texture and, finally, the general processing of the 3D textured object. For a better result a combination of commercial and in-house software made for the automation of various steps of the procedure was used. The results derived from the above procedure were especially satisfactory in terms of accuracy and quality of the model. However, the procedure was proved to be time consuming while the use of various software packages presumes the services of a specialist.

  5. IHT: Tools for Computing Insolation Absorption by Particle Laden Flows

    SciTech Connect

    Grout, R. W.

    2013-10-01

    This report describes IHT, a toolkit for computing radiative heat exchange between particles. Well suited for insolation absorption computations, it is also has potential applications in combustion (sooting flames), biomass gasification processes and similar processes. The algorithm is based on the 'Photon Monte Carlo' approach and implemented in a library that can be interfaced with a variety of computational fluid dynamics codes to analyze radiative heat transfer in particle-laden flows. The emphasis in this report is on the data structures and organization of IHT for developers seeking to use the IHT toolkit to add Photon Monte Carlo capabilities to their own codes.

  6. EFL Learners' Attitudes towards Using Computers as a Learning Tool in Language Learning

    ERIC Educational Resources Information Center

    Kitchakarn, Orachorn

    2015-01-01

    The study was conducted to investigate attitudes toward using computers as a learning tool among undergraduate students in a private university. In this regards, some variables which might be potential antecedents of attitudes toward computer including gender, experience of using computers and perceived abilities in using programs were examined.…

  7. Project-Based Teaching-Learning Computer-Aided Engineering Tools

    ERIC Educational Resources Information Center

    Simoes, J. A.; Relvas, C.; Moreira, R.

    2004-01-01

    Computer-aided design, computer-aided manufacturing, computer-aided analysis, reverse engineering and rapid prototyping are tools that play an important key role within product design. These are areas of technical knowledge that must be part of engineering and industrial design courses' curricula. This paper describes our teaching experience of…

  8. Computer Databases as an Educational Tool in the Basic Sciences.

    ERIC Educational Resources Information Center

    Friedman, Charles P.; And Others

    1990-01-01

    The University of North Carolina School of Medicine developed a computer database, INQUIRER, containing scientific information in bacteriology, and then integrated the database into routine educational activities for first-year medical students in their microbiology course. (Author/MLW)

  9. Further Uses of the Analog Computer as a Teaching Tool

    ERIC Educational Resources Information Center

    Shonle, John I.

    1976-01-01

    Discusses the use of an analog computer oscilloscope to illustrate the transition from underdamped to overdamped for the simple harmonic oscillator, the maximum range for a projectile, and the behavior of charged particles in crossed electric and magnetic fields. (MLH)

  10. Accurate ab initio potential energy computations for the H sub 4 system: Tests of some analytic potential energy surfaces

    SciTech Connect

    Boothroyd, A.I. ); Dove, J.E.; Keogh, W.J. ); Martin, P.G. ); Peterson, M.R. )

    1991-09-15

    The interaction potential energy surface (PES) of H{sub 4} is of great importance for quantum chemistry, as a test case for molecule--molecule interactions. It is also required for a detailed understanding of certain astrophysical processes, namely, collisional excitation and dissociation of H{sub 2} in molecular clouds, at densities too low to be accessible experimentally. Accurate {ital ab} {ital initio} energies were computed for 6046 conformations of H{sub 4}, using a multiple reference (single and) double excitation configuration interaction (MRD-CI) program. Both systematic and random'' errors were estimated to have an rms size of 0.6 mhartree, for a total rms error of about 0.9 mhartree (or 0.55 kcal/mol) in the final {ital ab} {ital initio} energy values. It proved possible to include in a self-consistent way {ital ab} {ital initio} energies calculated by Schwenke, bringing the number of H{sub 4} conformations to 6101. {ital Ab} {ital initio} energies were also computed for 404 conformations of H{sub 3}; adding {ital ab} {ital initio} energies calculated by other authors yielded a total of 772 conformations of H{sub 3}. (The H{sub 3} results, and an improved analytic PES for H{sub 3}, are reported elsewhere.) {ital Ab} {ital initio} energies are tabulated in this paper only for a sample of H{sub 4} conformations; a full list of all 6101 conformations of H{sub 4} (and 772 conformations of H{sub 3} ) is available from Physics Auxiliary Publication Service (PAPS), or from the authors.

  11. Measurement Model for Division as a Tool in Computing Applications

    ERIC Educational Resources Information Center

    Abramovich, Sergei; Strock, Tracy

    2002-01-01

    The paper describes the use of a spreadsheet in a mathematics teacher education course. It shows how the tool can serve as a link between seemingly disconnected mathematical concepts. The didactical triad of using a spreadsheet as an agent, consumer, and amplifier of mathematical activities allows for an extended investigation of simple yet…

  12. Computational tool for simulation of power and refrigeration cycles

    NASA Astrophysics Data System (ADS)

    Córdoba Tuta, E.; Reyes Orozco, M.

    2016-07-01

    Small improvement in thermal efficiency of power cycles brings huge cost savings in the production of electricity, for that reason have a tool for simulation of power cycles allows modeling the optimal changes for a best performance. There is also a big boom in research Organic Rankine Cycle (ORC), which aims to get electricity at low power through cogeneration, in which the working fluid is usually a refrigerant. A tool to design the elements of an ORC cycle and the selection of the working fluid would be helpful, because sources of heat from cogeneration are very different and in each case would be a custom design. In this work the development of a multiplatform software for the simulation of power cycles and refrigeration, which was implemented in the C ++ language and includes a graphical interface which was developed using multiplatform environment Qt and runs on operating systems Windows and Linux. The tool allows the design of custom power cycles, selection the type of fluid (thermodynamic properties are calculated through CoolProp library), calculate the plant efficiency, identify the fractions of flow in each branch and finally generates a report very educational in pdf format via the LaTeX tool.

  13. Understanding Computation of Impulse Response in Microwave Software Tools

    ERIC Educational Resources Information Center

    Potrebic, Milka M.; Tosic, Dejan V.; Pejovic, Predrag V.

    2010-01-01

    In modern microwave engineering curricula, the introduction of the many new topics in microwave industrial development, or of software tools for design and simulation, sometimes results in students having an inadequate understanding of the fundamental theory. The terminology for and the explanation of algorithms for calculating impulse response in…

  14. Distributed design tools: Mapping targeted design tools onto a Web-based distributed architecture for high-performance computing

    SciTech Connect

    Holmes, V.P.; Linebarger, J.M.; Miller, D.J.; Poore, C.A.

    1999-11-30

    Design Tools use a Web-based Java interface to guide a product designer through the design-to-analysis cycle for a specific, well-constrained design problem. When these Design Tools are mapped onto a Web-based distributed architecture for high-performance computing, the result is a family of Distributed Design Tools (DDTs). The software components that enable this mapping consist of a Task Sequencer, a generic Script Execution Service, and the storage of both data and metadata in an active, object-oriented database called the Product Database Operator (PDO). The benefits of DDTs include improved security, reliability, scalability (in both problem size and computing hardware), robustness, and reusability. In addition, access to the PDO unlocks its wide range of services for distributed components, such as lookup and launch capability, persistent shared memory for communication between cooperating services, state management, event notification, and archival of design-to-analysis session data.

  15. A Perspective on Computational Human Performance Models as Design Tools

    NASA Technical Reports Server (NTRS)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  16. Non-sequential optimization technique for a computer controlled optical surfacing process using multiple tool influence functions.

    PubMed

    Kim, Dae Wook; Kim, Sug-Whan; Burge, James H

    2009-11-23

    Optical surfaces can be accurately figured by computer controlled optical surfacing (CCOS) that uses well characterized sub-diameter polishing tools driven by numerically controlled (NC) machines. The motion of the polishing tool is optimized to vary the dwell time of the polisher on the workpiece according to the desired removal and the calibrated tool influence function (TIF). Operating CCOS with small and very well characterized TIF achieves excellent performance, but it takes a long time. This overall polishing time can be reduced by performing sequential polishing runs that start with large tools and finish with smaller tools. In this paper we present a variation of this technique that uses a set of different size TIFs, but the optimization is performed globally - i.e. simultaneously optimizing the dwell times and tool shapes for the entire set of polishing runs. So the actual polishing runs will be sequential, but the optimization is comprehensive. As the optimization is modified from the classical method to the comprehensive non-sequential algorithm, the performance improvement is significant. For representative polishing runs we show figuring efficiency improvement from approximately 88% to approximately 98% in terms of residual RMS (root-mean-square) surface error and from approximately 47% to approximately 89% in terms of residual RMS slope error.

  17. Accurate ab initio tight-binding Hamiltonians: Effective tools for electronic transport and optical spectroscopy from first principles

    NASA Astrophysics Data System (ADS)

    D'Amico, Pino; Agapito, Luis; Catellani, Alessandra; Ruini, Alice; Curtarolo, Stefano; Fornari, Marco; Nardelli, Marco Buongiorno; Calzolari, Arrigo

    2016-10-01

    The calculations of electronic transport coefficients and optical properties require a very dense interpolation of the electronic band structure in reciprocal space that is computationally expensive and may have issues with band crossing and degeneracies. Capitalizing on a recently developed pseudoatomic orbital projection technique, we exploit the exact tight-binding representation of the first-principles electronic structure for the purposes of (i) providing an efficient strategy to explore the full band structure En(k ) , (ii) computing the momentum operator differentiating directly the Hamiltonian, and (iii) calculating the imaginary part of the dielectric function. This enables us to determine the Boltzmann transport coefficients and the optical properties within the independent particle approximation. In addition, the local nature of the tight-binding representation facilitates the calculation of the ballistic transport within the Landauer theory for systems with hundreds of atoms. In order to validate our approach we study the multivalley band structure of CoSb3 and a large core-shell nanowire using the ACBN0 functional. In CoSb3 we point the many band minima contributing to the electronic transport that enhance the thermoelectric properties; for the core-shell nanowire we identify possible mechanisms for photo-current generation and justify the presence of protected transport channels in the wire.

  18. FAMBE-pH: a fast and accurate method to compute the total solvation free energies of proteins.

    PubMed

    Vorobjev, Yury N; Vila, Jorge A; Scheraga, Harold A

    2008-09-04

    A fast and accurate method to compute the total solvation free energies of proteins as a function of pH is presented. The method makes use of a combination of approaches, some of which have already appeared in the literature; (i) the Poisson equation is solved with an optimized fast adaptive multigrid boundary element (FAMBE) method; (ii) the electrostatic free energies of the ionizable sites are calculated for their neutral and charged states by using a detailed model of atomic charges; (iii) a set of optimal atomic radii is used to define a precise dielectric surface interface; (iv) a multilevel adaptive tessellation of this dielectric surface interface is achieved by using multisized boundary elements; and (v) 1:1 salt effects are included. The equilibrium proton binding/release is calculated with the Tanford-Schellman integral if the proteins contain more than approximately 20-25 ionizable groups; for a smaller number of ionizable groups, the ionization partition function is calculated directly. The FAMBE method is tested as a function of pH (FAMBE-pH) with three proteins, namely, bovine pancreatic trypsin inhibitor (BPTI), hen egg white lysozyme (HEWL), and bovine pancreatic ribonuclease A (RNaseA). The results are (a) the FAMBE-pH method reproduces the observed pK a's of the ionizable groups of these proteins within an average absolute value of 0.4 p K units and a maximum error of 1.2 p K units and (b) comparison of the calculated total pH-dependent solvation free energy for BPTI, between the exact calculation of the ionization partition function and the Tanford-Schellman integral method, shows agreement within 1.2 kcal/mol. These results indicate that calculation of total solvation free energies with the FAMBE-pH method can provide an accurate prediction of protein conformational stability at a given fixed pH and, if coupled with molecular mechanics or molecular dynamics methods, can also be used for more realistic studies of protein folding, unfolding, and

  19. Coordinated Computer-Supported Collaborative Learning: Awareness and Awareness Tools

    ERIC Educational Resources Information Center

    Janssen, Jeroen; Bodemer, Daniel

    2013-01-01

    Traditionally, research on awareness during online collaboration focused on topics such as the effects of spatial information about group members' activities on the collaborative process. When the concept of awareness was introduced to computer-supported collaborative learning, this focus shifted to cognitive group awareness (e.g., information…

  20. A Web Browsing Tool for a Shared Computer Environment

    ERIC Educational Resources Information Center

    Bodnar, George H.

    2007-01-01

    This paper provides a Microsoft .NET framework application that makes browsing the Internet in a shared computer environment convenient and secure. One simply opens the program, then points and clicks to both open Internet Explorer and have it move directly to the selected address. Addresses do not need to be manually entered or copied and pasted…

  1. Computer Vision Tools for Finding Images and Video Sequences.

    ERIC Educational Resources Information Center

    Forsyth, D. A.

    1999-01-01

    Computer vision offers a variety of techniques for searching for pictures in large collections of images. Appearance methods compare images based on the overall content of the image using certain criteria. Finding methods concentrate on matching subparts of images, defined in a variety of ways, in hope of finding particular objects. These ideas…

  2. Integrated computational materials engineering: Tools, simulations and new applications

    SciTech Connect

    Madison, Jonathan D.

    2016-03-30

    Here, Integrated Computational Materials Engineering (ICME) is a relatively new methodology full of tremendous potential to revolutionize how science, engineering and manufacturing work together. ICME was motivated by the desire to derive greater understanding throughout each portion of the development life cycle of materials, while simultaneously reducing the time between discovery to implementation [1,2].

  3. Computer Generated Optical Illusions: A Teaching and Research Tool.

    ERIC Educational Resources Information Center

    Bailey, Bruce; Harman, Wade

    Interactive computer-generated simulations that highlight psychological principles were investigated in this study in which 33 female and 19 male undergraduate college student volunteers of median age 21 matched line and circle sizes in six variations of Ponzo's illusion. Prior to working with the illusions, data were collected based on subjects'…

  4. Algorithmic Tools and Computational Frameworks for Cell Informatics

    DTIC Science & Technology

    2006-04-01

    Various Biological Systems ................................................................... 10 C . elegans Gonad Tract Cells Simulations...context, several experiments on the nematode C . elegans were conducted in cooperation with colleagues in the NYU Department of Biology, in order to test...proliferation. No animal research was conducted under this project. To this end, a rigorous computational model of C . elegans germ line stem cell growth

  5. Nanobioinformatics: Emerging Computational Tools to Understand Nano-Bio Interaction

    DTIC Science & Technology

    2012-11-16

    silver nitrate, titanium dioxide , and many more. Many studies had been reported until now as these NP individually are not potent but in conjugated form... titanium dioxide when injected alone not much change in enzymes (Malondialdehyde, Glutathione reductase, superoxide Dismutase) concentration is...continued usage of its in medical field for imaging, delivery of drugs and as sunscreen (cosmetic) and etc Development of prediction tool for

  6. Developing Battery Computer Aided Engineering Tools for Military Vehicles

    DTIC Science & Technology

    2013-12-01

    Solid Electrolyte Interphase (SEI) layer decomposition 80 2 Anode — electrolyte 100 3 Cathode — electrolyte 130 4 Electrolyte decomposition 180...performance, NREL and the University of Colorado at Boulder coded and linked a solid mechanics model to explore mechanical phenomena in lithium -ion...electrified military vehicles. Particularly, TARDEC’s objective was the development of tools to accelerate comparative analysis of alternative lithium -ion

  7. Lilith: A scalable secure tool for massively parallel distributed computing

    SciTech Connect

    Armstrong, R.C.; Camp, L.J.; Evensky, D.A.; Gentile, A.C.

    1997-06-01

    Changes in high performance computing have necessitated the ability to utilize and interrogate potentially many thousands of processors. The ASCI (Advanced Strategic Computing Initiative) program conducted by the United States Department of Energy, for example, envisions thousands of distinct operating systems connected by low-latency gigabit-per-second networks. In addition multiple systems of this kind will be linked via high-capacity networks with latencies as low as the speed of light will allow. Code which spans systems of this sort must be scalable; yet constructing such code whether for applications, debugging, or maintenance is an unsolved problem. Lilith is a research software platform that attempts to answer these questions with an end toward meeting these needs. Presently, Lilith exists as a test-bed, written in Java, for various spanning algorithms and security schemes. The test-bed software has, and enforces, hooks allowing implementation and testing of various security schemes.

  8. Computer aided systems human engineering: A hypermedia tool

    NASA Technical Reports Server (NTRS)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  9. Bridging the Gap: Using Interactive Computer Tools To Build Fraction Schemes.

    ERIC Educational Resources Information Center

    Olive, John

    2002-01-01

    Explores ways to help children make connections between whole-number multiplication and their notion of a fraction. Illustrates an approach to constructing fraction concepts that builds on children's whole-number knowledge using specially designed computer tools. (KHR)

  10. Computers as Management Tools: Acceptance by Nursing Personnel

    PubMed Central

    Charters, K. G. J.

    1981-01-01

    A review of the capabilities of “home” computers (e.g.: Apple III or TRS-80 Model III) indicates these inexpensive machines could automate as much as 80% of nursing service paperwork. With software currently available, a complete novice could run such a system after just a few hours of training. Challenges to be met in establishing a nursing information management system include staff reactions of fear, distrust, resentment, disbelief, and skepticism, and occasional feelings of professional inferiority.

  11. Computational tools for the evaluation of laboratory-engineered biocatalysts

    PubMed Central

    Romero-Rivera, Adrian; Garcia-Borràs, Marc

    2017-01-01

    Biocatalysis is based on the application of natural catalysts for new purposes, for which enzymes were not designed. Although the first examples of biocatalysis were reported more than a century ago, biocatalysis was revolutionized after the discovery of an in vitro version of Darwinian evolution called Directed Evolution (DE). Despite the recent advances in the field, major challenges remain to be addressed. Currently, the best experimental approach consists of creating multiple mutations simultaneously while limiting the choices using statistical methods. Still, tens of thousands of variants need to be tested experimentally, and little information is available on how these mutations lead to enhanced enzyme proficiency. This review aims to provide a brief description of the available computational techniques to unveil the molecular basis of improved catalysis achieved by DE. An overview of the strengths and weaknesses of current computational strategies is explored with some recent representative examples. The understanding of how this powerful technique is able to obtain highly active variants is important for the future development of more robust computational methods to predict amino-acid changes needed for activity. PMID:27812570

  12. Present status of computational tools for Maglev development

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Chen, S. S.; Rote, D. M.

    1991-10-01

    High-speed vehicles that employ magnetic levitation (maglev) have received great attention worldwide as a means of relieving both highway and air-traffic congestion. At this time, Japan and Germany are leading the development of maglev. After fifteen years of inactivity that is attributed to technical policy decisions, the Federal government of the United States has reconsidered the possibility of using maglev in the United States. The National Maglev Initiative (NMI) was established in May 1990 to assess the potential of maglev in the United States. One of the tasks of the NMI, which is also the objective of this report, is to determine the status of existing computer software that can be applied to maglev-related problems. The computational problems involved in maglev assessment, research, and development can be classified into two categories: electromagnetic and mechanical. Because most maglev problems are complicated and difficult to solve analytically, proper numerical methods are needed to find solutions. To determine the status of maglev-related software, developers and users of computer codes were surveyed. The results of the survey are described in this report.

  13. Present status of computational tools for maglev development

    SciTech Connect

    Wang, Z.; Chen, S.S.; Rote, D.M.

    1991-10-01

    High-speed vehicles that employ magnetic levitation (maglev) have received great attention worldwide as a means of relieving both highway and air-traffic congestion. At this time, Japan and Germany are leading the development of maglev. After fifteen years of inactivity that is attributed to technical policy decisions, the federal government of the United States has reconsidered the possibility of using maglev in the United States. The National Maglev Initiative (NMI) was established in May 1990 to assess the potential of maglev in the United States. One of the tasks of the NMI, which is also the objective of this report, is to determine the status of existing computer software that can be applied to maglev-related problems. The computational problems involved in maglev assessment, research, and development can be classified into two categories: electromagnetic and mechanical. Because most maglev problems are complicated and difficult to solve analytically, proper numerical methods are needed to find solutions. To determine the status of maglev-related software, developers and users of computer codes were surveyed. The results of the survey are described in this report. 25 refs.

  14. Towards early software reliability prediction for computer forensic tools (case study).

    PubMed

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  15. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  16. Brain-computer interface technology as a tool to augment plasticity and outcomes for neurological rehabilitation.

    PubMed

    Dobkin, Bruce H

    2007-03-15

    Brain-computer interfaces (BCIs) are a rehabilitation tool for tetraplegic patients that aim to improve quality of life by augmenting communication, control of the environment, and self-care. The neurobiology of both rehabilitation and BCI control depends upon learning to modify the efficacy of spared neural ensembles that represent movement, sensation and cognition through progressive practice with feedback and reward. To serve patients, BCI systems must become safe, reliable, cosmetically acceptable, quickly mastered with minimal ongoing technical support, and highly accurate even in the face of mental distractions and the uncontrolled environment beyond a laboratory. BCI technologies may raise ethical concerns if their availability affects the decisions of patients who become locked-in with brain stem stroke or amyotrophic lateral sclerosis to be sustained with ventilator support. If BCI technology becomes flexible and affordable, volitional control of cortical signals could be employed for the rehabilitation of motor and cognitive impairments in hemiplegic or paraplegic patients by offering on-line feedback about cortical activity associated with mental practice, motor intention, and other neural recruitment strategies during progressive task-oriented practice. Clinical trials with measures of quality of life will be necessary to demonstrate the value of near-term and future BCI applications.

  17. Brain–computer interface technology as a tool to augment plasticity and outcomes for neurological rehabilitation

    PubMed Central

    Dobkin, Bruce H

    2007-01-01

    Brain–computer interfaces (BCIs) are a rehabilitation tool for tetraplegic patients that aim to improve quality of life by augmenting communication, control of the environment, and self-care. The neurobiology of both rehabilitation and BCI control depends upon learning to modify the efficacy of spared neural ensembles that represent movement, sensation and cognition through progressive practice with feedback and reward. To serve patients, BCI systems must become safe, reliable, cosmetically acceptable, quickly mastered with minimal ongoing technical support, and highly accurate even in the face of mental distractions and the uncontrolled environment beyond a laboratory. BCI technologies may raise ethical concerns if their availability affects the decisions of patients who become locked-in with brain stem stroke or amyotrophic lateral sclerosis to be sustained with ventilator support. If BCI technology becomes flexible and affordable, volitional control of cortical signals could be employed for the rehabilitation of motor and cognitive impairments in hemiplegic or paraplegic patients by offering on-line feedback about cortical activity associated with mental practice, motor intention, and other neural recruitment strategies during progressive task-oriented practice. Clinical trials with measures of quality of life will be necessary to demonstrate the value of near-term and future BCI applications. PMID:17095557

  18. Computer implemented method, and apparatus for controlling a hand-held tool

    NASA Technical Reports Server (NTRS)

    Wagner, Kenneth William (Inventor); Taylor, James Clayton (Inventor)

    1999-01-01

    The invention described here in is a computer-implemented method and apparatus for controlling a hand-held tool. In particular, the control of a hand held tool is for the purpose of controlling the speed of a fastener interface mechanism and the torque applied to fasteners by the fastener interface mechanism of the hand-held tool and monitoring the operating parameters of the tool. The control is embodied in intool software embedded on a processor within the tool which also communicates with remote software. An operator can run the tool, or through the interaction of both software, operate the tool from a remote location, analyze data from a performance history recorded by the tool, and select various torque and speed parameters for each fastener.

  19. Developing Tools for Computation of Basin Topographic Parameters in GIS

    NASA Astrophysics Data System (ADS)

    Gökgöz, T.; Yayla, Y.; Yaman, M. B.; Güvenç, H.; Kaya, S.

    2016-10-01

    Although water use has been increasing day by day depending on fast population increase, urbanization and industrialization in the world, potential of usable water resources remains stable. On the other side, expansion of agricultural activities, industrialization, urbanization, global warming and climate change create a big pressure on current water resources. Therefore, management of water resources is one of the most significant problems of today that is required to be solved and `'Integrated Basin Management'' has gained importance in the world in terms of decreasing environmental problems by more efficiently using current water resources. In order to achieve integrated basin management, it is needed to determine basin boundaries with sufficient accuracy and precision and encode them systematically. In various analyses to be done on the basis of basin, topographic parameters are also needed such as shape factor, bifurcation ratio, drainage frequency, drainage density, length of the main flow path, harmonic slope, average slope, time of concentration, hypsometric curve and maximum elevation difference. Nowadays, basin boundaries are obtained with digital elevation models in geographical information systems. However, tools developed for topographic parameters are not available. In this study, programs were written in Python programming language for afore-mentioned topographic parameters and each turned into a geographical information system tool. Therefore, a significant contribution has been made to the subject by completing the deficiency in the geographical information system devoted to the topographic parameters that are needed in almost every analyses concerning to the hydrology.

  20. Synthetic RNAs for Gene Regulation: Design Principles and Computational Tools

    PubMed Central

    Laganà, Alessandro; Shasha, Dennis; Croce, Carlo Maria

    2014-01-01

    The use of synthetic non-coding RNAs for post-transcriptional regulation of gene expression has not only become a standard laboratory tool for gene functional studies but it has also opened up new perspectives in the design of new and potentially promising therapeutic strategies. Bioinformatics has provided researchers with a variety of tools for the design, the analysis, and the evaluation of RNAi agents such as small-interfering RNA (siRNA), short-hairpin RNA (shRNA), artificial microRNA (a-miR), and microRNA sponges. More recently, a new system for genome engineering based on the bacterial CRISPR-Cas9 system (Clustered Regularly Interspaced Short Palindromic Repeats), was shown to have the potential to also regulate gene expression at both transcriptional and post-transcriptional level in a more specific way. In this mini review, we present RNAi and CRISPRi design principles and discuss the advantages and limitations of the current design approaches. PMID:25566532

  1. Automatic brain matter segmentation of computed tomography images using a statistical model: A tool to gain working time!

    PubMed Central

    Bertè, Francesco; Lamponi, Giuseppe; Bramanti, Placido

    2015-01-01

    Brain computed tomography (CT) is useful diagnostic tool for the evaluation of several neurological disorders due to its accuracy, reliability, safety and wide availability. In this field, a potentially interesting research topic is the automatic segmentation and recognition of medical regions of interest (ROIs). Herein, we propose a novel automated method, based on the use of the active appearance model (AAM) for the segmentation of brain matter in CT images to assist radiologists in the evaluation of the images. The method described, that was applied to 54 CT images coming from a sample of outpatients affected by cognitive impairment, enabled us to obtain the generation of a model overlapping with the original image with quite good precision. Since CT neuroimaging is in widespread use for detecting neurological disease, including neurodegenerative conditions, the development of automated tools enabling technicians and physicians to reduce working time and reach a more accurate diagnosis is needed. PMID:26427894

  2. Automatic brain matter segmentation of computed tomography images using a statistical model: A tool to gain working time!

    PubMed

    Bertè, Francesco; Lamponi, Giuseppe; Bramanti, Placido; Calabrò, Rocco S

    2015-10-01

    Brain computed tomography (CT) is useful diagnostic tool for the evaluation of several neurological disorders due to its accuracy, reliability, safety and wide availability. In this field, a potentially interesting research topic is the automatic segmentation and recognition of medical regions of interest (ROIs). Herein, we propose a novel automated method, based on the use of the active appearance model (AAM) for the segmentation of brain matter in CT images to assist radiologists in the evaluation of the images. The method described, that was applied to 54 CT images coming from a sample of outpatients affected by cognitive impairment, enabled us to obtain the generation of a model overlapping with the original image with quite good precision. Since CT neuroimaging is in widespread use for detecting neurological disease, including neurodegenerative conditions, the development of automated tools enabling technicians and physicians to reduce working time and reach a more accurate diagnosis is needed.

  3. Technical tips: verification of accurate placement and labeling of 10-10 scalp electrodes and intracranial grid/strip electrodes using documentation tools.

    PubMed

    Feravich, Susan M; Keller, Crystal M

    2012-06-01

    In some instances, evaluation of seizure activity may require the addition of 10-10 scalp electrodes or the placement of intracranial grids and strips. At any given time, different technologists may be responsible for placement, addition, and the care of electrodes for the same patient. The presence of extra surface electrodes or extensive coverage of brain with intracranial electrodes increases the risk of incorrect placement and labeling which can cause treatment errors based on inaccurate reading of EEG recordings. Procedures should be put into place with documentation tools to correctly place, label, and hook-up extra 10-10 scalp and intracranial electrodes without errors. By using written processes and documentation tools, staff are more capable of acquiring safe and accurate patient data which increase good patient outcomes. The processes for placement and hook up of 10-10 scalp electrodes and intracranial grid and strip electrodes are different and require separate procedures and documentation tools to ensure accuracy. For 10-10 scalp electrode placement, the use of a 10-10 map, labeled tape, and non-duplicating adjacent electrode colors reduces risk of error Documentation of intracranial grid/strip electrodes includes placement map, list of electrode locations in amplifier and a table of cables and corresponding grid/strips with colors. Accurate hook-up is verified by the technologist and the epileptologist and is documented on recording. With the use of documentation tools and verification procedures, the quality of patient outcomes increases while the potential for recording errors is reduced.

  4. Computational Modeling as a Design Tool in Microelectronics Manufacturing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    Plans to introduce pilot lines or fabs for 300 mm processing are in progress. The IC technology is simultaneously moving towards 0.25/0.18 micron. The convergence of these two trends places unprecedented stringent demands on processes and equipments. More than ever, computational modeling is called upon to play a complementary role in equipment and process design. The pace in hardware/process development needs a matching pace in software development: an aggressive move towards developing "virtual reactors" is desirable and essential to reduce design cycle and costs. This goal has three elements: reactor scale model, feature level model, and database of physical/chemical properties. With these elements coupled, the complete model should function as a design aid in a CAD environment. This talk would aim at the description of various elements. At the reactor level, continuum, DSMC(or particle) and hybrid models will be discussed and compared using examples of plasma and thermal process simulations. In microtopography evolution, approaches such as level set methods compete with conventional geometric models. Regardless of the approach, the reliance on empricism is to be eliminated through coupling to reactor model and computational surface science. This coupling poses challenging issues of orders of magnitude variation in length and time scales. Finally, database development has fallen behind; current situation is rapidly aggravated by the ever newer chemistries emerging to meet process metrics. The virtual reactor would be a useless concept without an accompanying reliable database that consists of: thermal reaction pathways and rate constants, electron-molecule cross sections, thermochemical properties, transport properties, and finally, surface data on the interaction of radicals, atoms and ions with various surfaces. Large scale computational chemistry efforts are critical as experiments alone cannot meet database needs due to the difficulties associated with such

  5. An Evaluation of Teaching Introductory Geomorphology Using Computer-based Tools.

    ERIC Educational Resources Information Center

    Wentz, Elizabeth A.; Vender, Joann C.; Brewer, Cynthia A.

    1999-01-01

    Compares student reactions to traditional teaching methods and an approach where computer-based tools (GEODe CD-ROM and GIS-based exercises) were either integrated with or replaced the traditional methods. Reveals that the students found both of these tools valuable forms of instruction when used in combination with the traditional methods. (CMK)

  6. DEVELOPMENT AND USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOLS FOR POLLUTION PREVENTION

    EPA Science Inventory

    The use of Computer-Aided Process Engineering (CAPE) and process simulation tools has become established industry practice to predict simulation software, new opportunities are available for the creation of a wide range of ancillary tools that can be used from within multiple sim...

  7. A Multiple-Sessions Interactive Computer-Based Learning Tool for Ability Cultivation in Circuit Simulation

    ERIC Educational Resources Information Center

    Xu, Q.; Lai, L. L.; Tse, N. C. F.; Ichiyanagi, K.

    2011-01-01

    An interactive computer-based learning tool with multiple sessions is proposed in this paper, which teaches students to think and helps them recognize the merits and limitations of simulation tools so as to improve their practical abilities in electrical circuit simulation based on the case of a power converter with progressive problems. The…

  8. A Methodology for Integrating Computer-Based Learning Tools in Science Curricula

    ERIC Educational Resources Information Center

    Papadouris, Nicos; Constantinou, Constantinos P.

    2009-01-01

    This paper demonstrates a methodology for effectively integrating computer-based learning tools in science teaching and learning. This methodology provides a means of systematic analysis to identify the capabilities of particular software tools and to formulate a series of competencies relevant to physical science that could be developed by means…

  9. Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models

    ERIC Educational Resources Information Center

    Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron

    2011-01-01

    To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…

  10. The Effect of a Computer-Based Cartooning Tool on Children's Cartoons and Written Stories

    ERIC Educational Resources Information Center

    Madden, M.; Chung, P. W. H.; Dawson, C. W.

    2008-01-01

    This paper reports a study assessing a new computer tool for cartoon storytelling, created by the authors for a target audience in the upper half of the English and Welsh Key Stage 2 (years 5 and 6, covering ages 9-11 years). The tool attempts to provide users with more opportunities for expressive visualisation than previous educational software;…

  11. Which Way Will the Wind Blow? Networked Computer Tools for Studying the Weather.

    ERIC Educational Resources Information Center

    Fishman, Barry J.; D'Amico, Laura M.

    A suite of networked computer tools within a pedagogical framework was designed to enhance earth science education at the high school level. These tools give students access to live satellite images, weather maps, and other scientific data dealing with the weather, and make it easy for students to make their own weather forecasts by creating…

  12. Computational Tools for Interpreting Ion Channel pH-Dependence

    PubMed Central

    Sazanavets, Ivan; Warwicker, Jim

    2015-01-01

    Activity in many biological systems is mediated by pH, involving proton titratable groups with pKas in the relevant pH range. Experimental analysis of pH-dependence in proteins focusses on particular sidechains, often with mutagenesis of histidine, due to its pKa near to neutral pH. The key question for algorithms that predict pKas is whether they are sufficiently accurate to effectively narrow the search for molecular determinants of pH-dependence. Through analysis of inwardly rectifying potassium (Kir) channels and acid-sensing ion channels (ASICs), mutational effects on pH-dependence are probed, distinguishing between groups described as pH-coupled or pH-sensor. Whereas mutation can lead to a shift in transition pH between open and closed forms for either type of group, only for pH-sensor groups does mutation modulate the amplitude of the transition. It is shown that a hybrid Finite Difference Poisson-Boltzmann (FDPB) – Debye-Hückel continuum electrostatic model can filter mutation candidates, providing enrichment for key pH-coupled and pH-sensor residues in both ASICs and Kir channels, in comparison with application of FDPB alone. PMID:25915903

  13. Computational tool for morphological analysis of cultured neonatal rat cardiomyocytes.

    PubMed

    Leite, Maria Ruth C R; Cestari, Idágene A; Cestari, Ismar N

    2015-08-01

    This study describes the development and evaluation of a semiautomatic myocyte edge-detector using digital image processing. The algorithm was developed in Matlab 6.0 using the SDC Morphology Toolbox. Its conceptual basis is the mathematical morphology theory together with the watershed and Euclidean distance transformations. The algorithm enables the user to select cells within an image for automatic detection of their borders and calculation of their surface areas; these areas are determined by adding the pixels within each myocyte's boundaries. The algorithm was applied to images of cultured ventricular myocytes from neonatal rats. The edge-detector allowed the identification and quantification of morphometric alterations in cultured isolated myocytes induced by 72 hours of exposure to a hypertrophic agent (50 μM phenylephrine). There was a significant increase in the mean surface area of the phenylephrine-treated cells compared with the control cells (p<;0.05), corresponding to cellular hypertrophy of approximately 50%. In conclusion, this edge-detector provides a rapid, repeatable and accurate measurement of cell surface areas in a standardized manner. Other possible applications include morphologic measurement of other types of cultured cells and analysis of time-related morphometric changes in adult cardiac myocytes.

  14. Computational Tools for Interpreting Ion Channel pH-Dependence.

    PubMed

    Sazanavets, Ivan; Warwicker, Jim

    2015-01-01

    Activity in many biological systems is mediated by pH, involving proton titratable groups with pKas in the relevant pH range. Experimental analysis of pH-dependence in proteins focusses on particular sidechains, often with mutagenesis of histidine, due to its pKa near to neutral pH. The key question for algorithms that predict pKas is whether they are sufficiently accurate to effectively narrow the search for molecular determinants of pH-dependence. Through analysis of inwardly rectifying potassium (Kir) channels and acid-sensing ion channels (ASICs), mutational effects on pH-dependence are probed, distinguishing between groups described as pH-coupled or pH-sensor. Whereas mutation can lead to a shift in transition pH between open and closed forms for either type of group, only for pH-sensor groups does mutation modulate the amplitude of the transition. It is shown that a hybrid Finite Difference Poisson-Boltzmann (FDPB) - Debye-Hückel continuum electrostatic model can filter mutation candidates, providing enrichment for key pH-coupled and pH-sensor residues in both ASICs and Kir channels, in comparison with application of FDPB alone.

  15. Assessing Affordances of Selected Cloud Computing Tools for Language Teacher Education in Nigeria

    ERIC Educational Resources Information Center

    Ofemile, Abdulmalik Yusuf

    2015-01-01

    This paper reports part of a study that hoped to understand Teacher Educators' (TE) assessment of the affordances of selected cloud computing tools ranked among the top 100 for the year 2010. Research has shown that ICT and by extension cloud computing has positive impacts on daily life and this informed the Nigerian government's policy to…

  16. Computer animation as a tool to study preferences in the cichlid Pelvicachromis taeniatus.

    PubMed

    Baldauf, S A; Kullmann, H; Thünken, T; Winter, S; Bakker, T C M

    2009-08-01

    Four choice experiments were conducted with both sexes of the cichlid Pelvicachromis taeniatus using computer-manipulated stimuli of digital images differing in movement, body shape or colouration. The results show that computer animations can be useful and flexible tools in studying preferences of a cichlid with complex and variable preferences for different visual cues.

  17. Development and Assessment of a Chemistry-Based Computer Video Game as a Learning Tool

    ERIC Educational Resources Information Center

    Martinez-Hernandez, Kermin Joel

    2010-01-01

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning…

  18. Arc Flash Boundary Calculations Using Computer Software Tools

    SciTech Connect

    Gibbs, M.D.

    2005-01-07

    Arc Flash Protection boundary calculations have become easier to perform with the availability of personal computer software. These programs incorporate arc flash protection boundary formulas for different voltage and current levels, calculate the bolted fault current at each bus, and use built in time-current coordination curves to determine the clearing time of protective devices in the system. Results of the arc flash protection boundary calculations can be presented in several different forms--as an annotation to the one-line diagram, as a table of arc flash protection boundary distances, and as printed placards to be attached to the appropriate equipment. Basic arc flash protection boundary principles are presented in this paper along with several helpful suggestions for performing arc flash protection boundary calculations.

  19. Industrial Computed Tomography (ICT) System—A Versatile Tool

    NASA Astrophysics Data System (ADS)

    Muralidhar, C.; Rao, G. V. Siva; Kumaran, K.; Subramanian, M. P.; Lukose, Sijo N.; Reddy, M. Venkata

    2008-09-01

    Industrial Computed Tomography (ICT) system has been developed indigenously for Non Destructive Evaluation (NDE) applications. ICT system consists of 450 keV X-ray source, 256-channel detector array, 6-axes mechanical manipulator, data acquisition system and reconstruction software. This system handles objects up to 1000mm (shell) diameter, 10 m height and 2000 kg weight with a spatial resolution of 1-line pair (lp)/mm. Metallic, nonmetallic, composite hardware including assemblies of varying densities and sizes were scanned and analyzed for identification of defects such as delaminations, debonds, cracks, voids, foreign inclusions and for internal details. Various image-processing techniques were employed for better visualization and interpretation of defects. The defects were analyzed for their location, area, depth and dimensions. The salient features of ICT system in handling a wide variety of hardware have been highlighted.

  20. Decision peptide-driven: a free software tool for accurate protein quantification using gel electrophoresis and matrix assisted laser desorption ionization time of flight mass spectrometry.

    PubMed

    Santos, Hugo M; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Nunes-Miranda, J D; Fdez-Riverola, Florentino; Carvallo, R; Capelo, J L

    2010-09-15

    The decision peptide-driven tool implements a software application for assisting the user in a protocol for accurate protein quantification based on the following steps: (1) protein separation through gel electrophoresis; (2) in-gel protein digestion; (3) direct and inverse (18)O-labeling and (4) matrix assisted laser desorption ionization time of flight mass spectrometry, MALDI analysis. The DPD software compares the MALDI results of the direct and inverse (18)O-labeling experiments and quickly identifies those peptides with paralleled loses in different sets of a typical proteomic workflow. Those peptides are used for subsequent accurate protein quantification. The interpretation of the MALDI data from direct and inverse labeling experiments is time-consuming requiring a significant amount of time to do all comparisons manually. The DPD software shortens and simplifies the searching of the peptides that must be used for quantification from a week to just some minutes. To do so, it takes as input several MALDI spectra and aids the researcher in an automatic mode (i) to compare data from direct and inverse (18)O-labeling experiments, calculating the corresponding ratios to determine those peptides with paralleled losses throughout different sets of experiments; and (ii) allow to use those peptides as internal standards for subsequent accurate protein quantification using (18)O-labeling. In this work the DPD software is presented and explained with the quantification of protein carbonic anhydrase.

  1. Sensor design using computer tools II; Proceedings of the Meeting, Arlington, VA, April 11, 12, 1985

    NASA Astrophysics Data System (ADS)

    Jamieson, J. A.

    The present conference discusses topics in the computerized simulation of electronic sensor performance, subsystem design and testing, quality design and verification, and the relationship of optical engineering professionals to their computer design tools. Attention is given to advances in Landsat image processing and mapping, the modeling of linear scan electrooptic sensors, computer simulation-based design of multispectral scanners, laser imager computer simulation, and precise space telescope pointing by means of a quadrant detector. Also discussed are an adaptive telescope design using computer tools, a focal plane products data base, the use of personal computers in optical design, and the computer simulation of focal plane array performance using coupled ray trace and carrier diffusion models.

  2. Tool or Science? The History of Computing at the Norwegian University of Science and Technology

    NASA Astrophysics Data System (ADS)

    Nordal, Ola

    One may characterize the history of computing at the Norwegian University of Science and Technology by a tension between the computer as a tool in other disciplines and computer science as discipline in itself. This tension has been latent since the pioneering period of the 1950s until today. This paper shows how this have been expressed in the early attempts to take up computing at the University, and how it gave the Division of Computer Science a fairly rough start when it opened in 1972.

  3. Computer-based simulator for radiology: an educational tool.

    PubMed

    Towbin, Alexander J; Paterson, Brian E; Chang, Paul J

    2008-01-01

    In the past decade, radiology has moved from being predominantly film based to predominantly digital. Although in clinical terms the transition has been relatively smooth, the method in which radiology is taught has not kept pace. Simulator programs have proved effective in other specialties as a method for teaching a specific skill set. Because many radiologists already work in the digital environment, a simulator could easily and safely be integrated with a picture archiving and communication system (PACS) and become a powerful tool for radiology education. Thus, a simulator program was designed for the specific purpose of giving residents practice in reading images independently, thereby helping them to prepare more fully for the rigors of being on call. The program is similar to a typical PACS, thus allowing a more interactive learning process, and closely mimics the real-world practice of radiology to help prepare the user for a variety of clinical scenarios. Besides education, other possible uses include certification, testing, and the creation of teaching files.

  4. Cost-Effective Cloud Computing: A Case Study Using the Comparative Genomics Tool, Roundup

    PubMed Central

    Kudtarkar, Parul; DeLuca, Todd F.; Fusaro, Vincent A.; Tonellato, Peter J.; Wall, Dennis P.

    2010-01-01

    Background Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource—Roundup—using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Methods Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon’s Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. Results We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon’s computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing

  5. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  6. Value of coronary computed tomography as a prognostic tool.

    PubMed

    Contractor, Tahmeed; Parekh, Maansi; Ahmed, Shameer; Martinez, Matthew W

    2012-08-01

    Coronary computed tomography angiography (CCTA) has become an important part of our armamentarium for noninvasive diagnosis of coronary artery disease (CAD). Emerging technologies have produced lower radiation dose, improved spatial and temporal resolution, as well as information about coronary physiology. Although the prognostic role of coronary artery calcium scoring is known, similar evidence for CCTA has only recently emerged. Initial, small studies in various patient populations have indicated that CCTA-identified CAD may have a prognostic value. These findings were confirmed in a recent analysis of the international, prospective Coronary CT Angiography Evaluation For Clinical Outcomes: An International Multicenter (CONFIRM) registry. An incremental increase in mortality was found with a worse severity of CAD on a per-patient, per-vessel, and per-segment basis. In addition, age-, sex-, and ethnicity-based differences in mortality were also found. Whether changing our management algorithms based on these findings will affect outcomes is unclear. Large prospective studies utilizing targeted management strategies for obstructive and nonobstructive CAD are required to incorporate these recent findings into our daily practice.

  7. A Survey of Computational Tools to Analyze and Interpret Whole Exome Sequencing Data

    PubMed Central

    Robinson, William A.

    2016-01-01

    Whole Exome Sequencing (WES) is the application of the next-generation technology to determine the variations in the exome and is becoming a standard approach in studying genetic variants in diseases. Understanding the exomes of individuals at single base resolution allows the identification of actionable mutations for disease treatment and management. WES technologies have shifted the bottleneck in experimental data production to computationally intensive informatics-based data analysis. Novel computational tools and methods have been developed to analyze and interpret WES data. Here, we review some of the current tools that are being used to analyze WES data. These tools range from the alignment of raw sequencing reads all the way to linking variants to actionable therapeutics. Strengths and weaknesses of each tool are discussed for the purpose of helping researchers make more informative decisions on selecting the best tools to analyze their WES data. PMID:28070503

  8. A Survey of Computational Tools to Analyze and Interpret Whole Exome Sequencing Data.

    PubMed

    Hintzsche, Jennifer D; Robinson, William A; Tan, Aik Choon

    2016-01-01

    Whole Exome Sequencing (WES) is the application of the next-generation technology to determine the variations in the exome and is becoming a standard approach in studying genetic variants in diseases. Understanding the exomes of individuals at single base resolution allows the identification of actionable mutations for disease treatment and management. WES technologies have shifted the bottleneck in experimental data production to computationally intensive informatics-based data analysis. Novel computational tools and methods have been developed to analyze and interpret WES data. Here, we review some of the current tools that are being used to analyze WES data. These tools range from the alignment of raw sequencing reads all the way to linking variants to actionable therapeutics. Strengths and weaknesses of each tool are discussed for the purpose of helping researchers make more informative decisions on selecting the best tools to analyze their WES data.

  9. Acts -- A collection of high performing software tools for scientific computing

    SciTech Connect

    Drummond, L.A.; Marques, O.A.

    2002-11-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Further, many new discoveries depend on high performance computer simulations to satisfy their demands for large computational resources and short response time. The Advanced CompuTational Software (ACTS) Collection brings together a number of general-purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS collection promotes code portability, reusability, reduction of duplicate efforts, and tool maturity. This paper presents a brief introduction to the functionality available in ACTS. It also highlight the tools that are in demand by Climate and Weather modelers.

  10. A New Accurate 3D Measurement Tool to Assess the Range of Motion of the Tongue in Oral Cancer Patients: A Standardized Model.

    PubMed

    van Dijk, Simone; van Alphen, Maarten J A; Jacobi, Irene; Smeele, Ludwig E; van der Heijden, Ferdinand; Balm, Alfons J M

    2016-02-01

    In oral cancer treatment, function loss such as speech and swallowing deterioration can be severe, mostly due to reduced lingual mobility. Until now, there is no standardized measurement tool for tongue mobility and pre-operative prediction of function loss is based on expert opinion instead of evidence based insight. The purpose of this study was to assess the reliability of a triple-camera setup for the measurement of tongue range of motion (ROM) in healthy adults and its feasibility in patients with partial glossectomy. A triple-camera setup was used, and 3D coordinates of the tongue in five standardized tongue positions were achieved in 15 healthy volunteers. Maximum distances between the tip of the tongue and the maxillary midline were calculated. Each participant was recorded twice, and each movie was analysed three times by two separate raters. Intrarater, interrater and test-retest reliability were the main outcome measures. Secondly, feasibility of the method was tested in ten patients treated for oral tongue carcinoma. Intrarater, interrater and test-retest reliability all showed high correlation coefficients of >0.9 in both study groups. All healthy subjects showed perfect symmetrical tongue ROM. In patients, significant differences in lateral tongue movements were found, due to restricted tongue mobility after surgery. This triple-camera setup is a reliable measurement tool to assess three-dimensional information of tongue ROM. It constitutes an accurate tool for objective grading of reduced tongue mobility after partial glossectomy.

  11. Informed public choices for low-carbon electricity portfolios using a computer decision tool.

    PubMed

    Mayer, Lauren A Fleishman; Bruine de Bruin, Wändi; Morgan, M Granger

    2014-04-01

    Reducing CO2 emissions from the electricity sector will likely require policies that encourage the widespread deployment of a diverse mix of low-carbon electricity generation technologies. Public discourse informs such policies. To make informed decisions and to productively engage in public discourse, citizens need to understand the trade-offs between electricity technologies proposed for widespread deployment. Building on previous paper-and-pencil studies, we developed a computer tool that aimed to help nonexperts make informed decisions about the challenges faced in achieving a low-carbon energy future. We report on an initial usability study of this interactive computer tool. After providing participants with comparative and balanced information about 10 electricity technologies, we asked them to design a low-carbon electricity portfolio. Participants used the interactive computer tool, which constrained portfolio designs to be realistic and yield low CO2 emissions. As they changed their portfolios, the tool updated information about projected CO2 emissions, electricity costs, and specific environmental impacts. As in the previous paper-and-pencil studies, most participants designed diverse portfolios that included energy efficiency, nuclear, coal with carbon capture and sequestration, natural gas, and wind. Our results suggest that participants understood the tool and used it consistently. The tool may be downloaded from http://cedmcenter.org/tools-for-cedm/informing-the-public-about-low-carbon-technologies/ .

  12. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    SciTech Connect

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1997-12-31

    Lilith is a general purpose tool that provides a highly scalable, easy distribution of user code across a heterogeneous computing platform. By handling the details of code distribution and communication, such a framework allows for the rapid development of tools for the use and management of large distributed systems. This speed-up in development not only enables the easy creation of tools as needed but also facilitates the ultimate development of more refined, hard-coded tools as well. Lilith is written in Java, providing platform independence and further facilitating rapid tool development through Object reuse and ease of development. The authors present the user-involved objects in the Lilith Distributed Object System and the Lilith User API. They present an example of tool development, illustrating the user calls, and present results demonstrating Lilith`s scalability.

  13. Development of generalized mapping tools to improve implementation of data driven computer simulations (04-ERD-083)

    SciTech Connect

    Ramirez, A; Pasyanos, M; Franz, G A

    2004-09-17

    The Stochastic Engine (SE) is a data driven computer simulation tool for predicting the characteristics of complex systems. The SE integrates accurate simulators with the Monte Carlo Markov Chain (MCMC) approach (a stochastic inverse technique) to identify alternative models that are consistent with available data and ranks these alternatives according to their probabilities. Implementation of the SE is currently cumbersome owing to the need to customize the pre-processing and processing steps that are required for a specific application. This project widens the applicability of the Stochastic Engine by generalizing some aspects of the method (i.e. model-to-data transformation types, configuration, model representation). We have generalized several of the transformations that are necessary to match the observations to proposed models. These transformations are sufficiently general not to pertain to any single application. This approach provides a framework that increases the efficiency of the SE implementation. The overall goal is to reduce response time and make the approach as ''plug-and-play'' as possible, and will result in the rapid accumulation of new data types for a host of both earth science and non-earth science problems. When adapting the SE approach to a specific application, there are various pre-processing and processing steps that are typically needed to run a specific problem. Many of these steps are common to a wide variety of specific applications. Here we list and describe several data transformations that are common to a variety of subsurface inverse problems. A subset of these steps have been developed in a generalized form such that they could be used with little or no modifications in a wide variety of specific applications. This work was funded by the LDRD Program (tracking number 04-ERD-083).

  14. The Vicious Worm: a computer-based Taenia solium education tool.

    PubMed

    Johansen, Maria Vang; Trevisan, Chiara; Braae, Uffe Christian; Magnussen, Pascal; Ertel, Rebekka Lund; Mejer, Helena; Saarnak, Christopher F L

    2014-08-01

    Ignorance is a major obstacle for the effective control of diseases. To provide evidence-based knowledge about prevention and control of Taenia solium cysticercosis, we have developed a computer-based education tool: 'The Vicious Worm'. The tool targets policy makers, professionals, and laypeople, and comprises educational materials including illustrated short stories, videos, and scientific texts designed for the different target groups. We suggest that evidence-based health education is included as a specific control measure in any control programme.

  15. Scale up tools in reactive extrusion and compounding processes. Could 1D-computer modeling be helpful?

    NASA Astrophysics Data System (ADS)

    Pradel, J.-L.; David, C.; Quinebèche, S.; Blondel, P.

    2014-05-01

    Industrial scale-up (or scale down) in Compounding and Reactive Extrusion processes is one of the most critical R&D challenges. Indeed, most of High Performances Polymers are obtained within a reactive compounding involving chemistry: free radical grafting, in situ compatibilization, rheology control... but also side reactions: oxidation, branching, chain scission... As described by basic Arrhenius and kinetics laws, the competition between all chemical reactions depends on residence time distribution and temperature. Then, to ensure the best possible scale up methodology, we need tools to match thermal history of the formulation along the screws from a lab scale twin screw extruder to an industrial one. This paper proposes a comparison between standard scale-up laws and the use of Computer modeling Software such as Ludovic® applied and compared to experimental data. Scaling data from a compounding line to another one, applying general rules (for example at constant specific mechanical energy), shows differences between experimental and computed data, and error depends on the screw speed range. For more accurate prediction, 1D-Computer Modeling could be used to optimize the process conditions to ensure the best scale-up product, especially in temperature sensitive reactive extrusion processes. When the product temperature along the screws is the key, Ludovic® software could help to compute the temperature profile along the screws and extrapolate conditions, even screw profile, on industrial extruders.

  16. Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.

    2016-01-01

    A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.

  17. Computer-based tools for decision support at the Hanford Site

    SciTech Connect

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high` level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the ``glue`` or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  18. Computer-based tools for decision support at the Hanford Site

    SciTech Connect

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high' level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the glue'' or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  19. Time-Accurate, Unstructured-Mesh Navier-Stokes Computations with the Space-Time CESE Method

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan

    2006-01-01

    Application of the newly emerged space-time conservation element solution element (CESE) method to compressible Navier-Stokes equations is studied. In contrast to Euler equations solvers, several issues such as boundary conditions, numerical dissipation, and grid stiffness warrant systematic investigations and validations. Non-reflecting boundary conditions applied at the truncated boundary are also investigated from the stand point of acoustic wave propagation. Validations of the numerical solutions are performed by comparing with exact solutions for steady-state as well as time-accurate viscous flow problems. The test cases cover a broad speed regime for problems ranging from acoustic wave propagation to 3D hypersonic configurations. Model problems pertinent to hypersonic configurations demonstrate the effectiveness of the CESE method in treating flows with shocks, unsteady waves, and separations. Good agreement with exact solutions suggests that the space-time CESE method provides a viable alternative for time-accurate Navier-Stokes calculations of a broad range of problems.

  20. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    NASA Technical Reports Server (NTRS)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  1. Accurate and computationally efficient prediction of thermochemical properties of biomolecules using the generalized connectivity-based hierarchy.

    PubMed

    Sengupta, Arkajyoti; Ramabhadran, Raghunath O; Raghavachari, Krishnan

    2014-08-14

    In this study we have used the connectivity-based hierarchy (CBH) method to derive accurate heats of formation of a range of biomolecules, 18 amino acids and 10 barbituric acid/uracil derivatives. The hierarchy is based on the connectivity of the different atoms in a large molecule. It results in error-cancellation reaction schemes that are automated, general, and can be readily used for a broad range of organic molecules and biomolecules. Herein, we first locate stable conformational and tautomeric forms of these biomolecules using an accurate level of theory (viz. CCSD(T)/6-311++G(3df,2p)). Subsequently, the heats of formation of the amino acids are evaluated using the CBH-1 and CBH-2 schemes and routinely employed density functionals or wave function-based methods. The calculated heats of formation obtained herein using modest levels of theory and are in very good agreement with those obtained using more expensive W1-F12 and W2-F12 methods on amino acids and G3 results on barbituric acid derivatives. Overall, the present study (a) highlights the small effect of including multiple conformers in determining the heats of formation of biomolecules and (b) in concurrence with previous CBH studies, proves that use of the more effective error-cancelling isoatomic scheme (CBH-2) results in more accurate heats of formation with modestly sized basis sets along with common density functionals or wave function-based methods.

  2. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  3. Development of a Computational High-Throughput Tool for the Quantitative Examination of Dose-Dependent Histological Features

    PubMed Central

    Nault, Rance; Colbry, Dirk; Brandenberger, Christina; Harkema, Jack R.; Zacharewski, Timothy R.

    2015-01-01

    High-resolution digitalizing of histology slides facilitates the development of computational alternatives to manual quantitation of features of interest. We developed a MATLAB-based quantitative histological analysis tool (QuHAnT) for the high-throughput assessment of distinguishable histological features. QuHAnT validation was demonstrated by comparison with manual quantitation using liver sections from mice orally gavaged with sesame oil vehicle or 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD; 0.001–30 µg/kg) every 4 days for 28 days, which elicits hepatic steatosis with mild fibrosis. A quality control module of QuHAnT reduced the number of quantifiable Oil Red O (ORO)-stained images from 3,123 to 2,756. Increased ORO staining was measured at 10 and 30 µg/kg TCDD with a high correlation between manual and computational volume densities (Vv), although the dynamic range of QuHAnT was 10-fold greater. Additionally, QuHAnT determined the size of each ORO vacuole, which could not be accurately quantitated by visual examination or manual point counting. PicroSirius Red quantitation demonstrated superior collagen deposition detection due to the ability to consider all images within each section. QuHAnT dramatically reduced analysis time and facilitated the comprehensive assessment of features improving accuracy and sensitivity and represents a complementary tool for tissue/cellular features that are difficult and tedious to assess via subjective or semiquantitative methods. PMID:25274660

  4. The Computer as a Tool: Basis for a University-Wide Computer Literacy Course.

    ERIC Educational Resources Information Center

    Helms, Susan

    1985-01-01

    When a core curriculum degree requirement in computer literacy was approved at Hardin-Simmons University (a small, private institution), a committee of faculty members developed the course specifications. They include a weekly one-hour lecture and a two-hour laboratory to achieve eight educational goals. These goals and the course format are…

  5. Accurate computations of the structures and binding energies of the imidazole⋯benzene and pyrrole⋯benzene complexes

    NASA Astrophysics Data System (ADS)

    Ahnen, Sandra; Hehn, Anna-Sophia; Vogiatzis, Konstantinos D.; Trachsel, Maria A.; Leutwyler, Samuel; Klopper, Wim

    2014-09-01

    Using explicitly-correlated coupled-cluster theory with single and double excitations, the intermolecular distances and interaction energies of the T-shaped imidazole⋯benzene and pyrrole⋯benzene complexes have been computed in a large augmented correlation-consistent quadruple-zeta basis set, adding also corrections for connected triple excitations and remaining basis-set-superposition errors. The results of these computations are used to assess other methods such as Møller-Plesset perturbation theory (MP2), spin-component-scaled MP2 theory, dispersion-weighted MP2 theory, interference-corrected explicitly-correlated MP2 theory, dispersion-corrected double-hybrid density-functional theory (DFT), DFT-based symmetry-adapted perturbation theory, the random-phase approximation, explicitly-correlated ring-coupled-cluster-doubles theory, and double-hybrid DFT with a correlation energy computed in the random-phase approximation.

  6. ModelDB in computational neuroscience education - a research tool as interactive educational media

    PubMed Central

    Morse, Thomas M.

    2013-01-01

    ModelDB's mission is to link computational models and publications, supporting the field of computational neuroscience (CNS) by making model source code readily available. It is continually expanding, and currently contains source code for more than 300 models that cover more than 41 topics. Investigators, educators, and students can use it to obtain working models that reproduce published results and can be modified to test for new domains of applicability. Users can browse ModelDB to survey the field of computational neuroscience, or pursue more focused explorations of specific topics. Here we describe tutorials and initial experiences with ModelDB as an interactive educational tool. PMID:25089156

  7. A visualization tool for parallel and distributed computing using the Lilith framework

    SciTech Connect

    Gentile, A.C.; Evensky, D.A.; Wyckoff, P.

    1998-05-01

    The authors present a visualization tool for the monitoring and debugging of codes run in a parallel and distributed computing environment, called Lilith Lights. This tool can be used both for debugging parallel codes as well as for resource management of clusters. It was developed under Lilith, a framework for creating scalable software tools for distributed computing. The use of Lilith provides scalable, non-invasive debugging, as opposed to other commonly used software debugging and visualization tools. Furthermore, by implementing the visualization tool in software rather than in hardware (as available on some MPPs), Lilith Lights is easily transferable to other machines, and well adapted for use on distributed clusters of machines. The information provided in a clustered environment can further be used for resource management of the cluster. In this paper, they introduce Lilith Lights, discussing its use on the Computational Plant cluster at Sandia National Laboratories, show its design and development under the Lilith framework, and present metrics for resource use and performance.

  8. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    PubMed

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible.

  9. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road

    PubMed Central

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on “on-demand payment” for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  10. Examining the effects of computational tools on students' understanding of thermodynamics of material concepts and representations

    NASA Astrophysics Data System (ADS)

    Ogunwuyi, Oluwatosin

    Technology is becoming a more critical agent for supporting learning as well as research in science and engineering. In particular, technology-based tools in the form of simulations and virtual environments support learning using mathematical models and computational methods. The purpose of this research is to: (a) measure the value added in conveying Thermodynamics of materials concepts with a blended learning environment using computational simulation tools with lectures; and (b) characterize students' use of representational forms to convey their conceptual understanding of core concepts within a learning environment that blended Gibbs computational resource and traditional lectures. A mix-method approach was implemented that included the use of statistical analysis to compare student test performance as a result of interacting with Gibbs tool and the use of Grounded Theory inductive analysis to explore students' use of representational forms to express their understanding of thermodynamics of material concepts. Results for the quantitative study revealed positive gains in students' conceptual understanding before and after interacting with Gibbs tool for the majority of the concepts tested. In addition, insight gained from the qualitative analysis helped provide understanding about how students utilized representational forms in communicating their understanding of thermodynamics of material concepts. Knowledge of how novice students construct meaning in this context will provide insight for engineering education instructors and researchers in understanding students' learning processes in the context of educational environments that integrate expert simulation tools as part of their instructional resources for foundational domain knowledge.

  11. Multimedia Instructional Tools' Impact on Student Motivation and Learning Strategies in Computer Applications Courses

    ERIC Educational Resources Information Center

    Chapman, Debra; Wang, Shuyan

    2015-01-01

    Multimedia instructional tools (MMIT) have been identified as a way effectively and economically present instructional material. MMITs are commonly used in introductory computer applications courses as MMITs should be effective in increasing student knowledge and positively impact motivation and learning strategies, without increasing costs. This…

  12. The Computer as an Experimental Tool in Teaching Mathematics. Dissemination Packet--Summer 1989: Booklet #9.

    ERIC Educational Resources Information Center

    Hastings, Harold M.; And Others

    This booklet is the ninth in a series of nine from the Teacher Training Institute at Hofstra University (New York) and describes the content and the approach of an institute course in which the participants use the personal computer as a personal tool within the mathematical discovery process of making conjectures, testing those conjectures, and…

  13. Instruction of Statistics via Computer-Based Tools: Effects on Statistics' Anxiety, Attitude, and Achievement

    ERIC Educational Resources Information Center

    Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar

    2014-01-01

    The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…

  14. USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOL IN POLLUTION PREVENTION

    EPA Science Inventory

    Computer-Aided Process Engineering has become established in industry as a design tool. With the establishment of the CAPE-OPEN software specifications for process simulation environments. CAPE-OPEN provides a set of "middleware" standards that enable software developers to acces...

  15. Technology and Jobs: Computer-Aided Design. Numerical-Control Machine-Tool Operators. Office Automation.

    ERIC Educational Resources Information Center

    Stanton, Michael; And Others

    1985-01-01

    Three reports on the effects of high technology on the nature of work include (1) Stanton on applications and implications of computer-aided design for engineers, drafters, and architects; (2) Nardone on the outlook and training of numerical-control machine tool operators; and (3) Austin and Drake on the future of clerical occupations in automated…

  16. Computers as Pedagogical Tools in Brazil: A Pseudo-Panel Analysis

    ERIC Educational Resources Information Center

    Sprietsma, Maresa

    2012-01-01

    The number of schools that have access to computers and the Internet has increased rapidly since the beginning of the 1990s. However, evidence of their effectiveness as pedagogical tools to acquire reading and math skills is still the object of debate. We use repeated cross-section data from Brazil to evaluate the effect of the availability of a…

  17. Policy Analysis: A Tool for Setting District Computer Use Policy. Paper and Report Series No. 97.

    ERIC Educational Resources Information Center

    Gray, Peter J.

    This report explores the use of policy analysis as a tool for setting computer use policy in a school district by discussing the steps in the policy formation and implementation processes and outlining how policy analysis methods can contribute to the creation of effective policy. Factors related to the adoption and implementation of innovations…

  18. Graphical and Normative Analysis of Binocular Vision by Mini Computer: A Teaching Aid and Clinical Tool.

    ERIC Educational Resources Information Center

    Kees, Martin; Schor, Clifton

    1981-01-01

    An inexpensive computer graphics systems (Commodore PET), used as a video aid for teaching students advanced case analysis, is described. The course provides students with the analytical tools for evaluating with graphical and statistical techniques and treating with lenses, prisms, and orthoptics various anomalies of binocular vision. (MLW)

  19. Video Analysis of Projectile Motion Using Tablet Computers as Experimental Tools

    ERIC Educational Resources Information Center

    Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and "g" in order to explore the underlying laws of motion. This experiment…

  20. Structure of the Brazilian Sign Language (Libras) for Computational Tools: Citizenship and Social Inclusion

    NASA Astrophysics Data System (ADS)

    Guimaraes, Cayley; Antunes, Diego R.; de F. Guilhermino Trindade, Daniela; da Silva, Rafaella A. Lopes; Garcia, Laura Sanchez

    This work presents a computational model (XML) of the Brazilian Sign Language (Libras), based on its phonology. The model was used to create a sample of representative signs to aid the recording of a base of videos whose aim is to support the development of tools to support genuine social inclusion of the deaf.

  1. An Efficient Multiplex PCR-Based Assay as a Novel Tool for Accurate Inter-Serovar Discrimination of Salmonella Enteritidis, S. Pullorum/Gallinarum and S. Dublin

    PubMed Central

    Xiong, Dan; Song, Li; Tao, Jing; Zheng, Huijuan; Zhou, Zihao; Geng, Shizhong; Pan, Zhiming; Jiao, Xinan

    2017-01-01

    Salmonella enterica serovars Enteritidis, Pullorum/Gallinarum, and Dublin are infectious pathogens causing serious problems for pig, chicken, and cattle production, respectively. Traditional serotyping for Salmonella is costly and labor-intensive. Here, we established a rapid multiplex PCR method to simultaneously identify three prevalent Salmonella serovars Enteritidis, Pullorum/Gallinarum, and Dublin individually for the first time. The multiplex PCR-based assay focuses on three genes tcpS, lygD, and flhB. Gene tcpS exists only in the three Salmonella serovars, and lygD exists only in S. Enteritidis, while a truncated region of flhB gene is only found in S. Pullorum/Gallinarum. The sensitivity and specificity of the multiplex PCR assay using three pairs of specific primers for these genes were evaluated. The results showed that this multiplex PCR method could accurately identify Salmonella Enteritidis, Pullorum/Gallinarum, and Dublin from eight non-Salmonella species and 27 Salmonella serovars. The least concentration of genomic DNA that could be detected was 58.5 pg/μL and the least number of cells was 100 CFU. Subsequently, this developed method was used to analyze clinical Salmonella isolates from one pig farm, one chicken farm, and one cattle farm. The results showed that blinded PCR testing of Salmonella isolates from the three farms were in concordance with the traditional serotyping tests, indicating the newly developed multiplex PCR system could be used as a novel tool to accurately distinguish the three specific Salmonella serovars individually, which is useful, especially in high-throughput screening. PMID:28360901

  2. A Simulation Tool for the Duties of Computer Specialist Non-Commissioned Officers on a Turkish Air Force Base

    DTIC Science & Technology

    2009-09-01

    at the MOVES Institute A SIMULATION TOOL FOR THE DUTIES OF COMPUTER SPECIALIST NON-COMMISSIONED OFFICERS ON A TURKISH AIR FORCE BASE by...REPORT DATE September 2009 1. AGENCY USE ONLY (Leave blank) 4. TITLE AND SUBTITLE A Simulation Tool for the Duties of Computer Specialist...simulation tool by using a prototypical model of the computer system specialist non-commissioned officers’ jobs on a Turkish Air Force Base, and to

  3. Drug Metabolism in Preclinical Drug Development: A Survey of the Discovery Process, Toxicology, and Computational Tools.

    PubMed

    Issa, Naiem T; Wathieu, Henri; Ojo, Abiola; Byers, Stephen W; Dakshanamurthy, Sivanesan

    2017-03-15

    Increased R & D spending and high failure rates exist in drug development, due in part to inadequate prediction of drug metabolism and its consequences in the human body. Hence, there is a need for computational methods to supplement and complement current biological assessment strategies. In this review, we provide an overview of drug metabolism in pharmacology, and discuss the current in vitro and in vivo strategies for assessing drug metabolism in preclinical drug development. We highlight computational tools available to the scientific community for the in silico prediction of drug metabolism, and examine how these tools have been implemented to produce drug-target signatures relevant to metabolic routes. Computational workflows that assess drug metabolism and its toxicological and pharmacokinetic effects, such as by applying the adverse outcome pathway framework for risk assessment, may improve the efficiency and speed of preclinical drug development.

  4. G-LoSA: An efficient computational tool for local structure-centric biological studies and drug design.

    PubMed

    Lee, Hui Sun; Im, Wonpil

    2016-04-01

    Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G-LoSA. G-LoSA aligns protein local structures in a sequence order independent way and provides a GA-score, a chemical feature-based and size-independent structure similarity score. Our benchmark validation shows the robust performance of G-LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure-centric comparative biology studies. In particular, G-LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G-LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer-aided drug design. We hope that G-LoSA can be a useful computational method for exploring interesting biological problems through large-scale comparison of protein local structures and facilitating drug discovery research and development. G-LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/.

  5. SHADING MASK: A computer-based teaching tool for sun shading devices

    SciTech Connect

    Setiadarma, E.; Schiler, M.

    1996-10-01

    Sun shading devices affect natural lighting, ventilation, solar gain, and overall building performance. Few architecture students, architects, and designers have applied solar shading as a useful tool to reduce glare, control light intensity and radiation, and minimize the cooling load on their project. SHADING MASK is a computer-based teaching tool that uses Edward Mazria`s rectangular sun path diagram as a basis. The tool explains the basic theory of solar control, generates sun path diagrams; allows the design of overhangs, fins, and eggcrates types of shading devices; calculate solar angles and shading masks; and provides case study examples of actual buildings. It is a demonstration of how to integrate theory into a teaching/simulation tool to make important solar control information easily accessible to students, architects, and designers.

  6. Extending peripersonal space representation without tool-use: evidence from a combined behavioral-computational approach

    PubMed Central

    Serino, Andrea; Canzoneri, Elisa; Marzolla, Marilena; di Pellegrino, Giuseppe; Magosso, Elisa

    2015-01-01

    Stimuli from different sensory modalities occurring on or close to the body are integrated in a multisensory representation of the space surrounding the body, i.e., peripersonal space (PPS). PPS dynamically modifies depending on experience, e.g., it extends after using a tool to reach far objects. However, the neural mechanism underlying PPS plasticity after tool use is largely unknown. Here we use a combined computational-behavioral approach to propose and test a possible mechanism accounting for PPS extension. We first present a neural network model simulating audio-tactile representation in the PPS around one hand. Simulation experiments showed that our model reproduced the main property of PPS neurons, i.e., selective multisensory response for stimuli occurring close to the hand. We used the neural network model to simulate the effects of a tool-use training. In terms of sensory inputs, tool use was conceptualized as a concurrent tactile stimulation from the hand, due to holding the tool, and an auditory stimulation from the far space, due to tool-mediated action. Results showed that after exposure to those inputs, PPS neurons responded also to multisensory stimuli far from the hand. The model thus suggests that synchronous pairing of tactile hand stimulation and auditory stimulation from the far space is sufficient to extend PPS, such as after tool-use. Such prediction was confirmed by a behavioral experiment, where we used an audio-tactile interaction paradigm to measure the boundaries of PPS representation. We found that PPS extended after synchronous tactile-hand stimulation and auditory-far stimulation in a group of healthy volunteers. Control experiments both in simulation and behavioral settings showed that the same amount of tactile and auditory inputs administered out of synchrony did not change PPS representation. We conclude by proposing a simple, biological-plausible model to explain plasticity in PPS representation after tool-use, which is

  7. Extending peripersonal space representation without tool-use: evidence from a combined behavioral-computational approach.

    PubMed

    Serino, Andrea; Canzoneri, Elisa; Marzolla, Marilena; di Pellegrino, Giuseppe; Magosso, Elisa

    2015-01-01

    Stimuli from different sensory modalities occurring on or close to the body are integrated in a multisensory representation of the space surrounding the body, i.e., peripersonal space (PPS). PPS dynamically modifies depending on experience, e.g., it extends after using a tool to reach far objects. However, the neural mechanism underlying PPS plasticity after tool use is largely unknown. Here we use a combined computational-behavioral approach to propose and test a possible mechanism accounting for PPS extension. We first present a neural network model simulating audio-tactile representation in the PPS around one hand. Simulation experiments showed that our model reproduced the main property of PPS neurons, i.e., selective multisensory response for stimuli occurring close to the hand. We used the neural network model to simulate the effects of a tool-use training. In terms of sensory inputs, tool use was conceptualized as a concurrent tactile stimulation from the hand, due to holding the tool, and an auditory stimulation from the far space, due to tool-mediated action. Results showed that after exposure to those inputs, PPS neurons responded also to multisensory stimuli far from the hand. The model thus suggests that synchronous pairing of tactile hand stimulation and auditory stimulation from the far space is sufficient to extend PPS, such as after tool-use. Such prediction was confirmed by a behavioral experiment, where we used an audio-tactile interaction paradigm to measure the boundaries of PPS representation. We found that PPS extended after synchronous tactile-hand stimulation and auditory-far stimulation in a group of healthy volunteers. Control experiments both in simulation and behavioral settings showed that the same amount of tactile and auditory inputs administered out of synchrony did not change PPS representation. We conclude by proposing a simple, biological-plausible model to explain plasticity in PPS representation after tool-use, which is

  8. TepiTool: A Pipeline for Computational Prediction of T Cell Epitope Candidates.

    PubMed

    Paul, Sinu; Sidney, John; Sette, Alessandro; Peters, Bjoern

    2016-08-01

    Computational prediction of T cell epitope candidates is currently being used in several applications including vaccine discovery studies, development of diagnostics, and removal of unwanted immune responses against protein therapeutics. There have been continuous improvements in the performance of MHC binding prediction tools, but their general adoption by immunologists has been slow due to the lack of user-friendly interfaces and guidelines. Current tools only provide minimal advice on what alleles to include, what lengths to consider, how to deal with homologous peptides, and what cutoffs should be considered relevant. This protocol provides step-by-step instructions with necessary recommendations for prediction of the best T cell epitope candidates with the newly developed online tool called TepiTool. TepiTool, which is part of the Immune Epitope Database (IEDB), provides some of the top MHC binding prediction algorithms for number of species including humans, chimpanzees, bovines, gorillas, macaques, mice, and pigs. The TepiTool is freely accessible at http://tools.iedb.org/tepitool/. © 2016 by John Wiley & Sons, Inc.

  9. Computation of Accurate Activation Barriers for Methyl-Transfer Reactions of Sulfonium and Ammonium Salts in Aqueous Solution.

    PubMed

    Gunaydin, Hakan; Acevedo, Orlando; Jorgensen, William L; Houk, K N

    2007-05-01

    The energetics of methyl-transfer reactions from dimethylammonium, tetramethylammonium, and trimethylsulfonium to dimethylamine were computed with density functional theory, MP2, CBS-QB3, and quantum mechanics/molecular mechanics (QM/MM) Monte Carlo methods. At the CBS-QB3 level, the gas-phase activation enthalpies are computed to be 9.9, 15.3, and 7.9 kcal/mol, respectively. MP2/6-31+G(d,p) activation enthalpies are in best agreement with the CBS-QB3 results. The effects of aqueous solvation on these reactions were studied with polarizable continuum model, generalized Born/surface area (GB/SA), and QM/MM Monte Carlo simulations utilizing free-energy perturbation theory in which the PDDG/PM3 semiempirical Hamiltonian for the QM and explicit TIP4P water molecules in the MM region were used. In the aqueous phase, all of these reactions proceed more slowly when compared to the gas phase, since the charged reactants are stabilized more than the transition structure geometries with delocalized positive charges. In order to obtain the aqueous-phase activation free energies, the gas-phase activation free energies were corrected with the solvation free energies obtained from single-point conductor-like polarizable continuum model and GB/SA calculations for the stationary points along the reaction coordinate.

  10. Combining computer algorithms with experimental approaches permits the rapid and accurate identification of T cell epitopes from defined antigens.

    PubMed

    Schirle, M; Weinschenk, T; Stevanović, S

    2001-11-01

    The identification of T cell epitopes from immunologically relevant antigens remains a critical step in the development of vaccines and methods for monitoring of T cell responses. This review presents an overview of strategies that employ computer algorithms for the selection of candidate peptides from defined proteins and subsequent verification of their in vivo relevance by experimental approaches. Several computer algorithms are currently being used for epitope prediction of various major histocompatibility complex (MHC) class I and II molecules, based either on the analysis of natural MHC ligands or on the binding properties of synthetic peptides. Moreover, the analysis of proteasomal digests of peptides and whole proteins has led to the development of algorithms for the prediction of proteasomal cleavages. In order to verify the generation of the predicted peptides during antigen processing in vivo as well as their immunogenic potential, several experimental approaches have been pursued in the recent past. Mass spectrometry-based bioanalytical approaches have been used specifically to detect predicted peptides among isolated natural ligands. Other strategies employ various methods for the stimulation of primary T cell responses against the predicted peptides and subsequent testing of the recognition pattern towards target cells that express the antigen.

  11. Lilith: A Java framework for the development of scalable tools for high performance distributed computing platforms

    SciTech Connect

    Evensky, D.A.; Gentile, A.C.; Armstrong, R.C.

    1998-03-19

    Increasingly, high performance computing constitutes the use of very large heterogeneous clusters of machines. The use and maintenance of such clusters are subject to complexities of communication between the machines in a time efficient and secure manner. Lilith is a general purpose tool that provides a highly scalable, secure, and easy distribution of user code across a heterogeneous computing platform. By handling the details of code distribution and communication, such a framework allows for the rapid development of tools for the use and management of large distributed systems. Lilith is written in Java, taking advantage of Java`s unique features of loading and distributing code dynamically, its platform independence, its thread support, and its provision of graphical components to facilitate easy to use resultant tools. The authors describe the use of Lilith in a tool developed for the maintenance of the large distributed cluster at their institution and present details of the Lilith architecture and user API for the general user development of scalable tools.

  12. Network Computing Infrastructure to Share Tools and Data in Global Nuclear Energy Partnership

    NASA Astrophysics Data System (ADS)

    Kim, Guehee; Suzuki, Yoshio; Teshima, Naoya

    CCSE/JAEA (Center for Computational Science and e-Systems/Japan Atomic Energy Agency) integrated a prototype system of a network computing infrastructure for sharing tools and data to support the U.S. and Japan collaboration in GNEP (Global Nuclear Energy Partnership). We focused on three technical issues to apply our information process infrastructure, which are accessibility, security, and usability. In designing the prototype system, we integrated and improved both network and Web technologies. For the accessibility issue, we adopted SSL-VPN (Security Socket Layer-Virtual Private Network) technology for the access beyond firewalls. For the security issue, we developed an authentication gateway based on the PKI (Public Key Infrastructure) authentication mechanism to strengthen the security. Also, we set fine access control policy to shared tools and data and used shared key based encryption method to protect tools and data against leakage to third parties. For the usability issue, we chose Web browsers as user interface and developed Web application to provide functions to support sharing tools and data. By using WebDAV (Web-based Distributed Authoring and Versioning) function, users can manipulate shared tools and data through the Windows-like folder environment. We implemented the prototype system in Grid infrastructure for atomic energy research: AEGIS (Atomic Energy Grid Infrastructure) developed by CCSE/JAEA. The prototype system was applied for the trial use in the first period of GNEP.

  13. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  14. Accurately computing the optical pathlength difference for a michelson interferometer with minimal knowledge of the source spectrum.

    PubMed

    Milman, Mark H

    2005-12-01

    Astrometric measurements using stellar interferometry rely on precise measurement of the central white light fringe to accurately obtain the optical pathlength difference of incoming starlight to the two arms of the interferometer. One standard approach to stellar interferometry uses a channeled spectrum to determine phases at a number of different wavelengths that are then converted to the pathlength delay. When throughput is low these channels are broadened to improve the signal-to-noise ratio. Ultimately the ability to use monochromatic models and algorithms in each of the channels to extract phase becomes problematic and knowledge of the spectrum must be incorporated to achieve the accuracies required of the astrometric measurements. To accomplish this an optimization problem is posed to estimate simultaneously the pathlength delay and spectrum of the source. Moreover, the nature of the parameterization of the spectrum that is introduced circumvents the need to solve directly for these parameters so that the optimization problem reduces to a scalar problem in just the pathlength delay variable. A number of examples are given to show the robustness of the approach.

  15. MegaMiner: A Tool for Lead Identification Through Text Mining Using Chemoinformatics Tools and Cloud Computing Environment.

    PubMed

    Karthikeyan, Muthukumarasamy; Pandit, Yogesh; Pandit, Deepak; Vyas, Renu

    2015-01-01

    Virtual screening is an indispensable tool to cope with the massive amount of data being tossed by the high throughput omics technologies. With the objective of enhancing the automation capability of virtual screening process a robust portal termed MegaMiner has been built using the cloud computing platform wherein the user submits a text query and directly accesses the proposed lead molecules along with their drug-like, lead-like and docking scores. Textual chemical structural data representation is fraught with ambiguity in the absence of a global identifier. We have used a combination of statistical models, chemical dictionary and regular expression for building a disease specific dictionary. To demonstrate the effectiveness of this approach, a case study on malaria has been carried out in the present work. MegaMiner offered superior results compared to other text mining search engines, as established by F score analysis. A single query term 'malaria' in the portlet led to retrieval of related PubMed records, protein classes, drug classes and 8000 scaffolds which were internally processed and filtered to suggest new molecules as potential anti-malarials. The results obtained were validated by docking the virtual molecules into relevant protein targets. It is hoped that MegaMiner will serve as an indispensable tool for not only identifying hidden relationships between various biological and chemical entities but also for building better corpus and ontologies.

  16. Video analysis of projectile motion using tablet computers as experimental tools

    NASA Astrophysics Data System (ADS)

    Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and g in order to explore the underlying laws of motion. This experiment can easily be performed by students themselves, providing more autonomy in their problem-solving processes than traditional learning approaches. We believe that this autonomy and the authenticity of the experimental tool both foster their motivation.

  17. Study on computer controlled polishing machine with small air bag tool

    NASA Astrophysics Data System (ADS)

    Wang, Yi; Ni, Ying; Yu, Jing-chi

    2007-12-01

    Laser and infrared optical technologies are developed quickly recently. Small aspheric lens of φ30 to 100mm which are normally used in such optical systems are largely demanded. But computer controlled polishing technology for small batch-quantity aspheric lens is a bottle-neck technology to prevent the development of laser and infrared optical technologies. In this article, the technology of computer controlled optical surfacing (CCOS) was used to solve the problems of batch-quantity aspheric lens' polishing. First, material's removing action by computer controlled small polishing tool is detailed simulated by computer. Then, According to the simulation result, polishing correction is completed after adjusting the function of tool's resident time. Finally the accuracy of 70 mm aspheric lens (Surface shape measurement value is 0.45μm, roughness measurement value is 2.687nm) is achieved under efficient polishing with our home made model computer controlled polishing machine which has three universal driving shafts. Efficiency of small aspheric lens' batch-quantity manufacturing is remarkably improved.

  18. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  19. IMAT (Integrated Multidisciplinary Analysis Tool) user's guide for the VAX/VMS computer

    NASA Technical Reports Server (NTRS)

    Meissner, Frances T. (Editor)

    1988-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system for the VAX/VMS computer developed at the Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

  20. A Queue Simulation Tool for a High Performance Scientific Computing Center

    NASA Technical Reports Server (NTRS)

    Spear, Carrie; McGalliard, James

    2007-01-01

    The NASA Center for Computational Sciences (NCCS) at the Goddard Space Flight Center provides high performance highly parallel processors, mass storage, and supporting infrastructure to a community of computational Earth and space scientists. Long running (days) and highly parallel (hundreds of CPUs) jobs are common in the workload. NCCS management structures batch queues and allocates resources to optimize system use and prioritize workloads. NCCS technical staff use a locally developed discrete event simulation tool to model the impacts of evolving workloads, potential system upgrades, alternative queue structures and resource allocation policies.

  1. Design and Development of a Sample "Computer Programming" Course Tool via Story-Based E-Learning Approach

    ERIC Educational Resources Information Center

    Kose, Utku; Koc, Durmus; Yucesoy, Suleyman Anil

    2013-01-01

    This study introduces a story-based e-learning oriented course tool that was designed and developed for using within "computer programming" courses. With this tool, students can easily adapt themselves to the subjects in the context of computer programming principles, thanks to the story-based, interactive processes. By using visually…

  2. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    ERIC Educational Resources Information Center

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  3. Noncontrast computed tomography can predict the outcome of shockwave lithotripsy via accurate stone measurement and abdominal fat distribution determination.

    PubMed

    Geng, Jiun-Hung; Tu, Hung-Pin; Shih, Paul Ming-Chen; Shen, Jung-Tsung; Jang, Mei-Yu; Wu, Wen-Jen; Li, Ching-Chia; Chou, Yii-Her; Juan, Yung-Shun

    2015-01-01

    Urolithiasis is a common disease of the urinary system. Extracorporeal shockwave lithotripsy (SWL) has become one of the standard treatments for renal and ureteral stones; however, the success rates range widely and failure of stone disintegration may cause additional outlay, alternative procedures, and even complications. We used the data available from noncontrast abdominal computed tomography (NCCT) to evaluate the impact of stone parameters and abdominal fat distribution on calculus-free rates following SWL. We retrospectively reviewed 328 patients who had urinary stones and had undergone SWL from August 2012 to August 2013. All of them received pre-SWL NCCT; 1 month after SWL, radiography was arranged to evaluate the condition of the fragments. These patients were classified into stone-free group and residual stone group. Unenhanced computed tomography variables, including stone attenuation, abdominal fat area, and skin-to-stone distance (SSD) were analyzed. In all, 197 (60%) were classified as stone-free and 132 (40%) as having residual stone. The mean ages were 49.35 ± 13.22 years and 55.32 ± 13.52 years, respectively. On univariate analysis, age, stone size, stone surface area, stone attenuation, SSD, total fat area (TFA), abdominal circumference, serum creatinine, and the severity of hydronephrosis revealed statistical significance between these two groups. From multivariate logistic regression analysis, the independent parameters impacting SWL outcomes were stone size, stone attenuation, TFA, and serum creatinine. [Adjusted odds ratios and (95% confidence intervals): 9.49 (3.72-24.20), 2.25 (1.22-4.14), 2.20 (1.10-4.40), and 2.89 (1.35-6.21) respectively, all p < 0.05]. In the present study, stone size, stone attenuation, TFA and serum creatinine were four independent predictors for stone-free rates after SWL. These findings suggest that pretreatment NCCT may predict the outcomes after SWL. Consequently, we can use these predictors for selecting

  4. Computational thermodynamics, Gaussian processes and genetic algorithms: combined tools to design new alloys

    NASA Astrophysics Data System (ADS)

    Tancret, F.

    2013-06-01

    A new alloy design procedure is proposed, combining in a single computational tool several modelling and predictive techniques that have already been used and assessed in the field of materials science and alloy design: a genetic algorithm is used to optimize the alloy composition for target properties and performance on the basis of the prediction of mechanical properties (estimated by Gaussian process regression of data on existing alloys) and of microstructural constitution, stability and processability (evaluated by computational themodynamics). These tools are integrated in a unique Matlab programme. An example is given in the case of the design of a new nickel-base superalloy for future power plant applications (such as the ultra-supercritical (USC) coal-fired plant, or the high-temperature gas-cooled nuclear reactor (HTGCR or HTGR), where the selection criteria include cost, oxidation and creep resistance around 750 °C, long-term stability at service temperature, forgeability, weldability, etc.

  5. Mobile computing device as tools for college student education: a case on flashcards application

    NASA Astrophysics Data System (ADS)

    Kang, Congying

    2012-04-01

    Traditionally, college students always use flash cards as a tool to remember massive knowledge, such as nomenclature, structures, and reactions in chemistry. Educational and information technology have enabled flashcards viewed on computers, like Slides and PowerPoint, works as tunnels of drilling and feedback for the learners. The current generation of students is more capable of information technology and mobile computing devices. For example, they use their Mobile phones much more intensively everyday day. Trends of using Mobile phone as an educational tool is analyzed and a educational technology initiative is proposed, which use Mobile phone flash cards applications to help students learn biology and chemistry. Experiments show that users responded positively to these mobile flash cards.

  6. System capacity and economic modeling computer tool for satellite mobile communications systems

    NASA Technical Reports Server (NTRS)

    Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.

    1988-01-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  7. Computational Tools for Predictive Modeling of Properties in Complex Actinide Systems

    SciTech Connect

    Autschbach, Jochen; Govind, Niranjan; Atta Fynn, Raymond; Bylaska, Eric J.; Weare, John H.; de Jong, Wibe A.

    2015-03-30

    In this chapter we focus on methodological and computational aspects that are key to accurately modeling the spectroscopic and thermodynamic properties of molecular systems containing actinides within the density functional theory (DFT) framework. Our focus is on properties that require either an accurate relativistic all-electron description or an accurate description of the dynamical behavior of actinide species in an environment at finite temperature, or both. The implementation of the methods and the calculations discussed in this chapter were done with the NWChem software suite (Valiev et al. 2010). In the first two sections we discuss two methods that account for relativistic effects, the ZORA and the X2C Hamiltonian. Section 1.2.1 discusses the implementation of the approximate relativistic ZORA Hamiltonian and its extension to magnetic properties. Section 1.3 focuses on the exact X2C Hamiltonian and the application of this methodology to obtain accurate molecular properties. In Section 1.4 we examine the role of a dynamical environment at finite temperature as well as the presence of other ions on the thermodynamics of hydrolysis and exchange reaction mechanisms. Finally, Section 1.5 discusses the modeling of XAS (EXAFS, XANES) properties in realistic environments accounting for both the dynamics of the system and (for XANES) the relativistic effects.

  8. Computational Study of the Reactions of Methanol with the Hydroperoxyl and Methyl Radicals. Part I: Accurate Thermochemistry and Barrier Heights

    SciTech Connect

    Alecu, I. M.; Truhlar, D. G.

    2011-04-07

    The reactions of CH3OH with the HO2 and CH3 radicals are important in the combustion of methanol and are prototypes for reactions of heavier alcohols in biofuels. The reaction energies and barrier heights for these reaction systems are computed with CCSD(T) theory extrapolated to the complete basis set limit using correlation-consistent basis sets, both augmented and unaugmented, and further refined by including a fully coupled treatment of the connected triple excitations, a second-order perturbative treatment of quadruple excitations (by CCSDT(2)Q), core–valence corrections, and scalar relativistic effects. It is shown that the M08-HX and M08-SO hybrid meta-GGA density functionals can achieve sub-kcal mol-1 agreement with the high-level ab initio results, identifying these functionals as important potential candidates for direct dynamics studies on the rates of these and homologous reaction systems.

  9. An efficient and accurate technique to compute the absorption, emission, and transmission of radiation by the Martian atmosphere

    NASA Technical Reports Server (NTRS)

    Lindner, Bernhard Lee; Ackerman, Thomas P.; Pollack, James B.

    1990-01-01

    CO2 comprises 95 pct. of the composition of the Martian atmosphere. However, the Martian atmosphere also has a high aerosol content. Dust particles vary from less than 0.2 to greater than 3.0. CO2 is an active absorber and emitter in near IR and IR wavelengths; the near IR absorption bands of CO2 provide significant heating of the atmosphere, and the 15 micron band provides rapid cooling. Including both CO2 and aerosol radiative transfer simultaneously in a model is difficult. Aerosol radiative transfer requires a multiple scattering code, while CO2 radiative transfer must deal with complex wavelength structure. As an alternative to the pure atmosphere treatment in most models which causes inaccuracies, a treatment was developed called the exponential sum or k distribution approximation. The chief advantage of the exponential sum approach is that the integration over k space of f(k) can be computed more quickly than the integration of k sub upsilon over frequency. The exponential sum approach is superior to the photon path distribution and emissivity techniques for dusty conditions. This study was the first application of the exponential sum approach to Martian conditions.

  10. Assessment of the extended Koopmans' theorem for the chemical reactivity: Accurate computations of chemical potentials, chemical hardnesses, and electrophilicity indices.

    PubMed

    Yildiz, Dilan; Bozkaya, Uğur

    2016-01-30

    The extended Koopmans' theorem (EKT) provides a straightforward way to compute ionization potentials and electron affinities from any level of theory. Although it is widely applied to ionization potentials, the EKT approach has not been applied to evaluation of the chemical reactivity. We present the first benchmarking study to investigate the performance of the EKT methods for predictions of chemical potentials (μ) (hence electronegativities), chemical hardnesses (η), and electrophilicity indices (ω). We assess the performance of the EKT approaches for post-Hartree-Fock methods, such as Møller-Plesset perturbation theory, the coupled-electron pair theory, and their orbital-optimized counterparts for the evaluation of the chemical reactivity. Especially, results of the orbital-optimized coupled-electron pair theory method (with the aug-cc-pVQZ basis set) for predictions of the chemical reactivity are very promising; the corresponding mean absolute errors are 0.16, 0.28, and 0.09 eV for μ, η, and ω, respectively.

  11. Staging of osteonecrosis of the jaw requires computed tomography for accurate definition of the extent of bony disease.

    PubMed

    Bedogni, Alberto; Fedele, Stefano; Bedogni, Giorgio; Scoletta, Matteo; Favia, Gianfranco; Colella, Giuseppe; Agrillo, Alessandro; Bettini, Giordana; Di Fede, Olga; Oteri, Giacomo; Fusco, Vittorio; Gabriele, Mario; Ottolenghi, Livia; Valsecchi, Stefano; Porter, Stephen; Petruzzi, Massimo; Arduino, Paolo; D'Amato, Salvatore; Ungari, Claudio; Fung Polly, Pok-Lam; Saia, Giorgia; Campisi, Giuseppina

    2014-09-01

    Management of osteonecrosis of the jaw associated with antiresorptive agents is challenging, and outcomes are unpredictable. The severity of disease is the main guide to management, and can help to predict prognosis. Most available staging systems for osteonecrosis, including the widely-used American Association of Oral and Maxillofacial Surgeons (AAOMS) system, classify severity on the basis of clinical and radiographic findings. However, clinical inspection and radiography are limited in their ability to identify the extent of necrotic bone disease compared with computed tomography (CT). We have organised a large multicentre retrospective study (known as MISSION) to investigate the agreement between the AAOMS staging system and the extent of osteonecrosis of the jaw (focal compared with diffuse involvement of bone) as detected on CT. We studied 799 patients with detailed clinical phenotyping who had CT images taken. Features of diffuse bone disease were identified on CT within all AAOMS stages (20%, 8%, 48%, and 24% of patients in stages 0, 1, 2, and 3, respectively). Of the patients classified as stage 0, 110/192 (57%) had diffuse disease on CT, and about 1 in 3 with CT evidence of diffuse bone disease was misclassified by the AAOMS system as having stages 0 and 1 osteonecrosis. In addition, more than a third of patients with AAOMS stage 2 (142/405, 35%) had focal bone disease on CT. We conclude that the AAOMS staging system does not correctly identify the extent of bony disease in patients with osteonecrosis of the jaw.

  12. F18-fluorodeoxyglucose-positron emission tomography and computed tomography is not accurate in preoperative staging of gastric cancer

    PubMed Central

    Ha, Tae Kyung; Choi, Yun Young; Song, Soon Young

    2011-01-01

    Purpose To investigate the clinical benefits of F18-fluorodeoxyglucose-positron emission tomography and computed tomography (18F-FDG-PET/CT) over multi-detector row CT (MDCT) in preoperative staging of gastric cancer. Methods FDG-PET/CT and MDCT were performed on 78 patients with gastric cancer pathologically diagnosed by endoscopy. The accuracy of radiologic staging retrospectively was compared to pathologic result after curative resection. Results Primary tumors were detected in 51 (65.4%) patients with 18F-FDG-PET/CT, and 47 (60.3%) patients with MDCT. Regarding detection of lymph node metastasis, the sensitivity of FDG-PET/CT was 51.5% with an accuracy of 71.8%, whereas those of MDCT were 69.7% and 69.2%, respectively. The sensitivity of 18F-FDG-PET/CT for a primary tumor with signet ring cell carcinoma was lower than that of 18F-FDG-PET/CT for a primary tumor with non-signet ring cell carcinoma (35.3% vs. 73.8%, P < 0.01). Conclusion Due to its low sensitivity, 18F-FDG-PET/CT alone shows no definite clinical benefit for prediction of lymph node metastasis in preoperative staging of gastric cancer. PMID:22066108

  13. Summary Report on the Graded Prognostic Assessment: An Accurate and Facile Diagnosis-Specific Tool to Estimate Survival for Patients With Brain Metastases

    PubMed Central

    Sperduto, Paul W.; Kased, Norbert; Roberge, David; Xu, Zhiyuan; Shanley, Ryan; Luo, Xianghua; Sneed, Penny K.; Chao, Samuel T.; Weil, Robert J.; Suh, John; Bhatt, Amit; Jensen, Ashley W.; Brown, Paul D.; Shih, Helen A.; Kirkpatrick, John; Gaspar, Laurie E.; Fiveash, John B.; Chiang, Veronica; Knisely, Jonathan P.S.; Sperduto, Christina Maria; Lin, Nancy; Mehta, Minesh

    2012-01-01

    Purpose Our group has previously published the Graded Prognostic Assessment (GPA), a prognostic index for patients with brain metastases. Updates have been published with refinements to create diagnosis-specific Graded Prognostic Assessment indices. The purpose of this report is to present the updated diagnosis-specific GPA indices in a single, unified, user-friendly report to allow ease of access and use by treating physicians. Methods A multi-institutional retrospective (1985 to 2007) database of 3,940 patients with newly diagnosed brain metastases underwent univariate and multivariate analyses of prognostic factors associated with outcomes by primary site and treatment. Significant prognostic factors were used to define the diagnosis-specific GPA prognostic indices. A GPA of 4.0 correlates with the best prognosis, whereas a GPA of 0.0 corresponds with the worst prognosis. Results Significant prognostic factors varied by diagnosis. For lung cancer, prognostic factors were Karnofsky performance score, age, presence of extracranial metastases, and number of brain metastases, confirming the original Lung-GPA. For melanoma and renal cell cancer, prognostic factors were Karnofsky performance score and the number of brain metastases. For breast cancer, prognostic factors were tumor subtype, Karnofsky performance score, and age. For GI cancer, the only prognostic factor was the Karnofsky performance score. The median survival times by GPA score and diagnosis were determined. Conclusion Prognostic factors for patients with brain metastases vary by diagnosis, and for each diagnosis, a robust separation into different GPA scores was discerned, implying considerable heterogeneity in outcome, even within a single tumor type. In summary, these indices and related worksheet provide an accurate and facile diagnosis-specific tool to estimate survival, potentially select appropriate treatment, and stratify clinical trials for patients with brain metastases. PMID:22203767

  14. IrisPlex: a sensitive DNA tool for accurate prediction of blue and brown eye colour in the absence of ancestry information.

    PubMed

    Walsh, Susan; Liu, Fan; Ballantyne, Kaye N; van Oven, Mannis; Lao, Oscar; Kayser, Manfred

    2011-06-01

    A new era of 'DNA intelligence' is arriving in forensic biology, due to the impending ability to predict externally visible characteristics (EVCs) from biological material such as those found at crime scenes. EVC prediction from forensic samples, or from body parts, is expected to help concentrate police investigations towards finding unknown individuals, at times when conventional DNA profiling fails to provide informative leads. Here we present a robust and sensitive tool, termed IrisPlex, for the accurate prediction of blue and brown eye colour from DNA in future forensic applications. We used the six currently most eye colour-informative single nucleotide polymorphisms (SNPs) that previously revealed prevalence-adjusted prediction accuracies of over 90% for blue and brown eye colour in 6168 Dutch Europeans. The single multiplex assay, based on SNaPshot chemistry and capillary electrophoresis, both widely used in forensic laboratories, displays high levels of genotyping sensitivity with complete profiles generated from as little as 31pg of DNA, approximately six human diploid cell equivalents. We also present a prediction model to correctly classify an individual's eye colour, via probability estimation solely based on DNA data, and illustrate the accuracy of the developed prediction test on 40 individuals from various geographic origins. Moreover, we obtained insights into the worldwide allele distribution of these six SNPs using the HGDP-CEPH samples of 51 populations. Eye colour prediction analyses from HGDP-CEPH samples provide evidence that the test and model presented here perform reliably without prior ancestry information, although future worldwide genotype and phenotype data shall confirm this notion. As our IrisPlex eye colour prediction test is capable of immediate implementation in forensic casework, it represents one of the first steps forward in the creation of a fully individualised EVC prediction system for future use in forensic DNA intelligence.

  15. Computational Genomics Tools for Copy-Number Fluctuations in Prostate Cancer

    DTIC Science & Technology

    2005-11-01

    Catastrophes: The Brazilian Food-Poisoning Scenario & Beyond” • Bioinformatics Seminar, Cold Spring Harbor Laboratory, Long Island, NY, September 21, 2005...house Seminar, Cold Spring Harbor Laboratory, Long Island, NY, April 7, 2004. “”Valis, Simpathica and NYU MAD: Computational and Systems Biology Tools...Bioinformatics Seminar, Cold Spring Harbor Laboratory, Long Island, NY, March 10, 2004. “Identifying Differentially Expressed Genes via Multiscale

  16. Computer tools in the discovery of HIV-I integrase inhibitors

    PubMed Central

    Liao, Chenzhong; Nicklaus, Marc C

    2010-01-01

    Computer-aided drug design (CADD) methodologies have made great advances and contributed significantly to the discovery and/or optimization of many clinically used drugs in recent years. CADD tools have likewise been applied to the discovery of inhibitors of HIV-I integrase, a difficult and worthwhile target for the development of efficient anti-HIV drugs. This article reviews the application of CADD tools, including pharmacophore search, quantitative structure–activity relationships, model building of integrase complexed with viral DNA and quantum-chemical studies in the discovery of HIV-I integrase inhibitors. Different structurally diverse integrase inhibitors have been identified by, or with significant help from, various CADD tools. PMID:21426160

  17. Realistic 3D computer model of the gerbil middle ear, featuring accurate morphology of bone and soft tissue structures.

    PubMed

    Buytaert, Jan A N; Salih, Wasil H M; Dierick, Manual; Jacobs, Patric; Dirckx, Joris J J

    2011-12-01

    In order to improve realism in middle ear (ME) finite-element modeling (FEM), comprehensive and precise morphological data are needed. To date, micro-scale X-ray computed tomography (μCT) recordings have been used as geometric input data for FEM models of the ME ossicles. Previously, attempts were made to obtain these data on ME soft tissue structures as well. However, due to low X-ray absorption of soft tissue, quality of these images is limited. Another popular approach is using histological sections as data for 3D models, delivering high in-plane resolution for the sections, but the technique is destructive in nature and registration of the sections is difficult. We combine data from high-resolution μCT recordings with data from high-resolution orthogonal-plane fluorescence optical-sectioning microscopy (OPFOS), both obtained on the same gerbil specimen. State-of-the-art μCT delivers high-resolution data on the 3D shape of ossicles and other ME bony structures, while the OPFOS setup generates data of unprecedented quality both on bone and soft tissue ME structures. Each of these techniques is tomographic and non-destructive and delivers sets of automatically aligned virtual sections. The datasets coming from different techniques need to be registered with respect to each other. By combining both datasets, we obtain a complete high-resolution morphological model of all functional components in the gerbil ME. The resulting 3D model can be readily imported in FEM software and is made freely available to the research community. In this paper, we discuss the methods used, present the resulting merged model, and discuss the morphological properties of the soft tissue structures, such as muscles and ligaments.

  18. Highly Accurate Infrared Line Lists of SO2 Isotopologues Computed for Atmospheric Modeling on Venus and Exoplanets

    NASA Astrophysics Data System (ADS)

    Huang, X.; Schwenke, D.; Lee, T. J.

    2014-12-01

    Last year we reported a semi-empirical 32S16O2 spectroscopic line list (denoted Ames-296K) for its atmospheric characterization in Venus and other Exoplanetary environments. In order to facilitate the Sulfur isotopic ratio and Sulfur chemistry model determination, now we present Ames-296K line lists for both 626 (upgraded) and other 4 symmetric isotopologues: 636, 646, 666 and 828. The line lists are computed on an ab initio potential energy surface refined with most reliable high resolution experimental data, using a high quality CCSD(T)/aug-cc-pV(Q+d)Z dipole moment surface. The most valuable part of our approach is to provide "truly reliable" predictions (and alternatives) for those unknown or hard-to-measure/analyze spectra. This strategy has guaranteed the lists are the best available alternative for those wide spectra region missing from spectroscopic databases such as HITRAN and GEISA, where only very limited data exist for 626/646 and no Infrared data at all for 636/666 or other minor isotopologues. Our general line position accuracy up to 5000 cm-1 is 0.01 - 0.02 cm-1 or better. Most transition intensity deviations are less than 5%, compare to experimentally measured quantities. Note that we have solved a convergence issue and further improved the quality and completeness of the main isotopologue 626 list at 296K. We will compare the lists to available models in CDMS/JPL/HITRAN and discuss the future mutually beneficial interactions between theoretical and experimental efforts.

  19. Accelerating Design of Batteries Using Computer-Aided Engineering Tools (Presentation)

    SciTech Connect

    Pesaran, A.; Kim, G. H.; Smith, K.

    2010-11-01

    Computer-aided engineering (CAE) is a proven pathway, especially in the automotive industry, to improve performance by resolving the relevant physics in complex systems, shortening the product development design cycle, thus reducing cost, and providing an efficient way to evaluate parameters for robust designs. Academic models include the relevant physics details, but neglect engineering complexities. Industry models include the relevant macroscopic geometry and system conditions, but simplify the fundamental physics too much. Most of the CAE battery tools for in-house use are custom model codes and require expert users. There is a need to make these battery modeling and design tools more accessible to end users such as battery developers, pack integrators, and vehicle makers. Developing integrated and physics-based CAE battery tools can reduce the design, build, test, break, re-design, re-build, and re-test cycle and help lower costs. NREL has been involved in developing various models to predict the thermal and electrochemical performance of large-format cells and has used in commercial three-dimensional finite-element analysis and computational fluid dynamics to study battery pack thermal issues. These NREL cell and pack design tools can be integrated to help support the automotive industry and to accelerate battery design.

  20. Benchmarking therapeutic drug monitoring software: a review of available computer tools.

    PubMed

    Fuchs, Aline; Csajka, Chantal; Thoma, Yann; Buclin, Thierry; Widmer, Nicolas

    2013-01-01

    Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare

  1. GlycoMinestruct: a new bioinformatics tool for highly accurate mapping of the human N-linked and O-linked glycoproteomes by incorporating structural features

    PubMed Central

    Li, Fuyi; Li, Chen; Revote, Jerico; Zhang, Yang; Webb, Geoffrey I.; Li, Jian; Song, Jiangning; Lithgow, Trevor

    2016-01-01

    Glycosylation plays an important role in cell-cell adhesion, ligand-binding and subcellular recognition. Current approaches for predicting protein glycosylation are primarily based on sequence-derived features, while little work has been done to systematically assess the importance of structural features to glycosylation prediction. Here, we propose a novel bioinformatics method called GlycoMinestruct(http://glycomine.erc.monash.edu/Lab/GlycoMine_Struct/) for improved prediction of human N- and O-linked glycosylation sites by combining sequence and structural features in an integrated computational framework with a two-step feature-selection strategy. Experiments indicated that GlycoMinestruct outperformed NGlycPred, the only predictor that incorporated both sequence and structure features, achieving AUC values of 0.941 and 0.922 for N- and O-linked glycosylation, respectively, on an independent test dataset. We applied GlycoMinestruct to screen the human structural proteome and obtained high-confidence predictions for N- and O-linked glycosylation sites. GlycoMinestruct can be used as a powerful tool to expedite the discovery of glycosylation events and substrates to facilitate hypothesis-driven experimental studies. PMID:27708373

  2. Architecture-Adaptive Computing Environment: A Tool for Teaching Parallel Programming

    NASA Technical Reports Server (NTRS)

    Dorband, John E.; Aburdene, Maurice F.

    2002-01-01

    Recently, networked and cluster computation have become very popular. This paper is an introduction to a new C based parallel language for architecture-adaptive programming, aCe C. The primary purpose of aCe (Architecture-adaptive Computing Environment) is to encourage programmers to implement applications on parallel architectures by providing them the assurance that future architectures will be able to run their applications with a minimum of modification. A secondary purpose is to encourage computer architects to develop new types of architectures by providing an easily implemented software development environment and a library of test applications. This new language should be an ideal tool to teach parallel programming. In this paper, we will focus on some fundamental features of aCe C.

  3. Development of computer-based analytical tool for assessing physical protection system

    SciTech Connect

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  4. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    ERIC Educational Resources Information Center

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-01-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…

  5. An effective computational tool for parametric studies and identification problems in materials mechanics

    NASA Astrophysics Data System (ADS)

    Bolzon, Gabriella; Buljak, Vladimir

    2011-12-01

    Parametric studies and identification problems require to perform repeated analyses, where only a few input parameters are varied among those defining the problem of interest, often associated to complex numerical simulations. In fact, physical phenomena relevant to several practical applications involve coupled material and geometry non-linearities. In these situations, accurate but expensive computations, usually carried out by the finite element method, may be replaced by numerical procedures based on proper orthogonal decomposition combined with radial basis function interpolation. Besides drastically reducing computing times and costs, this approach is capable of retaining the essential features of the considered system responses while filtering most disturbances. These features are illustrated in this paper with specific reference to some elastic-plastic problems. The presented results can however be easily extended to other meaningful engineering situations.

  6. Evaluation of a Tool to Categorize Patients by Reading Literacy and Computer Skill to Facilitate the Computer-Administered Patient Interview

    PubMed Central

    Lobach, David F.; Hasselblad, Victor; Wildemuth, Barbara M.

    2003-01-01

    Past efforts to collect clinical information directly from patients using computers have had limited utility because these efforts required users to be literate and facile with the computerized information collecting system. In this paper we describe the creation and use of a computer-based tool designed to assess a user’s reading literacy and computer skill for the purpose of adapting the human-computer interface to fit the identified skill levels of the user. The tool is constructed from a regression model based on 4 questions that we identified in a laboratory study to be highly predictive of reading literacy and 2 questions predictive of computer skill. When used in 2 diverse clinical practices the tool categorized low literacy users so that they received appropriate support to enter data through the computer, enabling them to perform as well as high literacy users. Confirmation of the performance of the tool with a validated reading assessment instrument showed a statistically significant difference (p=0.0025) between the two levels of reading literacy defined by the tool. Our assessment tool can be administered through a computer in less than two minutes without requiring any special training or expertise making it useful for rapidly determining users’ aptitudes. PMID:14728201

  7. Development of Experimental and Computational Aeroacoustic Tools for Advanced Liner Evaluation

    NASA Technical Reports Server (NTRS)

    Jones, Michael G.; Watson, Willie R.; Nark, Douglas N.; Parrott, Tony L.; Gerhold, Carl H.; Brown, Martha C.

    2006-01-01

    Acoustic liners in aircraft engine nacelles suppress radiated noise. Therefore, as air travel increases, increasingly sophisticated tools are needed to maximize noise suppression. During the last 30 years, NASA has invested significant effort in development of experimental and computational acoustic liner evaluation tools. The Curved Duct Test Rig is a 152-mm by 381- mm curved duct that supports liner evaluation at Mach numbers up to 0.3 and source SPLs up to 140 dB, in the presence of user-selected modes. The Grazing Flow Impedance Tube is a 51- mm by 63-mm duct currently being fabricated to operate at Mach numbers up to 0.6 with source SPLs up to at least 140 dB, and will replace the existing 51-mm by 51-mm duct. Together, these test rigs allow evaluation of advanced acoustic liners over a range of conditions representative of those observed in aircraft engine nacelles. Data acquired with these test ducts are processed using three aeroacoustic propagation codes. Two are based on finite element solutions to convected Helmholtz and linearized Euler equations. The third is based on a parabolic approximation to the convected Helmholtz equation. The current status of these computational tools and their associated usage with the Langley test rigs is provided.

  8. User's Manual for FOMOCO Utilities-Force and Moment Computation Tools for Overset Grids

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Buning, Pieter G.

    1996-01-01

    In the numerical computations of flows around complex configurations, accurate calculations of force and moment coefficients for aerodynamic surfaces are required. When overset grid methods are used, the surfaces on which force and moment coefficients are sought typically consist of a collection of overlapping surface grids. Direct integration of flow quantities on the overlapping grids would result in the overlapped regions being counted more than once. The FOMOCO Utilities is a software package for computing flow coefficients (force, moment, and mass flow rate) on a collection of overset surfaces with accurate accounting of the overlapped zones. FOMOCO Utilities can be used in stand-alone mode or in conjunction with the Chimera overset grid compressible Navier-Stokes flow solver OVERFLOW. The software package consists of two modules corresponding to a two-step procedure: (1) hybrid surface grid generation (MIXSUR module), and (2) flow quantities integration (OVERINT module). Instructions on how to use this software package are described in this user's manual. Equations used in the flow coefficients calculation are given in Appendix A.

  9. Ring polymer molecular dynamics fast computation of rate coefficients on accurate potential energy surfaces in local configuration space: Application to the abstraction of hydrogen from methane.

    PubMed

    Meng, Qingyong; Chen, Jun; Zhang, Dong H

    2016-04-21

    To fast and accurately compute rate coefficients of the H/D + CH4 → H2/HD + CH3reactions, we propose a segmented strategy for fitting suitable potential energy surface (PES), on which ring-polymer molecular dynamics (RPMD) simulations are performed. On the basis of recently developed permutation invariant polynomial neural-network approach [J. Li et al., J. Chem. Phys. 142, 204302 (2015)], PESs in local configuration spaces are constructed. In this strategy, global PES is divided into three parts, including asymptotic, intermediate, and interaction parts, along the reaction coordinate. Since less fitting parameters are involved in the local PESs, the computational efficiency for operating the PES routine is largely enhanced by a factor of ∼20, comparing with that for global PES. On interaction part, the RPMD computational time for the transmission coefficient can be further efficiently reduced by cutting off the redundant part of the child trajectories. For H + CH4, good agreements among the present RPMD rates and those from previous simulations as well as experimental results are found. For D + CH4, on the other hand, qualitative agreement between present RPMD and experimental results is predicted.

  10. Ring polymer molecular dynamics fast computation of rate coefficients on accurate potential energy surfaces in local configuration space: Application to the abstraction of hydrogen from methane

    NASA Astrophysics Data System (ADS)

    Meng, Qingyong; Chen, Jun; Zhang, Dong H.

    2016-04-01

    To fast and accurately compute rate coefficients of the H/D + CH4 → H2/HD + CH3 reactions, we propose a segmented strategy for fitting suitable potential energy surface (PES), on which ring-polymer molecular dynamics (RPMD) simulations are performed. On the basis of recently developed permutation invariant polynomial neural-network approach [J. Li et al., J. Chem. Phys. 142, 204302 (2015)], PESs in local configuration spaces are constructed. In this strategy, global PES is divided into three parts, including asymptotic, intermediate, and interaction parts, along the reaction coordinate. Since less fitting parameters are involved in the local PESs, the computational efficiency for operating the PES routine is largely enhanced by a factor of ˜20, comparing with that for global PES. On interaction part, the RPMD computational time for the transmission coefficient can be further efficiently reduced by cutting off the redundant part of the child trajectories. For H + CH4, good agreements among the present RPMD rates and those from previous simulations as well as experimental results are found. For D + CH4, on the other hand, qualitative agreement between present RPMD and experimental results is predicted.

  11. Can a numerically stable subgrid-scale model for turbulent flow computation be ideally accurate?: a preliminary theoretical study for the Gaussian filtered Navier-Stokes equations.

    PubMed

    Ida, Masato; Taniguchi, Nobuyuki

    2003-09-01

    This paper introduces a candidate for the origin of the numerical instabilities in large eddy simulation repeatedly observed in academic and practical industrial flow computations. Without resorting to any subgrid-scale modeling, but based on a simple assumption regarding the streamwise component of flow velocity, it is shown theoretically that in a channel-flow computation, the application of the Gaussian filtering to the incompressible Navier-Stokes equations yields a numerically unstable term, a cross-derivative term, which is similar to one appearing in the Gaussian filtered Vlasov equation derived by Klimas [J. Comput. Phys. 68, 202 (1987)] and also to one derived recently by Kobayashi and Shimomura [Phys. Fluids 15, L29 (2003)] from the tensor-diffusivity subgrid-scale term in a dynamic mixed model. The present result predicts that not only the numerical methods and the subgrid-scale models employed but also only the applied filtering process can be a seed of this numerical instability. An investigation concerning the relationship between the turbulent energy scattering and the unstable term shows that the instability of the term does not necessarily represent the backscatter of kinetic energy which has been considered a possible origin of numerical instabilities in large eddy simulation. The present findings raise the question whether a numerically stable subgrid-scale model can be ideally accurate.

  12. Identification of fidgety movements and prediction of CP by the use of computer-based video analysis is more accurate when based on two video recordings.

    PubMed

    Adde, Lars; Helbostad, Jorunn; Jensenius, Alexander R; Langaas, Mette; Støen, Ragnhild

    2013-08-01

    This study evaluates the role of postterm age at assessment and the use of one or two video recordings for the detection of fidgety movements (FMs) and prediction of cerebral palsy (CP) using computer vision software. Recordings between 9 and 17 weeks postterm age from 52 preterm and term infants (24 boys, 28 girls; 26 born preterm) were used. Recordings were analyzed using computer vision software. Movement variables, derived from differences between subsequent video frames, were used for quantitative analysis. Sensitivities, specificities, and area under curve were estimated for the first and second recording, or a mean of both. FMs were classified based on the Prechtl approach of general movement assessment. CP status was reported at 2 years. Nine children developed CP of whom all recordings had absent FMs. The mean variability of the centroid of motion (CSD) from two recordings was more accurate than using only one recording, and identified all children who were diagnosed with CP at 2 years. Age at assessment did not influence the detection of FMs or prediction of CP. The accuracy of computer vision techniques in identifying FMs and predicting CP based on two recordings should be confirmed in future studies.

  13. Development of simulation tools for numerical investigation and computer-aided design (CAD) of gyrotrons

    NASA Astrophysics Data System (ADS)

    Damyanova, M.; Sabchevski, S.; Zhelyazkov, I.; Vasileva, E.; Balabanova, E.; Dankov, P.; Malinov, P.

    2016-10-01

    As the most powerful CW sources of coherent radiation in the sub-terahertz to terahertz frequency range the gyrotrons have demonstrated a remarkable potential for numerous novel and prospective applications in the fundamental physical research and the technologies. Among them are powerful gyrotrons for electron cyclotron resonance heating (ECRH) and current drive (ECCD) of magnetically confined plasma in various reactors for controlled thermonuclear fusion (e.g., tokamaks and most notably ITER), high-frequency gyrotrons for sub-terahertz spectroscopy (for example NMR-DNP, XDMR, study of the hyperfine structure of positronium, etc.), gyrotrons for thermal processing and so on. Modelling and simulation are indispensable tools for numerical studies, computer-aided design (CAD) and optimization of such sophisticated vacuum tubes (fast-wave devices) operating on a physical principle known as electron cyclotron resonance maser (ECRM) instability. During the recent years, our research team has been involved in the development of physical models and problem-oriented software packages for numerical analysis and CAD of different gyrotrons in the framework of a broad international collaboration. In this paper we present the current status of our simulation tools (GYROSIM and GYREOSS packages) and illustrate their functionality by results of numerical experiments carried out recently. Finally, we provide an outlook on the envisaged further development of the computer codes and the computational modules belonging to these packages and specialized to different subsystems of the gyrotrons.

  14. Using molecular visualization to explore protein structure and function and enhance student facility with computational tools.

    PubMed

    Terrell, Cassidy R; Listenberger, Laura L

    2017-02-01

    Recognizing that undergraduate students can benefit from analysis of 3D protein structure and function, we have developed a multiweek, inquiry-based molecular visualization project for Biochemistry I students. This project uses a virtual model of cyclooxygenase-1 (COX-1) to guide students through multiple levels of protein structure analysis. The first assignment explores primary structure by generating and examining a protein sequence alignment. Subsequent assignments introduce 3D visualization software to explore secondary, tertiary, and quaternary structure. Students design an inhibitor, based on scrutiny of the enzyme active site, and evaluate the fit of the molecule using computed binding energies. In the last assignment, students introduce a point mutation to model the active site of the related COX-2 enzyme and analyze the impact of the mutation on inhibitor binding. With this project we aim to increase knowledge about, and confidence in using, online databases and computational tools. Here, we share results of our mixed methods pre- and postsurvey demonstrating student gains in knowledge about, and confidence using, online databases and computational tools. © 2017 by The International Union of Biochemistry and Molecular Biology, 2017.

  15. The Astronomy Workshop: Computer Assisted Learning Tools with Instructor Support Materials and Student Activities

    NASA Astrophysics Data System (ADS)

    Deming, Grace; Hamilton, D.; Hayes-Gehrke, M.

    2006-12-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is a collection of interactive World Wide Web tools that were developed under the direction of Doug Hamilton for use in undergraduate classes, as supplementary materials appropriate for grades 9-12, and by the general public. The philosophy of the website is to foster student and public interest in astronomy by capitalizing on their fascination with computers and the internet. Many of the tools were developed by graduate and undergraduate students at UMD. This website contains over 20 tools on topics including scientific notation, giant impacts, extrasolar planets, astronomical distances, planets, moons, comets, and asteroids. Educators around the country at universities, colleges, and secondary schools have used the Astronomy Workshop’s tools and activities as homework assignments, in-class demos, or extra credit. Since 2005, Grace Deming has assessed several of the Astronomy Workshop’s tools for clarity and effectiveness by interviewing students as they used tools on the website. Based on these interviews, Deming wrote student activities and instructor support materials and posted them to the website. Over the next three years, we will continue to interview students, develop web materials, and field-test activities. We are targeting classes in introductory undergraduate astronomy courses and grades 11-12 for our Spring 2007 field tests. We are interested in hearing your ideas on how we can make the Astronomy Workshop more appealing to educators, museum directors, specialty programs, and professors. This research is funded by NASA EPO grants NNG04GM18G and NNG06GGF99G.

  16. Validation of RetroPath, a computer-aided design tool for metabolic pathway engineering.

    PubMed

    Fehér, Tamás; Planson, Anne-Gaëlle; Carbonell, Pablo; Fernández-Castané, Alfred; Grigoras, Ioana; Dariy, Ekaterina; Perret, Alain; Faulon, Jean-Loup

    2014-11-01

    Metabolic engineering has succeeded in biosynthesis of numerous commodity or high value compounds. However, the choice of pathways and enzymes used for production was many times made ad hoc, or required expert knowledge of the specific biochemical reactions. In order to rationalize the process of engineering producer strains, we developed the computer-aided design (CAD) tool RetroPath that explores and enumerates metabolic pathways connecting the endogenous metabolites of a chassis cell to the target compound. To experimentally validate our tool, we constructed 12 top-ranked enzyme combinations producing the flavonoid pinocembrin, four of which displayed significant yields. Namely, our tool queried the enzymes found in metabolic databases based on their annotated and predicted activities. Next, it ranked pathways based on the predicted efficiency of the available enzymes, the toxicity of the intermediate metabolites and the calculated maximum product flux. To implement the top-ranking pathway, our procedure narrowed down a list of nine million possible enzyme combinations to 12, a number easily assembled and tested. One round of metabolic network optimization based on RetroPath output further increased pinocembrin titers 17-fold. In total, 12 out of the 13 enzymes tested in this work displayed a relative performance that was in accordance with its predicted score. These results validate the ranking function of our CAD tool, and open the way to its utilization in the biosynthesis of novel compounds.

  17. A computer assisted diagnosis tool for the classification of burns by depth of injury.

    PubMed

    Serrano, Carmen; Acha, Begoña; Gómez-Cía, Tomás; Acha, José I; Roa, Laura M

    2005-05-01

    In this paper, a computer assisted diagnosis (CAD) tool for the classification of burns into their depths is proposed. The aim of the system is to separate burn wounds from healthy skin, and to distinguish among the different types of burns (burn depths) by means of digital photographs. It is intended to be used as an aid to diagnosis in local medical centres, where there is a lack of specialists. Another potential use of the system is as an educational tool. The system is based on the analysis of digital photographs. It extracts from those images colour and texture information, as these are the characteristics observed by physicians in order to form a diagnosis. Clinical effectiveness of the method was demonstrated on 35 clinical burn wound images, yielding an average classification success rate of 88% compared to expert classified images.

  18. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  19. Validation of space/ground antenna control algorithms using a computer-aided design tool

    NASA Technical Reports Server (NTRS)

    Gantenbein, Rex E.

    1995-01-01

    The validation of the algorithms for controlling the space-to-ground antenna subsystem for Space Station Alpha is an important step in assuring reliable communications. These algorithms have been developed and tested using a simulation environment based on a computer-aided design tool that can provide a time-based execution framework with variable environmental parameters. Our work this summer has involved the exploration of this environment and the documentation of the procedures used to validate these algorithms. We have installed a variety of tools in a laboratory of the Tracking and Communications division for reproducing the simulation experiments carried out on these algorithms to verify that they do meet their requirements for controlling the antenna systems. In this report, we describe the processes used in these simulations and our work in validating the tests used.

  20. Defining a Standard for Reporting Digital Evidence Items in Computer Forensic Tools

    NASA Astrophysics Data System (ADS)

    Bariki, Hamda; Hashmi, Mariam; Baggili, Ibrahim

    Due to the lack of standards in reporting digital evidence items, investigators are facing difficulties in efficiently presenting their findings. This paper proposes a standard for digital evidence to be used in reports that are generated using computer forensic software tools. The authors focused on developing a standard digital evidence items by surveying various digital forensic tools while keeping in mind the legal integrity of digital evidence items. Additionally, an online questionnaire was used to gain the opinion of knowledgeable and experienced stakeholders in the digital forensics domain. Based on the findings, the authors propose a standard for digital evidence items that includes data about the case, the evidence source, evidence item, and the chain of custody. Research results enabled the authors in creating a defined XML schema for digital evidence items.

  1. AMAS: a fast tool for alignment manipulation and computing of summary statistics.

    PubMed

    Borowiec, Marek L

    2016-01-01

    The amount of data used in phylogenetics has grown explosively in the recent years and many phylogenies are inferred with hundreds or even thousands of loci and many taxa. These modern phylogenomic studies often entail separate analyses of each of the loci in addition to multiple analyses of subsets of genes or concatenated sequences. Computationally efficient tools for handling and computing properties of thousands of single-locus or large concatenated alignments are needed. Here I present AMAS (Alignment Manipulation And Summary), a tool that can be used either as a stand-alone command-line utility or as a Python package. AMAS works on amino acid and nucleotide alignments and combines capabilities of sequence manipulation with a function that calculates basic statistics. The manipulation functions include conversions among popular formats, concatenation, extracting sites and splitting according to a pre-defined partitioning scheme, creation of replicate data sets, and removal of taxa. The statistics calculated include the number of taxa, alignment length, total count of matrix cells, overall number of undetermined characters, percent of missing data, AT and GC contents (for DNA alignments), count and proportion of variable sites, count and proportion of parsimony informative sites, and counts of all characters relevant for a nucleotide or amino acid alphabet. AMAS is particularly suitable for very large alignments with hundreds of taxa and thousands of loci. It is computationally efficient, utilizes parallel processing, and performs better at concatenation than other popular tools. AMAS is a Python 3 program that relies solely on Python's core modules and needs no additional dependencies. AMAS source code and manual can be downloaded from http://github.com/marekborowiec/AMAS/ under GNU General Public License.

  2. AMAS: a fast tool for alignment manipulation and computing of summary statistics

    PubMed Central

    2016-01-01

    The amount of data used in phylogenetics has grown explosively in the recent years and many phylogenies are inferred with hundreds or even thousands of loci and many taxa. These modern phylogenomic studies often entail separate analyses of each of the loci in addition to multiple analyses of subsets of genes or concatenated sequences. Computationally efficient tools for handling and computing properties of thousands of single-locus or large concatenated alignments are needed. Here I present AMAS (Alignment Manipulation And Summary), a tool that can be used either as a stand-alone command-line utility or as a Python package. AMAS works on amino acid and nucleotide alignments and combines capabilities of sequence manipulation with a function that calculates basic statistics. The manipulation functions include conversions among popular formats, concatenation, extracting sites and splitting according to a pre-defined partitioning scheme, creation of replicate data sets, and removal of taxa. The statistics calculated include the number of taxa, alignment length, total count of matrix cells, overall number of undetermined characters, percent of missing data, AT and GC contents (for DNA alignments), count and proportion of variable sites, count and proportion of parsimony informative sites, and counts of all characters relevant for a nucleotide or amino acid alphabet. AMAS is particularly suitable for very large alignments with hundreds of taxa and thousands of loci. It is computationally efficient, utilizes parallel processing, and performs better at concatenation than other popular tools. AMAS is a Python 3 program that relies solely on Python’s core modules and needs no additional dependencies. AMAS source code and manual can be downloaded from http://github.com/marekborowiec/AMAS/ under GNU General Public License. PMID:26835189

  3. A simple tool for the computation of the stream-aquifer coefficient.

    NASA Astrophysics Data System (ADS)

    Cousquer, Yohann; Pryet, Alexandre; Dupuy, Alain

    2015-04-01

    Most groundwater models consider a river network in interaction with aquifers, where the stream-aquifer boundary is usually modeled with a Cauchy-type boundary condition. This condition is parameterized with the so-called "river coefficient", which is a lumped parameter representing the effects of numerous geometric and hydrodynamic controlling factors. The value of the river coefficient is essential for the quantification of stream-aquifer flow but is challenging to determine. In recent years, many formulations for the river coefficient have been proposed from analytical and numerical approaches. However, these methods are either too simple to be realistic or too complex to be easily implemented by groundwater modelers. We propose a simple tool to infer the value of the river coefficient from a fine-grid numerical model. This tool allows the simple and fast computation of the river coefficient with various stream geometries and hydraulic parameters. A Python-based pre- and post-processor has been developed, which reduces the contribution of the operator to the definition of the model parameters: river geometry and aquifer properties. The numerical model is implemented with the USGS SUTRA finite element model and considers an aquifer in interaction with a stream in a 2D vertical cross-section. A Dirichlet-type boundary condition is imposed at the stream-aquifer interface. The linearity between the stream-aquifer flow and the head difference between river and the aquifer has been verified. For a given parameter set, the value of river coefficient is estimated by linear regression for different values of head difference between the river and the aquifer. The innovation is that the mesh size of the regional model is also considered for the computation of the river coefficient. This tool has been used to highlight the importance of parameters that were usually neglected for the computation of the river coefficient. The results of this work will be made available to the

  4. Beam hardening artifacts in micro-computed tomography scanning can be reduced by X-ray beam filtration and the resulting images can be used to accurately measure BMD.

    PubMed

    Meganck, Jeffrey A; Kozloff, Kenneth M; Thornton, Michael M; Broski, Stephen M; Goldstein, Steven A

    2009-12-01

    Bone mineral density (BMD) measurements are critical in many research studies investigating skeletal integrity. For pre-clinical research, micro-computed tomography (microCT) has become an essential tool in these studies. However, the ability to measure the BMD directly from microCT images can be biased by artifacts, such as beam hardening, in the image. This three-part study was designed to understand how the image acquisition process can affect the resulting BMD measurements and to verify that the BMD measurements are accurate. In the first part of this study, the effect of beam hardening-induced cupping artifacts on BMD measurements was examined. In the second part of this study, the number of bones in the X-ray path and the sampling process during scanning was examined. In the third part of this study, microCT-based BMD measurements were compared with ash weights to verify the accuracy of the measurements. The results indicate that beam hardening artifacts of up to 32.6% can occur in sample sizes of interest in studies investigating mineralized tissue and affect mineral density measurements. Beam filtration can be used to minimize these artifacts. The results also indicate that, for murine femora, the scan setup can impact densitometry measurements for both cortical and trabecular bone and morphologic measurements of trabecular bone. Last, when a scan setup that minimized all of these artifacts was used, the microCT-based measurements correlated well with ash weight measurements (R(2)=0.983 when air was excluded), indicating that microCT can be an accurate tool for murine bone densitometry.

  5. A technique for evaluating bone ingrowth into 3D printed, porous Ti6Al4V implants accurately using X-ray micro-computed tomography and histomorphometry.

    PubMed

    Palmquist, Anders; Shah, Furqan A; Emanuelsson, Lena; Omar, Omar; Suska, Felicia

    2017-03-01

    This paper investigates the application of X-ray micro-computed tomography (micro-CT) to accurately evaluate bone formation within 3D printed, porous Ti6Al4V implants manufactured using Electron Beam Melting (EBM), retrieved after six months of healing in sheep femur and tibia. All samples were scanned twice (i.e., before and after resin embedding), using fast, low-resolution scans (Skyscan 1172; Bruker micro-CT, Kontich, Belgium), and were analysed by 2D and 3D morphometry. The main questions posed were: (i) Can low resolution, fast scans provide morphometric data of bone formed inside (and around) metal implants with a complex, open-pore architecture?, (ii) Can micro-CT be used to accurately quantify both the bone area (BA) and bone-implant contact (BIC)?, (iii) What degree of error is introduced in the quantitative data by varying the threshold values?, and (iv) Does resin embedding influence the accuracy of the analysis? To validate the accuracy of micro-CT measurements, each data set was correlated with a corresponding centrally cut histological section. The results show that quantitative histomorphometry corresponds strongly with 3D measurements made by micro-CT, where a high correlation exists between the two techniques for bone area/volume measurements around and inside the porous network. On the contrary, the direct bone-implant contact is challenging to estimate accurately or reproducibly. Large errors may be introduced in micro-CT measurements when segmentation is performed without calibrating the data set against a corresponding histological section. Generally, the bone area measurement is strongly influenced by the lower threshold limit, while the upper threshold limit has little or no effect. Resin embedding does not compromise the accuracy of micro-CT measurements, although there is a change in the contrast distributions and optimisation of the threshold ranges is required.

  6. A Review of Diffusion Tensor Magnetic Resonance Imaging Computational Methods and Software Tools

    PubMed Central

    Hasan, Khader M.; Walimuni, Indika S.; Abid, Humaira; Hahn, Klaus R.

    2010-01-01

    In this work we provide an up-to-date short review of computational magnetic resonance imaging (MRI) and software tools that are widely used to process and analyze diffusion-weighted MRI data. A review of different methods used to acquire, model and analyze diffusion-weighted imaging data (DWI) is first provided with focus on diffusion tensor imaging (DTI). The major preprocessing, processing and post-processing procedures applied to DTI data are discussed. A list of freely available software packages to analyze diffusion MRI data is also provided. PMID:21087766

  7. Fortran Transformational Tools in Support of Scientific Application Development for Petascale Computer Architectures

    SciTech Connect

    Sottille, Matthew

    2013-09-12

    This document is the final report for a multi-year effort building infrastructure to support tool development for Fortran programs. We also investigated static analysis and code transformation methods relevant to scientific programmers who are writing Fortran programs for petascale-class high performance computing systems. This report details our accomplishments, technical approaches, and provides information on where the research results and code may be obtained from an open source software repository. The report for the first year of the project that was performed at the University of Oregon prior to the PI moving to Galois, Inc. is included as an appendix.

  8. Tools for grid deployment of CDF offline and SAM data handling systems for summer 2004 computing

    SciTech Connect

    Kreymer, A.; Baranovski, A.; Garzoglio, G.; Herber, R.; Illingworth, R.; Kennedy, R.; Loebel-Carpenter, L.; Lyon, A.; Merritt, W.; Stonjek, S.; Terekhov, I.; Trumbo, J.; Veseli, S.; White, S.; Bartsch, V.; Leslie, M.; Belforte, S.; Burgon-Lyon, M.; St. Denis, R.; Kerzel, U.; Ratnikov, F.; /Rutgers U., Piscataway /Texas Tech.

    2004-12-01

    The Fermilab CDF Run-II experiment is now providing official support for remote computing, which has provided approximately 35% of the total CDF computing capacity during the summer of 2004. We face the challenge of unreliable networks, time differences, and remote managers having little experience with this particular software. The approach we have taken has been to separate the data handling components from the main CDF offline code releases by means of shared libraries, permitting live upgrades to otherwise frozen code. We now use a special ''development lite'' release to ensure that all sites have the latest tools available. We have put substantial effort into revision control, so that essentially all active CDF sites are running exactly the same SAM code.

  9. Isolating blocks as computational tools in the circular restricted three-body problem

    NASA Astrophysics Data System (ADS)

    Anderson, Rodney L.; Easton, Robert W.; Lo, Martin W.

    2017-03-01

    Isolating blocks may be used as computational tools to search for the invariant manifolds of orbits and hyperbolic invariant sets associated with libration points while also giving additional insight into the dynamics of the flow in these regions. We use isolating blocks to investigate the dynamics of objects entering the Earth-Moon system in the circular restricted three-body problem with energies close to the energy of the L2 libration point. Specifically, the stable and unstable manifolds of Lyapunov orbits and the hyperbolic invariant set around the libration points are obtained by numerically computing the way orbits exit from an isolating block in combination with a bisection method. Invariant spheres of solutions in the spatial problem may then be located using the resulting manifolds.

  10. Creation of an idealized nasopharynx geometry for accurate computational fluid dynamics simulations of nasal airflow in patient-specific models lacking the nasopharynx anatomy.

    PubMed

    A T Borojeni, Azadeh; Frank-Ito, Dennis O; Kimbell, Julia S; Rhee, John S; Garcia, Guilherme J M

    2016-08-15

    Virtual surgery planning based on computational fluid dynamics (CFD) simulations has the potential to improve surgical outcomes for nasal airway obstruction patients, but the benefits of virtual surgery planning must outweigh the risks of radiation exposure. Cone beam computed tomography (CT) scans represent an attractive imaging modality for virtual surgery planning due to lower costs and lower radiation exposures compared with conventional CT scans. However, to minimize the radiation exposure, the cone beam CT sinusitis protocol sometimes images only the nasal cavity, excluding the nasopharynx. The goal of this study was to develop an idealized nasopharynx geometry for accurate representation of outlet boundary conditions when the nasopharynx geometry is unavailable. Anatomically accurate models of the nasopharynx created from 30 CT scans were intersected with planes rotated at different angles to obtain an average geometry. Cross sections of the idealized nasopharynx were approximated as ellipses with cross-sectional areas and aspect ratios equal to the average in the actual patient-specific models. CFD simulations were performed to investigate whether nasal airflow patterns were affected when the CT-based nasopharynx was replaced by the idealized nasopharynx in 10 nasal airway obstruction patients. Despite the simple form of the idealized geometry, all biophysical variables (nasal resistance, airflow rate, and heat fluxes) were very similar in the idealized vs patient-specific models. The results confirmed the expectation that the nasopharynx geometry has a minimal effect in the nasal airflow patterns during inspiration. The idealized nasopharynx geometry will be useful in future CFD studies of nasal airflow based on medical images that exclude the nasopharynx.

  11. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  12. Analysis and computer tools for separation processes involving nonideal mixtures. Progress report, December 1, 1989--November 30, 1992

    SciTech Connect

    Lucia, A.

    1992-05-01

    The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

  13. Morpheus Spectral Counter: A computational tool for label-free quantitative mass spectrometry using the Morpheus search engine.

    PubMed

    Gemperline, David C; Scalf, Mark; Smith, Lloyd M; Vierstra, Richard D

    2016-03-01

    Label-free quantitative MS based on the Normalized Spectral Abundance Factor (NSAF) has emerged as a straightforward and robust method to determine the relative abundance of individual proteins within complex mixtures. Here, we present Morpheus Spectral Counter (MSpC) as the first computational tool that directly calculates NSAF values from output obtained from Morpheus, a fast, open-source, peptide-MS/MS matching engine compatible with high-resolution accurate-mass instruments. NSAF has distinct advantages over other MS-based quantification methods, including a greater dynamic range as compared to isobaric tags, no requirement to align and re-extract MS1 peaks, and increased speed. MSpC features an easy-to-use graphic user interface that additionally calculates both distributed and unique NSAF values to permit analyses of both protein families and isoforms/proteoforms. MSpC determinations of protein concentration were linear over several orders of magnitude based on the analysis of several high-mass accuracy datasets either obtained from PRIDE or generated with total cell extracts spiked with purified Arabidopsis 20S proteasomes. The MSpC software was developed in C# and is open sourced under a permissive license with the code made available at http://dcgemperline.github.io/Morpheus_SpC/.

  14. Morpheus Spectral Counter: A Computational Tool for Label-Free Quantitative Mass Spectrometry using the Morpheus Search Engine

    PubMed Central

    Gemperline, David C.; Scalf, Mark; Smith, Lloyd M.; Vierstra, Richard D.

    2016-01-01

    Label-free quantitative MS based on the Normalized Spectral Abundance Factor (NSAF) has emerged as a straightforward and robust method to determine the relative abundance of individual proteins within complex mixtures. Here, we present Morpheus Spectral Counter (MSpC) as the first computational tool that directly calculates NSAF values from output obtained from Morpheus, a fast, open-source, peptide-MS/MS matching engine compatible with high-resolution accurate-mass instruments. NSAF has distinct advantages over other MS-based quantification methods, including a higher dynamic range as compared to isobaric tags, no requirement to align and re-extract MS1 peaks, and increased speed. MSpC features an easy to use graphic user interface that additionally calculates both distributed and unique NSAF values to permit analyses of both protein families and isoforms/proteoforms. MSpC determinations of protein concentration were linear over several orders of magnitude based on the analysis of several high-mass accuracy datasets either obtained from PRIDE or generated with total cell extracts spiked with purified Arabidopsis 20S proteasomes. The MSpC software was developed in C# and is open sourced under a permissive license with the code made available at http://dcgemperline.github.io/Morpheus_SpC/. PMID:26791624

  15. Fast and Accurate Data Extraction for Near Real-Time Registration of 3-D Ultrasound and Computed Tomography in Orthopedic Surgery.

    PubMed

    Brounstein, Anna; Hacihaliloglu, Ilker; Guy, Pierre; Hodgson, Antony; Abugharbieh, Rafeef

    2015-12-01

    Automatic, accurate and real-time registration is an important step in providing effective guidance and successful anatomic restoration in ultrasound (US)-based computer assisted orthopedic surgery. We propose a method in which local phase-based bone surfaces, extracted from intra-operative US data, are registered to pre-operatively segmented computed tomography data. Extracted bone surfaces are downsampled and reinforced with high curvature features. A novel hierarchical simplification algorithm is used to further optimize the point clouds. The final point clouds are represented as Gaussian mixture models and iteratively matched by minimizing the dissimilarity between them using an L2 metric. For 44 clinical data sets from 25 pelvic fracture patients and 49 phantom data sets, we report mean surface registration accuracies of 0.31 and 0.77 mm, respectively, with an average registration time of 1.41 s. Our results suggest the viability and potential of the chosen method for real-time intra-operative registration in orthopedic surgery.

  16. Online object oriented Monte Carlo computational tool for the needs of biomedical optics

    PubMed Central

    Doronin, Alexander; Meglinski, Igor

    2011-01-01

    Conceptual engineering design and optimization of laser-based imaging techniques and optical diagnostic systems used in the field of biomedical optics requires a clear understanding of the light-tissue interaction and peculiarities of localization of the detected optical radiation within the medium. The description of photon migration within the turbid tissue-like media is based on the concept of radiative transfer that forms a basis of Monte Carlo (MC) modeling. An opportunity of direct simulation of influence of structural variations of biological tissues on the probing light makes MC a primary tool for biomedical optics and optical engineering. Due to the diversity of optical modalities utilizing different properties of light and mechanisms of light-tissue interactions a new MC code is typically required to be developed for the particular diagnostic application. In current paper introducing an object oriented concept of MC modeling and utilizing modern web applications we present the generalized online computational tool suitable for the major applications in biophotonics. The computation is supported by NVIDEA CUDA Graphics Processing Unit providing acceleration of modeling up to 340 times. PMID:21991540

  17. Computational Tools for Simulating Thermal-hydrological-chemical Conditions in the Martian Subsurface

    NASA Astrophysics Data System (ADS)

    Painter, S.; Boice, D.; Browning, L.; Dinwiddie, C.; Pickett, D.

    2002-09-01

    Methods for simulating non-isothermal, multiphase flow and geochemical transport in unsaturated porous media have matured in recent years, and are now used in a range of advanced terrestrial applications. Similar computational tools have a range of potential applications in Mars research. They may be used, for example, to support data analysis, to test hypotheses regarding the evolution and current state of subsurface hydrological systems, and to understand the potential for undesirable perturbations during future drilling or sample collection activities. We describe ongoing efforts to adapt computational hydrology tools to the conditions of the Martian subsurface in a new simulation code MARSFLO. Initial versions of MARSFLO will simulate heat transport, the dynamics of multiple fluid phases (ice, water, water vapor, and CO2), and the evolution of solute concentration in the absence of geochemical reactions. The general modeling strategy is to use equilibrium constraints to reduce the system to four highly non-linear coupled conservation equations, which are then solved using an integral-finite-difference method and fully implicit time stepping. The required constitutive relationships are developed from the theory of freezing terrestrial soils and modified for Martian conditions. Data needs, potential applications, and plans to include multi-component reactive transport are also discussed. This work was funded by the Southwest Research Initiative on Mars (SwIM).

  18. AView: An Image-based Clinical Computational Tool for Intracranial Aneurysm Flow Visualization and Clinical Management.

    PubMed

    Xiang, Jianping; Antiga, Luca; Varble, Nicole; Snyder, Kenneth V; Levy, Elad I; Siddiqui, Adnan H; Meng, Hui

    2016-04-01

    Intracranial aneurysms (IAs) occur in around 3% of the entire population. IA rupture is responsible for the most devastating type of hemorrhagic strokes, with high fatality and disability rates as well as healthcare costs. With increasing detection of unruptured aneurysms, clinicians are routinely faced with the dilemma whether to treat IA patients and how to best treat them. Hemodynamic and morphological characteristics are increasingly considered in aneurysm rupture risk assessment and treatment planning, but currently no computational tools allow routine integration of flow visualization and quantitation of these parameters in clinical workflow. In this paper, we introduce AView, a prototype of a clinician-oriented, integrated computation tool for aneurysm hemodynamics, morphology, and risk and data management to aid in treatment decisions and treatment planning in or near the procedure room. Specifically, we describe how we have designed the AView structure from the end-user's point of view, performed a pilot study and gathered clinical feedback. The positive results demonstrate AView's potential clinical value on enhancing aneurysm treatment decision and treatment planning.

  19. Smartphone qualification & linux-based tools for CubeSat computing payloads

    NASA Astrophysics Data System (ADS)

    Bridges, C. P.; Yeomans, B.; Iacopino, C.; Frame, T. E.; Schofield, A.; Kenyon, S.; Sweeting, M. N.

    Modern computers are now far in advance of satellite systems and leveraging of these technologies for space applications could lead to cheaper and more capable spacecraft. Together with NASA AMES's PhoneSat, the STRaND-1 nanosatellite team has been developing and designing new ways to include smart-phone technologies to the popular CubeSat platform whilst mitigating numerous risks. Surrey Space Centre (SSC) and Surrey Satellite Technology Ltd. (SSTL) have led in qualifying state-of-the-art COTS technologies and capabilities - contributing to numerous low-cost satellite missions. The focus of this paper is to answer if 1) modern smart-phone software is compatible for fast and low-cost development as required by CubeSats, and 2) if the components utilised are robust to the space environment. The STRaND-1 smart-phone payload software explored in this paper is united using various open-source Linux tools and generic interfaces found in terrestrial systems. A major result from our developments is that many existing software and hardware processes are more than sufficient to provide autonomous and operational payload object-to-object and file-based management solutions. The paper will provide methodologies on the software chains and tools used for the STRaND-1 smartphone computing platform, the hardware built with space qualification results (thermal, thermal vacuum, and TID radiation), and how they can be implemented in future missions.

  20. Gmat. A software tool for the computation of the rovibrational G matrix

    NASA Astrophysics Data System (ADS)

    Castro, M. E.; Niño, A.; Muñoz-Caro, C.

    2009-07-01

    Gmat is a C++ program able to compute the rovibrational G matrix in molecules of arbitrary size. This allows the building of arbitrary rovibrational Hamiltonians. In particular, the program is designed to work with the structural results of potential energy hypersurface mappings computed in computer clusters or computational Grid environments. In the present version, 1.0, the program uses internal coordinates as vibrational coordinates, with the principal axes of inertia as body-fixed system. The main design implements a complete separation of the interface and functional parts of the program. The interface part permits the automatic reading of the molecular structures from the output files of different electronic structure codes. At present, Gamess and Gaussian output files are allowed. To such an end, use is made of the object orientation polymorphism characteristic. The functional part computes numerically the derivatives of the nuclear positions respect to the vibrational coordinates. Very accurate derivatives are obtained by using central differences embedded in a nine levels Richardson extrapolation procedure. Program summaryProgram title: Gmat Catalogue identifier: AECZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 17 023 No. of bytes in distributed program, including test data, etc.: 274 714 Distribution format: tar.gz Programming language: Standard C++ Computer: All running Linux/Windows Operating system: Linux, Windows Classification: 16.2 Nature of problem: Computation of the rovibrational G matrix in molecules of any size. This allows the building of arbitrary rovibrational Hamiltonians. It must be possible to obtain the input data from the output files of standard electronic structure codes

  1. Supporting Scientific Modeling Practices in Atmospheric Sciences: Intended and Actual Affordances of a Computer-Based Modeling Tool

    ERIC Educational Resources Information Center

    Wu, Pai-Hsing; Wu, Hsin-Kai; Kuo, Che-Yu; Hsu, Ying-Shao

    2015-01-01

    Computer-based learning tools include design features to enhance learning but learners may not always perceive the existence of these features and use them in desirable ways. There might be a gap between what the tool features are designed to offer (intended affordance) and what they are actually used (actual affordance). This study thus aims at…

  2. Automated metastatic brain lesion detection: a computer aided diagnostic and clinical research tool

    NASA Astrophysics Data System (ADS)

    Devine, Jeremy; Sahgal, Arjun; Karam, Irene; Martel, Anne L.

    2016-03-01

    The accurate localization of brain metastases in magnetic resonance (MR) images is crucial for patients undergoing stereotactic radiosurgery (SRS) to ensure that all neoplastic foci are targeted. Computer automated tumor localization and analysis can improve both of these tasks by eliminating inter and intra-observer variations during the MR image reading process. Lesion localization is accomplished using adaptive thresholding to extract enhancing objects. Each enhancing object is represented as a vector of features which includes information on object size, symmetry, position, shape, and context. These vectors are then used to train a random forest classifier. We trained and tested the image analysis pipeline on 3D axial contrast-enhanced MR images with the intention of localizing the brain metastases. In our cross validation study and at the most effective algorithm operating point, we were able to identify 90% of the lesions at a precision rate of 60%.

  3. Computer assisted 3D pre-operative planning tool for femur fracture orthopedic surgery

    NASA Astrophysics Data System (ADS)

    Gamage, Pavan; Xie, Sheng Quan; Delmas, Patrice; Xu, Wei Liang

    2010-02-01

    Femur shaft fractures are caused by high impact injuries and can affect gait functionality if not treated correctly. Until recently, the pre-operative planning for femur fractures has relied on two-dimensional (2D) radiographs, light boxes, tracing paper, and transparent bone templates. The recent availability of digital radiographic equipment has to some extent improved the workflow for preoperative planning. Nevertheless, imaging is still in 2D X-rays and planning/simulation tools to support fragment manipulation and implant selection are still not available. Direct three-dimensional (3D) imaging modalities such as Computed Tomography (CT) are also still restricted to a minority of complex orthopedic procedures. This paper proposes a software tool which allows orthopedic surgeons to visualize, diagnose, plan and simulate femur shaft fracture reduction procedures in 3D. The tool utilizes frontal and lateral 2D radiographs to model the fracture surface, separate a generic bone into the two fractured fragments, identify the pose of each fragment, and automatically customize the shape of the bone. The use of 3D imaging allows full spatial inspection of the fracture providing different views through the manipulation of the interactively reconstructed 3D model, and ultimately better pre-operative planning.

  4. An APEL Tool Based CPU Usage Accounting Infrastructure for Large Scale Computing Grids

    NASA Astrophysics Data System (ADS)

    Jiang, Ming; Novales, Cristina Del Cano; Mathieu, Gilles; Casson, John; Rogers, William; Gordon, John

    The APEL (Accounting Processor for Event Logs) is the fundamental tool for the CPU usage accounting infrastructure deployed within the WLCG and EGEE Grids. In these Grids, jobs are submitted by users to computing resources via a Grid Resource Broker (e.g. gLite Workload Management System). As a log processing tool, APEL interprets logs of Grid gatekeeper (e.g. globus) and batch system logs (e.g. PBS, LSF, SGE and Condor) to produce CPU job accounting records identified with Grid identities. These records provide a complete description of usage of computing resources by user's jobs. APEL publishes accounting records into an accounting record repository at a Grid Operations Centre (GOC) for the access from a GUI web tool. The functions of log files parsing, records generation and publication are implemented by the APEL Parser, APEL Core, and APEL Publisher component respectively. Within the distributed accounting infrastructure, accounting records are transported from APEL Publishers at Grid sites to either a regionalised accounting system or the central one by choice via a common ActiveMQ message broker network. This provides an open transport layer for other accounting systems to publish relevant accounting data to a central accounting repository via a unified interface provided an APEL Publisher and also will give regional/National Grid Initiatives (NGIs) Grids the flexibility in their choice of accounting system. The robust and secure delivery of accounting record messages at an NGI level and between NGI accounting instances and the central one are achieved by using configurable APEL Publishers and an ActiveMQ message broker network.

  5. N2A: a computational tool for modeling from neurons to algorithms

    PubMed Central

    Rothganger, Fredrick; Warrender, Christina E.; Trumbo, Derek; Aimone, James B.

    2014-01-01

    The exponential increase in available neural data has combined with the exponential growth in computing (“Moore's law”) to create new opportunities to understand neural systems at large scale and high detail. The ability to produce large and sophisticated simulations has introduced unique challenges to neuroscientists. Computational models in neuroscience are increasingly broad efforts, often involving the collaboration of experts in different domains. Furthermore, the size and detail of models have grown to levels for which understanding the implications of variability and assumptions is no longer trivial. Here, we introduce the model design platform N2A which aims to facilitate the design and validation of biologically realistic models. N2A uses a hierarchical representation of neural information to enable the integration of models from different users. N2A streamlines computational validation of a model by natively implementing standard tools in sensitivity analysis and uncertainty quantification. The part-relationship representation allows both network-level analysis and dynamical simulations. We will demonstrate how N2A can be used in a range of examples, including a simple Hodgkin-Huxley cable model, basic parameter sensitivity of an 80/20 network, and the expression of the structural plasticity of a growing dendrite and stem cell proliferation and differentiation. PMID:24478635

  6. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    SciTech Connect

    Smith, P.R.; Sarfaty, R.

    1993-05-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility`s physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables.

  7. Validation of Three Early Ejaculation Diagnostic Tools: A Composite Measure Is Accurate and More Adequate for Diagnosis by Updated Diagnostic Criteria

    PubMed Central

    Jern, Patrick; Piha, Juhana; Santtila, Pekka

    2013-01-01

    Purpose To validate three early ejaculation diagnostic tools, and propose a new tool for diagnosis in line with proposed changes to diagnostic criteria. Significant changes to diagnostic criteria are expected in the near future. Available screening tools do not necessarily reflect proposed changes. Materials and Methods Data from 148 diagnosed early ejaculation patients (Mage = 42.8) and 892 controls (Mage = 33.1 years) from a population-based sample were used. Participants responded to three different questionnaires (Premature Ejaculation Profile; Premature Ejaculation Diagnostic Tool; Multiple Indicators of Premature Ejaculation). Stopwatch measured ejaculation latency times were collected from a subsample of early ejaculation patients. We used two types of responses to the questionnaires depending on the treatment status of the patients 1) responses regarding the situation before starting pharmacological treatment and 2) responses regarding current situation. Logistic regressions and Receiver Operating Characteristics were used to assess ability of both the instruments and individual items to differentiate between patients and controls. Results All instruments had very good precision (Areas under the Curve ranging from .93-.98). A new five-item instrument (named CHecklist for Early Ejaculation Symptoms – CHEES) consisting of high-performance variables selected from the three instruments had validity (Nagelkerke R2 range .51-.79 for backwards/forwards logistic regression) equal to or slightly better than any individual instrument (i.e., had slightly higher validity statistics, but these differences did not achieve statistical significance). Importantly, however, this instrument was more in line with proposed changes to diagnostic criteria. Conclusions All three screening tools had good validity. A new 5-item diagnostic tool (CHEES) based on the three instruments had equal or somewhat more favorable validity statistics compared to the other three tools, but is

  8. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  9. A computer modeling methodology and tool for assessing design concepts for the Space Station Data Management System

    NASA Technical Reports Server (NTRS)

    Jones, W. R.

    1986-01-01

    A computer modeling tool is being developed to assess candidate designs for the Space Station Data Management System (DMS). The DMS is to be a complex distributed computer system including the processor, storage devices, local area networks, and software that will support all processing functions onboard the Space Station. The modeling tool will allow a candidate design for the DMS, or for other subsystems that use the DMS, to be evaluated in terms of parameters. The tool and its associated modeling methodology are intended for use by DMS and subsystem designers to perform tradeoff analyses between design concepts using varied architectures and technologies.

  10. A tool for computing time-dependent permeability reduction of fractured volcanic conduit margins.

    NASA Astrophysics Data System (ADS)

    Farquharson, Jamie; Wadsworth, Fabian; Heap, Michael; Baud, Patrick

    2016-04-01

    Laterally-oriented fractures within volcanic conduit margins are thought to play an important role in tempering eruption explosivity by allowing magmatic volatiles to outgas. The permeability of a fractured conduit margin - the equivalent permeability - can be modelled as the sum of permeability contributions of the edifice host rock and the fracture(s) within it. We present here a flexible MATLAB® tool which computes the time-dependent equivalent permeability of a volcanic conduit margin containing ash-filled fractures. The tool is designed so that the end-user can define a wide range of input parameters to yield equivalent permeability estimates for their application. The time-dependence of the equivalent permeability is incorporated by considering permeability decrease as a function of porosity loss in the ash-filled fractures due to viscous sintering (after Russell and Quane, 2005), which is in turn dependent on the depth and temperature of each fracture and the crystal-content of the magma (all user-defined variables). The initial viscosity of the granular material filling the fracture is dependent on the water content (Hess and Dingwell, 1996), which is computed assuming equilibrium depth-dependent water content (Liu et al., 2005). Crystallinity is subsequently accounted for by employing the particle-suspension rheological model of Mueller et al. (2010). The user then defines the number of fractures, their widths, and their depths, and the lengthscale of interest (e.g. the length of the conduit). Using these data, the combined influence of transient fractures on the equivalent permeability of the conduit margin is then calculated by adapting a parallel-plate flow model (developed by Baud et al., 2012 for porous sandstones), for host rock permeabilities from 10-11 to 10-22 m2. The calculated values of porosity and equivalent permeability with time for each host rock permeability is then output in text and worksheet file formats. We introduce two dimensionless

  11. Computational fluid dynamics as surgical planning tool: a pilot study on middle turbinate resection.

    PubMed

    Zhao, Kai; Malhotra, Prashant; Rosen, David; Dalton, Pamela; Pribitkin, Edmund A

    2014-11-01

    Controversies exist regarding the resection or preservation of the middle turbinate (MT) during functional endoscopic sinus surgery. Any MT resection will perturb nasal airflow and may affect the mucociliary dynamics of the osteomeatal complex. Neither rhinometry nor computed tomography (CT) can adequately quantify nasal airflow pattern changes following surgery. This study explores the feasibility of assessing changes in nasal airflow dynamics following partial MT resection using computational fluid dynamics (CFD) techniques. We retrospectively converted the pre- and postoperative CT scans of a patient who underwent isolated partial MT concha bullosa resection into anatomically accurate three-dimensional numerical nasal models. Pre- and postsurgery nasal airflow simulations showed that the partial MT resection resulted in a shift of regional airflow towards the area of MT removal with a resultant decreased airflow velocity, decreased wall shear stress and increased local air pressure. However, the resection did not strongly affect the overall nasal airflow patterns, flow distributions in other areas of the nose, nor the odorant uptake rate to the olfactory cleft mucosa. Moreover, CFD predicted the patient's failure to perceive an improvement in his unilateral nasal obstruction following surgery. Accordingly, CFD techniques can be used to predict changes in nasal airflow dynamics following partial MT resection. However, the functional implications of this analysis await further clinical studies. Nevertheless, such techniques may potentially provide a quantitative evaluation of surgical effectiveness and may prove useful in preoperatively modeling the effects of surgical interventions.

  12. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    NASA Astrophysics Data System (ADS)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  13. Development and assessment of a chemistry-based computer video game as a learning tool

    NASA Astrophysics Data System (ADS)

    Martinez-Hernandez, Kermin Joel

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning experience through gameplay. The project consists of three areas: development, assessment, and implementation. However, the foci of this study were the development and assessment of the computer video game including possible learning outcomes and game design elements. A chemistry-based game using a mixed genre of a single player first-person game embedded with action-adventure and puzzle components was developed to determine if students' level of understanding of chemistry concepts change after gameplay intervention. Three phases have been completed to assess students' understanding of chemistry concepts prior and after gameplay intervention. Two main assessment instruments (pre/post open-ended content survey and individual semi-structured interviews) were used to assess student understanding of concepts. In addition, game design elements were evaluated for future development phases. Preliminary analyses of the interview data suggest that students were able to understand most of the chemistry challenges presented in the game and the game served as a review for previously learned concepts as well as a way to apply such previous knowledge. To guarantee a better understanding of the chemistry concepts, additions such as debriefing and feedback about the content presented in the game seem to be needed. The use of visuals in the game to represent chemical processes, game genre, and game idea appear to be the game design elements that students like the most about the current computer video game.

  14. Biophysical and biochemical aspects of antifreeze proteins: Using computational tools to extract atomistic information.

    PubMed

    Kar, Rajiv K; Bhunia, Anirban

    2015-11-01

    Antifreeze proteins (AFPs) are the key biomolecules that protect species from extreme climatic conditions. Studies of AFPs, which are based on recognition of ice plane and structural motifs, have provided vital information that point towards the mechanism responsible for executing antifreeze activity. Importantly, the use of experimental techniques has revealed key information for AFPs, but the exact microscopic details are still not well understood, which limits the application and design of novel antifreeze agents. The present review focuses on the importance of computational tools for investigating (i) molecular properties, (ii) structure-function relationships, and (iii) AFP-ice interactions at atomistic levels. In this context, important details pertaining to the methodological approaches used in molecular dynamics studies of AFPs are also discussed. It is hoped that the information presented herein is helpful for enriching our knowledge of antifreeze properties, which can potentially pave the way for the successful design of novel antifreeze biomolecular agents.

  15. New computational tools for H/D determination in macromolecular structures from neutron data.

    PubMed

    Siliqi, Dritan; Caliandro, Rocco; Carrozzini, Benedetta; Cascarano, Giovanni Luca; Mazzone, Annamaria

    2010-11-01

    Two new computational methods dedicated to neutron crystallography, called n-FreeLunch and DNDM-NDM, have been developed and successfully tested. The aim in developing these methods is to determine hydrogen and deuterium positions in macromolecular structures by using information from neutron density maps. Of particular interest is resolving cases in which the geometrically predicted hydrogen or deuterium positions are ambiguous. The methods are an evolution of approaches that are already applied in X-ray crystallography: extrapolation beyond the observed resolution (known as the FreeLunch procedure) and a difference electron-density modification (DEDM) technique combined with the electron-density modification (EDM) tool (known as DEDM-EDM). It is shown that the two methods are complementary to each other and are effective in finding the positions of H and D atoms in neutron density maps.

  16. Feasibility study for application of the compressed-sensing framework to interior computed tomography (ICT) for low-dose, high-accurate dental x-ray imaging

    NASA Astrophysics Data System (ADS)

    Je, U. K.; Cho, H. M.; Cho, H. S.; Park, Y. O.; Park, C. K.; Lim, H. W.; Kim, K. S.; Kim, G. A.; Park, S. Y.; Woo, T. H.; Choi, S. I.

    2016-02-01

    In this paper, we propose a new/next-generation type of CT examinations, the so-called Interior Computed Tomography (ICT), which may presumably lead to dose reduction to the patient outside the target region-of-interest (ROI), in dental x-ray imaging. Here an x-ray beam from each projection position covers only a relatively small ROI containing a target of diagnosis from the examined structure, leading to imaging benefits such as decreasing scatters and system cost as well as reducing imaging dose. We considered the compressed-sensing (CS) framework, rather than common filtered-backprojection (FBP)-based algorithms, for more accurate ICT reconstruction. We implemented a CS-based ICT algorithm and performed a systematic simulation to investigate the imaging characteristics. Simulation conditions of two ROI ratios of 0.28 and 0.14 between the target and the whole phantom sizes and four projection numbers of 360, 180, 90, and 45 were tested. We successfully reconstructed ICT images of substantially high image quality by using the CS framework even with few-view projection data, still preserving sharp edges in the images.

  17. Computational Tools for Allosteric Drug Discovery: Site Identification and Focus Library Design.

    PubMed

    Huang, Wenkang; Nussinov, Ruth; Zhang, Jian

    2017-01-01

    Allostery is an intrinsic phenomenon of biological macromolecules involving regulation and/or signal transduction induced by a ligand binding to an allosteric site distinct from a molecule's active site. Allosteric drugs are currently receiving increased attention in drug discovery because drugs that target allosteric sites can provide important advantages over the corresponding orthosteric drugs including specific subtype selectivity within receptor families. Consequently, targeting allosteric sites, instead of orthosteric sites, can reduce drug-related side effects and toxicity. On the down side, allosteric drug discovery can be more challenging than traditional orthosteric drug discovery due to difficulties associated with determining the locations of allosteric sites and designing drugs based on these sites and the need for the allosteric effects to propagate through the structure, reach the ligand binding site and elicit a conformational change. In this study, we present computational tools ranging from the identification of potential allosteric sites to the design of "allosteric-like" modulator libraries. These tools may be particularly useful for allosteric drug discovery.

  18. Computational and molecular tools for scalable rAAV-mediated genome editing

    PubMed Central

    Stoimenov, Ivaylo; Ali, Muhammad Akhtar; Pandzic, Tatjana; Sjöblom, Tobias

    2015-01-01

    The rapid discovery of potential driver mutations through large-scale mutational analyses of human cancers generates a need to characterize their cellular phenotypes. Among the techniques for genome editing, recombinant adeno-associated virus (rAAV)-mediated gene targeting is suited for knock-in of single nucleotide substitutions and to a lesser degree for gene knock-outs. However, the generation of gene targeting constructs and the targeting process is time-consuming and labor-intense. To facilitate rAAV-mediated gene targeting, we developed the first software and complementary automation-friendly vector tools to generate optimized targeting constructs for editing human protein encoding genes. By computational approaches, rAAV constructs for editing ∼71% of bases in protein-coding exons were designed. Similarly, ∼81% of genes were predicted to be targetable by rAAV-mediated knock-out. A Gateway-based cloning system for facile generation of rAAV constructs suitable for robotic automation was developed and used in successful generation of targeting constructs. Together, these tools enable automated rAAV targeting construct design, generation as well as enrichment and expansion of targeted cells with desired integrations. PMID:25488813

  19. Using Brain–Computer Interfaces and Brain-State Dependent Stimulation as Tools in Cognitive Neuroscience

    PubMed Central

    Jensen, Ole; Bahramisharif, Ali; Oostenveld, Robert; Klanke, Stefan; Hadjipapas, Avgis; Okazaki, Yuka O.; van Gerven, Marcel A. J.

    2011-01-01

    Large efforts are currently being made to develop and improve online analysis of brain activity which can be used, e.g., for brain–computer interfacing (BCI). A BCI allows a subject to control a device by willfully changing his/her own brain activity. BCI therefore holds the promise as a tool for aiding the disabled and for augmenting human performance. While technical developments obviously are important, we will here argue that new insight gained from cognitive neuroscience can be used to identify signatures of neural activation which reliably can be modulated by the subject at will. This review will focus mainly on oscillatory activity in the alpha band which is strongly modulated by changes in covert attention. Besides developing BCIs for their traditional purpose, they might also be used as a research tool for cognitive neuroscience. There is currently a strong interest in how brain-state fluctuations impact cognition. These state fluctuations are partly reflected by ongoing oscillatory activity. The functional role of the brain state can be investigated by introducing stimuli in real-time to subjects depending on the actual state of the brain. This principle of brain-state dependent stimulation may also be used as a practical tool for augmenting human behavior. In conclusion, new approaches based on online analysis of ongoing brain activity are currently in rapid development. These approaches are amongst others informed by new insight gained from electroencephalography/magnetoencephalography studies in cognitive neuroscience and hold the promise of providing new ways for investigating the brain at work. PMID:21687463

  20. ChemScreener: A Distributed Computing Tool for Scaffold based Virtual Screening.

    PubMed

    Karthikeyan, Muthukumarasamy; Pandit, Deepak; Vyas, Renu

    2015-01-01

    In this work we present ChemScreener, a Java-based application to perform virtual library generation combined with virtual screening in a platform-independent distributed computing environment. ChemScreener comprises a scaffold identifier, a distinct scaffold extractor, an interactive virtual library generator as well as a virtual screening module for subsequently selecting putative bioactive molecules. The virtual libraries are annotated with chemophore-, pharmacophore- and toxicophore-based information for compound prioritization. The hits selected can then be further processed using QSAR, docking and other in silico approaches which can all be interfaced within the ChemScreener framework. As a sample application, in this work scaffold selectivity, diversity, connectivity and promiscuity towards six important therapeutic classes have been studied. In order to illustrate the computational power of the application, 55 scaffolds extracted from 161 anti-psychotic compounds were enumerated to produce a virtual library comprising 118 million compounds (17 GB) and annotated with chemophore, pharmacophore and toxicophore based features in a single step which would be non-trivial to perform with many standard software tools today on libraries of this size.

  1. Stimulated dual-band infrared computed tomography: A tool to inspect the aging infrastructure

    SciTech Connect

    Del Grande, N.K.; Durbin, P.F.

    1995-06-27

    The authors have developed stimulated dual-band infrared (IR) computed tomography as a tool to inspect the aging infrastructure. The system has the potential to locate and quantify structural damage within airframes and bridge decks. Typically, dual-band IR detection methods improve the signal-to-noise ratio by a factor of ten, compared to single-band IR detection methods. They conducted a demonstration at Boeing using a uniform pulsed-heat source to stimulate IR images of hidden defects in the 727 fuselage. The dual-band IR camera and image processing system produced temperature, thermal inertia, and cooling-rate maps. In combination, these maps characterized the defect site, size, depth, thickness and type. The authors quantified the percent metal loss from corrosion above a threshold of 5%, with overall uncertainties of 3%. Also, they conducted a feasibility study of dual-band IR thermal imaging for bridge deck inspections. They determined the sites and relative concrete displacement of 2-in. and 4-in. deep delaminations from thin styrofoam implants in asphalt-covered concrete slabs. They demonstrated the value of dual-band IR computed tomography to quantify structural damage within flash-heated airframes and naturally-heated bridge decks.

  2. FILMPAR: A parallel algorithm designed for the efficient and accurate computation of thin film flow on functional surfaces containing micro-structure

    NASA Astrophysics Data System (ADS)

    Lee, Y. C.; Thompson, H. M.; Gaskell, P. H.

    2009-12-01

    , industrial and physical applications. However, despite recent modelling advances, the accurate numerical solution of the equations governing such problems is still at a relatively early stage. Indeed, recent studies employing a simplifying long-wave approximation have shown that highly efficient numerical methods are necessary to solve the resulting lubrication equations in order to achieve the level of grid resolution required to accurately capture the effects of micro- and nano-scale topographical features. Solution method: A portable parallel multigrid algorithm has been developed for the above purpose, for the particular case of flow over submerged topographical features. Within the multigrid framework adopted, a W-cycle is used to accelerate convergence in respect of the time dependent nature of the problem, with relaxation sweeps performed using a fixed number of pre- and post-Red-Black Gauss-Seidel Newton iterations. In addition, the algorithm incorporates automatic adaptive time-stepping to avoid the computational expense associated with repeated time-step failure. Running time: 1.31 minutes using 128 processors on BlueGene/P with a problem size of over 16.7 million mesh points.

  3. An Interactive Tool for Outdoor Computer Controlled Cultivation of Microalgae in a Tubular Photobioreactor System

    PubMed Central

    Dormido, Raquel; Sánchez, José; Duro, Natividad; Dormido-Canto, Sebastián; Guinaldo, María; Dormido, Sebastián

    2014-01-01

    This paper describes an interactive virtual laboratory for experimenting with an outdoor tubular photobioreactor (henceforth PBR for short). This virtual laboratory it makes possible to: (a) accurately reproduce the structure of a real plant (the PBR designed and built by the Department of Chemical Engineering of the University of Almería, Spain); (b) simulate a generic tubular PBR by changing the PBR geometry; (c) simulate the effects of changing different operating parameters such as the conditions of the culture (pH, biomass concentration, dissolved O2, inyected CO2, etc.); (d) simulate the PBR in its environmental context; it is possible to change the geographic location of the system or the solar irradiation profile; (e) apply different control strategies to adjust different variables such as the CO2 injection, culture circulation rate or culture temperature in order to maximize the biomass production; (f) simulate the harvesting. In this way, users can learn in an intuitive way how productivity is affected by any change in the design. It facilitates the learning of how to manipulate essential variables for microalgae growth to design an optimal PBR. The simulator has been developed with Easy Java Simulations, a freeware open-source tool developed in Java, specifically designed for the creation of interactive dynamic simulations. PMID:24662450

  4. An interactive tool for outdoor computer controlled cultivation of microalgae in a tubular photobioreactor system.

    PubMed

    Dormido, Raquel; Sánchez, José; Duro, Natividad; Dormido-Canto, Sebastián; Guinaldo, María; Dormido, Sebastián

    2014-03-06

    This paper describes an interactive virtual laboratory for experimenting with an outdoor tubular photobioreactor (henceforth PBR for short). This virtual laboratory it makes possible to: (a) accurately reproduce the structure of a real plant (the PBR designed and built by the Department of Chemical Engineering of the University of Almería, Spain); (b) simulate a generic tubular PBR by changing the PBR geometry; (c) simulate the effects of changing different operating parameters such as the conditions of the culture (pH, biomass concentration, dissolved O2, inyected CO2, etc.); (d) simulate the PBR in its environmental context; it is possible to change the geographic location of the system or the solar irradiation profile; (e) apply different control strategies to adjust different variables such as the CO2 injection, culture circulation rate or culture temperature in order to maximize the biomass production; (f) simulate the harvesting. In this way, users can learn in an intuitive way how productivity is affected by any change in the design. It facilitates the learning of how to manipulate essential variables for microalgae growth to design an optimal PBR. The simulator has been developed with Easy Java Simulations, a freeware open-source tool developed in Java, specifically designed for the creation of interactive dynamic simulations.

  5. Computer-Based Tools for Inquiry in Undergraduate Classrooms: Results from the VGEE

    NASA Astrophysics Data System (ADS)

    Pandya, R. E.; Bramer, D. J.; Elliott, D.; Hay, K. E.; Mallaiahgari, L.; Marlino, M. R.; Middleton, D.; Ramamurhty, M. K.; Scheitlin, T.; Weingroff, M.; Wilhelmson, R.; Yoder, J.

    2002-05-01

    The Visual Geophysical Exploration Environment (VGEE) is a suite of computer-based tools designed to help learners connect observable, large-scale geophysical phenomena to underlying physical principles. Technologically, this connection is mediated by java-based interactive tools: a multi-dimensional visualization environment, authentic scientific data-sets, concept models that illustrate fundamental physical principles, and an interactive web-based work management system for archiving and evaluating learners' progress. Our preliminary investigations showed, however, that the tools alone are not sufficient to empower undergraduate learners; learners have trouble in organizing inquiry and using the visualization tools effectively. To address these issues, the VGEE includes an inquiry strategy and scaffolding activities that are similar to strategies used successfully in K-12 classrooms. The strategy is organized around the steps: identify, relate, explain, and integrate. In the first step, students construct visualizations from data to try to identify salient features of a particular phenomenon. They compare their previous conceptions of a phenomenon to the data examine their current knowledge and motivate investigation. Next, students use the multivariable functionality of the visualization environment to relate the different features they identified. Explain moves the learner temporarily outside the visualization to the concept models, where they explore fundamental physical principles. Finally, in integrate, learners use these fundamental principles within the visualization environment by literally placing the concept model within the visualization environment as a probe and watching it respond to larger-scale patterns. This capability, unique to the VGEE, addresses the disconnect that novice learners often experience between fundamental physics and observable phenomena. It also allows learners the opportunity to reflect on and refine their knowledge as well as

  6. Diagnostic tools in maxillofacial fractures: Is there really a need of three-dimensional computed tomography?

    PubMed Central

    Shah, Sheerin; Uppal, Sanjeev K.; Mittal, Rajinder K.; Garg, Ramneesh; Saggar, Kavita; Dhawan, Rishi

    2016-01-01

    Introduction: Because of its functional and cosmetic importance, facial injuries, especially bony fractures are clinically very significant. Missed and maltreated fractures might result in malocclusion and disfigurement of the face, thus making accurate diagnosis of the fracture very essential. In earlier times, conventional radiography along with clinical examination played a major role in diagnosis of maxillofacial fractures. However, it was noted that the overlapping nature of bones and the inability to visualise soft tissue swelling and fracture displacement, especially in face, makes radiography less reliable and useful. Computed tomography (CT), also called as X-ray computed radiography, has helped in solving this problem. This clinical study is to compare three-dimensional (3D) CT reconstruction with conventional radiography in evaluating the maxillofacial fractures preoperatively and effecting the surgical management, accordingly. Materials and Methods: Fifty patients, with suspected maxillofacial fractures on clinical examination, were subjected to conventional radiography and CT face with 3D reconstruction. The number and site of fractures in zygoma, maxilla, mandible and nose, detected by both the methods, were enumerated and compared. The final bearing of these additional fractures, on the management protocol, was analysed. Results: CT proved superior to conventional radiography in diagnosing additional number of fractures in zygoma, maxilla, mandible (subcondylar) and nasal bone. Coronal and axial images were found to be significantly more diagnostic in fracture sites such as zygomaticomaxillary complex, orbital floor, arch, lateral maxillary wall and anterior maxillary wall. Conclusion: 3D images gave an inside out picture of the actual sites of fractures. It acted as mind's eye for pre-operative planning and intra-operative execution of surgery. Better surgical treatment could be given to 33% of the cases because of better diagnostic ability of CT

  7. An Integrated Tool to Study MHC Region: Accurate SNV Detection and HLA Genes Typing in Human MHC Region Using Targeted High-Throughput Sequencing

    PubMed Central

    Liu, Xiao; Xu, Yinyin; Liang, Dequan; Gao, Peng; Sun, Yepeng; Gifford, Benjamin; D’Ascenzo, Mark; Liu, Xiaomin; Tellier, Laurent C. A. M.; Yang, Fang; Tong, Xin; Chen, Dan; Zheng, Jing; Li, Weiyang; Richmond, Todd; Xu, Xun; Wang, Jun; Li, Yingrui

    2013-01-01

    The major histocompatibility complex (MHC) is one of the most variable and gene-dense regions of the human genome. Most studies of the MHC, and associated regions, focus on minor variants and HLA typing, many of which have been demonstrated to be associated with human disease susceptibility and metabolic pathways. However, the detection of variants in the MHC region, and diagnostic HLA typing, still lacks a coherent, standardized, cost effective and high coverage protocol of clinical quality and reliability. In this paper, we presented such a method for the accurate detection of minor variants and HLA types in the human MHC region, using high-throughput, high-coverage sequencing of target regions. A probe set was designed to template upon the 8 annotated human MHC haplotypes, and to encompass the 5 megabases (Mb) of the extended MHC region. We deployed our probes upon three, genetically diverse human samples for probe set evaluation, and sequencing data show that ∼97% of the MHC region, and over 99% of the genes in MHC region, are covered with sufficient depth and good evenness. 98% of genotypes called by this capture sequencing prove consistent with established HapMap genotypes. We have concurrently developed a one-step pipeline for calling any HLA type referenced in the IMGT/HLA database from this target capture sequencing data, which shows over 96% typing accuracy when deployed at 4 digital resolution. This cost-effective and highly accurate approach for variant detection and HLA typing in the MHC region may lend further insight into immune-mediated diseases studies, and may find clinical utility in transplantation medicine research. This one-step pipeline is released for general evaluation and use by the scientific community. PMID:23894464

  8. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  9. A Monte Carlo tool for raster-scanning particle therapy dose computation

    NASA Astrophysics Data System (ADS)

    Jelen, U.; Radon, M.; Santiago, A.; Wittig, A.; Ammazzalorso, F.

    2014-03-01

    Purpose of this work was to implement Monte Carlo (MC) dose computation in realistic patient geometries with raster-scanning, the most advanced ion beam delivery technique, combining magnetic beam deflection with energy variation. FLUKA, a Monte Carlo package well-established in particle therapy applications, was extended to simulate raster-scanning delivery with clinical data, unavailable as built-in feature. A new complex beam source, compatible with FLUKA public programming interface, was implemented in Fortran to model the specific properties of raster-scanning, i.e. delivery by means of multiple spot sources with variable spatial distributions, energies and numbers of particles. The source was plugged into the MC engine through the user hook system provided by FLUKA. Additionally, routines were provided to populate the beam source with treatment plan data, stored as DICOM RTPlan or TRiP98's RST format, enabling MC recomputation of clinical plans. Finally, facilities were integrated to read computerised tomography (CT) data into FLUKA. The tool was used to recompute two representative carbon ion treatment plans, a skull base and a prostate case, prepared with analytical dose calculation (TRiP98). Selected, clinically relevant issues influencing the dose distributions were investigated: (1) presence of positioning errors, (2) influence of fiducial markers and (3) variations in pencil beam width. Notable differences in modelling of these challenging situations were observed between the analytical and Monte Carlo results. In conclusion, a tool was developed, to support particle therapy research and treatment, when high precision MC calculations are required, e.g. in presence of severe density heterogeneities or in quality assurance procedures.

  10. A Computational Tool for the Microstructure Optimization of a Polymeric Heart Valve Prosthesis.

    PubMed

    Serrani, M; Brubert, J; Stasiak, J; De Gaetano, F; Zaffora, A; Costantino, M L; Moggridge, G D

    2016-06-01

    Styrene-based block copolymers are promising materials for the development of a polymeric heart valve prosthesis (PHV), and the mechanical properties of these polymers can be tuned via the manufacturing process, orienting the cylindrical domains to achieve material anisotropy. The aim of this work is the development of a computational tool for the optimization of the material microstructure in a new PHV intended for aortic valve replacement to enhance the mechanical performance of the device. An iterative procedure was implemented to orient the cylinders along the maximum principal stress direction of the leaflet. A numerical model of the leaflet was developed, and the polymer mechanical behavior was described by a hyperelastic anisotropic constitutive law. A custom routine was implemented to align the cylinders with the maximum principal stress direction in the leaflet for each iteration. The study was focused on valve closure, since during this phase the fibrous structure of the leaflets must bear the greatest load. The optimal microstructure obtained by our procedure is characterized by mainly circumferential orientation of the cylinders within the valve leaflet. An increase in the radial strain and a decrease in the circumferential strain due to the microstructure optimization were observed. Also, a decrease in the maximum value of the strain energy density was found in the case of optimized orientation; since the strain energy density is a widely used criterion to predict elastomer's lifetime, this result suggests a possible increase of the device durability if the polymer microstructure is optimized. The present method represents a valuable tool for the design of a new anisotropic PHV, allowing the investigation of different designs, materials, and loading conditions.

  11. Collidoscope: An Improved Tool for Computing Collisional Cross-Sections with the Trajectory Method.

    PubMed

    Ewing, Simon A; Donor, Micah T; Wilson, Jesse W; Prell, James S

    2017-04-01

    Ion mobility-mass spectrometry (IM-MS) can be a powerful tool for determining structural information about ions in the gas phase, from small covalent analytes to large, native-like or denatured proteins and complexes. For large biomolecular ions, which may have a wide variety of possible gas-phase conformations and multiple charge sites, quantitative, physically explicit modeling of collisional cross sections (CCSs) for comparison to IMS data can be challenging and time-consuming. We present a "trajectory method" (TM) based CCS calculator, named "Collidoscope," which utilizes parallel processing and optimized trajectory sampling, and implements both He and N2 as collision gas options. Also included is a charge-placement algorithm for determining probable charge site configurations for protonated protein ions given an input geometry in pdb file format. Results from Collidoscope are compared with those from the current state-of-the-art CCS simulation suite, IMoS. Collidoscope CCSs are within 4% of IMoS values for ions with masses from ~18 Da to ~800 kDa. Collidoscope CCSs using X-ray crystal geometries are typically within a few percent of IM-MS experimental values for ions with mass up to ~3.5 kDa (melittin), and discrepancies for larger ions up to ~800 kDa (GroEL) are attributed in large part to changes in ion structure during and after the electrospray process. Due to its physically explicit modeling of scattering, computational efficiency, and accuracy, Collidoscope can be a valuable tool for IM-MS research, especially for large biomolecular ions. Graphical Abstract ᅟ.

  12. A Computational Tool for the Microstructure Optimization of a Polymeric Heart Valve Prosthesis

    PubMed Central

    Serrani, M.; Brubert, J.; Stasiak, J.; De Gaetano, F.; Zaffora, A.; Costantino, M. L.; Moggridge, G. D.

    2016-01-01

    Styrene-based block copolymers are promising materials for the development of a polymeric heart valve prosthesis (PHV), and the mechanical properties of these polymers can be tuned via the manufacturing process, orienting the cylindrical domains to achieve material anisotropy. The aim of this work is the development of a computational tool for the optimization of the material microstructure in a new PHV intended for aortic valve replacement to enhance the mechanical performance of the device. An iterative procedure was implemented to orient the cylinders along the maximum principal stress direction of the leaflet. A numerical model of the leaflet was developed, and the polymer mechanical behavior was described by a hyperelastic anisotropic constitutive law. A custom routine was implemented to align the cylinders with the maximum principal stress direction in the leaflet for each iteration. The study was focused on valve closure, since during this phase the fibrous structure of the leaflets must bear the greatest load. The optimal microstructure obtained by our procedure is characterized by mainly circumferential orientation of the cylinders within the valve leaflet. An increase in the radial strain and a decrease in the circumferential strain due to the microstructure optimization were observed. Also, a decrease in the maximum value of the strain energy density was found in the case of optimized orientation; since the strain energy density is a widely used criterion to predict elastomer’s lifetime, this result suggests a possible increase of the device durability if the polymer microstructure is optimized. The present method represents a valuable tool for the design of a new anisotropic PHV, allowing the investigation of different designs, materials, and loading conditions. PMID:27018454

  13. Collidoscope: An Improved Tool for Computing Collisional Cross-Sections with the Trajectory Method

    NASA Astrophysics Data System (ADS)

    Ewing, Simon A.; Donor, Micah T.; Wilson, Jesse W.; Prell, James S.

    2017-02-01

    Ion mobility-mass spectrometry (IM-MS) can be a powerful tool for determining structural information about ions in the gas phase, from small covalent analytes to large, native-like or denatured proteins and complexes. For large biomolecular ions, which may have a wide variety of possible gas-phase conformations and multiple charge sites, quantitative, physically explicit modeling of collisional cross sections (CCSs) for comparison to IMS data can be challenging and time-consuming. We present a "trajectory method" (TM) based CCS calculator, named "Collidoscope," which utilizes parallel processing and optimized trajectory sampling, and implements both He and N2 as collision gas options. Also included is a charge-placement algorithm for determining probable charge site configurations for protonated protein ions given an input geometry in pdb file format. Results from Collidoscope are compared with those from the current state-of-the-art CCS simulation suite, IMoS. Collidoscope CCSs are within 4% of IMoS values for ions with masses from 18 Da to 800 kDa. Collidoscope CCSs using X-ray crystal geometries are typically within a few percent of IM-MS experimental values for ions with mass up to 3.5 kDa (melittin), and discrepancies for larger ions up to 800 kDa (GroEL) are attributed in large part to changes in ion structure during and after the electrospray process. Due to its physically explicit modeling of scattering, computational efficiency, and accuracy, Collidoscope can be a valuable tool for IM-MS research, especially for large biomolecular ions.

  14. A Computational Tool to Detect and Avoid Redundancy in Selected Reaction Monitoring

    PubMed Central

    Röst, Hannes; Malmström, Lars; Aebersold, Ruedi

    2012-01-01

    Selected reaction monitoring (SRM), also called multiple reaction monitoring, has become an invaluable tool for targeted quantitative proteomic analyses, but its application can be compromised by nonoptimal selection of transitions. In particular, complex backgrounds may cause ambiguities in SRM measurement results because peptides with interfering transitions similar to those of the target peptide may be present in the sample. Here, we developed a computer program, the SRMCollider, that calculates nonredundant theoretical SRM assays, also known as unique ion signatures (UIS), for a given proteomic background. We show theoretically that UIS of three transitions suffice to conclusively identify 90% of all yeast peptides and 85% of all human peptides. Using predicted retention times, the SRMCollider also simulates time-scheduled SRM acquisition, which reduces the number of interferences to consider and leads to fewer transitions necessary to construct an assay. By integrating experimental fragment ion intensities from large scale proteome synthesis efforts (SRMAtlas) with the information content-based UIS, we combine two orthogonal approaches to create high quality SRM assays ready to be deployed. We provide a user friendly, open source implementation of an algorithm to calculate UIS of any order that can be accessed online at http://www.srmcollider.org to find interfering transitions. Finally, our tool can also simulate the specificity of novel data-independent MS acquisition methods in Q1–Q3 space. This allows us to predict parameters for these methods that deliver a specificity comparable with that of SRM. Using SRM interference information in addition to other sources of information can increase the confidence in an SRM measurement. We expect that the consideration of information content will become a standard step in SRM assay design and analysis, facilitated by the SRMCollider. PMID:22535207

  15. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing

    PubMed Central

    2013-01-01

    Background Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Results Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Conclusions Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of

  16. Simple X-ray versus ultrasonography examination in blunt chest trauma: effective tools of accurate diagnosis and considerations for rib fractures

    PubMed Central

    Hwang, Eun Gu; Lee, Yunjung

    2016-01-01

    Simple radiography is the best diagnostic tool for rib fractures caused by chest trauma, but it has some limitations. Thus, other tools are also being used. The aims of this study were to investigate the effectiveness of ultrasonography (US) for identifying rib fractures and to identify influencing factors of its effectiveness. Between October 2003 and August 2007, 201 patients with blunt chest trauma were available to undergo chest radiographic and US examinations for diagnosis of rib fractures. The two modalities were compared in terms of effectiveness based on simple radiographic readings and US examination results. We also investigated the factors that influenced the effectiveness of US examination. Rib fractures were detected on radiography in 69 patients (34.3%) but not in 132 patients. Rib fractures were diagnosed by using US examination in 160 patients (84.6%). Of the 132 patients who showed no rib fractures on radiography, 92 showed rib fractures on US. Among the 69 patients of rib fracture detected on radiography, 33 had additional rib fractures detected on US. Of the patients, 76 (37.8%) had identical radiographic and US results, and 125 (62.2%) had fractures detected on US that were previously undetected on radiography or additional fractures detected on US. Age, duration until US examination, and fracture location were not significant influencing factors. However, in the group without detected fractures on radiography, US showed a more significant effectiveness than in the group with detected fractures on radiography (P=0.003). US examination could detect unnoticed rib fractures on simple radiography. US examination is especially more effective in the group without detected fractures on radiography. More attention should be paid to patients with chest trauma who have no detected fractures on radiography. PMID:28119889

  17. Introducing CAFein, a New Computational Tool for Stellar Pulsations and Dynamic Tides

    NASA Astrophysics Data System (ADS)

    Valsecchi, F.; Farr, W. M.; Willems, B.; Rasio, F. A.; Kalogera, V.

    2013-08-01

    Here we present CAFein, a new computational tool for investigating radiative dissipation of dynamic tides in close binaries and of non-adiabatic, non-radial stellar oscillations in isolated stars in the linear regime. For the latter, CAFein computes the non-adiabatic eigenfrequencies and eigenfunctions of detailed stellar models. The code is based on the so-called Riccati method, a numerical algorithm that has been successfully applied to a variety of stellar pulsators, and which does not suffer from the major drawbacks of commonly used shooting and relaxation schemes. Here we present an extension of the Riccati method to investigate dynamic tides in close binaries. We demonstrate CAFein's capabilities as a stellar pulsation code both in the adiabatic and non-adiabatic regimes, by reproducing previously published eigenfrequencies of a polytrope, and by successfully identifying the unstable modes of a stellar model in the β Cephei/SPB region of the Hertzsprung-Russell diagram. Finally, we verify CAFein's behavior in the dynamic tides regime by investigating the effects of dynamic tides on the eigenfunctions and orbital and spin evolution of massive main sequence stars in eccentric binaries, and of hot Jupiter host stars. The plethora of asteroseismic data provided by NASA's Kepler satellite, some of which include the direct detection of tidally excited stellar oscillations, make CAFein quite timely. Furthermore, the increasing number of observed short-period detached double white dwarfs (WDs) and the observed orbital decay in the tightest of such binaries open up a new possibility of investigating WD interiors through the effects of tides on their orbital evolution.

  18. INTRODUCING CAFein, A NEW COMPUTATIONAL TOOL FOR STELLAR PULSATIONS AND DYNAMIC TIDES

    SciTech Connect

    Valsecchi, F.; Farr, W. M.; Willems, B.; Rasio, F. A.; Kalogera, V.

    2013-08-10

    Here we present CAFein, a new computational tool for investigating radiative dissipation of dynamic tides in close binaries and of non-adiabatic, non-radial stellar oscillations in isolated stars in the linear regime. For the latter, CAFein computes the non-adiabatic eigenfrequencies and eigenfunctions of detailed stellar models. The code is based on the so-called Riccati method, a numerical algorithm that has been successfully applied to a variety of stellar pulsators, and which does not suffer from the major drawbacks of commonly used shooting and relaxation schemes. Here we present an extension of the Riccati method to investigate dynamic tides in close binaries. We demonstrate CAFein's capabilities as a stellar pulsation code both in the adiabatic and non-adiabatic regimes, by reproducing previously published eigenfrequencies of a polytrope, and by successfully identifying the unstable modes of a stellar model in the {beta} Cephei/SPB region of the Hertzsprung-Russell diagram. Finally, we verify CAFein's behavior in the dynamic tides regime by investigating the effects of dynamic tides on the eigenfunctions and orbital and spin evolution of massive main sequence stars in eccentric binaries, and of hot Jupiter host stars. The plethora of asteroseismic data provided by NASA's Kepler satellite, some of which include the direct detection of tidally excited stellar oscillations, make CAFein quite timely. Furthermore, the increasing number of observed short-period detached double white dwarfs (WDs) and the observed orbital decay in the tightest of such binaries open up a new possibility of investigating WD interiors through the effects of tides on their orbital evolution.

  19. How to Compute a Slot Marker - Calculation of Controller Managed Spacing Tools for Efficient Descents with Precision Scheduling

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas

    2012-01-01

    This paper describes the underlying principles and algorithms for computing the primary controller managed spacing (CMS) tools developed at NASA for precisely spacing aircraft along efficient descent paths. The trajectory-based CMS tools include slot markers, delay indications and speed advisories. These tools are one of three core NASA technologies integrated in NASAs ATM technology demonstration-1 (ATD-1) that will operationally demonstrate the feasibility of fuel-efficient, high throughput arrival operations using Automatic Dependent Surveillance Broadcast (ADS-B) and ground-based and airborne NASA technologies for precision scheduling and spacing.

  20. Computational Ecology and Open Science: Tools to Help Manage Lakes for Cyanobacteria in Lakes

    EPA Science Inventory

    Computational ecology is an interdisciplinary field that takes advantage of modern computation abilities to expand our ecological understanding. As computational ecologists, we use large data sets, which often cover large spatial extents, and advanced statistical/mathematical co...

  1. A Potential Tool for Clinicians; Evaluating a Computer-Led Dietary Assessment Method in Overweight and Obese Women during Weight Loss.

    PubMed

    Widaman, Adrianne M; Keim, Nancy L; Burnett, Dustin J; Miller, Beverly; Witbracht, Megan G; Widaman, Keith F; Laugero, Kevin D

    2017-03-01

    Many Americans are attempting to lose weight with the help of healthcare professionals. Clinicians can improve weight loss results by using technology. Accurate dietary assessment is crucial to effective weight loss. The aim of this study was to validate a computer-led dietary assessment method in overweight/obese women. Known dietary intake was compared to Automated Self-Administered 24-h recall (ASA24) reported intake in women (n = 45), 19-50 years, with body mass index of 27-39.9 kg/m². Participants received nutrition education and reduced body weight by 4%-10%. Participants completed one unannounced dietary recall and their responses were compared to actual intake. Accuracy of the recall and characteristics of respondent error were measured using linear and logistic regression. Energy was underreported by 5% with no difference for most nutrients except carbohydrates, vitamin B12, vitamin C, selenium, calcium and vitamin D (p = 0.002, p < 0.0001, p = 0.022, p = 0.010, p = 0.008 and p = 0.001 respectively). Overall, ASA24 is a valid dietary assessment tool in overweight/obese women participating in a weight loss program. The automated features eliminate the need for clinicians to be trained, to administer, or to analyze dietary intake. Computer-led dietary assessment tools should be considered as part of clinician-supervised weight loss programs.

  2. A Potential Tool for Clinicians; Evaluating a Computer-Led Dietary Assessment Method in Overweight and Obese Women during Weight Loss

    PubMed Central

    Widaman, Adrianne M.; Keim, Nancy L.; Burnett, Dustin J.; Miller, Beverly; Witbracht, Megan G.; Widaman, Keith F.; Laugero, Kevin D.

    2017-01-01

    Many Americans are attempting to lose weight with the help of healthcare professionals. Clinicians can improve weight loss results by using technology. Accurate dietary assessment is crucial to effective weight loss. The aim of this study was to validate a computer-led dietary assessment method in overweight/obese women. Known dietary intake was compared to Automated Self-Administered 24-h recall (ASA24) reported intake in women (n = 45), 19–50 years, with body mass index of 27–39.9 kg/m2. Participants received nutrition education and reduced body weight by 4%–10%. Participants completed one unannounced dietary recall and their responses were compared to actual intake. Accuracy of the recall and characteristics of respondent error were measured using linear and logistic regression. Energy was underreported by 5% with no difference for most nutrients except carbohydrates, vitamin B12, vitamin C, selenium, calcium and vitamin D (p = 0.002, p < 0.0001, p = 0.022, p = 0.010, p = 0.008 and p = 0.001 respectively). Overall, ASA24 is a valid dietary assessment tool in overweight/obese women participating in a weight loss program. The automated features eliminate the need for clinicians to be trained, to administer, or to analyze dietary intake. Computer-led dietary assessment tools should be considered as part of clinician-supervised weight loss programs. PMID:28257040

  3. A software tool for quality assurance of computed/digital radiography (CR/DR) systems

    NASA Astrophysics Data System (ADS)

    Desai, Nikunj; Valentino, Daniel J.

    2011-03-01

    The recommended methods to test the performance of computed radiography (CR) systems have been established by The American Association of Physicists in Medicine, Report No. 93, "Acceptance Testing and Quality Control of Photostimulable Storage Phosphor Imaging Systems". The quality assurance tests are categorized by how frequently they need to be performed. Quality assurance of CR systems is the responsibility of the facility that performs the exam and is governed by the state in which the facility is located. For Example, the New York State Department of Health has established a guide which lists the tests that a CR facility must perform for quality assurance. This study aims at educating the reader about the new quality assurance requirements defined by the state. It further demonstrates an easy to use software tool, henceforth referred to as the Digital Physicist, developed to aid a radiologic facility in conforming with state guidelines and monitoring quality assurance of CR/DR imaging systems. The Digital Physicist provides a vendor independent procedure for quality assurance of CR/DR systems. Further it, generates a PDF report with a brief description of these tests and the obtained results.

  4. Dynamic 3-D computer graphics for designing a diagnostic tool for patients with schizophrenia.

    PubMed

    Farkas, Attila; Papathomas, Thomas V; Silverstein, Steven M; Kourtev, Hristiyan; Papayanopoulos, John F

    2016-11-01

    We introduce a novel procedure that uses dynamic 3-D computer graphics as a diagnostic tool for assessing disease severity in schizophrenia patients, based on their reduced influence of top-down cognitive processes in interpreting bottom-up sensory input. Our procedure uses the hollow-mask illusion, in which the concave side of the mask is misperceived as convex, because familiarity with convex faces dominates sensory cues signaling a concave mask. It is known that schizophrenia patients resist this illusion and their resistance increases with illness severity. Our method uses virtual masks rendered with two competing textures: (a) realistic features that enhance the illusion; (b) random-dot visual noise that reduces the illusion. We control the relative weights of the two textures to obtain psychometric functions for controls and patients and assess illness severity. The primary novelty is the use of a rotating mask that is easy to implement on a wide variety of portable devices and avoids the use of elaborate stereoscopic devices that have been used in the past. Thus our method, which can also be used to assess the efficacy of treatments, provides clinicians the advantage to bring the test to the patient's own environment, instead of having to bring patients to the clinic.

  5. Unraveling the Web of Viroinformatics: Computational Tools and Databases in Virus Research

    PubMed Central

    Priyadarshini, Pragya; Vrati, Sudhanshu

    2014-01-01

    The beginning of the second century of research in the field of virology (the first virus was discovered in 1898) was marked by its amalgamation with bioinformatics, resulting in the birth of a new domain—viroinformatics. The availability of more than 100 Web servers and databases embracing all or specific viruses (for example, dengue virus, influenza virus, hepatitis virus, human immunodeficiency virus [HIV], hemorrhagic fever virus [HFV], human papillomavirus [HPV], West Nile virus, etc.) as well as distinct applications (comparative/diversity analysis, viral recombination, small interfering RNA [siRNA]/short hairpin RNA [shRNA]/microRNA [miRNA] studies, RNA folding, protein-protein interaction, structural analysis, and phylotyping and genotyping) will definitely aid the development of effective drugs and vaccines. However, information about their access and utility is not available at any single source or on any single platform. Therefore, a compendium of various computational tools and resources dedicated specifically to virology is presented in this article. PMID:25428870

  6. Cone beam computed tomography (CBCT) as a tool for the analysis of nonhuman skeletal remains in a medico-legal setting.

    PubMed

    Lucena, Joaquin; Mora, Esther; Rodriguez, Lucia; Muñoz, Mariela; Cantin, Mario G; Fonseca, Gabriel M

    2016-09-01

    To confirm the nature and forensic significance of questioned skeletal material submitted a medico-legal setting is a relatively common procedure, although not without difficulties when the remains are fragmented or burned. Different methodologies have been described for this purpose, many of them invasive, time and money consuming or dependent on the availability of the analytical instrument. We present a case in which skeletal material with unusual conditions of preservation and curious discovery was sent to a medico-legal setting to determine its human/nonhuman origin. A combined strategy of imagenological procedures (macroscopic, radiographic and cone beam computed tomography - CBCT-technology) was performed as non-invasive and rapid methods to assess the nonhuman nature of the material, specifically of pig (Sus scrofa) origin. This hypothesis was later confirmed by DNA analysis. CBCT data sets provide accurate three-dimensional reconstructions, which demonstrate its reliable use as a forensic tool.

  7. Chimera Grid Tools

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  8. Computer-Assisted Mathematics: An Investigation of the Effectiveness of the Computer Used as a Tool to Learn Mathematics.

    ERIC Educational Resources Information Center

    Hatfield, Larry Lee

    Reported are the results of an investigation of the effects of programing a computer in a seventh grade mathematics class. Two treatments were conducted during two successive years. The students in the treatment group used the programing language BASIC to write computer algorithms following supplemental instruction. The mathematical content was…

  9. Innovation Configuration Mapping as a Professional Development Tool: The Case of One-to-One Laptop Computing

    ERIC Educational Resources Information Center

    Towndrow, Phillip A.; Fareed, Wan

    2015-01-01

    This article illustrates how findings from a study of teachers' and students' uses of laptop computers in a secondary school in Singapore informed the development of an Innovation Configuration (IC) Map--a tool for identifying and describing alternative ways of implementing innovations based on teachers' unique feelings, preoccupations, thoughts…

  10. The Use of Interactive Computer Animations Based on POE as a Presentation Tool in Primary Science Teaching

    ERIC Educational Resources Information Center

    Akpinar, Ercan

    2014-01-01

    This study investigates the effects of using interactive computer animations based on predict-observe-explain (POE) as a presentation tool on primary school students' understanding of the static electricity concepts. A quasi-experimental pre-test/post-test control group design was utilized in this study. The experiment group consisted of 30…

  11. Conducting Creativity Brainstorming Sessions in Small and Medium-Sized Enterprises Using Computer-Mediated Communication Tools

    NASA Astrophysics Data System (ADS)

    Murthy, Uday S.

    A variety of Web-based low cost computer-mediated communication (CMC) tools are now available for use by small and medium-sized enterprises (SME). These tools invariably incorporate chat systems that facilitate simultaneous input in synchronous electronic meeting environments, allowing what is referred to as “electronic brainstorming.” Although prior research in information systems (IS) has established that electronic brainstorming can be superior to face-to-face brainstorming, there is a lack of detailed guidance regarding how CMC tools should be optimally configured to foster creativity in SMEs. This paper discusses factors to be considered in using CMC tools for creativity brainstorming and proposes recommendations for optimally configuring CMC tools to enhance creativity in SMEs. The recommendations are based on lessons learned from several recent experimental studies on the use of CMC tools for rich brainstorming tasks that require participants to invoke domain-specific knowledge. Based on a consideration of the advantages and disadvantages of the various configuration options, the recommendations provided can form the basis for selecting a CMC tool for creativity brainstorming or for creating an in-house CMC tool for the purpose.

  12. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    SciTech Connect

    Habib, Salman; Roser, Robert

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  13. UniDrug-Target: A Computational Tool to Identify Unique Drug Targets in Pathogenic Bacteria

    PubMed Central

    Chanumolu, Sree Krishna; Rout, Chittaranjan; Chauhan, Rajinder S.

    2012-01-01

    Background Targeting conserved proteins of bacteria through antibacterial medications has resulted in both the development of resistant strains and changes to human health by destroying beneficial microbes which eventually become breeding grounds for the evolution of resistances. Despite the availability of more than 800 genomes sequences, 430 pathways, 4743 enzymes, 9257 metabolic reactions and protein (three-dimensional) 3D structures in bacteria, no pathogen-specific computational drug target identification tool has been developed. Methods A web server, UniDrug-Target, which combines bacterial biological information and computational methods to stringently identify pathogen-specific proteins as drug targets, has been designed. Besides predicting pathogen-specific proteins essentiality, chokepoint property, etc., three new algorithms were developed and implemented by using protein sequences, domains, structures, and metabolic reactions for construction of partial metabolic networks (PMNs), determination of conservation in critical residues, and variation analysis of residues forming similar cavities in proteins sequences. First, PMNs are constructed to determine the extent of disturbances in metabolite production by targeting a protein as drug target. Conservation of pathogen-specific protein's critical residues involved in cavity formation and biological function determined at domain-level with low-matching sequences. Last, variation analysis of residues forming similar cavities in proteins sequences from pathogenic versus non-pathogenic bacteria and humans is performed. Results The server is capable of predicting drug targets for any sequenced pathogenic bacteria having fasta sequences and annotated information. The utility of UniDrug-Target server was demonstrated for Mycobacterium tuberculosis (H37Rv). The UniDrug-Target identified 265 mycobacteria pathogen-specific proteins, including 17 essential proteins which can be potential drug targets. Conclusions

  14. GMXPBSA 2.1: A GROMACS tool to perform MM/PBSA and computational alanine scanning

    NASA Astrophysics Data System (ADS)

    Paissoni, C.; Spiliotopoulos, D.; Musco, G.; Spitaleri, A.

    2015-01-01

    GMXPBSA 2.1 is a user-friendly suite of Bash/Perl scripts for streamlining MM/PBSA calculations on structural ensembles derived from GROMACS trajectories, to automatically calculate binding free energies for protein-protein or ligand-protein complexes [R.T. Bradshaw et al., Protein Eng. Des. Sel. 24 (2011) 197-207]. GMXPBSA 2.1 is flexible and can easily be customized to specific needs and it is an improvement of the previous GMXPBSA 2.0 [C. Paissoni et al., Comput. Phys. Commun. (2014), 185, 2920-2929]. Additionally, it performs computational alanine scanning (CAS) to study the effects of ligand and/or receptor alanine mutations on the free energy of binding. Calculations require only for protein-protein or protein-ligand MD simulations. GMXPBSA 2.1 performs different comparative analyses, including a posteriori generation of alanine mutants of the wild-type complex, calculation of the binding free energy values of the mutant complexes and comparison of the results with the wild-type system. Moreover, it compares the binding free energy of different complex trajectories, allowing the study of the effects of non-alanine mutations, post-translational modifications or unnatural amino acids on the binding free energy of the system under investigation. Finally, it can calculate and rank relative affinity to the same receptor utilizing MD simulations of proteins in complex with different ligands. In order to dissect the different MM/PBSA energy contributions, including molecular mechanic (MM), electrostatic contribution to solvation (PB) and nonpolar contribution to solvation (SA), the tool combines two freely available programs: the MD simulations software GROMACS [S. Pronk et al., Bioinformatics 29 (2013) 845-854] and the Poisson-Boltzmann equation solver APBS [N.A. Baker et al., Proc. Natl. Acad. Sci. U.S.A 98 (2001) 10037-10041]. All the calculations can be performed in single or distributed automatic fashion on a cluster facility in order to increase the

  15. Comparative Analysis of the Testis and Ovary Transcriptomes in Zebrafish by Combining Experimental and Computational Tools

    PubMed Central

    Li, Yang; Chia, Jer Ming; Bartfai, Richard; Christoffels, Alan; Yue, Gen Hua; Ding, Ke; Ho, Mei Yin; Hill, James A.

    2004-01-01

    Studies on the zebrafish model have contributed to our understanding of several important developmental processes, especially those that can be easily studied in the embryo. However, our knowledge on late events such as gonad differentiation in the zebrafish is still limited. Here we provide an analysis on the gene sets expressed in the adult zebrafish testis and ovary in an attempt to identify genes with potential role in (zebra)fish gonad development and function. We produced 10 533 expressed sequence tags (ESTs) from zebrafish testis or ovary and downloaded an additional 23 642 gonad-derived sequences from the zebrafish EST database. We clustered these sequences together with over 13 000 kidney-derived zebrafish ESTs to study partial transcriptomes for these three organs. We searched for genes with gonad-specific expression by screening macroarrays containing at least 2600 unique cDNA inserts with testis-, ovary- and kidney-derived cDNA probes. Clones hybridizing to only one of the two gonad probes were selected, and subsequently screened with computational tools to identify 72 genes with potentially testis-specific and 97 genes with potentially ovary-specific expression, respectively. PCR-amplification confirmed gonad-specificity for 21 of the 45 clones tested (all without known function). Our study, which involves over 47 000 EST sequences and specialized cDNA arrays, is the first analysis of adult organ transcriptomes of zebrafish at such a scale. The study of genes expressed in adult zebrafish testis and ovary will provide useful information on regulation of gene expression in teleost gonads and might also contribute to our understanding of the development and differentiation of reproductive organs in vertebrates. PMID:18629171

  16. Computational tools for calculating alternative muscle force patterns during motion: a comparison of possible solutions.

    PubMed

    Martelli, Saulo; Calvetti, Daniela; Somersalo, Erkki; Viceconti, Marco; Taddei, Fulvia

    2013-08-09

    Comparing the available electromyography (EMG) and the related uncertainties with the space of muscle forces potentially driving the same motion can provide insights into understanding human motion in healthy and pathological neuromotor conditions. However, it is not clear how effective the available computational tools are in completely sample the possible muscle forces. In this study, we compared the effectiveness of Metabolica and the Null-Space algorithm at generating a comprehensive spectrum of possible muscle forces for a representative motion frame. The hip force peak during a selected walking trial was identified using a lower-limb musculoskeletal model. The joint moments, the muscle lever arms, and the muscle force constraints extracted from the model constituted the indeterminate equilibrium equation at the joints. Two spectra, each containing 200,000 muscle force samples, were calculated using Metabolica and the Null-Space algorithm. The full hip force range was calculated using optimization and compared with the hip force ranges derived from the Metabolica and the Null-Space spectra. The Metabolica spectrum spanned a much larger force range than the NS spectrum, reaching 811N difference for the gluteus maximus intermediate bundle. The Metabolica hip force range exhibited a 0.3-0.4 BW error on the upper and lower boundaries of the full hip force range (3.4-11.3 BW), whereas the full range was imposed in the NS spectrum. The results suggest that Metabolica is well suited for exhaustively sample the spectrum of possible muscle recruitment strategy. Future studies will investigate the muscle force range in healthy and pathological neuromotor conditions.

  17. A suite of MATLAB-based computational tools for automated analysis of COPAS Biosort data

    PubMed Central

    Morton, Elizabeth; Lamitina, Todd

    2010-01-01

    Complex Object Parametric Analyzer and Sorter (COPAS) devices are large-object, fluorescence-capable flow cytometers used for high-throughput analysis of live model organisms, including Drosophila melanogaster, Caenorhabditis elegans, and zebrafish. The COPAS is especially useful in C. elegans high-throughput genome-wide RNA interference (RNAi) screens that utilize fluorescent reporters. However, analysis of data from such screens is relatively labor-intensive and time-consuming. Currently, there are no computational tools available to facilitate high-throughput analysis of COPAS data. We used MATLAB to develop algorithms (COPAquant, COPAmulti, and COPAcompare) to analyze different types of COPAS data. COPAquant reads single-sample files, filters and extracts values and value ratios for each file, and then returns a summary of the data. COPAmulti reads 96-well autosampling files generated with the ReFLX adapter, performs sample filtering, graphs features across both wells and plates, performs some common statistical measures for hit identification, and outputs results in graphical formats. COPAcompare performs a correlation analysis between replicate 96-well plates. For many parameters, thresholds may be defined through a simple graphical user interface (GUI), allowing our algorithms to meet a variety of screening applications. In a screen for regulators of stress-inducible GFP expression, COPAquant dramatically accelerated data analysis and allowed us to rapidly move from raw data to hit identification. Because the COPAS file structure is standardized and our MATLAB code is freely available, our algorithms should be extremely useful for analysis of COPAS data from multiple platforms and organisms. The MATLAB code is freely available at our web site (www.med.upenn.edu/lamitinalab/downloads.shtml). PMID:20569218

  18. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and water

  19. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    NASA Astrophysics Data System (ADS)

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2016-08-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing) that students apply to study how lakes around the globe are experiencing the effects of climate change. In the module, students develop hypotheses about the effects of different climate scenarios on lakes and then test their hypotheses using hundreds of model simulations. We taught the module in a 4-hour workshop and found that participation in the module significantly increased both undergraduate and graduate students' understanding about climate change effects on lakes. Moreover, participation in the module also significantly increased students' perceived experience level in using different software, technologies, and modeling tools. By embedding modeling in an environmental science context, non-computer science students were able to successfully use and master technologies that they had previously never been exposed to. Overall, our findings suggest that modeling is a powerful tool for catalyzing student learning on the effects of climate change.

  20. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    NASA Astrophysics Data System (ADS)

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-02-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing) that students apply to study how lakes around the globe are experiencing the effects of climate change. In the module, students develop hypotheses about the effects of different climate scenarios on lakes and then test their hypotheses using hundreds of model simulations. We taught the module in a 4-hour workshop and found that participation in the module significantly increased both undergraduate and graduate students' understanding about climate change effects on lakes. Moreover, participation in the module also significantly increased students' perceived experience level in using different software, technologies, and modeling tools. By embedding modeling in an environmental science context, non-computer science students were able to successfully use and master technologies that they had previously never been exposed to. Overall, our findings suggest that modeling is a powerful tool for catalyzing student learning on the effects of climate change.

  1. GMXPBSA 2.0: A GROMACS tool to perform MM/PBSA and computational alanine scanning

    NASA Astrophysics Data System (ADS)

    Paissoni, C.; Spiliotopoulos, D.; Musco, G.; Spitaleri, A.

    2014-11-01

    GMXPBSA 2.0 is a user-friendly suite of Bash/Perl scripts for streamlining MM/PBSA calculations on structural ensembles derived from GROMACS trajectories, to automatically calculate binding free energies for protein-protein or ligand-protein complexes. GMXPBSA 2.0 is flexible and can easily be customized to specific needs. Additionally, it performs computational alanine scanning (CAS) to study the effects of ligand and/or receptor alanine mutations on the free energy of binding. Calculations require only for protein-protein or protein-ligand MD simulations. GMXPBSA 2.0 performs different comparative analysis, including a posteriori generation of alanine mutants of the wild-type complex, calculation of the binding free energy values of the mutant complexes and comparison of the results with the wild-type system. Moreover, it compares the binding free energy of different complexes trajectories, allowing the study the effects of non-alanine mutations, post-translational modifications or unnatural amino acids on the binding free energy of the system under investigation. Finally, it can calculate and rank relative affinity to the same receptor utilizing MD simulations of proteins in complex with different ligands. In order to dissect the different MM/PBSA energy contributions, including molecular mechanic (MM), electrostatic contribution to solvation (PB) and nonpolar contribution to solvation (SA), the tool combines two freely available programs: the MD simulations software GROMACS and the Poisson-Boltzmann equation solver APBS. All the calculations can be performed in single or distributed automatic fashion on a cluster facility in order to increase the calculation by dividing frames across the available processors. The program is freely available under the GPL license. Catalogue identifier: AETQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AETQ_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing

  2. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    ERIC Educational Resources Information Center

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  3. A Usability Study of Users' Perceptions toward a Multimedia Computer-Assisted Learning Tool for Neuroanatomy

    ERIC Educational Resources Information Center

    Gould, Douglas J.; Terrell, Mark A.; Fleming, Jo

    2008-01-01

    This usability study evaluated users' perceptions of a multimedia prototype for a new e-learning tool: Anatomy of the Central Nervous System: A Multimedia Course. Usability testing is a collection of formative evaluation methods that inform the developmental design of e-learning tools to maximize user acceptance, satisfaction, and adoption.…

  4. Population Dynamics P system (PDP) models: a standardized protocol for describing and applying novel bio-inspired computing tools.

    PubMed

    Colomer, Maria Àngels; Margalida, Antoni; Pérez-Jiménez, Mario J

    2013-01-01

    Today, the volume of data and knowledge of processes necessitates more complex models that integrate all available information. This handicap has been solved thanks to the technological advances in both software and hardware. Computational tools available today have allowed developing a new family of models, known as computational models. The description of these models is difficult as they can not be expressed analytically, and it is therefore necessary to create protocols that serve as guidelines for future users. The Population Dynamics P systems models (PDP) are a novel and effective computational tool to model complex problems, are characterized by the ability to work in parallel (simultaneously interrelating different processes), are modular and have a high computational efficiency. However, the difficulty of describing these models therefore requires a protocol to unify the presentation and the steps to follow. We use two case studies to demonstrate the use and implementation of these computational models for population dynamics and ecological process studies, discussing briefly their potential applicability to simulate complex ecosystem dynamics.

  5. Improved mathematical and computational tools for modeling photon propagation in tissue

    NASA Astrophysics Data System (ADS)

    Calabro, Katherine Weaver

    Light interacts with biological tissue through two predominant mechanisms: scattering and absorption, which are sensitive to the size and density of cellular organelles, and to biochemical composition (ex. hemoglobin), respectively. During the progression of disease, tissues undergo a predictable set of changes in cell morphology and vascularization, which directly affect their scattering and absorption properties. Hence, quantification of these optical property differences can be used to identify the physiological biomarkers of disease with interest often focused on cancer. Diffuse reflectance spectroscopy is a diagnostic tool, wherein broadband visible light is transmitted through a fiber optic probe into a turbid medium, and after propagating through the sample, a fraction of the light is collected at the surface as reflectance. The measured reflectance spectrum can be analyzed with appropriate mathematical models to extract the optical properties of the tissue, and from these, a set of physiological properties. A number of models have been developed for this purpose using a variety of approaches -- from diffusion theory, to computational simulations, and empirical observations. However, these models are generally limited to narrow ranges of tissue and probe geometries. In this thesis, reflectance models were developed for a much wider range of measurement parameters, and influences such as the scattering phase function and probe design were investigated rigorously for the first time. The results provide a comprehensive understanding of the factors that influence reflectance, with novel insights that, in some cases, challenge current assumptions in the field. An improved Monte Carlo simulation program, designed to run on a graphics processing unit (GPU), was built to simulate the data used in the development of the reflectance models. Rigorous error analysis was performed to identify how inaccuracies in modeling assumptions can be expected to affect the accuracy

  6. MRUniNovo: an efficient tool for de novo peptide sequencing utilizing the hadoop distributed computing framework.

    PubMed

    Li, Chuang; Chen, Tao; He, Qiang; Zhu, Yunping; Li, Kenli

    2016-12-19

    Tandem mass spectrometry-based de novo peptide sequencing is a complex and time-consuming process. The current algorithms for de novo peptide sequencing cannot rapidly and thoroughly process large mass spectrometry datasets. In this paper, we propose MRUniNovo, a novel tool for parallel de novo peptide sequencing. MRUniNovo parallelizes UniNovo based on the Hadoop compute platform. Our experimental results demonstrate that MRUniNovo significantly reduces the computation time of de novo peptide sequencing without sacrificing the correctness and accuracy of the results, and thus can process very large datasets that UniNovo cannot.

  7. Development of 3D multimedia with advanced computer animation tools for outreach activities related to Meteor Science and Meteoritics

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.

    2012-09-01

    Documentaries related to Astronomy and Planetary Sciences are a common and very attractive way to promote the interest of the public in these areas. These educational tools can get benefit from new advanced computer animation software and 3D technologies, as these allow making these documentaries even more attractive. However, special care must be taken in order to guarantee that the information contained in them is serious and objective. In this sense, an additional value is given when the footage is produced by the own researchers. With this aim, a new documentary produced and directed by Prof. Madiedo has been developed. The documentary, which has been entirely developed by means of advanced computer animation tools, is dedicated to several aspects of Meteor Science and Meteoritics. The main features of this outreach and education initiative are exposed here.

  8. Web-Based Architecture to Enable Compute-Intensive CAD Tools and Multi-user Synchronization in Teleradiology

    NASA Astrophysics Data System (ADS)

    Mehta, Neville; Kompalli, Suryaprakash; Chaudhary, Vipin

    Teleradiology is the electronic transmission of radiological patient images, such as x-rays, CT, or MR across multiple locations. The goal could be interpretation, consultation, or medical records keeping. Information technology solutions have enabled electronic records and their associated benefits are evident in health care today. However, salient aspects of collaborative interfaces, and computer assisted diagnostic (CAD) tools are yet to be integrated into workflow designs. The Computer Assisted Diagnostics and Interventions (CADI) group at the University at Buffalo has developed an architecture that facilitates web-enabled use of CAD tools, along with the novel concept of synchronized collaboration. The architecture can support multiple teleradiology applications and case studies are presented here.

  9. Using the Eclipse Parallel Tools Platform to Assist Earth Science Model Development and Optimization on High Performance Computers

    NASA Astrophysics Data System (ADS)

    Alameda, J. C.

    2011-12-01

    Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into

  10. Technical Evaluation Report on the Flight Mechanics Panel Symposium on the Use of Computers as a Design Tool.

    DTIC Science & Technology

    1980-03-01

    by Professor Dr- gfrie r Hochschule der Bundeswehr Minchen Werner - Heisenberg -Weg 39 8014 Neubiberg, Germany DTI Tl sJUN . This Advisory Report was...Wagner March 1980 10. Author’s/Editor’s Address II. Pages Hochschule der Bundeswehr MUnchen Werner - Heisenberg -Weg 39 21 8014 Neubiberg, Germany 12...SYMPOSIUM ON THE USE OF COMPUTERS AS A DESIGN TOOL by Professor Dr.-Ing. Siegfried N. Wagner Luftfahrttechnik Hochschule der Bundeswehr Mflnchen Werner

  11. MLP Tools: a PyMOL plugin for using the molecular lipophilicity potential in computer-aided drug design

    NASA Astrophysics Data System (ADS)

    Oberhauser, Nils; Nurisso, Alessandra; Carrupt, Pierre-Alain

    2014-05-01

    The molecular lipophilicity potential (MLP) is a well-established method to calculate and visualize lipophilicity on molecules. We are here introducing a new computational tool named MLP Tools, written in the programming language Python, and conceived as a free plugin for the popular open source molecular viewer PyMOL. The plugin is divided into several sub-programs which allow the visualization of the MLP on molecular surfaces, as well as in three-dimensional space in order to analyze lipophilic properties of binding pockets. The sub-program Log MLP also implements the virtual log P which allows the prediction of the octanol/water partition coefficients on multiple three-dimensional conformations of the same molecule. An implementation on the recently introduced MLP GOLD procedure, improving the GOLD docking performance in hydrophobic pockets, is also part of the plugin. In this article, all functions of the MLP Tools will be described through a few chosen examples.

  12. Integration of computational analysis as a sentinel tool in toxicological assessments.

    PubMed

    Pearl, G M; Livingston-Carr, S; Durham, S K

    2001-09-01

    Computational toxicity modeling can have significant impact in the drug discovery process, especially when utilized as a sentinel filter for common drug safety liabilities, such as mutagenicity, carcinogenicity and teratogenicity. This review will focus on the strengths and limitations of the current computational models for predicting these drug safety liabilities, and the various strategies for incorporating these predictive models into the drug discovery process.

  13. Computer-Based Language Tools for the Teaching of Language for a Special Purpose.

    ERIC Educational Resources Information Center

    Cooks, Maria; Henstock, Peter

    This paper describes the development and use of a computer aided instruction (CAI) software program for the teaching of Spanish for special purposes at Purdue University in West Lafayette, Indiana. The program is designed to: (1) motivate students to use the computer by making it a non-threatening medium through individualization of the learning…

  14. Embodying Computational Thinking: Initial Design of an Emerging Technological Learning Tool

    ERIC Educational Resources Information Center

    Daily, Shaundra B.; Leonard, Alison E.; Jörg, Sophie; Babu, Sabarish; Gundersen, Kara; Parmar, Dhaval

    2015-01-01

    This emerging technology report describes virtual environment interactions an approach for blending movement and computer programming as an embodied way to support girls in building computational thinking skills. The authors seek to understand how body syntonicity might enable young learners to bootstrap their intuitive knowledge in order to…

  15. The Application of a Computer Algebra System as a Tool in College Algebra.

    ERIC Educational Resources Information Center

    Mayes, Robert L.

    1995-01-01

    Students (n=61) in an experimental course stressing active student involvement and the use of a computer algebra system scored higher than students (n=76) in a traditional college algebra course on final measures of inductive reasoning, visualization, and problem solving while maintaining equivalent manipulation and computation skills. (Author/MLB)

  16. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  17. Computational Aero-acoustics As a Tool For Turbo-machinery Noise Reduction

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.

    2003-01-01

    This talk will provide an overview of the field of computational aero-acoustics and its use in fan noise prediction. After a brief history of computational fluid dynamics, some of the recent developments in computational aero-acoustics will be explored. Computational issues concerning sound wave production, propagation, and reflection in practical turbo-machinery applications will be discussed including: (a) High order/High Resolution Numerical Techniques. (b) High Resolution Boundary Conditions. [c] MIMD Parallel Computing. [d] Form of Governing Equations Useful for Simulations. In addition, the basic design of our Broadband Analysis Stator Simulator (BASS) code and its application to a 2 D rotor wake-stator interaction will be shown. An example of the noise produced by the wakes from a rotor impinging upon a stator cascade will be shown.

  18. Self port scanning tool : providing a more secure computing Environment through the use of proactive port scanning

    NASA Technical Reports Server (NTRS)

    Kocher, Joshua E; Gilliam, David P.

    2005-01-01

    Secure computing is a necessity in the hostile environment that the internet has become. Protection from nefarious individuals and organizations requires a solution that is more a methodology than a one time fix. One aspect of this methodology is having the knowledge of which network ports a computer has open to the world, These network ports are essentially the doorways from the internet into the computer. An assessment method which uses the nmap software to scan ports has been developed to aid System Administrators (SAs) with analysis of open ports on their system(s). Additionally, baselines for several operating systems have been developed so that SAs can compare their open ports to a baseline for a given operating system. Further, the tool is deployed on a website where SAs and Users can request a port scan of their computer. The results are then emailed to the requestor. This tool aids Users, SAs, and security professionals by providing an overall picture of what services are running, what ports are open, potential trojan programs or backdoors, and what ports can be closed.

  19. Computer Simulation in Mass Emergency and Disaster Response: An Evaluation of Its Effectiveness as a Tool for Demonstrating Strategic Competency in Emergency Department Medical Responders

    ERIC Educational Resources Information Center

    O'Reilly, Daniel J.

    2011-01-01

    This study examined the capability of computer simulation as a tool for assessing the strategic competency of emergency department nurses as they responded to authentically computer simulated biohazard-exposed patient case studies. Thirty registered nurses from a large, urban hospital completed a series of computer-simulated case studies of…

  20. Method for Determining Language Objectives and Criteria. Volume II. Methodological Tools: Computer Analysis, Data Collection Instruments.

    DTIC Science & Technology

    1979-05-25

    This volume presents (1) Methods for computer and hand analysis of numerical language performance data (includes examples) (2) samples of interview, observation, and survey instruments used in collecting language data. (Author)