Sample records for accurate computational tools

  1. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    PubMed Central

    2010-01-01

    Background The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/. PMID:20482791

  2. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments.

    PubMed

    Ma, Jingming; Dykes, Carrie; Wu, Tao; Huang, Yangxin; Demeter, Lisa; Wu, Hulin

    2010-05-18

    The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  3. Simulation of DKIST solar adaptive optics system

    NASA Astrophysics Data System (ADS)

    Marino, Jose; Carlisle, Elizabeth; Schmidt, Dirk

    2016-07-01

    Solar adaptive optics (AO) simulations are a valuable tool to guide the design and optimization process of current and future solar AO and multi-conjugate AO (MCAO) systems. Solar AO and MCAO systems rely on extended object cross-correlating Shack-Hartmann wavefront sensors to measure the wavefront. Accurate solar AO simulations require computationally intensive operations, which have until recently presented a prohibitive computational cost. We present an update on the status of a solar AO and MCAO simulation tool being developed at the National Solar Observatory. The simulation tool is a multi-threaded application written in the C++ language that takes advantage of current large multi-core CPU computer systems and fast ethernet connections to provide accurate full simulation of solar AO and MCAO systems. It interfaces with KAOS, a state of the art solar AO control software developed by the Kiepenheuer-Institut fuer Sonnenphysik, that provides reliable AO control. We report on the latest results produced by the solar AO simulation tool.

  4. Physics Education through Computational Tools: The Case of Geometrical and Physical Optics

    ERIC Educational Resources Information Center

    Rodríguez, Y.; Santana, A.; Mendoza, L. M.

    2013-01-01

    Recently, with the development of more powerful and accurate computational tools, the inclusion of new didactic materials in the classroom is known to have increased. However, the form in which these materials can be used to enhance the learning process is still under debate. Many different methodologies have been suggested for constructing new…

  5. CFD research, parallel computation and aerodynamic optimization

    NASA Technical Reports Server (NTRS)

    Ryan, James S.

    1995-01-01

    Over five years of research in Computational Fluid Dynamics and its applications are covered in this report. Using CFD as an established tool, aerodynamic optimization on parallel architectures is explored. The objective of this work is to provide better tools to vehicle designers. Submarine design requires accurate force and moment calculations in flow with thick boundary layers and large separated vortices. Low noise production is critical, so flow into the propulsor region must be predicted accurately. The High Speed Civil Transport (HSCT) has been the subject of recent work. This vehicle is to be a passenger vehicle with the capability of cutting overseas flight times by more than half. A successful design must surpass the performance of comparable planes. Fuel economy, other operational costs, environmental impact, and range must all be improved substantially. For all these reasons, improved design tools are required, and these tools must eventually integrate optimization, external aerodynamics, propulsion, structures, heat transfer and other disciplines.

  6. Development of a computer code for calculating the steady super/hypersonic inviscid flow around real configurations. Volume 1: Computational technique

    NASA Technical Reports Server (NTRS)

    Marconi, F.; Salas, M.; Yaeger, L.

    1976-01-01

    A numerical procedure has been developed to compute the inviscid super/hypersonic flow field about complex vehicle geometries accurately and efficiently. A second order accurate finite difference scheme is used to integrate the three dimensional Euler equations in regions of continuous flow, while all shock waves are computed as discontinuities via the Rankine Hugoniot jump conditions. Conformal mappings are used to develop a computational grid. The effects of blunt nose entropy layers are computed in detail. Real gas effects for equilibrium air are included using curve fits of Mollier charts. Typical calculated results for shuttle orbiter, hypersonic transport, and supersonic aircraft configurations are included to demonstrate the usefulness of this tool.

  7. Improving Fidelity of Launch Vehicle Liftoff Acoustic Simulations

    NASA Technical Reports Server (NTRS)

    Liever, Peter; West, Jeff

    2016-01-01

    Launch vehicles experience high acoustic loads during ignition and liftoff affected by the interaction of rocket plume generated acoustic waves with launch pad structures. Application of highly parallelized Computational Fluid Dynamics (CFD) analysis tools optimized for application on the NAS computer systems such as the Loci/CHEM program now enable simulation of time-accurate, turbulent, multi-species plume formation and interaction with launch pad geometry and capture the generation of acoustic noise at the source regions in the plume shear layers and impingement regions. These CFD solvers are robust in capturing the acoustic fluctuations, but they are too dissipative to accurately resolve the propagation of the acoustic waves throughout the launch environment domain along the vehicle. A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed to improve such liftoff acoustic environment predictions. The framework combines the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate discontinuous Galerkin (DG) solver, Loci/THRUST, developed in the same computational framework. Loci/THRUST employs a low dissipation, high-order, unstructured DG method to accurately propagate acoustic waves away from the source regions across large distances. The DG solver is currently capable of solving up to 4th order solutions for non-linear, conservative acoustic field propagation. Higher order boundary conditions are implemented to accurately model the reflection and refraction of acoustic waves on launch pad components. The DG solver accepts generalized unstructured meshes, enabling efficient application of common mesh generation tools for CHEM and THRUST simulations. The DG solution is coupled with the CFD solution at interface boundaries placed near the CFD acoustic source regions. Both simulations are executed simultaneously with coordinated boundary condition data exchange.

  8. A new software tool for computing Earth's atmospheric transmission of near- and far-infrared radiation

    NASA Technical Reports Server (NTRS)

    Lord, Steven D.

    1992-01-01

    This report describes a new software tool, ATRAN, which computes the transmittance of Earth's atmosphere at near- and far-infrared wavelengths. We compare the capabilities of this program with others currently available and demonstrate its utility for observational data calibration and reduction. The program employs current water-vapor and ozone models to produce fast and accurate transmittance spectra for wavelengths ranging from 0.8 microns to 10 mm.

  9. An Interactive, Versatile, Three-Dimensional Display, Manipulation and Plotting System for Biomedical Research

    ERIC Educational Resources Information Center

    Feldmann, Richard J.; And Others

    1972-01-01

    Computer graphics provides a valuable tool for the representation and a better understanding of structures, both small and large. Accurate and rapid construction, manipulation, and plotting of structures, such as macromolecules as complex as hemoglobin, are performed by a collection of computer programs and a time-sharing computer. (21 references)…

  10. An implicit higher-order spatially accurate scheme for solving time dependent flows on unstructured meshes

    NASA Astrophysics Data System (ADS)

    Tomaro, Robert F.

    1998-07-01

    The present research is aimed at developing a higher-order, spatially accurate scheme for both steady and unsteady flow simulations using unstructured meshes. The resulting scheme must work on a variety of general problems to ensure the creation of a flexible, reliable and accurate aerodynamic analysis tool. To calculate the flow around complex configurations, unstructured grids and the associated flow solvers have been developed. Efficient simulations require the minimum use of computer memory and computational times. Unstructured flow solvers typically require more computer memory than a structured flow solver due to the indirect addressing of the cells. The approach taken in the present research was to modify an existing three-dimensional unstructured flow solver to first decrease the computational time required for a solution and then to increase the spatial accuracy. The terms required to simulate flow involving non-stationary grids were also implemented. First, an implicit solution algorithm was implemented to replace the existing explicit procedure. Several test cases, including internal and external, inviscid and viscous, two-dimensional, three-dimensional and axi-symmetric problems, were simulated for comparison between the explicit and implicit solution procedures. The increased efficiency and robustness of modified code due to the implicit algorithm was demonstrated. Two unsteady test cases, a plunging airfoil and a wing undergoing bending and torsion, were simulated using the implicit algorithm modified to include the terms required for a moving and/or deforming grid. Secondly, a higher than second-order spatially accurate scheme was developed and implemented into the baseline code. Third- and fourth-order spatially accurate schemes were implemented and tested. The original dissipation was modified to include higher-order terms and modified near shock waves to limit pre- and post-shock oscillations. The unsteady cases were repeated using the higher-order spatially accurate code. The new solutions were compared with those obtained using the second-order spatially accurate scheme. Finally, the increased efficiency of using an implicit solution algorithm in a production Computational Fluid Dynamics flow solver was demonstrated for steady and unsteady flows. A third- and fourth-order spatially accurate scheme has been implemented creating a basis for a state-of-the-art aerodynamic analysis tool.

  11. SLUG - stochastically lighting up galaxies - III. A suite of tools for simulated photometry, spectroscopy, and Bayesian inference with stochastic stellar populations

    NASA Astrophysics Data System (ADS)

    Krumholz, Mark R.; Fumagalli, Michele; da Silva, Robert L.; Rendahl, Theodore; Parra, Jonathan

    2015-09-01

    Stellar population synthesis techniques for predicting the observable light emitted by a stellar population have extensive applications in numerous areas of astronomy. However, accurate predictions for small populations of young stars, such as those found in individual star clusters, star-forming dwarf galaxies, and small segments of spiral galaxies, require that the population be treated stochastically. Conversely, accurate deductions of the properties of such objects also require consideration of stochasticity. Here we describe a comprehensive suite of modular, open-source software tools for tackling these related problems. These include the following: a greatly-enhanced version of the SLUG code introduced by da Silva et al., which computes spectra and photometry for stochastically or deterministically sampled stellar populations with nearly arbitrary star formation histories, clustering properties, and initial mass functions; CLOUDY_SLUG, a tool that automatically couples SLUG-computed spectra with the CLOUDY radiative transfer code in order to predict stochastic nebular emission; BAYESPHOT, a general-purpose tool for performing Bayesian inference on the physical properties of stellar systems based on unresolved photometry; and CLUSTER_SLUG and SFR_SLUG, a pair of tools that use BAYESPHOT on a library of SLUG models to compute the mass, age, and extinction of mono-age star clusters, and the star formation rate of galaxies, respectively. The latter two tools make use of an extensive library of pre-computed stellar population models, which are included in the software. The complete package is available at http://www.slugsps.com.

  12. Representing nursing guideline with unified modeling language to facilitate development of a computer system: a case study.

    PubMed

    Choi, Jeeyae; Choi, Jeungok E

    2014-01-01

    To provide best recommendations at the point of care, guidelines have been implemented in computer systems. As a prerequisite, guidelines are translated into a computer-interpretable guideline format. Since there are no specific tools to translate nursing guidelines, only a few nursing guidelines are translated and implemented in computer systems. Unified modeling language (UML) is a software writing language and is known to well and accurately represent end-users' perspective, due to the expressive characteristics of the UML. In order to facilitate the development of computer systems for nurses' use, the UML was used to translate a paper-based nursing guideline, and its ease of use and the usefulness were tested through a case study of a genetic counseling guideline. The UML was found to be a useful tool to nurse informaticians and a sufficient tool to model a guideline in a computer program.

  13. Development of a computer code for calculating the steady super/hypersonic inviscid flow around real configurations. Volume 2: Code description

    NASA Technical Reports Server (NTRS)

    Marconi, F.; Yaeger, L.

    1976-01-01

    A numerical procedure was developed to compute the inviscid super/hypersonic flow field about complex vehicle geometries accurately and efficiently. A second-order accurate finite difference scheme is used to integrate the three-dimensional Euler equations in regions of continuous flow, while all shock waves are computed as discontinuities via the Rankine-Hugoniot jump conditions. Conformal mappings are used to develop a computational grid. The effects of blunt nose entropy layers are computed in detail. Real gas effects for equilibrium air are included using curve fits of Mollier charts. Typical calculated results for shuttle orbiter, hypersonic transport, and supersonic aircraft configurations are included to demonstrate the usefulness of this tool.

  14. Chimera Grid Tools

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  15. Computer-aided design/computer-aided manufacturing skull base drill.

    PubMed

    Couldwell, William T; MacDonald, Joel D; Thomas, Charles L; Hansen, Bradley C; Lapalikar, Aniruddha; Thakkar, Bharat; Balaji, Alagar K

    2017-05-01

    The authors have developed a simple device for computer-aided design/computer-aided manufacturing (CAD-CAM) that uses an image-guided system to define a cutting tool path that is shared with a surgical machining system for drilling bone. Information from 2D images (obtained via CT and MRI) is transmitted to a processor that produces a 3D image. The processor generates code defining an optimized cutting tool path, which is sent to a surgical machining system that can drill the desired portion of bone. This tool has applications for bone removal in both cranial and spine neurosurgical approaches. Such applications have the potential to reduce surgical time and associated complications such as infection or blood loss. The device enables rapid removal of bone within 1 mm of vital structures. The validity of such a machining tool is exemplified in the rapid (< 3 minutes machining time) and accurate removal of bone for transtemporal (for example, translabyrinthine) approaches.

  16. Efficient hybrid-symbolic methods for quantum mechanical calculations

    NASA Astrophysics Data System (ADS)

    Scott, T. C.; Zhang, Wenxing

    2015-06-01

    We present hybrid symbolic-numerical tools to generate optimized numerical code for rapid prototyping and fast numerical computation starting from a computer algebra system (CAS) and tailored to any given quantum mechanical problem. Although a major focus concerns the quantum chemistry methods of H. Nakatsuji which has yielded successful and very accurate eigensolutions for small atoms and molecules, the tools are general and may be applied to any basis set calculation with a variational principle applied to its linear and non-linear parameters.

  17. Progressive Damage and Failure Analysis of Composite Laminates

    NASA Astrophysics Data System (ADS)

    Joseph, Ashith P. K.

    Composite materials are widely used in various industries for making structural parts due to higher strength to weight ratio, better fatigue life, corrosion resistance and material property tailorability. To fully exploit the capability of composites, it is required to know the load carrying capacity of the parts made of them. Unlike metals, composites are orthotropic in nature and fails in a complex manner under various loading conditions which makes it a hard problem to analyze. Lack of reliable and efficient failure analysis tools for composites have led industries to rely more on coupon and component level testing to estimate the design space. Due to the complex failure mechanisms, composite materials require a very large number of coupon level tests to fully characterize the behavior. This makes the entire testing process very time consuming and costly. The alternative is to use virtual testing tools which can predict the complex failure mechanisms accurately. This reduces the cost only to it's associated computational expenses making significant savings. Some of the most desired features in a virtual testing tool are - (1) Accurate representation of failure mechanism: Failure progression predicted by the virtual tool must be same as those observed in experiments. A tool has to be assessed based on the mechanisms it can capture. (2) Computational efficiency: The greatest advantages of a virtual tools are the savings in time and money and hence computational efficiency is one of the most needed features. (3) Applicability to a wide range of problems: Structural parts are subjected to a variety of loading conditions including static, dynamic and fatigue conditions. A good virtual testing tool should be able to make good predictions for all these different loading conditions. The aim of this PhD thesis is to develop a computational tool which can model the progressive failure of composite laminates under different quasi-static loading conditions. The analysis tool is validated by comparing the simulations against experiments for a selected number of quasi-static loading cases.

  18. High accurate interpolation of NURBS tool path for CNC machine tools

    NASA Astrophysics Data System (ADS)

    Liu, Qiang; Liu, Huan; Yuan, Songmei

    2016-09-01

    Feedrate fluctuation caused by approximation errors of interpolation methods has great effects on machining quality in NURBS interpolation, but few methods can efficiently eliminate or reduce it to a satisfying level without sacrificing the computing efficiency at present. In order to solve this problem, a high accurate interpolation method for NURBS tool path is proposed. The proposed method can efficiently reduce the feedrate fluctuation by forming a quartic equation with respect to the curve parameter increment, which can be efficiently solved by analytic methods in real-time. Theoretically, the proposed method can totally eliminate the feedrate fluctuation for any 2nd degree NURBS curves and can interpolate 3rd degree NURBS curves with minimal feedrate fluctuation. Moreover, a smooth feedrate planning algorithm is also proposed to generate smooth tool motion with considering multiple constraints and scheduling errors by an efficient planning strategy. Experiments are conducted to verify the feasibility and applicability of the proposed method. This research presents a novel NURBS interpolation method with not only high accuracy but also satisfying computing efficiency.

  19. SURE reliability analysis: Program and mathematics

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; White, Allan L.

    1988-01-01

    The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The computational methods on which the program is based provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.

  20. An Accurate and Computationally Efficient Model for Membrane-Type Circular-Symmetric Micro-Hotplates

    PubMed Central

    Khan, Usman; Falconi, Christian

    2014-01-01

    Ideally, the design of high-performance micro-hotplates would require a large number of simulations because of the existence of many important design parameters as well as the possibly crucial effects of both spread and drift. However, the computational cost of FEM simulations, which are the only available tool for accurately predicting the temperature in micro-hotplates, is very high. As a result, micro-hotplate designers generally have no effective simulation-tools for the optimization. In order to circumvent these issues, here, we propose a model for practical circular-symmetric micro-hot-plates which takes advantage of modified Bessel functions, computationally efficient matrix-approach for considering the relevant boundary conditions, Taylor linearization for modeling the Joule heating and radiation losses, and external-region-segmentation strategy in order to accurately take into account radiation losses in the entire micro-hotplate. The proposed model is almost as accurate as FEM simulations and two to three orders of magnitude more computationally efficient (e.g., 45 s versus more than 8 h). The residual errors, which are mainly associated to the undesired heating in the electrical contacts, are small (e.g., few degrees Celsius for an 800 °C operating temperature) and, for important analyses, almost constant. Therefore, we also introduce a computationally-easy single-FEM-compensation strategy in order to reduce the residual errors to about 1 °C. As illustrative examples of the power of our approach, we report the systematic investigation of a spread in the membrane thermal conductivity and of combined variations of both ambient and bulk temperatures. Our model enables a much faster characterization of micro-hotplates and, thus, a much more effective optimization prior to fabrication. PMID:24763214

  1. Quokka: a comprehensive tool for rapid and accurate prediction of kinase family-specific phosphorylation sites in the human proteome.

    PubMed

    Li, Fuyi; Li, Chen; Marquez-Lago, Tatiana T; Leier, André; Akutsu, Tatsuya; Purcell, Anthony W; Smith, A Ian; Lithgow, Trevor; Daly, Roger J; Song, Jiangning; Chou, Kuo-Chen

    2018-06-27

    Kinase-regulated phosphorylation is a ubiquitous type of post-translational modification (PTM) in both eukaryotic and prokaryotic cells. Phosphorylation plays fundamental roles in many signalling pathways and biological processes, such as protein degradation and protein-protein interactions. Experimental studies have revealed that signalling defects caused by aberrant phosphorylation are highly associated with a variety of human diseases, especially cancers. In light of this, a number of computational methods aiming to accurately predict protein kinase family-specific or kinase-specific phosphorylation sites have been established, thereby facilitating phosphoproteomic data analysis. In this work, we present Quokka, a novel bioinformatics tool that allows users to rapidly and accurately identify human kinase family-regulated phosphorylation sites. Quokka was developed by using a variety of sequence scoring functions combined with an optimized logistic regression algorithm. We evaluated Quokka based on well-prepared up-to-date benchmark and independent test datasets, curated from the Phospho.ELM and UniProt databases, respectively. The independent test demonstrates that Quokka improves the prediction performance compared with state-of-the-art computational tools for phosphorylation prediction. In summary, our tool provides users with high-quality predicted human phosphorylation sites for hypothesis generation and biological validation. The Quokka webserver and datasets are freely available at http://quokka.erc.monash.edu/. Supplementary data are available at Bioinformatics online.

  2. Computational Prediction of miRNA Genes from Small RNA Sequencing Data

    PubMed Central

    Kang, Wenjing; Friedländer, Marc R.

    2015-01-01

    Next-generation sequencing now for the first time allows researchers to gage the depth and variation of entire transcriptomes. However, now as rare transcripts can be detected that are present in cells at single copies, more advanced computational tools are needed to accurately annotate and profile them. microRNAs (miRNAs) are 22 nucleotide small RNAs (sRNAs) that post-transcriptionally reduce the output of protein coding genes. They have established roles in numerous biological processes, including cancers and other diseases. During miRNA biogenesis, the sRNAs are sequentially cleaved from precursor molecules that have a characteristic hairpin RNA structure. The vast majority of new miRNA genes that are discovered are mined from small RNA sequencing (sRNA-seq), which can detect more than a billion RNAs in a single run. However, given that many of the detected RNAs are degradation products from all types of transcripts, the accurate identification of miRNAs remain a non-trivial computational problem. Here, we review the tools available to predict animal miRNAs from sRNA sequencing data. We present tools for generalist and specialist use cases, including prediction from massively pooled data or in species without reference genome. We also present wet-lab methods used to validate predicted miRNAs, and approaches to computationally benchmark prediction accuracy. For each tool, we reference validation experiments and benchmarking efforts. Last, we discuss the future of the field. PMID:25674563

  3. Flowing Hot or Cold: User-Friendly Computational Models of Terrestrial and Planetary Lava Channels and Lakes

    NASA Astrophysics Data System (ADS)

    Sakimoto, S. E. H.

    2016-12-01

    Planetary volcanism has redefined what is considered volcanism. "Magma" now may be considered to be anything from the molten rock familiar at terrestrial volcanoes to cryovolcanic ammonia-water mixes erupted on an outer solar system moon. However, even with unfamiliar compositions and source mechanisms, we find familiar landforms such as volcanic channels, lakes, flows, and domes and thus a multitude of possibilities for modeling. As on Earth, these landforms lend themselves to analysis for estimating storage, eruption and/or flow rates. This has potential pitfalls, as extension of the simplified analytic models we often use for terrestrial features into unfamiliar parameter space might yield misleading results. Our most commonly used tools for estimating flow and cooling have tended to lag significantly behind state-of-the-art; the easiest methods to use are neither realistic or accurate, but the more realistic and accurate computational methods are not simple to use. Since the latter computational tools tend to be both expensive and require a significant learning curve, there is a need for a user-friendly approach that still takes advantage of their accuracy. One method is use of the computational package for generation of a server-based tool that allows less computationally inclined users to get accurate results over their range of input parameters for a given problem geometry. A second method is to use the computational package for the generation of a polynomial empirical solution for each class of flow geometry that can be fairly easily solved by anyone with a spreadsheet. In this study, we demonstrate both approaches for several channel flow and lava lake geometries with terrestrial and extraterrestrial examples and compare their results. Specifically, we model cooling rectangular channel flow with a yield strength material, with applications to Mauna Loa, Kilauea, Venus, and Mars. This approach also shows promise with model applications to lava lakes, magma flow through cracks, and volcanic dome formation.

  4. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-31

    requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry

  5. Modeling of edge effect in subaperture tool influence functions of computer controlled optical surfacing.

    PubMed

    Wan, Songlin; Zhang, Xiangchao; He, Xiaoying; Xu, Min

    2016-12-20

    Computer controlled optical surfacing requires an accurate tool influence function (TIF) for reliable path planning and deterministic fabrication. Near the edge of the workpieces, the TIF has a nonlinear removal behavior, which will cause a severe edge-roll phenomenon. In the present paper, a new edge pressure model is developed based on the finite element analysis results. The model is represented as the product of a basic pressure function and a correcting function. The basic pressure distribution is calculated according to the surface shape of the polishing pad, and the correcting function is used to compensate the errors caused by the edge effect. Practical experimental results demonstrate that the new model can accurately predict the edge TIFs with different overhang ratios. The relative error of the new edge model can be reduced to 15%.

  6. A total variation diminishing finite difference algorithm for sonic boom propagation models

    NASA Technical Reports Server (NTRS)

    Sparrow, Victor W.

    1993-01-01

    It is difficult to accurately model the rise phases of sonic boom waveforms with traditional finite difference algorithms because of finite difference phase dispersion. This paper introduces the concept of a total variation diminishing (TVD) finite difference method as a tool for accurately modeling the rise phases of sonic booms. A standard second order finite difference algorithm and its TVD modified counterpart are both applied to the one-way propagation of a square pulse. The TVD method clearly outperforms the non-TVD method, showing great potential as a new computational tool in the analysis of sonic boom propagation.

  7. Computing tools for implementing standards for single-case designs.

    PubMed

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.

  8. The SURE reliability analysis program

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1986-01-01

    The SURE program is a new reliability tool for ultrareliable computer system architectures. The program is based on computational methods recently developed for the NASA Langley Research Center. These methods provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.

  9. The SURE Reliability Analysis Program

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1986-01-01

    The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The program is based on computational methods recently developed for the NASA Langley Research Center. These methods provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.

  10. An integrated computational tool for precipitation simulation

    NASA Astrophysics Data System (ADS)

    Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.

    2011-07-01

    Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.

  11. Solar Power Tower Integrated Layout and Optimization Tool | Concentrating

    Science.gov Websites

    methods to reduce the overall computational burden while generating accurate and precise results. These methods have been developed as part of the U.S. Department of Energy (DOE) SunShot Initiative research

  12. Tools for Accurate and Efficient Analysis of Complex Evolutionary Mechanisms in Microbial Genomes. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakhleh, Luay

    I proposed to develop computationally efficient tools for accurate detection and reconstruction of microbes' complex evolutionary mechanisms, thus enabling rapid and accurate annotation, analysis and understanding of their genomes. To achieve this goal, I proposed to address three aspects. (1) Mathematical modeling. A major challenge facing the accurate detection of HGT is that of distinguishing between these two events on the one hand and other events that have similar "effects." I proposed to develop a novel mathematical approach for distinguishing among these events. Further, I proposed to develop a set of novel optimization criteria for the evolutionary analysis of microbialmore » genomes in the presence of these complex evolutionary events. (2) Algorithm design. In this aspect of the project, I proposed to develop an array of e cient and accurate algorithms for analyzing microbial genomes based on the formulated optimization criteria. Further, I proposed to test the viability of the criteria and the accuracy of the algorithms in an experimental setting using both synthetic as well as biological data. (3) Software development. I proposed the nal outcome to be a suite of software tools which implements the mathematical models as well as the algorithms developed.« less

  13. Laplace Transform Based Radiative Transfer Studies

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Lin, B.; Ng, T.; Yang, P.; Wiscombe, W.; Herath, J.; Duffy, D.

    2006-12-01

    Multiple scattering is the major uncertainty for data analysis of space-based lidar measurements. Until now, accurate quantitative lidar data analysis has been limited to very thin objects that are dominated by single scattering, where photons from the laser beam only scatter a single time with particles in the atmosphere before reaching the receiver, and simple linear relationship between physical property and lidar signal exists. In reality, multiple scattering is always a factor in space-based lidar measurement and it dominates space- based lidar returns from clouds, dust aerosols, vegetation canopy and phytoplankton. While multiple scattering are clear signals, the lack of a fast-enough lidar multiple scattering computation tool forces us to treat the signal as unwanted "noise" and use simple multiple scattering correction scheme to remove them. Such multiple scattering treatments waste the multiple scattering signals and may cause orders of magnitude errors in retrieved physical properties. Thus the lack of fast and accurate time-dependent radiative transfer tools significantly limits lidar remote sensing capabilities. Analyzing lidar multiple scattering signals requires fast and accurate time-dependent radiative transfer computations. Currently, multiple scattering is done with Monte Carlo simulations. Monte Carlo simulations take minutes to hours and are too slow for interactive satellite data analysis processes and can only be used to help system / algorithm design and error assessment. We present an innovative physics approach to solve the time-dependent radiative transfer problem. The technique utilizes FPGA based reconfigurable computing hardware. The approach is as following, 1. Physics solution: Perform Laplace transform on the time and spatial dimensions and Fourier transform on the viewing azimuth dimension, and convert the radiative transfer differential equation solving into a fast matrix inversion problem. The majority of the radiative transfer computation goes to matrix inversion processes, FFT and inverse Laplace transforms. 2. Hardware solutions: Perform the well-defined matrix inversion, FFT and Laplace transforms on highly parallel, reconfigurable computing hardware. This physics-based computational tool leads to accurate quantitative analysis of space-based lidar signals and improves data quality of current lidar mission such as CALIPSO. This presentation will introduce the basic idea of this approach, preliminary results based on SRC's FPGA-based Mapstation, and how we may apply it to CALIPSO data analysis.

  14. Overview of aerothermodynamic loads definition study

    NASA Technical Reports Server (NTRS)

    Gaugler, Raymond E.

    1989-01-01

    Over the years, NASA has been conducting the Advanced Earth-to-Orbit (AETO) Propulsion Technology Program to provide the knowledge, understanding, and design methodology that will allow the development of advanced Earth-to-orbit propulsion systems with high performance, extended service life, automated operations, and diagnostics for in-flight health monitoring. The objective of the Aerothermodynamic Loads Definition Study is to develop methods to more accurately predict the operating environment in AETO propulsion systems, such as the Space Shuttle Main Engine (SSME) powerhead. The approach taken consists of 2 parts: to modify, apply, and disseminate existing computational fluid dynamics tools in response to current needs and to develop new technology that will enable more accurate computation of the time averaged and unsteady aerothermodynamic loads in the SSME powerhead. The software tools are detailed. Significant progress was made in the area of turbomachinery, where there is an overlap between the AETO efforts and research in the aeronautical gas turbine field.

  15. CFD - Mature Technology?

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan

    2005-01-01

    Over the past 30 years, numerical methods and simulation tools for fluid dynamic problems have advanced as a new discipline, namely, computational fluid dynamics (CFD). Although a wide spectrum of flow regimes are encountered in many areas of science and engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to a large demand for predicting the aerodynamic performance characteristics of flight vehicles, such as commercial, military, and space vehicles. As flow analysis is required to be more accurate and computationally efficient for both commercial and mission-oriented applications (such as those encountered in meteorology, aerospace vehicle development, general fluid engineering and biofluid analysis) CFD tools for engineering become increasingly important for predicting safety, performance and cost. This paper presents the author's perspective on the maturity of CFD, especially from an aerospace engineering point of view.

  16. Compression-based distance (CBD): a simple, rapid, and accurate method for microbiota composition comparison

    PubMed Central

    2013-01-01

    Background Perturbations in intestinal microbiota composition have been associated with a variety of gastrointestinal tract-related diseases. The alleviation of symptoms has been achieved using treatments that alter the gastrointestinal tract microbiota toward that of healthy individuals. Identifying differences in microbiota composition through the use of 16S rRNA gene hypervariable tag sequencing has profound health implications. Current computational methods for comparing microbial communities are usually based on multiple alignments and phylogenetic inference, making them time consuming and requiring exceptional expertise and computational resources. As sequencing data rapidly grows in size, simpler analysis methods are needed to meet the growing computational burdens of microbiota comparisons. Thus, we have developed a simple, rapid, and accurate method, independent of multiple alignments and phylogenetic inference, to support microbiota comparisons. Results We create a metric, called compression-based distance (CBD) for quantifying the degree of similarity between microbial communities. CBD uses the repetitive nature of hypervariable tag datasets and well-established compression algorithms to approximate the total information shared between two datasets. Three published microbiota datasets were used as test cases for CBD as an applicable tool. Our study revealed that CBD recaptured 100% of the statistically significant conclusions reported in the previous studies, while achieving a decrease in computational time required when compared to similar tools without expert user intervention. Conclusion CBD provides a simple, rapid, and accurate method for assessing distances between gastrointestinal tract microbiota 16S hypervariable tag datasets. PMID:23617892

  17. Compression-based distance (CBD): a simple, rapid, and accurate method for microbiota composition comparison.

    PubMed

    Yang, Fang; Chia, Nicholas; White, Bryan A; Schook, Lawrence B

    2013-04-23

    Perturbations in intestinal microbiota composition have been associated with a variety of gastrointestinal tract-related diseases. The alleviation of symptoms has been achieved using treatments that alter the gastrointestinal tract microbiota toward that of healthy individuals. Identifying differences in microbiota composition through the use of 16S rRNA gene hypervariable tag sequencing has profound health implications. Current computational methods for comparing microbial communities are usually based on multiple alignments and phylogenetic inference, making them time consuming and requiring exceptional expertise and computational resources. As sequencing data rapidly grows in size, simpler analysis methods are needed to meet the growing computational burdens of microbiota comparisons. Thus, we have developed a simple, rapid, and accurate method, independent of multiple alignments and phylogenetic inference, to support microbiota comparisons. We create a metric, called compression-based distance (CBD) for quantifying the degree of similarity between microbial communities. CBD uses the repetitive nature of hypervariable tag datasets and well-established compression algorithms to approximate the total information shared between two datasets. Three published microbiota datasets were used as test cases for CBD as an applicable tool. Our study revealed that CBD recaptured 100% of the statistically significant conclusions reported in the previous studies, while achieving a decrease in computational time required when compared to similar tools without expert user intervention. CBD provides a simple, rapid, and accurate method for assessing distances between gastrointestinal tract microbiota 16S hypervariable tag datasets.

  18. Computational Challenges of Viscous Incompressible Flows

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin; Kim, Chang Sung

    2004-01-01

    Over the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of the computational fluid dynamics (CFD) discipline. Although incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to the rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low-speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient CFD took become increasingly important in fluid engineering for incompressible and low-speed flow. This paper reviews some of the successes made possible by advances in computational technologies during the same period, and discusses some of the current challenges faced in computing incompressible flows.

  19. Development and Validation of a Multidisciplinary Tool for Accurate and Efficient Rotorcraft Noise Prediction (MUTE)

    NASA Technical Reports Server (NTRS)

    Liu, Yi; Anusonti-Inthra, Phuriwat; Diskin, Boris

    2011-01-01

    A physics-based, systematically coupled, multidisciplinary prediction tool (MUTE) for rotorcraft noise was developed and validated with a wide range of flight configurations and conditions. MUTE is an aggregation of multidisciplinary computational tools that accurately and efficiently model the physics of the source of rotorcraft noise, and predict the noise at far-field observer locations. It uses systematic coupling approaches among multiple disciplines including Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and high fidelity acoustics. Within MUTE, advanced high-order CFD tools are used around the rotor blade to predict the transonic flow (shock wave) effects, which generate the high-speed impulsive noise. Predictions of the blade-vortex interaction noise in low speed flight are also improved by using the Particle Vortex Transport Method (PVTM), which preserves the wake flow details required for blade/wake and fuselage/wake interactions. The accuracy of the source noise prediction is further improved by utilizing a coupling approach between CFD and CSD, so that the effects of key structural dynamics, elastic blade deformations, and trim solutions are correctly represented in the analysis. The blade loading information and/or the flow field parameters around the rotor blade predicted by the CFD/CSD coupling approach are used to predict the acoustic signatures at far-field observer locations with a high-fidelity noise propagation code (WOPWOP3). The predicted results from the MUTE tool for rotor blade aerodynamic loading and far-field acoustic signatures are compared and validated with a variation of experimental data sets, such as UH60-A data, DNW test data and HART II test data.

  20. PredictSNP: Robust and Accurate Consensus Classifier for Prediction of Disease-Related Mutations

    PubMed Central

    Bendl, Jaroslav; Stourac, Jan; Salanda, Ondrej; Pavelka, Antonin; Wieben, Eric D.; Zendulka, Jaroslav; Brezovsky, Jan; Damborsky, Jiri

    2014-01-01

    Single nucleotide variants represent a prevalent form of genetic variation. Mutations in the coding regions are frequently associated with the development of various genetic diseases. Computational tools for the prediction of the effects of mutations on protein function are very important for analysis of single nucleotide variants and their prioritization for experimental characterization. Many computational tools are already widely employed for this purpose. Unfortunately, their comparison and further improvement is hindered by large overlaps between the training datasets and benchmark datasets, which lead to biased and overly optimistic reported performances. In this study, we have constructed three independent datasets by removing all duplicities, inconsistencies and mutations previously used in the training of evaluated tools. The benchmark dataset containing over 43,000 mutations was employed for the unbiased evaluation of eight established prediction tools: MAPP, nsSNPAnalyzer, PANTHER, PhD-SNP, PolyPhen-1, PolyPhen-2, SIFT and SNAP. The six best performing tools were combined into a consensus classifier PredictSNP, resulting into significantly improved prediction performance, and at the same time returned results for all mutations, confirming that consensus prediction represents an accurate and robust alternative to the predictions delivered by individual tools. A user-friendly web interface enables easy access to all eight prediction tools, the consensus classifier PredictSNP and annotations from the Protein Mutant Database and the UniProt database. The web server and the datasets are freely available to the academic community at http://loschmidt.chemi.muni.cz/predictsnp. PMID:24453961

  1. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiu, Dongbin

    2017-03-03

    The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  2. Examining ion channel properties using free-energy methods.

    PubMed

    Domene, Carmen; Furini, Simone

    2009-01-01

    Recent advances in structural biology have revealed the architecture of a number of transmembrane channels, allowing for these complex biological systems to be understood in atomistic detail. Computational simulations are a powerful tool by which the dynamic and energetic properties, and thereby the function of these protein architectures, can be investigated. The experimentally observable properties of a system are often determined more by energetic than dynamics, and therefore understanding the underlying free energy (FE) of biophysical processes is of crucial importance. Critical to the accurate evaluation of FE values are the problems of obtaining accurate sampling of complex biological energy landscapes, and of obtaining accurate representations of the potential energy of a system, this latter problem having been addressed through the development of molecular force fields. While these challenges are common to all FE methods, depending on the system under study, and the questions being asked of it, one technique for FE calculation may be preferable to another, the choice of method and simulation protocol being crucial to achieve efficiency. Applied in a correct manner, FE calculations represent a predictive and affordable computational tool with which to make relevant contact with experiments. This chapter, therefore, aims to give an overview of the most widely implemented computational methods used to calculate the FE associated with particular biochemical or biophysical events, and to highlight their recent applications to ion channels. Copyright © 2009 Elsevier Inc. All rights reserved.

  3. Physics-based subsurface visualization of human tissue.

    PubMed

    Sharp, Richard; Adams, Jacob; Machiraju, Raghu; Lee, Robert; Crane, Robert

    2007-01-01

    In this paper, we present a framework for simulating light transport in three-dimensional tissue with inhomogeneous scattering properties. Our approach employs a computational model to simulate light scattering in tissue through the finite element solution of the diffusion equation. Although our model handles both visible and nonvisible wavelengths, we especially focus on the interaction of near infrared (NIR) light with tissue. Since most human tissue is permeable to NIR light, tools to noninvasively image tumors, blood vasculature, and monitor blood oxygenation levels are being constructed. We apply this model to a numerical phantom to visually reproduce the images generated by these real-world tools. Therefore, in addition to enabling inverse design of detector instruments, our computational tools produce physically-accurate visualizations of subsurface structures.

  4. An Upgrade of the Aeroheating Software ''MINIVER''

    NASA Technical Reports Server (NTRS)

    Louderback, Pierce

    2013-01-01

    Detailed computational modeling: CFO often used to create and execute computational domains. Increasing complexity when moving from 20 to 30 geometries. Computational time increased as finer grids are used (accuracy). Strong tool, but takes time to set up and run. MINIVER: Uses theoretical and empirical correlations. Orders of magnitude faster to set up and run. Not as accurate as CFO, but gives reasonable estimations. MINIVER's Drawbacks: Rigid command-line interface. Lackluster, unorganized documentation. No central control; multiple versions exist and have diverged.

  5. Dynamic modelling of an adsorption storage tank using a hybrid approach combining computational fluid dynamics and process simulation

    USGS Publications Warehouse

    Mota, J.P.B.; Esteves, I.A.A.C.; Rostam-Abadi, M.

    2004-01-01

    A computational fluid dynamics (CFD) software package has been coupled with the dynamic process simulator of an adsorption storage tank for methane fuelled vehicles. The two solvers run as independent processes and handle non-overlapping portions of the computational domain. The codes exchange data on the boundary interface of the two domains to ensure continuity of the solution and of its gradient. A software interface was developed to dynamically suspend and activate each process as necessary, and be responsible for data exchange and process synchronization. This hybrid computational tool has been successfully employed to accurately simulate the discharge of a new tank design and evaluate its performance. The case study presented here shows that CFD and process simulation are highly complementary computational tools, and that there are clear benefits to be gained from a close integration of the two. ?? 2004 Elsevier Ltd. All rights reserved.

  6. Do dichromats see colours in this way? Assessing simulation tools without colorimetric measurements.

    PubMed

    Lillo Jover, Julio A; Álvaro Llorente, Leticia; Moreira Villegas, Humberto; Melnikova, Anna

    2016-11-01

    Simulcheck evaluates Colour Simulation Tools (CSTs, they transform colours to mimic those seen by colour vision deficients). Two CSTs (Variantor and Coblis) were used to know if the standard Simulcheck version (direct measurement based, DMB) can be substituted by another (RGB values based) not requiring sophisticated measurement instruments. Ten normal trichromats performed the two psychophysical tasks included in the Simulcheck method. The Pseudoachromatic Stimuli Identification task provided the h uv (hue angle) values of the pseudoachromatic stimuli: colours seen as red or green by normal trichromats but as grey by colour deficient people. The Minimum Achromatic Contrast task was used to compute the L R (relative luminance) values of the pseudoachromatic stimuli. Simulcheck DMB version showed that Variantor was accurate to simulate protanopia but neither Variantor nor Coblis were accurate to simulate deuteranopia. Simulcheck RGB version provided accurate h uv values, so this variable can be adequately estimated when lacking a colorimeter —an expensive and unusual apparatus—. Contrary, the inaccuracy of the L R estimations provided by Simulcheck RGB version makes it advisable to compute this variable from the measurements performed with a photometer, a cheap and easy to find apparatus.

  7. Development of a fourth generation predictive capability maturity model.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hills, Richard Guy; Witkowski, Walter R.; Urbina, Angel

    2013-09-01

    The Predictive Capability Maturity Model (PCMM) is an expert elicitation tool designed to characterize and communicate completeness of the approaches used for computational model definition, verification, validation, and uncertainty quantification associated for an intended application. The primary application of this tool at Sandia National Laboratories (SNL) has been for physics-based computational simulations in support of nuclear weapons applications. The two main goals of a PCMM evaluation are 1) the communication of computational simulation capability, accurately and transparently, and 2) the development of input for effective planning. As a result of the increasing importance of computational simulation to SNLs mission, themore » PCMM has evolved through multiple generations with the goal to provide more clarity, rigor, and completeness in its application. This report describes the approach used to develop the fourth generation of the PCMM.« less

  8. Automatic detection and analysis of cell motility in phase-contrast time-lapse images using a combination of maximally stable extremal regions and Kalman filter approaches.

    PubMed

    Kaakinen, M; Huttunen, S; Paavolainen, L; Marjomäki, V; Heikkilä, J; Eklund, L

    2014-01-01

    Phase-contrast illumination is simple and most commonly used microscopic method to observe nonstained living cells. Automatic cell segmentation and motion analysis provide tools to analyze single cell motility in large cell populations. However, the challenge is to find a sophisticated method that is sufficiently accurate to generate reliable results, robust to function under the wide range of illumination conditions encountered in phase-contrast microscopy, and also computationally light for efficient analysis of large number of cells and image frames. To develop better automatic tools for analysis of low magnification phase-contrast images in time-lapse cell migration movies, we investigated the performance of cell segmentation method that is based on the intrinsic properties of maximally stable extremal regions (MSER). MSER was found to be reliable and effective in a wide range of experimental conditions. When compared to the commonly used segmentation approaches, MSER required negligible preoptimization steps thus dramatically reducing the computation time. To analyze cell migration characteristics in time-lapse movies, the MSER-based automatic cell detection was accompanied by a Kalman filter multiobject tracker that efficiently tracked individual cells even in confluent cell populations. This allowed quantitative cell motion analysis resulting in accurate measurements of the migration magnitude and direction of individual cells, as well as characteristics of collective migration of cell groups. Our results demonstrate that MSER accompanied by temporal data association is a powerful tool for accurate and reliable analysis of the dynamic behaviour of cells in phase-contrast image sequences. These techniques tolerate varying and nonoptimal imaging conditions and due to their relatively light computational requirements they should help to resolve problems in computationally demanding and often time-consuming large-scale dynamical analysis of cultured cells. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.

  9. Neural Network Design on the SRC-6 Reconfigurable Computer

    DTIC Science & Technology

    2006-12-01

    fingerprint identification. In this field, automatic identification methods are used to save time, especially for the purpose of fingerprint matching in...grid widths and lengths and therefore was useful in producing an accurate canvas with which to create sample training images. The added benefit of...tools available free of charge and readily accessible on the computer, it was simple to design bitmap data files visually on a canvas and then

  10. Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Y; Glascoe, L

    The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less

  11. Computed Tomography (CT) Imaging of Injuries from Blunt Abdominal Trauma: A Pictorial Essay.

    PubMed

    Hassan, Radhiana; Abd Aziz, Azian

    2010-04-01

    Blunt abdominal trauma can cause multiple internal injuries. However, these injuries are often difficult to accurately evaluate, particularly in the presence of more obvious external injuries. Computed tomography (CT) imaging is currently used to assess clinically stable patients with blunt abdominal trauma. CT can provide a rapid and accurate appraisal of the abdominal viscera, retroperitoneum and abdominal wall, as well as a limited assessment of the lower thoracic region and bony pelvis. This paper presents examples of various injuries in trauma patients depicted in abdominal CT images. We hope these images provide a resource for radiologists, surgeons and medical officers, as well as a learning tool for medical students.

  12. CAD-RADS - a new clinical decision support tool for coronary computed tomography angiography.

    PubMed

    Foldyna, Borek; Szilveszter, Bálint; Scholtz, Jan-Erik; Banerji, Dahlia; Maurovich-Horvat, Pál; Hoffmann, Udo

    2018-04-01

    Coronary computed tomography angiography (CTA) has been established as an accurate method to non-invasively assess coronary artery disease (CAD). The proposed 'Coronary Artery Disease Reporting and Data System' (CAD-RADS) may enable standardised reporting of the broad spectrum of coronary CTA findings related to the presence, extent and composition of coronary atherosclerosis. The CAD-RADS classification is a comprehensive tool for summarising findings on a per-patient-basis dependent on the highest-grade coronary artery lesion, ranging from CAD-RADS 0 (absence of CAD) to CAD-RADS 5 (total occlusion of a coronary artery). In addition, it provides suggestions for clinical management for each classification, including further testing and therapeutic options. Despite some limitations, CAD-RADS may facilitate improved communication between imagers and patient caregivers. As such, CAD-RADS may enable a more efficient use of coronary CTA leading to more accurate utilisation of invasive coronary angiograms. Furthermore, widespread use of CAD-RADS may facilitate registry-based research of diagnostic and prognostic aspects of CTA. • CAD-RADS is a tool for standardising coronary CTA reports. • CAD-RADS includes clinical treatment recommendations based on CTA findings. • CAD-RADS has the potential to reduce variability of CTA reports.

  13. Active Control of Fan Noise: Feasibility Study. Volume 5; Numerical Computation of Acoustic Mode Reflection Coefficients for an Unflanged Cylindrical Duct

    NASA Technical Reports Server (NTRS)

    Kraft, R. E.

    1996-01-01

    A computational method to predict modal reflection coefficients in cylindrical ducts has been developed based on the work of Homicz, Lordi, and Rehm, which uses the Wiener-Hopf method to account for the boundary conditions at the termination of a thin cylindrical pipe. The purpose of this study is to develop a computational routine to predict the reflection coefficients of higher order acoustic modes impinging on the unflanged termination of a cylindrical duct. This effort was conducted wider Task Order 5 of the NASA Lewis LET Program, Active Noise Control of aircraft Engines: Feasibility Study, and will be used as part of the development of an integrated source noise, acoustic propagation, ANC actuator coupling, and control system algorithm simulation. The reflection coefficient prediction will be incorporated into an existing cylindrical duct modal analysis to account for the reflection of modes from the duct termination. This will provide a more accurate, rapid computation design tool for evaluating the effect of reflected waves on active noise control systems mounted in the duct, as well as providing a tool for the design of acoustic treatment in inlet ducts. As an active noise control system design tool, the method can be used preliminary to more accurate but more numerically intensive acoustic propagation models such as finite element methods. The resulting computer program has been shown to give reasonable results, some examples of which are presented. Reliable data to use for comparison is scarce, so complete checkout is difficult, and further checkout is needed over a wider range of system parameters. In future efforts the method will be adapted as a subroutine to the GEAE segmented cylindrical duct modal analysis program.

  14. Past, present and prospect of an Artificial Intelligence (AI) based model for sediment transport prediction

    NASA Astrophysics Data System (ADS)

    Afan, Haitham Abdulmohsin; El-shafie, Ahmed; Mohtar, Wan Hanna Melini Wan; Yaseen, Zaher Mundher

    2016-10-01

    An accurate model for sediment prediction is a priority for all hydrological researchers. Many conventional methods have shown an inability to achieve an accurate prediction of suspended sediment. These methods are unable to understand the behaviour of sediment transport in rivers due to the complexity, noise, non-stationarity, and dynamism of the sediment pattern. In the past two decades, Artificial Intelligence (AI) and computational approaches have become a remarkable tool for developing an accurate model. These approaches are considered a powerful tool for solving any non-linear model, as they can deal easily with a large number of data and sophisticated models. This paper is a review of all AI approaches that have been applied in sediment modelling. The current research focuses on the development of AI application in sediment transport. In addition, the review identifies major challenges and opportunities for prospective research. Throughout the literature, complementary models superior to classical modelling.

  15. Proposed method of producing large optical mirrors Single-point diamond crushing followed by polishing with a small-area tool

    NASA Technical Reports Server (NTRS)

    Wright, G.; Bryan, J. B.

    1986-01-01

    Faster production of large optical mirrors may result from combining single-point diamond crushing of the glass with polishing using a small area tool to smooth the surface and remove the damaged layer. Diamond crushing allows a surface contour accurate to 0.5 microns to be generated, and the small area computer-controlled polishing tool allows the surface roughness to be removed without destroying the initial contour. Final contours with an accuracy of 0.04 microns have been achieved.

  16. Software Tools for Emittance Measurement and Matching for 12 GeV CEBAF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, Dennis L.

    2016-05-01

    This paper discusses model-driven setup of the Continuous Electron Beam Accelerator Facility (CEBAF) for the 12GeV era, focusing on qsUtility. qsUtility is a set of software tools created to perform emittance measurements, analyze those measurements, and compute optics corrections based upon the measurements.qsUtility was developed as a toolset to facilitate reducing machine configuration time and reproducibility by way of an accurate accelerator model, and to provide Operations staff with tools to measure and correct machine optics with little or no assistance from optics experts.

  17. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  18. Computer simulation results of attitude estimation of earth orbiting satellites

    NASA Technical Reports Server (NTRS)

    Kou, S. R.

    1976-01-01

    Computer simulation results of attitude estimation of Earth-orbiting satellites (including Space Telescope) subjected to environmental disturbances and noises are presented. Decomposed linear recursive filter and Kalman filter were used as estimation tools. Six programs were developed for this simulation, and all were written in the basic language and were run on HP 9830A and HP 9866A computers. Simulation results show that a decomposed linear recursive filter is accurate in estimation and fast in response time. Furthermore, for higher order systems, this filter has computational advantages (i.e., less integration errors and roundoff errors) over a Kalman filter.

  19. SSME main combustion chamber and nozzle flowfield analysis

    NASA Technical Reports Server (NTRS)

    Farmer, R. C.; Wang, T. S.; Smith, S. D.; Prozan, R. J.

    1986-01-01

    An investigation is presented of the computational fluid dynamics (CFD) tools which would accurately analyze main combustion chamber and nozzle flow. The importance of combustion phenomena and local variations in mixture ratio are fully appreciated; however, the computational aspects of the gas dynamics involved were the sole issues addressed. The CFD analyses made are first compared with conventional nozzle analyses to determine the accuracy for steady flows, and then transient analyses are discussed.

  20. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    NASA Astrophysics Data System (ADS)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  1. Hyperbolic heat conduction problems involving non-Fourier effects - Numerical simulations via explicit Lax-Wendroff/Taylor-Galerkin finite element formulations

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Namburu, Raju R.

    1989-01-01

    Numerical simulations are presented for hyperbolic heat-conduction problems that involve non-Fourier effects, using explicit, Lax-Wendroff/Taylor-Galerkin FEM formulations as the principal computational tool. Also employed are smoothing techniques which stabilize the numerical noise and accurately predict the propagating thermal disturbances. The accurate capture of propagating thermal disturbances at characteristic time-step values is achieved; numerical test cases are presented which validate the proposed hyperbolic heat-conduction problem concepts.

  2. Dissecting innate immune responses with the tools of systems biology.

    PubMed

    Smith, Kelly D; Bolouri, Hamid

    2005-02-01

    Systems biology strives to derive accurate predictive descriptions of complex systems such as innate immunity. The innate immune system is essential for host defense, yet the resulting inflammatory response must be tightly regulated. Current understanding indicates that this system is controlled by complex regulatory networks, which maintain homoeostasis while accurately distinguishing pathogenic infections from harmless exposures. Recent studies have used high throughput technologies and computational techniques that presage predictive models and will be the foundation of a systems level understanding of innate immunity.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mattsson, Ann E.

    Density Functional Theory (DFT) based Equation of State (EOS) construction is a prominent part of Sandia’s capabilities to support engineering sciences. This capability is based on augmenting experimental data with information gained from computational investigations, especially in those parts of the phase space where experimental data is hard, dangerous, or expensive to obtain. A key part of the success of the Sandia approach is the fundamental science work supporting the computational capability. Not only does this work enhance the capability to perform highly accurate calculations but it also provides crucial insight into the limitations of the computational tools, providing highmore » confidence in the results even where results cannot be, or have not yet been, validated by experimental data. This report concerns the key ingredient of projector augmented-wave (PAW) potentials for use in pseudo-potential computational codes. Using the tools discussed in SAND2012-7389 we assess the standard Vienna Ab-initio Simulation Package (VASP) PAWs for Molybdenum.« less

  4. A Computational and Experimental Study of Slit Resonators

    NASA Technical Reports Server (NTRS)

    Tam, C. K. W.; Ju, H.; Jones, M. G.; Watson, W. R.; Parrott, T. L.

    2003-01-01

    Computational and experimental studies are carried out to offer validation of the results obtained from direct numerical simulation (DNS) of the flow and acoustic fields of slit resonators. The test cases include slits with 90-degree corners and slits with 45-degree bevel angle housed inside an acoustic impedance tube. Three slit widths are used. Six frequencies from 0.5 to 3.0 kHz are chosen. Good agreement is found between computed and measured reflection factors. In addition, incident sound waves having white noise spectrum and a prescribed pseudo-random noise spectrum are used in subsequent series of tests. The computed broadband results are again found to agree well with experimental data. It is believed the present results provide strong support that DNS can eventually be a useful and accurate prediction tool for liner aeroacoustics. The usage of DNS as a design tool is discussed and illustrated by a simple example.

  5. A SCILAB Program for Computing General-Relativistic Models of Rotating Neutron Stars by Implementing Hartle's Perturbation Method

    NASA Astrophysics Data System (ADS)

    Papasotiriou, P. J.; Geroyannis, V. S.

    We implement Hartle's perturbation method to the computation of relativistic rigidly rotating neutron star models. The program has been written in SCILAB (© INRIA ENPC), a matrix-oriented high-level programming language. The numerical method is described in very detail and is applied to many models in slow or fast rotation. We show that, although the method is perturbative, it gives accurate results for all practical purposes and it should prove an efficient tool for computing rapidly rotating pulsars.

  6. Adaptable Interactive CBL Design Tools for Education.

    ERIC Educational Resources Information Center

    Chandra, Peter

    The design team approach to the development of computer based learning (CBL) courseware relies heavily on the effective communication between different members of the team, including up-to-date paperwork and documentation. This is important for the accurate and efficient overall coordination of the courseware design, and for future maintenance of…

  7. Microwave Workshop for Windows.

    ERIC Educational Resources Information Center

    White, Colin

    1998-01-01

    "Microwave Workshop for Windows" consists of three programs that act as teaching aid and provide a circuit design utility within the field of microwave engineering. The first program is a computer representation of a graphical design tool; the second is an accurate visual and analytical representation of a microwave test bench; the third…

  8. Introduction to Forensics and the Use of the Helix Free Forensic Tool

    DTIC Science & Technology

    2012-01-01

    computer system belongs to and his personal activities, interests, and hobbies. An example presented in the paper was that pedophiles might keep...digital records like pictures or video of their delinquent activities. As we mentioned before, we must keep an accurate record of our investigation

  9. Methods for Efficiently and Accurately Computing Quantum Mechanical Free Energies for Enzyme Catalysis.

    PubMed

    Kearns, F L; Hudson, P S; Boresch, S; Woodcock, H L

    2016-01-01

    Enzyme activity is inherently linked to free energies of transition states, ligand binding, protonation/deprotonation, etc.; these free energies, and thus enzyme function, can be affected by residue mutations, allosterically induced conformational changes, and much more. Therefore, being able to predict free energies associated with enzymatic processes is critical to understanding and predicting their function. Free energy simulation (FES) has historically been a computational challenge as it requires both the accurate description of inter- and intramolecular interactions and adequate sampling of all relevant conformational degrees of freedom. The hybrid quantum mechanical molecular mechanical (QM/MM) framework is the current tool of choice when accurate computations of macromolecular systems are essential. Unfortunately, robust and efficient approaches that employ the high levels of computational theory needed to accurately describe many reactive processes (ie, ab initio, DFT), while also including explicit solvation effects and accounting for extensive conformational sampling are essentially nonexistent. In this chapter, we will give a brief overview of two recently developed methods that mitigate several major challenges associated with QM/MM FES: the QM non-Boltzmann Bennett's acceptance ratio method and the QM nonequilibrium work method. We will also describe usage of these methods to calculate free energies associated with (1) relative properties and (2) along reaction paths, using simple test cases with relevance to enzymes examples. © 2016 Elsevier Inc. All rights reserved.

  10. Towards a C2 Poly-Visualization Tool: Leveraging the Power of Social-Network Analysis and GIS

    DTIC Science & Technology

    2011-06-01

    from Magsino.14 AutoMap, a product of CASOS at Carnegie Mellon University, is a text-mining tool that enables the extraction of network data from...enables community leaders to prepare for biological attacks using computational models. BioWar is a CASOS package that combines many factors into a...models, demographically accurate agent modes, wind dispersion models, and an error-diagnostic model. Construct, also developed by CASOS , is a

  11. An optimization approach for fitting canonical tensor decompositions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunlavy, Daniel M.; Acar, Evrim; Kolda, Tamara Gibson

    Tensor decompositions are higher-order analogues of matrix decompositions and have proven to be powerful tools for data analysis. In particular, we are interested in the canonical tensor decomposition, otherwise known as the CANDECOMP/PARAFAC decomposition (CPD), which expresses a tensor as the sum of component rank-one tensors and is used in a multitude of applications such as chemometrics, signal processing, neuroscience, and web analysis. The task of computing the CPD, however, can be difficult. The typical approach is based on alternating least squares (ALS) optimization, which can be remarkably fast but is not very accurate. Previously, nonlinear least squares (NLS) methodsmore » have also been recommended; existing NLS methods are accurate but slow. In this paper, we propose the use of gradient-based optimization methods. We discuss the mathematical calculation of the derivatives and further show that they can be computed efficiently, at the same cost as one iteration of ALS. Computational experiments demonstrate that the gradient-based optimization methods are much more accurate than ALS and orders of magnitude faster than NLS.« less

  12. Rich Language Analysis for Counterterrorism

    NASA Astrophysics Data System (ADS)

    Guidère, Mathieu; Howard, Newton; Argamon, Shlomo

    Accurate and relevant intelligence is critical for effective counterterrorism. Too much irrelevant information is as bad or worse than not enough information. Modern computational tools promise to provide better search and summarization capabilities to help analysts filter and select relevant and key information. However, to do this task effectively, such tools must have access to levels of meaning beyond the literal. Terrorists operating in context-rich cultures like fundamentalist Islam use messages with multiple levels of interpretation, which are easily misunderstood by non-insiders. This chapter discusses several kinds of such “encryption” used by terrorists and insurgents in the Arabic language, and how knowledge of such methods can be used to enhance computational text analysis techniques for use in counterterrorism.

  13. Automated Development of Accurate Algorithms and Efficient Codes for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.; Dyson, Rodger W.

    1999-01-01

    The simulation of sound generation and propagation in three space dimensions with realistic aircraft components is a very large time dependent computation with fine details. Simulations in open domains with embedded objects require accurate and robust algorithms for propagation, for artificial inflow and outflow boundaries, and for the definition of geometrically complex objects. The development, implementation, and validation of methods for solving these demanding problems is being done to support the NASA pillar goals for reducing aircraft noise levels. Our goal is to provide algorithms which are sufficiently accurate and efficient to produce usable results rapidly enough to allow design engineers to study the effects on sound levels of design changes in propulsion systems, and in the integration of propulsion systems with airframes. There is a lack of design tools for these purposes at this time. Our technical approach to this problem combines the development of new, algorithms with the use of Mathematica and Unix utilities to automate the algorithm development, code implementation, and validation. We use explicit methods to ensure effective implementation by domain decomposition for SPMD parallel computing. There are several orders of magnitude difference in the computational efficiencies of the algorithms which we have considered. We currently have new artificial inflow and outflow boundary conditions that are stable, accurate, and unobtrusive, with implementations that match the accuracy and efficiency of the propagation methods. The artificial numerical boundary treatments have been proven to have solutions which converge to the full open domain problems, so that the error from the boundary treatments can be driven as low as is required. The purpose of this paper is to briefly present a method for developing highly accurate algorithms for computational aeroacoustics, the use of computer automation in this process, and a brief survey of the algorithms that have resulted from this work. A review of computational aeroacoustics has recently been given by Lele.

  14. Maxillary first molar with three buccal roots evaluated with cone-beam computed tomography: a rare case report.

    PubMed

    Kottoor, Jojo; Nandini, Suresh; Velmurugan, Natanasabapathy

    2012-01-01

    This case report describes the nonsurgical endodontic management of a maxillary first molar with the unusual morphology of three separate buccal roots. An accurate assessment of this morphology was made with the help of cone-beam computed tomography (CBCT). This report also describes the varied root morphology associated with maxillary first molars and the role of CBCT as a diagnostic tool for managing these complex cases successfully.

  15. Use of Transition Modeling to Enable the Computation of Losses for Variable-Speed Power Turbine

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.

    2012-01-01

    To investigate the penalties associated with using a variable speed power turbine (VSPT) in a rotorcraft capable of vertical takeoff and landing, various analysis tools are required. Such analysis tools must be able to model the flow accurately within the operating envelope of VSPT. For power turbines low Reynolds numbers and a wide range of the incidence angles, positive and negative, due to the variation in the shaft speed at relatively fixed corrected flows, characterize this envelope. The flow in the turbine passage is expected to be transitional and separated at high incidence. The turbulence model of Walters and Leylek was implemented in the NASA Glenn-HT code to enable a more accurate analysis of such flows. Two-dimensional heat transfer predictions of flat plate flow and two-dimensional and three-dimensional heat transfer predictions on a turbine blade were performed and reported herein. Heat transfer computations were performed because it is a good marker for transition. The final goal is to be able to compute the aerodynamic losses. Armed with the new transition model, total pressure losses for three-dimensional flow of an Energy Efficient Engine (E3) tip section cascade for a range of incidence angles were computed in anticipation of the experimental data. The results obtained form a loss bucket for the chosen blade.

  16. Successes and Challenges of Incompressible Flow Simulation

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin

    2003-01-01

    During the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of CFD discipline. Even though incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient, CFD tools become indispensable in fluid engineering for incompressible and low speed flow. This paper is intended to review some of the successes made possible by advances in computational technologies during the same period, and discuss some of the current challenges.

  17. Space shuttle atmospheric revitalization subsystem/active thermal control subsystem computer program (users manual)

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A shuttle (ARS) atmosphere revitalization subsystem active thermal control subsystem (ATCS) performance routine was developed. This computer program is adapted from the Shuttle EC/LSS Design Computer Program. The program was upgraded in three noteworthy areas: (1) The functional ARS/ATCS schematic has been revised to accurately synthesize the shuttle baseline system definition. (2) The program logic has been improved to provide a more accurate prediction of the integrated ARS/ATCS system performance. Additionally, the logic has been expanded to model all components and thermal loads in the ARS/ATCS system. (3) The program is designed to be used on the NASA JSC crew system division's programmable calculator system. As written the new computer routine has an average running time of five minutes. The use of desk top type calculation equipment, and the rapid response of the program provides the NASA with an analytical tool for trade studies to refine the system definition, and for test support of the RSECS or integrated Shuttle ARS/ATCS test programs.

  18. A computational continuum model of poroelastic beds

    PubMed Central

    Zampogna, G. A.

    2017-01-01

    Despite the ubiquity of fluid flows interacting with porous and elastic materials, we lack a validated non-empirical macroscale method for characterizing the flow over and through a poroelastic medium. We propose a computational tool to describe such configurations by deriving and validating a continuum model for the poroelastic bed and its interface with the above free fluid. We show that, using stress continuity condition and slip velocity condition at the interface, the effective model captures the effects of small changes in the microstructure anisotropy correctly and predicts the overall behaviour in a physically consistent and controllable manner. Moreover, we show that the performance of the effective model is accurate by validating with fully microscopic resolved simulations. The proposed computational tool can be used in investigations in a wide range of fields, including mechanical engineering, bio-engineering and geophysics. PMID:28413355

  19. CombiROC: an interactive web tool for selecting accurate marker combinations of omics data.

    PubMed

    Mazzara, Saveria; Rossi, Riccardo L; Grifantini, Renata; Donizetti, Simone; Abrignani, Sergio; Bombaci, Mauro

    2017-03-30

    Diagnostic accuracy can be improved considerably by combining multiple markers, whose performance in identifying diseased subjects is usually assessed via receiver operating characteristic (ROC) curves. The selection of multimarker signatures is a complicated process that requires integration of data signatures with sophisticated statistical methods. We developed a user-friendly tool, called CombiROC, to help researchers accurately determine optimal markers combinations from diverse omics methods. With CombiROC data from different domains, such as proteomics and transcriptomics, can be analyzed using sensitivity/specificity filters: the number of candidate marker panels rising from combinatorial analysis is easily optimized bypassing limitations imposed by the nature of different experimental approaches. Leaving to the user full control on initial selection stringency, CombiROC computes sensitivity and specificity for all markers combinations, performances of best combinations and ROC curves for automatic comparisons, all visualized in a graphic interface. CombiROC was designed without hard-coded thresholds, allowing a custom fit to each specific data: this dramatically reduces the computational burden and lowers the false negative rates given by fixed thresholds. The application was validated with published data, confirming the marker combination already originally described or even finding new ones. CombiROC is a novel tool for the scientific community freely available at http://CombiROC.eu.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vang, Leng; Prescott, Steven R; Smith, Curtis

    In collaborating scientific research arena it is important to have an environment where analysts have access to a shared of information documents, software tools and be able to accurately maintain and track historical changes in models. A new cloud-based environment would be accessible remotely from anywhere regardless of computing platforms given that the platform has available of Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report reviews development of a Cloud-based Architecture Capabilities (CAC) as a web portal for PRA tools.

  1. 3D liver volume reconstructed for palpation training.

    PubMed

    Tibamoso, Gerardo; Perez-Gutierrez, Byron; Uribe-Quevedo, Alvaro

    2013-01-01

    Virtual Reality systems for medical procedures such as the palpation of different organs, requires fast, robust, accurate and reliable computational methods for providing realism during interaction with the 3D biological models. This paper presents the segmentation, reconstruction and palpation simulation of a healthy liver volume as a tool for training. The chosen method considers the mechanical characteristics and liver properties for correctly simulating palpation interactions, which results appropriate as a complementary tool for training medical students in familiarizing with the liver anatomy.

  2. Non-conforming finite-element formulation for cardiac electrophysiology: an effective approach to reduce the computation time of heart simulations without compromising accuracy

    NASA Astrophysics Data System (ADS)

    Hurtado, Daniel E.; Rojas, Guillermo

    2018-04-01

    Computer simulations constitute a powerful tool for studying the electrical activity of the human heart, but computational effort remains prohibitively high. In order to recover accurate conduction velocities and wavefront shapes, the mesh size in linear element (Q1) formulations cannot exceed 0.1 mm. Here we propose a novel non-conforming finite-element formulation for the non-linear cardiac electrophysiology problem that results in accurate wavefront shapes and lower mesh-dependance in the conduction velocity, while retaining the same number of global degrees of freedom as Q1 formulations. As a result, coarser discretizations of cardiac domains can be employed in simulations without significant loss of accuracy, thus reducing the overall computational effort. We demonstrate the applicability of our formulation in biventricular simulations using a coarse mesh size of ˜ 1 mm, and show that the activation wave pattern closely follows that obtained in fine-mesh simulations at a fraction of the computation time, thus improving the accuracy-efficiency trade-off of cardiac simulations.

  3. VirusDetect: An automated pipeline for efficient virus discovery using deep sequencing of small RNAs

    USDA-ARS?s Scientific Manuscript database

    Accurate detection of viruses in plants and animals is critical for agriculture production and human health. Deep sequencing and assembly of virus-derived siRNAs has proven to be a highly efficient approach for virus discovery. However, to date no computational tools specifically designed for both k...

  4. Support for Debugging Automatically Parallelized Programs

    NASA Technical Reports Server (NTRS)

    Hood, Robert; Jost, Gabriele

    2001-01-01

    This viewgraph presentation provides information on support sources available for the automatic parallelization of computer program. CAPTools, a support tool developed at the University of Greenwich, transforms, with user guidance, existing sequential Fortran code into parallel message passing code. Comparison routines are then run for debugging purposes, in essence, ensuring that the code transformation was accurate.

  5. Calibration Experiments for a Computer Vision Oyster Volume Estimation System

    ERIC Educational Resources Information Center

    Chang, G. Andy; Kerns, G. Jay; Lee, D. J.; Stanek, Gary L.

    2009-01-01

    Calibration is a technique that is commonly used in science and engineering research that requires calibrating measurement tools for obtaining more accurate measurements. It is an important technique in various industries. In many situations, calibration is an application of linear regression, and is a good topic to be included when explaining and…

  6. A graphical user interface for RAId, a knowledge integrated proteomics analysis suite with accurate statistics.

    PubMed

    Joyce, Brendan; Lee, Danny; Rubio, Alex; Ogurtsov, Aleksey; Alves, Gelio; Yu, Yi-Kuo

    2018-03-15

    RAId is a software package that has been actively developed for the past 10 years for computationally and visually analyzing MS/MS data. Founded on rigorous statistical methods, RAId's core program computes accurate E-values for peptides and proteins identified during database searches. Making this robust tool readily accessible for the proteomics community by developing a graphical user interface (GUI) is our main goal here. We have constructed a graphical user interface to facilitate the use of RAId on users' local machines. Written in Java, RAId_GUI not only makes easy executions of RAId but also provides tools for data/spectra visualization, MS-product analysis, molecular isotopic distribution analysis, and graphing the retrieval versus the proportion of false discoveries. The results viewer displays and allows the users to download the analyses results. Both the knowledge-integrated organismal databases and the code package (containing source code, the graphical user interface, and a user manual) are available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads/raid.html .

  7. Computational modeling of human oral bioavailability: what will be next?

    PubMed

    Cabrera-Pérez, Miguel Ángel; Pham-The, Hai

    2018-06-01

    The oral route is the most convenient way of administrating drugs. Therefore, accurate determination of oral bioavailability is paramount during drug discovery and development. Quantitative structure-property relationship (QSPR), rule-of-thumb (RoT) and physiologically based-pharmacokinetic (PBPK) approaches are promising alternatives to the early oral bioavailability prediction. Areas covered: The authors give insight into the factors affecting bioavailability, the fundamental theoretical framework and the practical aspects of computational methods for predicting this property. They also give their perspectives on future computational models for estimating oral bioavailability. Expert opinion: Oral bioavailability is a multi-factorial pharmacokinetic property with its accurate prediction challenging. For RoT and QSPR modeling, the reliability of datasets, the significance of molecular descriptor families and the diversity of chemometric tools used are important factors that define model predictability and interpretability. Likewise, for PBPK modeling the integrity of the pharmacokinetic data, the number of input parameters, the complexity of statistical analysis and the software packages used are relevant factors in bioavailability prediction. Although these approaches have been utilized independently, the tendency to use hybrid QSPR-PBPK approaches together with the exploration of ensemble and deep-learning systems for QSPR modeling of oral bioavailability has opened new avenues for development promising tools for oral bioavailability prediction.

  8. The investigation and implementation of real-time face pose and direction estimation on mobile computing devices

    NASA Astrophysics Data System (ADS)

    Fu, Deqian; Gao, Lisheng; Jhang, Seong Tae

    2012-04-01

    The mobile computing device has many limitations, such as relative small user interface and slow computing speed. Usually, augmented reality requires face pose estimation can be used as a HCI and entertainment tool. As far as the realtime implementation of head pose estimation on relatively resource limited mobile platforms is concerned, it is required to face different constraints while leaving enough face pose estimation accuracy. The proposed face pose estimation method met this objective. Experimental results running on a testing Android mobile device delivered satisfactory performing results in the real-time and accurately.

  9. Subsonic Wing Optimization for Handling Qualities Using ACSYNT

    NASA Technical Reports Server (NTRS)

    Soban, Danielle Suzanne

    1996-01-01

    The capability to accurately and rapidly predict aircraft stability derivatives using one comprehensive analysis tool has been created. The PREDAVOR tool has the following capabilities: rapid estimation of stability derivatives using a vortex lattice method, calculation of a longitudinal handling qualities metric, and inherent methodology to optimize a given aircraft configuration for longitudinal handling qualities, including an intuitive graphical interface. The PREDAVOR tool may be applied to both subsonic and supersonic designs, as well as conventional and unconventional, symmetric and asymmetric configurations. The workstation-based tool uses as its model a three-dimensional model of the configuration generated using a computer aided design (CAD) package. The PREDAVOR tool was applied to a Lear Jet Model 23 and the North American XB-70 Valkyrie.

  10. PhytoCRISP-Ex: a web-based and stand-alone application to find specific target sequences for CRISPR/CAS editing.

    PubMed

    Rastogi, Achal; Murik, Omer; Bowler, Chris; Tirichine, Leila

    2016-07-01

    With the emerging interest in phytoplankton research, the need to establish genetic tools for the functional characterization of genes is indispensable. The CRISPR/Cas9 system is now well recognized as an efficient and accurate reverse genetic tool for genome editing. Several computational tools have been published allowing researchers to find candidate target sequences for the engineering of the CRISPR vectors, while searching possible off-targets for the predicted candidates. These tools provide built-in genome databases of common model organisms that are used for CRISPR target prediction. Although their predictions are highly sensitive, the applicability to non-model genomes, most notably protists, makes their design inadequate. This motivated us to design a new CRISPR target finding tool, PhytoCRISP-Ex. Our software offers CRIPSR target predictions using an extended list of phytoplankton genomes and also delivers a user-friendly standalone application that can be used for any genome. The software attempts to integrate, for the first time, most available phytoplankton genomes information and provide a web-based platform for Cas9 target prediction within them with high sensitivity. By offering a standalone version, PhytoCRISP-Ex maintains an independence to be used with any organism and widens its applicability in high throughput pipelines. PhytoCRISP-Ex out pars all the existing tools by computing the availability of restriction sites over the most probable Cas9 cleavage sites, which can be ideal for mutant screens. PhytoCRISP-Ex is a simple, fast and accurate web interface with 13 pre-indexed and presently updating phytoplankton genomes. The software was also designed as a UNIX-based standalone application that allows the user to search for target sequences in the genomes of a variety of other species.

  11. Leveraging e-Science infrastructure for electrochemical research.

    PubMed

    Peachey, Tom; Mashkina, Elena; Lee, Chong-Yong; Enticott, Colin; Abramson, David; Bond, Alan M; Elton, Darrell; Gavaghan, David J; Stevenson, Gareth P; Kennedy, Gareth F

    2011-08-28

    As in many scientific disciplines, modern chemistry involves a mix of experimentation and computer-supported theory. Historically, these skills have been provided by different groups, and range from traditional 'wet' laboratory science to advanced numerical simulation. Increasingly, progress is made by global collaborations, in which new theory may be developed in one part of the world and applied and tested in the laboratory elsewhere. e-Science, or cyber-infrastructure, underpins such collaborations by providing a unified platform for accessing scientific instruments, computers and data archives, and collaboration tools. In this paper we discuss the application of advanced e-Science software tools to electrochemistry research performed in three different laboratories--two at Monash University in Australia and one at the University of Oxford in the UK. We show that software tools that were originally developed for a range of application domains can be applied to electrochemical problems, in particular Fourier voltammetry. Moreover, we show that, by replacing ad-hoc manual processes with e-Science tools, we obtain more accurate solutions automatically.

  12. Computer Controlled Optical Surfacing With Orbital Tool Motion

    NASA Astrophysics Data System (ADS)

    Jones, Robert A.

    1985-10-01

    Asymmetric aspheric optical surfaces are very difficult to fabricate using classical techniques and laps the same size as the workpiece. Opticians can produce such surfaces by grinding and polishing, using small laps with orbital tool motion. However, hand correction is a time consuming process unsuitable for large optical elements. Itek has developed Computer Controlled Optical Surfacing (CCOS) for fabricating such aspheric optics. Automated equipment moves a nonrotating orbiting tool slowly over the workpiece surface. The process corrects low frequency surface errors by figuring. The velocity of the tool assembly over the workpiece surface is purposely varied. Since the amount of material removal is proportional to the polishing or grinding time, accurate control over material removal is achieved. The removal of middle and high frequency surface errors is accomplished by pad smoothing. For a soft pad material, the pad will compress to fit the workpiece surface producing greater pressure and more removal at the surface high areas. A harder pad will ride on only the high regions resulting in removal only for those locations.

  13. Potential pitfalls of strain rate imaging: angle dependency

    NASA Technical Reports Server (NTRS)

    Castro, P. L.; Greenberg, N. L.; Drinko, J.; Garcia, M. J.; Thomas, J. D.

    2000-01-01

    Strain Rate Imaging (SRI) is a new echocardiographic technique that allows for the real-time determination of myocardial SR, which may be used for the early and accurate detection of coronary artery disease. We sought to study whether SR is affected by scan line alignment in a computer simulation and an in vivo experiment. Through the computer simulation and the in vivo experiment we generated and validated safe scanning sectors within the ultrasound scan sector and showed that while SRI will be an extremely valuable tool in detecting coronary artery disease there are potential pitfalls for the unwary clinician. Only after accounting for these affects due to angle dependency, can clinicians utilize SRI's potential as a valuable tool in detecting coronary artery disease.

  14. EMHP: an accurate automated hole masking algorithm for single-particle cryo-EM image processing.

    PubMed

    Berndsen, Zachary; Bowman, Charles; Jang, Haerin; Ward, Andrew B

    2017-12-01

    The Electron Microscopy Hole Punch (EMHP) is a streamlined suite of tools for quick assessment, sorting and hole masking of electron micrographs. With recent advances in single-particle electron cryo-microscopy (cryo-EM) data processing allowing for the rapid determination of protein structures using a smaller computational footprint, we saw the need for a fast and simple tool for data pre-processing that could run independent of existing high-performance computing (HPC) infrastructures. EMHP provides a data preprocessing platform in a small package that requires minimal python dependencies to function. https://www.bitbucket.org/chazbot/emhp Apache 2.0 License. bowman@scripps.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  15. Computer program to assess impact of fatigue and fracture criteria on weight and cost of transport aircraft

    NASA Technical Reports Server (NTRS)

    Tanner, C. J.; Kruse, G. S.; Oman, B. H.

    1975-01-01

    A preliminary design analysis tool for rapidly performing trade-off studies involving fatigue, fracture, static strength, weight, and cost is presented. Analysis subprograms were developed for fatigue life, crack growth life, and residual strength; and linked to a structural synthesis module which in turn was integrated into a computer program. The part definition module of a cost and weight analysis program was expanded to be compatible with the upgraded structural synthesis capability. The resultant vehicle design and evaluation program is named VDEP-2. It is an accurate and useful tool for estimating purposes at the preliminary design stage of airframe development. A sample case along with an explanation of program applications and input preparation is presented.

  16. The Development of a Novel High Throughput Computational Tool for Studying Individual and Collective Cellular Migration

    PubMed Central

    Chapnick, Douglas A.; Jacobsen, Jeremy; Liu, Xuedong

    2013-01-01

    Understanding how cells migrate individually and collectively during development and cancer metastasis can be significantly aided by a computation tool to accurately measure not only cellular migration speed, but also migration direction and changes in migration direction in a temporal and spatial manner. We have developed such a tool for cell migration researchers, named Pathfinder, which is capable of simultaneously measuring the migration speed, migration direction, and changes in migration directions of thousands of cells both instantaneously and over long periods of time from fluorescence microscopy data. Additionally, we demonstrate how the Pathfinder software can be used to quantify collective cell migration. The novel capability of the Pathfinder software to measure the changes in migration direction of large populations of cells in a spatiotemporal manner will aid cellular migration research by providing a robust method for determining the mechanisms of cellular guidance during individual and collective cell migration. PMID:24386097

  17. Recent Advances in Cardiac Computed Tomography: Dual Energy, Spectral and Molecular CT Imaging

    PubMed Central

    Danad, Ibrahim; Fayad, Zahi A.; Willemink, Martin J.; Min, James K.

    2015-01-01

    Computed tomography (CT) evolved into a powerful diagnostic tool and it is impossible to imagine current clinical practice without CT imaging. Due to its widespread availability, ease of clinical application, superb sensitivity for detection of CAD, and non-invasive nature, CT has become a valuable tool within the armamentarium of the cardiologist. In the last few years, numerous technological advances in CT have occurred—including dual energy CT (DECT), spectral CT and CT-based molecular imaging. By harnessing the advances in technology, cardiac CT has advanced beyond the mere evaluation of coronary stenosis to an imaging modality tool that permits accurate plaque characterization, assessment of myocardial perfusion and even probing of molecular processes that are involved in coronary atherosclerosis. Novel innovations in CT contrast agents and pre-clinical spectral CT devices have paved the way for CT-based molecular imaging. PMID:26068288

  18. 3D printing in chemical engineering and catalytic technology: structured catalysts, mixers and reactors.

    PubMed

    Parra-Cabrera, Cesar; Achille, Clement; Kuhn, Simon; Ameloot, Rob

    2018-01-02

    Computer-aided fabrication technologies combined with simulation and data processing approaches are changing our way of manufacturing and designing functional objects. Also in the field of catalytic technology and chemical engineering the impact of additive manufacturing, also referred to as 3D printing, is steadily increasing thanks to a rapidly decreasing equipment threshold. Although still in an early stage, the rapid and seamless transition between digital data and physical objects enabled by these fabrication tools will benefit both research and manufacture of reactors and structured catalysts. Additive manufacturing closes the gap between theory and experiment, by enabling accurate fabrication of geometries optimized through computational fluid dynamics and the experimental evaluation of their properties. This review highlights the research using 3D printing and computational modeling as digital tools for the design and fabrication of reactors and structured catalysts. The goal of this contribution is to stimulate interactions at the crossroads of chemistry and materials science on the one hand and digital fabrication and computational modeling on the other.

  19. Clinical nursing informatics. Developing tools for knowledge workers.

    PubMed

    Ozbolt, J G; Graves, J R

    1993-06-01

    Current research in clinical nursing informatics is proceeding along three important dimensions: (1) identifying and defining nursing's language and structuring its data; (2) understanding clinical judgment and how computer-based systems can facilitate and not replace it; and (3) discovering how well-designed systems can transform nursing practice. A number of efforts are underway to find and use language that accurately represents nursing and that can be incorporated into computer-based information systems. These efforts add to understanding nursing problems, interventions, and outcomes, and provide the elements for databases from which nursing's costs and effectiveness can be studied. Research on clinical judgment focuses on how nurses (perhaps with different levels of expertise) assess patient needs, set goals, and plan and deliver care, as well as how computer-based systems can be developed to aid these cognitive processes. Finally, investigators are studying not only how computers can help nurses with the mechanics and logistics of processing information but also and more importantly how access to informatics tools changes nursing care.

  20. A semi-automatic annotation tool for cooking video

    NASA Astrophysics Data System (ADS)

    Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe

    2013-03-01

    In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.

  1. Tools for studying dry-cured ham processing by using computed tomography.

    PubMed

    Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena

    2012-01-11

    An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.

  2. Solid rocket booster internal flow analysis by highly accurate adaptive computational methods

    NASA Technical Reports Server (NTRS)

    Huang, C. Y.; Tworzydlo, W.; Oden, J. T.; Bass, J. M.; Cullen, C.; Vadaketh, S.

    1991-01-01

    The primary objective of this project was to develop an adaptive finite element flow solver for simulating internal flows in the solid rocket booster. Described here is a unique flow simulator code for analyzing highly complex flow phenomena in the solid rocket booster. New methodologies and features incorporated into this analysis tool are described.

  3. Models of protein–ligand crystal structures: trust, but verify

    PubMed Central

    Deller, Marc C.

    2015-01-01

    X-ray crystallography provides the most accurate models of protein–ligand structures. These models serve as the foundation of many computational methods including structure prediction, molecular modelling, and structure-based drug design. The success of these computational methods ultimately depends on the quality of the underlying protein–ligand models. X-ray crystallography offers the unparalleled advantage of a clear mathematical formalism relating the experimental data to the protein–ligand model. In the case of X-ray crystallography, the primary experimental evidence is the electron density of the molecules forming the crystal. The first step in the generation of an accurate and precise crystallographic model is the interpretation of the electron density of the crystal, typically carried out by construction of an atomic model. The atomic model must then be validated for fit to the experimental electron density and also for agreement with prior expectations of stereochemistry. Stringent validation of protein–ligand models has become possible as a result of the mandatory deposition of primary diffraction data, and many computational tools are now available to aid in the validation process. Validation of protein–ligand complexes has revealed some instances of overenthusiastic interpretation of ligand density. Fundamental concepts and metrics of protein–ligand quality validation are discussed and we highlight software tools to assist in this process. It is essential that end users select high quality protein–ligand models for their computational and biological studies, and we provide an overview of how this can be achieved. PMID:25665575

  4. Models of protein-ligand crystal structures: trust, but verify.

    PubMed

    Deller, Marc C; Rupp, Bernhard

    2015-09-01

    X-ray crystallography provides the most accurate models of protein-ligand structures. These models serve as the foundation of many computational methods including structure prediction, molecular modelling, and structure-based drug design. The success of these computational methods ultimately depends on the quality of the underlying protein-ligand models. X-ray crystallography offers the unparalleled advantage of a clear mathematical formalism relating the experimental data to the protein-ligand model. In the case of X-ray crystallography, the primary experimental evidence is the electron density of the molecules forming the crystal. The first step in the generation of an accurate and precise crystallographic model is the interpretation of the electron density of the crystal, typically carried out by construction of an atomic model. The atomic model must then be validated for fit to the experimental electron density and also for agreement with prior expectations of stereochemistry. Stringent validation of protein-ligand models has become possible as a result of the mandatory deposition of primary diffraction data, and many computational tools are now available to aid in the validation process. Validation of protein-ligand complexes has revealed some instances of overenthusiastic interpretation of ligand density. Fundamental concepts and metrics of protein-ligand quality validation are discussed and we highlight software tools to assist in this process. It is essential that end users select high quality protein-ligand models for their computational and biological studies, and we provide an overview of how this can be achieved.

  5. extrap: Software to assist the selection of extrapolation methods for moving-boat ADCP streamflow measurements

    NASA Astrophysics Data System (ADS)

    Mueller, David S.

    2013-04-01

    Selection of the appropriate extrapolation methods for computing the discharge in the unmeasured top and bottom parts of a moving-boat acoustic Doppler current profiler (ADCP) streamflow measurement is critical to the total discharge computation. The software tool, extrap, combines normalized velocity profiles from the entire cross section and multiple transects to determine a mean profile for the measurement. The use of an exponent derived from normalized data from the entire cross section is shown to be valid for application of the power velocity distribution law in the computation of the unmeasured discharge in a cross section. Selected statistics are combined with empirically derived criteria to automatically select the appropriate extrapolation methods. A graphical user interface (GUI) provides the user tools to visually evaluate the automatically selected extrapolation methods and manually change them, as necessary. The sensitivity of the total discharge to available extrapolation methods is presented in the GUI. Use of extrap by field hydrographers has demonstrated that extrap is a more accurate and efficient method of determining the appropriate extrapolation methods compared with tools currently (2012) provided in the ADCP manufacturers' software.

  6. LittleQuickWarp: an ultrafast image warping tool.

    PubMed

    Qu, Lei; Peng, Hanchuan

    2015-02-01

    Warping images into a standard coordinate space is critical for many image computing related tasks. However, for multi-dimensional and high-resolution images, an accurate warping operation itself is often very expensive in terms of computer memory and computational time. For high-throughput image analysis studies such as brain mapping projects, it is desirable to have high performance image warping tools that are compatible with common image analysis pipelines. In this article, we present LittleQuickWarp, a swift and memory efficient tool that boosts 3D image warping performance dramatically and at the same time has high warping quality similar to the widely used thin plate spline (TPS) warping. Compared to the TPS, LittleQuickWarp can improve the warping speed 2-5 times and reduce the memory consumption 6-20 times. We have implemented LittleQuickWarp as an Open Source plug-in program on top of the Vaa3D system (http://vaa3d.org). The source code and a brief tutorial can be found in the Vaa3D plugin source code repository. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. High-order computational fluid dynamics tools for aircraft design

    PubMed Central

    Wang, Z. J.

    2014-01-01

    Most forecasts predict an annual airline traffic growth rate between 4.5 and 5% in the foreseeable future. To sustain that growth, the environmental impact of aircraft cannot be ignored. Future aircraft must have much better fuel economy, dramatically less greenhouse gas emissions and noise, in addition to better performance. Many technical breakthroughs must take place to achieve the aggressive environmental goals set up by governments in North America and Europe. One of these breakthroughs will be physics-based, highly accurate and efficient computational fluid dynamics and aeroacoustics tools capable of predicting complex flows over the entire flight envelope and through an aircraft engine, and computing aircraft noise. Some of these flows are dominated by unsteady vortices of disparate scales, often highly turbulent, and they call for higher-order methods. As these tools will be integral components of a multi-disciplinary optimization environment, they must be efficient to impact design. Ultimately, the accuracy, efficiency, robustness, scalability and geometric flexibility will determine which methods will be adopted in the design process. This article explores these aspects and identifies pacing items. PMID:25024419

  8. Uncertainty Aware Structural Topology Optimization Via a Stochastic Reduced Order Model Approach

    NASA Technical Reports Server (NTRS)

    Aguilo, Miguel A.; Warner, James E.

    2017-01-01

    This work presents a stochastic reduced order modeling strategy for the quantification and propagation of uncertainties in topology optimization. Uncertainty aware optimization problems can be computationally complex due to the substantial number of model evaluations that are necessary to accurately quantify and propagate uncertainties. This computational complexity is greatly magnified if a high-fidelity, physics-based numerical model is used for the topology optimization calculations. Stochastic reduced order model (SROM) methods are applied here to effectively 1) alleviate the prohibitive computational cost associated with an uncertainty aware topology optimization problem; and 2) quantify and propagate the inherent uncertainties due to design imperfections. A generic SROM framework that transforms the uncertainty aware, stochastic topology optimization problem into a deterministic optimization problem that relies only on independent calls to a deterministic numerical model is presented. This approach facilitates the use of existing optimization and modeling tools to accurately solve the uncertainty aware topology optimization problems in a fraction of the computational demand required by Monte Carlo methods. Finally, an example in structural topology optimization is presented to demonstrate the effectiveness of the proposed uncertainty aware structural topology optimization approach.

  9. Framework Development Supporting the Safety Portal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prescott, Steven Ralph; Kvarfordt, Kellie Jean; Vang, Leng

    2015-07-01

    In a collaborating scientific research arena it is important to have an environment where analysts have access to a shared repository of information, documents, and software tools, and be able to accurately maintain and track historical changes in models. The new Safety Portal cloud-based environment will be accessible remotely from anywhere regardless of computing platforms given that the platform has available Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report discusses current development of a cloud-based web portal for PRA tools.

  10. Automatic temperature computation for realistic IR simulation

    NASA Astrophysics Data System (ADS)

    Le Goff, Alain; Kersaudy, Philippe; Latger, Jean; Cathala, Thierry; Stolte, Nilo; Barillot, Philippe

    2000-07-01

    Polygon temperature computation in 3D virtual scenes is fundamental for IR image simulation. This article describes in detail the temperature calculation software and its current extensions, briefly presented in [1]. This software, called MURET, is used by the simulation workshop CHORALE of the French DGA. MURET is a one-dimensional thermal software, which accurately takes into account the material thermal attributes of three-dimensional scene and the variation of the environment characteristics (atmosphere) as a function of the time. Concerning the environment, absorbed incident fluxes are computed wavelength by wavelength, for each half an hour, druing 24 hours before the time of the simulation. For each polygon, incident fluxes are compsed of: direct solar fluxes, sky illumination (including diffuse solar fluxes). Concerning the materials, classical thermal attributes are associated to several layers, such as conductivity, absorption, spectral emissivity, density, specific heat, thickness and convection coefficients are taken into account. In the future, MURET will be able to simulate permeable natural materials (water influence) and vegetation natural materials (woods). This model of thermal attributes induces a very accurate polygon temperature computation for the complex 3D databases often found in CHORALE simulations. The kernel of MUET consists of an efficient ray tracer allowing to compute the history (over 24 hours) of the shadowed parts of the 3D scene and a library, responsible for the thermal computations. The great originality concerns the way the heating fluxes are computed. Using ray tracing, the flux received in each 3D point of the scene accurately takes into account the masking (hidden surfaces) between objects. By the way, this library supplies other thermal modules such as a thermal shows computation tool.

  11. SIMS: A Hybrid Method for Rapid Conformational Analysis

    PubMed Central

    Gipson, Bryant; Moll, Mark; Kavraki, Lydia E.

    2013-01-01

    Proteins are at the root of many biological functions, often performing complex tasks as the result of large changes in their structure. Describing the exact details of these conformational changes, however, remains a central challenge for computational biology due the enormous computational requirements of the problem. This has engendered the development of a rich variety of useful methods designed to answer specific questions at different levels of spatial, temporal, and energetic resolution. These methods fall largely into two classes: physically accurate, but computationally demanding methods and fast, approximate methods. We introduce here a new hybrid modeling tool, the Structured Intuitive Move Selector (sims), designed to bridge the divide between these two classes, while allowing the benefits of both to be seamlessly integrated into a single framework. This is achieved by applying a modern motion planning algorithm, borrowed from the field of robotics, in tandem with a well-established protein modeling library. sims can combine precise energy calculations with approximate or specialized conformational sampling routines to produce rapid, yet accurate, analysis of the large-scale conformational variability of protein systems. Several key advancements are shown, including the abstract use of generically defined moves (conformational sampling methods) and an expansive probabilistic conformational exploration. We present three example problems that sims is applied to and demonstrate a rapid solution for each. These include the automatic determination of “active” residues for the hinge-based system Cyanovirin-N, exploring conformational changes involving long-range coordinated motion between non-sequential residues in Ribose-Binding Protein, and the rapid discovery of a transient conformational state of Maltose-Binding Protein, previously only determined by Molecular Dynamics. For all cases we provide energetic validations using well-established energy fields, demonstrating this framework as a fast and accurate tool for the analysis of a wide range of protein flexibility problems. PMID:23935893

  12. A Dynamic/Anisotropic Low Earth Orbit (LEO) Ionizing Radiation Model

    NASA Technical Reports Server (NTRS)

    Badavi, Francis F.; West, Katie J.; Nealy, John E.; Wilson, John W.; Abrahms, Briana L.; Luetke, Nathan J.

    2006-01-01

    The International Space Station (ISS) provides the proving ground for future long duration human activities in space. Ionizing radiation measurements in ISS form the ideal tool for the experimental validation of ionizing radiation environmental models, nuclear transport code algorithms, and nuclear reaction cross sections. Indeed, prior measurements on the Space Transportation System (STS; Shuttle) have provided vital information impacting both the environmental models and the nuclear transport code development by requiring dynamic models of the Low Earth Orbit (LEO) environment. Previous studies using Computer Aided Design (CAD) models of the evolving ISS configurations with Thermo Luminescent Detector (TLD) area monitors, demonstrated that computational dosimetry requires environmental models with accurate non-isotropic as well as dynamic behavior, detailed information on rack loading, and an accurate 6 degree of freedom (DOF) description of ISS trajectory and orientation.

  13. Space Object Collision Probability via Monte Carlo on the Graphics Processing Unit

    NASA Astrophysics Data System (ADS)

    Vittaldev, Vivek; Russell, Ryan P.

    2017-09-01

    Fast and accurate collision probability computations are essential for protecting space assets. Monte Carlo (MC) simulation is the most accurate but computationally intensive method. A Graphics Processing Unit (GPU) is used to parallelize the computation and reduce the overall runtime. Using MC techniques to compute the collision probability is common in literature as the benchmark. An optimized implementation on the GPU, however, is a challenging problem and is the main focus of the current work. The MC simulation takes samples from the uncertainty distributions of the Resident Space Objects (RSOs) at any time during a time window of interest and outputs the separations at closest approach. Therefore, any uncertainty propagation method may be used and the collision probability is automatically computed as a function of RSO collision radii. Integration using a fixed time step and a quartic interpolation after every Runge Kutta step ensures that no close approaches are missed. Two orders of magnitude speedups over a serial CPU implementation are shown, and speedups improve moderately with higher fidelity dynamics. The tool makes the MC approach tractable on a single workstation, and can be used as a final product, or for verifying surrogate and analytical collision probability methods.

  14. Setting the Scope of Concept Inventories for Introductory Computing Subjects

    ERIC Educational Resources Information Center

    Goldman, Ken; Gross, Paul; Heeren, Cinda; Herman, Geoffrey L.; Kaczmarczyk, Lisa; Loui, Michael C.; Zilles, Craig

    2010-01-01

    A concept inventory is a standardized assessment tool intended to evaluate a student's understanding of the core concepts of a topic. In order to create a concept inventory it is necessary to accurately identify these core concepts. A Delphi process is a structured multi-step process that uses a group of experts to achieve a consensus opinion. We…

  15. A Framework for Automating Cost Estimates in Assembly Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calton, T.L.; Peters, R.R.

    1998-12-09

    When a product concept emerges, the manufacturing engineer is asked to sketch out a production strategy and estimate its cost. The engineer is given an initial product design, along with a schedule of expected production volumes. The engineer then determines the best approach to manufacturing the product, comparing a variey of alternative production strategies. The engineer must consider capital cost, operating cost, lead-time, and other issues in an attempt to maximize pro$ts. After making these basic choices and sketching the design of overall production, the engineer produces estimates of the required capital, operating costs, and production capacity. 177is process maymore » iterate as the product design is refined in order to improve its pe~ormance or manufacturability. The focus of this paper is on the development of computer tools to aid manufacturing engineers in their decision-making processes. This computer sof~are tool provides aj?amework in which accurate cost estimates can be seamlessly derivedfiom design requirements at the start of any engineering project. Z+e result is faster cycle times through first-pass success; lower ll~e cycie cost due to requirements-driven design and accurate cost estimates derived early in the process.« less

  16. Comparison of Artificial Immune System and Particle Swarm Optimization Techniques for Error Optimization of Machine Vision Based Tool Movements

    NASA Astrophysics Data System (ADS)

    Mahapatra, Prasant Kumar; Sethi, Spardha; Kumar, Amod

    2015-10-01

    In conventional tool positioning technique, sensors embedded in the motion stages provide the accurate tool position information. In this paper, a machine vision based system and image processing technique for motion measurement of lathe tool from two-dimensional sequential images captured using charge coupled device camera having a resolution of 250 microns has been described. An algorithm was developed to calculate the observed distance travelled by the tool from the captured images. As expected, error was observed in the value of the distance traversed by the tool calculated from these images. Optimization of errors due to machine vision system, calibration, environmental factors, etc. in lathe tool movement was carried out using two soft computing techniques, namely, artificial immune system (AIS) and particle swarm optimization (PSO). The results show better capability of AIS over PSO.

  17. Solubility prediction, solvate and cocrystal screening as tools for rational crystal engineering.

    PubMed

    Loschen, Christoph; Klamt, Andreas

    2015-06-01

    The fact that novel drug candidates are becoming increasingly insoluble is a major problem of current drug development. Computational tools may address this issue by screening for suitable solvents or by identifying potential novel cocrystal formers that increase bioavailability. In contrast to other more specialized methods, the fluid phase thermodynamics approach COSMO-RS (conductor-like screening model for real solvents) allows for a comprehensive treatment of drug solubility, solvate and cocrystal formation and many other thermodynamics properties in liquids. This article gives an overview of recent COSMO-RS developments that are of interest for drug development and contains several new application examples for solubility prediction and solvate/cocrystal screening. For all property predictions COSMO-RS has been used. The basic concept of COSMO-RS consists of using the screening charge density as computed from first principles calculations in combination with fast statistical thermodynamics to compute the chemical potential of a compound in solution. The fast and accurate assessment of drug solubility and the identification of suitable solvents, solvate or cocrystal formers is nowadays possible and may be used to complement modern drug development. Efficiency is increased by avoiding costly quantum-chemical computations using a database of previously computed molecular fragments. COSMO-RS theory can be applied to a range of physico-chemical properties, which are of interest in rational crystal engineering. Most notably, in combination with experimental reference data, accurate quantitative solubility predictions in any solvent or solvent mixture are possible. Additionally, COSMO-RS can be extended to the prediction of cocrystal formation, which results in considerable predictive accuracy concerning coformer screening. In a recent variant costly quantum chemical calculations are avoided resulting in a significant speed-up and ease-of-use. © 2015 Royal Pharmaceutical Society.

  18. A Machine Learning Method for the Prediction of Receptor Activation in the Simulation of Synapses

    PubMed Central

    Montes, Jesus; Gomez, Elena; Merchán-Pérez, Angel; DeFelipe, Javier; Peña, Jose-Maria

    2013-01-01

    Chemical synaptic transmission involves the release of a neurotransmitter that diffuses in the extracellular space and interacts with specific receptors located on the postsynaptic membrane. Computer simulation approaches provide fundamental tools for exploring various aspects of the synaptic transmission under different conditions. In particular, Monte Carlo methods can track the stochastic movements of neurotransmitter molecules and their interactions with other discrete molecules, the receptors. However, these methods are computationally expensive, even when used with simplified models, preventing their use in large-scale and multi-scale simulations of complex neuronal systems that may involve large numbers of synaptic connections. We have developed a machine-learning based method that can accurately predict relevant aspects of the behavior of synapses, such as the percentage of open synaptic receptors as a function of time since the release of the neurotransmitter, with considerably lower computational cost compared with the conventional Monte Carlo alternative. The method is designed to learn patterns and general principles from a corpus of previously generated Monte Carlo simulations of synapses covering a wide range of structural and functional characteristics. These patterns are later used as a predictive model of the behavior of synapses under different conditions without the need for additional computationally expensive Monte Carlo simulations. This is performed in five stages: data sampling, fold creation, machine learning, validation and curve fitting. The resulting procedure is accurate, automatic, and it is general enough to predict synapse behavior under experimental conditions that are different to the ones it has been trained on. Since our method efficiently reproduces the results that can be obtained with Monte Carlo simulations at a considerably lower computational cost, it is suitable for the simulation of high numbers of synapses and it is therefore an excellent tool for multi-scale simulations. PMID:23894367

  19. Automatic and accurate reconstruction of distal humerus contours through B-Spline fitting based on control polygon deformation.

    PubMed

    Mostafavi, Kamal; Tutunea-Fatan, O Remus; Bordatchev, Evgueni V; Johnson, James A

    2014-12-01

    The strong advent of computer-assisted technologies experienced by the modern orthopedic surgery prompts for the expansion of computationally efficient techniques to be built on the broad base of computer-aided engineering tools that are readily available. However, one of the common challenges faced during the current developmental phase continues to remain the lack of reliable frameworks to allow a fast and precise conversion of the anatomical information acquired through computer tomography to a format that is acceptable to computer-aided engineering software. To address this, this study proposes an integrated and automatic framework capable to extract and then postprocess the original imaging data to a common planar and closed B-Spline representation. The core of the developed platform relies on the approximation of the discrete computer tomography data by means of an original two-step B-Spline fitting technique based on successive deformations of the control polygon. In addition to its rapidity and robustness, the developed fitting technique was validated to produce accurate representations that do not deviate by more than 0.2 mm with respect to alternate representations of the bone geometry that were obtained through different-contact-based-data acquisition or data processing methods. © IMechE 2014.

  20. Simulation tools for scattering corrections in spectrally resolved x-ray computed tomography using McXtrace

    NASA Astrophysics Data System (ADS)

    Busi, Matteo; Olsen, Ulrik L.; Knudsen, Erik B.; Frisvad, Jeppe R.; Kehres, Jan; Dreier, Erik S.; Khalil, Mohamad; Haldrup, Kristoffer

    2018-03-01

    Spectral computed tomography is an emerging imaging method that involves using recently developed energy discriminating photon-counting detectors (PCDs). This technique enables measurements at isolated high-energy ranges, in which the dominating undergoing interaction between the x-ray and the sample is the incoherent scattering. The scattered radiation causes a loss of contrast in the results, and its correction has proven to be a complex problem, due to its dependence on energy, material composition, and geometry. Monte Carlo simulations can utilize a physical model to estimate the scattering contribution to the signal, at the cost of high computational time. We present a fast Monte Carlo simulation tool, based on McXtrace, to predict the energy resolved radiation being scattered and absorbed by objects of complex shapes. We validate the tool through measurements using a CdTe single PCD (Multix ME-100) and use it for scattering correction in a simulation of a spectral CT. We found the correction to account for up to 7% relative amplification in the reconstructed linear attenuation. It is a useful tool for x-ray CT to obtain a more accurate material discrimination, especially in the high-energy range, where the incoherent scattering interactions become prevailing (>50 keV).

  1. BLESS 2: accurate, memory-efficient and fast error correction method.

    PubMed

    Heo, Yun; Ramachandran, Anand; Hwu, Wen-Mei; Ma, Jian; Chen, Deming

    2016-08-01

    The most important features of error correction tools for sequencing data are accuracy, memory efficiency and fast runtime. The previous version of BLESS was highly memory-efficient and accurate, but it was too slow to handle reads from large genomes. We have developed a new version of BLESS to improve runtime and accuracy while maintaining a small memory usage. The new version, called BLESS 2, has an error correction algorithm that is more accurate than BLESS, and the algorithm has been parallelized using hybrid MPI and OpenMP programming. BLESS 2 was compared with five top-performing tools, and it was found to be the fastest when it was executed on two computing nodes using MPI, with each node containing twelve cores. Also, BLESS 2 showed at least 11% higher gain while retaining the memory efficiency of the previous version for large genomes. Freely available at https://sourceforge.net/projects/bless-ec dchen@illinois.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. MetaBAT, an efficient tool for accurately reconstructing single genomes from complex microbial communities

    DOE PAGES

    Kang, Dongwan D.; Froula, Jeff; Egan, Rob; ...

    2015-01-01

    Grouping large genomic fragments assembled from shotgun metagenomic sequences to deconvolute complex microbial communities, or metagenome binning, enables the study of individual organisms and their interactions. Because of the complex nature of these communities, existing metagenome binning methods often miss a large number of microbial species. In addition, most of the tools are not scalable to large datasets. Here we introduce automated software called MetaBAT that integrates empirical probabilistic distances of genome abundance and tetranucleotide frequency for accurate metagenome binning. MetaBAT outperforms alternative methods in accuracy and computational efficiency on both synthetic and real metagenome datasets. Lastly, it automatically formsmore » hundreds of high quality genome bins on a very large assembly consisting millions of contigs in a matter of hours on a single node. MetaBAT is open source software and available at https://bitbucket.org/berkeleylab/metabat.« less

  3. Systems Biology for Organotypic Cell Cultures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grego, Sonia; Dougherty, Edward R.; Alexander, Francis J.

    Translating in vitro biological data into actionable information related to human health holds the potential to improve disease treatment and risk assessment of chemical exposures. While genomics has identified regulatory pathways at the cellular level, translation to the organism level requires a multiscale approach accounting for intra-cellular regulation, inter-cellular interaction, and tissue/organ-level effects. Tissue-level effects can now be probed in vitro thanks to recently developed systems of three-dimensional (3D), multicellular, “organotypic” cell cultures, which mimic functional responses of living tissue. However, there remains a knowledge gap regarding interactions across different biological scales, complicating accurate prediction of health outcomes from molecular/genomicmore » data and tissue responses. Systems biology aims at mathematical modeling of complex, non-linear biological systems. We propose to apply a systems biology approach to achieve a computational representation of tissue-level physiological responses by integrating empirical data derived from organotypic culture systems with computational models of intracellular pathways to better predict human responses. Successful implementation of this integrated approach will provide a powerful tool for faster, more accurate and cost-effective screening of potential toxicants and therapeutics. On September 11, 2015, an interdisciplinary group of scientists, engineers, and clinicians gathered for a workshop in Research Triangle Park, North Carolina, to discuss this ambitious goal. Participants represented laboratory-based and computational modeling approaches to pharmacology and toxicology, as well as the pharmaceutical industry, government, non-profits, and academia. Discussions focused on identifying critical system perturbations to model, the computational tools required, and the experimental approaches best suited to generating key data. This consensus report summarizes the discussions held.« less

  4. Workshop Report: Systems Biology for Organotypic Cell Cultures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grego, Sonia; Dougherty, Edward R.; Alexander, Francis Joseph

    Translating in vitro biological data into actionable information related to human health holds the potential to improve disease treatment and risk assessment of chemical exposures. While genomics has identified regulatory pathways at the cellular level, translation to the organism level requires a multiscale approach accounting for intra-cellular regulation, inter-cellular interaction, and tissue/organ-level effects. Tissue-level effects can now be probed in vitro thanks to recently developed systems of three-dimensional (3D), multicellular, “organotypic” cell cultures, which mimic functional responses of living tissue. However, there remains a knowledge gap regarding interactions across different biological scales, complicating accurate prediction of health outcomes from molecular/genomicmore » data and tissue responses. Systems biology aims at mathematical modeling of complex, non-linear biological systems. We propose to apply a systems biology approach to achieve a computational representation of tissue-level physiological responses by integrating empirical data derived from organotypic culture systems with computational models of intracellular pathways to better predict human responses. Successful implementation of this integrated approach will provide a powerful tool for faster, more accurate and cost-effective screening of potential toxicants and therapeutics. On September 11, 2015, an interdisciplinary group of scientists, engineers, and clinicians gathered for a workshop in Research Triangle Park, North Carolina, to discuss this ambitious goal. Participants represented laboratory-based and computational modeling approaches to pharmacology and toxicology, as well as the pharmaceutical industry, government, non-profits, and academia. Discussions focused on identifying critical system perturbations to model, the computational tools required, and the experimental approaches best suited to generating key data.« less

  5. Workshop Report: Systems Biology for Organotypic Cell Cultures

    DOE PAGES

    Grego, Sonia; Dougherty, Edward R.; Alexander, Francis Joseph; ...

    2016-11-14

    Translating in vitro biological data into actionable information related to human health holds the potential to improve disease treatment and risk assessment of chemical exposures. While genomics has identified regulatory pathways at the cellular level, translation to the organism level requires a multiscale approach accounting for intra-cellular regulation, inter-cellular interaction, and tissue/organ-level effects. Tissue-level effects can now be probed in vitro thanks to recently developed systems of three-dimensional (3D), multicellular, “organotypic” cell cultures, which mimic functional responses of living tissue. However, there remains a knowledge gap regarding interactions across different biological scales, complicating accurate prediction of health outcomes from molecular/genomicmore » data and tissue responses. Systems biology aims at mathematical modeling of complex, non-linear biological systems. We propose to apply a systems biology approach to achieve a computational representation of tissue-level physiological responses by integrating empirical data derived from organotypic culture systems with computational models of intracellular pathways to better predict human responses. Successful implementation of this integrated approach will provide a powerful tool for faster, more accurate and cost-effective screening of potential toxicants and therapeutics. On September 11, 2015, an interdisciplinary group of scientists, engineers, and clinicians gathered for a workshop in Research Triangle Park, North Carolina, to discuss this ambitious goal. Participants represented laboratory-based and computational modeling approaches to pharmacology and toxicology, as well as the pharmaceutical industry, government, non-profits, and academia. Discussions focused on identifying critical system perturbations to model, the computational tools required, and the experimental approaches best suited to generating key data.« less

  6. Systems biology for organotypic cell cultures.

    PubMed

    Grego, Sonia; Dougherty, Edward R; Alexander, Francis J; Auerbach, Scott S; Berridge, Brian R; Bittner, Michael L; Casey, Warren; Cooley, Philip C; Dash, Ajit; Ferguson, Stephen S; Fennell, Timothy R; Hawkins, Brian T; Hickey, Anthony J; Kleensang, Andre; Liebman, Michael N J; Martin, Florian; Maull, Elizabeth A; Paragas, Jason; Qiao, Guilin Gary; Ramaiahgari, Sreenivasa; Sumner, Susan J; Yoon, Miyoung

    2017-01-01

    Translating in vitro biological data into actionable information related to human health holds the potential to improve disease treatment and risk assessment of chemical exposures. While genomics has identified regulatory pathways at the cellular level, translation to the organism level requires a multiscale approach accounting for intra-cellular regulation, inter-cellular interaction, and tissue/organ-level effects. Tissue-level effects can now be probed in vitro thanks to recently developed systems of three-dimensional (3D), multicellular, "organotypic" cell cultures, which mimic functional responses of living tissue. However, there remains a knowledge gap regarding interactions across different biological scales, complicating accurate prediction of health outcomes from molecular/genomic data and tissue responses. Systems biology aims at mathematical modeling of complex, non-linear biological systems. We propose to apply a systems biology approach to achieve a computational representation of tissue-level physiological responses by integrating empirical data derived from organotypic culture systems with computational models of intracellular pathways to better predict human responses. Successful implementation of this integrated approach will provide a powerful tool for faster, more accurate and cost-effective screening of potential toxicants and therapeutics. On September 11, 2015, an interdisciplinary group of scientists, engineers, and clinicians gathered for a workshop in Research Triangle Park, North Carolina, to discuss this ambitious goal. Participants represented laboratory-based and computational modeling approaches to pharmacology and toxicology, as well as the pharmaceutical industry, government, non-profits, and academia. Discussions focused on identifying critical system perturbations to model, the computational tools required, and the experimental approaches best suited to generating key data.

  7. G-Anchor: a novel approach for whole-genome comparative mapping utilizing evolutionary conserved DNA sequences.

    PubMed

    Lenis, Vasileios Panagiotis E; Swain, Martin; Larkin, Denis M

    2018-05-01

    Cross-species whole-genome sequence alignment is a critical first step for genome comparative analyses, ranging from the detection of sequence variants to studies of chromosome evolution. Animal genomes are large and complex, and whole-genome alignment is a computationally intense process, requiring expensive high-performance computing systems due to the need to explore extensive local alignments. With hundreds of sequenced animal genomes available from multiple projects, there is an increasing demand for genome comparative analyses. Here, we introduce G-Anchor, a new, fast, and efficient pipeline that uses a strictly limited but highly effective set of local sequence alignments to anchor (or map) an animal genome to another species' reference genome. G-Anchor makes novel use of a databank of highly conserved DNA sequence elements. We demonstrate how these elements may be aligned to a pair of genomes, creating anchors. These anchors enable the rapid mapping of scaffolds from a de novo assembled genome to chromosome assemblies of a reference species. Our results demonstrate that G-Anchor can successfully anchor a vertebrate genome onto a phylogenetically related reference species genome using a desktop or laptop computer within a few hours and with comparable accuracy to that achieved by a highly accurate whole-genome alignment tool such as LASTZ. G-Anchor thus makes whole-genome comparisons accessible to researchers with limited computational resources. G-Anchor is a ready-to-use tool for anchoring a pair of vertebrate genomes. It may be used with large genomes that contain a significant fraction of evolutionally conserved DNA sequences and that are not highly repetitive, polypoid, or excessively fragmented. G-Anchor is not a substitute for whole-genome aligning software but can be used for fast and accurate initial genome comparisons. G-Anchor is freely available and a ready-to-use tool for the pairwise comparison of two genomes.

  8. Pynamic: the Python Dynamic Benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, G L; Ahn, D H; de Supinksi, B R

    2007-07-10

    Python is widely used in scientific computing to facilitate application development and to support features such as computational steering. Making full use of some of Python's popular features, which improve programmer productivity, leads to applications that access extremely high numbers of dynamically linked libraries (DLLs). As a result, some important Python-based applications severely stress a system's dynamic linking and loading capabilities and also cause significant difficulties for most development environment tools, such as debuggers. Furthermore, using the Python paradigm for large scale MPI-based applications can create significant file IO and further stress tools and operating systems. In this paper, wemore » present Pynamic, the first benchmark program to support configurable emulation of a wide-range of the DLL usage of Python-based applications for large scale systems. Pynamic has already accurately reproduced system software and tool issues encountered by important large Python-based scientific applications on our supercomputers. Pynamic provided insight for our system software and tool vendors, and our application developers, into the impact of several design decisions. As we describe the Pynamic benchmark, we will highlight some of the issues discovered in our large scale system software and tools using Pynamic.« less

  9. Ganalyzer: A tool for automatic galaxy image analysis

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2011-05-01

    Ganalyzer is a model-based tool that automatically analyzes and classifies galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large datasets of galaxy images collected by autonomous sky surveys such as SDSS, LSST or DES.

  10. RF Models for Plasma-Surface Interactions in VSim

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas G.; Smithe, D. N.; Pankin, A. Y.; Roark, C. M.; Zhou, C. D.; Stoltz, P. H.; Kruger, S. E.

    2014-10-01

    An overview of ongoing enhancements to the Plasma Discharge (PD) module of Tech-X's VSim software tool is presented. A sub-grid kinetic sheath model, developed for the accurate computation of sheath potentials near metal and dielectric-coated walls, enables the physical effects of DC and RF sheath physics to be included in macroscopic-scale plasma simulations that need not explicitly resolve sheath scale lengths. Sheath potential evolution, together with particle behavior near the sheath, can thus be simulated in complex geometries. Generalizations of the model to include sputtering, secondary electron emission, and effects from multiple ion species and background magnetic fields are summarized; related numerical results are also presented. In addition, improved tools for plasma chemistry and IEDF/EEDF visualization and modeling are discussed, as well as our initial efforts toward the development of hybrid fluid/kinetic transition capabilities within VSim. Ultimately, we aim to establish VSimPD as a robust, efficient computational tool for modeling industrial plasma processes. Supported by US DoE SBIR-I/II Award DE-SC0009501.

  11. Accuracy of patient-specific organ dose estimates obtained using an automated image segmentation algorithm.

    PubMed

    Schmidt, Taly Gilat; Wang, Adam S; Coradi, Thomas; Haas, Benjamin; Star-Lack, Josh

    2016-10-01

    The overall goal of this work is to develop a rapid, accurate, and automated software tool to estimate patient-specific organ doses from computed tomography (CT) scans using simulations to generate dose maps combined with automated segmentation algorithms. This work quantified the accuracy of organ dose estimates obtained by an automated segmentation algorithm. We hypothesized that the autosegmentation algorithm is sufficiently accurate to provide organ dose estimates, since small errors delineating organ boundaries will have minimal effect when computing mean organ dose. A leave-one-out validation study of the automated algorithm was performed with 20 head-neck CT scans expertly segmented into nine regions. Mean organ doses of the automatically and expertly segmented regions were computed from Monte Carlo-generated dose maps and compared. The automated segmentation algorithm estimated the mean organ dose to be within 10% of the expert segmentation for regions other than the spinal canal, with the median error for each organ region below 2%. In the spinal canal region, the median error was [Formula: see text], with a maximum absolute error of 28% for the single-atlas approach and 11% for the multiatlas approach. The results demonstrate that the automated segmentation algorithm can provide accurate organ dose estimates despite some segmentation errors.

  12. Accuracy of patient-specific organ dose estimates obtained using an automated image segmentation algorithm

    PubMed Central

    Schmidt, Taly Gilat; Wang, Adam S.; Coradi, Thomas; Haas, Benjamin; Star-Lack, Josh

    2016-01-01

    Abstract. The overall goal of this work is to develop a rapid, accurate, and automated software tool to estimate patient-specific organ doses from computed tomography (CT) scans using simulations to generate dose maps combined with automated segmentation algorithms. This work quantified the accuracy of organ dose estimates obtained by an automated segmentation algorithm. We hypothesized that the autosegmentation algorithm is sufficiently accurate to provide organ dose estimates, since small errors delineating organ boundaries will have minimal effect when computing mean organ dose. A leave-one-out validation study of the automated algorithm was performed with 20 head-neck CT scans expertly segmented into nine regions. Mean organ doses of the automatically and expertly segmented regions were computed from Monte Carlo-generated dose maps and compared. The automated segmentation algorithm estimated the mean organ dose to be within 10% of the expert segmentation for regions other than the spinal canal, with the median error for each organ region below 2%. In the spinal canal region, the median error was −7%, with a maximum absolute error of 28% for the single-atlas approach and 11% for the multiatlas approach. The results demonstrate that the automated segmentation algorithm can provide accurate organ dose estimates despite some segmentation errors. PMID:27921070

  13. Thermal Analysis of Magnetically-Coupled Pump for Cryogenic Applications

    NASA Technical Reports Server (NTRS)

    Senocak, Inanc; Udaykumar, H. S.; Ndri, Narcisse; Francois, Marianne; Shyy, Wei

    1999-01-01

    Magnetically-coupled pump is under evaluation at Kennedy Space Center for possible cryogenic applications. A major concern is the impact of low temperature fluid flows on the pump performance. As a first step toward addressing this and related issues, a computational fluid dynamics and heat transfer tool has been adopted in a pump geometry. The computational tool includes (i) a commercial grid generator to handle multiple grid blocks and complicated geometric definitions, and (ii) an in-house computational fluid dynamics and heat transfer software developed in the Principal Investigator's group at the University of Florida. Both pure-conduction and combined convection-conduction computations have been conducted. A pure-conduction analysis gives insufficient information about the overall thermal distribution. Combined convection-conduction analysis indicates the significant influence of the coolant over the entire flow path. Since 2-D simulation is of limited help, future work on full 3-D modeling of the pump using multi-materials is needed. A comprehensive and accurate model can be developed to take into account the effect of multi-phase flow in the cooling flow loop, and the magnetic interactions.

  14. Structural variation discovery in the cancer genome using next generation sequencing: Computational solutions and perspectives

    PubMed Central

    Liu, Biao; Conroy, Jeffrey M.; Morrison, Carl D.; Odunsi, Adekunle O.; Qin, Maochun; Wei, Lei; Trump, Donald L.; Johnson, Candace S.; Liu, Song; Wang, Jianmin

    2015-01-01

    Somatic Structural Variations (SVs) are a complex collection of chromosomal mutations that could directly contribute to carcinogenesis. Next Generation Sequencing (NGS) technology has emerged as the primary means of interrogating the SVs of the cancer genome in recent investigations. Sophisticated computational methods are required to accurately identify the SV events and delineate their breakpoints from the massive amounts of reads generated by a NGS experiment. In this review, we provide an overview of current analytic tools used for SV detection in NGS-based cancer studies. We summarize the features of common SV groups and the primary types of NGS signatures that can be used in SV detection methods. We discuss the principles and key similarities and differences of existing computational programs and comment on unresolved issues related to this research field. The aim of this article is to provide a practical guide of relevant concepts, computational methods, software tools and important factors for analyzing and interpreting NGS data for the detection of SVs in the cancer genome. PMID:25849937

  15. Crux: Rapid Open Source Protein Tandem Mass Spectrometry Analysis

    PubMed Central

    2015-01-01

    Efficiently and accurately analyzing big protein tandem mass spectrometry data sets requires robust software that incorporates state-of-the-art computational, machine learning, and statistical methods. The Crux mass spectrometry analysis software toolkit (http://cruxtoolkit.sourceforge.net) is an open source project that aims to provide users with a cross-platform suite of analysis tools for interpreting protein mass spectrometry data. PMID:25182276

  16. Scattered Dose Calculations and Measurements in a Life-Like Mouse Phantom

    PubMed Central

    Welch, David; Turner, Leah; Speiser, Michael; Randers-Pehrson, Gerhard; Brenner, David J.

    2017-01-01

    Anatomically accurate phantoms are useful tools for radiation dosimetry studies. In this work, we demonstrate the construction of a new generation of life-like mouse phantoms in which the methods have been generalized to be applicable to the fabrication of any small animal. The mouse phantoms, with built-in density inhomogeneity, exhibit different scattering behavior dependent on where the radiation is delivered. Computer models of the mouse phantoms and a small animal irradiation platform were devised in Monte Carlo N-Particle code (MCNP). A baseline test replicating the irradiation system in a computational model shows minimal differences from experimental results from 50 Gy down to 0.1 Gy. We observe excellent agreement between scattered dose measurements and simulation results from X-ray irradiations focused at either the lung or the abdomen within our phantoms. This study demonstrates the utility of our mouse phantoms as measurement tools with the goal of using our phantoms to verify complex computational models. PMID:28140787

  17. Internet (WWW) based system of ultrasonic image processing tools for remote image analysis.

    PubMed

    Zeng, Hong; Fei, Ding-Yu; Fu, Cai-Ting; Kraft, Kenneth A

    2003-07-01

    Ultrasonic Doppler color imaging can provide anatomic information and simultaneously render flow information within blood vessels for diagnostic purpose. Many researchers are currently developing ultrasound image processing algorithms in order to provide physicians with accurate clinical parameters from the images. Because researchers use a variety of computer languages and work on different computer platforms to implement their algorithms, it is difficult for other researchers and physicians to access those programs. A system has been developed using World Wide Web (WWW) technologies and HTTP communication protocols to publish our ultrasonic Angle Independent Doppler Color Image (AIDCI) processing algorithm and several general measurement tools on the Internet, where authorized researchers and physicians can easily access the program using web browsers to carry out remote analysis of their local ultrasonic images or images provided from the database. In order to overcome potential incompatibility between programs and users' computer platforms, ActiveX technology was used in this project. The technique developed may also be used for other research fields.

  18. A review of imaging modalities in pulmonary hypertension

    PubMed Central

    Ascha, Mona; Renapurkar, Rahul D.; Tonelli, Adriano R.

    2017-01-01

    Pulmonary hypertension (PH) is defined as resting mean pulmonary artery pressure ≥25 mmHg measured by right heart catheterization. PH is a progressive, life-threatening disease with a variety of etiologies. Swift and accurate diagnosis of PH and appropriate classification in etiologic group will allow for earlier treatment and improved outcomes. A number of imaging tools are utilized in the evaluation of PH, such as chest X-ray, computed tomography (CT), ventilation/perfusion (V/Q) scan, and cardiac magnetic resonance imaging. Newer imaging tools such as dual-energy CT and single-photon emission computed tomography/computed tomography V/Q scanning have also emerged; however, their place in the diagnostic evaluation of PH remains to be determined. In general, each imaging technique provides incremental information, with varying degrees of sensitivity and specificity, which helps suspect the presence and identify the etiology of PH. The present study aims to provide a comprehensive review of the utility, advantages, and shortcomings of the imaging modalities that may be used to evaluate patients with PH. PMID:28469715

  19. Remote Sensing: A valuable tool in the Forest Service decision making process. [in Utah

    NASA Technical Reports Server (NTRS)

    Stanton, F. L.

    1975-01-01

    Forest Service studies for integrating remotely sensed data into existing information systems highlight a need to: (1) re-examine present methods of collecting and organizing data, (2) develop an integrated information system for rapidly processing and interpreting data, (3) apply existing technological tools in new ways, and (4) provide accurate and timely information for making right management decisions. The Forest Service developed an integrated information system using remote sensors, microdensitometers, computer hardware and software, and interactive accessories. Their efforts substantially reduce the time it takes for collecting and processing data.

  20. Design synthesis and optimization of permanent magnet synchronous machines based on computationally-efficient finite element analysis

    NASA Astrophysics Data System (ADS)

    Sizov, Gennadi Y.

    In this dissertation, a model-based multi-objective optimal design of permanent magnet ac machines, supplied by sine-wave current regulated drives, is developed and implemented. The design procedure uses an efficient electromagnetic finite element-based solver to accurately model nonlinear material properties and complex geometric shapes associated with magnetic circuit design. Application of an electromagnetic finite element-based solver allows for accurate computation of intricate performance parameters and characteristics. The first contribution of this dissertation is the development of a rapid computational method that allows accurate and efficient exploration of large multi-dimensional design spaces in search of optimum design(s). The computationally efficient finite element-based approach developed in this work provides a framework of tools that allow rapid analysis of synchronous electric machines operating under steady-state conditions. In the developed modeling approach, major steady-state performance parameters such as, winding flux linkages and voltages, average, cogging and ripple torques, stator core flux densities, core losses, efficiencies and saturated machine winding inductances, are calculated with minimum computational effort. In addition, the method includes means for rapid estimation of distributed stator forces and three-dimensional effects of stator and/or rotor skew on the performance of the machine. The second contribution of this dissertation is the development of the design synthesis and optimization method based on a differential evolution algorithm. The approach relies on the developed finite element-based modeling method for electromagnetic analysis and is able to tackle large-scale multi-objective design problems using modest computational resources. Overall, computational time savings of up to two orders of magnitude are achievable, when compared to current and prevalent state-of-the-art methods. These computational savings allow one to expand the optimization problem to achieve more complex and comprehensive design objectives. The method is used in the design process of several interior permanent magnet industrial motors. The presented case studies demonstrate that the developed finite element-based approach practically eliminates the need for using less accurate analytical and lumped parameter equivalent circuit models for electric machine design optimization. The design process and experimental validation of the case-study machines are detailed in the dissertation.

  1. Steady-State Computation of Constant Rotational Rate Dynamic Stability Derivatives

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Green, Lawrence L.

    2000-01-01

    Dynamic stability derivatives are essential to predicting the open and closed loop performance, stability, and controllability of aircraft. Computational determination of constant-rate dynamic stability derivatives (derivatives of aircraft forces and moments with respect to constant rotational rates) is currently performed indirectly with finite differencing of multiple time-accurate computational fluid dynamics solutions. Typical time-accurate solutions require excessive amounts of computational time to complete. Formulating Navier-Stokes (N-S) equations in a rotating noninertial reference frame and applying an automatic differentiation tool to the modified code has the potential for directly computing these derivatives with a single, much faster steady-state calculation. The ability to rapidly determine static and dynamic stability derivatives by computational methods can benefit multidisciplinary design methodologies and reduce dependency on wind tunnel measurements. The CFL3D thin-layer N-S computational fluid dynamics code was modified for this study to allow calculations on complex three-dimensional configurations with constant rotation rate components in all three axes. These CFL3D modifications also have direct application to rotorcraft and turbomachinery analyses. The modified CFL3D steady-state calculation is a new capability that showed excellent agreement with results calculated by a similar formulation. The application of automatic differentiation to CFL3D allows the static stability and body-axis rate derivatives to be calculated quickly and exactly.

  2. LOS selective fading and AN/FRC-170(V) radio hybrid computer simulation phase A report

    NASA Astrophysics Data System (ADS)

    Klukis, M. K.; Lyon, T. I.; Walker, R.

    1981-09-01

    This report documents results of the first phase of modeling, simulation and study of the dual diversity AN/FRC-170(V) radio and frequency selective fading line of sight channel. Both hybrid computer and circuit technologies were used to develop a fast, accurate and flexible simulation tool to investigate changes and proposed improvements to the design of the AN/FRC-170(V) radio. In addition to the simulation study, a remote hybrid computer terminal was provided to DCEC for interactive study of the modeled radio and channel. Simulated performance of the radio for Rayleigh, line of sight two ray channels, and additive noise are included in the report.

  3. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation

    PubMed Central

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering. PMID:27872840

  4. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation.

    PubMed

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.

  5. Parameter Estimation for a Turbulent Buoyant Jet Using Approximate Bayesian Computation

    NASA Astrophysics Data System (ADS)

    Christopher, Jason D.; Wimer, Nicholas T.; Hayden, Torrey R. S.; Lapointe, Caelan; Grooms, Ian; Rieker, Gregory B.; Hamlington, Peter E.

    2016-11-01

    Approximate Bayesian Computation (ABC) is a powerful tool that allows sparse experimental or other "truth" data to be used for the prediction of unknown model parameters in numerical simulations of real-world engineering systems. In this presentation, we introduce the ABC approach and then use ABC to predict unknown inflow conditions in simulations of a two-dimensional (2D) turbulent, high-temperature buoyant jet. For this test case, truth data are obtained from a simulation with known boundary conditions and problem parameters. Using spatially-sparse temperature statistics from the 2D buoyant jet truth simulation, we show that the ABC method provides accurate predictions of the true jet inflow temperature. The success of the ABC approach in the present test suggests that ABC is a useful and versatile tool for engineering fluid dynamics research.

  6. Measurement of breast volume using body scan technology(computer-aided anthropometry).

    PubMed

    Veitch, Daisy; Burford, Karen; Dench, Phil; Dean, Nicola; Griffin, Philip

    2012-01-01

    Assessment of breast volume is an important tool for preoperative planning in various breast surgeries and other applications, such as bra development. Accurate assessment can improve the consistency and quality of surgery outcomes. This study outlines a non-invasive method to measure breast volume using a whole body 3D laser surface anatomy scanner, the Cyberware WBX. It expands on a previous publication where this method was validated against patients undergoing mastectomy. It specifically outlines and expands the computer-aided anthropometric (CAA) method for extracting breast volumes in a non-invasive way from patients enrolled in a breast reduction study at Flinders Medical Centre, South Australia. This step-by-step description allows others to replicate this work and provides an additional tool to assist them in their own clinical practice and development of designs.

  7. OLTARIS: On-Line Tool for the Assessment of Radiation in Space

    NASA Technical Reports Server (NTRS)

    Singleterry, Robert C., Jr.; Blattnig, Steve R.; Clowdsley, Martha S.; Qualls, Garry D.; Sandridge, Chris A.; Simonsen, Lisa C.; Norbury, John W.; Slaba, Tony C.; Walker, Steve A.; Badavi, Francis F.; hide

    2009-01-01

    The On-Line Tool for the Assessment of Radiation In Space (OLTARIS) is a World Wide Web based tool that assesses the effects of space radiation to humans in items such as spacecraft, habitats, rovers, and spacesuits. This document explains the basis behind the interface and framework used to input the data, perform the assessment, and output the results to the user as well as the physics, engineering, and computer science used to develop OLTARIS. The physics is based on the HZETRN2005 and NUCFRG2 research codes. The OLTARIS website is the successor to the SIREST website from the early 2000 s. Modifications have been made to the code to enable easy maintenance, additions, and configuration management along with a more modern web interface. Over all, the code has been verified, tested, and modified to enable faster and more accurate assessments. The next major areas of modification are more accurate transport algorithms, better uncertainty estimates, and electronic response functions. Improvements in the existing algorithms and data occur continuously and are logged in the change log section of the website.

  8. Effect of Counterflow Jet on a Supersonic Reentry Capsule

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan; Venkatachari, Balaji Shankar; Cheng, Gary C.

    2006-01-01

    Recent NASA initiatives for space exploration have reinvigorated research on Apollo-like capsule vehicles. Aerothermodynamic characteristics of these capsule configurations during reentry play a crucial role in the performance and safety of the planetary entry probes and the crew exploration vehicles. At issue are the forebody thermal shield protection and afterbody aeroheating predictions. Due to the lack of flight or wind tunnel measurements at hypersonic speed, design decisions on such vehicles would rely heavily on computational results. Validation of current computational tools against experimental measurement thus becomes one of the most important tasks for general hypersonic research. This paper is focused on time-accurate numerical computations of hypersonic flows over a set of capsule configurations, which employ a counterflow jet to offset the detached bow shock. The accompanying increased shock stand-off distance and modified heat transfer characteristics associated with the counterflow jet may provide guidance for future design of hypersonic reentry capsules. The newly emerged space-time conservation element solution element (CESE) method is used to perform time-accurate, unstructured mesh Navier-Stokes computations for all cases investigated. The results show good agreement between experimental and numerical Schlieren pictures. Surface heat flux and aerodynamic force predictions of the capsule configurations are discussed in detail.

  9. Molecular Isotopic Distribution Analysis (MIDAs) with Adjustable Mass Accuracy

    NASA Astrophysics Data System (ADS)

    Alves, Gelio; Ogurtsov, Aleksey Y.; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  10. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    PubMed

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  11. Advanced computer techniques for inverse modeling of electric current in cardiac tissue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hutchinson, S.A.; Romero, L.A.; Diegert, C.F.

    1996-08-01

    For many years, ECG`s and vector cardiograms have been the tools of choice for non-invasive diagnosis of cardiac conduction problems, such as found in reentrant tachycardia or Wolff-Parkinson-White (WPW) syndrome. Through skillful analysis of these skin-surface measurements of cardiac generated electric currents, a physician can deduce the general location of heart conduction irregularities. Using a combination of high-fidelity geometry modeling, advanced mathematical algorithms and massively parallel computing, Sandia`s approach would provide much more accurate information and thus allow the physician to pinpoint the source of an arrhythmia or abnormal conduction pathway.

  12. Real-time in situ three-dimensional integral videography and surgical navigation using augmented reality: a pilot study

    PubMed Central

    Suenaga, Hideyuki; Hoang Tran, Huy; Liao, Hongen; Masamune, Ken; Dohi, Takeyoshi; Hoshi, Kazuto; Mori, Yoshiyuki; Takato, Tsuyoshi

    2013-01-01

    To evaluate the feasibility and accuracy of a three-dimensional augmented reality system incorporating integral videography for imaging oral and maxillofacial regions, based on preoperative computed tomography data. Three-dimensional surface models of the jawbones, based on the computed tomography data, were used to create the integral videography images of a subject's maxillofacial area. The three-dimensional augmented reality system (integral videography display, computed tomography, a position tracker and a computer) was used to generate a three-dimensional overlay that was projected on the surgical site via a half-silvered mirror. Thereafter, a feasibility study was performed on a volunteer. The accuracy of this system was verified on a solid model while simulating bone resection. Positional registration was attained by identifying and tracking the patient/surgical instrument's position. Thus, integral videography images of jawbones, teeth and the surgical tool were superimposed in the correct position. Stereoscopic images viewed from various angles were accurately displayed. Change in the viewing angle did not negatively affect the surgeon's ability to simultaneously observe the three-dimensional images and the patient, without special glasses. The difference in three-dimensional position of each measuring point on the solid model and augmented reality navigation was almost negligible (<1 mm); this indicates that the system was highly accurate. This augmented reality system was highly accurate and effective for surgical navigation and for overlaying a three-dimensional computed tomography image on a patient's surgical area, enabling the surgeon to understand the positional relationship between the preoperative image and the actual surgical site, with the naked eye. PMID:23703710

  13. Faster Aerodynamic Simulation With Cart3D

    NASA Technical Reports Server (NTRS)

    2003-01-01

    A NASA-developed aerodynamic simulation tool is ensuring the safety of future space operations while providing designers and engineers with an automated, highly accurate computer simulation suite. Cart3D, co-winner of NASA's 2002 Software of the Year award, is the result of over 10 years of research and software development conducted by Michael Aftosmis and Dr. John Melton of Ames Research Center and Professor Marsha Berger of the Courant Institute at New York University. Cart3D offers a revolutionary approach to computational fluid dynamics (CFD), the computer simulation of how fluids and gases flow around an object of a particular design. By fusing technological advancements in diverse fields such as mineralogy, computer graphics, computational geometry, and fluid dynamics, the software provides a new industrial geometry processing and fluid analysis capability with unsurpassed automation and efficiency.

  14. Parametric bicubic spline and CAD tools for complex targets shape modelling in physical optics radar cross section prediction

    NASA Astrophysics Data System (ADS)

    Delogu, A.; Furini, F.

    1991-09-01

    Increasing interest in radar cross section (RCS) reduction is placing new demands on theoretical, computation, and graphic techniques for calculating scattering properties of complex targets. In particular, computer codes capable of predicting the RCS of an entire aircraft at high frequency and of achieving RCS control with modest structural changes, are becoming of paramount importance in stealth design. A computer code, evaluating the RCS of arbitrary shaped metallic objects that are computer aided design (CAD) generated, and its validation with measurements carried out using ALENIA RCS test facilities are presented. The code, based on the physical optics method, is characterized by an efficient integration algorithm with error control, in order to contain the computer time within acceptable limits, and by an accurate parametric representation of the target surface in terms of bicubic splines.

  15. Computational Methods for HSCT-Inlet Controls/CFD Interdisciplinary Research

    NASA Technical Reports Server (NTRS)

    Cole, Gary L.; Melcher, Kevin J.; Chicatelli, Amy K.; Hartley, Tom T.; Chung, Joongkee

    1994-01-01

    A program aimed at facilitating the use of computational fluid dynamics (CFD) simulations by the controls discipline is presented. The objective is to reduce the development time and cost for propulsion system controls by using CFD simulations to obtain high-fidelity system models for control design and as numerical test beds for control system testing and validation. An interdisciplinary team has been formed to develop analytical and computational tools in three discipline areas: controls, CFD, and computational technology. The controls effort has focused on specifying requirements for an interface between the controls specialist and CFD simulations and a new method for extracting linear, reduced-order control models from CFD simulations. Existing CFD codes are being modified to permit time accurate execution and provide realistic boundary conditions for controls studies. Parallel processing and distributed computing techniques, along with existing system integration software, are being used to reduce CFD execution times and to support the development of an integrated analysis/design system. This paper describes: the initial application for the technology being developed, the high speed civil transport (HSCT) inlet control problem; activities being pursued in each discipline area; and a prototype analysis/design system in place for interactive operation and visualization of a time-accurate HSCT-inlet simulation.

  16. Medical imaging and registration in computer assisted surgery.

    PubMed

    Simon, D A; Lavallée, S

    1998-09-01

    Imaging, sensing, and computing technologies that are being introduced to aid in the planning and execution of surgical procedures are providing orthopaedic surgeons with a powerful new set of tools for improving clinical accuracy, reliability, and patient outcomes while reducing costs and operating times. Current computer assisted surgery systems typically include a measurement process for collecting patient specific medical data, a decision making process for generating a surgical plan, a registration process for aligning the surgical plan to the patient, and an action process for accurately achieving the goals specified in the plan. Some of the key concepts in computer assisted surgery applied to orthopaedics with a focus on the basic framework and underlying technologies is outlined. In addition, technical challenges and future trends in the field are discussed.

  17. User's Manual for FOMOCO Utilities-Force and Moment Computation Tools for Overset Grids

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Buning, Pieter G.

    1996-01-01

    In the numerical computations of flows around complex configurations, accurate calculations of force and moment coefficients for aerodynamic surfaces are required. When overset grid methods are used, the surfaces on which force and moment coefficients are sought typically consist of a collection of overlapping surface grids. Direct integration of flow quantities on the overlapping grids would result in the overlapped regions being counted more than once. The FOMOCO Utilities is a software package for computing flow coefficients (force, moment, and mass flow rate) on a collection of overset surfaces with accurate accounting of the overlapped zones. FOMOCO Utilities can be used in stand-alone mode or in conjunction with the Chimera overset grid compressible Navier-Stokes flow solver OVERFLOW. The software package consists of two modules corresponding to a two-step procedure: (1) hybrid surface grid generation (MIXSUR module), and (2) flow quantities integration (OVERINT module). Instructions on how to use this software package are described in this user's manual. Equations used in the flow coefficients calculation are given in Appendix A.

  18. Defense Small Business Innovation Research Program (SBIR), Volume 4, Defense Agencies Abstracts of Phase 1 Awards 1991

    DTIC Science & Technology

    1991-01-01

    EXPERIENCE IN DEVELOPING INTEGRATED OPTICAL DEVICES, NONLINEAR MAGNETIC-OPTIC MATERIALS, HIGH FREQUENCY MODULATORS, COMPUTER-AIDED MODELING AND SOPHISTICATED... HIGH -LEVEL PRESENTATION AND DISTRIBUTED CONTROL MODELS FOR INTEGRATING HETEROGENEOUS MECHANICAL ENGINEERING APPLICATIONS AND TOOLS. THE DESIGN IS FOCUSED...STATISTICALLY ACCURATE WORST CASE DEVICE MODELS FOR CIRCUIT SIMULATION. PRESENT METHODS OF WORST CASE DEVICE DESIGN ARE AD HOC AND DO NOT ALLOW THE

  19. ShatterProof: operational detection and quantification of chromothripsis.

    PubMed

    Govind, Shaylan K; Zia, Amin; Hennings-Yeomans, Pablo H; Watson, John D; Fraser, Michael; Anghel, Catalina; Wyatt, Alexander W; van der Kwast, Theodorus; Collins, Colin C; McPherson, John D; Bristow, Robert G; Boutros, Paul C

    2014-03-19

    Chromothripsis, a newly discovered type of complex genomic rearrangement, has been implicated in the evolution of several types of cancers. To date, it has been described in bone cancer, SHH-medulloblastoma and acute myeloid leukemia, amongst others, however there are still no formal or automated methods for detecting or annotating it in high throughput sequencing data. As such, findings of chromothripsis are difficult to compare and many cases likely escape detection altogether. We introduce ShatterProof, a software tool for detecting and quantifying chromothriptic events. ShatterProof takes structural variation calls (translocations, copy-number variations, short insertions and loss of heterozygosity) produced by any algorithm and using an operational definition of chromothripsis performs robust statistical tests to accurately predict the presence and location of chromothriptic events. Validation of our tool was conducted using clinical data sets including matched normal, prostate cancer samples in addition to the colorectal cancer and SCLC data sets used in the original description of chromothripsis. ShatterProof is computationally efficient, having low memory requirements and near linear computation time. This allows it to become a standard component of sequencing analysis pipelines, enabling researchers to routinely and accurately assess samples for chromothripsis. Source code and documentation can be found at http://search.cpan.org/~sgovind/Shatterproof.

  20. AccuRT: A versatile tool for radiative transfer simulations in the coupled atmosphere-ocean system

    NASA Astrophysics Data System (ADS)

    Hamre, Børge; Stamnes, Snorre; Stamnes, Knut; Stamnes, Jakob

    2017-02-01

    Reliable, accurate, and efficient modeling of the transport of electromagnetic radiation in turbid media has important applications in the study of the Earth's climate by remote sensing. For example, such modeling is needed to develop forward-inverse methods used to quantify types and concentrations of aerosol and cloud particles in the atmosphere, the dissolved organic and particulate biogeochemical matter in lakes, rivers, coastal, and open-ocean waters. It is also needed to simulate the performance of remote sensing detectors deployed on aircraft, balloons, and satellites as well as radiometric detectors deployed on buoys, gliders and other aquatic observing systems. Accurate radiative transfer modeling is also required to compute irradiances and scalar irradiances that are used to compute warming/cooling and photolysis rates in the atmosphere and primary production and warming/cooling rates in the water column. AccuRT is a radiative transfer model for the coupled atmosphere-water system that is designed to be a versatile tool for researchers in the ocean optics and remote sensing communities. It addresses the needs of researchers interested in analyzing irradiance and radiance measurements in the field and laboratory as well as those interested in making simulations of the top-of-the-atmosphere radiance in support of remote sensing algorithm development.

  1. Computational tool for the early screening of monoclonal antibodies for their viscosities

    PubMed Central

    Agrawal, Neeraj J; Helk, Bernhard; Kumar, Sandeep; Mody, Neil; Sathish, Hasige A.; Samra, Hardeep S.; Buck, Patrick M; Li, Li; Trout, Bernhardt L

    2016-01-01

    Highly concentrated antibody solutions often exhibit high viscosities, which present a number of challenges for antibody-drug development, manufacturing and administration. The antibody sequence is a key determinant for high viscosity of highly concentrated solutions; therefore, a sequence- or structure-based tool that can identify highly viscous antibodies from their sequence would be effective in ensuring that only antibodies with low viscosity progress to the development phase. Here, we present a spatial charge map (SCM) tool that can accurately identify highly viscous antibodies from their sequence alone (using homology modeling to determine the 3-dimensional structures). The SCM tool has been extensively validated at 3 different organizations, and has proved successful in correctly identifying highly viscous antibodies. As a quantitative tool, SCM is amenable to high-throughput automated analysis, and can be effectively implemented during the antibody screening or engineering phase for the selection of low-viscosity antibodies. PMID:26399600

  2. Mobile Building Energy Audit and Modeling Tools: Cooperative Research and Development Final Report, CRADA Number CRD-11-00441

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brackney, L.

    Broadly accessible, low cost, accurate, and easy-to-use energy auditing tools remain out of reach for managers of the aging U.S. building population (over 80% of U.S. commercial buildings are more than 10 years old*). concept3D and NREL's commercial buildings group will work to translate and extend NREL's existing spreadsheet-based energy auditing tool for a browser-friendly and mobile-computing platform. NREL will also work with concept3D to further develop a prototype geometry capture and materials inference tool operable on a smart phone/pad platform. These tools will be developed to interoperate with NREL's Building Component Library and OpenStudio energy modeling platforms, and willmore » be marketed by concept3D to commercial developers, academic institutions and governmental agencies. concept3D is NREL's lead developer and subcontractor of the Building Component Library.« less

  3. Simulations of recoiling black holes: adaptive mesh refinement and radiative transfer

    NASA Astrophysics Data System (ADS)

    Meliani, Zakaria; Mizuno, Yosuke; Olivares, Hector; Porth, Oliver; Rezzolla, Luciano; Younsi, Ziri

    2017-02-01

    Context. In many astrophysical phenomena, and especially in those that involve the high-energy regimes that always accompany the astronomical phenomenology of black holes and neutron stars, physical conditions that are achieved are extreme in terms of speeds, temperatures, and gravitational fields. In such relativistic regimes, numerical calculations are the only tool to accurately model the dynamics of the flows and the transport of radiation in the accreting matter. Aims: We here continue our effort of modelling the behaviour of matter when it orbits or is accreted onto a generic black hole by developing a new numerical code that employs advanced techniques geared towards solving the equations of general-relativistic hydrodynamics. Methods: More specifically, the new code employs a number of high-resolution shock-capturing Riemann solvers and reconstruction algorithms, exploiting the enhanced accuracy and the reduced computational cost of adaptive mesh-refinement (AMR) techniques. In addition, the code makes use of sophisticated ray-tracing libraries that, coupled with general-relativistic radiation-transfer calculations, allow us to accurately compute the electromagnetic emissions from such accretion flows. Results: We validate the new code by presenting an extensive series of stationary accretion flows either in spherical or axial symmetry that are performed either in two or three spatial dimensions. In addition, we consider the highly nonlinear scenario of a recoiling black hole produced in the merger of a supermassive black-hole binary interacting with the surrounding circumbinary disc. In this way, we can present for the first time ray-traced images of the shocked fluid and the light curve resulting from consistent general-relativistic radiation-transport calculations from this process. Conclusions: The work presented here lays the ground for the development of a generic computational infrastructure employing AMR techniques to accurately and self-consistently calculate general-relativistic accretion flows onto compact objects. In addition to the accurate handling of the matter, we provide a self-consistent electromagnetic emission from these scenarios by solving the associated radiative-transfer problem. While magnetic fields are currently excluded from our analysis, the tools presented here can have a number of applications to study accretion flows onto black holes or neutron stars.

  4. Radio Frequency Mass Gauging of Propellants

    NASA Technical Reports Server (NTRS)

    Zimmerli, Gregory A.; Vaden, Karl R.; Herlacher, Michael D.; Buchanan, David A.; VanDresar, Neil T.

    2007-01-01

    A combined experimental and computer simulation effort was conducted to measure radio frequency (RF) tank resonance modes in a dewar partially filled with liquid oxygen, and compare the measurements with numerical simulations. The goal of the effort was to demonstrate that computer simulations of a tank's electromagnetic eigenmodes can be used to accurately predict ground-based measurements, thereby providing a computational tool for predicting tank modes in a low-gravity environment. Matching the measured resonant frequencies of several tank modes with computer simulations can be used to gauge the amount of liquid in a tank, thus providing a possible method to gauge cryogenic propellant tanks in low-gravity. Using a handheld RF spectrum analyzer and a small antenna in a 46 liter capacity dewar for experimental measurements, we have verified that the four lowest transverse magnetic eigenmodes can be accurately predicted as a function of liquid oxygen fill level using computer simulations. The input to the computer simulations consisted of tank dimensions, and the dielectric constant of the fluid. Without using any adjustable parameters, the calculated and measured frequencies agree such that the liquid oxygen fill level was gauged to within 2 percent full scale uncertainty. These results demonstrate the utility of using electromagnetic simulations to form the basis of an RF mass gauging technology with the power to simulate tank resonance frequencies from arbitrary fluid configurations.

  5. Plant Aquaporins: Genome-Wide Identification, Transcriptomics, Proteomics, and Advanced Analytical Tools.

    PubMed

    Deshmukh, Rupesh K; Sonah, Humira; Bélanger, Richard R

    2016-01-01

    Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is offered as a resource for AQP research.

  6. Plant Aquaporins: Genome-Wide Identification, Transcriptomics, Proteomics, and Advanced Analytical Tools

    PubMed Central

    Deshmukh, Rupesh K.; Sonah, Humira; Bélanger, Richard R.

    2016-01-01

    Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is offered as a resource for AQP research. PMID:28066459

  7. Computer analysis of lighting style in fine art: steps towards inter-artist studies

    NASA Astrophysics Data System (ADS)

    Stork, David G.

    2011-03-01

    Stylometry in visual art-the mathematical description of artists' styles - has been based on a number of properties of works, such as color, brush stroke shape, visual texture, and measures of contours' curvatures. We introduce the concept of quantitative measures of lighting, such as statistical descriptions of spatial coherence, diuseness, and so forth, as properties of artistic style. Some artists of the high Renaissance, such as Leonardo, worked from nature and strove to render illumination "faithfully" photorealists, such as Richard Estes, worked from photographs and duplicated the "physics based" lighting accurately. As such, each had dierent motivations, methodologies, stagings, and "accuracies" in rendering lighting clues. Perceptual studies show that observers are poor judges of properties of lighting in photographs such as consistency (and thus by extension in paintings as well); computer methods such as rigorous cast-shadow analysis, occluding-contour analysis and spherical harmonic based estimation of light fields can be quite accurate. For this reasons, computer lighting analysis can provide a new tools for art historical studies. We review lighting analysis in paintings such as Vermeer's Girl with a pearl earring, de la Tour's Christ in the carpenter's studio, Caravaggio's Magdalen with the smoking flame and Calling of St. Matthew) and extend our corpus to works where lighting coherence is of interest to art historians, such as Caravaggio's Adoration of the Shepherds or Nativity (1609) in the Capuchin church of Santa Maria degli Angeli. Our measure of lighting coherence may help reveal the working methods of some artists and in diachronic studies of individual artists. We speculate on artists and art historical questions that may ultimately profit from future renements to these new computational tools.

  8. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures.

    PubMed

    Souris, Kevin; Lee, John Aldo; Sterpin, Edmond

    2016-04-01

    Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the gate/geant4 Monte Carlo application for homogeneous and heterogeneous geometries. Comparisons with gate/geant4 for various geometries show deviations within 2%-1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10(7) primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Andrew; Lawrence, Earl

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code,more » a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.« less

  10. Computational prediction of type III and IV secreted effectors in Gram-negative bacteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDermott, Jason E.; Corrigan, Abigail L.; Peterson, Elena S.

    2011-01-01

    In this review, we provide an overview of the methods employed by four recent papers that described novel methods for computational prediction of secreted effectors from type III and IV secretion systems in Gram-negative bacteria. The results of the studies in terms of performance at accurately predicting secreted effectors and similarities found between secretion signals that may reflect biologically relevant features for recognition. We discuss the web-based tools for secreted effector prediction described in these studies and announce the availability of our tool, the SIEVEserver (http://www.biopilot.org). Finally, we assess the accuracy of the three type III effector prediction methods onmore » a small set of proteins not known prior to the development of these tools that we have recently discovered and validated using both experimental and computational approaches. Our comparison shows that all methods use similar approaches and, in general arrive at similar conclusions. We discuss the possibility of an order-dependent motif in the secretion signal, which was a point of disagreement in the studies. Our results show that there may be classes of effectors in which the signal has a loosely defined motif, and others in which secretion is dependent only on compositional biases. Computational prediction of secreted effectors from protein sequences represents an important step toward better understanding the interaction between pathogens and hosts.« less

  11. Dosimetry applications in GATE Monte Carlo toolkit.

    PubMed

    Papadimitroulas, Panagiotis

    2017-09-01

    Monte Carlo (MC) simulations are a well-established method for studying physical processes in medical physics. The purpose of this review is to present GATE dosimetry applications on diagnostic and therapeutic simulated protocols. There is a significant need for accurate quantification of the absorbed dose in several specific applications such as preclinical and pediatric studies. GATE is an open-source MC toolkit for simulating imaging, radiotherapy (RT) and dosimetry applications in a user-friendly environment, which is well validated and widely accepted by the scientific community. In RT applications, during treatment planning, it is essential to accurately assess the deposited energy and the absorbed dose per tissue/organ of interest, as well as the local statistical uncertainty. Several types of realistic dosimetric applications are described including: molecular imaging, radio-immunotherapy, radiotherapy and brachytherapy. GATE has been efficiently used in several applications, such as Dose Point Kernels, S-values, Brachytherapy parameters, and has been compared against various MC codes which are considered as standard tools for decades. Furthermore, the presented studies show reliable modeling of particle beams when comparing experimental with simulated data. Examples of different dosimetric protocols are reported for individualized dosimetry and simulations combining imaging and therapy dose monitoring, with the use of modern computational phantoms. Personalization of medical protocols can be achieved by combining GATE MC simulations with anthropomorphic computational models and clinical anatomical data. This is a review study, covering several dosimetric applications of GATE, and the different tools used for modeling realistic clinical acquisitions with accurate dose assessment. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  12. RATT: Rapid Annotation Transfer Tool

    PubMed Central

    Otto, Thomas D.; Dillon, Gary P.; Degrave, Wim S.; Berriman, Matthew

    2011-01-01

    Second-generation sequencing technologies have made large-scale sequencing projects commonplace. However, making use of these datasets often requires gene function to be ascribed genome wide. Although tool development has kept pace with the changes in sequence production, for tasks such as mapping, de novo assembly or visualization, genome annotation remains a challenge. We have developed a method to rapidly provide accurate annotation for new genomes using previously annotated genomes as a reference. The method, implemented in a tool called RATT (Rapid Annotation Transfer Tool), transfers annotations from a high-quality reference to a new genome on the basis of conserved synteny. We demonstrate that a Mycobacterium tuberculosis genome or a single 2.5 Mb chromosome from a malaria parasite can be annotated in less than five minutes with only modest computational resources. RATT is available at http://ratt.sourceforge.net. PMID:21306991

  13. Computed tomography-based volumetric tool for standardized measurement of the maxillary sinus

    PubMed Central

    Giacomini, Guilherme; Pavan, Ana Luiza Menegatti; Altemani, João Mauricio Carrasco; Duarte, Sergio Barbosa; Fortaleza, Carlos Magno Castelo Branco; Miranda, José Ricardo de Arruda

    2018-01-01

    Volume measurements of maxillary sinus may be useful to identify diseases affecting paranasal sinuses. However, literature shows a lack of consensus in studies measuring the volume. This may be attributable to different computed tomography data acquisition techniques, segmentation methods, focuses of investigation, among other reasons. Furthermore, methods for volumetrically quantifying the maxillary sinus are commonly manual or semiautomated, which require substantial user expertise and are time-consuming. The purpose of the present study was to develop an automated tool for quantifying the total and air-free volume of the maxillary sinus based on computed tomography images. The quantification tool seeks to standardize maxillary sinus volume measurements, thus allowing better comparisons and determinations of factors that influence maxillary sinus size. The automated tool utilized image processing techniques (watershed, threshold, and morphological operators). The maxillary sinus volume was quantified in 30 patients. To evaluate the accuracy of the automated tool, the results were compared with manual segmentation that was performed by an experienced radiologist using a standard procedure. The mean percent differences between the automated and manual methods were 7.19% ± 5.83% and 6.93% ± 4.29% for total and air-free maxillary sinus volume, respectively. Linear regression and Bland-Altman statistics showed good agreement and low dispersion between both methods. The present automated tool for maxillary sinus volume assessment was rapid, reliable, robust, accurate, and reproducible and may be applied in clinical practice. The tool may be used to standardize measurements of maxillary volume. Such standardization is extremely important for allowing comparisons between studies, providing a better understanding of the role of the maxillary sinus, and determining the factors that influence maxillary sinus size under normal and pathological conditions. PMID:29304130

  14. Physics education through computational tools: the case of geometrical and physical optics

    NASA Astrophysics Data System (ADS)

    Rodríguez, Y.; Santana, A.; Mendoza, L. M.

    2013-09-01

    Recently, with the development of more powerful and accurate computational tools, the inclusion of new didactic materials in the classroom is known to have increased. However, the form in which these materials can be used to enhance the learning process is still under debate. Many different methodologies have been suggested for constructing new relevant curricular material and, among them, just-in-time teaching (JiTT) has arisen as an effective and successful way to improve the content of classes. In this paper, we will show the implemented pedagogic strategies for the courses of geometrical and optical physics for students of optometry. Thus, the use of the GeoGebra software for the geometrical optics class and the employment of new in-house software for the physical optics class created using the high-level programming language Python is shown with the corresponding activities developed for each of these applets.

  15. Nanopore sequencing technology and tools for genome assembly: computational analysis of the current state, bottlenecks and future directions.

    PubMed

    Senol Cali, Damla; Kim, Jeremie S; Ghose, Saugata; Alkan, Can; Mutlu, Onur

    2018-04-02

    Nanopore sequencing technology has the potential to render other sequencing technologies obsolete with its ability to generate long reads and provide portability. However, high error rates of the technology pose a challenge while generating accurate genome assemblies. The tools used for nanopore sequence analysis are of critical importance, as they should overcome the high error rates of the technology. Our goal in this work is to comprehensively analyze current publicly available tools for nanopore sequence analysis to understand their advantages, disadvantages and performance bottlenecks. It is important to understand where the current tools do not perform well to develop better tools. To this end, we (1) analyze the multiple steps and the associated tools in the genome assembly pipeline using nanopore sequence data, and (2) provide guidelines for determining the appropriate tools for each step. Based on our analyses, we make four key observations: (1) the choice of the tool for basecalling plays a critical role in overcoming the high error rates of nanopore sequencing technology. (2) Read-to-read overlap finding tools, GraphMap and Minimap, perform similarly in terms of accuracy. However, Minimap has a lower memory usage, and it is faster than GraphMap. (3) There is a trade-off between accuracy and performance when deciding on the appropriate tool for the assembly step. The fast but less accurate assembler Miniasm can be used for quick initial assembly, and further polishing can be applied on top of it to increase the accuracy, which leads to faster overall assembly. (4) The state-of-the-art polishing tool, Racon, generates high-quality consensus sequences while providing a significant speedup over another polishing tool, Nanopolish. We analyze various combinations of different tools and expose the trade-offs between accuracy, performance, memory usage and scalability. We conclude that our observations can guide researchers and practitioners in making conscious and effective choices for each step of the genome assembly pipeline using nanopore sequence data. Also, with the help of bottlenecks we have found, developers can improve the current tools or build new ones that are both accurate and fast, to overcome the high error rates of the nanopore sequencing technology.

  16. Solving subsurface structural problems using a computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witte, D.M.

    1987-02-01

    Until recently, the solution of subsurface structural problems has required a combination of graphical construction, trigonometry, time, and patience. Recent advances in software available for both mainframe and microcomputers now reduce the time and potential error of these calculations by an order of magnitude. Software for analysis of deviated wells, three point problems, apparent dip, apparent thickness, and the intersection of two planes, as well as the plotting and interpretation of these data can be used to allow timely and accurate exploration or operational decisions. The available computer software provides a set of utilities, or tools, rather than a comprehensive,more » intelligent system. The burden for selection of appropriate techniques, computation methods, and interpretations still lies with the explorationist user.« less

  17. Stereolithography: a potential new tool in forensic medicine.

    PubMed

    Dolz, M S; Cina, S J; Smith, R

    2000-06-01

    Stereolithography is a computer-mediated method that can be used to quickly create anatomically correct three-dimensional epoxy and acrylic resin models from various types of medical data. Multiple imaging modalities can be exploited, including computed tomography and magnetic resonance imaging. The technology was first developed and used in 1986 to overcome limitations in previous computer-aided manufacturing/milling techniques. Stereolithography is presently used to accurately reproduce both the external and internal anatomy of body structures. Current medical uses of stereolithography include preoperative planning of orthopedic and maxillofacial surgeries, the fabrication of custom prosthetic devices; and the assessment of the degree of bony and soft-tissue injury caused by trauma. We propose that there is a useful, as yet untapped, potential for this technology in forensic medicine.

  18. Modeling RF-induced Plasma-Surface Interactions with VSim

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas G.; Smithe, David N.; Pankin, Alexei Y.; Roark, Christine M.; Stoltz, Peter H.; Zhou, Sean C.-D.; Kruger, Scott E.

    2014-10-01

    An overview of ongoing enhancements to the Plasma Discharge (PD) module of Tech-X's VSim software tool is presented. A sub-grid kinetic sheath model, developed for the accurate computation of sheath potentials near metal and dielectric-coated walls, enables the physical effects of DC and RF sheath dynamics to be included in macroscopic-scale plasma simulations that need not explicitly resolve sheath scale lengths. Sheath potential evolution, together with particle behavior near the sheath (e.g. sputtering), can thus be simulated in complex, experimentally relevant geometries. Simulations of RF sheath-enhanced impurity production near surfaces of the C-Mod field-aligned ICRF antenna are presented to illustrate the model; impurity mitigation techniques are also explored. Model extensions to capture the physics of secondary electron emission and of multispecies plasmas are summarized, together with a discussion of improved tools for plasma chemistry and IEDF/EEDF visualization and modeling. The latter tools are also highly relevant for commercial plasma processing applications. Ultimately, we aim to establish VSimPD as a robust, efficient computational tool for modeling fusion and industrial plasma processes. Supported by U.S. DoE SBIR Phase I/II Award DE-SC0009501.

  19. Ordinary kriging as a tool to estimate historical daily streamflow records

    USGS Publications Warehouse

    Farmer, William H.

    2016-01-01

    Efficient and responsible management of water resources relies on accurate streamflow records. However, many watersheds are ungaged, limiting the ability to assess and understand local hydrology. Several tools have been developed to alleviate this data scarcity, but few provide continuous daily streamflow records at individual streamgages within an entire region. Building on the history of hydrologic mapping, ordinary kriging was extended to predict daily streamflow time series on a regional basis. Pooling parameters to estimate a single, time-invariant characterization of spatial semivariance structure is shown to produce accurate reproduction of streamflow. This approach is contrasted with a time-varying series of variograms, representing the temporal evolution and behavior of the spatial semivariance structure. Furthermore, the ordinary kriging approach is shown to produce more accurate time series than more common, single-index hydrologic transfers. A comparison between topological kriging and ordinary kriging is less definitive, showing the ordinary kriging approach to be significantly inferior in terms of Nash–Sutcliffe model efficiencies while maintaining significantly superior performance measured by root mean squared errors. Given the similarity of performance and the computational efficiency of ordinary kriging, it is concluded that ordinary kriging is useful for first-order approximation of daily streamflow time series in ungaged watersheds.

  20. Design of the VISITOR Tool: A Versatile ImpulSive Interplanetary Trajectory OptimizeR

    NASA Technical Reports Server (NTRS)

    Corpaccioli, Luca; Linskens, Harry; Komar, David R.

    2014-01-01

    The design of trajectories for interplanetary missions represents one of the most complex and important problems to solve during conceptual space mission design. To facilitate conceptual mission sizing activities, it is essential to obtain sufficiently accurate trajectories in a fast and repeatable manner. To this end, the VISITOR tool was developed. This tool modularly augments a patched conic MGA-1DSM model with a mass model, launch window analysis, and the ability to simulate more realistic arrival and departure operations. This was implemented in MATLAB, exploiting the built-in optimization tools and vector analysis routines. The chosen optimization strategy uses a grid search and pattern search, an iterative variable grid method. A genetic algorithm can be selectively used to improve search space pruning, at the cost of losing the repeatability of the results and increased computation time. The tool was validated against seven flown missions: the average total mission (Delta)V offset from the nominal trajectory was 9.1%, which was reduced to 7.3% when using the genetic algorithm at the cost of an increase in computation time by a factor 5.7. It was found that VISITOR was well-suited for the conceptual design of interplanetary trajectories, while also facilitating future improvements due to its modular structure.

  1. Computational modeling of radiofrequency ablation: evaluation on ex vivo data using ultrasound monitoring

    NASA Astrophysics Data System (ADS)

    Audigier, Chloé; Kim, Younsu; Dillow, Austin; Boctor, Emad M.

    2017-03-01

    Radiofrequency ablation (RFA) is the most widely used minimally invasive ablative therapy for liver cancer, but it is challenged by a lack of patient-specific monitoring. Inter-patient tissue variability and the presence of blood vessels make the prediction of the RFA difficult. A monitoring tool which can be personalized for a given patient during the intervention would be helpful to achieve a complete tumor ablation. However, the clinicians do not have access to such a tool, which results in incomplete treatment and a large number of recurrences. Computational models can simulate the phenomena and mechanisms governing this therapy. The temperature evolution as well as the resulted ablation can be modeled. When combined together with intraoperative measurements, computational modeling becomes an accurate and powerful tool to gain quantitative understanding and to enable improvements in the ongoing clinical settings. This paper shows how computational models of RFA can be evaluated using intra-operative measurements. First, simulations are used to demonstrate the feasibility of the method, which is then evaluated on two ex vivo datasets. RFA is simulated on a simplified geometry to generate realistic longitudinal temperature maps and the resulted necrosis. Computed temperatures are compared with the temperature evolution recorded using thermometers, and with temperatures monitored by ultrasound (US) in a 2D plane containing the ablation tip. Two ablations are performed on two cadaveric bovine livers, and we achieve error of 2.2 °C on average between the computed and the thermistors temperature and 1.4 °C and 2.7 °C on average between the temperature computed and monitored by US during the ablation at two different time points (t = 240 s and t = 900 s).

  2. Conceptual Design Oriented Wing Structural Analysis and Optimization

    NASA Technical Reports Server (NTRS)

    Lau, May Yuen

    1996-01-01

    Airplane optimization has always been the goal of airplane designers. In the conceptual design phase, a designer's goal could be tradeoffs between maximum structural integrity, minimum aerodynamic drag, or maximum stability and control, many times achieved separately. Bringing all of these factors into an iterative preliminary design procedure was time consuming, tedious, and not always accurate. For example, the final weight estimate would often be based upon statistical data from past airplanes. The new design would be classified based on gross characteristics, such as number of engines, wingspan, etc., to see which airplanes of the past most closely resembled the new design. This procedure works well for conventional airplane designs, but not very well for new innovative designs. With the computing power of today, new methods are emerging for the conceptual design phase of airplanes. Using finite element methods, computational fluid dynamics, and other computer techniques, designers can make very accurate disciplinary-analyses of an airplane design. These tools are computationally intensive, and when used repeatedly, they consume a great deal of computing time. In order to reduce the time required to analyze a design and still bring together all of the disciplines (such as structures, aerodynamics, and controls) into the analysis, simplified design computer analyses are linked together into one computer program. These design codes are very efficient for conceptual design. The work in this thesis is focused on a finite element based conceptual design oriented structural synthesis capability (CDOSS) tailored to be linked into ACSYNT.

  3. Chemical vapor deposition fluid flow simulation modelling tool

    NASA Technical Reports Server (NTRS)

    Bullister, Edward T.

    1992-01-01

    Accurate numerical simulation of chemical vapor deposition (CVD) processes requires a general purpose computational fluid dynamics package combined with specialized capabilities for high temperature chemistry. In this report, we describe the implementation of these specialized capabilities in the spectral element code NEKTON. The thermal expansion of the gases involved is shown to be accurately approximated by the low Mach number perturbation expansion of the incompressible Navier-Stokes equations. The radiative heat transfer between multiple interacting radiating surfaces is shown to be tractable using the method of Gebhart. The disparate rates of reaction and diffusion in CVD processes are calculated via a point-implicit time integration scheme. We demonstrate the use above capabilities on prototypical CVD applications.

  4. On the computational modeling of the viscosity of colloidal dispersions and its relation with basic molecular interactions

    NASA Astrophysics Data System (ADS)

    Gama Goicochea, A.; Balderas Altamirano, M. A.; Lopez-Esparza, R.; Waldo-Mendoza, Miguel A.; Perez, E.

    2015-09-01

    The connection between fundamental interactions acting in molecules in a fluid and macroscopically measured properties, such as the viscosity between colloidal particles coated with polymers, is studied here. The role that hydrodynamic and Brownian forces play in colloidal dispersions is also discussed. It is argued that many-body systems in which all these interactions take place can be accurately solved using computational simulation tools. One of those modern tools is the technique known as dissipative particle dynamics, which incorporates Brownian and hydrodynamic forces, as well as basic conservative interactions. A case study is reported, as an example of the applications of this technique, which consists of the prediction of the viscosity and friction between two opposing parallel surfaces covered with polymer chains, under the influence of a steady flow. This work is intended to serve as an introduction to the subject of colloidal dispersions and computer simulations, for final-year undergraduate students and beginning graduate students who are interested in beginning research in soft matter systems. To that end, a computational code is included that students can use right away to study complex fluids in equilibrium.

  5. CADRE-SS, an in Silico Tool for Predicting Skin Sensitization Potential Based on Modeling of Molecular Interactions.

    PubMed

    Kostal, Jakub; Voutchkova-Kostal, Adelina

    2016-01-19

    Using computer models to accurately predict toxicity outcomes is considered to be a major challenge. However, state-of-the-art computational chemistry techniques can now be incorporated in predictive models, supported by advances in mechanistic toxicology and the exponential growth of computing resources witnessed over the past decade. The CADRE (Computer-Aided Discovery and REdesign) platform relies on quantum-mechanical modeling of molecular interactions that represent key biochemical triggers in toxicity pathways. Here, we present an external validation exercise for CADRE-SS, a variant developed to predict the skin sensitization potential of commercial chemicals. CADRE-SS is a hybrid model that evaluates skin permeability using Monte Carlo simulations, assigns reactive centers in a molecule and possible biotransformations via expert rules, and determines reactivity with skin proteins via quantum-mechanical modeling. The results were promising with an overall very good concordance of 93% between experimental and predicted values. Comparison to performance metrics yielded by other tools available for this endpoint suggests that CADRE-SS offers distinct advantages for first-round screenings of chemicals and could be used as an in silico alternative to animal tests where permissible by legislative programs.

  6. Reduced-order surrogate models for Green's functions in black hole spacetimes

    NASA Astrophysics Data System (ADS)

    Galley, Chad; Wardell, Barry

    2016-03-01

    The fundamental nature of linear wave propagation in curved spacetime is encoded in the retarded Green's function (or propagator). Green's functions are useful tools because almost any field quantity of interest can be computed via convolution integrals with a source. In addition, perturbation theories involving nonlinear wave propagation can be expressed in terms of multiple convolutions of the Green's function. Recently, numerical solutions for propagators in black hole spacetimes have been found that are globally valid and accurate for computing physical quantities. However, the data generated is too large for practical use because the propagator depends on two spacetime points that must be sampled finely to yield accurate convolutions. I describe how to build a reduced-order model that can be evaluated as a substitute, or surrogate, for solutions of the curved spacetime Green's function equation. The resulting surrogate accurately and quickly models the original and out-of-sample data. I discuss applications of the surrogate, including self-consistent evolutions and waveforms of extreme mass ratio binaries. Green's function surrogate models provide a new and practical way to handle many old problems involving wave propagation and motion in curved spacetimes.

  7. Accurate atom-mapping computation for biochemical reactions.

    PubMed

    Latendresse, Mario; Malerich, Jeremiah P; Travers, Mike; Karp, Peter D

    2012-11-26

    The complete atom mapping of a chemical reaction is a bijection of the reactant atoms to the product atoms that specifies the terminus of each reactant atom. Atom mapping of biochemical reactions is useful for many applications of systems biology, in particular for metabolic engineering where synthesizing new biochemical pathways has to take into account for the number of carbon atoms from a source compound that are conserved in the synthesis of a target compound. Rapid, accurate computation of the atom mapping(s) of a biochemical reaction remains elusive despite significant work on this topic. In particular, past researchers did not validate the accuracy of mapping algorithms. We introduce a new method for computing atom mappings called the minimum weighted edit-distance (MWED) metric. The metric is based on bond propensity to react and computes biochemically valid atom mappings for a large percentage of biochemical reactions. MWED models can be formulated efficiently as Mixed-Integer Linear Programs (MILPs). We have demonstrated this approach on 7501 reactions of the MetaCyc database for which 87% of the models could be solved in less than 10 s. For 2.1% of the reactions, we found multiple optimal atom mappings. We show that the error rate is 0.9% (22 reactions) by comparing these atom mappings to 2446 atom mappings of the manually curated Kyoto Encyclopedia of Genes and Genomes (KEGG) RPAIR database. To our knowledge, our computational atom-mapping approach is the most accurate and among the fastest published to date. The atom-mapping data will be available in the MetaCyc database later in 2012; the atom-mapping software will be available within the Pathway Tools software later in 2012.

  8. Computational Fluid Dynamic Solutions of Optimized Heat Shields Designed for Earth Entry

    DTIC Science & Technology

    2010-01-01

    domain ρ = Density (kg/m3) σ = Stefan Boltzmann constant τ = Shear stress tensor τT−V = T-V relaxation time τe−V = e-V relaxation time xi φ = Sweep angle...Vehicle DES = Differential evolutionary Scheme DOR = Design Optimization Tools DPLR = Data Parallel Line Relaxation GSLR = Gauss- Seidel Line... Stefan - Boltzmann constant. This model provides accurate heating predictions, especially for the non-ablating heat-shields explored in this work. Various

  9. Engineering and programming manual: Two-dimensional kinetic reference computer program (TDK)

    NASA Technical Reports Server (NTRS)

    Nickerson, G. R.; Dang, L. D.; Coats, D. E.

    1985-01-01

    The Two Dimensional Kinetics (TDK) computer program is a primary tool in applying the JANNAF liquid rocket thrust chamber performance prediction methodology. The development of a methodology that includes all aspects of rocket engine performance from analytical calculation to test measurements, that is physically accurate and consistent, and that serves as an industry and government reference is presented. Recent interest in rocket engines that operate at high expansion ratio, such as most Orbit Transfer Vehicle (OTV) engine designs, has required an extension of the analytical methods used by the TDK computer program. Thus, the version of TDK that is described in this manual is in many respects different from the 1973 version of the program. This new material reflects the new capabilities of the TDK computer program, the most important of which are described.

  10. Computational Material Processing in Microgravity

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Working with Professor David Matthiesen at Case Western Reserve University (CWRU) a computer model of the DPIMS (Diffusion Processes in Molten Semiconductors) space experiment was developed that is able to predict the thermal field, flow field and concentration profile within a molten germanium capillary under both ground-based and microgravity conditions as illustrated. These models are coupled with a novel nonlinear statistical methodology for estimating the diffusion coefficient from measured concentration values after a given time that yields a more accurate estimate than traditional methods. This code was integrated into a web-based application that has become a standard tool used by engineers in the Materials Science Department at CWRU.

  11. SnapAnatomy, a computer-based interactive tool for independent learning of human anatomy.

    PubMed

    Yip, George W; Rajendran, Kanagasuntheram

    2008-06-01

    Computer-aided instruction materials are becoming increasing popular in medical education and particularly in the teaching of human anatomy. This paper describes SnapAnatomy, a new interactive program that the authors designed for independent learning of anatomy. SnapAnatomy is primarily tailored for the beginner student to encourage the learning of anatomy by developing a three-dimensional visualization of human structure that is essential to applications in clinical practice and the understanding of function. The program allows the student to take apart and to accurately put together body components in an interactive, self-paced and variable manner to achieve the learning outcome.

  12. Trends in Programming Languages for Neuroscience Simulations

    PubMed Central

    Davison, Andrew P.; Hines, Michael L.; Muller, Eilif

    2009-01-01

    Neuroscience simulators allow scientists to express models in terms of biological concepts, without having to concern themselves with low-level computational details of their implementation. The expressiveness, power and ease-of-use of the simulator interface is critical in efficiently and accurately translating ideas into a working simulation. We review long-term trends in the development of programmable simulator interfaces, and examine the benefits of moving from proprietary, domain-specific languages to modern dynamic general-purpose languages, in particular Python, which provide neuroscientists with an interactive and expressive simulation development environment and easy access to state-of-the-art general-purpose tools for scientific computing. PMID:20198154

  13. Trends in programming languages for neuroscience simulations.

    PubMed

    Davison, Andrew P; Hines, Michael L; Muller, Eilif

    2009-01-01

    Neuroscience simulators allow scientists to express models in terms of biological concepts, without having to concern themselves with low-level computational details of their implementation. The expressiveness, power and ease-of-use of the simulator interface is critical in efficiently and accurately translating ideas into a working simulation. We review long-term trends in the development of programmable simulator interfaces, and examine the benefits of moving from proprietary, domain-specific languages to modern dynamic general-purpose languages, in particular Python, which provide neuroscientists with an interactive and expressive simulation development environment and easy access to state-of-the-art general-purpose tools for scientific computing.

  14. Conic state extrapolation. [computer program for space shuttle navigation and guidance requirements

    NASA Technical Reports Server (NTRS)

    Shepperd, S. W.; Robertson, W. M.

    1973-01-01

    The Conic State Extrapolation Routine provides the capability to conically extrapolate any spacecraft inertial state vector either backwards or forwards as a function of time or as a function of transfer angle. It is merely the coded form of two versions of the solution of the two-body differential equations of motion of the spacecraft center of mass. Because of its relatively fast computation speed and moderate accuracy, it serves as a preliminary navigation tool and as a method of obtaining quick solutions for targeting and guidance functions. More accurate (but slower) results are provided by the Precision State Extrapolation Routine.

  15. A Design Tool for Liquid Rocket Engine Injectors

    NASA Technical Reports Server (NTRS)

    Farmer, R.; Cheng, G.; Trinh, H.; Tucker, K.

    2000-01-01

    A practical design tool which emphasizes the analysis of flowfields near the injector face of liquid rocket engines has been developed and used to simulate preliminary configurations of NASA's Fastrac and vortex engines. This computational design tool is sufficiently detailed to predict the interactive effects of injector element impingement angles and points and the momenta of the individual orifice flows and the combusting flow which results. In order to simulate a significant number of individual orifices, a homogeneous computational fluid dynamics model was developed. To describe sub- and supercritical liquid and vapor flows, the model utilized thermal and caloric equations of state which were valid over a wide range of pressures and temperatures. The model was constructed such that the local quality of the flow was determined directly. Since both the Fastrac and vortex engines utilize RP-1/LOX propellants, a simplified hydrocarbon combustion model was devised in order to accomplish three-dimensional, multiphase flow simulations. Such a model does not identify drops or their distribution, but it does allow the recirculating flow along the injector face and into the acoustic cavity and the film coolant flow to be accurately predicted.

  16. Modeling NIF experimental designs with adaptive mesh refinement and Lagrangian hydrodynamics

    NASA Astrophysics Data System (ADS)

    Koniges, A. E.; Anderson, R. W.; Wang, P.; Gunney, B. T. N.; Becker, R.; Eder, D. C.; MacGowan, B. J.; Schneider, M. B.

    2006-06-01

    Incorporation of adaptive mesh refinement (AMR) into Lagrangian hydrodynamics algorithms allows for the creation of a highly powerful simulation tool effective for complex target designs with three-dimensional structure. We are developing an advanced modeling tool that includes AMR and traditional arbitrary Lagrangian-Eulerian (ALE) techniques. Our goal is the accurate prediction of vaporization, disintegration and fragmentation in National Ignition Facility (NIF) experimental target elements. Although our focus is on minimizing the generation of shrapnel in target designs and protecting the optics, the general techniques are applicable to modern advanced targets that include three-dimensional effects such as those associated with capsule fill tubes. Several essential computations in ordinary radiation hydrodynamics need to be redesigned in order to allow for AMR to work well with ALE, including algorithms associated with radiation transport. Additionally, for our goal of predicting fragmentation, we include elastic/plastic flow into our computations. We discuss the integration of these effects into a new ALE-AMR simulation code. Applications of this newly developed modeling tool as well as traditional ALE simulations in two and three dimensions are applied to NIF early-light target designs.

  17. Modeling of Tool-Tissue Interactions for Computer-Based Surgical Simulation: A Literature Review

    PubMed Central

    Misra, Sarthak; Ramesh, K. T.; Okamura, Allison M.

    2009-01-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in the development of high-fidelity surgical simulators. Researchers have attempted to model tool-tissue interactions in a wide variety of ways, which can be broadly classified as (1) linear elasticity-based, (2) nonlinear (hyperelastic) elasticity-based finite element (FE) methods, and (3) other techniques that not based on FE methods or continuum mechanics. Realistic modeling of organ deformation requires populating the model with real tissue data (which are difficult to acquire in vivo) and simulating organ response in real time (which is computationally expensive). Further, it is challenging to account for connective tissue supporting the organ, friction, and topological changes resulting from tool-tissue interactions during invasive surgical procedures. Overcoming such obstacles will not only help us to model tool-tissue interactions in real time, but also enable realistic force feedback to the user during surgical simulation. This review paper classifies the existing research on tool-tissue interactions for surgical simulators specifically based on the modeling techniques employed and the kind of surgical operation being simulated, in order to inform and motivate future research on improved tool-tissue interaction models. PMID:20119508

  18. CFD Fuel Slosh Modeling of Fluid-Structure Interaction in Spacecraft Propellant Tanks with Diaphragms

    NASA Technical Reports Server (NTRS)

    Sances, Dillon J.; Gangadharan, Sathya N.; Sudermann, James E.; Marsell, Brandon

    2010-01-01

    Liquid sloshing within spacecraft propellant tanks causes rapid energy dissipation at resonant modes, which can result in attitude destabilization of the vehicle. Identifying resonant slosh modes currently requires experimental testing and mechanical pendulum analogs to characterize the slosh dynamics. Computational Fluid Dynamics (CFD) techniques have recently been validated as an effective tool for simulating fuel slosh within free-surface propellant tanks. Propellant tanks often incorporate an internal flexible diaphragm to separate ullage and propellant which increases modeling complexity. A coupled fluid-structure CFD model is required to capture the damping effects of a flexible diaphragm on the propellant. ANSYS multidisciplinary engineering software employs a coupled solver for analyzing two-way Fluid Structure Interaction (FSI) cases such as the diaphragm propellant tank system. Slosh models generated by ANSYS software are validated by experimental lateral slosh test results. Accurate data correlation would produce an innovative technique for modeling fuel slosh within diaphragm tanks and provide an accurate and efficient tool for identifying resonant modes and the slosh dynamic response.

  19. Using block pulse functions for seismic vibration semi-active control of structures with MR dampers

    NASA Astrophysics Data System (ADS)

    Rahimi Gendeshmin, Saeed; Davarnia, Daniel

    2018-03-01

    This article applied the idea of block pulse functions in the semi-active control of structures. The BP functions give effective tools to approximate complex problems. The applied control algorithm has a major effect on the performance of the controlled system and the requirements of the control devices. In control problems, it is important to devise an accurate analytical technique with less computational cost. It is proved that the BP functions are fundamental tools in approximation problems which have been applied in disparate areas in last decades. This study focuses on the employment of BP functions in control algorithm concerning reduction the computational cost. Magneto-rheological (MR) dampers are one of the well-known semi-active tools that can be used to control the response of civil Structures during earthquake. For validation purposes, numerical simulations of a 5-story shear building frame with MR dampers are presented. The results of suggested method were compared with results obtained by controlling the frame by the optimal control method based on linear quadratic regulator theory. It can be seen from simulation results that the suggested method can be helpful in reducing seismic structural responses. Besides, this method has acceptable accuracy and is in agreement with optimal control method with less computational costs.

  20. On the reliability of computed chaotic solutions of non-linear differential equations

    NASA Astrophysics Data System (ADS)

    Liao, Shijun

    2009-08-01

    A new concept, namely the critical predictable time Tc, is introduced to give a more precise description of computed chaotic solutions of non-linear differential equations: it is suggested that computed chaotic solutions are unreliable and doubtable when t > Tc. This provides us a strategy to detect reliable solution from a given computed result. In this way, the computational phenomena, such as computational chaos (CC), computational periodicity (CP) and computational prediction uncertainty, which are mainly based on long-term properties of computed time-series, can be completely avoided. Using this concept, the famous conclusion `accurate long-term prediction of chaos is impossible' should be replaced by a more precise conclusion that `accurate prediction of chaos beyond the critical predictable time Tc is impossible'. So, this concept also provides us a timescale to determine whether or not a particular time is long enough for a given non-linear dynamic system. Besides, the influence of data inaccuracy and various numerical schemes on the critical predictable time is investigated in details by using symbolic computation software as a tool. A reliable chaotic solution of Lorenz equation in a rather large interval 0 <= t < 1200 non-dimensional Lorenz time units is obtained for the first time. It is found that the precision of the initial condition and the computed data at each time step, which is mathematically necessary to get such a reliable chaotic solution in such a long time, is so high that it is physically impossible due to the Heisenberg uncertainty principle in quantum physics. This, however, provides us a so-called `precision paradox of chaos', which suggests that the prediction uncertainty of chaos is physically unavoidable, and that even the macroscopical phenomena might be essentially stochastic and thus could be described by probability more economically.

  1. MiRduplexSVM: A High-Performing MiRNA-Duplex Prediction and Evaluation Methodology

    PubMed Central

    Karathanasis, Nestoras; Tsamardinos, Ioannis; Poirazi, Panayiota

    2015-01-01

    We address the problem of predicting the position of a miRNA duplex on a microRNA hairpin via the development and application of a novel SVM-based methodology. Our method combines a unique problem representation and an unbiased optimization protocol to learn from mirBase19.0 an accurate predictive model, termed MiRduplexSVM. This is the first model that provides precise information about all four ends of the miRNA duplex. We show that (a) our method outperforms four state-of-the-art tools, namely MaturePred, MiRPara, MatureBayes, MiRdup as well as a Simple Geometric Locator when applied on the same training datasets employed for each tool and evaluated on a common blind test set. (b) In all comparisons, MiRduplexSVM shows superior performance, achieving up to a 60% increase in prediction accuracy for mammalian hairpins and can generalize very well on plant hairpins, without any special optimization. (c) The tool has a number of important applications such as the ability to accurately predict the miRNA or the miRNA*, given the opposite strand of a duplex. Its performance on this task is superior to the 2nts overhang rule commonly used in computational studies and similar to that of a comparative genomic approach, without the need for prior knowledge or the complexity of performing multiple alignments. Finally, it is able to evaluate novel, potential miRNAs found either computationally or experimentally. In relation with recent confidence evaluation methods used in miRBase, MiRduplexSVM was successful in identifying high confidence potential miRNAs. PMID:25961860

  2. Making it Easy to Construct Accurate Hydrological Models that Exploit High Performance Computers (Invited)

    NASA Astrophysics Data System (ADS)

    Kees, C. E.; Farthing, M. W.; Terrel, A.; Certik, O.; Seljebotn, D.

    2013-12-01

    This presentation will focus on two barriers to progress in the hydrological modeling community, and research and development conducted to lessen or eliminate them. The first is a barrier to sharing hydrological models among specialized scientists that is caused by intertwining the implementation of numerical methods with the implementation of abstract numerical modeling information. In the Proteus toolkit for computational methods and simulation, we have decoupled these two important parts of computational model through separate "physics" and "numerics" interfaces. More recently we have begun developing the Strong Form Language for easy and direct representation of the mathematical model formulation in a domain specific language embedded in Python. The second major barrier is sharing ANY scientific software tools that have complex library or module dependencies, as most parallel, multi-physics hydrological models must have. In this setting, users and developer are dependent on an entire distribution, possibly depending on multiple compilers and special instructions depending on the environment of the target machine. To solve these problem we have developed, hashdist, a stateless package management tool and a resulting portable, open source scientific software distribution.

  3. Advanced Doubling Adding Method for Radiative Transfer in Planetary Atmospheres

    NASA Astrophysics Data System (ADS)

    Liu, Quanhua; Weng, Fuzhong

    2006-12-01

    The doubling adding method (DA) is one of the most accurate tools for detailed multiple-scattering calculations. The principle of the method goes back to the nineteenth century in a problem dealing with reflection and transmission by glass plates. Since then the doubling adding method has been widely used as a reference tool for other radiative transfer models. The method has never been used in operational applications owing to tremendous demand on computational resources from the model. This study derives an analytical expression replacing the most complicated thermal source terms in the doubling adding method. The new development is called the advanced doubling adding (ADA) method. Thanks also to the efficiency of matrix and vector manipulations in FORTRAN 90/95, the advanced doubling adding method is about 60 times faster than the doubling adding method. The radiance (i.e., forward) computation code of ADA is easily translated into tangent linear and adjoint codes for radiance gradient calculations. The simplicity in forward and Jacobian computation codes is very useful for operational applications and for the consistency between the forward and adjoint calculations in satellite data assimilation.

  4. NetCoffee: a fast and accurate global alignment approach to identify functionally conserved proteins in multiple networks.

    PubMed

    Hu, Jialu; Kehr, Birte; Reinert, Knut

    2014-02-15

    Owing to recent advancements in high-throughput technologies, protein-protein interaction networks of more and more species become available in public databases. The question of how to identify functionally conserved proteins across species attracts a lot of attention in computational biology. Network alignments provide a systematic way to solve this problem. However, most existing alignment tools encounter limitations in tackling this problem. Therefore, the demand for faster and more efficient alignment tools is growing. We present a fast and accurate algorithm, NetCoffee, which allows to find a global alignment of multiple protein-protein interaction networks. NetCoffee searches for a global alignment by maximizing a target function using simulated annealing on a set of weighted bipartite graphs that are constructed using a triplet approach similar to T-Coffee. To assess its performance, NetCoffee was applied to four real datasets. Our results suggest that NetCoffee remedies several limitations of previous algorithms, outperforms all existing alignment tools in terms of speed and nevertheless identifies biologically meaningful alignments. The source code and data are freely available for download under the GNU GPL v3 license at https://code.google.com/p/netcoffee/.

  5. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  6. Development of a ROV Deployed Video Analysis Tool for Rapid Measurement of Submerged Oil/Gas Leaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savas, Omer

    Expanded deep sea drilling around the globe makes it necessary to have readily available tools to quickly and accurately measure discharge rates from accidental submerged oil/gas leak jets for the first responders to deploy adequate resources for containment. We have developed and tested a field deployable video analysis software package which is able to provide in the field sufficiently accurate flow rate estimates for initial responders in accidental oil discharges in submarine operations. The essence of our approach is based on tracking coherent features at the interface in the near field of immiscible turbulent jets. The software package, UCB_Plume, ismore » ready to be used by the first responders for field implementation. We have tested the tool on submerged water and oil jets which are made visible using fluorescent dyes. We have been able to estimate the discharge rate within 20% accuracy. A high end WINDOWS laptop computer is suggested as the operating platform and a USB connected high speed, high resolution monochrome camera as the imaging device are sufficient for acquiring flow images under continuous unidirectional illumination and running the software in the field. Results are obtained over a matter of minutes.« less

  7. High Accuracy Liquid Propellant Slosh Predictions Using an Integrated CFD and Controls Analysis Interface

    NASA Technical Reports Server (NTRS)

    Marsell, Brandon; Griffin, David; Schallhorn, Dr. Paul; Roth, Jacob

    2012-01-01

    Coupling computational fluid dynamics (CFD) with a controls analysis tool elegantly allows for high accuracy predictions of the interaction between sloshing liquid propellants and th e control system of a launch vehicle. Instead of relying on mechanical analogs which are not valid during aU stages of flight, this method allows for a direct link between the vehicle dynamic environments calculated by the solver in the controls analysis tool to the fluid flow equations solved by the CFD code. This paper describes such a coupling methodology, presents the results of a series of test cases, and compares said results against equivalent results from extensively validated tools. The coupling methodology, described herein, has proven to be highly accurate in a variety of different cases.

  8. Integrated CFD and Controls Analysis Interface for High Accuracy Liquid Propellant Slosh Predictions

    NASA Technical Reports Server (NTRS)

    Marsell, Brandon; Griffin, David; Schallhorn, Paul; Roth, Jacob

    2012-01-01

    Coupling computational fluid dynamics (CFD) with a controls analysis tool elegantly allows for high accuracy predictions of the interaction between sloshing liquid propellants and the control system of a launch vehicle. Instead of relying on mechanical analogs which are n0t va lid during all stages of flight, this method allows for a direct link between the vehicle dynamic environments calculated by the solver in the controls analysis tool to the fluid now equations solved by the CFD code. This paper describes such a coupling methodology, presents the results of a series of test cases, and compares said results against equivalent results from extensively validated tools. The coupling methodology, described herein, has proven to be highly accurate in a variety of different cases.

  9. A CFD-informed quasi-steady model of flapping wing aerodynamics.

    PubMed

    Nakata, Toshiyuki; Liu, Hao; Bomphrey, Richard J

    2015-11-01

    Aerodynamic performance and agility during flapping flight are determined by the combination of wing shape and kinematics. The degree of morphological and kinematic optimisation is unknown and depends upon a large parameter space. Aimed at providing an accurate and computationally inexpensive modelling tool for flapping-wing aerodynamics, we propose a novel CFD (computational fluid dynamics)-informed quasi-steady model (CIQSM), which assumes that the aerodynamic forces on a flapping wing can be decomposed into the quasi-steady forces and parameterised based on CFD results. Using least-squares fitting, we determine a set of proportional coefficients for the quasi-steady model relating wing kinematics to instantaneous aerodynamic force and torque; we calculate power with the product of quasi-steady torques and angular velocity. With the quasi-steady model fully and independently parameterised on the basis of high-fidelity CFD modelling, it is capable of predicting flapping-wing aerodynamic forces and power more accurately than the conventional blade element model (BEM) does. The improvement can be attributed to, for instance, taking into account the effects of the induced downwash and the wing tip vortex on the force generation and power consumption. Our model is validated by comparing the aerodynamics of a CFD model and the present quasi-steady model using the example case of a hovering hawkmoth. It demonstrates that the CIQSM outperforms the conventional BEM while remaining computationally cheap, and hence can be an effective tool for revealing the mechanisms of optimization and control of kinematics and morphology in flapping-wing flight for both bio-flyers and unmanned air systems.

  10. A CFD-informed quasi-steady model of flapping wing aerodynamics

    PubMed Central

    Nakata, Toshiyuki; Liu, Hao; Bomphrey, Richard J.

    2016-01-01

    Aerodynamic performance and agility during flapping flight are determined by the combination of wing shape and kinematics. The degree of morphological and kinematic optimisation is unknown and depends upon a large parameter space. Aimed at providing an accurate and computationally inexpensive modelling tool for flapping-wing aerodynamics, we propose a novel CFD (computational fluid dynamics)-informed quasi-steady model (CIQSM), which assumes that the aerodynamic forces on a flapping wing can be decomposed into the quasi-steady forces and parameterised based on CFD results. Using least-squares fitting, we determine a set of proportional coefficients for the quasi-steady model relating wing kinematics to instantaneous aerodynamic force and torque; we calculate power with the product of quasi-steady torques and angular velocity. With the quasi-steady model fully and independently parameterised on the basis of high-fidelity CFD modelling, it is capable of predicting flapping-wing aerodynamic forces and power more accurately than the conventional blade element model (BEM) does. The improvement can be attributed to, for instance, taking into account the effects of the induced downwash and the wing tip vortex on the force generation and power consumption. Our model is validated by comparing the aerodynamics of a CFD model and the present quasi-steady model using the example case of a hovering hawkmoth. It demonstrates that the CIQSM outperforms the conventional BEM while remaining computationally cheap, and hence can be an effective tool for revealing the mechanisms of optimization and control of kinematics and morphology in flapping-wing flight for both bio-flyers and unmanned air systems. PMID:27346891

  11. A Three-Dimensional Parallel Time-Accurate Turbopump Simulation Procedure Using Overset Grid System

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin; Chan, William; Kwak, Dochan

    2002-01-01

    The objective of the current effort is to provide a computational framework for design and analysis of the entire fuel supply system of a liquid rocket engine, including high-fidelity unsteady turbopump flow analysis. This capability is needed to support the design of pump sub-systems for advanced space transportation vehicles that are likely to involve liquid propulsion systems. To date, computational tools for design/analysis of turbopump flows are based on relatively lower fidelity methods. An unsteady, three-dimensional viscous flow analysis tool involving stationary and rotational components for the entire turbopump assembly has not been available for real-world engineering applications. The present effort provides developers with information such as transient flow phenomena at start up, and nonuniform inflows, and will eventually impact on system vibration and structures. In the proposed paper, the progress toward the capability of complete simulation of the turbo-pump for a liquid rocket engine is reported. The Space Shuttle Main Engine (SSME) turbo-pump is used as a test case for evaluation of the hybrid MPI/Open-MP and MLP versions of the INS3D code. CAD to solution auto-scripting capability is being developed for turbopump applications. The relative motion of the grid systems for the rotor-stator interaction was obtained using overset grid techniques. Unsteady computations for the SSME turbo-pump, which contains 114 zones with 34.5 million grid points, are carried out on Origin 3000 systems at NASA Ames Research Center. Results from these time-accurate simulations with moving boundary capability are presented along with the performance of parallel versions of the code.

  12. A computational approach for predicting off-target toxicity of antiviral ribonucleoside analogues to mitochondrial RNA polymerase.

    PubMed

    Freedman, Holly; Winter, Philip; Tuszynski, Jack; Tyrrell, D Lorne; Houghton, Michael

    2018-06-22

    In the development of antiviral drugs that target viral RNA-dependent RNA polymerases, off-target toxicity caused by the inhibition of the human mitochondrial RNA polymerase (POLRMT) is a major liability. Therefore, it is essential that all new ribonucleoside analogue drugs be accurately screened for POLRMT inhibition. A computational tool that can accurately predict NTP binding to POLRMT could assist in evaluating any potential toxicity and in designing possible salvaging strategies. Using the available crystal structure of POLRMT bound to an RNA transcript, here we created a model of POLRMT with an NTP molecule bound in the active site. Furthermore, we implemented a computational screening procedure that determines the relative binding free energy of an NTP analogue to POLRMT by free energy perturbation (FEP), i.e. a simulation in which the natural NTP molecule is slowly transformed into the analogue and back. In each direction, the transformation was performed over 40 ns of simulation on our IBM Blue Gene Q supercomputer. This procedure was validated across a panel of drugs for which experimental dissociation constants were available, showing that NTP relative binding free energies could be predicted to within 0.97 kcal/mol of the experimental values on average. These results demonstrate for the first time that free-energy simulation can be a useful tool for predicting binding affinities of NTP analogues to a polymerase. We expect that our model, together with similar models of viral polymerases, will be very useful in the screening and future design of NTP inhibitors of viral polymerases that have no mitochondrial toxicity. © 2018 Freedman et al.

  13. The Linear Interaction Energy Method for the Prediction of Protein Stability Changes Upon Mutation

    PubMed Central

    Wickstrom, Lauren; Gallicchio, Emilio; Levy, Ronald M.

    2011-01-01

    The coupling of protein energetics and sequence changes is a critical aspect of computational protein design, as well as for the understanding of protein evolution, human disease, and drug resistance. In order to study the molecular basis for this coupling, computational tools must be sufficiently accurate and computationally inexpensive enough to handle large amounts of sequence data. We have developed a computational approach based on the linear interaction energy (LIE) approximation to predict the changes in the free energy of the native state induced by a single mutation. This approach was applied to a set of 822 mutations in 10 proteins which resulted in an average unsigned error of 0.82 kcal/mol and a correlation coefficient of 0.72 between the calculated and experimental ΔΔG values. The method is able to accurately identify destabilizing hot spot mutations however it has difficulty in distinguishing between stabilizing and destabilizing mutations due to the distribution of stability changes for the set of mutations used to parameterize the model. In addition, the model also performs quite well in initial tests on a small set of double mutations. Based on these promising results, we can begin to examine the relationship between protein stability and fitness, correlated mutations, and drug resistance. PMID:22038697

  14. A computable phenotype for asthma case identification in adult and pediatric patients: External validation in the Chicago Area Patient-Outcomes Research Network (CAPriCORN).

    PubMed

    Afshar, Majid; Press, Valerie G; Robison, Rachel G; Kho, Abel N; Bandi, Sindhura; Biswas, Ashvini; Avila, Pedro C; Kumar, Harsha Vardhan Madan; Yu, Byung; Naureckas, Edward T; Nyenhuis, Sharmilee M; Codispoti, Christopher D

    2017-10-13

    Comprehensive, rapid, and accurate identification of patients with asthma for clinical care and engagement in research efforts is needed. The original development and validation of a computable phenotype for asthma case identification occurred at a single institution in Chicago and demonstrated excellent test characteristics. However, its application in a diverse payer mix, across different health systems and multiple electronic health record vendors, and in both children and adults was not examined. The objective of this study is to externally validate the computable phenotype across diverse Chicago institutions to accurately identify pediatric and adult patients with asthma. A cohort of 900 asthma and control patients was identified from the electronic health record between January 1, 2012 and November 30, 2014. Two physicians at each site independently reviewed the patient chart to annotate cases. The inter-observer reliability between the physician reviewers had a κ-coefficient of 0.95 (95% CI 0.93-0.97). The accuracy, sensitivity, specificity, negative predictive value, and positive predictive value of the computable phenotype were all above 94% in the full cohort. The excellent positive and negative predictive values in this multi-center external validation study establish a useful tool to identify asthma cases in in the electronic health record for research and care. This computable phenotype could be used in large-scale comparative-effectiveness trials.

  15. A semi-analytical bearing model considering outer race flexibility for model based bearing load monitoring

    NASA Astrophysics Data System (ADS)

    Kerst, Stijn; Shyrokau, Barys; Holweg, Edward

    2018-05-01

    This paper proposes a novel semi-analytical bearing model addressing flexibility of the bearing outer race structure. It furthermore presents the application of this model in a bearing load condition monitoring approach. The bearing model is developed as current computational low cost bearing models fail to provide an accurate description of the more and more common flexible size and weight optimized bearing designs due to their assumptions of rigidity. In the proposed bearing model raceway flexibility is described by the use of static deformation shapes. The excitation of the deformation shapes is calculated based on the modelled rolling element loads and a Fourier series based compliance approximation. The resulting model is computational low cost and provides an accurate description of the rolling element loads for flexible outer raceway structures. The latter is validated by a simulation-based comparison study with a well-established bearing simulation software tool. An experimental study finally shows the potential of the proposed model in a bearing load monitoring approach.

  16. Size assessment of breast lesions by means of a computer-aided detection (CAD) system for magnetic resonance mammography.

    PubMed

    Levrini, G; Sghedoni, R; Mori, C; Botti, A; Vacondio, R; Nitrosi, A; Iori, M; Nicoli, F

    2011-10-01

    The aim of this study was to investigate the efficacy of a dedicated software tool for automated volume measurement of breast lesions in contrast-enhanced (CE) magnetic resonance mammography (MRM). The size of 52 breast lesions with a known histopathological diagnosis (three benign, 49 malignant) was automatically evaluated using different techniques. The volume of all lesions was measured automatically (AVM) from CE 3D MRM examinations by means of a computer-aided detection (CAD) system and compared with the size estimates based on maximum diameter measurement (MDM) on MRM, ultrasonography (US), mammography and histopathology. Compared with histopathology as the reference method, AVM understimated lesion size by 4% on average. This result was similar to MDM (3% understimation, not significantly different) but significantly better than US and mammographic lesion measurements (24% and 33% size underestimation, respectively). AVM is as accurate as MDM but faster. Both methods are more accurate for size assessment of breast lesions compared with US and mammography.

  17. 3D multiscale crack propagation using the XFEM applied to a gas turbine blade

    NASA Astrophysics Data System (ADS)

    Holl, Matthias; Rogge, Timo; Loehnert, Stefan; Wriggers, Peter; Rolfes, Raimund

    2014-01-01

    This work presents a new multiscale technique to investigate advancing cracks in three dimensional space. This fully adaptive multiscale technique is designed to take into account cracks of different length scales efficiently, by enabling fine scale domains locally in regions of interest, i.e. where stress concentrations and high stress gradients occur. Due to crack propagation, these regions change during the simulation process. Cracks are modeled using the extended finite element method, such that an accurate and powerful numerical tool is achieved. Restricting ourselves to linear elastic fracture mechanics, the -integral yields an accurate solution of the stress intensity factors, and with the criterion of maximum hoop stress, a precise direction of growth. If necessary, the on the finest scale computed crack surface is finally transferred to the corresponding scale. In a final step, the model is applied to a quadrature point of a gas turbine blade, to compute crack growth on the microscale of a real structure.

  18. Cloud-Based Tools to Support High-Resolution Modeling (Invited)

    NASA Astrophysics Data System (ADS)

    Jones, N.; Nelson, J.; Swain, N.; Christensen, S.

    2013-12-01

    The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  19. Abstracting ICU Nursing Care Quality Data From the Electronic Health Record.

    PubMed

    Seaman, Jennifer B; Evans, Anna C; Sciulli, Andrea M; Barnato, Amber E; Sereika, Susan M; Happ, Mary Beth

    2017-09-01

    The electronic health record is a potentially rich source of data for clinical research in the intensive care unit setting. We describe the iterative, multi-step process used to develop and test a data abstraction tool, used for collection of nursing care quality indicators from the electronic health record, for a pragmatic trial. We computed Cohen's kappa coefficient (κ) to assess interrater agreement or reliability of data abstracted using preliminary and finalized tools. In assessing the reliability of study data ( n = 1,440 cases) using the finalized tool, 108 randomly selected cases (10% of first half sample; 5% of last half sample) were independently abstracted by a second rater. We demonstrated mean κ values ranging from 0.61 to 0.99 for all indicators. Nursing care quality data can be accurately and reliably abstracted from the electronic health records of intensive care unit patients using a well-developed data collection tool and detailed training.

  20. Toward transient finite element simulation of thermal deformation of machine tools in real-time

    NASA Astrophysics Data System (ADS)

    Naumann, Andreas; Ruprecht, Daniel; Wensch, Joerg

    2018-01-01

    Finite element models without simplifying assumptions can accurately describe the spatial and temporal distribution of heat in machine tools as well as the resulting deformation. In principle, this allows to correct for displacements of the Tool Centre Point and enables high precision manufacturing. However, the computational cost of FE models and restriction to generic algorithms in commercial tools like ANSYS prevents their operational use since simulations have to run faster than real-time. For the case where heat diffusion is slow compared to machine movement, we introduce a tailored implicit-explicit multi-rate time stepping method of higher order based on spectral deferred corrections. Using the open-source FEM library DUNE, we show that fully coupled simulations of the temperature field are possible in real-time for a machine consisting of a stock sliding up and down on rails attached to a stand.

  1. iProphet: Multi-level Integrative Analysis of Shotgun Proteomic Data Improves Peptide and Protein Identification Rates and Error Estimates*

    PubMed Central

    Shteynberg, David; Deutsch, Eric W.; Lam, Henry; Eng, Jimmy K.; Sun, Zhi; Tasman, Natalie; Mendoza, Luis; Moritz, Robert L.; Aebersold, Ruedi; Nesvizhskii, Alexey I.

    2011-01-01

    The combination of tandem mass spectrometry and sequence database searching is the method of choice for the identification of peptides and the mapping of proteomes. Over the last several years, the volume of data generated in proteomic studies has increased dramatically, which challenges the computational approaches previously developed for these data. Furthermore, a multitude of search engines have been developed that identify different, overlapping subsets of the sample peptides from a particular set of tandem mass spectrometry spectra. We present iProphet, the new addition to the widely used open-source suite of proteomic data analysis tools Trans-Proteomics Pipeline. Applied in tandem with PeptideProphet, it provides more accurate representation of the multilevel nature of shotgun proteomic data. iProphet combines the evidence from multiple identifications of the same peptide sequences across different spectra, experiments, precursor ion charge states, and modified states. It also allows accurate and effective integration of the results from multiple database search engines applied to the same data. The use of iProphet in the Trans-Proteomics Pipeline increases the number of correctly identified peptides at a constant false discovery rate as compared with both PeptideProphet and another state-of-the-art tool Percolator. As the main outcome, iProphet permits the calculation of accurate posterior probabilities and false discovery rate estimates at the level of sequence identical peptide identifications, which in turn leads to more accurate probability estimates at the protein level. Fully integrated with the Trans-Proteomics Pipeline, it supports all commonly used MS instruments, search engines, and computer platforms. The performance of iProphet is demonstrated on two publicly available data sets: data from a human whole cell lysate proteome profiling experiment representative of typical proteomic data sets, and from a set of Streptococcus pyogenes experiments more representative of organism-specific composite data sets. PMID:21876204

  2. Geometry Modeling and Grid Generation for Design and Optimization

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1998-01-01

    Geometry modeling and grid generation (GMGG) have played and will continue to play an important role in computational aerosciences. During the past two decades, tremendous progress has occurred in GMGG; however, GMGG is still the biggest bottleneck to routine applications for complicated Computational Fluid Dynamics (CFD) and Computational Structures Mechanics (CSM) models for analysis, design, and optimization. We are still far from incorporating GMGG tools in a design and optimization environment for complicated configurations. It is still a challenging task to parameterize an existing model in today's Computer-Aided Design (CAD) systems, and the models created are not always good enough for automatic grid generation tools. Designers may believe their models are complete and accurate, but unseen imperfections (e.g., gaps, unwanted wiggles, free edges, slivers, and transition cracks) often cause problems in gridding for CSM and CFD. Despite many advances in grid generation, the process is still the most labor-intensive and time-consuming part of the computational aerosciences for analysis, design, and optimization. In an ideal design environment, a design engineer would use a parametric model to evaluate alternative designs effortlessly and optimize an existing design for a new set of design objectives and constraints. For this ideal environment to be realized, the GMGG tools must have the following characteristics: (1) be automated, (2) provide consistent geometry across all disciplines, (3) be parametric, and (4) provide sensitivity derivatives. This paper will review the status of GMGG for analysis, design, and optimization processes, and it will focus on some emerging ideas that will advance the GMGG toward the ideal design environment.

  3. Evaluating the accuracy of wear formulae for acetabular cup liners.

    PubMed

    Wu, James Shih-Shyn; Hsu, Shu-Ling; Chen, Jian-Horng

    2010-02-01

    This study proposes two methods for exploring the wear volume of a worn liner. The first method is a numerical method, in which SolidWorks software is used to create models of the worn out regions of liners at various wear directions and depths. The second method is an experimental one, in which a machining center is used to mill polyoxymethylene to manufacture worn and unworn liner models, then the volumes of the models are measured. The results show that the SolidWorks software is a good tool for presenting the wear pattern and volume of a worn liner. The formula provided by Ilchmann is the most suitable for computing liner volume loss, but is not accurate enough. This study suggests that a more accurate wear formula is required. This is crucial for accurate evaluation of the performance of hip components implanted in patients, as well as for designing new hip components.

  4. Toward high-speed 3D nonlinear soft tissue deformation simulations using Abaqus software.

    PubMed

    Idkaidek, Ashraf; Jasiuk, Iwona

    2015-12-01

    We aim to achieve a fast and accurate three-dimensional (3D) simulation of a porcine liver deformation under a surgical tool pressure using the commercial finite element software Abaqus. The liver geometry is obtained using magnetic resonance imaging, and a nonlinear constitutive law is employed to capture large deformations of the tissue. Effects of implicit versus explicit analysis schemes, element type, and mesh density on computation time are studied. We find that Abaqus explicit and implicit solvers are capable of simulating nonlinear soft tissue deformations accurately using first-order tetrahedral elements in a relatively short time by optimizing the element size. This study provides new insights and guidance on accurate and relatively fast nonlinear soft tissue simulations. Such simulations can provide force feedback during robotic surgery and allow visualization of tissue deformations for surgery planning and training of surgical residents.

  5. [Construction of abridged life table for health evaluation of local resident using Excel program].

    PubMed

    Chen, Qingsha; Wang, Feng; Li, Xiaozhen; Yang, Jian; Yu, Shouyi; Hu, Jun

    2012-05-01

    To provide an easy computational tool for evaluating the health condition of local residents. An abridged life table was programmed by applying mathematical functions and formula in Excel program and tested with the real study data to evaluate the results computed. The Excel was capable of computing group death probability of age in the life table ((n)q(x)), number of survivors (l(x)), number of death ((n)d(x)), survival per person-year ((n)L(x)), survival total per person-year (T(x)) and life expectancy (e(x)). The calculated results were consistent with those by SAS. The abridged life table constructed using Microsoft Excel can conveniently and accurately calculate the relevant indices for evaluating the health condition of the residents.

  6. Non-steady state modelling of wheel-rail contact problem

    NASA Astrophysics Data System (ADS)

    Guiral, A.; Alonso, A.; Baeza, L.; Giménez, J. G.

    2013-01-01

    Among all the algorithms to solve the wheel-rail contact problem, Kalker's FastSim has become the most useful computation tool since it combines a low computational cost and enough precision for most of the typical railway dynamics problems. However, some types of dynamic problems require the use of a non-steady state analysis. Alonso and Giménez developed a non-stationary method based on FastSim, which provides both, sufficiently accurate results and a low computational cost. However, it presents some limitations; the method is developed for one time-dependent creepage and its accuracy for varying normal forces has not been checked. This article presents the required changes in order to deal with both problems and compares its results with those given by Kalker's Variational Method for rolling contact.

  7. A computational workflow for designing silicon donor qubits

    DOE PAGES

    Humble, Travis S.; Ericson, M. Nance; Jakowski, Jacek; ...

    2016-09-19

    Developing devices that can reliably and accurately demonstrate the principles of superposition and entanglement is an on-going challenge for the quantum computing community. Modeling and simulation offer attractive means of testing early device designs and establishing expectations for operational performance. However, the complex integrated material systems required by quantum device designs are not captured by any single existing computational modeling method. We examine the development and analysis of a multi-staged computational workflow that can be used to design and characterize silicon donor qubit systems with modeling and simulation. Our approach integrates quantum chemistry calculations with electrostatic field solvers to performmore » detailed simulations of a phosphorus dopant in silicon. We show how atomistic details can be synthesized into an operational model for the logical gates that define quantum computation in this particular technology. In conclusion, the resulting computational workflow realizes a design tool for silicon donor qubits that can help verify and validate current and near-term experimental devices.« less

  8. Survey of Bibliographies and Reference Works on Asia, Africa, Latin America, and Russia and East Europe and Compilation of Bibliographies on East Asia, South Asia, and Africa South of the Sahara for Undergraduate Libraries. Final Report.

    ERIC Educational Resources Information Center

    Morehouse, Ward

    The project was concerned with developing three up-to-date, accurate bibliographies on Asia and Africa as resource guides and book selection tools for undergraduate libraries. Existing bibliographies and information on newer books favorably received in journals were entered on a computer system. A preliminary, unedited, unselected bibliography was…

  9. Muver, a computational framework for accurately calling accumulated mutations.

    PubMed

    Burkholder, Adam B; Lujan, Scott A; Lavender, Christopher A; Grimm, Sara A; Kunkel, Thomas A; Fargo, David C

    2018-05-09

    Identification of mutations from next-generation sequencing data typically requires a balance between sensitivity and accuracy. This is particularly true of DNA insertions and deletions (indels), that can impart significant phenotypic consequences on cells but are harder to call than substitution mutations from whole genome mutation accumulation experiments. To overcome these difficulties, we present muver, a computational framework that integrates established bioinformatics tools with novel analytical methods to generate mutation calls with the extremely low false positive rates and high sensitivity required for accurate mutation rate determination and comparison. Muver uses statistical comparison of ancestral and descendant allelic frequencies to identify variant loci and assigns genotypes with models that include per-sample assessments of sequencing errors by mutation type and repeat context. Muver identifies maximally parsimonious mutation pathways that connect these genotypes, differentiating potential allelic conversion events and delineating ambiguities in mutation location, type, and size. Benchmarking with a human gold standard father-son pair demonstrates muver's sensitivity and low false positive rates. In DNA mismatch repair (MMR) deficient Saccharomyces cerevisiae, muver detects multi-base deletions in homopolymers longer than the replicative polymerase footprint at rates greater than predicted for sequential single-base deletions, implying a novel multi-repeat-unit slippage mechanism. Benchmarking results demonstrate the high accuracy and sensitivity achieved with muver, particularly for indels, relative to available tools. Applied to an MMR-deficient Saccharomyces cerevisiae system, muver mutation calls facilitate mechanistic insights into DNA replication fidelity.

  10. Tools of the Future: How Decision Tree Analysis Will Impact Mission Planning

    NASA Technical Reports Server (NTRS)

    Otterstatter, Matthew R.

    2005-01-01

    The universe is infinitely complex; however, the human mind has a finite capacity. The multitude of possible variables, metrics, and procedures in mission planning are far too many to address exhaustively. This is unfortunate because, in general, considering more possibilities leads to more accurate and more powerful results. To compensate, we can get more insightful results by employing our greatest tool, the computer. The power of the computer will be utilized through a technology that considers every possibility, decision tree analysis. Although decision trees have been used in many other fields, this is innovative for space mission planning. Because this is a new strategy, no existing software is able to completely accommodate all of the requirements. This was determined through extensive research and testing of current technologies. It was necessary to create original software, for which a short-term model was finished this summer. The model was built into Microsoft Excel to take advantage of the familiar graphical interface for user input, computation, and viewing output. Macros were written to automate the process of tree construction, optimization, and presentation. The results are useful and promising. If this tool is successfully implemented in mission planning, our reliance on old-fashioned heuristics, an error-prone shortcut for handling complexity, will be reduced. The computer algorithms involved in decision trees will revolutionize mission planning. The planning will be faster and smarter, leading to optimized missions with the potential for more valuable data.

  11. Simplified formulae for the estimation of offshore wind turbines clutter on marine radars.

    PubMed

    Grande, Olatz; Cañizo, Josune; Angulo, Itziar; Jenn, David; Danoon, Laith R; Guerra, David; de la Vega, David

    2014-01-01

    The potential impact that offshore wind farms may cause on nearby marine radars should be considered before the wind farm is installed. Strong radar echoes from the turbines may degrade radars' detection capability in the area around the wind farm. Although conventional computational methods provide accurate results of scattering by wind turbines, they are not directly implementable in software tools that can be used to conduct the impact studies. This paper proposes a simple model to assess the clutter that wind turbines may generate on marine radars. This method can be easily implemented in the system modeling software tools for the impact analysis of a wind farm in a real scenario.

  12. Simplified Formulae for the Estimation of Offshore Wind Turbines Clutter on Marine Radars

    PubMed Central

    Grande, Olatz; Cañizo, Josune; Jenn, David; Danoon, Laith R.; Guerra, David

    2014-01-01

    The potential impact that offshore wind farms may cause on nearby marine radars should be considered before the wind farm is installed. Strong radar echoes from the turbines may degrade radars' detection capability in the area around the wind farm. Although conventional computational methods provide accurate results of scattering by wind turbines, they are not directly implementable in software tools that can be used to conduct the impact studies. This paper proposes a simple model to assess the clutter that wind turbines may generate on marine radars. This method can be easily implemented in the system modeling software tools for the impact analysis of a wind farm in a real scenario. PMID:24782682

  13. Use of a scanning optical profilometer for toolmark characterization

    NASA Astrophysics Data System (ADS)

    Chumbley, L. S.; Eisenmann, D. J.; Morris, M.; Zhang, S.; Craft, J.; Fisher, C.; Saxton, A.

    2009-05-01

    An optical profilometer has been used to obtain 3-dimensional data for use in two research projects concerning toolmark quantification and identification. In the first study quantitative comparisons between toolmarks made using data from the optical system proved superior to similar data obtained using a stylus profilometer. In the second study the ability of the instrument to obtain accurate data from two surfaces intersecting at a high angle (approximately 90 degrees) is demonstrated by obtaining measurements from the tip of a flat screwdriver. The data obtained was used to produce a computer generated "virtual tool," which was then employed to create "virtual tool marks." How these experiments were conducted and the results obtained will be presented and discussed.

  14. An Efficient Finite Element Framework to Assess Flexibility Performances of SMA Self-Expandable Carotid Artery Stents

    PubMed Central

    Ferraro, Mauro; Auricchio, Ferdinando; Boatti, Elisa; Scalet, Giulia; Conti, Michele; Morganti, Simone; Reali, Alessandro

    2015-01-01

    Computer-based simulations are nowadays widely exploited for the prediction of the mechanical behavior of different biomedical devices. In this aspect, structural finite element analyses (FEA) are currently the preferred computational tool to evaluate the stent response under bending. This work aims at developing a computational framework based on linear and higher order FEA to evaluate the flexibility of self-expandable carotid artery stents. In particular, numerical simulations involving large deformations and inelastic shape memory alloy constitutive modeling are performed, and the results suggest that the employment of higher order FEA allows accurately representing the computational domain and getting a better approximation of the solution with a widely-reduced number of degrees of freedom with respect to linear FEA. Moreover, when buckling phenomena occur, higher order FEA presents a superior capability of reproducing the nonlinear local effects related to buckling phenomena. PMID:26184329

  15. A Software Upgrade of the NASA Aeroheating Code "MINIVER"

    NASA Technical Reports Server (NTRS)

    Louderback, Pierce Mathew

    2013-01-01

    Computational Fluid Dynamics (CFD) is a powerful and versatile tool simulating fluid and thermal environments of launch and re-entry vehicles alike. Where it excels in power and accuracy, however, it lacks in speed. An alternative tool for this purpose is known as MINIVER, an aeroheating code widely used by NASA and within the aerospace industry. Capable of providing swift, reasonably accurate approximations of the fluid and thermal environment of launch vehicles, MINIVER is used where time is of the essence and accuracy need not be exact. However, MINIVER is an old, aging tool: running on a user-unfriendly, legacy command-line interface, it is difficult for it to keep pace with more modem software tools. Florida Institute of Technology was tasked with the construction of a new Graphical User Interface (GUI) that implemented the legacy version's capabilities and enhanced them with new tools and utilities. This thesis provides background to the legacy version of the program, the progression and final version of a modem user interface, and benchmarks to demonstrate its usefulness.

  16. Development and evaluation of a computer program to grade student performance on peripheral blood smears

    NASA Astrophysics Data System (ADS)

    Lehman, Donald Clifford

    Today's medical laboratories are dealing with cost containment health care policies and unfilled laboratory positions. Because there may be fewer experienced clinical laboratory scientists, students graduating from clinical laboratory science (CLS) programs are expected by their employers to perform accurately in entry-level positions with minimal training. Information in the CLS field is increasing at a dramatic rate, and instructors are expected to teach more content in the same amount of time with the same resources. With this increase in teaching obligations, instructors could use a tool to facilitate grading. The research question was, "Can computer-assisted assessment evaluate students in an accurate and time efficient way?" A computer program was developed to assess CLS students' ability to evaluate peripheral blood smears. Automated grading permits students to get results quicker and allows the laboratory instructor to devote less time to grading. This computer program could improve instruction by providing more time to students and instructors for other activities. To be valuable, the program should provide the same quality of grading as the instructor. These benefits must outweigh potential problems such as the time necessary to develop and maintain the program, monitoring of student progress by the instructor, and the financial cost of the computer software and hardware. In this study, surveys of students and an interview with the laboratory instructor were performed to provide a formative evaluation of the computer program. In addition, the grading accuracy of the computer program was examined. These results will be used to improve the program for use in future courses.

  17. Improving the Efficiency of Abdominal Aortic Aneurysm Wall Stress Computations

    PubMed Central

    Zelaya, Jaime E.; Goenezen, Sevan; Dargon, Phong T.; Azarbal, Amir-Farzin; Rugonyi, Sandra

    2014-01-01

    An abdominal aortic aneurysm is a pathological dilation of the abdominal aorta, which carries a high mortality rate if ruptured. The most commonly used surrogate marker of rupture risk is the maximal transverse diameter of the aneurysm. More recent studies suggest that wall stress from models of patient-specific aneurysm geometries extracted, for instance, from computed tomography images may be a more accurate predictor of rupture risk and an important factor in AAA size progression. However, quantification of wall stress is typically computationally intensive and time-consuming, mainly due to the nonlinear mechanical behavior of the abdominal aortic aneurysm walls. These difficulties have limited the potential of computational models in clinical practice. To facilitate computation of wall stresses, we propose to use a linear approach that ensures equilibrium of wall stresses in the aneurysms. This proposed linear model approach is easy to implement and eliminates the burden of nonlinear computations. To assess the accuracy of our proposed approach to compute wall stresses, results from idealized and patient-specific model simulations were compared to those obtained using conventional approaches and to those of a hypothetical, reference abdominal aortic aneurysm model. For the reference model, wall mechanical properties and the initial unloaded and unstressed configuration were assumed to be known, and the resulting wall stresses were used as reference for comparison. Our proposed linear approach accurately approximates wall stresses for varying model geometries and wall material properties. Our findings suggest that the proposed linear approach could be used as an effective, efficient, easy-to-use clinical tool to estimate patient-specific wall stresses. PMID:25007052

  18. Molecular determinants of blood-brain barrier permeation.

    PubMed

    Geldenhuys, Werner J; Mohammad, Afroz S; Adkins, Chris E; Lockman, Paul R

    2015-01-01

    The blood-brain barrier (BBB) is a microvascular unit which selectively regulates the permeability of drugs to the brain. With the rise in CNS drug targets and diseases, there is a need to be able to accurately predict a priori which compounds in a company database should be pursued for favorable properties. In this review, we will explore the different computational tools available today, as well as underpin these to the experimental methods used to determine BBB permeability. These include in vitro models and the in vivo models that yield the dataset we use to generate predictive models. Understanding of how these models were experimentally derived determines our accurate and predicted use for determining a balance between activity and BBB distribution.

  19. Molecular determinants of blood–brain barrier permeation

    PubMed Central

    Geldenhuys, Werner J; Mohammad, Afroz S; Adkins, Chris E; Lockman, Paul R

    2015-01-01

    The blood–brain barrier (BBB) is a microvascular unit which selectively regulates the permeability of drugs to the brain. With the rise in CNS drug targets and diseases, there is a need to be able to accurately predict a priori which compounds in a company database should be pursued for favorable properties. In this review, we will explore the different computational tools available today, as well as underpin these to the experimental methods used to determine BBB permeability. These include in vitro models and the in vivo models that yield the dataset we use to generate predictive models. Understanding of how these models were experimentally derived determines our accurate and predicted use for determining a balance between activity and BBB distribution. PMID:26305616

  20. Wellbore inertial directional surveying system

    DOEpatents

    Andreas, R.D.; Heck, G.M.; Kohler, S.M.; Watts, A.C.

    1982-09-08

    A wellbore inertial directional surveying system for providing a complete directional survey of an oil or gas well borehole to determine the displacement in all three directions of the borehole path relative to the well head at the surface. The information generated by the present invention is especially useful when numerous wells are drilled to different geographical targets from a single offshore platform. Accurate knowledge of the path of the borehole allows proper well spacing and provides assurance that target formations are reached. The tool is lowered down into a borehole on an electrical cable. A computer positioned on the surface communicates with the tool via the cable. The tool contains a sensor block which is supported on a single gimbal, the rotation axis of which is aligned with the cylinder axis of the tool and, correspondingly, the borehole. The gyroscope measurement of the sensor block rotation is used in a null-seeking servo loop which essentially prevents rotation of the sensor block about the gimbal axis. Angular rates of the sensor block about axes which are perpendicular to te gimbal axis are measured by gyroscopes in a manner similar to a strapped-down arrangement. Three accelerometers provide acceleration information as the tool is lowered within the borehole. The uphole computer derives position information based upon acceleration information and angular rate information. Kalman estimation techniques are used to compensate for system errors. 25 figures.

  1. Wellbore inertial directional surveying system

    DOEpatents

    Andreas, Ronald D.; Heck, G. Michael; Kohler, Stewart M.; Watts, Alfred C.

    1991-01-01

    A wellbore inertial directional surveying system for providing a complete directional survey of an oil or gas well borehole to determine the displacement in all three directions of the borehole path relative to the well head at the surface. The information generated by the present invention is especially useful when numerous wells are drilled to different geographical targets from a single off-shore platform. Accurate knowledge of the path of the borehole allows proper well spacing and provides assurance that target formations are reached. The tool is lowered down into a borehole on the electrical cable. A computer positioned on the surface communicates with the tool via the cable. The tool contains a sensor block which is supported on a single gimbal, the rotation axis of which is aligned with the cylinder axis of the tool and, correspondingly, the borehole. The gyroscope measurement of the sensor block rotation is used in a null-seeking servo loop which essentially prevents rotation of the sensor block aboutthe gimbal axis. Angular rates of the sensor block about axes which are perpendicular to the gimbal axis are measured by gyroscopes in a manner similar to a strapped-down arrangement. Three accelerometers provide acceleration information as the tool is lowered within the borehole. The uphole computer derives position information based upon acceleration information and anular rate information. Kalman estimation techniques are used to compensate for system errors.

  2. CTAS: Computer intelligence for air traffic control in the terminal area

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    1992-01-01

    A system for the automated management and control of arrival traffic, referred to as the Center-TRACON Automation System (CTAS), has been designed by the ATC research group at NASA Ames research center. In a cooperative program, NASA and the FAA have efforts underway to install and evaluate the system at the Denver and Dallas/Ft. Worth airports. CTAS consists of three types of integrated tools that provide computer-generated intelligence for both Center and TRACON controllers to guide them in managing and controlling arrival traffic efficiently. One tool, the Traffic Management Advisor (TMA), establishes optimized landing sequences and landing times for aircraft arriving in the center airspace several hundred miles from the airport. In TRACON, TMA frequencies missed approach aircraft and unanticipated arrivals. Another tool, the Descent Advisor (DA), generates clearances for the center controllers handling at crossing times provided by TMA. In the TRACON, the final approach spacing tool (FAST) provides heading and speed clearances that produce and accurately spaced flow of aircraft on the final approach course. A data base consisting of aircraft performance models, airline preferred operational procedures and real time wind measurements contribute to the effective operation of CTAS. Extensive simulator evaluations of CTAS have demonstrated controller acceptance, delay reductions, and fuel savings.

  3. 3D Displays And User Interface Design For A Radiation Therapy Treatment Planning CAD Tool

    NASA Astrophysics Data System (ADS)

    Mosher, Charles E.; Sherouse, George W.; Chaney, Edward L.; Rosenman, Julian G.

    1988-06-01

    The long term goal of the project described in this paper is to improve local tumor control through the use of computer-aided treatment design methods that can result in selection of better treatment plans compared with conventional planning methods. To this end, a CAD tool for the design of radiation treatment beams is described. Crucial to the effectiveness of this tool are high quality 3D display techniques. We have found that 2D and 3D display methods dramatically improve the comprehension of the complex spatial relationships between patient anatomy, radiation beams, and dose distributions. In order to take full advantage of these displays, an intuitive and highly interactive user interface was created. If the system is to be used by physicians unfamiliar with computer systems, it is essential that a user interface is incorporated that allows the user to navigate through each step of the design process in a manner similar to what they are used to. Compared with conventional systems, we believe our display and CAD tools will allow the radiotherapist to achieve more accurate beam targetting leading to a better radiation dose configuration to the tumor volume. This would result in a reduction of the dose to normal tissue.

  4. Additional extensions to the NASCAP computer code, volume 1

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Katz, I.; Stannard, P. R.

    1981-01-01

    Extensions and revisions to a computer code that comprehensively analyzes problems of spacecraft charging (NASCAP) are documented. Using a fully three dimensional approach, it can accurately predict spacecraft potentials under a variety of conditions. Among the extensions are a multiple electron/ion gun test tank capability, and the ability to model anisotropic and time dependent space environments. Also documented are a greatly extended MATCHG program and the preliminary version of NASCAP/LEO. The interactive MATCHG code was developed into an extremely powerful tool for the study of material-environment interactions. The NASCAP/LEO, a three dimensional code to study current collection under conditions of high voltages and short Debye lengths, was distributed for preliminary testing.

  5. SYSTID - A flexible tool for the analysis of communication systems.

    NASA Technical Reports Server (NTRS)

    Dawson, C. T.; Tranter, W. H.

    1972-01-01

    Description of the System Time Domain Simulation (SYSTID) computer-aided analysis program which is specifically structured for communication systems analysis. The SYSTID program is user oriented so that very little knowledge of computer techniques and very little programming ability are required for proper application. The program is designed so that the user can go from a system block diagram to an accurate simulation by simply programming a single English language statement for each block in the system. The mathematical and functional models available in the SYSTID library are presented. An example problem is given which illustrates the ease of modeling communication systems. Examples of the outputs available are presented, and proposed improvements are summarized.

  6. Personal computer study of finite-difference methods for the transonic small disturbance equation

    NASA Technical Reports Server (NTRS)

    Bland, Samuel R.

    1989-01-01

    Calculation of unsteady flow phenomena requires careful attention to the numerical treatment of the governing partial differential equations. The personal computer provides a convenient and useful tool for the development of meshes, algorithms, and boundary conditions needed to provide time accurate solution of these equations. The one-dimensional equation considered provides a suitable model for the study of wave propagation in the equations of transonic small disturbance potential flow. Numerical results for effects of mesh size, extent, and stretching, time step size, and choice of far-field boundary conditions are presented. Analysis of the discretized model problem supports these numerical results. Guidelines for suitable mesh and time step choices are given.

  7. Capillary device refilling. [liquid rocket propellant tank tests

    NASA Technical Reports Server (NTRS)

    Blatt, M. H.; Merino, F.; Symons, E. P.

    1980-01-01

    An analytical and experimental study was conducted dealing with refilling start baskets (capillary devices) with settled fluid. A computer program was written to include dynamic pressure, screen wicking, multiple-screen barriers, standpipe screens, variable vehicle mass for computing vehicle acceleration, and calculation of tank outflow rate and vapor pullthrough height. An experimental apparatus was fabricated and tested to provide data for correlation with the analytical model; the test program was conducted in normal gravity using a scale-model capillary device and ethanol as the test fluid. The test data correlated with the analytical model; the model is a versatile and apparently accurate tool for predicting start basket refilling under actual mission conditions.

  8. SAMICS Validation. SAMICS Support Study, Phase 3

    NASA Technical Reports Server (NTRS)

    1979-01-01

    SAMICS provides a consistent basis for estimating array costs and compares production technology costs. A review and a validation of the SAMICS model are reported. The review had the following purposes: (1) to test the computational validity of the computer model by comparison with preliminary hand calculations based on conventional cost estimating techniques; (2) to review and improve the accuracy of the cost relationships being used by the model: and (3) to provide an independent verification to users of the model's value in decision making for allocation of research and developement funds and for investment in manufacturing capacity. It is concluded that the SAMICS model is a flexible, accurate, and useful tool for managerial decision making.

  9. Real Time Flood Alert System (RTFAS) for Puerto Rico

    USGS Publications Warehouse

    Lopez-Trujillo, Dianne

    2010-01-01

    The Real Time Flood Alert System is a web-based computer program, developed as a data integration tool, and designed to increase the ability of emergency managers to rapidly and accurately predict flooding conditions of streams in Puerto Rico. The system includes software and a relational database to determine the spatial and temporal distribution of rainfall, water levels in streams and reservoirs, and associated storms to determine hazardous and potential flood conditions. The computer program was developed as part of a cooperative agreement between the U.S. Geological Survey Caribbean Water Science Center and the Puerto Rico Emergency Management Agency, and integrates information collected and processed by these two agencies and the National Weather Service.

  10. FlavonoidSearch: A system for comprehensive flavonoid annotation by mass spectrometry.

    PubMed

    Akimoto, Nayumi; Ara, Takeshi; Nakajima, Daisuke; Suda, Kunihiro; Ikeda, Chiaki; Takahashi, Shingo; Muneto, Reiko; Yamada, Manabu; Suzuki, Hideyuki; Shibata, Daisuke; Sakurai, Nozomu

    2017-04-28

    Currently, in mass spectrometry-based metabolomics, limited reference mass spectra are available for flavonoid identification. In the present study, a database of probable mass fragments for 6,867 known flavonoids (FsDatabase) was manually constructed based on new structure- and fragmentation-related rules using new heuristics to overcome flavonoid complexity. We developed the FlavonoidSearch system for flavonoid annotation, which consists of the FsDatabase and a computational tool (FsTool) to automatically search the FsDatabase using the mass spectra of metabolite peaks as queries. This system showed the highest identification accuracy for the flavonoid aglycone when compared to existing tools and revealed accurate discrimination between the flavonoid aglycone and other compounds. Sixteen new flavonoids were found from parsley, and the diversity of the flavonoid aglycone among different fruits and vegetables was investigated.

  11. Personalizing oncology treatments by predicting drug efficacy, side-effects, and improved therapy: mathematics, statistics, and their integration.

    PubMed

    Agur, Zvia; Elishmereni, Moran; Kheifetz, Yuri

    2014-01-01

    Despite its great promise, personalized oncology still faces many hurdles, and it is increasingly clear that targeted drugs and molecular biomarkers alone yield only modest clinical benefit. One reason is the complex relationships between biomarkers and the patient's response to drugs, obscuring the true weight of the biomarkers in the overall patient's response. This complexity can be disentangled by computational models that integrate the effects of personal biomarkers into a simulator of drug-patient dynamic interactions, for predicting the clinical outcomes. Several computational tools have been developed for personalized oncology, notably evidence-based tools for simulating pharmacokinetics, Bayesian-estimated tools for predicting survival, etc. We describe representative statistical and mathematical tools, and discuss their merits, shortcomings and preliminary clinical validation attesting to their potential. Yet, the individualization power of mathematical models alone, or statistical models alone, is limited. More accurate and versatile personalization tools can be constructed by a new application of the statistical/mathematical nonlinear mixed effects modeling (NLMEM) approach, which until recently has been used only in drug development. Using these advanced tools, clinical data from patient populations can be integrated with mechanistic models of disease and physiology, for generating personal mathematical models. Upon a more substantial validation in the clinic, this approach will hopefully be applied in personalized clinical trials, P-trials, hence aiding the establishment of personalized medicine within the main stream of clinical oncology. © 2014 Wiley Periodicals, Inc.

  12. X-ray system simulation software tools for radiology and radiography education.

    PubMed

    Kengyelics, Stephen M; Treadgold, Laura A; Davies, Andrew G

    2018-02-01

    To develop x-ray simulation software tools to support delivery of radiological science education for a range of learning environments and audiences including individual study, lectures, and tutorials. Two software tools were developed; one simulated x-ray production for a simple two dimensional radiographic system geometry comprising an x-ray source, beam filter, test object and detector. The other simulated the acquisition and display of two dimensional radiographic images of complex three dimensional objects using a ray casting algorithm through three dimensional mesh objects. Both tools were intended to be simple to use, produce results accurate enough to be useful for educational purposes, and have an acceptable simulation time on modest computer hardware. The radiographic factors and acquisition geometry could be altered in both tools via their graphical user interfaces. A comparison of radiographic contrast measurements of the simulators to a real system was performed. The contrast output of the simulators had excellent agreement with measured results. The software simulators were deployed to 120 computers on campus. The software tools developed are easy-to-use, clearly demonstrate important x-ray physics and imaging principles, are accessible within a standard University setting and could be used to enhance the teaching of x-ray physics to undergraduate students. Current approaches to teaching x-ray physics in radiological science lack immediacy when linking theory with practice. This method of delivery allows students to engage with the subject in an experiential learning environment. Copyright © 2017. Published by Elsevier Ltd.

  13. C-arm Cone Beam Computed Tomography: A New Tool in the Interventional Suite.

    PubMed

    Raj, Santhosh; Irani, Farah Gillan; Tay, Kiang Hiong; Tan, Bien Soo

    2013-11-01

    C-arm Cone Beam CT (CBCT) is a technology that is being integrated into many of the newer angiography systems in the interventional suite. Due to its ability to provide cross sectional imaging, it has opened a myriad of opportunities for creating new clinical applications. We review the technical aspects, current reported clinical applications and potential benefits of this technology. Searches were made via PubMed using the string "CBCT", "Cone Beam CT", "Cone Beam Computed Tomography" and "C-arm Cone Beam Computed Tomography". All relevant articles in the results were reviewed. CBCT clinical applications have been reported in both vascular and non-vascular interventions. They encompass many aspects of a procedure including preprocedural planning, intraprocedural guidance and postprocedural assessment. As a result, they have allowed the interventionalist to be safer and more accurate in performing image guided procedures. There are however several technical limitations. The quality of images produced is not comparable to conventional computed tomography (CT). Radiation doses are also difficult to quantify when compared to CT and fluoroscopy. CBCT technology in the interventional suite has contributed significant benefits to the patient despite its current limitations. It is a tool that will evolve and potentially become an integral part of imaging guidance for intervention.

  14. A fast - Monte Carlo toolkit on GPU for treatment plan dose recalculation in proton therapy

    NASA Astrophysics Data System (ADS)

    Senzacqua, M.; Schiavi, A.; Patera, V.; Pioli, S.; Battistoni, G.; Ciocca, M.; Mairani, A.; Magro, G.; Molinelli, S.

    2017-10-01

    In the context of the particle therapy a crucial role is played by Treatment Planning Systems (TPSs), tools aimed to compute and optimize the tratment plan. Nowadays one of the major issues related to the TPS in particle therapy is the large CPU time needed. We developed a software toolkit (FRED) for reducing dose recalculation time by exploiting Graphics Processing Units (GPU) hardware. Thanks to their high parallelization capability, GPUs significantly reduce the computation time, up to factor 100 respect to a standard CPU running software. The transport of proton beams in the patient is accurately described through Monte Carlo methods. Physical processes reproduced are: Multiple Coulomb Scattering, energy straggling and nuclear interactions of protons with the main nuclei composing the biological tissues. FRED toolkit does not rely on the water equivalent translation of tissues, but exploits the Computed Tomography anatomical information by reconstructing and simulating the atomic composition of each crossed tissue. FRED can be used as an efficient tool for dose recalculation, on the day of the treatment. In fact it can provide in about one minute on standard hardware the dose map obtained combining the treatment plan, earlier computed by the TPS, and the current patient anatomic arrangement.

  15. G-LoSA: An efficient computational tool for local structure-centric biological studies and drug design.

    PubMed

    Lee, Hui Sun; Im, Wonpil

    2016-04-01

    Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G-LoSA. G-LoSA aligns protein local structures in a sequence order independent way and provides a GA-score, a chemical feature-based and size-independent structure similarity score. Our benchmark validation shows the robust performance of G-LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure-centric comparative biology studies. In particular, G-LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G-LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer-aided drug design. We hope that G-LoSA can be a useful computational method for exploring interesting biological problems through large-scale comparison of protein local structures and facilitating drug discovery research and development. G-LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. © 2016 The Protein Society.

  16. Automated segmentation of the lungs from high resolution CT images for quantitative study of chronic obstructive pulmonary diseases

    NASA Astrophysics Data System (ADS)

    Garg, Ishita; Karwoski, Ronald A.; Camp, Jon J.; Bartholmai, Brian J.; Robb, Richard A.

    2005-04-01

    Chronic obstructive pulmonary diseases (COPD) are debilitating conditions of the lung and are the fourth leading cause of death in the United States. Early diagnosis is critical for timely intervention and effective treatment. The ability to quantify particular imaging features of specific pathology and accurately assess progression or response to treatment with current imaging tools is relatively poor. The goal of this project was to develop automated segmentation techniques that would be clinically useful as computer assisted diagnostic tools for COPD. The lungs were segmented using an optimized segmentation threshold and the trachea was segmented using a fixed threshold characteristic of air. The segmented images were smoothed by a morphological close operation using spherical elements of different sizes. The results were compared to other segmentation approaches using an optimized threshold to segment the trachea. Comparison of the segmentation results from 10 datasets showed that the method of trachea segmentation using a fixed air threshold followed by morphological closing with spherical element of size 23x23x5 yielded the best results. Inclusion of greater number of pulmonary vessels in the lung volume is important for the development of computer assisted diagnostic tools because the physiological changes of COPD can result in quantifiable anatomic changes in pulmonary vessels. Using a fixed threshold to segment the trachea removed airways from the lungs to a better extent as compared to using an optimized threshold. Preliminary measurements gathered from patient"s CT scans suggest that segmented images can be used for accurate analysis of total lung volume and volumes of regional lung parenchyma. Additionally, reproducible segmentation allows for quantification of specific pathologic features, such as lower intensity pixels, which are characteristic of abnormal air spaces in diseases like emphysema.

  17. An augmented reality tool for learning spatial anatomy on mobile devices.

    PubMed

    Jain, Nishant; Youngblood, Patricia; Hasel, Matthew; Srivastava, Sakti

    2017-09-01

    Augmented Realty (AR) offers a novel method of blending virtual and real anatomy for intuitive spatial learning. Our first aim in the study was to create a prototype AR tool for mobile devices. Our second aim was to complete a technical evaluation of our prototype AR tool focused on measuring the system's ability to accurately render digital content in the real world. We imported Computed Tomography (CT) data derived virtual surface models into a 3D Unity engine environment and implemented an AR algorithm to display these on mobile devices. We investigated the accuracy of the virtual renderings by comparing a physical cube with an identical virtual cube for dimensional accuracy. Our comparative study confirms that our AR tool renders 3D virtual objects with a high level of accuracy as evidenced by the degree of similarity between measurements of the dimensions of a virtual object (a cube) and the corresponding physical object. We developed an inexpensive and user-friendly prototype AR tool for mobile devices that creates highly accurate renderings. This prototype demonstrates an intuitive, portable, and integrated interface for spatial interaction with virtual anatomical specimens. Integrating this AR tool with a library of CT derived surface models provides a platform for spatial learning in the anatomy curriculum. The segmentation methodology implemented to optimize human CT data for mobile viewing can be extended to include anatomical variations and pathologies. The ability of this inexpensive educational platform to deliver a library of interactive, 3D models to students worldwide demonstrates its utility as a supplemental teaching tool that could greatly benefit anatomical instruction. Clin. Anat. 30:736-741, 2017. © 2017Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  18. Lithographic image simulation for the 21st century with 19th-century tools

    NASA Astrophysics Data System (ADS)

    Gordon, Ronald L.; Rosenbluth, Alan E.

    2004-01-01

    Simulation of lithographic processes in semiconductor manufacturing has gone from a crude learning tool 20 years ago to a critical part of yield enhancement strategy today. Although many disparate models, championed by equally disparate communities, exist to describe various photoresist development phenomena, these communities would all agree that the one piece of the simulation picture that can, and must, be computed accurately is the image intensity in the photoresist. The imaging of a photomask onto a thin-film stack is one of the only phenomena in the lithographic process that is described fully by well-known, definitive physical laws. Although many approximations are made in the derivation of the Fourier transform relations between the mask object, the pupil, and the image, these and their impacts are well-understood and need little further investigation. The imaging process in optical lithography is modeled as a partially-coherent, Kohler illumination system. As Hopkins has shown, we can separate the computation into 2 pieces: one that takes information about the illumination source, the projection lens pupil, the resist stack, and the mask size or pitch, and the other that only needs the details of the mask structure. As the latter piece of the calculation can be expressed as a fast Fourier transform, it is the first piece that dominates. This piece involves computation of a potentially large number of numbers called Transmission Cross-Coefficients (TCCs), which are correlations of the pupil function weighted with the illumination intensity distribution. The advantage of performing the image calculations this way is that the computation of these TCCs represents an up-front cost, not to be repeated if one is only interested in changing the mask features, which is the case in Model-Based Optical Proximity Correction (MBOPC). The down side, however, is that the number of these expensive double integrals that must be performed increases as the square of the mask unit cell area; this number can cause even the fastest computers to balk if one needs to study medium- or long-range effects. One can reduce this computational burden by approximating with a smaller area, but accuracy is usually a concern, especially when building a model that will purportedly represent a manufacturing process. This work will review the current methodologies used to simulate the intensity distribution in air above the resist and address the above problems. More to the point, a methodology has been developed to eliminate the expensive numerical integrations in the TCC calculations, as the resulting integrals in many cases of interest can be either evaluated analytically, or replaced by analytical functions accurate to within machine precision. With the burden of computing these numbers lightened, more accurate representations of the image field can be realized, and better overall models are then possible.

  19. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  20. The use of computational inspection to identify process window limiting hotspots and predict sub-15nm defects with high capture rate

    NASA Astrophysics Data System (ADS)

    Ham, Boo-Hyun; Kim, Il-Hwan; Park, Sung-Sik; Yeo, Sun-Young; Kim, Sang-Jin; Park, Dong-Woon; Park, Joon-Soo; Ryu, Chang-Hoon; Son, Bo-Kyeong; Hwang, Kyung-Bae; Shin, Jae-Min; Shin, Jangho; Park, Ki-Yeop; Park, Sean; Liu, Lei; Tien, Ming-Chun; Nachtwein, Angelique; Jochemsen, Marinus; Yan, Philip; Hu, Vincent; Jones, Christopher

    2017-03-01

    As critical dimensions for advanced two dimensional (2D) DUV patterning continue to shrink, the exact process window becomes increasingly difficult to determine. The defect size criteria shrink with the patterning critical dimensions and are well below the resolution of current optical inspection tools. As a result, it is more challenging for traditional bright field inspection tools to accurately discover the hotspots that define the process window. In this study, we use a novel computational inspection method to identify the depth-of-focus limiting features of a 10 nm node mask with 2D metal structures (single exposure) and compare the results to those obtained with a traditional process windows qualification (PWQ) method based on utilizing a focus modulated wafer and bright field inspection (BFI) to detect hotspot defects. The method is extended to litho-etch litho-etch (LELE) on a different test vehicle to show that overlay related bridging hotspots also can be identified.

  1. Parameter Estimation for a Pulsating Turbulent Buoyant Jet Using Approximate Bayesian Computation

    NASA Astrophysics Data System (ADS)

    Christopher, Jason; Wimer, Nicholas; Lapointe, Caelan; Hayden, Torrey; Grooms, Ian; Rieker, Greg; Hamlington, Peter

    2017-11-01

    Approximate Bayesian Computation (ABC) is a powerful tool that allows sparse experimental or other ``truth'' data to be used for the prediction of unknown parameters, such as flow properties and boundary conditions, in numerical simulations of real-world engineering systems. Here we introduce the ABC approach and then use ABC to predict unknown inflow conditions in simulations of a two-dimensional (2D) turbulent, high-temperature buoyant jet. For this test case, truth data are obtained from a direct numerical simulation (DNS) with known boundary conditions and problem parameters, while the ABC procedure utilizes lower fidelity large eddy simulations. Using spatially-sparse statistics from the 2D buoyant jet DNS, we show that the ABC method provides accurate predictions of true jet inflow parameters. The success of the ABC approach in the present test suggests that ABC is a useful and versatile tool for predicting flow information, such as boundary conditions, that can be difficult to determine experimentally.

  2. Rapid Phenotyping of Root Systems of Brachypodium Plants Using X-ray Computed Tomography: a Comparative Study of Soil Types and Segmentation Tools

    NASA Astrophysics Data System (ADS)

    Varga, T.; McKinney, A. L.; Bingham, E.; Handakumbura, P. P.; Jansson, C.

    2017-12-01

    Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as in processes with important implications to farming and thus human food supply. X-ray computed tomography (XCT) has been proven to be an effective tool for non-invasive root imaging and analysis. Selected Brachypodium distachyon phenotypes were grown in both natural and artificial soil mixes. The specimens were imaged by XCT, and the root architectures were extracted from the data using three different software-based methods; RooTrak, ImageJ-based WEKA segmentation, and the segmentation feature in VG Studio MAX. The 3D root image was successfully segmented at 30 µm resolution by all three methods. In this presentation, ease of segmentation and the accuracy of the extracted quantitative information (root volume and surface area) will be compared between soil types and segmentation methods. The best route to easy and accurate segmentation and root analysis will be highlighted.

  3. High performance TWT development for the microwave power module

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whaley, D.R.; Armstrong, C.M.; Groshart, G.

    1996-12-31

    Northrop Grumman`s ongoing development of microwave power modules (MPM) provides microwave power at various power levels, frequencies, and bandwidths for a variety of applications. Present day requirements for the vacuum power booster traveling wave tubes of the microwave power module are becoming increasingly more demanding, necessitating the need for further enhancement of tube performance. The MPM development program at Northrop Grumman is designed specifically to meet this need by construction and test of a series of new tubes aimed at verifying computation and reaching high efficiency design goals. Tubes under test incorporate several different helix designs, as well as varyingmore » electron gun and magnetic confinement configurations. Current efforts also include further development of state-of-the-art TWT modeling and computational methods at Northrop Grumman incorporating new, more accurate models into existing design tools and developing new tools to be used in all aspects of traveling wave tube design. Current status of the Northrop Grumman MPM TWT development program will be presented.« less

  4. NASA and CFD - Making investments for the future

    NASA Technical Reports Server (NTRS)

    Hessenius, Kristin A.; Richardson, P. F.

    1992-01-01

    From a NASA perspective, CFD is a new tool for fluid flow simulation and prediction with virtually none of the inherent limitations of other ground-based simulation techniques. A primary goal of NASA's CFD research program is to develop efficient and accurate computational techniques for utilization in the design and analysis of aerospace vehicles. The program in algorithm development has systematically progressed through the hierarchy of engineering simplifications of the Navier-Stokes equations, starting with the inviscid formulations such as transonic small disturbance, full potential, and Euler.

  5. Engineering applications of metaheuristics: an introduction

    NASA Astrophysics Data System (ADS)

    Oliva, Diego; Hinojosa, Salvador; Demeshko, M. V.

    2017-01-01

    Metaheuristic algorithms are important tools that in recent years have been used extensively in several fields. In engineering, there is a big amount of problems that can be solved from an optimization point of view. This paper is an introduction of how metaheuristics can be used to solve complex problems of engineering. Their use produces accurate results in problems that are computationally expensive. Experimental results support the performance obtained by the selected algorithms in such specific problems as digital filter design, image processing and solar cells design.

  6. Time-dependent solution for axisymmetric flow over a blunt body with ideal gas, CF4, or equilibrium air chemistry

    NASA Technical Reports Server (NTRS)

    Hamilton, H. H., II; Spall, J. R.

    1986-01-01

    A time-asymptotic method has been used to obtain steady-flow solutions for axisymmetric inviscid flow over several blunt bodies including spheres, paraboloids, ellipsoids, and spherically blunted cones. Comparisons with experimental data and results of other computational methods have demonstrated that accurate solutions can be obtained using this approach. The method should prove useful as an analysis tool for comparing with experimental data and for making engineering calculations for blunt reentry vehicles.

  7. Time-dependent solution for axisymmetric flow over a blunt body with ideal gas, CF4, or equilibrium air chemistry

    NASA Astrophysics Data System (ADS)

    Hamilton, H. H., II; Spall, J. R.

    1986-07-01

    A time-asymptotic method has been used to obtain steady-flow solutions for axisymmetric inviscid flow over several blunt bodies including spheres, paraboloids, ellipsoids, and spherically blunted cones. Comparisons with experimental data and results of other computational methods have demonstrated that accurate solutions can be obtained using this approach. The method should prove useful as an analysis tool for comparing with experimental data and for making engineering calculations for blunt reentry vehicles.

  8. Exploration Technology Developments Program's Radiation Hardened Electronics for Space Environments (RHESE) Project Overview

    NASA Technical Reports Server (NTRS)

    Keys, Andrew S.; Adams, James H.; Darty, Ronald C.; Patrick, Marshall C.; Johnson, Michael A.; Cressler, John D.

    2008-01-01

    Primary Objective: 1) A computational tool to accurately predict electronics performance in the presence of space radiation in support of spacecraft design: a) Total dose; b) Single Event Effects; and c) Mean Time Between Failure. (Developed as successor to CR ME96.) Secondary Objectives: 2) To provide a detailed description of the natural radiation environment in support of radiation health and instrument design: a) In deep space; b) Inside the magnetosphere; and c) Behind shielding.

  9. Testing large flats with computer generated holograms

    NASA Astrophysics Data System (ADS)

    Pariani, Giorgio; Tresoldi, Daniela; Spanò, Paolo; Bianco, Andrea

    2012-09-01

    We describe the optical test of a large flat based on a spherical mirror and a dedicated CGH. The spherical mirror, which can be accurately manufactured and tested in absolute way, allows to obtain a quasi collimated light beam, and the hologram performs the residual wavefront correction. Alignment tools for the spherical mirror and the hologram itself are encoded in the CGH. Sensitivity to fabrication errors and alignment has been evaluated. Tests to verify the effectiveness of our approach are now under execution.

  10. Simulation of Inviscid Compressible Multi-Phase Flow with Condensation

    NASA Technical Reports Server (NTRS)

    Kelleners, Philip

    2003-01-01

    Condensation of vapours in rapid expansions of compressible gases is investigated. In the case of high temperature gradients the condensation will start at conditions well away from thermodynamic equilibrium of the fluid. In those cases homogeneous condensation is dominant over heterogeneous condensation. The present work is concerned with development of a simulation tool for computation of high speed compressible flows with homogeneous condensation. The resulting ow solver should preferably be accurate and robust to be used for simulation of industrial flows in general geometries.

  11. Performance of a Method to Standardize Breast Ultrasound Interpretation Using Image Processing and Case-Based Reasoning

    NASA Astrophysics Data System (ADS)

    André, M. P.; Galperin, M.; Berry, A.; Ojeda-Fournier, H.; O'Boyle, M.; Olson, L.; Comstock, C.; Taylor, A.; Ledgerwood, M.

    Our computer-aided diagnostic (CADx) tool uses advanced image processing and artificial intelligence to analyze findings on breast sonography images. The goal is to standardize reporting of such findings using well-defined descriptors and to improve accuracy and reproducibility of interpretation of breast ultrasound by radiologists. This study examined several factors that may impact accuracy and reproducibility of the CADx software, which proved to be highly accurate and stabile over several operating conditions.

  12. Computational simulation and aerodynamic sensitivity analysis of film-cooled turbines

    NASA Astrophysics Data System (ADS)

    Massa, Luca

    A computational tool is developed for the time accurate sensitivity analysis of the stage performance of hot gas, unsteady turbine components. An existing turbomachinery internal flow solver is adapted to the high temperature environment typical of the hot section of jet engines. A real gas model and film cooling capabilities are successfully incorporated in the software. The modifications to the existing algorithm are described; both the theoretical model and the numerical implementation are validated. The accuracy of the code in evaluating turbine stage performance is tested using a turbine geometry typical of the last stage of aeronautical jet engines. The results of the performance analysis show that the predictions differ from the experimental data by less than 3%. A reliable grid generator, applicable to the domain discretization of the internal flow field of axial flow turbine is developed. A sensitivity analysis capability is added to the flow solver, by rendering it able to accurately evaluate the derivatives of the time varying output functions. The complex Taylor's series expansion (CTSE) technique is reviewed. Two of them are used to demonstrate the accuracy and time dependency of the differentiation process. The results are compared with finite differences (FD) approximations. The CTSE is more accurate than the FD, but less efficient. A "black box" differentiation of the source code, resulting from the automated application of the CTSE, generates high fidelity sensitivity algorithms, but with low computational efficiency and high memory requirements. New formulations of the CTSE are proposed and applied. Selective differentiation of the method for solving the non-linear implicit residual equation leads to sensitivity algorithms with the same accuracy but improved run time. The time dependent sensitivity derivatives are computed in run times comparable to the ones required by the FD approach.

  13. DR2DI: a powerful computational tool for predicting novel drug-disease associations

    NASA Astrophysics Data System (ADS)

    Lu, Lu; Yu, Hua

    2018-05-01

    Finding the new related candidate diseases for known drugs provides an effective method for fast-speed and low-risk drug development. However, experimental identification of drug-disease associations is expensive and time-consuming. This motivates the need for developing in silico computational methods that can infer true drug-disease pairs with high confidence. In this study, we presented a novel and powerful computational tool, DR2DI, for accurately uncovering the potential associations between drugs and diseases using high-dimensional and heterogeneous omics data as information sources. Based on a unified and extended similarity kernel framework, DR2DI inferred the unknown relationships between drugs and diseases using Regularized Kernel Classifier. Importantly, DR2DI employed a semi-supervised and global learning algorithm which can be applied to uncover the diseases (drugs) associated with known and novel drugs (diseases). In silico global validation experiments showed that DR2DI significantly outperforms recent two approaches for predicting drug-disease associations. Detailed case studies further demonstrated that the therapeutic indications and side effects of drugs predicted by DR2DI could be validated by existing database records and literature, suggesting that DR2DI can be served as a useful bioinformatic tool for identifying the potential drug-disease associations and guiding drug repositioning. Our software and comparison codes are freely available at https://github.com/huayu1111/DR2DI.

  14. A Potential Tool for Clinicians; Evaluating a Computer-Led Dietary Assessment Method in Overweight and Obese Women during Weight Loss.

    PubMed

    Widaman, Adrianne M; Keim, Nancy L; Burnett, Dustin J; Miller, Beverly; Witbracht, Megan G; Widaman, Keith F; Laugero, Kevin D

    2017-03-01

    Many Americans are attempting to lose weight with the help of healthcare professionals. Clinicians can improve weight loss results by using technology. Accurate dietary assessment is crucial to effective weight loss. The aim of this study was to validate a computer-led dietary assessment method in overweight/obese women. Known dietary intake was compared to Automated Self-Administered 24-h recall (ASA24) reported intake in women ( n = 45), 19-50 years, with body mass index of 27-39.9 kg/m². Participants received nutrition education and reduced body weight by 4%-10%. Participants completed one unannounced dietary recall and their responses were compared to actual intake. Accuracy of the recall and characteristics of respondent error were measured using linear and logistic regression. Energy was underreported by 5% with no difference for most nutrients except carbohydrates, vitamin B12, vitamin C, selenium, calcium and vitamin D ( p = 0.002, p < 0.0001, p = 0.022, p = 0.010, p = 0.008 and p = 0.001 respectively). Overall, ASA24 is a valid dietary assessment tool in overweight/obese women participating in a weight loss program. The automated features eliminate the need for clinicians to be trained, to administer, or to analyze dietary intake. Computer-led dietary assessment tools should be considered as part of clinician-supervised weight loss programs.

  15. DR2DI: a powerful computational tool for predicting novel drug-disease associations

    NASA Astrophysics Data System (ADS)

    Lu, Lu; Yu, Hua

    2018-04-01

    Finding the new related candidate diseases for known drugs provides an effective method for fast-speed and low-risk drug development. However, experimental identification of drug-disease associations is expensive and time-consuming. This motivates the need for developing in silico computational methods that can infer true drug-disease pairs with high confidence. In this study, we presented a novel and powerful computational tool, DR2DI, for accurately uncovering the potential associations between drugs and diseases using high-dimensional and heterogeneous omics data as information sources. Based on a unified and extended similarity kernel framework, DR2DI inferred the unknown relationships between drugs and diseases using Regularized Kernel Classifier. Importantly, DR2DI employed a semi-supervised and global learning algorithm which can be applied to uncover the diseases (drugs) associated with known and novel drugs (diseases). In silico global validation experiments showed that DR2DI significantly outperforms recent two approaches for predicting drug-disease associations. Detailed case studies further demonstrated that the therapeutic indications and side effects of drugs predicted by DR2DI could be validated by existing database records and literature, suggesting that DR2DI can be served as a useful bioinformatic tool for identifying the potential drug-disease associations and guiding drug repositioning. Our software and comparison codes are freely available at https://github.com/huayu1111/DR2DI.

  16. Reverse engineering and analysis of large genome-scale gene networks

    PubMed Central

    Aluru, Maneesha; Zola, Jaroslaw; Nettleton, Dan; Aluru, Srinivas

    2013-01-01

    Reverse engineering the whole-genome networks of complex multicellular organisms continues to remain a challenge. While simpler models easily scale to large number of genes and gene expression datasets, more accurate models are compute intensive limiting their scale of applicability. To enable fast and accurate reconstruction of large networks, we developed Tool for Inferring Network of Genes (TINGe), a parallel mutual information (MI)-based program. The novel features of our approach include: (i) B-spline-based formulation for linear-time computation of MI, (ii) a novel algorithm for direct permutation testing and (iii) development of parallel algorithms to reduce run-time and facilitate construction of large networks. We assess the quality of our method by comparison with ARACNe (Algorithm for the Reconstruction of Accurate Cellular Networks) and GeneNet and demonstrate its unique capability by reverse engineering the whole-genome network of Arabidopsis thaliana from 3137 Affymetrix ATH1 GeneChips in just 9 min on a 1024-core cluster. We further report on the development of a new software Gene Network Analyzer (GeNA) for extracting context-specific subnetworks from a given set of seed genes. Using TINGe and GeNA, we performed analysis of 241 Arabidopsis AraCyc 8.0 pathways, and the results are made available through the web. PMID:23042249

  17. Modeling of stress/strain behavior of fiber-reinforced ceramic matrix composites including stress redistribution

    NASA Technical Reports Server (NTRS)

    Mital, Subodh K.; Murthy, Pappu L. N.; Chamis, Christos C.

    1994-01-01

    A computational simulation procedure is presented for nonlinear analyses which incorporates microstress redistribution due to progressive fracture in ceramic matrix composites. This procedure facilitates an accurate simulation of the stress-strain behavior of ceramic matrix composites up to failure. The nonlinearity in the material behavior is accounted for at the constituent (fiber/matrix/interphase) level. This computational procedure is a part of recent upgrades to CEMCAN (Ceramic Matrix Composite Analyzer) computer code. The fiber substructuring technique in CEMCAN is used to monitor the damage initiation and progression as the load increases. The room-temperature tensile stress-strain curves for SiC fiber reinforced reaction-bonded silicon nitride (RBSN) matrix unidirectional and angle-ply laminates are simulated and compared with experimentally observed stress-strain behavior. Comparison between the predicted stress/strain behavior and experimental stress/strain curves is good. Collectively the results demonstrate that CEMCAN computer code provides the user with an effective computational tool to simulate the behavior of ceramic matrix composites.

  18. A Three Dimensional Parallel Time Accurate Turbopump Simulation Procedure Using Overset Grid Systems

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin; Chan, William; Kwak, Dochan

    2001-01-01

    The objective of the current effort is to provide a computational framework for design and analysis of the entire fuel supply system of a liquid rocket engine, including high-fidelity unsteady turbopump flow analysis. This capability is needed to support the design of pump sub-systems for advanced space transportation vehicles that are likely to involve liquid propulsion systems. To date, computational tools for design/analysis of turbopump flows are based on relatively lower fidelity methods. An unsteady, three-dimensional viscous flow analysis tool involving stationary and rotational components for the entire turbopump assembly has not been available for real-world engineering applications. The present effort provides developers with information such as transient flow phenomena at start up, and non-uniform inflows, and will eventually impact on system vibration and structures. In the proposed paper, the progress toward the capability of complete simulation of the turbo-pump for a liquid rocket engine is reported. The Space Shuttle Main Engine (SSME) turbo-pump is used as a test case for evaluation of the hybrid MPI/Open-MP and MLP versions of the INS3D code. CAD to solution auto-scripting capability is being developed for turbopump applications. The relative motion of the grid systems for the rotor-stator interaction was obtained using overset grid techniques. Unsteady computations for the SSME turbo-pump, which contains 114 zones with 34.5 million grid points, are carried out on Origin 3000 systems at NASA Ames Research Center. Results from these time-accurate simulations with moving boundary capability will be presented along with the performance of parallel versions of the code.

  19. Disease Staging and Prognosis in Smokers Using Deep Learning in Chest Computed Tomography.

    PubMed

    González, Germán; Ash, Samuel Y; Vegas-Sánchez-Ferrero, Gonzalo; Onieva Onieva, Jorge; Rahaghi, Farbod N; Ross, James C; Díaz, Alejandro; San José Estépar, Raúl; Washko, George R

    2018-01-15

    Deep learning is a powerful tool that may allow for improved outcome prediction. To determine if deep learning, specifically convolutional neural network (CNN) analysis, could detect and stage chronic obstructive pulmonary disease (COPD) and predict acute respiratory disease (ARD) events and mortality in smokers. A CNN was trained using computed tomography scans from 7,983 COPDGene participants and evaluated using 1,000 nonoverlapping COPDGene participants and 1,672 ECLIPSE participants. Logistic regression (C statistic and the Hosmer-Lemeshow test) was used to assess COPD diagnosis and ARD prediction. Cox regression (C index and the Greenwood-Nam-D'Agnostino test) was used to assess mortality. In COPDGene, the C statistic for the detection of COPD was 0.856. A total of 51.1% of participants in COPDGene were accurately staged and 74.95% were within one stage. In ECLIPSE, 29.4% were accurately staged and 74.6% were within one stage. In COPDGene and ECLIPSE, the C statistics for ARD events were 0.64 and 0.55, respectively, and the Hosmer-Lemeshow P values were 0.502 and 0.380, respectively, suggesting no evidence of poor calibration. In COPDGene and ECLIPSE, CNN predicted mortality with fair discrimination (C indices, 0.72 and 0.60, respectively), and without evidence of poor calibration (Greenwood-Nam-D'Agnostino P values, 0.307 and 0.331, respectively). A deep-learning approach that uses only computed tomography imaging data can identify those smokers who have COPD and predict who are most likely to have ARD events and those with the highest mortality. At a population level CNN analysis may be a powerful tool for risk assessment.

  20. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  1. Data analysis of gravitational-wave signals from spinning neutron stars. III. Detection statistics and computational requirements

    NASA Astrophysics Data System (ADS)

    Jaranowski, Piotr; Królak, Andrzej

    2000-03-01

    We develop the analytic and numerical tools for data analysis of the continuous gravitational-wave signals from spinning neutron stars for ground-based laser interferometric detectors. The statistical data analysis method that we investigate is maximum likelihood detection which for the case of Gaussian noise reduces to matched filtering. We study in detail the statistical properties of the optimum functional that needs to be calculated in order to detect the gravitational-wave signal and estimate its parameters. We find it particularly useful to divide the parameter space into elementary cells such that the values of the optimal functional are statistically independent in different cells. We derive formulas for false alarm and detection probabilities both for the optimal and the suboptimal filters. We assess the computational requirements needed to do the signal search. We compare a number of criteria to build sufficiently accurate templates for our data analysis scheme. We verify the validity of our concepts and formulas by means of the Monte Carlo simulations. We present algorithms by which one can estimate the parameters of the continuous signals accurately. We find, confirming earlier work of other authors, that given a 100 Gflops computational power an all-sky search for observation time of 7 days and directed search for observation time of 120 days are possible whereas an all-sky search for 120 days of observation time is computationally prohibitive.

  2. TRIM—3D: a three-dimensional model for accurate simulation of shallow water flow

    USGS Publications Warehouse

    Casulli, Vincenzo; Bertolazzi, Enrico; Cheng, Ralph T.

    1993-01-01

    A semi-implicit finite difference formulation for the numerical solution of three-dimensional tidal circulation is discussed. The governing equations are the three-dimensional Reynolds equations in which the pressure is assumed to be hydrostatic. A minimal degree of implicitness has been introduced in the finite difference formula so that the resulting algorithm permits the use of large time steps at a minimal computational cost. This formulation includes the simulation of flooding and drying of tidal flats, and is fully vectorizable for an efficient implementation on modern vector computers. The high computational efficiency of this method has made it possible to provide the fine details of circulation structure in complex regions that previous studies were unable to obtain. For proper interpretation of the model results suitable interactive graphics is also an essential tool.

  3. Parallel stochastic simulation of macroscopic calcium currents.

    PubMed

    González-Vélez, Virginia; González-Vélez, Horacio

    2007-06-01

    This work introduces MACACO, a macroscopic calcium currents simulator. It provides a parameter-sweep framework which computes macroscopic Ca(2+) currents from the individual aggregation of unitary currents, using a stochastic model for L-type Ca(2+) channels. MACACO uses a simplified 3-state Markov model to simulate the response of each Ca(2+) channel to different voltage inputs to the cell. In order to provide an accurate systematic view for the stochastic nature of the calcium channels, MACACO is composed of an experiment generator, a central simulation engine and a post-processing script component. Due to the computational complexity of the problem and the dimensions of the parameter space, the MACACO simulation engine employs a grid-enabled task farm. Having been designed as a computational biology tool, MACACO heavily borrows from the way cell physiologists conduct and report their experimental work.

  4. Surface Traps in Colloidal Quantum Dots: A Combined Experimental and Theoretical Perspective.

    PubMed

    Giansante, Carlo; Infante, Ivan

    2017-10-19

    Surface traps are ubiquitous to nanoscopic semiconductor materials. Understanding their atomistic origin and manipulating them chemically have capital importance to design defect-free colloidal quantum dots and make a leap forward in the development of efficient optoelectronic devices. Recent advances in computing power established computational chemistry as a powerful tool to describe accurately complex chemical species and nowadays it became conceivable to model colloidal quantum dots with realistic sizes and shapes. In this Perspective, we combine the knowledge gathered in recent experimental findings with the computation of quantum dot electronic structures. We analyze three different systems: namely, CdSe, PbS, and CsPbI 3 as benchmark semiconductor nanocrystals showing how different types of trap states can form at their surface. In addition, we suggest experimental healing of such traps according to their chemical origin and nanocrystal composition.

  5. A Research Roadmap for Computation-Based Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less

  6. Computer-assisted adjuncts for aneurysmal morphologic assessment: toward more precise and accurate approaches

    NASA Astrophysics Data System (ADS)

    Rajabzadeh-Oghaz, Hamidreza; Varble, Nicole; Davies, Jason M.; Mowla, Ashkan; Shakir, Hakeem J.; Sonig, Ashish; Shallwani, Hussain; Snyder, Kenneth V.; Levy, Elad I.; Siddiqui, Adnan H.; Meng, Hui

    2017-03-01

    Neurosurgeons currently base most of their treatment decisions for intracranial aneurysms (IAs) on morphological measurements made manually from 2D angiographic images. These measurements tend to be inaccurate because 2D measurements cannot capture the complex geometry of IAs and because manual measurements are variable depending on the clinician's experience and opinion. Incorrect morphological measurements may lead to inappropriate treatment strategies. In order to improve the accuracy and consistency of morphological analysis of IAs, we have developed an image-based computational tool, AView. In this study, we quantified the accuracy of computer-assisted adjuncts of AView for aneurysmal morphologic assessment by performing measurement on spheres of known size and anatomical IA models. AView has an average morphological error of 0.56% in size and 2.1% in volume measurement. We also investigate the clinical utility of this tool on a retrospective clinical dataset and compare size and neck diameter measurement between 2D manual and 3D computer-assisted measurement. The average error was 22% and 30% in the manual measurement of size and aneurysm neck diameter, respectively. Inaccuracies due to manual measurements could therefore lead to wrong treatment decisions in 44% and inappropriate treatment strategies in 33% of the IAs. Furthermore, computer-assisted analysis of IAs improves the consistency in measurement among clinicians by 62% in size and 82% in neck diameter measurement. We conclude that AView dramatically improves accuracy for morphological analysis. These results illustrate the necessity of a computer-assisted approach for the morphological analysis of IAs.

  7. Role of Statistical Random-Effects Linear Models in Personalized Medicine.

    PubMed

    Diaz, Francisco J; Yeh, Hung-Wen; de Leon, Jose

    2012-03-01

    Some empirical studies and recent developments in pharmacokinetic theory suggest that statistical random-effects linear models are valuable tools that allow describing simultaneously patient populations as a whole and patients as individuals. This remarkable characteristic indicates that these models may be useful in the development of personalized medicine, which aims at finding treatment regimes that are appropriate for particular patients, not just appropriate for the average patient. In fact, published developments show that random-effects linear models may provide a solid theoretical framework for drug dosage individualization in chronic diseases. In particular, individualized dosages computed with these models by means of an empirical Bayesian approach may produce better results than dosages computed with some methods routinely used in therapeutic drug monitoring. This is further supported by published empirical and theoretical findings that show that random effects linear models may provide accurate representations of phase III and IV steady-state pharmacokinetic data, and may be useful for dosage computations. These models have applications in the design of clinical algorithms for drug dosage individualization in chronic diseases; in the computation of dose correction factors; computation of the minimum number of blood samples from a patient that are necessary for calculating an optimal individualized drug dosage in therapeutic drug monitoring; measure of the clinical importance of clinical, demographic, environmental or genetic covariates; study of drug-drug interactions in clinical settings; the implementation of computational tools for web-site-based evidence farming; design of pharmacogenomic studies; and in the development of a pharmacological theory of dosage individualization.

  8. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Souris, Kevin, E-mail: kevin.souris@uclouvain.be; Lee, John Aldo; Sterpin, Edmond

    2016-04-15

    Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithmmore » of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10{sup 7} primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.« less

  9. Simulations in Cyber-Security: A Review of Cognitive Modeling of Network Attackers, Defenders, and Users.

    PubMed

    Veksler, Vladislav D; Buchler, Norbou; Hoffman, Blaine E; Cassenti, Daniel N; Sample, Char; Sugrim, Shridat

    2018-01-01

    Computational models of cognitive processes may be employed in cyber-security tools, experiments, and simulations to address human agency and effective decision-making in keeping computational networks secure. Cognitive modeling can addresses multi-disciplinary cyber-security challenges requiring cross-cutting approaches over the human and computational sciences such as the following: (a) adversarial reasoning and behavioral game theory to predict attacker subjective utilities and decision likelihood distributions, (b) human factors of cyber tools to address human system integration challenges, estimation of defender cognitive states, and opportunities for automation, (c) dynamic simulations involving attacker, defender, and user models to enhance studies of cyber epidemiology and cyber hygiene, and (d) training effectiveness research and training scenarios to address human cyber-security performance, maturation of cyber-security skill sets, and effective decision-making. Models may be initially constructed at the group-level based on mean tendencies of each subject's subgroup, based on known statistics such as specific skill proficiencies, demographic characteristics, and cultural factors. For more precise and accurate predictions, cognitive models may be fine-tuned to each individual attacker, defender, or user profile, and updated over time (based on recorded behavior) via techniques such as model tracing and dynamic parameter fitting.

  10. Continuum approach for aerothermal flow through ablative porous material using discontinuous Galerkin discretization.

    NASA Astrophysics Data System (ADS)

    Schrooyen, Pierre; Chatelain, Philippe; Hillewaert, Koen; Magin, Thierry E.

    2014-11-01

    The atmospheric entry of spacecraft presents several challenges in simulating the aerothermal flow around the heat shield. Predicting an accurate heat-flux is a complex task, especially regarding the interaction between the flow in the free stream and the erosion of the thermal protection material. To capture this interaction, a continuum approach is developed to go progressively from the region fully occupied by fluid to a receding porous medium. The volume averaged Navier-Stokes equations are used to model both phases in the same computational domain considering a single set of conservation laws. The porosity is itself a variable of the computation, allowing to take volumetric ablation into account through adequate source terms. This approach is implemented within a computational tool based on a high-order discontinuous Galerkin discretization. The multi-dimensional tool has already been validated and has proven its efficient parallel implementation. Within this platform, a fully implicit method was developed to simulate multi-phase reacting flows. Numerical results to verify and validate the methodology are considered within this work. Interactions between the flow and the ablated geometry are also presented. Supported by Fund for Research Training in Industry and Agriculture.

  11. Adaptive System Modeling for Spacecraft Simulation

    NASA Technical Reports Server (NTRS)

    Thomas, Justin

    2011-01-01

    This invention introduces a methodology and associated software tools for automatically learning spacecraft system models without any assumptions regarding system behavior. Data stream mining techniques were used to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). Evaluation on historical ISS telemetry data shows that adaptive system modeling reduces simulation error anywhere from 50 to 90 percent over existing approaches. The purpose of the methodology is to outline how someone can create accurate system models from sensor (telemetry) data. The purpose of the software is to support the methodology. The software provides analysis tools to design the adaptive models. The software also provides the algorithms to initially build system models and continuously update them from the latest streaming sensor data. The main strengths are as follows: Creates accurate spacecraft system models without in-depth system knowledge or any assumptions about system behavior. Automatically updates/calibrates system models using the latest streaming sensor data. Creates device specific models that capture the exact behavior of devices of the same type. Adapts to evolving systems. Can reduce computational complexity (faster simulations).

  12. Required temporal resolution for accurate thoracic aortic pulse wave velocity measurements by phase-contrast magnetic resonance imaging and comparison with clinical standard applanation tonometry.

    PubMed

    Dorniak, Karolina; Heiberg, Einar; Hellmann, Marcin; Rawicz-Zegrzda, Dorota; Wesierska, Maria; Galaska, Rafal; Sabisz, Agnieszka; Szurowska, Edyta; Dudziak, Maria; Hedström, Erik

    2016-05-26

    Pulse wave velocity (PWV) is a biomarker for arterial stiffness, clinically assessed by applanation tonometry (AT). Increased use of phase-contrast cardiac magnetic resonance (CMR) imaging allows for PWV assessment with minor routine protocol additions. The aims were to investigate the acquired temporal resolution needed for accurate and precise measurements of CMR-PWV, and develop a tool for CMR-PWV measurements. Computer phantoms were generated for PWV = 2-20 m/s based on human CMR-PWV data. The PWV measurements were performed in 13 healthy young subjects and 13 patients at risk for cardiovascular disease. The CMR-PWV was measured by through-plane phase-contrast CMR in the ascending aorta and at the diaphragm level. Centre-line aortic distance was determined between flow planes. The AT-PWV was assessed within 2 h after CMR. Three observers (CMR experience: 15, 4, and <1 year) determined CMR-PWV. The developed tool was based on the flow-curve foot transit time for PWV quantification. Computer phantoms showed bias 0.27 ± 0.32 m/s for a temporal resolution of at least 30 ms. Intraobserver variability for CMR-PWV were: 0 ± 0.03 m/s (15 years), -0.04 ± 0.33 m/s (4 years), and -0.02 ± 0.30 m/s (<1 year). Interobserver variability for CMR-PWV was below 0.02 ± 0.38 m/s. The AT-PWV overestimated CMR-PWV by 1.1 ± 0.7 m/s in healthy young subjects and 1.6 ± 2.7 m/s in patients. An acquired temporal resolution of at least 30 ms should be used to obtain accurate and precise thoracic aortic phase-contrast CMR-PWV. A new freely available research tool was used to measure PWV in healthy young subjects and in patients, showing low intra- and interobserver variability also for less experienced CMR observers.

  13. New Solar PV Tool Accurately Calculates Degradation Rates, Saving Money and

    Science.gov Websites

    Guiding Business Decisions | News | NREL New Solar PV Tool Accurately Calculates Degradation Rates, Saving Money and Guiding Business Decisions News Release: New Solar PV Tool Accurately Calculates ; said Dirk Jordan, engineer and solar PV researcher at NREL. "We spent years building consensus in

  14. CoVaCS: a consensus variant calling system.

    PubMed

    Chiara, Matteo; Gioiosa, Silvia; Chillemi, Giovanni; D'Antonio, Mattia; Flati, Tiziano; Picardi, Ernesto; Zambelli, Federico; Horner, David Stephen; Pesole, Graziano; Castrignanò, Tiziana

    2018-02-05

    The advent and ongoing development of next generation sequencing technologies (NGS) has led to a rapid increase in the rate of human genome re-sequencing data, paving the way for personalized genomics and precision medicine. The body of genome resequencing data is progressively increasing underlining the need for accurate and time-effective bioinformatics systems for genotyping - a crucial prerequisite for identification of candidate causal mutations in diagnostic screens. Here we present CoVaCS, a fully automated, highly accurate system with a web based graphical interface for genotyping and variant annotation. Extensive tests on a gold standard benchmark data-set -the NA12878 Illumina platinum genome- confirm that call-sets based on our consensus strategy are completely in line with those attained by similar command line based approaches, and far more accurate than call-sets from any individual tool. Importantly our system exhibits better sensitivity and higher specificity than equivalent commercial software. CoVaCS offers optimized pipelines integrating state of the art tools for variant calling and annotation for whole genome sequencing (WGS), whole-exome sequencing (WES) and target-gene sequencing (TGS) data. The system is currently hosted at Cineca, and offers the speed of a HPC computing facility, a crucial consideration when large numbers of samples must be analysed. Importantly, all the analyses are performed automatically allowing high reproducibility of the results. As such, we believe that CoVaCS can be a valuable tool for the analysis of human genome resequencing studies. CoVaCS is available at: https://bioinformatics.cineca.it/covacs .

  15. Different methods of image segmentation in the process of meat marbling evaluation

    NASA Astrophysics Data System (ADS)

    Ludwiczak, A.; Ślósarz, P.; Lisiak, D.; Przybylak, A.; Boniecki, P.; Stanisz, M.; Koszela, K.; Zaborowicz, M.; Przybył, K.; Wojcieszak, D.; Janczak, D.; Bykowska, M.

    2015-07-01

    The level of marbling in meat assessment based on digital images is very popular, as computer vision tools are becoming more and more advanced. However considering muscle cross sections as the data source for marbling level evaluation, there are still a few problems to cope with. There is a need for an accurate method which would facilitate this evaluation procedure and increase its accuracy. The presented research was conducted in order to compare the effect of different image segmentation tools considering their usefulness in meat marbling evaluation on the muscle anatomical cross - sections. However this study is considered to be an initial trial in the presented field of research and an introduction to ultrasonic images processing and analysis.

  16. DEVELOPMENTS IN GRworkbench

    NASA Astrophysics Data System (ADS)

    Moylan, Andrew; Scott, Susan M.; Searle, Anthony C.

    2006-02-01

    The software tool GRworkbench is an ongoing project in visual, numerical General Relativity at The Australian National University. Recently, GRworkbench has been significantly extended to facilitate numerical experimentation in analytically-defined space-times. The numerical differential geometric engine has been rewritten using functional programming techniques, enabling objects which are normally defined as functions in the formalism of differential geometry and General Relativity to be directly represented as function variables in the C++ code of GRworkbench. The new functional differential geometric engine allows for more accurate and efficient visualisation of objects in space-times and makes new, efficient computational techniques available. Motivated by the desire to investigate a recent scientific claim using GRworkbench, new tools for numerical experimentation have been implemented, allowing for the simulation of complex physical situations.

  17. Diagnostic classification of cancer using DNA microarrays and artificial intelligence.

    PubMed

    Greer, Braden T; Khan, Javed

    2004-05-01

    The application of artificial intelligence (AI) to microarray data has been receiving much attention in recent years because of the possibility of automated diagnosis in the near future. Studies have been published predicting tumor type, estrogen receptor status, and prognosis using a variety of AI algorithms. The performance of intelligent computing decisions based on gene expression signatures is in some cases comparable to or better than the current clinical decision schemas. The goal of these tools is not to make clinicians obsolete, but rather to give clinicians one more tool in their armamentarium to accurately diagnose and hence better treat cancer patients. Several such applications are summarized in this chapter, and some of the common pitfalls are noted.

  18. A Comparative Study of Measuring Devices Used During Space Shuttle Processing for Inside Diameters

    NASA Technical Reports Server (NTRS)

    Rodriguez, Antonio

    2006-01-01

    During Space Shuttle processing, discrepancies between vehicle dimensions and per print dimensions determine if a part should be refurbished, replaced or accepted "as-is." The engineer's job is to address each discrepancy by choosing the most accurate procedure and tool available, sometimes with up to ten thousands of an inch tolerance. Four methods of measurement are commonly used at the Kennedy Space Center: 1) caliper, 2) mold impressions, 3) optical comparator, 4) dial bore gage. During a problem report evaluation, uncertainty arose between methods after measuring diameters with variations of up to 0.0004" inches. The results showed that computer based measuring devices are extremely accurate, but when human factor is involved in determining points of reference, the results may vary widely compared to more traditional methods. iv

  19. Background estimation and player detection in badminton video clips using histogram of pixel values along temporal dimension

    NASA Astrophysics Data System (ADS)

    Peng, Yahui; Ma, Xiao; Gao, Xinyu; Zhou, Fangxu

    2015-12-01

    Computer vision is an important tool for sports video processing. However, its application in badminton match analysis is very limited. In this study, we proposed a straightforward but robust histogram-based background estimation and player detection methods for badminton video clips, and compared the results with the naive averaging method and the mixture of Gaussians methods, respectively. The proposed method yielded better background estimation results than the naive averaging method and more accurate player detection results than the mixture of Gaussians player detection method. The preliminary results indicated that the proposed histogram-based method could estimate the background and extract the players accurately. We conclude that the proposed method can be used for badminton player tracking and further studies are warranted for automated match analysis.

  20. Accurate and efficient spin integration for particle accelerators

    DOE PAGES

    Abell, Dan T.; Meiser, Dominic; Ranjbar, Vahid H.; ...

    2015-02-01

    Accurate spin tracking is a valuable tool for understanding spin dynamics in particle accelerators and can help improve the performance of an accelerator. In this paper, we present a detailed discussion of the integrators in the spin tracking code GPUSPINTRACK. We have implemented orbital integrators based on drift-kick, bend-kick, and matrix-kick splits. On top of the orbital integrators, we have implemented various integrators for the spin motion. These integrators use quaternions and Romberg quadratures to accelerate both the computation and the convergence of spin rotations.We evaluate their performance and accuracy in quantitative detail for individual elements as well as formore » the entire RHIC lattice. We exploit the inherently data-parallel nature of spin tracking to accelerate our algorithms on graphics processing units.« less

  1. Medical image segmentation using 3D MRI data

    NASA Astrophysics Data System (ADS)

    Voronin, V.; Marchuk, V.; Semenishchev, E.; Cen, Yigang; Agaian, S.

    2017-05-01

    Precise segmentation of three-dimensional (3D) magnetic resonance imaging (MRI) image can be a very useful computer aided diagnosis (CAD) tool in clinical routines. Accurate automatic extraction a 3D component from images obtained by magnetic resonance imaging (MRI) is a challenging segmentation problem due to the small size objects of interest (e.g., blood vessels, bones) in each 2D MRA slice and complex surrounding anatomical structures. Our objective is to develop a specific segmentation scheme for accurately extracting parts of bones from MRI images. In this paper, we use a segmentation algorithm to extract the parts of bones from Magnetic Resonance Imaging (MRI) data sets based on modified active contour method. As a result, the proposed method demonstrates good accuracy in a comparison between the existing segmentation approaches on real MRI data.

  2. Modelling the physics in iterative reconstruction for transmission computed tomography

    PubMed Central

    Nuyts, Johan; De Man, Bruno; Fessler, Jeffrey A.; Zbijewski, Wojciech; Beekman, Freek J.

    2013-01-01

    There is an increasing interest in iterative reconstruction (IR) as a key tool to improve quality and increase applicability of X-ray CT imaging. IR has the ability to significantly reduce patient dose, it provides the flexibility to reconstruct images from arbitrary X-ray system geometries and it allows to include detailed models of photon transport and detection physics, to accurately correct for a wide variety of image degrading effects. This paper reviews discretisation issues and modelling of finite spatial resolution, Compton scatter in the scanned object, data noise and the energy spectrum. Widespread implementation of IR with highly accurate model-based correction, however, still requires significant effort. In addition, new hardware will provide new opportunities and challenges to improve CT with new modelling. PMID:23739261

  3. Requirements for Large Eddy Simulation Computations of Variable-Speed Power Turbine Flows

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.

    2016-01-01

    Variable-speed power turbines (VSPTs) operate at low Reynolds numbers and with a wide range of incidence angles. Transition, separation, and the relevant physics leading to them are important to VSPT flow. Higher fidelity tools such as large eddy simulation (LES) may be needed to resolve the flow features necessary for accurate predictive capability and design of such turbines. A survey conducted for this report explores the requirements for such computations. The survey is limited to the simulation of two-dimensional flow cases and endwalls are not included. It suggests that a grid resolution necessary for this type of simulation to accurately represent the physics may be of the order of Delta(x)+=45, Delta(x)+ =2 and Delta(z)+=17. Various subgrid-scale (SGS) models have been used and except for the Smagorinsky model, all seem to perform well and in some instances the simulations worked well without SGS modeling. A method of specifying the inlet conditions such as synthetic eddy modeling (SEM) is necessary to correctly represent the inlet conditions.

  4. Simulink based behavioural modelling of a pulse oximeter for deployment in rapid development, prototyping and verification.

    PubMed

    Shokouhian, M; Morling, R C S; Kale, I

    2012-01-01

    The pulse oximeter is a well-known device for measuring the level of oxygen in blood. Since their invention, pulse oximeters have been under constant development in both aspects of hardware and software; however there are still unsolved problems that limit their performance [6], [7]. Many fresh algorithms and new design techniques are being suggested every year by industry and academic researchers which claim that they can improve accuracy of measurements [8], [9]. With the lack of an accurate computer-based behavioural model for pulse oximeters, the only way for evaluation of these newly developed systems and algorithms is through hardware implementation which can be both expensive and time consuming. This paper presents an accurate Simulink based behavioural model for a pulse oximeter that can be used by industry and academia alike working in this area, as an exploration as well as productivity enhancement tool during their research and development process. The aim of this paper is to introduce a new computer-based behavioural model which provides a simulation environment from which new ideas can be rapidly evaluated long before the real implementation.

  5. A Deep Learning Framework for Robust and Accurate Prediction of ncRNA-Protein Interactions Using Evolutionary Information.

    PubMed

    Yi, Hai-Cheng; You, Zhu-Hong; Huang, De-Shuang; Li, Xiao; Jiang, Tong-Hai; Li, Li-Ping

    2018-06-01

    The interactions between non-coding RNAs (ncRNAs) and proteins play an important role in many biological processes, and their biological functions are primarily achieved by binding with a variety of proteins. High-throughput biological techniques are used to identify protein molecules bound with specific ncRNA, but they are usually expensive and time consuming. Deep learning provides a powerful solution to computationally predict RNA-protein interactions. In this work, we propose the RPI-SAN model by using the deep-learning stacked auto-encoder network to mine the hidden high-level features from RNA and protein sequences and feed them into a random forest (RF) model to predict ncRNA binding proteins. Stacked assembling is further used to improve the accuracy of the proposed method. Four benchmark datasets, including RPI2241, RPI488, RPI1807, and NPInter v2.0, were employed for the unbiased evaluation of five established prediction tools: RPI-Pred, IPMiner, RPISeq-RF, lncPro, and RPI-SAN. The experimental results show that our RPI-SAN model achieves much better performance than other methods, with accuracies of 90.77%, 89.7%, 96.1%, and 99.33%, respectively. It is anticipated that RPI-SAN can be used as an effective computational tool for future biomedical researches and can accurately predict the potential ncRNA-protein interacted pairs, which provides reliable guidance for biological research. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  6. Learning to Detect Vandalism in Social Content Systems: A Study on Wikipedia

    NASA Astrophysics Data System (ADS)

    Javanmardi, Sara; McDonald, David W.; Caruana, Rich; Forouzan, Sholeh; Lopes, Cristina V.

    A challenge facing user generated content systems is vandalism, i.e. edits that damage content quality. The high visibility and easy access to social networks makes them popular targets for vandals. Detecting and removing vandalism is critical for these user generated content systems. Because vandalism can take many forms, there are many different kinds of features that are potentially useful for detecting it. The complex nature of vandalism, and the large number of potential features, make vandalism detection difficult and time consuming for human editors. Machine learning techniques hold promise for developing accurate, tunable, and maintainable models that can be incorporated into vandalism detection tools. We describe a method for training classifiers for vandalism detection that yields classifiers that are more accurate on the PAN 2010 corpus than others previously developed. Because of the high turnaround in social network systems, it is important for vandalism detection tools to run in real-time. To this aim, we use feature selection to find the minimal set of features consistent with high accuracy. In addition, because some features are more costly to compute than others, we use cost-sensitive feature selection to reduce the total computational cost of executing our models. In addition to the features previously used for spam detection, we introduce new features based on user action histories. The user history features contribute significantly to classifier performance. The approach we use is general and can easily be applied to other user generated content systems.

  7. Comparison of high pressure transient PVT measurements and model predictions. Part I.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felver, Todd G.; Paradiso, Nicholas Joseph; Evans, Gregory Herbert

    2010-07-01

    A series of experiments consisting of vessel-to-vessel transfers of pressurized gas using Transient PVT methodology have been conducted to provide a data set for optimizing heat transfer correlations in high pressure flow systems. In rapid expansions such as these, the heat transfer conditions are neither adiabatic nor isothermal. Compressible flow tools exist, such as NETFLOW that can accurately calculate the pressure and other dynamical mechanical properties of such a system as a function of time. However to properly evaluate the mass that has transferred as a function of time these computational tools rely on heat transfer correlations that must bemore » confirmed experimentally. In this work new data sets using helium gas are used to evaluate the accuracy of these correlations for receiver vessel sizes ranging from 0.090 L to 13 L and initial supply pressures ranging from 2 MPa to 40 MPa. The comparisons show that the correlations developed in the 1980s from sparse data sets perform well for the supply vessels but are not accurate for the receivers, particularly at early time during the transfers. This report focuses on the experiments used to obtain high quality data sets that can be used to validate computational models. Part II of this report discusses how these data were used to gain insight into the physics of gas transfer and to improve vessel heat transfer correlations. Network flow modeling and CFD modeling is also discussed.« less

  8. Linearized Flux Evolution (LiFE): A technique for rapidly adapting fluxes from full-physics radiative transfer models

    NASA Astrophysics Data System (ADS)

    Robinson, Tyler D.; Crisp, David

    2018-05-01

    Solar and thermal radiation are critical aspects of planetary climate, with gradients in radiative energy fluxes driving heating and cooling. Climate models require that radiative transfer tools be versatile, computationally efficient, and accurate. Here, we describe a technique that uses an accurate full-physics radiative transfer model to generate a set of atmospheric radiative quantities which can be used to linearly adapt radiative flux profiles to changes in the atmospheric and surface state-the Linearized Flux Evolution (LiFE) approach. These radiative quantities describe how each model layer in a plane-parallel atmosphere reflects and transmits light, as well as how the layer generates diffuse radiation by thermal emission and by scattering light from the direct solar beam. By computing derivatives of these layer radiative properties with respect to dynamic elements of the atmospheric state, we can then efficiently adapt the flux profiles computed by the full-physics model to new atmospheric states. We validate the LiFE approach, and then apply this approach to Mars, Earth, and Venus, demonstrating the information contained in the layer radiative properties and their derivatives, as well as how the LiFE approach can be used to determine the thermal structure of radiative and radiative-convective equilibrium states in one-dimensional atmospheric models.

  9. How Haptic Size Sensations Improve Distance Perception

    PubMed Central

    Battaglia, Peter W.; Kersten, Daniel; Schrater, Paul R.

    2011-01-01

    Determining distances to objects is one of the most ubiquitous perceptual tasks in everyday life. Nevertheless, it is challenging because the information from a single image confounds object size and distance. Though our brains frequently judge distances accurately, the underlying computations employed by the brain are not well understood. Our work illuminates these computions by formulating a family of probabilistic models that encompass a variety of distinct hypotheses about distance and size perception. We compare these models' predictions to a set of human distance judgments in an interception experiment and use Bayesian analysis tools to quantitatively select the best hypothesis on the basis of its explanatory power and robustness over experimental data. The central question is: whether, and how, human distance perception incorporates size cues to improve accuracy. Our conclusions are: 1) humans incorporate haptic object size sensations for distance perception, 2) the incorporation of haptic sensations is suboptimal given their reliability, 3) humans use environmentally accurate size and distance priors, 4) distance judgments are produced by perceptual “posterior sampling”. In addition, we compared our model's estimated sensory and motor noise parameters with previously reported measurements in the perceptual literature and found good correspondence between them. Taken together, these results represent a major step forward in establishing the computational underpinnings of human distance perception and the role of size information. PMID:21738457

  10. OLTARIS: On-Line Tool for the Assessment of Radiation in Space

    NASA Technical Reports Server (NTRS)

    Singleterry, Robert C., Jr.; Blattnig, Steve R.; Clowdsley, Martha S.; Qualls, Garry D.; Sandridge, Christopher A.; Simonsen, Lisa C.; Norbury, John W.; Slaba, Tony C.; Walker, Steven A.; Badavi, Francis F.; hide

    2010-01-01

    The On-Line Tool for the Assessment of Radiation In Space (OLTARIS) is a World Wide Web based tool that assesses the effects of space radiation on humans and electronics in items such as spacecraft, habitats, rovers, and spacesuits. This document explains the basis behind the interface and framework used to input the data, perform the assessment, and output the results to the user as well as the physics, engineering, and computer science used to develop OLTARIS. The transport and physics is based on the HZETRN and NUCFRG research codes. The OLTARIS website is the successor to the SIREST website from the early 2000's. Modifications have been made to the code to enable easy maintenance, additions, and configuration management along with a more modern web interface. Overall, the code has been verified, tested, and modified to enable faster and more accurate assessments.

  11. Icarus: visualizer for de novo assembly evaluation.

    PubMed

    Mikheenko, Alla; Valin, Gleb; Prjibelski, Andrey; Saveliev, Vladislav; Gurevich, Alexey

    2016-11-01

    : Data visualization plays an increasingly important role in NGS data analysis. With advances in both sequencing and computational technologies, it has become a new bottleneck in genomics studies. Indeed, evaluation of de novo genome assemblies is one of the areas that can benefit from the visualization. However, even though multiple quality assessment methods are now available, existing visualization tools are hardly suitable for this purpose. Here, we present Icarus-a novel genome visualizer for accurate assessment and analysis of genomic draft assemblies, which is based on the tool QUAST. Icarus can be used in studies where a related reference genome is available, as well as for non-model organisms. The tool is available online and as a standalone application. http://cab.spbu.ru/software/icarus CONTACT: aleksey.gurevich@spbu.ruSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Spotsizer: High-throughput quantitative analysis of microbial growth.

    PubMed

    Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg

    2016-10-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.

  13. Loci-STREAM Version 0.9

    NASA Technical Reports Server (NTRS)

    Wright, Jeffrey; Thakur, Siddharth

    2006-01-01

    Loci-STREAM is an evolving computational fluid dynamics (CFD) software tool for simulating possibly chemically reacting, possibly unsteady flows in diverse settings, including rocket engines, turbomachines, oil refineries, etc. Loci-STREAM implements a pressure- based flow-solving algorithm that utilizes unstructured grids. (The benefit of low memory usage by pressure-based algorithms is well recognized by experts in the field.) The algorithm is robust for flows at all speeds from zero to hypersonic. The flexibility of arbitrary polyhedral grids enables accurate, efficient simulation of flows in complex geometries, including those of plume-impingement problems. The present version - Loci-STREAM version 0.9 - includes an interface with the Portable, Extensible Toolkit for Scientific Computation (PETSc) library for access to enhanced linear-equation-solving programs therein that accelerate convergence toward a solution. The name "Loci" reflects the creation of this software within the Loci computational framework, which was developed at Mississippi State University for the primary purpose of simplifying the writing of complex multidisciplinary application programs to run in distributed-memory computing environments including clusters of personal computers. Loci has been designed to relieve application programmers of the details of programming for distributed-memory computers.

  14. Algorithm for planning a double-jaw orthognathic surgery using a computer-aided surgical simulation (CASS) protocol. Part 1: planning sequence

    PubMed Central

    Xia, J. J.; Gateno, J.; Teichgraeber, J. F.; Yuan, P.; Chen, K.-C.; Li, J.; Zhang, X.; Tang, Z.; Alfi, D. M.

    2015-01-01

    The success of craniomaxillofacial (CMF) surgery depends not only on the surgical techniques, but also on an accurate surgical plan. The adoption of computer-aided surgical simulation (CASS) has created a paradigm shift in surgical planning. However, planning an orthognathic operation using CASS differs fundamentally from planning using traditional methods. With this in mind, the Surgical Planning Laboratory of Houston Methodist Research Institute has developed a CASS protocol designed specifically for orthognathic surgery. The purpose of this article is to present an algorithm using virtual tools for planning a double-jaw orthognathic operation. This paper will serve as an operation manual for surgeons wanting to incorporate CASS into their clinical practice. PMID:26573562

  15. First- and Second-Order Sensitivity Analysis of a P-Version Finite Element Equation Via Automatic Differentiation

    NASA Technical Reports Server (NTRS)

    Hou, Gene

    1998-01-01

    Sensitivity analysis is a technique for determining derivatives of system responses with respect to design parameters. Among many methods available for sensitivity analysis, automatic differentiation has been proven through many applications in fluid dynamics and structural mechanics to be an accurate and easy method for obtaining derivatives. Nevertheless, the method can be computational expensive and can require a high memory space. This project will apply an automatic differentiation tool, ADIFOR, to a p-version finite element code to obtain first- and second- order then-nal derivatives, respectively. The focus of the study is on the implementation process and the performance of the ADIFOR-enhanced codes for sensitivity analysis in terms of memory requirement, computational efficiency, and accuracy.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Rij, Jennifer A; Yu, Yi-Hsiang; Guo, Yi

    This study explores and verifies the generalized body-modes method for evaluating the structural loads on a wave energy converter (WEC). Historically, WEC design methodologies have focused primarily on accurately evaluating hydrodynamic loads, while methodologies for evaluating structural loads have yet to be fully considered and incorporated into the WEC design process. As wave energy technologies continue to advance, however, it has become increasingly evident that an accurate evaluation of the structural loads will enable an optimized structural design, as well as the potential utilization of composites and flexible materials, and hence reduce WEC costs. Although there are many computational fluidmore » dynamics, structural analyses and fluid-structure-interaction (FSI) codes available, the application of these codes is typically too computationally intensive to be practical in the early stages of the WEC design process. The generalized body-modes method, however, is a reduced order, linearized, frequency-domain FSI approach, performed in conjunction with the linear hydrodynamic analysis, with computation times that could realistically be incorporated into the WEC design process. The objective of this study is to verify the generalized body-modes approach in comparison to high-fidelity FSI simulations to accurately predict structural deflections and stress loads in a WEC. Two verification cases are considered, a free-floating barge and a fixed-bottom column. Details for both the generalized body-modes models and FSI models are first provided. Results for each of the models are then compared and discussed. Finally, based on the verification results obtained, future plans for incorporating the generalized body-modes method into the WEC simulation tool, WEC-Sim, and the overall WEC design process are discussed.« less

  17. VISPA2: a scalable pipeline for high-throughput identification and annotation of vector integration sites.

    PubMed

    Spinozzi, Giulio; Calabria, Andrea; Brasca, Stefano; Beretta, Stefano; Merelli, Ivan; Milanesi, Luciano; Montini, Eugenio

    2017-11-25

    Bioinformatics tools designed to identify lentiviral or retroviral vector insertion sites in the genome of host cells are used to address the safety and long-term efficacy of hematopoietic stem cell gene therapy applications and to study the clonal dynamics of hematopoietic reconstitution. The increasing number of gene therapy clinical trials combined with the increasing amount of Next Generation Sequencing data, aimed at identifying integration sites, require both highly accurate and efficient computational software able to correctly process "big data" in a reasonable computational time. Here we present VISPA2 (Vector Integration Site Parallel Analysis, version 2), the latest optimized computational pipeline for integration site identification and analysis with the following features: (1) the sequence analysis for the integration site processing is fully compliant with paired-end reads and includes a sequence quality filter before and after the alignment on the target genome; (2) an heuristic algorithm to reduce false positive integration sites at nucleotide level to reduce the impact of Polymerase Chain Reaction or trimming/alignment artifacts; (3) a classification and annotation module for integration sites; (4) a user friendly web interface as researcher front-end to perform integration site analyses without computational skills; (5) the time speedup of all steps through parallelization (Hadoop free). We tested VISPA2 performances using simulated and real datasets of lentiviral vector integration sites, previously obtained from patients enrolled in a hematopoietic stem cell gene therapy clinical trial and compared the results with other preexisting tools for integration site analysis. On the computational side, VISPA2 showed a > 6-fold speedup and improved precision and recall metrics (1 and 0.97 respectively) compared to previously developed computational pipelines. These performances indicate that VISPA2 is a fast, reliable and user-friendly tool for integration site analysis, which allows gene therapy integration data to be handled in a cost and time effective fashion. Moreover, the web access of VISPA2 ( http://openserver.itb.cnr.it/vispa/ ) ensures accessibility and ease of usage to researches of a complex analytical tool. We released the source code of VISPA2 in a public repository ( https://bitbucket.org/andreacalabria/vispa2 ).

  18. Micro-computed tomography of false starts produced on bone by different hand-saws.

    PubMed

    Pelletti, Guido; Viel, Guido; Fais, Paolo; Viero, Alessia; Visentin, Sindi; Miotto, Diego; Montisci, Massimo; Cecchetto, Giovanni; Giraudo, Chiara

    2017-05-01

    The analysis of macro- and microscopic characteristics of saw marks on bones can provide useful information about the class of the tool utilized to produce the injury. The aim of the present study was to test micro-computed tomography (micro-CT) for the analysis of false starts experimentally produced on 32 human bone sections using 4 different hand-saws in order to verify the potential utility of micro-CT for distinguishing false starts produced by different saws and to correlate the morphology of the tool with that of the bone mark. Each sample was analysed through stereomicroscopy and micro-CT. Stereomicroscopic analysis allowed the identification of the false starts and the detection of the number of tool marks left by each saw. Micro-CT scans, through the integration of 3D renders and multiplanar reconstructions (MPR), allowed the identification of the shape of each false start correlating it to the injuring tool. Our results suggest that micro-CT could be a useful technique for assessing false starts produced by different classes of saws, providing accurate morphological profiles of the bone marks with all the advantages of high resolution 3D imaging (e.g., high accuracy, non-destructive analysis, preservation and documentation of evidence). However, further studies are necessary to integrate qualitative data with quantitative metrical analysis in order to further characterize the false start and the related injuring tool. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Some Advanced Concepts in Discrete Aerodynamic Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Green, Lawrence L.; Newman, Perry A.; Putko, Michele M.

    2003-01-01

    An efficient incremental iterative approach for differentiating advanced flow codes is successfully demonstrated on a two-dimensional inviscid model problem. The method employs the reverse-mode capability of the automatic differentiation software tool ADIFOR 3.0 and is proven to yield accurate first-order aerodynamic sensitivity derivatives. A substantial reduction in CPU time and computer memory is demonstrated in comparison with results from a straightforward, black-box reverse-mode applicaiton of ADIFOR 3.0 to the same flow code. An ADIFOR-assisted procedure for accurate second-rder aerodynamic sensitivity derivatives is successfully verified on an inviscid transonic lifting airfoil example problem. The method requires that first-order derivatives are calculated first using both the forward (direct) and reverse (adjoinct) procedures; then, a very efficient noniterative calculation of all second-order derivatives can be accomplished. Accurate second derivatives (i.e., the complete Hesian matrices) of lift, wave drag, and pitching-moment coefficients are calculated with respect to geometric shape, angle of attack, and freestream Mach number.

  20. Some Advanced Concepts in Discrete Aerodynamic Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Green, Lawrence L.; Newman, Perry A.; Putko, Michele M.

    2001-01-01

    An efficient incremental-iterative approach for differentiating advanced flow codes is successfully demonstrated on a 2D inviscid model problem. The method employs the reverse-mode capability of the automatic- differentiation software tool ADIFOR 3.0, and is proven to yield accurate first-order aerodynamic sensitivity derivatives. A substantial reduction in CPU time and computer memory is demonstrated in comparison with results from a straight-forward, black-box reverse- mode application of ADIFOR 3.0 to the same flow code. An ADIFOR-assisted procedure for accurate second-order aerodynamic sensitivity derivatives is successfully verified on an inviscid transonic lifting airfoil example problem. The method requires that first-order derivatives are calculated first using both the forward (direct) and reverse (adjoint) procedures; then, a very efficient non-iterative calculation of all second-order derivatives can be accomplished. Accurate second derivatives (i.e., the complete Hessian matrices) of lift, wave-drag, and pitching-moment coefficients are calculated with respect to geometric- shape, angle-of-attack, and freestream Mach number

  1. Fast and Robust STEM Reconstruction in Complex Environments Using Terrestrial Laser Scanning

    NASA Astrophysics Data System (ADS)

    Wang, D.; Hollaus, M.; Puttonen, E.; Pfeifer, N.

    2016-06-01

    Terrestrial Laser Scanning (TLS) is an effective tool in forest research and management. However, accurate estimation of tree parameters still remains challenging in complex forests. In this paper, we present a novel algorithm for stem modeling in complex environments. This method does not require accurate delineation of stem points from the original point cloud. The stem reconstruction features a self-adaptive cylinder growing scheme. This algorithm is tested for a landslide region in the federal state of Vorarlberg, Austria. The algorithm results are compared with field reference data, which show that our algorithm is able to accurately retrieve the diameter at breast height (DBH) with a root mean square error (RMSE) of ~1.9 cm. This algorithm is further facilitated by applying an advanced sampling technique. Different sampling rates are applied and tested. It is found that a sampling rate of 7.5% is already able to retain the stem fitting quality and simultaneously reduce the computation time significantly by ~88%.

  2. Optimization of Microelectronic Devices for Sensor Applications

    NASA Technical Reports Server (NTRS)

    Cwik, Tom; Klimeck, Gerhard

    2000-01-01

    The NASA/JPL goal to reduce payload in future space missions while increasing mission capability demands miniaturization of active and passive sensors, analytical instruments and communication systems among others. Currently, typical system requirements include the detection of particular spectral lines, associated data processing, and communication of the acquired data to other systems. Advances in lithography and deposition methods result in more advanced devices for space application, while the sub-micron resolution currently available opens a vast design space. Though an experimental exploration of this widening design space-searching for optimized performance by repeated fabrication efforts-is unfeasible, it does motivate the development of reliable software design tools. These tools necessitate models based on fundamental physics and mathematics of the device to accurately model effects such as diffraction and scattering in opto-electronic devices, or bandstructure and scattering in heterostructure devices. The software tools must have convenient turn-around times and interfaces that allow effective usage. The first issue is addressed by the application of high-performance computers and the second by the development of graphical user interfaces driven by properly developed data structures. These tools can then be integrated into an optimization environment, and with the available memory capacity and computational speed of high performance parallel platforms, simulation of optimized components can proceed. In this paper, specific applications of the electromagnetic modeling of infrared filtering, as well as heterostructure device design will be presented using genetic algorithm global optimization methods.

  3. High-throughput migration modelling for estimating exposure to chemicals in food packaging in screening and prioritization tools.

    PubMed

    Ernstoff, Alexi S; Fantke, Peter; Huang, Lei; Jolliet, Olivier

    2017-11-01

    Specialty software and simplified models are often used to estimate migration of potentially toxic chemicals from packaging into food. Current models, however, are not suitable for emerging applications in decision-support tools, e.g. in Life Cycle Assessment and risk-based screening and prioritization, which require rapid computation of accurate estimates for diverse scenarios. To fulfil this need, we develop an accurate and rapid (high-throughput) model that estimates the fraction of organic chemicals migrating from polymeric packaging materials into foods. Several hundred step-wise simulations optimised the model coefficients to cover a range of user-defined scenarios (e.g. temperature). The developed model, operationalised in a spreadsheet for future dissemination, nearly instantaneously estimates chemical migration, and has improved performance over commonly used model simplifications. When using measured diffusion coefficients the model accurately predicted (R 2  = 0.9, standard error (S e ) = 0.5) hundreds of empirical data points for various scenarios. Diffusion coefficient modelling, which determines the speed of chemical transfer from package to food, was a major contributor to uncertainty and dramatically decreased model performance (R 2  = 0.4, S e  = 1). In all, this study provides a rapid migration modelling approach to estimate exposure to chemicals in food packaging for emerging screening and prioritization approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Surface Traps in Colloidal Quantum Dots: A Combined Experimental and Theoretical Perspective

    PubMed Central

    2017-01-01

    Surface traps are ubiquitous to nanoscopic semiconductor materials. Understanding their atomistic origin and manipulating them chemically have capital importance to design defect-free colloidal quantum dots and make a leap forward in the development of efficient optoelectronic devices. Recent advances in computing power established computational chemistry as a powerful tool to describe accurately complex chemical species and nowadays it became conceivable to model colloidal quantum dots with realistic sizes and shapes. In this Perspective, we combine the knowledge gathered in recent experimental findings with the computation of quantum dot electronic structures. We analyze three different systems: namely, CdSe, PbS, and CsPbI3 as benchmark semiconductor nanocrystals showing how different types of trap states can form at their surface. In addition, we suggest experimental healing of such traps according to their chemical origin and nanocrystal composition. PMID:28972763

  5. GBA manager: an online tool for querying low-complexity regions in proteins.

    PubMed

    Bandyopadhyay, Nirmalya; Kahveci, Tamer

    2010-01-01

    Abstract We developed GBA Manager, an online software that facilitates the Graph-Based Algorithm (GBA) we proposed in our earlier work. GBA identifies the low-complexity regions (LCR) of protein sequences. GBA exploits a similarity matrix, such as BLOSUM62, to compute the complexity of the subsequences of the input protein sequence. It uses a graph-based algorithm to accurately compute the regions that have low complexities. GBA Manager is a user friendly web-service that enables online querying of protein sequences using GBA. In addition to querying capabilities of the existing GBA algorithm, GBA Manager computes the p-values of the LCR identified. The p-value gives an estimate of the possibility that the region appears by chance. GBA Manager presents the output in three different understandable formats. GBA Manager is freely accessible at http://bioinformatics.cise.ufl.edu/GBA/GBA.htm .

  6. Launching genomics into the cloud: deployment of Mercury, a next generation sequence analysis pipeline.

    PubMed

    Reid, Jeffrey G; Carroll, Andrew; Veeraraghavan, Narayanan; Dahdouli, Mahmoud; Sundquist, Andreas; English, Adam; Bainbridge, Matthew; White, Simon; Salerno, William; Buhay, Christian; Yu, Fuli; Muzny, Donna; Daly, Richard; Duyk, Geoff; Gibbs, Richard A; Boerwinkle, Eric

    2014-01-29

    Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples.

  7. Superadiabatic holonomic quantum computation in cavity QED

    NASA Astrophysics Data System (ADS)

    Liu, Bao-Jie; Huang, Zhen-Hua; Xue, Zheng-Yuan; Zhang, Xin-Ding

    2017-06-01

    Adiabatic quantum control is a powerful tool for quantum engineering and a key component in some quantum computation models, where accurate control over the timing of the involved pulses is not needed. However, the adiabatic condition requires that the process be very slow and thus limits its application in quantum computation, where quantum gates are preferred to be fast due to the limited coherent times of the quantum systems. Here, we propose a feasible scheme to implement universal holonomic quantum computation based on non-Abelian geometric phases with superadiabatic quantum control, where the adiabatic manipulation is sped up while retaining its robustness against errors in the timing control. Consolidating the advantages of both strategies, our proposal is thus both robust and fast. The cavity QED system is adopted as a typical example to illustrate the merits where the proposed scheme can be realized in a tripod configuration by appropriately controlling the pulse shapes and their relative strength. To demonstrate the distinct performance of our proposal, we also compare our scheme with the conventional adiabatic strategy.

  8. RighTime: A real time clock correcting program for MS-DOS-based computer systems

    NASA Technical Reports Server (NTRS)

    Becker, G. Thomas

    1993-01-01

    A computer program is described which effectively eliminates the misgivings of the DOS system clock in PC/AT-class computers. RighTime is a small, sophisticated memory-resident program that automatically corrects both the DOS system clock and the hardware 'CMOS' real time clock (RTC) in real time. RighTime learns what corrections are required without operator interaction beyond the occasional accurate time set. Both warm (power on) and cool (power off) errors are corrected, usually yielding better than one part per million accuracy in the typical desktop computer with no additional hardware, and RighTime increases the system clock resolution from approximately 0.0549 second to 0.01 second. Program tools are also available which allow visualization of RighTime's actions, verification of its performance, display of its history log, and which provide data for graphing of the system clock behavior. The program has found application in a wide variety of industries, including astronomy, satellite tracking, communications, broadcasting, transportation, public utilities, manufacturing, medicine, and the military.

  9. Role of computational fluid dynamics in unsteady aerodynamics for aeroelasticity

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.; Goorjian, Peter M.

    1989-01-01

    In the last two decades there have been extensive developments in computational unsteady transonic aerodynamics. Such developments are essential since the transonic regime plays an important role in the design of modern aircraft. Therefore, there has been a large effort to develop computational tools with which to accurately perform flutter analysis at transonic speeds. In the area of Computational Fluid Dynamics (CFD), unsteady transonic aerodynamics are characterized by the feature of modeling the motion of shock waves over aerodynamic bodies, such as wings. This modeling requires the solution of nonlinear partial differential equations. Most advanced codes such as XTRAN3S use the transonic small perturbation equation. Currently, XTRAN3S is being used for generic research in unsteady aerodynamics and aeroelasticity of almost full aircraft configurations. Use of Euler/Navier Stokes equations for simple typical sections has just begun. A brief history of the development of CFD for aeroelastic applications is summarized. The development of unsteady transonic aerodynamics and aeroelasticity are also summarized.

  10. VOFTools - A software package of calculation tools for volume of fluid methods using general convex grids

    NASA Astrophysics Data System (ADS)

    López, J.; Hernández, J.; Gómez, P.; Faura, F.

    2018-02-01

    The VOFTools library includes efficient analytical and geometrical routines for (1) area/volume computation, (2) truncation operations that typically arise in VOF (volume of fluid) methods, (3) area/volume conservation enforcement (VCE) in PLIC (piecewise linear interface calculation) reconstruction and(4) computation of the distance from a given point to the reconstructed interface. The computation of a polyhedron volume uses an efficient formula based on a quadrilateral decomposition and a 2D projection of each polyhedron face. The analytical VCE method is based on coupling an interpolation procedure to bracket the solution with an improved final calculation step based on the above volume computation formula. Although the library was originally created to help develop highly accurate advection and reconstruction schemes in the context of VOF methods, it may have more general applications. To assess the performance of the supplied routines, different tests, which are provided in FORTRAN and C, were implemented for several 2D and 3D geometries.

  11. Optic disc boundary segmentation from diffeomorphic demons registration of monocular fundus image sequences versus 3D visualization of stereo fundus image pairs for automated early stage glaucoma assessment

    NASA Astrophysics Data System (ADS)

    Gatti, Vijay; Hill, Jason; Mitra, Sunanda; Nutter, Brian

    2014-03-01

    Despite the current availability in resource-rich regions of advanced technologies in scanning and 3-D imaging in current ophthalmology practice, world-wide screening tests for early detection and progression of glaucoma still consist of a variety of simple tools, including fundus image-based parameters such as CDR (cup to disc diameter ratio) and CAR (cup to disc area ratio), especially in resource -poor regions. Reliable automated computation of the relevant parameters from fundus image sequences requires robust non-rigid registration and segmentation techniques. Recent research work demonstrated that proper non-rigid registration of multi-view monocular fundus image sequences could result in acceptable segmentation of cup boundaries for automated computation of CAR and CDR. This research work introduces a composite diffeomorphic demons registration algorithm for segmentation of cup boundaries from a sequence of monocular images and compares the resulting CAR and CDR values with those computed manually by experts and from 3-D visualization of stereo pairs. Our preliminary results show that the automated computation of CDR and CAR from composite diffeomorphic segmentation of monocular image sequences yield values comparable with those from the other two techniques and thus may provide global healthcare with a cost-effective yet accurate tool for management of glaucoma in its early stage.

  12. Implementing Nonlinear Buoyancy and Excitation Forces in the WEC-Sim Wave Energy Converter Modeling Tool: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawson, M.; Yu, Y. H.; Nelessen, A.

    2014-05-01

    Wave energy converters (WECs) are commonly designed and analyzed using numerical models that combine multi-body dynamics with hydrodynamic models based on the Cummins Equation and linearized hydrodynamic coefficients. These modeling methods are attractive design tools because they are computationally inexpensive and do not require the use of high performance computing resources necessitated by high-fidelity methods, such as Navier Stokes computational fluid dynamics. Modeling hydrodynamics using linear coefficients assumes that the device undergoes small motions and that the wetted surface area of the devices is approximately constant. WEC devices, however, are typically designed to undergo large motions in order to maximizemore » power extraction, calling into question the validity of assuming that linear hydrodynamic models accurately capture the relevant fluid-structure interactions. In this paper, we study how calculating buoyancy and Froude-Krylov forces from the instantaneous position of a WEC device (referred to as instantaneous buoyancy and Froude-Krylov forces from herein) changes WEC simulation results compared to simulations that use linear hydrodynamic coefficients. First, we describe the WEC-Sim tool used to perform simulations and how the ability to model instantaneous forces was incorporated into WEC-Sim. We then use a simplified one-body WEC device to validate the model and to demonstrate how accounting for these instantaneously calculated forces affects the accuracy of simulation results, such as device motions, hydrodynamic forces, and power generation.« less

  13. Proceedings Second Annual Cyber Security and Information Infrastructure Research Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheldon, Frederick T; Krings, Axel; Yoo, Seong-Moo

    2006-01-01

    The workshop theme is Cyber Security: Beyond the Maginot Line Recently the FBI reported that computer crime has skyrocketed costing over $67 billion in 2005 alone and affecting 2.8M+ businesses and organizations. Attack sophistication is unprecedented along with availability of open source concomitant tools. Private, academic, and public sectors invest significant resources in cyber security. Industry primarily performs cyber security research as an investment in future products and services. While the public sector also funds cyber security R&D, the majority of this activity focuses on the specific mission(s) of the funding agency. Thus, broad areas of cyber security remain neglectedmore » or underdeveloped. Consequently, this workshop endeavors to explore issues involving cyber security and related technologies toward strengthening such areas and enabling the development of new tools and methods for securing our information infrastructure critical assets. We aim to assemble new ideas and proposals about robust models on which we can build the architecture of a secure cyberspace including but not limited to: * Knowledge discovery and management * Critical infrastructure protection * De-obfuscating tools for the validation and verification of tamper-proofed software * Computer network defense technologies * Scalable information assurance strategies * Assessment-driven design for trust * Security metrics and testing methodologies * Validation of security and survivability properties * Threat assessment and risk analysis * Early accurate detection of the insider threat * Security hardened sensor networks and ubiquitous computing environments * Mobile software authentication protocols * A new "model" of the threat to replace the "Maginot Line" model and more . . .« less

  14. Evidence-based pathology in its second decade: toward probabilistic cognitive computing.

    PubMed

    Marchevsky, Alberto M; Walts, Ann E; Wick, Mark R

    2017-03-01

    Evidence-based pathology advocates using a combination of best available data ("evidence") from the literature and personal experience for the diagnosis, estimation of prognosis, and assessment of other variables that impact individual patient care. Evidence-based pathology relies on systematic reviews of the literature, evaluation of the quality of evidence as categorized by evidence levels and statistical tools such as meta-analyses, estimates of probabilities and odds, and others. However, it is well known that previously "statistically significant" information usually does not accurately forecast the future for individual patients. There is great interest in "cognitive computing" in which "data mining" is combined with "predictive analytics" designed to forecast future events and estimate the strength of those predictions. This study demonstrates the use of IBM Watson Analytics software to evaluate and predict the prognosis of 101 patients with typical and atypical pulmonary carcinoid tumors in which Ki-67 indices have been determined. The results obtained with this system are compared with those previously reported using "routine" statistical software and the help of a professional statistician. IBM Watson Analytics interactively provides statistical results that are comparable to those obtained with routine statistical tools but much more rapidly, with considerably less effort and with interactive graphics that are intuitively easy to apply. It also enables analysis of natural language variables and yields detailed survival predictions for patient subgroups selected by the user. Potential applications of this tool and basic concepts of cognitive computing are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Numerical implementation of equations for photon motion in Kerr spacetime

    NASA Astrophysics Data System (ADS)

    Bursa, Michal

    2017-12-01

    Raytracing is one of the essential tools for accurate modeling of spectra and variability of various astrophysical objects. It has a major importance in relativistic environments, where light endures to a number of relativistic effects. Because the trajectories of light rays in curved spacetimes, and in Kerr spacetime in particular, are highly non-trivial, we summarize the equations governing the motion of photon (or any other zero rest mass particle) and give analytic solution of the equations that can be further used in practical computer implementations.

  16. Near-Infrared Fluorescence-Enhanced Optical Tomography

    PubMed Central

    2016-01-01

    Fluorescence-enhanced optical imaging using near-infrared (NIR) light developed for in vivo molecular targeting and reporting of cancer provides promising opportunities for diagnostic imaging. The current state of the art of NIR fluorescence-enhanced optical tomography is reviewed in the context of the principle of fluorescence, the different measurement schemes employed, and the mathematical tools established to tomographically reconstruct the fluorescence optical properties in various tissue domains. Finally, we discuss the recent advances in forward modeling and distributed memory parallel computation to provide robust, accurate, and fast fluorescence-enhanced optical tomography. PMID:27803924

  17. Ubiquitous Wireless Smart Sensing and Control

    NASA Technical Reports Server (NTRS)

    Wagner, Raymond

    2013-01-01

    Need new technologies to reliably and safely have humans interact within sensored environments (integrated user interfaces, physical and cognitive augmentation, training, and human-systems integration tools). Areas of focus include: radio frequency identification (RFID), motion tracking, wireless communication, wearable computing, adaptive training and decision support systems, and tele-operations. The challenge is developing effective, low cost/mass/volume/power integrated monitoring systems to assess and control system, environmental, and operator health; and accurately determining and controlling the physical, chemical, and biological environments of the areas and associated environmental control systems.

  18. Ubiquitous Wireless Smart Sensing and Control. Pumps and Pipes JSC: Uniquely Houston

    NASA Technical Reports Server (NTRS)

    Wagner, Raymond

    2013-01-01

    Need new technologies to reliably and safely have humans interact within sensored environments (integrated user interfaces, physical and cognitive augmentation, training, and human-systems integration tools).Areas of focus include: radio frequency identification (RFID), motion tracking, wireless communication, wearable computing, adaptive training and decision support systems, and tele-operations. The challenge is developing effective, low cost/mass/volume/power integrated monitoring systems to assess and control system, environmental, and operator health; and accurately determining and controlling the physical, chemical, and biological environments of the areas and associated environmental control systems.

  19. Automated payload experiment tool feasibility study

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Clark, James; Delugach, Harry; Hammons, Charles; Logan, Julie; Provancha, Anna

    1991-01-01

    To achieve an environment less dependent on the flow of paper, automated techniques of data storage and retrieval must be utilized. The prototype under development seeks to demonstrate the ability of a knowledge-based, hypertext computer system. This prototype is concerned with the logical links between two primary NASA support documents, the Science Requirements Document (SRD) and the Engineering Requirements Document (ERD). Once developed, the final system should have the ability to guide a principal investigator through the documentation process in a more timely and efficient manner, while supplying more accurate information to the NASA payload developer.

  20. Recommendations on Model Fidelity for Wind Turbine Gearbox Simulations; NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keller, J.; Lacava, W.; Austin, J.

    2015-02-01

    This work investigates the minimum level of fidelity required to accurately simulate wind turbine gearboxes using state-of-the-art design tools. Excessive model fidelity including drivetrain complexity, gearbox complexity, excitation sources, and imperfections, significantly increases computational time, but may not provide a commensurate increase in the value of the results. Essential designparameters are evaluated, including the planetary load-sharing factor, gear tooth load distribution, and sun orbit motion. Based on the sensitivity study results, recommendations for the minimum model fidelities are provided.

  1. Near-Infrared Fluorescence-Enhanced Optical Tomography.

    PubMed

    Zhu, Banghe; Godavarty, Anuradha

    2016-01-01

    Fluorescence-enhanced optical imaging using near-infrared (NIR) light developed for in vivo molecular targeting and reporting of cancer provides promising opportunities for diagnostic imaging. The current state of the art of NIR fluorescence-enhanced optical tomography is reviewed in the context of the principle of fluorescence, the different measurement schemes employed, and the mathematical tools established to tomographically reconstruct the fluorescence optical properties in various tissue domains. Finally, we discuss the recent advances in forward modeling and distributed memory parallel computation to provide robust, accurate, and fast fluorescence-enhanced optical tomography.

  2. Experiences of registered nurses with regard to accessing health information at the point-of-care via mobile computing devices.

    PubMed

    Ricks, Esmeralda; Benjamin, Valencia; Williams, Margaret

    2015-11-19

    The volume of health information necessary to provide competent health care today has become overwhelming. Mobile computing devices are fast becoming an essential clinical tool for accessing health information at the point-of-care of patients. This study explored and described how registered nurses experienced accessing information at the point-of-care via mobile computing devices (MCDs). A qualitative, exploratory, descriptive and contextual design was used. Ten in-depth interviews were conducted with purposively sampled registered nurses employed by a state hospital in the Nelson Mandela Bay Municipality (NMBM). Interviews were recorded, transcribed verbatim and analysed using Tesch's data analysis technique. Ethical principles were adhered to throughout the study. Guba's model of trustworthiness was used to confirm integrity of the study. Four themes emerged which revealed that the registered nurses benefited from the training they received by enabling them to develop, and improve, their computer literacy levels. Emphasis was placed on the benefits that the accessed information had for educational purposes for patients and the public, for colleagues and students. Furthermore the ability to access information at the point-of-care was considered by registered nurses as valuable to improve patient care because of the wide range of accurate and readily accessible information available via the mobile computing device. The registered nurses in this study felt that being able to access information at the point-of-care increased their confidence and facilitated the provision of quality care because it assisted them in being accurate and sure of what they were doing.

  3. Role of Statistical Random-Effects Linear Models in Personalized Medicine

    PubMed Central

    Diaz, Francisco J; Yeh, Hung-Wen; de Leon, Jose

    2012-01-01

    Some empirical studies and recent developments in pharmacokinetic theory suggest that statistical random-effects linear models are valuable tools that allow describing simultaneously patient populations as a whole and patients as individuals. This remarkable characteristic indicates that these models may be useful in the development of personalized medicine, which aims at finding treatment regimes that are appropriate for particular patients, not just appropriate for the average patient. In fact, published developments show that random-effects linear models may provide a solid theoretical framework for drug dosage individualization in chronic diseases. In particular, individualized dosages computed with these models by means of an empirical Bayesian approach may produce better results than dosages computed with some methods routinely used in therapeutic drug monitoring. This is further supported by published empirical and theoretical findings that show that random effects linear models may provide accurate representations of phase III and IV steady-state pharmacokinetic data, and may be useful for dosage computations. These models have applications in the design of clinical algorithms for drug dosage individualization in chronic diseases; in the computation of dose correction factors; computation of the minimum number of blood samples from a patient that are necessary for calculating an optimal individualized drug dosage in therapeutic drug monitoring; measure of the clinical importance of clinical, demographic, environmental or genetic covariates; study of drug-drug interactions in clinical settings; the implementation of computational tools for web-site-based evidence farming; design of pharmacogenomic studies; and in the development of a pharmacological theory of dosage individualization. PMID:23467392

  4. A Comparative Study of Simulated and Measured Gear-Flap Flow Interaction

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Mineck, Raymond E.; Yao, Chungsheng; Jenkins, Luther N.; Fares, Ehab

    2015-01-01

    The ability of two CFD solvers to accurately characterize the transient, complex, interacting flowfield asso-ciated with a realistic gear-flap configuration is assessed via comparison of simulated flow with experimental measurements. The simulated results, obtained with NASA's FUN3D and Exa's PowerFLOW® for a high-fidelity, 18% scale semi-span model of a Gulfstream aircraft in landing configuration (39 deg flap deflection, main landing gear on and off) are compared to two-dimensional and stereo particle image velocimetry measurements taken within the gear-flap flow interaction region during wind tunnel tests of the model. As part of the bench-marking process, direct comparisons of the mean and fluctuating velocity fields are presented in the form of planar contour plots and extracted line profiles at measurement planes in various orientations stationed in the main gear wake. The measurement planes in the vicinity of the flap side edge and downstream of the flap trailing edge are used to highlight the effects of gear presence on tip vortex development and the ability of the computational tools to accurately capture such effects. The present study indicates that both computed datasets contain enough detail to construct a relatively accurate depiction of gear-flap flow interaction. Such a finding increases confidence in using the simulated volumetric flow solutions to examine the behavior of pertinent aer-odynamic mechanisms within the gear-flap interaction zone.

  5. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  6. Drooling in Parkinson's disease: a novel tool for assessment of swallow frequency.

    PubMed

    Marks, L; Weinreich, J

    2001-01-01

    A non-invasive way to obtain objective measurements of swallowing frequency and thus indirectly, drooling was required as part of the study 'Drooling in Parkinson's disease: objective measurement and response to therapy'. A hard disk, digital recorder was developed, for use on a laptop computer, which was capable of collecting large quantities of swallowing data from an anticipated 40 patients and 10 controls. An electric microphone was taped to the subjects' larynx for recording the swallow sounds when drinking 150 ml of water and at rest for 30 minutes. The software provides an accurate visual display of the audio-signal allowing the researcher easy access to any segment of the recording and to mark and extract the swallow events, so that swallow frequency may be efficiently and accurately ascertained. Preliminary results are presented.

  7. A macro-micro robot for precise force applications

    NASA Technical Reports Server (NTRS)

    Marzwell, Neville I.; Wang, Yulun

    1993-01-01

    This paper describes an 8 degree-of-freedom macro-micro robot capable of performing tasks which require accurate force control. Applications such as polishing, finishing, grinding, deburring, and cleaning are a few examples of tasks which need this capability. Currently these tasks are either performed manually or with dedicated machinery because of the lack of a flexible and cost effective tool, such as a programmable force-controlled robot. The basic design and control of the macro-micro robot is described in this paper. A modular high-performance multiprocessor control system was designed to provide sufficient compute power for executing advanced control methods. An 8 degree of freedom macro-micro mechanism was constructed to enable accurate tip forces. Control algorithms based on the impedance control method were derived, coded, and load balanced for maximum execution speed on the multiprocessor system.

  8. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. VEDA: a web-based virtual environment for dynamic atomic force microscopy.

    PubMed

    Melcher, John; Hu, Shuiqing; Raman, Arvind

    2008-06-01

    We describe here the theory and applications of virtual environment dynamic atomic force microscopy (VEDA), a suite of state-of-the-art simulation tools deployed on nanoHUB (www.nanohub.org) for the accurate simulation of tip motion in dynamic atomic force microscopy (dAFM) over organic and inorganic samples. VEDA takes advantage of nanoHUB's cyberinfrastructure to run high-fidelity dAFM tip dynamics computations on local clusters and the teragrid. Consequently, these tools are freely accessible and the dAFM simulations are run using standard web-based browsers without requiring additional software. A wide range of issues in dAFM ranging from optimal probe choice, probe stability, and tip-sample interaction forces, power dissipation, to material property extraction and scanning dynamics over hetereogeneous samples can be addressed.

  10. Invited Article: VEDA: A web-based virtual environment for dynamic atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Melcher, John; Hu, Shuiqing; Raman, Arvind

    2008-06-01

    We describe here the theory and applications of virtual environment dynamic atomic force microscopy (VEDA), a suite of state-of-the-art simulation tools deployed on nanoHUB (www.nanohub.org) for the accurate simulation of tip motion in dynamic atomic force microscopy (dAFM) over organic and inorganic samples. VEDA takes advantage of nanoHUB's cyberinfrastructure to run high-fidelity dAFM tip dynamics computations on local clusters and the teragrid. Consequently, these tools are freely accessible and the dAFM simulations are run using standard web-based browsers without requiring additional software. A wide range of issues in dAFM ranging from optimal probe choice, probe stability, and tip-sample interaction forces, power dissipation, to material property extraction and scanning dynamics over hetereogeneous samples can be addressed.

  11. A hybrid solution using computational prediction and measured data to accurately determine process corrections with reduced overlay sampling

    NASA Astrophysics Data System (ADS)

    Noyes, Ben F.; Mokaberi, Babak; Mandoy, Ram; Pate, Alex; Huijgen, Ralph; McBurney, Mike; Chen, Owen

    2017-03-01

    Reducing overlay error via an accurate APC feedback system is one of the main challenges in high volume production of the current and future nodes in the semiconductor industry. The overlay feedback system directly affects the number of dies meeting overlay specification and the number of layers requiring dedicated exposure tools through the fabrication flow. Increasing the former number and reducing the latter number is beneficial for the overall efficiency and yield of the fabrication process. An overlay feedback system requires accurate determination of the overlay error, or fingerprint, on exposed wafers in order to determine corrections to be automatically and dynamically applied to the exposure of future wafers. Since current and future nodes require correction per exposure (CPE), the resolution of the overlay fingerprint must be high enough to accommodate CPE in the overlay feedback system, or overlay control module (OCM). Determining a high resolution fingerprint from measured data requires extremely dense overlay sampling that takes a significant amount of measurement time. For static corrections this is acceptable, but in an automated dynamic correction system this method creates extreme bottlenecks for the throughput of said system as new lots have to wait until the previous lot is measured. One solution is using a less dense overlay sampling scheme and employing computationally up-sampled data to a dense fingerprint. That method uses a global fingerprint model over the entire wafer; measured localized overlay errors are therefore not always represented in its up-sampled output. This paper will discuss a hybrid system shown in Fig. 1 that combines a computationally up-sampled fingerprint with the measured data to more accurately capture the actual fingerprint, including local overlay errors. Such a hybrid system is shown to result in reduced modelled residuals while determining the fingerprint, and better on-product overlay performance.

  12. The Continual Intercomparison of Radiation Codes: Results from Phase I

    NASA Technical Reports Server (NTRS)

    Oreopoulos, Lazaros; Mlawer, Eli; Delamere, Jennifer; Shippert, Timothy; Cole, Jason; Iacono, Michael; Jin, Zhonghai; Li, Jiangnan; Manners, James; Raisanen, Petri; hide

    2011-01-01

    The computer codes that calculate the energy budget of solar and thermal radiation in Global Climate Models (GCMs), our most advanced tools for predicting climate change, have to be computationally efficient in order to not impose undue computational burden to climate simulations. By using approximations to gain execution speed, these codes sacrifice accuracy compared to more accurate, but also much slower, alternatives. International efforts to evaluate the approximate schemes have taken place in the past, but they have suffered from the drawback that the accurate standards were not validated themselves for performance. The manuscript summarizes the main results of the first phase of an effort called "Continual Intercomparison of Radiation Codes" (CIRC) where the cases chosen to evaluate the approximate models are based on observations and where we have ensured that the accurate models perform well when compared to solar and thermal radiation measurements. The effort is endorsed by international organizations such as the GEWEX Radiation Panel and the International Radiation Commission and has a dedicated website (i.e., http://circ.gsfc.nasa.gov) where interested scientists can freely download data and obtain more information about the effort's modus operandi and objectives. In a paper published in the March 2010 issue of the Bulletin of the American Meteorological Society only a brief overview of CIRC was provided with some sample results. In this paper the analysis of submissions of 11 solar and 13 thermal infrared codes relative to accurate reference calculations obtained by so-called "line-by-line" radiation codes is much more detailed. We demonstrate that, while performance of the approximate codes continues to improve, significant issues still remain to be addressed for satisfactory performance within GCMs. We hope that by identifying and quantifying shortcomings, the paper will help establish performance standards to objectively assess radiation code quality, and will guide the development of future phases of CIRC

  13. Edge control in a computer controlled optical surfacing process using a heterocercal tool influence function.

    PubMed

    Hu, Haixiang; Zhang, Xin; Ford, Virginia; Luo, Xiao; Qi, Erhui; Zeng, Xuefeng; Zhang, Xuejun

    2016-11-14

    Edge effect is regarded as one of the most difficult technical issues in a computer controlled optical surfacing (CCOS) process. Traditional opticians have to even up the consequences of the two following cases. Operating CCOS in a large overhang condition affects the accuracy of material removal, while in a small overhang condition, it achieves a more accurate performance, but leaves a narrow rolled-up edge, which takes time and effort to remove. In order to control the edge residuals in the latter case, we present a new concept of the 'heterocercal' tool influence function (TIF). Generated from compound motion equipment, this type of TIF can 'transfer' the material removal from the inner place to the edge, meanwhile maintaining the high accuracy and efficiency of CCOS. We call it the 'heterocercal' TIF, because of the inspiration from the heterocercal tails of sharks, whose upper lobe provides most of the explosive power. The heterocercal TIF was theoretically analyzed, and physically realized in CCOS facilities. Experimental and simulation results showed good agreement. It enables significant control of the edge effect and convergence of entire surface errors in large tool-to-mirror size-ratio conditions. This improvement will largely help manufacturing efficiency in some extremely large optical system projects, like the tertiary mirror of the Thirty Meter Telescope.

  14. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    PubMed

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  15. Benchmarking of density functionals for a soft but accurate prediction and assignment of (1) H and (13)C NMR chemical shifts in organic and biological molecules.

    PubMed

    Benassi, Enrico

    2017-01-15

    A number of programs and tools that simulate 1 H and 13 C nuclear magnetic resonance (NMR) chemical shifts using empirical approaches are available. These tools are user-friendly, but they provide a very rough (and sometimes misleading) estimation of the NMR properties, especially for complex systems. Rigorous and reliable ways to predict and interpret NMR properties of simple and complex systems are available in many popular computational program packages. Nevertheless, experimentalists keep relying on these "unreliable" tools in their daily work because, to have a sufficiently high accuracy, these rigorous quantum mechanical methods need high levels of theory. An alternative, efficient, semi-empirical approach has been proposed by Bally, Rablen, Tantillo, and coworkers. This idea consists of creating linear calibrations models, on the basis of the application of different combinations of functionals and basis sets. Following this approach, the predictive capability of a wider range of popular functionals was systematically investigated and tested. The NMR chemical shifts were computed in solvated phase at density functional theory level, using 30 different functionals coupled with three different triple-ζ basis sets. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  16. CFD: computational fluid dynamics or confounding factor dissemination? The role of hemodynamics in intracranial aneurysm rupture risk assessment.

    PubMed

    Xiang, J; Tutino, V M; Snyder, K V; Meng, H

    2014-10-01

    Image-based computational fluid dynamics holds a prominent position in the evaluation of intracranial aneurysms, especially as a promising tool to stratify rupture risk. Current computational fluid dynamics findings correlating both high and low wall shear stress with intracranial aneurysm growth and rupture puzzle researchers and clinicians alike. These conflicting findings may stem from inconsistent parameter definitions, small datasets, and intrinsic complexities in intracranial aneurysm growth and rupture. In Part 1 of this 2-part review, we proposed a unifying hypothesis: both high and low wall shear stress drive intracranial aneurysm growth and rupture through mural cell-mediated and inflammatory cell-mediated destructive remodeling pathways, respectively. In the present report, Part 2, we delineate different wall shear stress parameter definitions and survey recent computational fluid dynamics studies, in light of this mechanistic heterogeneity. In the future, we expect that larger datasets, better analyses, and increased understanding of hemodynamic-biologic mechanisms will lead to more accurate predictive models for intracranial aneurysm risk assessment from computational fluid dynamics. © 2014 by American Journal of Neuroradiology.

  17. Conversion of IVA Human Computer Model to EVA Use and Evaluation and Comparison of the Result to Existing EVA Models

    NASA Technical Reports Server (NTRS)

    Hamilton, George S.; Williams, Jermaine C.

    1998-01-01

    This paper describes the methods, rationale, and comparative results of the conversion of an intravehicular (IVA) 3D human computer model (HCM) to extravehicular (EVA) use and compares the converted model to an existing model on another computer platform. The task of accurately modeling a spacesuited human figure in software is daunting: the suit restricts the human's joint range of motion (ROM) and does not have joints collocated with human joints. The modeling of the variety of materials needed to construct a space suit (e. g. metal bearings, rigid fiberglass torso, flexible cloth limbs and rubber coated gloves) attached to a human figure is currently out of reach of desktop computer hardware and software. Therefore a simplified approach was taken. The HCM's body parts were enlarged and the joint ROM was restricted to match the existing spacesuit model. This basic approach could be used to model other restrictive environments in industry such as chemical or fire protective clothing. In summary, the approach provides a moderate fidelity, usable tool which will run on current notebook computers.

  18. Postflight aerothermodynamic analysis of Pegasus(tm) using computational fluid dynamic techniques

    NASA Technical Reports Server (NTRS)

    Kuhn, Gary D.

    1992-01-01

    The objective was to validate the computational capability of the NASA Ames Navier-Stokes code, F3D, for flows at high Mach numbers using comparison flight test data from the Pegasus (tm) air launched, winged space booster. Comparisons were made with temperature and heat fluxes estimated from measurements on the wing surfaces and wing-fuselage fairings. Tests were conducted for solution convergence, sensitivity to grid density, and effects of distributing grid points to provide high density near temperature and heat flux sensors. The measured temperatures were from sensors embedded in the ablating thermal protection system. Surface heat fluxes were from plugs fabricated of highly insulative, nonablating material, and mounted level with the surface of the surrounding ablative material. As a preflight design tool, the F3D code produces accurate predictions of heat transfer and other aerodynamic properties, and it can provide detailed data for assessment of boundary layer separation, shock waves, and vortex formation. As a postflight analysis tool, the code provides a way to clarify and interpret the measured results.

  19. Fast simulation tool for ultraviolet radiation at the earth's surface

    NASA Astrophysics Data System (ADS)

    Engelsen, Ola; Kylling, Arve

    2005-04-01

    FastRT is a fast, yet accurate, UV simulation tool that computes downward surface UV doses, UV indices, and irradiances in the spectral range 290 to 400 nm with a resolution as small as 0.05 nm. It computes a full UV spectrum within a few milliseconds on a standard PC, and enables the user to convolve the spectrum with user-defined and built-in spectral response functions including the International Commission on Illumination (CIE) erythemal response function used for UV index calculations. The program accounts for the main radiative input parameters, i.e., instrumental characteristics, solar zenith angle, ozone column, aerosol loading, clouds, surface albedo, and surface altitude. FastRT is based on look-up tables of carefully selected entries of atmospheric transmittances and spherical albedos, and exploits the smoothness of these quantities with respect to atmospheric, surface, geometrical, and spectral parameters. An interactive site, http://nadir.nilu.no/~olaeng/fastrt/fastrt.html, enables the public to run the FastRT program with most input options. This page also contains updated information about FastRT and links to freely downloadable source codes and binaries.

  20. Expert models and modeling processes associated with a computer-modeling tool

    NASA Astrophysics Data System (ADS)

    Zhang, Baohui; Liu, Xiufeng; Krajcik, Joseph S.

    2006-07-01

    Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using think aloud technique and video recording, we captured their computer screen modeling activities and thinking processes. We also interviewed them the day following their modeling sessions to further probe the rationale of their modeling practices. We analyzed both the audio-video transcripts and the experts' models. We found the experts' modeling processes followed the linear sequence built in the modeling program with few instances of moving back and forth. They specified their goals up front and spent a long time thinking through an entire model before acting. They specified relationships with accurate and convincing evidence. Factors (i.e., variables) in expert models were clustered, and represented by specialized technical terms. Based on the above findings, we made suggestions for improving model-based science teaching and learning using Model-It.

  1. MINIVER upgrade for the AVID system. Volume 3: EXITS user's and input guide

    NASA Technical Reports Server (NTRS)

    Pond, J. E.; Schmitz, C. P.

    1983-01-01

    The successful design of thermal protection systems for vehicles operating in atmosphere and near-space environments requires accurate analyses of heating rate and temperature histories encountered along a trajectory. For preliminary design calculations, however, the requirement for accuracy must be tempered by the need for speed and versatility in computational tools used to determine thermal environments and structural thermal response. The MINIVER program was found to provide the proper balance between versatility, speed and accuracy for an aerothermal prediction tool. The advancement in computer aided design concepts at Langley Research Center (LaRC) in the past few years has made it desirable to incorporate the MINIVER program into the LaRC Advanced Vehicle Integrated Design, AVID, system. In order to effectively incorporate MINIVER into the AVID system, several changes to MINIVER were made. The thermal conduction options in MINIVER were removed and a new Explicit Interactive Thermal Structures (EXITS) code was developed. Many upgrades to the MINIVER code were made and a new Langley version of MINIVER called LANMIN was created.

  2. CaFE: a tool for binding affinity prediction using end-point free energy methods.

    PubMed

    Liu, Hui; Hou, Tingjun

    2016-07-15

    Accurate prediction of binding free energy is of particular importance to computational biology and structure-based drug design. Among those methods for binding affinity predictions, the end-point approaches, such as MM/PBSA and LIE, have been widely used because they can achieve a good balance between prediction accuracy and computational cost. Here we present an easy-to-use pipeline tool named Calculation of Free Energy (CaFE) to conduct MM/PBSA and LIE calculations. Powered by the VMD and NAMD programs, CaFE is able to handle numerous static coordinate and molecular dynamics trajectory file formats generated by different molecular simulation packages and supports various force field parameters. CaFE source code and documentation are freely available under the GNU General Public License via GitHub at https://github.com/huiliucode/cafe_plugin It is a VMD plugin written in Tcl and the usage is platform-independent. tingjunhou@zju.edu.cn. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. MINIVER upgrade for the AVID system. Volume 1: LANMIN user's manual

    NASA Technical Reports Server (NTRS)

    Engel, C. D.; Praharaj, S. C.

    1983-01-01

    The successful design of thermal protection systems for vehicles operating in atmosphere and near space environments requires accurate analyses of heating rate and temperature histories encountered along a trajectory. For preliminary design calculations, however, the requirement for accuracy must be tempered by the need for speed and versatility in computational tools used to determine thermal environments and structural thermal response. The MINIVER program has been found to provide the proper balance between versatility, speed and accuracy for an aerothermal prediction tool. The advancement in computer aided design concepts at Langley Research Center (LaRC) in the past few years has made it desirable to incorporate the MINIVER program into the LaRC Advanced Vehicle Integrated Design, AVID, system. In order to effectively incorporate MINIVER into the AVID system, several changes to MINIVER were made. The thermal conduction options in MINIVER were removed and a new Explicit Interactive Thermal Structures (EXITS) code was developed. Many upgrades to the MINIVER code were made and a new Langley version of MINIVER called LANMIN was created. The theoretical methods and subroutine functions used in LANMIN are described.

  4. A new digitized reverse correction method for hypoid gears based on a one-dimensional probe

    NASA Astrophysics Data System (ADS)

    Li, Tianxing; Li, Jubo; Deng, Xiaozhong; Yang, Jianjun; Li, Genggeng; Ma, Wensuo

    2017-12-01

    In order to improve the tooth surface geometric accuracy and transmission quality of hypoid gears, a new digitized reverse correction method is proposed based on the measurement data from a one-dimensional probe. The minimization of tooth surface geometrical deviations is realized from the perspective of mathematical analysis and reverse engineering. Combining the analysis of complex tooth surface generation principles and the measurement mechanism of one-dimensional probes, the mathematical relationship between the theoretical designed tooth surface, the actual machined tooth surface and the deviation tooth surface is established, the mapping relation between machine-tool settings and tooth surface deviations is derived, and the essential connection between the accurate calculation of tooth surface deviations and the reverse correction method of machine-tool settings is revealed. Furthermore, a reverse correction model of machine-tool settings is built, a reverse correction strategy is planned, and the minimization of tooth surface deviations is achieved by means of the method of numerical iterative reverse solution. On this basis, a digitized reverse correction system for hypoid gears is developed by the organic combination of numerical control generation, accurate measurement, computer numerical processing, and digitized correction. Finally, the correctness and practicability of the digitized reverse correction method are proved through a reverse correction experiment. The experimental results show that the tooth surface geometric deviations meet the engineering requirements after two trial cuts and one correction.

  5. Sma3s: A universal tool for easy functional annotation of proteomes and transcriptomes.

    PubMed

    Casimiro-Soriguer, Carlos S; Muñoz-Mérida, Antonio; Pérez-Pulido, Antonio J

    2017-06-01

    The current cheapening of next-generation sequencing has led to an enormous growth in the number of sequenced genomes and transcriptomes, allowing wet labs to get the sequences from their organisms of study. To make the most of these data, one of the first things that should be done is the functional annotation of the protein-coding genes. But it used to be a slow and tedious step that can involve the characterization of thousands of sequences. Sma3s is an accurate computational tool for annotating proteins in an unattended way. Now, we have developed a completely new version, which includes functionalities that will be of utility for fundamental and applied science. Currently, the results provide functional categories such as biological processes, which become useful for both characterizing particular sequence datasets and comparing results from different projects. But one of the most important implemented innovations is that it has now low computational requirements, and the complete annotation of a simple proteome or transcriptome usually takes around 24 hours in a personal computer. Sma3s has been tested with a large amount of complete proteomes and transcriptomes, and it has demonstrated its potential in health science and other specific projects. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. SiteDB: Marshalling people and resources available to CMS

    NASA Astrophysics Data System (ADS)

    Metson, S.; Bonacorsi, D.; Dias Ferreira, M.; Egeland, R.

    2010-04-01

    In a collaboration the size of CMS (approx. 3000 users, and almost 100 computing centres of varying size) communication and accurate information about the sites it has access to is vital in co-ordinating the multitude of computing tasks required for smooth running. SiteDB is a tool developed by CMS to track sites available to the collaboration, the allocation to CMS of resources available at those sites and the associations between CMS members and the sites (as either a manager/operator of the site or a member of a group associated to the site). It is used to track the roles a person has for an associated site or group. SiteDB eases the coordination load for the operations teams by providing a consistent interface to manage communication with the people working at a site, by identifying who is responsible for a given task or service at a site and by offering a uniform interface to information on CMS contacts and sites. SiteDB provides api's and reports for other CMS tools to use to access the information it contains, for instance enabling CRAB to use "user friendly" names when black/white listing CE's, providing role based authentication and authorisation for other web based services and populating various troubleshooting squads in external ticketing systems in use daily by CMS Computing operations.

  7. Simulations in Cyber-Security: A Review of Cognitive Modeling of Network Attackers, Defenders, and Users

    PubMed Central

    Veksler, Vladislav D.; Buchler, Norbou; Hoffman, Blaine E.; Cassenti, Daniel N.; Sample, Char; Sugrim, Shridat

    2018-01-01

    Computational models of cognitive processes may be employed in cyber-security tools, experiments, and simulations to address human agency and effective decision-making in keeping computational networks secure. Cognitive modeling can addresses multi-disciplinary cyber-security challenges requiring cross-cutting approaches over the human and computational sciences such as the following: (a) adversarial reasoning and behavioral game theory to predict attacker subjective utilities and decision likelihood distributions, (b) human factors of cyber tools to address human system integration challenges, estimation of defender cognitive states, and opportunities for automation, (c) dynamic simulations involving attacker, defender, and user models to enhance studies of cyber epidemiology and cyber hygiene, and (d) training effectiveness research and training scenarios to address human cyber-security performance, maturation of cyber-security skill sets, and effective decision-making. Models may be initially constructed at the group-level based on mean tendencies of each subject's subgroup, based on known statistics such as specific skill proficiencies, demographic characteristics, and cultural factors. For more precise and accurate predictions, cognitive models may be fine-tuned to each individual attacker, defender, or user profile, and updated over time (based on recorded behavior) via techniques such as model tracing and dynamic parameter fitting. PMID:29867661

  8. G‐LoSA: An efficient computational tool for local structure‐centric biological studies and drug design

    PubMed Central

    2016-01-01

    Abstract Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G‐LoSA. G‐LoSA aligns protein local structures in a sequence order independent way and provides a GA‐score, a chemical feature‐based and size‐independent structure similarity score. Our benchmark validation shows the robust performance of G‐LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure‐centric comparative biology studies. In particular, G‐LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G‐LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer‐aided drug design. We hope that G‐LoSA can be a useful computational method for exploring interesting biological problems through large‐scale comparison of protein local structures and facilitating drug discovery research and development. G‐LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. PMID:26813336

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sauter, Nicholas K., E-mail: nksauter@lbl.gov; Hattne, Johan; Grosse-Kunstleve, Ralf W.

    The Computational Crystallography Toolbox (cctbx) is a flexible software platform that has been used to develop high-throughput crystal-screening tools for both synchrotron sources and X-ray free-electron lasers. Plans for data-processing and visualization applications are discussed, and the benefits and limitations of using graphics-processing units are evaluated. Current pixel-array detectors produce diffraction images at extreme data rates (of up to 2 TB h{sup −1}) that make severe demands on computational resources. New multiprocessing frameworks are required to achieve rapid data analysis, as it is important to be able to inspect the data quickly in order to guide the experiment in realmore » time. By utilizing readily available web-serving tools that interact with the Python scripting language, it was possible to implement a high-throughput Bragg-spot analyzer (cctbx.spotfinder) that is presently in use at numerous synchrotron-radiation beamlines. Similarly, Python interoperability enabled the production of a new data-reduction package (cctbx.xfel) for serial femtosecond crystallography experiments at the Linac Coherent Light Source (LCLS). Future data-reduction efforts will need to focus on specialized problems such as the treatment of diffraction spots on interleaved lattices arising from multi-crystal specimens. In these challenging cases, accurate modeling of close-lying Bragg spots could benefit from the high-performance computing capabilities of graphics-processing units.« less

  10. Modeling the direct sun component in buildings using matrix algebraic approaches: Methods and validation

    DOE PAGES

    Lee, Eleanor S.; Geisler-Moroder, David; Ward, Gregory

    2017-12-23

    Simulation tools that enable annual energy performance analysis of optically-complex fenestration systems have been widely adopted by the building industry for use in building design, code development, and the development of rating and certification programs for commercially-available shading and daylighting products. The tools rely on a three-phase matrix operation to compute solar heat gains, using as input low-resolution bidirectional scattering distribution function (BSDF) data (10–15° angular resolution; BSDF data define the angle-dependent behavior of light-scattering materials and systems). Measurement standards and product libraries for BSDF data are undergoing development to support solar heat gain calculations. Simulation of other metrics suchmore » as discomfort glare, annual solar exposure, and potentially thermal discomfort, however, require algorithms and BSDF input data that more accurately model the spatial distribution of transmitted and reflected irradiance or illuminance from the sun (0.5° resolution). This study describes such algorithms and input data, then validates the tools (i.e., an interpolation tool for measured BSDF data and the five-phase method) through comparisons with ray-tracing simulations and field monitored data from a full-scale testbed. Simulations of daylight-redirecting films, a micro-louvered screen, and venetian blinds using variable resolution, tensor tree BSDF input data derived from interpolated scanning goniophotometer measurements were shown to agree with field monitored data to within 20% for greater than 75% of the measurement period for illuminance-based performance parameters. The three-phase method delivered significantly less accurate results. We discuss the ramifications of these findings on industry and provide recommendations to increase end user awareness of the current limitations of existing software tools and BSDF product libraries.« less

  11. Modeling the direct sun component in buildings using matrix algebraic approaches: Methods and validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Eleanor S.; Geisler-Moroder, David; Ward, Gregory

    Simulation tools that enable annual energy performance analysis of optically-complex fenestration systems have been widely adopted by the building industry for use in building design, code development, and the development of rating and certification programs for commercially-available shading and daylighting products. The tools rely on a three-phase matrix operation to compute solar heat gains, using as input low-resolution bidirectional scattering distribution function (BSDF) data (10–15° angular resolution; BSDF data define the angle-dependent behavior of light-scattering materials and systems). Measurement standards and product libraries for BSDF data are undergoing development to support solar heat gain calculations. Simulation of other metrics suchmore » as discomfort glare, annual solar exposure, and potentially thermal discomfort, however, require algorithms and BSDF input data that more accurately model the spatial distribution of transmitted and reflected irradiance or illuminance from the sun (0.5° resolution). This study describes such algorithms and input data, then validates the tools (i.e., an interpolation tool for measured BSDF data and the five-phase method) through comparisons with ray-tracing simulations and field monitored data from a full-scale testbed. Simulations of daylight-redirecting films, a micro-louvered screen, and venetian blinds using variable resolution, tensor tree BSDF input data derived from interpolated scanning goniophotometer measurements were shown to agree with field monitored data to within 20% for greater than 75% of the measurement period for illuminance-based performance parameters. The three-phase method delivered significantly less accurate results. We discuss the ramifications of these findings on industry and provide recommendations to increase end user awareness of the current limitations of existing software tools and BSDF product libraries.« less

  12. A study of computer graphics technology in application of communication resource management

    NASA Astrophysics Data System (ADS)

    Li, Jing; Zhou, Liang; Yang, Fei

    2017-08-01

    With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.

  13. Real-time structured light intraoral 3D measurement pipeline

    NASA Astrophysics Data System (ADS)

    Gheorghe, Radu; Tchouprakov, Andrei; Sokolov, Roman

    2013-02-01

    Computer aided design and manufacturing (CAD/CAM) is increasingly becoming a standard feature and service provided to patients in dentist offices and denture manufacturing laboratories. Although the quality of the tools and data has slowly improved in the last years, due to various surface measurement challenges, practical, accurate, invivo, real-time 3D high quality data acquisition and processing still needs improving. Advances in GPU computational power have allowed for achieving near real-time 3D intraoral in-vivo scanning of patient's teeth. We explore in this paper, from a real-time perspective, a hardware-software-GPU solution that addresses all the requirements mentioned before. Moreover we exemplify and quantify the hard and soft deadlines required by such a system and illustrate how they are supported in our implementation.

  14. Mining Software Usage with the Automatic Library Tracking Database (ALTD)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadri, Bilel; Fahey, Mark R

    2013-01-01

    Tracking software usage is important for HPC centers, computer vendors, code developers and funding agencies to provide more efficient and targeted software support, and to forecast needs and guide HPC software effort towards the Exascale era. However, accurately tracking software usage on HPC systems has been a challenging task. In this paper, we present a tool called Automatic Library Tracking Database (ALTD) that has been developed and put in production on several Cray systems. The ALTD infrastructure prototype automatically and transparently stores information about libraries linked into an application at compilation time and also the executables launched in a batchmore » job. We will illustrate the usage of libraries, compilers and third party software applications on a system managed by the National Institute for Computational Sciences.« less

  15. Recent advances in imaging technologies in dentistry.

    PubMed

    Shah, Naseem; Bansal, Nikhil; Logani, Ajay

    2014-10-28

    Dentistry has witnessed tremendous advances in all its branches over the past three decades. With these advances, the need for more precise diagnostic tools, specially imaging methods, have become mandatory. From the simple intra-oral periapical X-rays, advanced imaging techniques like computed tomography, cone beam computed tomography, magnetic resonance imaging and ultrasound have also found place in modern dentistry. Changing from analogue to digital radiography has not only made the process simpler and faster but also made image storage, manipulation (brightness/contrast, image cropping, etc.) and retrieval easier. The three-dimensional imaging has made the complex cranio-facial structures more accessible for examination and early and accurate diagnosis of deep seated lesions. This paper is to review current advances in imaging technology and their uses in different disciplines of dentistry.

  16. Recent advances in imaging technologies in dentistry

    PubMed Central

    Shah, Naseem; Bansal, Nikhil; Logani, Ajay

    2014-01-01

    Dentistry has witnessed tremendous advances in all its branches over the past three decades. With these advances, the need for more precise diagnostic tools, specially imaging methods, have become mandatory. From the simple intra-oral periapical X-rays, advanced imaging techniques like computed tomography, cone beam computed tomography, magnetic resonance imaging and ultrasound have also found place in modern dentistry. Changing from analogue to digital radiography has not only made the process simpler and faster but also made image storage, manipulation (brightness/contrast, image cropping, etc.) and retrieval easier. The three-dimensional imaging has made the complex cranio-facial structures more accessible for examination and early and accurate diagnosis of deep seated lesions. This paper is to review current advances in imaging technology and their uses in different disciplines of dentistry. PMID:25349663

  17. Calibration of a γ- Re θ transition model and its application in low-speed flows

    NASA Astrophysics Data System (ADS)

    Wang, YunTao; Zhang, YuLun; Meng, DeHong; Wang, GunXue; Li, Song

    2014-12-01

    The prediction of laminar-turbulent transition in boundary layer is very important for obtaining accurate aerodynamic characteristics with computational fluid dynamic (CFD) tools, because laminar-turbulent transition is directly related to complex flow phenomena in boundary layer and separated flow in space. Unfortunately, the transition effect isn't included in today's major CFD tools because of non-local calculations in transition modeling. In this paper, Menter's γ- Re θ transition model is calibrated and incorporated into a Reynolds-Averaged Navier-Stokes (RANS) code — Trisonic Platform (TRIP) developed in China Aerodynamic Research and Development Center (CARDC). Based on the experimental data of flat plate from the literature, the empirical correlations involved in the transition model are modified and calibrated numerically. Numerical simulation for low-speed flow of Trapezoidal Wing (Trap Wing) is performed and compared with the corresponding experimental data. It is indicated that the γ- Re θ transition model can accurately predict the location of separation-induced transition and natural transition in the flow region with moderate pressure gradient. The transition model effectively imporves the simulation accuracy of the boundary layer and aerodynamic characteristics.

  18. Augmented Endoscopic Images Overlaying Shape Changes in Bone Cutting Procedures.

    PubMed

    Nakao, Megumi; Endo, Shota; Nakao, Shinichi; Yoshida, Munehito; Matsuda, Tetsuya

    2016-01-01

    In microendoscopic discectomy for spinal disorders, bone cutting procedures are performed in tight spaces while observing a small portion of the target structures. Although optical tracking systems are able to measure the tip of the surgical tool during surgery, the poor shape information available during surgery makes accurate cutting difficult, even if preoperative computed tomography and magnetic resonance images are used for reference. Shape estimation and visualization of the target structures are essential for accurate cutting. However, time-varying shape changes during cutting procedures are still challenging issues for intraoperative navigation. This paper introduces a concept of endoscopic image augmentation that overlays shape changes to support bone cutting procedures. This framework handles the history of the location of the measured drill tip as a volume label and visualizes the remains to be cut overlaid on the endoscopic image in real time. A cutting experiment was performed with volunteers, and the feasibility of this concept was examined using a clinical navigation system. The efficacy of the cutting aid was evaluated with respect to the shape similarity, total moved distance of a cutting tool, and required cutting time. The results of the experiments showed that cutting performance was significantly improved by the proposed framework.

  19. Using quantum chemistry muscle to flex massive systems: How to respond to something perturbing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertoni, Colleen

    Computational chemistry uses the theoretical advances of quantum mechanics and the algorithmic and hardware advances of computer science to give insight into chemical problems. It is currently possible to do highly accurate quantum chemistry calculations, but the most accurate methods are very computationally expensive. Thus it is only feasible to do highly accurate calculations on small molecules, since typically more computationally efficient methods are also less accurate. The overall goal of my dissertation work has been to try to decrease the computational expense of calculations without decreasing the accuracy. In particular, my dissertation work focuses on fragmentation methods, intermolecular interactionsmore » methods, analytic gradients, and taking advantage of new hardware.« less

  20. A numerical tool for the calculation of non-equilibrium ionisation states in the solar corona and other astrophysical plasma environments

    NASA Astrophysics Data System (ADS)

    Bradshaw, S. J.

    2009-07-01

    Context: The effects of non-equilibrium processes on the ionisation state of strongly emitting elements in the solar corona can be extremely difficult to assess and yet they are critically important. For example, there is much interest in dynamic heating events localised in the solar corona because they are believed to be responsible for its high temperature and yet recent work has shown that the hottest (≥107 K) emission predicted to be associated with these events can be observationally elusive due to the difficulty of creating the highly ionised states from which the expected emission arises. This leads to the possibility of observing instruments missing such heating events entirely. Aims: The equations describing the evolution of the ionisaton state are a very stiff system of coupled, partial differential equations whose solution can be numerically challenging and time-consuming. Without access to specialised codes and significant computational resources it is extremely difficult to avoid the assumption of an equilibrium ionisation state even when it clearly cannot be justified. The aim of the current work is to develop a computational tool to allow straightforward calculation of the time-dependent ionisation state for a wide variety of physical circumstances. Methods: A numerical model comprising the system of time-dependent ionisation equations for a particular element and tabulated values of plasma temperature as a function of time is developed. The tabulated values can be the solutions of an analytical model, the output from a numerical code or a set of observational measurements. An efficient numerical method to solve the ionisation equations is implemented. Results: A suite of tests is designed and run to demonstrate that the code provides reliable and accurate solutions for a number of scenarios including equilibration of the ion population and rapid heating followed by thermal conductive cooling. It is found that the solver can evolve the ionisation state to recover exactly the equilibrium state found by an independent, steady-state solver for all temperatures, resolve the extremely small ionisation/recombination timescales associated with rapid temperature changes at high densities, and provide stable and accurate solutions for both dominant and minor ion population fractions. Rapid heating and cooling of low to moderate density plasma is characterised by significant non-equilibrium ionisation conditions. The effective ionisation temperatures are significantly lower than the electron temperature and the values found are in close agreement with the previous work of others. At the very highest densities included in the present study an assumption of equilibrium ionisation is found to be robust. Conclusions: The computational tool presented here provides a straightforward and reliable way to calculate ionisation states for a wide variety of physical circumstances. The numerical code gives results that are accurate and consistent with previous studies, has relatively undemanding computational requirements and is freely available from the author.

  1. Validation of a general practice audit and data extraction tool.

    PubMed

    Peiris, David; Agaliotis, Maria; Patel, Bindu; Patel, Anushka

    2013-11-01

    We assessed how accurately a common general practitioner (GP) audit tool extracts data from two software systems. First, pathology test codes were audited at 33 practices covering nine companies. Second, a manual audit of chronic disease data from 200 random patient records at two practices was compared with audit tool data. Pathology review: all companies assigned correct codes for cholesterol, creatinine and glycated haemoglobin; four companies assigned incorrect codes for albuminuria tests, precluding accurate detection with the audit tool. Case record review: there was strong agreement between the manual audit and the tool for all variables except chronic kidney disease diagnoses, which was due to a tool-related programming error. The audit tool accurately detected most chronic disease data in two GP record systems. The one exception, however, highlights the importance of surveillance systems to promptly identify errors. This will maximise potential for audit tools to improve healthcare quality.

  2. New insights into galaxy structure from GALPHAT- I. Motivation, methodology and benchmarks for Sérsic models

    NASA Astrophysics Data System (ADS)

    Yoon, Ilsang; Weinberg, Martin D.; Katz, Neal

    2011-06-01

    We introduce a new galaxy image decomposition tool, GALPHAT (GALaxy PHotometric ATtributes), which is a front-end application of the Bayesian Inference Engine (BIE), a parallel Markov chain Monte Carlo package, to provide full posterior probability distributions and reliable confidence intervals for all model parameters. The BIE relies on GALPHAT to compute the likelihood function. GALPHAT generates scale-free cumulative image tables for the desired model family with precise error control. Interpolation of this table yields accurate pixellated images with any centre, scale and inclination angle. GALPHAT then rotates the image by position angle using a Fourier shift theorem, yielding high-speed, accurate likelihood computation. We benchmark this approach using an ensemble of simulated Sérsic model galaxies over a wide range of observational conditions: the signal-to-noise ratio S/N, the ratio of galaxy size to the point spread function (PSF) and the image size, and errors in the assumed PSF; and a range of structural parameters: the half-light radius re and the Sérsic index n. We characterize the strength of parameter covariance in the Sérsic model, which increases with S/N and n, and the results strongly motivate the need for the full posterior probability distribution in galaxy morphology analyses and later inferences. The test results for simulated galaxies successfully demonstrate that, with a careful choice of Markov chain Monte Carlo algorithms and fast model image generation, GALPHAT is a powerful analysis tool for reliably inferring morphological parameters from a large ensemble of galaxies over a wide range of different observational conditions.

  3. Tools for Early Prediction of Drug Loading in Lipid-Based Formulations

    PubMed Central

    2015-01-01

    Identification of the usefulness of lipid-based formulations (LBFs) for delivery of poorly water-soluble drugs is at date mainly experimentally based. In this work we used a diverse drug data set, and more than 2,000 solubility measurements to develop experimental and computational tools to predict the loading capacity of LBFs. Computational models were developed to enable in silico prediction of solubility, and hence drug loading capacity, in the LBFs. Drug solubility in mixed mono-, di-, triglycerides (Maisine 35-1 and Capmul MCM EP) correlated (R2 0.89) as well as the drug solubility in Carbitol and other ethoxylated excipients (PEG400, R2 0.85; Polysorbate 80, R2 0.90; Cremophor EL, R2 0.93). A melting point below 150 °C was observed to result in a reasonable solubility in the glycerides. The loading capacity in LBFs was accurately calculated from solubility data in single excipients (R2 0.91). In silico models, without the demand of experimentally determined solubility, also gave good predictions of the loading capacity in these complex formulations (R2 0.79). The framework established here gives a better understanding of drug solubility in single excipients and of LBF loading capacity. The large data set studied revealed that experimental screening efforts can be rationalized by solubility measurements in key excipients or from solid state information. For the first time it was shown that loading capacity in complex formulations can be accurately predicted using molecular information extracted from calculated descriptors and thermal properties of the crystalline drug. PMID:26568134

  4. Tools for Early Prediction of Drug Loading in Lipid-Based Formulations.

    PubMed

    Alskär, Linda C; Porter, Christopher J H; Bergström, Christel A S

    2016-01-04

    Identification of the usefulness of lipid-based formulations (LBFs) for delivery of poorly water-soluble drugs is at date mainly experimentally based. In this work we used a diverse drug data set, and more than 2,000 solubility measurements to develop experimental and computational tools to predict the loading capacity of LBFs. Computational models were developed to enable in silico prediction of solubility, and hence drug loading capacity, in the LBFs. Drug solubility in mixed mono-, di-, triglycerides (Maisine 35-1 and Capmul MCM EP) correlated (R(2) 0.89) as well as the drug solubility in Carbitol and other ethoxylated excipients (PEG400, R(2) 0.85; Polysorbate 80, R(2) 0.90; Cremophor EL, R(2) 0.93). A melting point below 150 °C was observed to result in a reasonable solubility in the glycerides. The loading capacity in LBFs was accurately calculated from solubility data in single excipients (R(2) 0.91). In silico models, without the demand of experimentally determined solubility, also gave good predictions of the loading capacity in these complex formulations (R(2) 0.79). The framework established here gives a better understanding of drug solubility in single excipients and of LBF loading capacity. The large data set studied revealed that experimental screening efforts can be rationalized by solubility measurements in key excipients or from solid state information. For the first time it was shown that loading capacity in complex formulations can be accurately predicted using molecular information extracted from calculated descriptors and thermal properties of the crystalline drug.

  5. Development and Validation of a Monte Carlo Simulation Tool for Multi-Pinhole SPECT

    PubMed Central

    Mok, Greta S. P.; Du, Yong; Wang, Yuchuan; Frey, Eric C.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose In this work, we developed and validated a Monte Carlo simulation (MCS) tool for investigation and evaluation of multi-pinhole (MPH) SPECT imaging. Procedures This tool was based on a combination of the SimSET and MCNP codes. Photon attenuation and scatter in the object, as well as penetration and scatter through the collimator detector, are modeled in this tool. It allows accurate and efficient simulation of MPH SPECT with focused pinhole apertures and user-specified photon energy, aperture material, and imaging geometry. The MCS method was validated by comparing the point response function (PRF), detection efficiency (DE), and image profiles obtained from point sources and phantom experiments. A prototype single-pinhole collimator and focused four- and five-pinhole collimators fitted on a small animal imager were used for the experimental validations. We have also compared computational speed among various simulation tools for MPH SPECT, including SimSET-MCNP, MCNP, SimSET-GATE, and GATE for simulating projections of a hot sphere phantom. Results We found good agreement between the MCS and experimental results for PRF, DE, and image profiles, indicating the validity of the simulation method. The relative computational speeds for SimSET-MCNP, MCNP, SimSET-GATE, and GATE are 1: 2.73: 3.54: 7.34, respectively, for 120-view simulations. We also demonstrated the application of this MCS tool in small animal imaging by generating a set of low-noise MPH projection data of a 3D digital mouse whole body phantom. Conclusions The new method is useful for studying MPH collimator designs, data acquisition protocols, image reconstructions, and compensation techniques. It also has great potential to be applied for modeling the collimator-detector response with penetration and scatter effects for MPH in the quantitative reconstruction method. PMID:19779896

  6. Relativistic force field: parametric computations of proton-proton coupling constants in (1)H NMR spectra.

    PubMed

    Kutateladze, Andrei G; Mukhina, Olga A

    2014-09-05

    Spin-spin coupling constants in (1)H NMR carry a wealth of structural information and offer a powerful tool for deciphering molecular structures. However, accurate ab initio or DFT calculations of spin-spin coupling constants have been very challenging and expensive. Scaling of (easy) Fermi contacts, fc, especially in the context of recent findings by Bally and Rablen (Bally, T.; Rablen, P. R. J. Org. Chem. 2011, 76, 4818), offers a framework for achieving practical evaluation of spin-spin coupling constants. We report a faster and more precise parametrization approach utilizing a new basis set for hydrogen atoms optimized in conjunction with (i) inexpensive B3LYP/6-31G(d) molecular geometries, (ii) inexpensive 4-31G basis set for carbon atoms in fc calculations, and (iii) individual parametrization for different atom types/hybridizations, not unlike a force field in molecular mechanics, but designed for the fc's. With the training set of 608 experimental constants we achieved rmsd <0.19 Hz. The methodology performs very well as we illustrate with a set of complex organic natural products, including strychnine (rmsd 0.19 Hz), morphine (rmsd 0.24 Hz), etc. This precision is achieved with much shorter computational times: accurate spin-spin coupling constants for the two conformers of strychnine were computed in parallel on two 16-core nodes of a Linux cluster within 10 min.

  7. Computational Prediction of Neutralization Epitopes Targeted by Human Anti-V3 HIV Monoclonal Antibodies

    PubMed Central

    Shmelkov, Evgeny; Krachmarov, Chavdar; Grigoryan, Arsen V.; Pinter, Abraham; Statnikov, Alexander; Cardozo, Timothy

    2014-01-01

    The extreme diversity of HIV-1 strains presents a formidable challenge for HIV-1 vaccine design. Although antibodies (Abs) can neutralize HIV-1 and potentially protect against infection, antibodies that target the immunogenic viral surface protein gp120 have widely variable and poorly predictable cross-strain reactivity. Here, we developed a novel computational approach, the Method of Dynamic Epitopes, for identification of neutralization epitopes targeted by anti-HIV-1 monoclonal antibodies (mAbs). Our data demonstrate that this approach, based purely on calculated energetics and 3D structural information, accurately predicts the presence of neutralization epitopes targeted by V3-specific mAbs 2219 and 447-52D in any HIV-1 strain. The method was used to calculate the range of conservation of these specific epitopes across all circulating HIV-1 viruses. Accurately identifying an Ab-targeted neutralization epitope in a virus by computational means enables easy prediction of the breadth of reactivity of specific mAbs across the diversity of thousands of different circulating HIV-1 variants and facilitates rational design and selection of immunogens mimicking specific mAb-targeted epitopes in a multivalent HIV-1 vaccine. The defined epitopes can also be used for the purpose of epitope-specific analyses of breakthrough sequences recorded in vaccine clinical trials. Thus, our study is a prototype for a valuable tool for rational HIV-1 vaccine design. PMID:24587168

  8. Oncentra brachytherapy planning system.

    PubMed

    Yang, Jack

    2018-03-27

    In modern cancer management, treatment planning has progressed as a contemporary tool with all the advances in computing power in recent years. One of the advanced planning tools uses 3-dimensional (3D) data sets for accurate dose distributions in patient prescription. Among these planning processes, brachytherapy has been a very important part of a successful cancer management program, offering clinical benefits with specific or combined treatments with external beam therapy. In this chapter, we mainly discussed the Elekta Oncentra planning system, which is the main treatment planning tool for high-dose rate (HDR) modality in our facility and in many other facilities in the United States. HDR is a technically advanced form of brachytherapy; a high-intensity radiation source (3.6 mm in length) is delivered with step motor in submillimeter precision under computer guidance directly into the tumor areas while minimizing injury to surrounding normal healthy tissue. Oncentra planning is the key component to generate a deliverable brachytherapy procedure, which is executed on the microSelectron V3 remote afterloader treatment system. Creating a highly conformal plan can be a time-consuming task. The development of Oncentra software (version 4.5.3) offers a variety of useful tools that facilitate many of the clinical challenging tasks for planning, such as contouring and image reconstruction, as well as rapid planning calculations with dose and dose volume histogram analysis. Oncentra Brachy module creates workflow and optimizes the planning accuracy for wide varieties of clinical HDR treatments, such as skin, gynecologic (GYN), breast, prostate, and many other applications. The treatment file can also be transferred to the afterloader control station for speedy delivery. The design concept, calculation algorithms, and optimization modules presented some key characteristics to plan and treat the patients effectively and accurately. The dose distribution and accuracy of several clinical sample cases were discussed to illustrate the effectiveness and clinical efficacy. The American Association of Physicists in Medicine brachytherapy reports of TG-43 and TG-186 were also described and compared in evaluations of fundamental calculation methodologies. Copyright © 2018 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  9. An accurate model for the computation of the dose of protons in water.

    PubMed

    Embriaco, A; Bellinzona, V E; Fontana, A; Rotondi, A

    2017-06-01

    The accurate and fast calculation of the dose in proton radiation therapy is an essential ingredient for successful treatments. We propose a novel approach with a minimal number of parameters. The approach is based on the exact calculation of the electromagnetic part of the interaction, namely the Molière theory of the multiple Coulomb scattering for the transversal 1D projection and the Bethe-Bloch formula for the longitudinal stopping power profile, including a gaussian energy straggling. To this e.m. contribution the nuclear proton-nucleus interaction is added with a simple two-parameter model. Then, the non gaussian lateral profile is used to calculate the radial dose distribution with a method that assumes the cylindrical symmetry of the distribution. The results, obtained with a fast C++ based computational code called MONET (MOdel of ioN dosE for Therapy), are in very good agreement with the FLUKA MC code, within a few percent in the worst case. This study provides a new tool for fast dose calculation or verification, possibly for clinical use. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  10. Accurate Energies and Orbital Description in Semi-Local Kohn-Sham DFT

    NASA Astrophysics Data System (ADS)

    Lindmaa, Alexander; Kuemmel, Stephan; Armiento, Rickard

    2015-03-01

    We present our progress on a scheme in semi-local Kohn-Sham density-functional theory (KS-DFT) for improving the orbital description while still retaining the level of accuracy of the usual semi-local exchange-correlation (xc) functionals. DFT is a widely used tool for first-principles calculations of properties of materials. A given task normally requires a balance of accuracy and computational cost, which is well achieved with semi-local DFT. However, commonly used semi-local xc functionals have important shortcomings which often can be attributed to features of the corresponding xc potential. One shortcoming is an overly delocalized representation of localized orbitals. Recently a semi-local GGA-type xc functional was constructed to address these issues, however, it has the trade-off of lower accuracy of the total energy. We discuss the source of this error in terms of a surplus energy contribution in the functional that needs to be accounted for, and offer a remedy for this issue which formally stays within KS-DFT, and, which does not harshly increase the computational effort. The end result is a scheme that combines accurate total energies (e.g., relaxed geometries) with an improved orbital description (e.g., improved band structure).

  11. An Automated Method for High-Definition Transcranial Direct Current Stimulation Modeling*

    PubMed Central

    Huang, Yu; Su, Yuzhuo; Rorden, Christopher; Dmochowski, Jacek; Datta, Abhishek; Parra, Lucas C.

    2014-01-01

    Targeted transcranial stimulation with electric currents requires accurate models of the current flow from scalp electrodes to the human brain. Idiosyncratic anatomy of individual brains and heads leads to significant variability in such current flows across subjects, thus, necessitating accurate individualized head models. Here we report on an automated processing chain that computes current distributions in the head starting from a structural magnetic resonance image (MRI). The main purpose of automating this process is to reduce the substantial effort currently required for manual segmentation, electrode placement, and solving of finite element models. In doing so, several weeks of manual labor were reduced to no more than 4 hours of computation time and minimal user interaction, while current-flow results for the automated method deviated by less than 27.9% from the manual method. Key facilitating factors are the addition of three tissue types (skull, scalp and air) to a state-of-the-art automated segmentation process, morphological processing to correct small but important segmentation errors, and automated placement of small electrodes based on easily reproducible standard electrode configurations. We anticipate that such an automated processing will become an indispensable tool to individualize transcranial direct current stimulation (tDCS) therapy. PMID:23367144

  12. Off-design Performance Analysis of Multi-Stage Transonic Axial Compressors

    NASA Astrophysics Data System (ADS)

    Du, W. H.; Wu, H.; Zhang, L.

    Because of the complex flow fields and component interaction in modern gas turbine engines, they require extensive experiment to validate performance and stability. The experiment process can become expensive and complex. Modeling and simulation of gas turbine engines are way to reduce experiment costs, provide fidelity and enhance the quality of essential experiment. The flow field of a transonic compressor contains all the flow aspects, which are difficult to present-boundary layer transition and separation, shock-boundary layer interactions, and large flow unsteadiness. Accurate transonic axial compressor off-design performance prediction is especially difficult, due in large part to three-dimensional blade design and the resulting flow field. Although recent advancements in computer capacity have brought computational fluid dynamics to forefront of turbomachinery design and analysis, the grid and turbulence model still limit Reynolds-average Navier-Stokes (RANS) approximations in the multi-stage transonic axial compressor flow field. Streamline curvature methods are still the dominant numerical approach as an important tool for turbomachinery to analyze and design, and it is generally accepted that streamline curvature solution techniques will provide satisfactory flow prediction as long as the losses, deviation and blockage are accurately predicted.

  13. How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray computed tomography

    PubMed Central

    Jørgensen, J. S.; Sidky, E. Y.

    2015-01-01

    We introduce phase-diagram analysis, a standard tool in compressed sensing (CS), to the X-ray computed tomography (CT) community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In CS, a phase diagram is a convenient way to study and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers to questions of undersampling. First, we demonstrate that there are cases where X-ray CT empirically performs comparably with a near-optimal CS strategy, namely taking measurements with Gaussian sensing matrices. Second, we show that, in contrast to what might have been anticipated, taking randomized CT measurements does not lead to improved performance compared with standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means of total-variation regularization. PMID:25939620

  14. How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray computed tomography.

    PubMed

    Jørgensen, J S; Sidky, E Y

    2015-06-13

    We introduce phase-diagram analysis, a standard tool in compressed sensing (CS), to the X-ray computed tomography (CT) community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In CS, a phase diagram is a convenient way to study and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers to questions of undersampling. First, we demonstrate that there are cases where X-ray CT empirically performs comparably with a near-optimal CS strategy, namely taking measurements with Gaussian sensing matrices. Second, we show that, in contrast to what might have been anticipated, taking randomized CT measurements does not lead to improved performance compared with standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means of total-variation regularization.

  15. Eclipse-Free-Time Assessment Tool for IRIS

    NASA Technical Reports Server (NTRS)

    Eagle, David

    2012-01-01

    IRIS_EFT is a scientific simulation that can be used to perform an Eclipse-Free- Time (EFT) assessment of IRIS (Infrared Imaging Surveyor) mission orbits. EFT is defined to be those time intervals longer than one day during which the IRIS spacecraft is not in the Earth s shadow. Program IRIS_EFT implements a special perturbation of orbital motion to numerically integrate Cowell's form of the system of differential equations. Shadow conditions are predicted by embedding this integrator within Brent s method for finding the root of a nonlinear equation. The IRIS_EFT software models the effects of the following types of orbit perturbations on the long-term evolution and shadow characteristics of IRIS mission orbits. (1) Non-spherical Earth gravity, (2) Atmospheric drag, (3) Point-mass gravity of the Sun, and (4) Point-mass gravity of the Moon. The objective of this effort was to create an in-house computer program that would perform eclipse-free-time analysis. of candidate IRIS spacecraft mission orbits in an accurate and timely fashion. The software is a suite of Fortran subroutines and data files organized as a "computational" engine that is used to accurately predict the long-term orbit evolution of IRIS mission orbits while searching for Earth shadow conditions.

  16. Computed tomography in hypersensitivity pneumonitis: main findings, differential diagnosis and pitfalls.

    PubMed

    Dias, Olívia Meira; Baldi, Bruno Guedes; Pennati, Francesca; Aliverti, Andrea; Chate, Rodrigo Caruso; Sawamura, Márcio Valente Yamada; Carvalho, Carlos Roberto Ribeiro de; Albuquerque, André Luis Pereira de

    2018-01-01

    Hypersensitivity pneumonitis (HP) is a disease with variable clinical presentation in which inflammation in the lung parenchyma is caused by the inhalation of specific organic antigens or low molecular weight substances in genetically susceptible individuals. Alterations of the acute, subacute and chronic forms may eventually overlap, and the diagnosis based on temporality and presence of fibrosis (acute/inflammatory HP vs. chronic HP) seems to be more feasible and useful in clinical practice. Differential diagnosis of chronic HP with other interstitial fibrotic diseases is challenging due to the overlap of the clinical history, and the functional and imaging findings of these pathologies in the terminal stages. Areas covered: This article reviews the essential features of HP with emphasis on imaging features. Moreover, the main methodological limitations of high-resolution computed tomography (HRCT) interpretation are discussed, as well as new perspectives with volumetric quantitative CT analysis as a useful tool for retrieving detailed and accurate information from the lung parenchyma. Expert commentary: Mosaic attenuation is a prominent feature of this disease, but air trapping in chronic HP seems overestimated. Quantitative analysis has the potential to estimate the involvement of the pulmonary parenchyma more accurately and could correlate better with pulmonary function results.

  17. Acceleration of FDTD mode solver by high-performance computing techniques.

    PubMed

    Han, Lin; Xi, Yanping; Huang, Wei-Ping

    2010-06-21

    A two-dimensional (2D) compact finite-difference time-domain (FDTD) mode solver is developed based on wave equation formalism in combination with the matrix pencil method (MPM). The method is validated for calculation of both real guided and complex leaky modes of typical optical waveguides against the bench-mark finite-difference (FD) eigen mode solver. By taking advantage of the inherent parallel nature of the FDTD algorithm, the mode solver is implemented on graphics processing units (GPUs) using the compute unified device architecture (CUDA). It is demonstrated that the high-performance computing technique leads to significant acceleration of the FDTD mode solver with more than 30 times improvement in computational efficiency in comparison with the conventional FDTD mode solver running on CPU of a standard desktop computer. The computational efficiency of the accelerated FDTD method is in the same order of magnitude of the standard finite-difference eigen mode solver and yet require much less memory (e.g., less than 10%). Therefore, the new method may serve as an efficient, accurate and robust tool for mode calculation of optical waveguides even when the conventional eigen value mode solvers are no longer applicable due to memory limitation.

  18. Massively Parallel Processing for Fast and Accurate Stamping Simulations

    NASA Astrophysics Data System (ADS)

    Gress, Jeffrey J.; Xu, Siguang; Joshi, Ramesh; Wang, Chuan-tao; Paul, Sabu

    2005-08-01

    The competitive automotive market drives automotive manufacturers to speed up the vehicle development cycles and reduce the lead-time. Fast tooling development is one of the key areas to support fast and short vehicle development programs (VDP). In the past ten years, the stamping simulation has become the most effective validation tool in predicting and resolving all potential formability and quality problems before the dies are physically made. The stamping simulation and formability analysis has become an critical business segment in GM math-based die engineering process. As the simulation becomes as one of the major production tools in engineering factory, the simulation speed and accuracy are the two of the most important measures for stamping simulation technology. The speed and time-in-system of forming analysis becomes an even more critical to support the fast VDP and tooling readiness. Since 1997, General Motors Die Center has been working jointly with our software vendor to develop and implement a parallel version of simulation software for mass production analysis applications. By 2001, this technology was matured in the form of distributed memory processing (DMP) of draw die simulations in a networked distributed memory computing environment. In 2004, this technology was refined to massively parallel processing (MPP) and extended to line die forming analysis (draw, trim, flange, and associated spring-back) running on a dedicated computing environment. The evolution of this technology and the insight gained through the implementation of DM0P/MPP technology as well as performance benchmarks are discussed in this publication.

  19. Assessing Scientific Practices Using Machine-Learning Methods: How Closely Do They Match Clinical Interview Performance?

    NASA Astrophysics Data System (ADS)

    Beggrow, Elizabeth P.; Ha, Minsu; Nehm, Ross H.; Pearl, Dennis; Boone, William J.

    2014-02-01

    The landscape of science education is being transformed by the new Framework for Science Education (National Research Council, A framework for K-12 science education: practices, crosscutting concepts, and core ideas. The National Academies Press, Washington, DC, 2012), which emphasizes the centrality of scientific practices—such as explanation, argumentation, and communication—in science teaching, learning, and assessment. A major challenge facing the field of science education is developing assessment tools that are capable of validly and efficiently evaluating these practices. Our study examined the efficacy of a free, open-source machine-learning tool for evaluating the quality of students' written explanations of the causes of evolutionary change relative to three other approaches: (1) human-scored written explanations, (2) a multiple-choice test, and (3) clinical oral interviews. A large sample of undergraduates (n = 104) exposed to varying amounts of evolution content completed all three assessments: a clinical oral interview, a written open-response assessment, and a multiple-choice test. Rasch analysis was used to compute linear person measures and linear item measures on a single logit scale. We found that the multiple-choice test displayed poor person and item fit (mean square outfit >1.3), while both oral interview measures and computer-generated written response measures exhibited acceptable fit (average mean square outfit for interview: person 0.97, item 0.97; computer: person 1.03, item 1.06). Multiple-choice test measures were more weakly associated with interview measures (r = 0.35) than the computer-scored explanation measures (r = 0.63). Overall, Rasch analysis indicated that computer-scored written explanation measures (1) have the strongest correspondence to oral interview measures; (2) are capable of capturing students' normative scientific and naive ideas as accurately as human-scored explanations, and (3) more validly detect understanding than the multiple-choice assessment. These findings demonstrate the great potential of machine-learning tools for assessing key scientific practices highlighted in the new Framework for Science Education.

  20. Smart tissue anastomosis robot (STAR): a vision-guided robotics system for laparoscopic suturing.

    PubMed

    Leonard, Simon; Wu, Kyle L; Kim, Yonjae; Krieger, Axel; Kim, Peter C W

    2014-04-01

    This paper introduces the smart tissue anastomosis robot (STAR). Currently, the STAR is a proof-of-concept for a vision-guided robotic system featuring an actuated laparoscopic suturing tool capable of executing running sutures from image-based commands. The STAR tool is designed around a commercially available laparoscopic suturing tool that is attached to a custom-made motor stage and the STAR supervisory control architecture that enables a surgeon to select and track incisions and the placement of stitches. The STAR supervisory-control interface provides two modes: A manual mode that enables a surgeon to specify the placement of each stitch and an automatic mode that automatically computes equally-spaced stitches based on an incision contour. Our experiments on planar phantoms demonstrate that the STAR in either mode is more accurate, up to four times more consistent and five times faster than surgeons using state-of-the-art robotic surgical system, four times faster than surgeons using manual Endo360(°)®, and nine times faster than surgeons using manual laparoscopic tools.

  1. Accurate derivation of heart rate variability signal for detection of sleep disordered breathing in children.

    PubMed

    Chatlapalli, S; Nazeran, H; Melarkod, V; Krishnam, R; Estrada, E; Pamula, Y; Cabrera, S

    2004-01-01

    The electrocardiogram (ECG) signal is used extensively as a low cost diagnostic tool to provide information concerning the heart's state of health. Accurate determination of the QRS complex, in particular, reliable detection of the R wave peak, is essential in computer based ECG analysis. ECG data from Physionet's Sleep-Apnea database were used to develop, test, and validate a robust heart rate variability (HRV) signal derivation algorithm. The HRV signal was derived from pre-processed ECG signals by developing an enhanced Hilbert transform (EHT) algorithm with built-in missing beat detection capability for reliable QRS detection. The performance of the EHT algorithm was then compared against that of a popular Hilbert transform-based (HT) QRS detection algorithm. Autoregressive (AR) modeling of the HRV power spectrum for both EHT- and HT-derived HRV signals was achieved and different parameters from their power spectra as well as approximate entropy were derived for comparison. Poincare plots were then used as a visualization tool to highlight the detection of the missing beats in the EHT method After validation of the EHT algorithm on ECG data from the Physionet, the algorithm was further tested and validated on a dataset obtained from children undergoing polysomnography for detection of sleep disordered breathing (SDB). Sensitive measures of accurate HRV signals were then derived to be used in detecting and diagnosing sleep disordered breathing in children. All signal processing algorithms were implemented in MATLAB. We present a description of the EHT algorithm and analyze pilot data for eight children undergoing nocturnal polysomnography. The pilot data demonstrated that the EHT method provides an accurate way of deriving the HRV signal and plays an important role in extraction of reliable measures to distinguish between periods of normal and sleep disordered breathing (SDB) in children.

  2. TU-AB-BRC-12: Optimized Parallel MonteCarlo Dose Calculations for Secondary MU Checks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    French, S; Nazareth, D; Bellor, M

    Purpose: Secondary MU checks are an important tool used during a physics review of a treatment plan. Commercial software packages offer varying degrees of theoretical dose calculation accuracy, depending on the modality involved. Dose calculations of VMAT plans are especially prone to error due to the large approximations involved. Monte Carlo (MC) methods are not commonly used due to their long run times. We investigated two methods to increase the computational efficiency of MC dose simulations with the BEAMnrc code. Distributed computing resources, along with optimized code compilation, will allow for accurate and efficient VMAT dose calculations. Methods: The BEAMnrcmore » package was installed on a high performance computing cluster accessible to our clinic. MATLAB and PYTHON scripts were developed to convert a clinical VMAT DICOM plan into BEAMnrc input files. The BEAMnrc installation was optimized by running the VMAT simulations through profiling tools which indicated the behavior of the constituent routines in the code, e.g. the bremsstrahlung splitting routine, and the specified random number generator. This information aided in determining the most efficient compiling parallel configuration for the specific CPU’s available on our cluster, resulting in the fastest VMAT simulation times. Our method was evaluated with calculations involving 10{sup 8} – 10{sup 9} particle histories which are sufficient to verify patient dose using VMAT. Results: Parallelization allowed the calculation of patient dose on the order of 10 – 15 hours with 100 parallel jobs. Due to the compiler optimization process, further speed increases of 23% were achieved when compared with the open-source compiler BEAMnrc packages. Conclusion: Analysis of the BEAMnrc code allowed us to optimize the compiler configuration for VMAT dose calculations. In future work, the optimized MC code, in conjunction with the parallel processing capabilities of BEAMnrc, will be applied to provide accurate and efficient secondary MU checks.« less

  3. A global approach to analysis and interpretation of metabolic data for plant natural product discovery.

    PubMed

    Hur, Manhoi; Campbell, Alexis Ann; Almeida-de-Macedo, Marcia; Li, Ling; Ransom, Nick; Jose, Adarsh; Crispin, Matt; Nikolau, Basil J; Wurtele, Eve Syrkin

    2013-04-01

    Discovering molecular components and their functionality is key to the development of hypotheses concerning the organization and regulation of metabolic networks. The iterative experimental testing of such hypotheses is the trajectory that can ultimately enable accurate computational modelling and prediction of metabolic outcomes. This information can be particularly important for understanding the biology of natural products, whose metabolism itself is often only poorly defined. Here, we describe factors that must be in place to optimize the use of metabolomics in predictive biology. A key to achieving this vision is a collection of accurate time-resolved and spatially defined metabolite abundance data and associated metadata. One formidable challenge associated with metabolite profiling is the complexity and analytical limits associated with comprehensively determining the metabolome of an organism. Further, for metabolomics data to be efficiently used by the research community, it must be curated in publicly available metabolomics databases. Such databases require clear, consistent formats, easy access to data and metadata, data download, and accessible computational tools to integrate genome system-scale datasets. Although transcriptomics and proteomics integrate the linear predictive power of the genome, the metabolome represents the nonlinear, final biochemical products of the genome, which results from the intricate system(s) that regulate genome expression. For example, the relationship of metabolomics data to the metabolic network is confounded by redundant connections between metabolites and gene-products. However, connections among metabolites are predictable through the rules of chemistry. Therefore, enhancing the ability to integrate the metabolome with anchor-points in the transcriptome and proteome will enhance the predictive power of genomics data. We detail a public database repository for metabolomics, tools and approaches for statistical analysis of metabolomics data, and methods for integrating these datasets with transcriptomic data to create hypotheses concerning specialized metabolisms that generate the diversity in natural product chemistry. We discuss the importance of close collaborations among biologists, chemists, computer scientists and statisticians throughout the development of such integrated metabolism-centric databases and software.

  4. A global approach to analysis and interpretation of metabolic data for plant natural product discovery†

    PubMed Central

    Hur, Manhoi; Campbell, Alexis Ann; Almeida-de-Macedo, Marcia; Li, Ling; Ransom, Nick; Jose, Adarsh; Crispin, Matt; Nikolau, Basil J.

    2013-01-01

    Discovering molecular components and their functionality is key to the development of hypotheses concerning the organization and regulation of metabolic networks. The iterative experimental testing of such hypotheses is the trajectory that can ultimately enable accurate computational modelling and prediction of metabolic outcomes. This information can be particularly important for understanding the biology of natural products, whose metabolism itself is often only poorly defined. Here, we describe factors that must be in place to optimize the use of metabolomics in predictive biology. A key to achieving this vision is a collection of accurate time-resolved and spatially defined metabolite abundance data and associated metadata. One formidable challenge associated with metabolite profiling is the complexity and analytical limits associated with comprehensively determining the metabolome of an organism. Further, for metabolomics data to be efficiently used by the research community, it must be curated in publically available metabolomics databases. Such databases require clear, consistent formats, easy access to data and metadata, data download, and accessible computational tools to integrate genome system-scale datasets. Although transcriptomics and proteomics integrate the linear predictive power of the genome, the metabolome represents the nonlinear, final biochemical products of the genome, which results from the intricate system(s) that regulate genome expression. For example, the relationship of metabolomics data to the metabolic network is confounded by redundant connections between metabolites and gene-products. However, connections among metabolites are predictable through the rules of chemistry. Therefore, enhancing the ability to integrate the metabolome with anchor-points in the transcriptome and proteome will enhance the predictive power of genomics data. We detail a public database repository for metabolomics, tools and approaches for statistical analysis of metabolomics data, and methods for integrating these dataset with transcriptomic data to create hypotheses concerning specialized metabolism that generates the diversity in natural product chemistry. We discuss the importance of close collaborations among biologists, chemists, computer scientists and statisticians throughout the development of such integrated metabolism-centric databases and software. PMID:23447050

  5. Computationally efficient simulation of electrical activity at cell membranes interacting with self-generated and externally imposed electric fields

    NASA Astrophysics Data System (ADS)

    Agudelo-Toro, Andres; Neef, Andreas

    2013-04-01

    Objective. We present a computational method that implements a reduced set of Maxwell's equations to allow simulation of cells under realistic conditions: sub-micron cell morphology, a conductive non-homogeneous space and various ion channel properties and distributions. Approach. While a reduced set of Maxwell's equations can be used to couple membrane currents to extra- and intracellular potentials, this approach is rarely taken, most likely because adequate computational tools are missing. By using these equations, and introducing an implicit solver, numerical stability is attained even with large time steps. The time steps are limited only by the time development of the membrane potentials. Main results. This method allows simulation times of tens of minutes instead of weeks, even for complex problems. The extracellular fields are accurately represented, including secondary fields, which originate at inhomogeneities of the extracellular space and can reach several millivolts. We present a set of instructive examples that show how this method can be used to obtain reference solutions for problems, which might not be accurately captured by the traditional approaches. This includes the simulation of realistic magnitudes of extracellular action potential signals in restricted extracellular space. Significance. The electric activity of neurons creates extracellular potentials. Recent findings show that these endogenous fields act back onto the neurons, contributing to the synchronization of population activity. The influence of endogenous fields is also relevant for understanding therapeutic approaches such as transcranial direct current, transcranial magnetic and deep brain stimulation. The mutual interaction between fields and membrane currents is not captured by today's concepts of cellular electrophysiology, including the commonly used activation function, as those concepts are based on isolated membranes in an infinite, isopotential extracellular space. The presented tool makes simulations with detailed morphology and implicit interactions of currents and fields available to the electrophysiology community.

  6. Sensitivity analysis of dynamic biological systems with time-delays.

    PubMed

    Wu, Wu Hsiung; Wang, Feng Sheng; Chang, Maw Shang

    2010-10-15

    Mathematical modeling has been applied to the study and analysis of complex biological systems for a long time. Some processes in biological systems, such as the gene expression and feedback control in signal transduction networks, involve a time delay. These systems are represented as delay differential equation (DDE) models. Numerical sensitivity analysis of a DDE model by the direct method requires the solutions of model and sensitivity equations with time-delays. The major effort is the computation of Jacobian matrix when computing the solution of sensitivity equations. The computation of partial derivatives of complex equations either by the analytic method or by symbolic manipulation is time consuming, inconvenient, and prone to introduce human errors. To address this problem, an automatic approach to obtain the derivatives of complex functions efficiently and accurately is necessary. We have proposed an efficient algorithm with an adaptive step size control to compute the solution and dynamic sensitivities of biological systems described by ordinal differential equations (ODEs). The adaptive direct-decoupled algorithm is extended to solve the solution and dynamic sensitivities of time-delay systems describing by DDEs. To save the human effort and avoid the human errors in the computation of partial derivatives, an automatic differentiation technique is embedded in the extended algorithm to evaluate the Jacobian matrix. The extended algorithm is implemented and applied to two realistic models with time-delays: the cardiovascular control system and the TNF-α signal transduction network. The results show that the extended algorithm is a good tool for dynamic sensitivity analysis on DDE models with less user intervention. By comparing with direct-coupled methods in theory, the extended algorithm is efficient, accurate, and easy to use for end users without programming background to do dynamic sensitivity analysis on complex biological systems with time-delays.

  7. Computational Prediction of Electron Ionization Mass Spectra to Assist in GC/MS Compound Identification.

    PubMed

    Allen, Felicity; Pon, Allison; Greiner, Russ; Wishart, David

    2016-08-02

    We describe a tool, competitive fragmentation modeling for electron ionization (CFM-EI) that, given a chemical structure (e.g., in SMILES or InChI format), computationally predicts an electron ionization mass spectrum (EI-MS) (i.e., the type of mass spectrum commonly generated by gas chromatography mass spectrometry). The predicted spectra produced by this tool can be used for putative compound identification, complementing measured spectra in reference databases by expanding the range of compounds able to be considered when availability of measured spectra is limited. The tool extends CFM-ESI, a recently developed method for computational prediction of electrospray tandem mass spectra (ESI-MS/MS), but unlike CFM-ESI, CFM-EI can handle odd-electron ions and isotopes and incorporates an artificial neural network. Tests on EI-MS data from the NIST database demonstrate that CFM-EI is able to model fragmentation likelihoods in low-resolution EI-MS data, producing predicted spectra whose dot product scores are significantly better than full enumeration "bar-code" spectra. CFM-EI also outperformed previously reported results for MetFrag, MOLGEN-MS, and Mass Frontier on one compound identification task. It also outperformed MetFrag in a range of other compound identification tasks involving a much larger data set, containing both derivatized and nonderivatized compounds. While replicate EI-MS measurements of chemical standards are still a more accurate point of comparison, CFM-EI's predictions provide a much-needed alternative when no reference standard is available for measurement. CFM-EI is available at https://sourceforge.net/projects/cfm-id/ for download and http://cfmid.wishartlab.com as a web service.

  8. SSME Investment in Turbomachinery Inducer Impeller Design Tools and Methodology

    NASA Technical Reports Server (NTRS)

    Zoladz, Thomas; Mitchell, William; Lunde, Kevin

    2010-01-01

    Within the rocket engine industry, SSME turbomachines are the de facto standards of success with regard to meeting aggressive performance requirements under challenging operational environments. Over the Shuttle era, SSME has invested heavily in our national inducer impeller design infrastructure. While both low and high pressure turbopump failures/anomaly resolution efforts spurred some of these investments, the SSME program was a major benefactor of key areas of turbomachinery inducer-impeller research outside of flight manifest pressures. Over the past several decades, key turbopump internal environments have been interrogated via highly instrumented hot-fire and cold-flow testing. Likewise, SSME has sponsored the advancement of time accurate and cavitating inducer impeller computation fluid dynamics (CFD) tools. These investments together have led to a better understanding of the complex internal flow fields within aggressive high performing inducers and impellers. New design tools and methodologies have evolved which intend to provide confident blade designs which strike an appropriate balance between performance and self induced load management.

  9. Application of Nexus copy number software for CNV detection and analysis.

    PubMed

    Darvishi, Katayoon

    2010-04-01

    Among human structural genomic variation, copy number variants (CNVs) are the most frequently known component, comprised of gains/losses of DNA segments that are generally 1 kb in length or longer. Array-based comparative genomic hybridization (aCGH) has emerged as a powerful tool for detecting genomic copy number variants (CNVs). With the rapid increase in the density of array technology and with the adaptation of new high-throughput technology, a reliable and computationally scalable method for accurate mapping of recurring DNA copy number aberrations has become a main focus in research. Here we introduce Nexus Copy Number software, a platform-independent tool, to analyze the output files of all types of commercial and custom-made comparative genomic hybridization (CGH) and single-nucleotide polymorphism (SNP) arrays, such as those manufactured by Affymetrix, Agilent Technologies, Illumina, and Roche NimbleGen. It also supports data generated by various array image-analysis software tools such as GenePix, ImaGene, and BlueFuse. (c) 2010 by John Wiley & Sons, Inc.

  10. BUSCA: an integrative web server to predict subcellular localization of proteins.

    PubMed

    Savojardo, Castrense; Martelli, Pier Luigi; Fariselli, Piero; Profiti, Giuseppe; Casadio, Rita

    2018-04-30

    Here, we present BUSCA (http://busca.biocomp.unibo.it), a novel web server that integrates different computational tools for predicting protein subcellular localization. BUSCA combines methods for identifying signal and transit peptides (DeepSig and TPpred3), GPI-anchors (PredGPI) and transmembrane domains (ENSEMBLE3.0 and BetAware) with tools for discriminating subcellular localization of both globular and membrane proteins (BaCelLo, MemLoci and SChloro). Outcomes from the different tools are processed and integrated for annotating subcellular localization of both eukaryotic and bacterial protein sequences. We benchmark BUSCA against protein targets derived from recent CAFA experiments and other specific data sets, reporting performance at the state-of-the-art. BUSCA scores better than all other evaluated methods on 2732 targets from CAFA2, with a F1 value equal to 0.49 and among the best methods when predicting targets from CAFA3. We propose BUSCA as an integrated and accurate resource for the annotation of protein subcellular localization.

  11. Prompt and Precise Prototyping

    NASA Technical Reports Server (NTRS)

    2003-01-01

    For Sanders Design International, Inc., of Wilton, New Hampshire, every passing second between the concept and realization of a product is essential to succeed in the rapid prototyping industry where amongst heavy competition, faster time-to-market means more business. To separate itself from its rivals, Sanders Design aligned with NASA's Marshall Space Flight Center to develop what it considers to be the most accurate rapid prototyping machine for fabrication of extremely precise tooling prototypes. The company's Rapid ToolMaker System has revolutionized production of high quality, small-to-medium sized prototype patterns and tooling molds with an exactness that surpasses that of computer numerically-controlled (CNC) machining devices. Created with funding and support from Marshall under a Small Business Innovation Research (SBIR) contract, the Rapid ToolMaker is a dual-use technology with applications in both commercial and military aerospace fields. The advanced technology provides cost savings in the design and manufacturing of automotive, electronic, and medical parts, as well as in other areas of consumer interest, such as jewelry and toys. For aerospace applications, the Rapid ToolMaker enables fabrication of high-quality turbine and compressor blades for jet engines on unmanned air vehicles, aircraft, and missiles.

  12. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model.

    PubMed

    Neic, Aurel; Campos, Fernando O; Prassl, Anton J; Niederer, Steven A; Bishop, Martin J; Vigmond, Edward J; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  13. Electromagnetomechanical elastodynamic model for Lamb wave damage quantification in composites

    NASA Astrophysics Data System (ADS)

    Borkowski, Luke; Chattopadhyay, Aditi

    2014-03-01

    Physics-based wave propagation computational models play a key role in structural health monitoring (SHM) and the development of improved damage quantification methodologies. Guided waves (GWs), such as Lamb waves, provide the capability to monitor large plate-like aerospace structures with limited actuators and sensors and are sensitive to small scale damage; however due to the complex nature of GWs, accurate and efficient computation tools are necessary to investigate the mechanisms responsible for dispersion, coupling, and interaction with damage. In this paper, the local interaction simulation approach (LISA) coupled with the sharp interface model (SIM) solution methodology is used to solve the fully coupled electro-magneto-mechanical elastodynamic equations for the piezoelectric and piezomagnetic actuation and sensing of GWs in fiber reinforced composite material systems. The final framework provides the full three-dimensional displacement as well as electrical and magnetic potential fields for arbitrary plate and transducer geometries and excitation waveform and frequency. The model is validated experimentally and proven computationally efficient for a laminated composite plate. Studies are performed with surface bonded piezoelectric and embedded piezomagnetic sensors to gain insight into the physics of experimental techniques used for SHM. The symmetric collocation of piezoelectric actuators is modeled to demonstrate mode suppression in laminated composites for the purpose of damage detection. The effect of delamination and damage (i.e., matrix cracking) on the GW propagation is demonstrated and quantified. The developed model provides a valuable tool for the improvement of SHM techniques due to its proven accuracy and computational efficiency.

  14. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model

    NASA Astrophysics Data System (ADS)

    Neic, Aurel; Campos, Fernando O.; Prassl, Anton J.; Niederer, Steven A.; Bishop, Martin J.; Vigmond, Edward J.; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  15. CAST: a new program package for the accurate characterization of large and flexible molecular systems.

    PubMed

    Grebner, Christoph; Becker, Johannes; Weber, Daniel; Bellinger, Daniel; Tafipolski, Maxim; Brückner, Charlotte; Engels, Bernd

    2014-09-15

    The presented program package, Conformational Analysis and Search Tool (CAST) allows the accurate treatment of large and flexible (macro) molecular systems. For the determination of thermally accessible minima CAST offers the newly developed TabuSearch algorithm, but algorithms such as Monte Carlo (MC), MC with minimization, and molecular dynamics are implemented as well. For the determination of reaction paths, CAST provides the PathOpt, the Nudge Elastic band, and the umbrella sampling approach. Access to free energies is possible through the free energy perturbation approach. Along with a number of standard force fields, a newly developed symmetry-adapted perturbation theory-based force field is included. Semiempirical computations are possible through DFTB+ and MOPAC interfaces. For calculations based on density functional theory, a Message Passing Interface (MPI) interface to the Graphics Processing Unit (GPU)-accelerated TeraChem program is available. The program is available on request. Copyright © 2014 Wiley Periodicals, Inc.

  16. Study on Material Parameters Identification of Brain Tissue Considering Uncertainty of Friction Coefficient

    NASA Astrophysics Data System (ADS)

    Guan, Fengjiao; Zhang, Guanjun; Liu, Jie; Wang, Shujing; Luo, Xu; Zhu, Feng

    2017-10-01

    Accurate material parameters are critical to construct the high biofidelity finite element (FE) models. However, it is hard to obtain the brain tissue parameters accurately because of the effects of irregular geometry and uncertain boundary conditions. Considering the complexity of material test and the uncertainty of friction coefficient, a computational inverse method for viscoelastic material parameters identification of brain tissue is presented based on the interval analysis method. Firstly, the intervals are used to quantify the friction coefficient in the boundary condition. And then the inverse problem of material parameters identification under uncertain friction coefficient is transformed into two types of deterministic inverse problem. Finally the intelligent optimization algorithm is used to solve the two types of deterministic inverse problems quickly and accurately, and the range of material parameters can be easily acquired with no need of a variety of samples. The efficiency and convergence of this method are demonstrated by the material parameters identification of thalamus. The proposed method provides a potential effective tool for building high biofidelity human finite element model in the study of traffic accident injury.

  17. Performance of Frozen Density Embedding for Modeling Hole Transfer Reactions.

    PubMed

    Ramos, Pablo; Papadakis, Markos; Pavanello, Michele

    2015-06-18

    We have carried out a thorough benchmark of the frozen density-embedding (FDE) method for calculating hole transfer couplings. We have considered 10 exchange-correlation functionals, 3 nonadditive kinetic energy functionals, and 3 basis sets. Overall, we conclude that with a 7% mean relative unsigned error, the PBE and PW91 functionals coupled with the PW91k nonadditive kinetic energy functional and a TZP basis set constitute the most stable and accurate levels of theory for hole transfer coupling calculations. The FDE-ET method is found to be an excellent tool for computing diabatic couplings for hole transfer reactions.

  18. Radiology of pancreatic neoplasms: An update

    PubMed Central

    de la Santa, Luis Gijón; Retortillo, José Antonio Pérez; Miguel, Ainhoa Camarero; Klein, Lea Marie

    2014-01-01

    Diagnostic imaging is an important tool to evaluate pancreatic neoplasms. We describe the imaging features of pancreatic malignancies and their benign mimics. Accurate detection and staging are essential for ensuring appropriate selection of patients who will benefit from surgery and for preventing unnecessary surgeries in patients with unresectable disease. Ultrasound, multidetector computed tomography with multiplanar reconstruction and magnetic resonance imaging can help to do a correct diagnosis. Radiologists should be aware of the wide variety of anatomic variants and pathologic conditions that may mimic pancreatic neoplasms. The knowledge of the most important characteristic key findings may facilitate the right diagnosis. PMID:25232458

  19. Radiology of pancreatic neoplasms: An update.

    PubMed

    de la Santa, Luis Gijón; Retortillo, José Antonio Pérez; Miguel, Ainhoa Camarero; Klein, Lea Marie

    2014-09-15

    Diagnostic imaging is an important tool to evaluate pancreatic neoplasms. We describe the imaging features of pancreatic malignancies and their benign mimics. Accurate detection and staging are essential for ensuring appropriate selection of patients who will benefit from surgery and for preventing unnecessary surgeries in patients with unresectable disease. Ultrasound, multidetector computed tomography with multiplanar reconstruction and magnetic resonance imaging can help to do a correct diagnosis. Radiologists should be aware of the wide variety of anatomic variants and pathologic conditions that may mimic pancreatic neoplasms. The knowledge of the most important characteristic key findings may facilitate the right diagnosis.

  20. Tensorial Minkowski functionals of triply periodic minimal surfaces

    PubMed Central

    Mickel, Walter; Schröder-Turk, Gerd E.; Mecke, Klaus

    2012-01-01

    A fundamental understanding of the formation and properties of a complex spatial structure relies on robust quantitative tools to characterize morphology. A systematic approach to the characterization of average properties of anisotropic complex interfacial geometries is provided by integral geometry which furnishes a family of morphological descriptors known as tensorial Minkowski functionals. These functionals are curvature-weighted integrals of tensor products of position vectors and surface normal vectors over the interfacial surface. We here demonstrate their use by application to non-cubic triply periodic minimal surface model geometries, whose Weierstrass parametrizations allow for accurate numerical computation of the Minkowski tensors. PMID:24098847

  1. Electromagnetic Modelling of MMIC CPWs for High Frequency Applications

    NASA Astrophysics Data System (ADS)

    Sinulingga, E. P.; Kyabaggu, P. B. K.; Rezazadeh, A. A.

    2018-02-01

    Realising the theoretical electrical characteristics of components through modelling can be carried out using computer-aided design (CAD) simulation tools. If the simulation model provides the expected characteristics, the fabrication process of Monolithic Microwave Integrated Circuit (MMIC) can be performed for experimental verification purposes. Therefore improvements can be suggested before mass fabrication takes place. This research concentrates on development of MMIC technology by providing accurate predictions of the characteristics of MMIC components using an improved Electromagnetic (EM) modelling technique. The knowledge acquired from the modelling and characterisation process in this work can be adopted by circuit designers for various high frequency applications.

  2. MyPMFs: a simple tool for creating statistical potentials to assess protein structural models.

    PubMed

    Postic, Guillaume; Hamelryck, Thomas; Chomilier, Jacques; Stratmann, Dirk

    2018-05-29

    Evaluating the model quality of protein structures that evolve in environments with particular physicochemical properties requires scoring functions that are adapted to their specific residue compositions and/or structural characteristics. Thus, computational methods developed for structures from the cytosol cannot work properly on membrane or secreted proteins. Here, we present MyPMFs, an easy-to-use tool that allows users to train statistical potentials of mean force (PMFs) on the protein structures of their choice, with all parameters being adjustable. We demonstrate its use by creating an accurate statistical potential for transmembrane protein domains. We also show its usefulness to study the influence of the physical environment on residue interactions within protein structures. Our open-source software is freely available for download at https://github.com/bibip-impmc/mypmfs. Copyright © 2018. Published by Elsevier B.V.

  3. orthoFind Facilitates the Discovery of Homologous and Orthologous Proteins.

    PubMed

    Mier, Pablo; Andrade-Navarro, Miguel A; Pérez-Pulido, Antonio J

    2015-01-01

    Finding homologous and orthologous protein sequences is often the first step in evolutionary studies, annotation projects, and experiments of functional complementation. Despite all currently available computational tools, there is a requirement for easy-to-use tools that provide functional information. Here, a new web application called orthoFind is presented, which allows a quick search for homologous and orthologous proteins given one or more query sequences, allowing a recurrent and exhaustive search against reference proteomes, and being able to include user databases. It addresses the protein multidomain problem, searching for homologs with the same domain architecture, and gives a simple functional analysis of the results to help in the annotation process. orthoFind is easy to use and has been proven to provide accurate results with different datasets. Availability: http://www.bioinfocabd.upo.es/orthofind/.

  4. DKIST Adaptive Optics System: Simulation Results

    NASA Astrophysics Data System (ADS)

    Marino, Jose; Schmidt, Dirk

    2016-05-01

    The 4 m class Daniel K. Inouye Solar Telescope (DKIST), currently under construction, will be equipped with an ultra high order solar adaptive optics (AO) system. The requirements and capabilities of such a solar AO system are beyond those of any other solar AO system currently in operation. We must rely on solar AO simulations to estimate and quantify its performance.We present performance estimation results of the DKIST AO system obtained with a new solar AO simulation tool. This simulation tool is a flexible and fast end-to-end solar AO simulator which produces accurate solar AO simulations while taking advantage of current multi-core computer technology. It relies on full imaging simulations of the extended field Shack-Hartmann wavefront sensor (WFS), which directly includes important secondary effects such as field dependent distortions and varying contrast of the WFS sub-aperture images.

  5. Propellant Chemistry for CFD Applications

    NASA Technical Reports Server (NTRS)

    Farmer, R. C.; Anderson, P. G.; Cheng, Gary C.

    1996-01-01

    Current concepts for reusable launch vehicle design have created renewed interest in the use of RP-1 fuels for high pressure and tri-propellant propulsion systems. Such designs require the use of an analytical technology that accurately accounts for the effects of real fluid properties, combustion of large hydrocarbon fuel modules, and the possibility of soot formation. These effects are inadequately treated in current computational fluid dynamic (CFD) codes used for propulsion system analyses. The objective of this investigation is to provide an accurate analytical description of hydrocarbon combustion thermodynamics and kinetics that is sufficiently computationally efficient to be a practical design tool when used with CFD codes such as the FDNS code. A rigorous description of real fluid properties for RP-1 and its combustion products will be derived from the literature and from experiments conducted in this investigation. Upon the establishment of such a description, the fluid description will be simplified by using the minimum of empiricism necessary to maintain accurate combustion analyses and including such empirical models into an appropriate CFD code. An additional benefit of this approach is that the real fluid properties analysis simplifies the introduction of the effects of droplet sprays into the combustion model. Typical species compositions of RP-1 have been identified, surrogate fuels have been established for analyses, and combustion and sooting reaction kinetics models have been developed. Methods for predicting the necessary real fluid properties have been developed and essential experiments have been designed. Verification studies are in progress, and preliminary results from these studies will be presented. The approach has been determined to be feasible, and upon its completion the required methodology for accurate performance and heat transfer CFD analyses for high pressure, tri-propellant propulsion systems will be available.

  6. Application of hybrid methodology to rotors in steady and maneuvering flight

    NASA Astrophysics Data System (ADS)

    Rajmohan, Nischint

    Helicopters are versatile flying machines that have capabilities that are unparalleled by fixed wing aircraft, such as operating in hover, performing vertical takeoff and landing on unprepared sites. This makes their use especially desirable in military and search-and-rescue operations. However, modern helicopters still suffer from high levels of noise and vibration caused by the physical phenomena occurring in the vicinity of the rotor blades. Therefore, improvement in rotorcraft design to reduce the noise and vibration levels requires understanding of the underlying physical phenomena, and accurate prediction capabilities of the resulting rotorcraft aeromechanics. The goal of this research is to study the aeromechanics of rotors in steady and maneuvering flight using hybrid Computational Fluid Dynamics (CFD) methodology. The hybrid CFD methodology uses the Navier-Stokes equations to solve the flow near the blade surface but the effect of the far wake is computed through the wake model. The hybrid CFD methodology is computationally efficient and its wake modeling approach is nondissipative making it an attractive tool to study rotorcraft aeromechanics. Several enhancements were made to the CFD methodology and it was coupled to a Computational Structural Dynamics (CSD) methodology to perform a trimmed aeroelastic analysis of a rotor in forward flight. The coupling analyses, both loose and tight were used to identify the key physical phenomena that affect rotors in different steady flight regimes. The modeling enhancements improved the airloads predictions for a variety of flight conditions. It was found that the tightly coupled method did not impact the loads significantly for steady flight conditions compared to the loosely coupled method. The coupling methodology was extended to maneuvering flight analysis by enhancing the computational and structural models to handle non-periodic flight conditions and vehicle motions in time accurate mode. The flight test control angles were employed to enable the maneuvering flight analysis. The fully coupled model provided the presence of three dynamic stall cycles on the rotor in maneuver. It is important to mention that analysis of maneuvering flight requires knowledge of the pilot input control pitch settings, and the vehicle states. As the result, these computational tools cannot be used for analysis of loads in a maneuver that has not been duplicated in a real flight. This is a significant limitation if these tools are to be selected during the design phase of a helicopter where its handling qualities are evaluated in different trajectories. Therefore, a methodology was developed to couple the CFD/CSD simulation with an inverse flight mechanics simulation to perform the maneuver analysis without using the flight test control input. The methodology showed reasonable convergence in steady flight regime and control angles predictions compared fairly well with test data. In the maneuvering flight regions, the convergence was slower due to relaxation techniques used for the numerical stability. The subsequent computed control angles for the maneuvering flight regions compared well with test data. Further, the enhancement of the rotor inflow computations in the inverse simulation through implementation of a Lagrangian wake model improved the convergence of the coupling methodology.

  7. Validity of the Born approximation for beyond Gaussian weak lensing observables

    DOE PAGES

    Petri, Andrea; Haiman, Zoltan; May, Morgan

    2017-06-06

    Accurate forward modeling of weak lensing (WL) observables from cosmological parameters is necessary for upcoming galaxy surveys. Because WL probes structures in the nonlinear regime, analytical forward modeling is very challenging, if not impossible. Numerical simulations of WL features rely on ray tracing through the outputs of N-body simulations, which requires knowledge of the gravitational potential and accurate solvers for light ray trajectories. A less accurate procedure, based on the Born approximation, only requires knowledge of the density field, and can be implemented more efficiently and at a lower computational cost. In this work, we use simulations to show thatmore » deviations of the Born-approximated convergence power spectrum, skewness and kurtosis from their fully ray-traced counterparts are consistent with the smallest nontrivial O(Φ 3) post-Born corrections (so-called geodesic and lens-lens terms). Our results imply a cancellation among the larger O(Φ 4) (and higher order) terms, consistent with previous analytic work. We also find that cosmological parameter bias induced by the Born-approximated power spectrum is negligible even for a LSST-like survey, once galaxy shape noise is considered. When considering higher order statistics such as the κ skewness and kurtosis, however, we find significant bias of up to 2.5σ. Using the LensTools software suite, we show that the Born approximation saves a factor of 4 in computing time with respect to the full ray tracing in reconstructing the convergence.« less

  8. Validity of the Born approximation for beyond Gaussian weak lensing observables

    NASA Astrophysics Data System (ADS)

    Petri, Andrea; Haiman, Zoltán; May, Morgan

    2017-06-01

    Accurate forward modeling of weak lensing (WL) observables from cosmological parameters is necessary for upcoming galaxy surveys. Because WL probes structures in the nonlinear regime, analytical forward modeling is very challenging, if not impossible. Numerical simulations of WL features rely on ray tracing through the outputs of N -body simulations, which requires knowledge of the gravitational potential and accurate solvers for light ray trajectories. A less accurate procedure, based on the Born approximation, only requires knowledge of the density field, and can be implemented more efficiently and at a lower computational cost. In this work, we use simulations to show that deviations of the Born-approximated convergence power spectrum, skewness and kurtosis from their fully ray-traced counterparts are consistent with the smallest nontrivial O (Φ3) post-Born corrections (so-called geodesic and lens-lens terms). Our results imply a cancellation among the larger O (Φ4) (and higher order) terms, consistent with previous analytic work. We also find that cosmological parameter bias induced by the Born-approximated power spectrum is negligible even for a LSST-like survey, once galaxy shape noise is considered. When considering higher order statistics such as the κ skewness and kurtosis, however, we find significant bias of up to 2.5 σ . Using the LensTools software suite, we show that the Born approximation saves a factor of 4 in computing time with respect to the full ray tracing in reconstructing the convergence.

  9. Analyzing Spacecraft Telecommunication Systems

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  10. Modification of Hazen's equation in coarse grained soils by soft computing techniques

    NASA Astrophysics Data System (ADS)

    Kaynar, Oguz; Yilmaz, Isik; Marschalko, Marian; Bednarik, Martin; Fojtova, Lucie

    2013-04-01

    Hazen first proposed a Relationship between coefficient of permeability (k) and effective grain size (d10) was first proposed by Hazen, and it was then extended by some other researchers. However many attempts were done for estimation of k, correlation coefficients (R2) of the models were generally lower than ~0.80 and whole grain size distribution curves were not included in the assessments. Soft computing techniques such as; artificial neural networks, fuzzy inference systems, genetic algorithms, etc. and their hybrids are now being successfully used as an alternative tool. In this study, use of some soft computing techniques such as Artificial Neural Networks (ANNs) (MLP, RBF, etc.) and Adaptive Neuro-Fuzzy Inference System (ANFIS) for prediction of permeability of coarse grained soils was described, and Hazen's equation was then modificated. It was found that the soft computing models exhibited high performance in prediction of permeability coefficient. However four different kinds of ANN algorithms showed similar prediction performance, results of MLP was found to be relatively more accurate than RBF models. The most reliable prediction was obtained from ANFIS model.

  11. CRADA Final Report: Weld Predictor App

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Billings, Jay Jay

    Welding is an important manufacturing process used in a broad range of industries and market sectors, including automotive, aerospace, heavy manufacturing, medical, and defense. During welded fabrication, high localized heat input and subsequent rapid cooling result in the creation of residual stresses and distortion. These residual stresses can significantly affect the fatigue resistance, cracking behavior, and load-carrying capacity of welded structures during service. Further, additional fitting and tacking time is often required to fit distorted subassemblies together, resulting in non-value added cost. Using trial-and-error methods to determine which welding parameters, welding sequences, and fixture designs will most effectively reduce distortionmore » is a time-consuming and expensive process. For complex structures with many welds, this approach can take several months. For this reason, efficient and accurate methods of mitigating distortion are in-demand across all industries where welding is used. Analytical and computational methods and commercial software tools have been developed to predict welding-induced residual stresses and distortion. Welding process parameters, fixtures, and tooling can be optimized to reduce the HAZ softening and minimize weld residual stress and distortion, improving performance and reducing design, fabrication and testing costs. However, weld modeling technology tools are currently accessible only to engineers and designers with a background in finite element analysis (FEA) who work with large manufacturers, research institutes, and universities with access to high-performance computing (HPC) resources. Small and medium enterprises (SMEs) in the US do not typically have the human and computational resources needed to adopt and utilize weld modeling technology. To allow an engineer with no background in FEA and SMEs to gain access to this important design tool, EWI and the Ohio Supercomputer Center (OSC) developed the online weld application software tool “WeldPredictor” ( https://eweldpredictor.ewi.org ). About 1400 users have tested this application. This project marked the beginning of development on the next version of WeldPredictor that addresses many outstanding features of the original, including 3D models, allow more material hardening laws, model material phase transformation, and uses open source finite element solvers to quickly solve problems (as opposed to expensive commercial tools).« less

  12. Coherent multiscale image processing using dual-tree quaternion wavelets.

    PubMed

    Chan, Wai Lam; Choi, Hyeokho; Baraniuk, Richard G

    2008-07-01

    The dual-tree quaternion wavelet transform (QWT) is a new multiscale analysis tool for geometric image features. The QWT is a near shift-invariant tight frame representation whose coefficients sport a magnitude and three phases: two phases encode local image shifts while the third contains image texture information. The QWT is based on an alternative theory for the 2-D Hilbert transform and can be computed using a dual-tree filter bank with linear computational complexity. To demonstrate the properties of the QWT's coherent magnitude/phase representation, we develop an efficient and accurate procedure for estimating the local geometrical structure of an image. We also develop a new multiscale algorithm for estimating the disparity between a pair of images that is promising for image registration and flow estimation applications. The algorithm features multiscale phase unwrapping, linear complexity, and sub-pixel estimation accuracy.

  13. Computed myography: three-dimensional reconstruction of motor functions from surface EMG data

    NASA Astrophysics Data System (ADS)

    van den Doel, Kees; Ascher, Uri M.; Pai, Dinesh K.

    2008-12-01

    We describe a methodology called computed myography to qualitatively and quantitatively determine the activation level of individual muscles by voltage measurements from an array of voltage sensors on the skin surface. A finite element model for electrostatics simulation is constructed from morphometric data. For the inverse problem, we utilize a generalized Tikhonov regularization. This imposes smoothness on the reconstructed sources inside the muscles and suppresses sources outside the muscles using a penalty term. Results from experiments with simulated and human data are presented for activation reconstructions of three muscles in the upper arm (biceps brachii, bracialis and triceps). This approach potentially offers a new clinical tool to sensitively assess muscle function in patients suffering from neurological disorders (e.g., spinal cord injury), and could more accurately guide advances in the evaluation of specific rehabilitation training regimens.

  14. Knowing when to give up: early-rejection stratagems in ligand docking

    NASA Astrophysics Data System (ADS)

    Skone, Gwyn; Voiculescu, Irina; Cameron, Stephen

    2009-10-01

    Virtual screening is an important resource in the drug discovery community, of which protein-ligand docking is a significant part. Much software has been developed for this purpose, largely by biochemists and those in related disciplines, who pursue ever more accurate representations of molecular interactions. The resulting tools, however, are very processor-intensive. This paper describes some initial results from a project to review computational chemistry techniques for docking from a non-chemistry standpoint. An abstract blueprint for protein-ligand docking using empirical scoring functions is suggested, and this is used to discuss potential improvements. By introducing computer science tactics such as lazy function evaluation, dramatic increases to throughput can and have been realized using a real-world docking program. Naturally, they can be extended to any system that approximately corresponds to the architecture outlined.

  15. Algorithm for planning a double-jaw orthognathic surgery using a computer-aided surgical simulation (CASS) protocol. Part 1: planning sequence.

    PubMed

    Xia, J J; Gateno, J; Teichgraeber, J F; Yuan, P; Chen, K-C; Li, J; Zhang, X; Tang, Z; Alfi, D M

    2015-12-01

    The success of craniomaxillofacial (CMF) surgery depends not only on the surgical techniques, but also on an accurate surgical plan. The adoption of computer-aided surgical simulation (CASS) has created a paradigm shift in surgical planning. However, planning an orthognathic operation using CASS differs fundamentally from planning using traditional methods. With this in mind, the Surgical Planning Laboratory of Houston Methodist Research Institute has developed a CASS protocol designed specifically for orthognathic surgery. The purpose of this article is to present an algorithm using virtual tools for planning a double-jaw orthognathic operation. This paper will serve as an operation manual for surgeons wanting to incorporate CASS into their clinical practice. Copyright © 2015 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  16. Integral equation and discontinuous Galerkin methods for the analysis of light-matter interaction

    NASA Astrophysics Data System (ADS)

    Baczewski, Andrew David

    Light-matter interaction is among the most enduring interests of the physical sciences. The understanding and control of this physics is of paramount importance to the design of myriad technologies ranging from stained glass, to molecular sensing and characterization techniques, to quantum computers. The development of complex engineered systems that exploit this physics is predicated at least partially upon in silico design and optimization that properly capture the light-matter coupling. In this thesis, the details of computational frameworks that enable this type of analysis, based upon both Integral Equation and Discontinuous Galerkin formulations will be explored. There will be a primary focus on the development of efficient and accurate software, with results corroborating both. The secondary focus will be on the use of these tools in the analysis of a number of exemplary systems.

  17. Eye Tracking Based Control System for Natural Human-Computer Interaction

    PubMed Central

    Lin, Shu-Fan

    2017-01-01

    Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design. PMID:29403528

  18. Molecular Dynamics Simulations and Kinetic Measurements to Estimate and Predict Protein-Ligand Residence Times.

    PubMed

    Mollica, Luca; Theret, Isabelle; Antoine, Mathias; Perron-Sierra, Françoise; Charton, Yves; Fourquez, Jean-Marie; Wierzbicki, Michel; Boutin, Jean A; Ferry, Gilles; Decherchi, Sergio; Bottegoni, Giovanni; Ducrot, Pierre; Cavalli, Andrea

    2016-08-11

    Ligand-target residence time is emerging as a key drug discovery parameter because it can reliably predict drug efficacy in vivo. Experimental approaches to binding and unbinding kinetics are nowadays available, but we still lack reliable computational tools for predicting kinetics and residence time. Most attempts have been based on brute-force molecular dynamics (MD) simulations, which are CPU-demanding and not yet particularly accurate. We recently reported a new scaled-MD-based protocol, which showed potential for residence time prediction in drug discovery. Here, we further challenged our procedure's predictive ability by applying our methodology to a series of glucokinase activators that could be useful for treating type 2 diabetes mellitus. We combined scaled MD with experimental kinetics measurements and X-ray crystallography, promptly checking the protocol's reliability by directly comparing computational predictions and experimental measures. The good agreement highlights the potential of our scaled-MD-based approach as an innovative method for computationally estimating and predicting drug residence times.

  19. Three-camera stereo vision for intelligent transportation systems

    NASA Astrophysics Data System (ADS)

    Bergendahl, Jason; Masaki, Ichiro; Horn, Berthold K. P.

    1997-02-01

    A major obstacle in the application of stereo vision to intelligent transportation system is high computational cost. In this paper, a PC based three-camera stereo vision system constructed with off-the-shelf components is described. The system serves as a tool for developing and testing robust algorithms which approach real-time performance. We present an edge based, subpixel stereo algorithm which is adapted to permit accurate distance measurements to objects in the field of view using a compact camera assembly. Once computed, the 3D scene information may be directly applied to a number of in-vehicle applications, such as adaptive cruise control, obstacle detection, and lane tracking. Moreover, since the largest computational costs is incurred in generating the 3D scene information, multiple applications that leverage this information can be implemented in a single system with minimal cost. On-road applications, such as vehicle counting and incident detection, are also possible. Preliminary in-vehicle road trial results are presented.

  20. Computation of Relative Magnetic Helicity in Spherical Coordinates

    NASA Astrophysics Data System (ADS)

    Moraitis, Kostas; Pariat, Étienne; Savcheva, Antonia; Valori, Gherardo

    2018-06-01

    Magnetic helicity is a quantity of great importance in solar studies because it is conserved in ideal magnetohydrodynamics. While many methods for computing magnetic helicity in Cartesian finite volumes exist, in spherical coordinates, the natural coordinate system for solar applications, helicity is only treated approximately. We present here a method for properly computing the relative magnetic helicity in spherical geometry. The volumes considered are finite, of shell or wedge shape, and the three-dimensional magnetic field is considered to be fully known throughout the studied domain. Testing of the method with well-known, semi-analytic, force-free magnetic-field models reveals that it has excellent accuracy. Further application to a set of nonlinear force-free reconstructions of the magnetic field of solar active regions and comparison with an approximate method used in the past indicates that the proposed method can be significantly more accurate, thus making our method a promising tool in helicity studies that employ spherical geometry. Additionally, we determine and discuss the applicability range of the approximate method.

  1. Support vector machine firefly algorithm based optimization of lens system.

    PubMed

    Shamshirband, Shahaboddin; Petković, Dalibor; Pavlović, Nenad T; Ch, Sudheer; Altameem, Torki A; Gani, Abdullah

    2015-01-01

    Lens system design is an important factor in image quality. The main aspect of the lens system design methodology is the optimization procedure. Since optimization is a complex, nonlinear task, soft computing optimization algorithms can be used. There are many tools that can be employed to measure optical performance, but the spot diagram is the most useful. The spot diagram gives an indication of the image of a point object. In this paper, the spot size radius is considered an optimization criterion. Intelligent soft computing scheme support vector machines (SVMs) coupled with the firefly algorithm (FFA) are implemented. The performance of the proposed estimators is confirmed with the simulation results. The result of the proposed SVM-FFA model has been compared with support vector regression (SVR), artificial neural networks, and generic programming methods. The results show that the SVM-FFA model performs more accurately than the other methodologies. Therefore, SVM-FFA can be used as an efficient soft computing technique in the optimization of lens system designs.

  2. Eye Tracking Based Control System for Natural Human-Computer Interaction.

    PubMed

    Zhang, Xuebai; Liu, Xiaolong; Yuan, Shyan-Ming; Lin, Shu-Fan

    2017-01-01

    Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.

  3. Launching genomics into the cloud: deployment of Mercury, a next generation sequence analysis pipeline

    PubMed Central

    2014-01-01

    Background Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. Results To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. Conclusions By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples. PMID:24475911

  4. Millisecond precision psychological research in a world of commodity computers: new hardware, new problems?

    PubMed

    Plant, Richard R; Turner, Garry

    2009-08-01

    Since the publication of Plant, Hammond, and Turner (2004), which highlighted a pressing need for researchers to pay more attention to sources of error in computer-based experiments, the landscape has undoubtedly changed, but not necessarily for the better. Readily available hardware has improved in terms of raw speed; multi core processors abound; graphics cards now have hundreds of megabytes of RAM; main memory is measured in gigabytes; drive space is measured in terabytes; ever larger thin film transistor displays capable of single-digit response times, together with newer Digital Light Processing multimedia projectors, enable much greater graphic complexity; and new 64-bit operating systems, such as Microsoft Vista, are now commonplace. However, have millisecond-accurate presentation and response timing improved, and will they ever be available in commodity computers and peripherals? In the present article, we used a Black Box ToolKit to measure the variability in timing characteristics of hardware used commonly in psychological research.

  5. Free Energy Calculations using a Swarm-Enhanced Sampling Molecular Dynamics Approach.

    PubMed

    Burusco, Kepa K; Bruce, Neil J; Alibay, Irfan; Bryce, Richard A

    2015-10-26

    Free energy simulations are an established computational tool in modelling chemical change in the condensed phase. However, sampling of kinetically distinct substates remains a challenge to these approaches. As a route to addressing this, we link the methods of thermodynamic integration (TI) and swarm-enhanced sampling molecular dynamics (sesMD), where simulation replicas interact cooperatively to aid transitions over energy barriers. We illustrate the approach by using alchemical alkane transformations in solution, comparing them with the multiple independent trajectory TI (IT-TI) method. Free energy changes for transitions computed by using IT-TI grew increasingly inaccurate as the intramolecular barrier was heightened. By contrast, swarm-enhanced sampling TI (sesTI) calculations showed clear improvements in sampling efficiency, leading to more accurate computed free energy differences, even in the case of the highest barrier height. The sesTI approach, therefore, has potential in addressing chemical change in systems where conformations exist in slow exchange. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Using a combined computational-experimental approach to predict antibody-specific B cell epitopes.

    PubMed

    Sela-Culang, Inbal; Benhnia, Mohammed Rafii-El-Idrissi; Matho, Michael H; Kaever, Thomas; Maybeno, Matt; Schlossman, Andrew; Nimrod, Guy; Li, Sheng; Xiang, Yan; Zajonc, Dirk; Crotty, Shane; Ofran, Yanay; Peters, Bjoern

    2014-04-08

    Antibody epitope mapping is crucial for understanding B cell-mediated immunity and required for characterizing therapeutic antibodies. In contrast to T cell epitope mapping, no computational tools are in widespread use for prediction of B cell epitopes. Here, we show that, utilizing the sequence of an antibody, it is possible to identify discontinuous epitopes on its cognate antigen. The predictions are based on residue-pairing preferences and other interface characteristics. We combined these antibody-specific predictions with results of cross-blocking experiments that identify groups of antibodies with overlapping epitopes to improve the predictions. We validate the high performance of this approach by mapping the epitopes of a set of antibodies against the previously uncharacterized D8 antigen, using complementary techniques to reduce method-specific biases (X-ray crystallography, peptide ELISA, deuterium exchange, and site-directed mutagenesis). These results suggest that antibody-specific computational predictions and simple cross-blocking experiments allow for accurate prediction of residues in conformational B cell epitopes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Wetland Assessment Using Unmanned Aerial Vehicle (uav) Photogrammetry

    NASA Astrophysics Data System (ADS)

    Boon, M. A.; Greenfield, R.; Tesfamichael, S.

    2016-06-01

    The use of Unmanned Arial Vehicle (UAV) photogrammetry is a valuable tool to enhance our understanding of wetlands. Accurate planning derived from this technological advancement allows for more effective management and conservation of wetland areas. This paper presents results of a study that aimed at investigating the use of UAV photogrammetry as a tool to enhance the assessment of wetland ecosystems. The UAV images were collected during a single flight within 2½ hours over a 100 ha area at the Kameelzynkraal farm, Gauteng Province, South Africa. An AKS Y-6 MKII multi-rotor UAV and a digital camera on a motion compensated gimbal mount were utilised for the survey. Twenty ground control points (GCPs) were surveyed using a Trimble GPS to achieve geometrical precision and georeferencing accuracy. Structure-from-Motion (SfM) computer vision techniques were used to derive ultra-high resolution point clouds, orthophotos and 3D models from the multi-view photos. The geometric accuracy of the data based on the 20 GCP's were 0.018 m for the overall, 0.0025 m for the vertical root mean squared error (RMSE) and an over all root mean square reprojection error of 0.18 pixel. The UAV products were then edited and subsequently analysed, interpreted and key attributes extracted using a selection of tools/ software applications to enhance the wetland assessment. The results exceeded our expectations and provided a valuable and accurate enhancement to the wetland delineation, classification and health assessment which even with detailed field studies would have been difficult to achieve.

  8. ArraySolver: an algorithm for colour-coded graphical display and Wilcoxon signed-rank statistics for comparing microarray gene expression data.

    PubMed

    Khan, Haseeb Ahmad

    2004-01-01

    The massive surge in the production of microarray data poses a great challenge for proper analysis and interpretation. In recent years numerous computational tools have been developed to extract meaningful interpretation of microarray gene expression data. However, a convenient tool for two-groups comparison of microarray data is still lacking and users have to rely on commercial statistical packages that might be costly and require special skills, in addition to extra time and effort for transferring data from one platform to other. Various statistical methods, including the t-test, analysis of variance, Pearson test and Mann-Whitney U test, have been reported for comparing microarray data, whereas the utilization of the Wilcoxon signed-rank test, which is an appropriate test for two-groups comparison of gene expression data, has largely been neglected in microarray studies. The aim of this investigation was to build an integrated tool, ArraySolver, for colour-coded graphical display and comparison of gene expression data using the Wilcoxon signed-rank test. The results of software validation showed similar outputs with ArraySolver and SPSS for large datasets. Whereas the former program appeared to be more accurate for 25 or fewer pairs (n < or = 25), suggesting its potential application in analysing molecular signatures that usually contain small numbers of genes. The main advantages of ArraySolver are easy data selection, convenient report format, accurate statistics and the familiar Excel platform.

  9. ArraySolver: An Algorithm for Colour-Coded Graphical Display and Wilcoxon Signed-Rank Statistics for Comparing Microarray Gene Expression Data

    PubMed Central

    2004-01-01

    The massive surge in the production of microarray data poses a great challenge for proper analysis and interpretation. In recent years numerous computational tools have been developed to extract meaningful interpretation of microarray gene expression data. However, a convenient tool for two-groups comparison of microarray data is still lacking and users have to rely on commercial statistical packages that might be costly and require special skills, in addition to extra time and effort for transferring data from one platform to other. Various statistical methods, including the t-test, analysis of variance, Pearson test and Mann–Whitney U test, have been reported for comparing microarray data, whereas the utilization of the Wilcoxon signed-rank test, which is an appropriate test for two-groups comparison of gene expression data, has largely been neglected in microarray studies. The aim of this investigation was to build an integrated tool, ArraySolver, for colour-coded graphical display and comparison of gene expression data using the Wilcoxon signed-rank test. The results of software validation showed similar outputs with ArraySolver and SPSS for large datasets. Whereas the former program appeared to be more accurate for 25 or fewer pairs (n ≤ 25), suggesting its potential application in analysing molecular signatures that usually contain small numbers of genes. The main advantages of ArraySolver are easy data selection, convenient report format, accurate statistics and the familiar Excel platform. PMID:18629036

  10. SFESA: a web server for pairwise alignment refinement by secondary structure shifts.

    PubMed

    Tong, Jing; Pei, Jimin; Grishin, Nick V

    2015-09-03

    Protein sequence alignment is essential for a variety of tasks such as homology modeling and active site prediction. Alignment errors remain the main cause of low-quality structure models. A bioinformatics tool to refine alignments is needed to make protein alignments more accurate. We developed the SFESA web server to refine pairwise protein sequence alignments. Compared to the previous version of SFESA, which required a set of 3D coordinates for a protein, the new server will search a sequence database for the closest homolog with an available 3D structure to be used as a template. For each alignment block defined by secondary structure elements in the template, SFESA evaluates alignment variants generated by local shifts and selects the best-scoring alignment variant. A scoring function that combines the sequence score of profile-profile comparison and the structure score of template-derived contact energy is used for evaluation of alignments. PROMALS pairwise alignments refined by SFESA are more accurate than those produced by current advanced alignment methods such as HHpred and CNFpred. In addition, SFESA also improves alignments generated by other software. SFESA is a web-based tool for alignment refinement, designed for researchers to compute, refine, and evaluate pairwise alignments with a combined sequence and structure scoring of alignment blocks. To our knowledge, the SFESA web server is the only tool that refines alignments by evaluating local shifts of secondary structure elements. The SFESA web server is available at http://prodata.swmed.edu/sfesa.

  11. Cardiac imaging: working towards fully-automated machine analysis & interpretation

    PubMed Central

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-01-01

    Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804

  12. On the study of control effectiveness and computational efficiency of reduced Saint-Venant model in model predictive control of open channel flow

    NASA Astrophysics Data System (ADS)

    Xu, M.; van Overloop, P. J.; van de Giesen, N. C.

    2011-02-01

    Model predictive control (MPC) of open channel flow is becoming an important tool in water management. The complexity of the prediction model has a large influence on the MPC application in terms of control effectiveness and computational efficiency. The Saint-Venant equations, called SV model in this paper, and the Integrator Delay (ID) model are either accurate but computationally costly, or simple but restricted to allowed flow changes. In this paper, a reduced Saint-Venant (RSV) model is developed through a model reduction technique, Proper Orthogonal Decomposition (POD), on the SV equations. The RSV model keeps the main flow dynamics and functions over a large flow range but is easier to implement in MPC. In the test case of a modeled canal reach, the number of states and disturbances in the RSV model is about 45 and 16 times less than the SV model, respectively. The computational time of MPC with the RSV model is significantly reduced, while the controller remains effective. Thus, the RSV model is a promising means to balance the control effectiveness and computational efficiency.

  13. Parallel Implementation of Triangular Cellular Automata for Computing Two-Dimensional Elastodynamic Response on Arbitrary Domains

    NASA Astrophysics Data System (ADS)

    Leamy, Michael J.; Springer, Adam C.

    In this research we report parallel implementation of a Cellular Automata-based simulation tool for computing elastodynamic response on complex, two-dimensional domains. Elastodynamic simulation using Cellular Automata (CA) has recently been presented as an alternative, inherently object-oriented technique for accurately and efficiently computing linear and nonlinear wave propagation in arbitrarily-shaped geometries. The local, autonomous nature of the method should lead to straight-forward and efficient parallelization. We address this notion on symmetric multiprocessor (SMP) hardware using a Java-based object-oriented CA code implementing triangular state machines (i.e., automata) and the MPI bindings written in Java (MPJ Express). We use MPJ Express to reconfigure our existing CA code to distribute a domain's automata to cores present on a dual quad-core shared-memory system (eight total processors). We note that this message passing parallelization strategy is directly applicable to computer clustered computing, which will be the focus of follow-on research. Results on the shared memory platform indicate nearly-ideal, linear speed-up. We conclude that the CA-based elastodynamic simulator is easily configured to run in parallel, and yields excellent speed-up on SMP hardware.

  14. Iliac screw fixation using computer-assisted computer tomographic image guidance: technical note.

    PubMed

    Shin, John H; Hoh, Daniel J; Kalfas, Iain H

    2012-03-01

    Iliac screw fixation is a powerful tool used by spine surgeons to achieve fusion across the lumbosacral junction for a number of indications, including deformity, tumor, and pseudarthrosis. Complications associated with screw placement are related to blind trajectory selection and excessive soft tissue dissection. To describe the technique of iliac screw fixation using computed tomographic (CT)-based image guidance. Intraoperative registration and verification of anatomic landmarks are performed with the use of a preoperatively acquired CT of the lumbosacral spine. With the navigation probe, the ideal starting point for screw placement is selected while visualizing the intended trajectory and target on a computer screen. Once the starting point is selected and marked with a burr, a drill guide is docked within this point and the navigation probe re-inserted, confirming the trajectory. The probe is then removed and the high-speed drill reinserted within the drill guide. Drilling is performed to a depth measured on the computer screen and a screw is placed. Confirmation of accurate placement of iliac screws can be performed with standard radiographs. CT-guided navigation allows for 3-dimensional visualization of the pelvis and minimizes complications associated with soft-tissue dissection and breach of the ilium during screw placement.

  15. Toward a Global Bundle Adjustment of SPOT 5 - HRS Images

    NASA Astrophysics Data System (ADS)

    Massera, S.; Favé, P.; Gachet, R.; Orsoni, A.

    2012-07-01

    The HRS (High Resolution Stereoscopic) instrument carried on SPOT 5 enables quasi-simultaneous acquisition of stereoscopic images on wide segments - 120 km wide - with two forward and backward-looking telescopes observing the Earth with an angle of 20° ahead and behind the vertical. For 8 years IGN (Institut Géographique National) has been developing techniques to achieve spatiotriangulation of these images. During this time the capacities of bundle adjustment of SPOT 5 - HRS spatial images have largely improved. Today a global single block composed of about 20,000 images can be computed in reasonable calculation time. The progression was achieved step by step: first computed blocks were only composed of 40 images, then bigger blocks were computed. Finally only one global block is now computed. In the same time calculation tools have improved: for example the adjustment of 2,000 images of North Africa takes about 2 minutes whereas 8 hours were needed two years ago. To reach such a result a new independent software was developed to compute fast and efficient bundle adjustments. In the same time equipment - GCPs (Ground Control Points) and tie points - and techniques have also evolved over the last 10 years. Studies were made to get recommendations about the equipment in order to make an accurate single block. Tie points can now be quickly and automatically computed with SURF (Speeded Up Robust Features) techniques. Today the updated equipment is composed of about 500 GCPs and studies show that the ideal configuration is around 100 tie points by square degree. With such an equipment, the location of the global HRS block becomes a few meters accurate whereas non adjusted images are only 15 m accurate. This paper will describe the methods used in IGN Espace to compute a global single block composed of almost 20,000 HRS images, 500 GCPs and several million of tie points in reasonable calculation time. Many advantages can be found to use such a block. Because the global block is unique it becomes easier to manage the historic and the different evolutions of the computations (new images, new GCPs or tie points). The location is now unique and consequently coherent all around the world, avoiding steps and artifacts on the borders of DSMs (Digital Surface Models) and OrthoImages historically calculated from different blocks. No extrapolation far from GCPs in the limits of images is done anymore. Using the global block as a reference will allow new images from other sources to be easily located on this reference.

  16. iTools: a framework for classification, categorization and integration of computational biology resources.

    PubMed

    Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W

    2008-05-28

    The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu.

  17. iTools: A Framework for Classification, Categorization and Integration of Computational Biology Resources

    PubMed Central

    Dinov, Ivo D.; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H. V.; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D. Stott; Toga, Arthur W.

    2008-01-01

    The advancement of the computational biology field hinges on progress in three fundamental directions – the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources–data, software tools and web-services. The iTools design, implementation and resource meta - data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu. PMID:18509477

  18. Computational Investigation and Validation of Twin-Tail Buffet Response Including Dynamics and Control

    NASA Technical Reports Server (NTRS)

    Kandil, Osama A.

    1998-01-01

    Multidisciplinary tools for prediction of single rectangular-tail buffet are extended to single swept-back-tail buffet in transonic-speed flow, and multidisciplinary tools for prediction and control of twin-tail buffet are developed and presented. The configuration model consists of a sharp-edged delta wing with single or twin tails that are oriented normal to the wing surface. The tails are treated as cantilevered beams fixed at the root and allowed to oscillate in both bending and torsion. This complex multidisciplinary problem is solved sequentially using three sets of equations on a dynamic single or multi-block grid structure. The first set is the unsteady, compressible, Reynolds-averaged Navier-Stokes equations which are used for obtaining the flow field vector and the aerodynamic loads on the tails. The Navier-Stokes equations are solved accurately in time using the implicit, upwind, flux-difference splitting, finite volume scheme. The second set is the coupled bending and torsion aeroelastic equations of cantilevered beams which are used for obtaining the bending and torsion deflections of the tails. The aeroelastic equations'are solved accurately in time using, a fifth-order-accurate Runge-Kutta scheme. The third set is the grid-displacement equations and the rigid-body dynamics equations, which are used for updating the grid coordinates due to the tail deflections and rigid-body motions. The tail-buffet phenomenon is predicted for highly-swept, single vertical tail placed at the plane of geometric symmetry, and for highly-swept, vertical twin tails placed at three different spanwise separation distances. The investigation demonstrates the effects of structural inertial coupling and uncoupling of the bending and torsion modes of vibration, spanwise positions of the twin-tail, angle of attack, and pitching and rolling dynamic motions of the configuration model on the tail buffet loading and response. The fundamental issue of twin-tail buffet alleviation is addressed using two active flow-control methods. These methods are the tangential leading-edge blowing and the flow suction from the leading-edge vortex cores along their paths. Qualitative and quantitative comparisons with the available experimental data are presented. The comparisons indicate that the present multidisciplinary aeroelastic analysis tools are robust, accurate and efficient.

  19. SarcOptiM for ImageJ: high-frequency online sarcomere length computing on stimulated cardiomyocytes.

    PubMed

    Pasqualin, Côme; Gannier, François; Yu, Angèle; Malécot, Claire O; Bredeloux, Pierre; Maupoil, Véronique

    2016-08-01

    Accurate measurement of cardiomyocyte contraction is a critical issue for scientists working on cardiac physiology and physiopathology of diseases implying contraction impairment. Cardiomyocytes contraction can be quantified by measuring sarcomere length, but few tools are available for this, and none is freely distributed. We developed a plug-in (SarcOptiM) for the ImageJ/Fiji image analysis platform developed by the National Institutes of Health. SarcOptiM computes sarcomere length via fast Fourier transform analysis of video frames captured or displayed in ImageJ and thus is not tied to a dedicated video camera. It can work in real time or offline, the latter overcoming rotating motion or displacement-related artifacts. SarcOptiM includes a simulator and video generator of cardiomyocyte contraction. Acquisition parameters, such as pixel size and camera frame rate, were tested with both experimental recordings of rat ventricular cardiomyocytes and synthetic videos. It is freely distributed, and its source code is available. It works under Windows, Mac, or Linux operating systems. The camera speed is the limiting factor, since the algorithm can compute online sarcomere shortening at frame rates >10 kHz. In conclusion, SarcOptiM is a free and validated user-friendly tool for studying cardiomyocyte contraction in all species, including human. Copyright © 2016 the American Physiological Society.

  20. ViSAPy: a Python tool for biophysics-based generation of virtual spiking activity for evaluation of spike-sorting algorithms.

    PubMed

    Hagen, Espen; Ness, Torbjørn V; Khosrowshahi, Amir; Sørensen, Christina; Fyhn, Marianne; Hafting, Torkel; Franke, Felix; Einevoll, Gaute T

    2015-04-30

    New, silicon-based multielectrodes comprising hundreds or more electrode contacts offer the possibility to record spike trains from thousands of neurons simultaneously. This potential cannot be realized unless accurate, reliable automated methods for spike sorting are developed, in turn requiring benchmarking data sets with known ground-truth spike times. We here present a general simulation tool for computing benchmarking data for evaluation of spike-sorting algorithms entitled ViSAPy (Virtual Spiking Activity in Python). The tool is based on a well-established biophysical forward-modeling scheme and is implemented as a Python package built on top of the neuronal simulator NEURON and the Python tool LFPy. ViSAPy allows for arbitrary combinations of multicompartmental neuron models and geometries of recording multielectrodes. Three example benchmarking data sets are generated, i.e., tetrode and polytrode data mimicking in vivo cortical recordings and microelectrode array (MEA) recordings of in vitro activity in salamander retinas. The synthesized example benchmarking data mimics salient features of typical experimental recordings, for example, spike waveforms depending on interspike interval. ViSAPy goes beyond existing methods as it includes biologically realistic model noise, synaptic activation by recurrent spiking networks, finite-sized electrode contacts, and allows for inhomogeneous electrical conductivities. ViSAPy is optimized to allow for generation of long time series of benchmarking data, spanning minutes of biological time, by parallel execution on multi-core computers. ViSAPy is an open-ended tool as it can be generalized to produce benchmarking data or arbitrary recording-electrode geometries and with various levels of complexity. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  1. Analysis and Design of Novel Nanophotonic Structures

    NASA Astrophysics Data System (ADS)

    Shugayev, Roman

    Nanophotonic devices hold promise to revolutionize the fields of optical communications, quantum computing and bioimaging. Designing viable solutions to these pressing problems require developing accurate models of the relevant systems. While a great deal of work has been performed in terms of developing individual models with varying levels of fidelity, some of these more complex systems still require improved links between scales to allow for accurate design and optimization within a reasonable amount of computing time. For instance, color centers in nanocrystals appear to be a promising platform for room-temperature scalable quantum information science, but questions still remain about the optimal structures to control single-photon emitter rates, coupling fidelity, and suitable scaling architectures. In this work, a method for efficient optical access and readout of nanocrystal states via magnetic transitions was demonstrated. Separately novel Mie resonant devices that guarantee on-demand enhancement of emission from the single vacancy sources were shown. To improve addressability of the crystal-based impurities, a new approach for realization of single photon electro-optical devices is also proposed in this work. Furthermore, this work on color centers in nanocrystals has been shown to be sensitive to the local refractive index environment. This allows this system to be adapted to biomedical applications, such as sensitive, minimally invasive cancer detection. In this work, a novel scheme for propagation loss-free sensing of local refractive index using nanocrystal probes with broken symmetry is carefully investigated. In conclusion, this thesis develops several novel simulation and optimization techniques that combine existing nanophotonic modeling tools into a unique multi-scale modeling tool. It has been successfully applied to nanophotonically-tuned color vacancy centers. Potential applications span optical communications, quantum information processing, and biomedical sensing.

  2. Achieving Actionable Results from Available Inputs: Metamodels Take Building Energy Simulations One Step Further

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horsey, Henry; Fleming, Katherine; Ball, Brian

    Modeling commercial building energy usage can be a difficult and time-consuming task. The increasing prevalence of optimization algorithms provides one path for reducing the time and difficulty. Many use cases remain, however, where information regarding whole-building energy usage is valuable, but the time and expertise required to run and post-process a large number of building energy simulations is intractable. A relatively underutilized option to accurately estimate building energy consumption in real time is to pre-compute large datasets of potential building energy models, and use the set of results to quickly and efficiently provide highly accurate data. This process is calledmore » metamodeling. In this paper, two case studies are presented demonstrating the successful applications of metamodeling using the open-source OpenStudio Analysis Framework. The first case study involves the U.S. Department of Energy's Asset Score Tool, specifically the Preview Asset Score Tool, which is designed to give nontechnical users a near-instantaneous estimated range of expected results based on building system-level inputs. The second case study involves estimating the potential demand response capabilities of retail buildings in Colorado. The metamodel developed in this second application not only allows for estimation of a single building's expected performance, but also can be combined with public data to estimate the aggregate DR potential across various geographic (county and state) scales. In both case studies, the unique advantages of pre-computation allow building energy models to take the place of topdown actuarial evaluations. This paper ends by exploring the benefits of using metamodels and then examines the cost-effectiveness of this approach.« less

  3. An improved method for estimating capillary pressure from 3D microtomography images and its application to the study of disconnected nonwetting phase

    NASA Astrophysics Data System (ADS)

    Li, Tianyi; Schlüter, Steffen; Dragila, Maria Ines; Wildenschild, Dorthe

    2018-04-01

    We present an improved method for estimating interfacial curvatures from x-ray computed microtomography (CMT) data that significantly advances the potential for this tool to unravel the mechanisms and phenomena associated with multi-phase fluid motion in porous media. CMT data, used to analyze the spatial distribution and capillary pressure-saturation (Pc-S) relationships of liquid phases, requires accurate estimates of interfacial curvature. Our improved method for curvature estimation combines selective interface modification and distance weighting approaches. It was verified against synthetic (analytical computer-generated) and real image data sets, demonstrating a vast improvement over previous methods. Using this new tool on a previously published data set (multiphase flow) yielded important new insights regarding the pressure state of the disconnected nonwetting phase during drainage and imbibition. The trapped and disconnected non-wetting phase delimits its own hysteretic Pc-S curve that inhabits the space within the main hysteretic Pc-S loop of the connected wetting phase. Data suggests that the pressure of the disconnected, non-wetting phase is strongly modified by the pore geometry rather than solely by the bulk liquid phase that surrounds it.

  4. Creation of a simple natural language processing tool to support an imaging utilization quality dashboard.

    PubMed

    Swartz, Jordan; Koziatek, Christian; Theobald, Jason; Smith, Silas; Iturrate, Eduardo

    2017-05-01

    Testing for venous thromboembolism (VTE) is associated with cost and risk to patients (e.g. radiation). To assess the appropriateness of imaging utilization at the provider level, it is important to know that provider's diagnostic yield (percentage of tests positive for the diagnostic entity of interest). However, determining diagnostic yield typically requires either time-consuming, manual review of radiology reports or the use of complex and/or proprietary natural language processing software. The objectives of this study were twofold: 1) to develop and implement a simple, user-configurable, and open-source natural language processing tool to classify radiology reports with high accuracy and 2) to use the results of the tool to design a provider-specific VTE imaging dashboard, consisting of both utilization rate and diagnostic yield. Two physicians reviewed a training set of 400 lower extremity ultrasound (UTZ) and computed tomography pulmonary angiogram (CTPA) reports to understand the language used in VTE-positive and VTE-negative reports. The insights from this review informed the arguments to the five modifiable parameters of the NLP tool. A validation set of 2,000 studies was then independently classified by the reviewers and by the tool; the classifications were compared and the performance of the tool was calculated. The tool was highly accurate in classifying the presence and absence of VTE for both the UTZ (sensitivity 95.7%; 95% CI 91.5-99.8, specificity 100%; 95% CI 100-100) and CTPA reports (sensitivity 97.1%; 95% CI 94.3-99.9, specificity 98.6%; 95% CI 97.8-99.4). The diagnostic yield was then calculated at the individual provider level and the imaging dashboard was created. We have created a novel NLP tool designed for users without a background in computer programming, which has been used to classify venous thromboembolism reports with a high degree of accuracy. The tool is open-source and available for download at http://iturrate.com/simpleNLP. Results obtained using this tool can be applied to enhance quality by presenting information about utilization and yield to providers via an imaging dashboard. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. SimPhospho: a software tool enabling confident phosphosite assignment.

    PubMed

    Suni, Veronika; Suomi, Tomi; Tsubosaka, Tomoya; Imanishi, Susumu Y; Elo, Laura L; Corthals, Garry L

    2018-03-27

    Mass spectrometry combined with enrichment strategies for phosphorylated peptides has been successfully employed for two decades to identify sites of phosphorylation. However, unambiguous phosphosite assignment is considered challenging. Given that site-specific phosphorylation events function as different molecular switches, validation of phosphorylation sites is of utmost importance. In our earlier study we developed a method based on simulated phosphopeptide spectral libraries, which enables highly sensitive and accurate phosphosite assignments. To promote more widespread use of this method, we here introduce a software implementation with improved usability and performance. We present SimPhospho, a fast and user-friendly tool for accurate simulation of phosphopeptide tandem mass spectra. Simulated phosphopeptide spectral libraries are used to validate and supplement database search results, with a goal to improve reliable phosphoproteome identification and reporting. The presented program can be easily used together with the Trans-Proteomic Pipeline and integrated in a phosphoproteomics data analysis workflow. SimPhospho is available for Windows, Linux and Mac operating systems at https://sourceforge.net/projects/simphospho/. It is open source and implemented in C ++. A user's manual with detailed description of data analysis using SimPhospho as well as test data can be found as supplementary material of this article. Supplementary data are available at https://www.btk.fi/research/ computational-biomedicine/software/.

  6. Cyclic Symmetry Finite Element Forced Response Analysis of a Distortion-Tolerant Fan with Boundary Layer Ingestion

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Reddy, T. S. R.; Bakhle, M. A.; Coroneos, R. M.; Stefko, G. L.; Provenza, A. J.; Duffy, K. P.

    2018-01-01

    Accurate prediction of the blade vibration stress is required to determine overall durability of fan blade design under Boundary Layer Ingestion (BLI) distorted flow environments. Traditional single blade modeling technique is incapable of representing accurate modeling for the entire rotor blade system subject to complex dynamic loading behaviors and vibrations in distorted flow conditions. A particular objective of our work was to develop a high-fidelity full-rotor aeromechanics analysis capability for a system subjected to a distorted inlet flow by applying cyclic symmetry finite element modeling methodology. This reduction modeling method allows computationally very efficient analysis using a small periodic section of the full rotor blade system. Experimental testing by the use of the 8-foot by 6-foot Supersonic Wind Tunnel Test facility at NASA Glenn Research Center was also carried out for the system designated as the Boundary Layer Ingesting Inlet/Distortion-Tolerant Fan (BLI2DTF) technology development. The results obtained from the present numerical modeling technique were evaluated with those of the wind tunnel experimental test, toward establishing a computationally efficient aeromechanics analysis modeling tool facilitating for analyses of the full rotor blade systems subjected to a distorted inlet flow conditions. Fairly good correlations were achieved hence our computational modeling techniques were fully demonstrated. The analysis result showed that the safety margin requirement set in the BLI2DTF fan blade design provided a sufficient margin with respect to the operating speed range.

  7. Percutaneous Transcatheter Mitral Valve Replacement: Patient-specific Three-dimensional Computer-based Heart Model and Prototyping.

    PubMed

    Vaquerizo, Beatriz; Theriault-Lauzier, Pascal; Piazza, Nicolo

    2015-12-01

    Mitral regurgitation is the most prevalent valvular heart disease worldwide. Despite the widespread availability of curative surgical intervention, a considerable proportion of patients with severe mitral regurgitation are not referred for treatment, largely due to the presence of left ventricular dysfunction, advanced age, and comorbid illnesses. Transcatheter mitral valve replacement is a promising therapeutic alternative to traditional surgical valve replacement. The complex anatomical and pathophysiological nature of the mitral valvular complex, however, presents significant challenges to the successful design and implementation of novel transcatheter mitral replacement devices. Patient-specific 3-dimensional computer-based models enable accurate assessment of the mitral valve anatomy and preprocedural simulations for transcatheter therapies. Such information may help refine the design features of novel transcatheter mitral devices and enhance procedural planning. Herein, we describe a novel medical image-based processing tool that facilitates accurate, noninvasive assessment of the mitral valvular complex, by creating precise three-dimensional heart models. The 3-dimensional computer reconstructions are then converted to a physical model using 3-dimensional printing technology, thereby enabling patient-specific assessment of the interaction between device and patient. It may provide new opportunities for a better understanding of the mitral anatomy-pathophysiology-device interaction, which is of critical importance for the advancement of transcatheter mitral valve replacement. Copyright © 2015 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.

  8. An automated benchmarking platform for MHC class II binding prediction methods.

    PubMed

    Andreatta, Massimo; Trolle, Thomas; Yan, Zhen; Greenbaum, Jason A; Peters, Bjoern; Nielsen, Morten

    2018-05-01

    Computational methods for the prediction of peptide-MHC binding have become an integral and essential component for candidate selection in experimental T cell epitope discovery studies. The sheer amount of published prediction methods-and often discordant reports on their performance-poses a considerable quandary to the experimentalist who needs to choose the best tool for their research. With the goal to provide an unbiased, transparent evaluation of the state-of-the-art in the field, we created an automated platform to benchmark peptide-MHC class II binding prediction tools. The platform evaluates the absolute and relative predictive performance of all participating tools on data newly entered into the Immune Epitope Database (IEDB) before they are made public, thereby providing a frequent, unbiased assessment of available prediction tools. The benchmark runs on a weekly basis, is fully automated, and displays up-to-date results on a publicly accessible website. The initial benchmark described here included six commonly used prediction servers, but other tools are encouraged to join with a simple sign-up procedure. Performance evaluation on 59 data sets composed of over 10 000 binding affinity measurements suggested that NetMHCIIpan is currently the most accurate tool, followed by NN-align and the IEDB consensus method. Weekly reports on the participating methods can be found online at: http://tools.iedb.org/auto_bench/mhcii/weekly/. mniel@bioinformatics.dtu.dk. Supplementary data are available at Bioinformatics online.

  9. Computers and neurosurgery.

    PubMed

    Shaikhouni, Ammar; Elder, J Bradley

    2012-11-01

    At the turn of the twentieth century, the only computational device used in neurosurgical procedures was the brain of the surgeon. Today, most neurosurgical procedures rely at least in part on the use of a computer to help perform surgeries accurately and safely. The techniques that revolutionized neurosurgery were mostly developed after the 1950s. Just before that era, the transistor was invented in the late 1940s, and the integrated circuit was invented in the late 1950s. During this time, the first automated, programmable computational machines were introduced. The rapid progress in the field of neurosurgery not only occurred hand in hand with the development of modern computers, but one also can state that modern neurosurgery would not exist without computers. The focus of this article is the impact modern computers have had on the practice of neurosurgery. Neuroimaging, neuronavigation, and neuromodulation are examples of tools in the armamentarium of the modern neurosurgeon that owe each step in their evolution to progress made in computer technology. Advances in computer technology central to innovations in these fields are highlighted, with particular attention to neuroimaging. Developments over the last 10 years in areas of sensors and robotics that promise to transform the practice of neurosurgery further are discussed. Potential impacts of advances in computers related to neurosurgery in developing countries and underserved regions are also discussed. As this article illustrates, the computer, with its underlying and related technologies, is central to advances in neurosurgery over the last half century. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. 3-DOF Force-Sensing Motorized Micro-Forceps for Robot-Assisted Vitreoretinal Surgery

    PubMed Central

    Gonenc, Berk; Chamani, Alireza; Handa, James; Gehlbach, Peter; Taylor, Russell H.; Iordachita, Iulian

    2017-01-01

    In vitreoretinal surgery, membrane peeling is a prototypical task where a layer of fibrous tissue is delaminated off the retina with a micro-forceps by applying very fine forces that are mostly imperceptible to the surgeon. Previously we developed sensitized ophthalmic surgery tools based on fiber Bragg grating (FBG) strain sensors, which were shown to precisely detect forces at the instrument’s tip in two degrees of freedom perpendicular to the tool axis. This paper presents a new design that employs an additional sensor to capture also the tensile force along the tool axis. The grasping functionality is provided via a compact motorized unit. To compute forces, we investigate two distinct fitting methods: a linear regression and a nonlinear fitting based on second-order Bernstein polynomials. We carry out experiments to test the repeatability of sensor outputs, calibrate the sensor and validate its performance. Results demonstrate sensor wavelength repeatability within 2 pm. Although the linear method provides sufficient accuracy in measuring transverse forces, in the axial direction it produces a root mean square (rms) error over 3 mN even for a confined magnitude and direction of forces. On the other hand, the nonlinear method provides a more consistent and accurate measurement of both the transverse and axial forces for the entire force range (0–25 mN). Validation including random samples shows that our tool with the nonlinear force computation method can predict 3-D forces with an rms error under 0.15 mN in the transverse plane and within 2 mN accuracy in the axial direction. PMID:28736508

  11. Computable visually observed phenotype ontological framework for plants

    PubMed Central

    2011-01-01

    Background The ability to search for and precisely compare similar phenotypic appearances within and across species has vast potential in plant science and genetic research. The difficulty in doing so lies in the fact that many visual phenotypic data, especially visually observed phenotypes that often times cannot be directly measured quantitatively, are in the form of text annotations, and these descriptions are plagued by semantic ambiguity, heterogeneity, and low granularity. Though several bio-ontologies have been developed to standardize phenotypic (and genotypic) information and permit comparisons across species, these semantic issues persist and prevent precise analysis and retrieval of information. A framework suitable for the modeling and analysis of precise computable representations of such phenotypic appearances is needed. Results We have developed a new framework called the Computable Visually Observed Phenotype Ontological Framework for plants. This work provides a novel quantitative view of descriptions of plant phenotypes that leverages existing bio-ontologies and utilizes a computational approach to capture and represent domain knowledge in a machine-interpretable form. This is accomplished by means of a robust and accurate semantic mapping module that automatically maps high-level semantics to low-level measurements computed from phenotype imagery. The framework was applied to two different plant species with semantic rules mined and an ontology constructed. Rule quality was evaluated and showed high quality rules for most semantics. This framework also facilitates automatic annotation of phenotype images and can be adopted by different plant communities to aid in their research. Conclusions The Computable Visually Observed Phenotype Ontological Framework for plants has been developed for more efficient and accurate management of visually observed phenotypes, which play a significant role in plant genomics research. The uniqueness of this framework is its ability to bridge the knowledge of informaticians and plant science researchers by translating descriptions of visually observed phenotypes into standardized, machine-understandable representations, thus enabling the development of advanced information retrieval and phenotype annotation analysis tools for the plant science community. PMID:21702966

  12. Computational simulations of supersonic magnetohydrodynamic flow control, power and propulsion systems

    NASA Astrophysics Data System (ADS)

    Wan, Tian

    This work is motivated by the lack of fully coupled computational tool that solves successfully the turbulent chemically reacting Navier-Stokes equation, the electron energy conservation equation and the electric current Poisson equation. In the present work, the abovementioned equations are solved in a fully coupled manner using fully implicit parallel GMRES methods. The system of Navier-Stokes equations are solved using a GMRES method with combined Schwarz and ILU(0) preconditioners. The electron energy equation and the electric current Poisson equation are solved using a GMRES method with combined SOR and Jacobi preconditioners. The fully coupled method has also been implemented successfully in an unstructured solver, US3D, and convergence test results were presented. This new method is shown two to five times faster than the original DPLR method. The Poisson solver is validated with analytic test problems. Then, four problems are selected; two of them are computed to explore the possibility of onboard MHD control and power generation, and the other two are simulation of experiments. First, the possibility of onboard reentry shock control by a magnetic field is explored. As part of a previous project, MHD power generation onboard a re-entry vehicle is also simulated. Then, the MHD acceleration experiments conducted at NASA Ames research center are simulated. Lastly, the MHD power generation experiments known as the HVEPS project are simulated. For code validation, the scramjet experiments at University of Queensland are simulated first. The generator section of the HVEPS test facility is computed then. The main conclusion is that the computational tool is accurate for different types of problems and flow conditions, and its accuracy and efficiency are necessary when the flow complexity increases.

  13. Accomplishments and challenges of surgical simulation.

    PubMed

    Satava, R M

    2001-03-01

    For nearly a decade, advanced computer technologies have created extraordinary educational tools using three-dimensional (3D) visualization and virtual reality. Pioneering efforts in surgical simulation with these tools have resulted in a first generation of simulators for surgical technical skills. Accomplishments include simulations with 3D models of anatomy for practice of surgical tasks, initial assessment of student performance in technical skills, and awareness by professional societies of potential in surgical education and certification. However, enormous challenges remain, which include improvement of technical fidelity, standardization of accurate metrics for performance evaluation, integration of simulators into a robust educational curriculum, stringent evaluation of simulators for effectiveness and value added to surgical training, determination of simulation application to certification of surgical technical skills, and a business model to implement and disseminate simulation successfully throughout the medical education community. This review looks at the historical progress of surgical simulators, their accomplishments, and the challenges that remain.

  14. Advanced Usage of Vehicle Sketch Pad for CFD-Based Conceptual Design

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Li, Wu

    2013-01-01

    Conceptual design is the most fluid phase of aircraft design. It is important to be able to perform large scale design space exploration of candidate concepts that can achieve the design intent to avoid more costly configuration changes in later stages of design. This also means that conceptual design is highly dependent on the disciplinary analysis tools to capture the underlying physics accurately. The required level of analysis fidelity can vary greatly depending on the application. Vehicle Sketch Pad (VSP) allows the designer to easily construct aircraft concepts and make changes as the design matures. More recent development efforts have enabled VSP to bridge the gap to high-fidelity analysis disciplines such as computational fluid dynamics and structural modeling for finite element analysis. This paper focuses on the current state-of-the-art geometry modeling for the automated process of analysis and design of low-boom supersonic concepts using VSP and several capability-enhancing design tools.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    A mechanistic source term (MST) calculation attempts to realistically assess the transport and release of radionuclides from a reactor system to the environment during a specific accident sequence. The U.S. Nuclear Regulatory Commission (NRC) has repeatedly stated its expectation that advanced reactor vendors will utilize an MST during the U.S. reactor licensing process. As part of a project to examine possible impediments to sodium fast reactor (SFR) licensing in the U.S., an analysis was conducted regarding the current capabilities to perform an MST for a metal fuel SFR. The purpose of the project was to identify and prioritize any gapsmore » in current computational tools, and the associated database, for the accurate assessment of an MST. The results of the study demonstrate that an SFR MST is possible with current tools and data, but several gaps exist that may lead to possibly unacceptable levels of uncertainty, depending on the goals of the MST analysis.« less

  16. Electronics Shielding and Reliability Design Tools

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; ONeill, P. M.; Zang, Thomas A., Jr.; Pandolf, John E.; Koontz, Steven L.; Boeder, P.; Reddell, B.; Pankop, C.

    2006-01-01

    It is well known that electronics placement in large-scale human-rated systems provides opportunity to optimize electronics shielding through materials choice and geometric arrangement. For example, several hundred single event upsets (SEUs) occur within the Shuttle avionic computers during a typical mission. An order of magnitude larger SEU rate would occur without careful placement in the Shuttle design. These results used basic physics models (linear energy transfer (LET), track structure, Auger recombination) combined with limited SEU cross section measurements allowing accurate evaluation of target fragment contributions to Shuttle avionics memory upsets. Electronics shielding design on human-rated systems provides opportunity to minimize radiation impact on critical and non-critical electronic systems. Implementation of shielding design tools requires adequate methods for evaluation of design layouts, guiding qualification testing, and an adequate follow-up on final design evaluation including results from a systems/device testing program tailored to meet design requirements.

  17. extrap: Software to assist the selection of extrapolation methods for moving-boat ADCP streamflow measurements

    USGS Publications Warehouse

    Mueller, David S.

    2013-01-01

    profiles from the entire cross section and multiple transects to determine a mean profile for the measurement. The use of an exponent derived from normalized data from the entire cross section is shown to be valid for application of the power velocity distribution law in the computation of the unmeasured discharge in a cross section. Selected statistics are combined with empirically derived criteria to automatically select the appropriate extrapolation methods. A graphical user interface (GUI) provides the user tools to visually evaluate the automatically selected extrapolation methods and manually change them, as necessary. The sensitivity of the total discharge to available extrapolation methods is presented in the GUI. Use of extrap by field hydrographers has demonstrated that extrap is a more accurate and efficient method of determining the appropriate extrapolation methods compared with tools currently (2012) provided in the ADCP manufacturers’ software.

  18. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language.

    PubMed

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-12-15

    The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research can be accurately described and combined.

  19. Covariance Analysis Tool (G-CAT) for Computing Ascent, Descent, and Landing Errors

    NASA Technical Reports Server (NTRS)

    Boussalis, Dhemetrios; Bayard, David S.

    2013-01-01

    G-CAT is a covariance analysis tool that enables fast and accurate computation of error ellipses for descent, landing, ascent, and rendezvous scenarios, and quantifies knowledge error contributions needed for error budgeting purposes. Because GCAT supports hardware/system trade studies in spacecraft and mission design, it is useful in both early and late mission/ proposal phases where Monte Carlo simulation capability is not mature, Monte Carlo simulation takes too long to run, and/or there is a need to perform multiple parametric system design trades that would require an unwieldy number of Monte Carlo runs. G-CAT is formulated as a variable-order square-root linearized Kalman filter (LKF), typically using over 120 filter states. An important property of G-CAT is that it is based on a 6-DOF (degrees of freedom) formulation that completely captures the combined effects of both attitude and translation errors on the propagated trajectories. This ensures its accuracy for guidance, navigation, and control (GN&C) analysis. G-CAT provides the desired fast turnaround analysis needed for error budgeting in support of mission concept formulations, design trade studies, and proposal development efforts. The main usefulness of a covariance analysis tool such as G-CAT is its ability to calculate the performance envelope directly from a single run. This is in sharp contrast to running thousands of simulations to obtain similar information using Monte Carlo methods. It does this by propagating the "statistics" of the overall design, rather than simulating individual trajectories. G-CAT supports applications to lunar, planetary, and small body missions. It characterizes onboard knowledge propagation errors associated with inertial measurement unit (IMU) errors (gyro and accelerometer), gravity errors/dispersions (spherical harmonics, masscons), and radar errors (multiple altimeter beams, multiple Doppler velocimeter beams). G-CAT is a standalone MATLAB- based tool intended to run on any engineer's desktop computer.

  20. Evolvix BEST Names for semantic reproducibility across code2brain interfaces

    PubMed Central

    Scheuer, Katherine S.; Keel, Seth A.; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C.; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G.; Moog, Cecilia L.; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist‐Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda‐Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L.; Freiberg, Erika; Waters, Noah P.; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M.; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2016-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general‐purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long‐term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder‐brains to reader‐brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. PMID:27918836

  1. Efficiently accounting for ion correlations in electrokinetic nanofluidic devices using density functional theory.

    PubMed

    Gillespie, Dirk; Khair, Aditya S; Bardhan, Jaydeep P; Pennathur, Sumita

    2011-07-15

    The electrokinetic behavior of nanofluidic devices is dominated by the electrical double layers at the device walls. Therefore, accurate, predictive models of double layers are essential for device design and optimization. In this paper, we demonstrate that density functional theory (DFT) of electrolytes is an accurate and computationally efficient method for computing finite ion size effects and the resulting ion-ion correlations that are neglected in classical double layer theories such as Poisson-Boltzmann. Because DFT is derived from liquid-theory thermodynamic principles, it is ideal for nanofluidic systems with small spatial dimensions, high surface charge densities, high ion concentrations, and/or large ions. Ion-ion correlations are expected to be important in these regimes, leading to nonlinear phenomena such as charge inversion, wherein more counterions adsorb at the wall than is necessary to neutralize its surface charge, leading to a second layer of co-ions. We show that DFT, unlike other theories that do not include ion-ion correlations, can predict charge inversion and other nonlinear phenomena that lead to qualitatively different current densities and ion velocities for both pressure-driven and electro-osmotic flows. We therefore propose that DFT can be a valuable modeling and design tool for nanofluidic devices as they become smaller and more highly charged. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Hybrid CFD/CAA Modeling for Liftoff Acoustic Predictions

    NASA Technical Reports Server (NTRS)

    Strutzenberg, Louise L.; Liever, Peter A.

    2011-01-01

    This paper presents development efforts at the NASA Marshall Space flight Center to establish a hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) simulation system for launch vehicle liftoff acoustics environment analysis. Acoustic prediction engineering tools based on empirical jet acoustic strength and directivity models or scaled historical measurements are of limited value in efforts to proactively design and optimize launch vehicles and launch facility configurations for liftoff acoustics. CFD based modeling approaches are now able to capture the important details of vehicle specific plume flow environment, identifY the noise generation sources, and allow assessment of the influence of launch pad geometric details and sound mitigation measures such as water injection. However, CFD methodologies are numerically too dissipative to accurately capture the propagation of the acoustic waves in the large CFD models. The hybrid CFD/CAA approach combines the high-fidelity CFD analysis capable of identifYing the acoustic sources with a fast and efficient Boundary Element Method (BEM) that accurately propagates the acoustic field from the source locations. The BEM approach was chosen for its ability to properly account for reflections and scattering of acoustic waves from launch pad structures. The paper will present an overview of the technology components of the CFD/CAA framework and discuss plans for demonstration and validation against test data.

  3. Automatic online spike sorting with singular value decomposition and fuzzy C-mean clustering

    PubMed Central

    2012-01-01

    Background Understanding how neurons contribute to perception, motor functions and cognition requires the reliable detection of spiking activity of individual neurons during a number of different experimental conditions. An important problem in computational neuroscience is thus to develop algorithms to automatically detect and sort the spiking activity of individual neurons from extracellular recordings. While many algorithms for spike sorting exist, the problem of accurate and fast online sorting still remains a challenging issue. Results Here we present a novel software tool, called FSPS (Fuzzy SPike Sorting), which is designed to optimize: (i) fast and accurate detection, (ii) offline sorting and (iii) online classification of neuronal spikes with very limited or null human intervention. The method is based on a combination of Singular Value Decomposition for fast and highly accurate pre-processing of spike shapes, unsupervised Fuzzy C-mean, high-resolution alignment of extracted spike waveforms, optimal selection of the number of features to retain, automatic identification the number of clusters, and quantitative quality assessment of resulting clusters independent on their size. After being trained on a short testing data stream, the method can reliably perform supervised online classification and monitoring of single neuron activity. The generalized procedure has been implemented in our FSPS spike sorting software (available free for non-commercial academic applications at the address: http://www.spikesorting.com) using LabVIEW (National Instruments, USA). We evaluated the performance of our algorithm both on benchmark simulated datasets with different levels of background noise and on real extracellular recordings from premotor cortex of Macaque monkeys. The results of these tests showed an excellent accuracy in discriminating low-amplitude and overlapping spikes under strong background noise. The performance of our method is competitive with respect to other robust spike sorting algorithms. Conclusions This new software provides neuroscience laboratories with a new tool for fast and robust online classification of single neuron activity. This feature could become crucial in situations when online spike detection from multiple electrodes is paramount, such as in human clinical recordings or in brain-computer interfaces. PMID:22871125

  4. Automatic online spike sorting with singular value decomposition and fuzzy C-mean clustering.

    PubMed

    Oliynyk, Andriy; Bonifazzi, Claudio; Montani, Fernando; Fadiga, Luciano

    2012-08-08

    Understanding how neurons contribute to perception, motor functions and cognition requires the reliable detection of spiking activity of individual neurons during a number of different experimental conditions. An important problem in computational neuroscience is thus to develop algorithms to automatically detect and sort the spiking activity of individual neurons from extracellular recordings. While many algorithms for spike sorting exist, the problem of accurate and fast online sorting still remains a challenging issue. Here we present a novel software tool, called FSPS (Fuzzy SPike Sorting), which is designed to optimize: (i) fast and accurate detection, (ii) offline sorting and (iii) online classification of neuronal spikes with very limited or null human intervention. The method is based on a combination of Singular Value Decomposition for fast and highly accurate pre-processing of spike shapes, unsupervised Fuzzy C-mean, high-resolution alignment of extracted spike waveforms, optimal selection of the number of features to retain, automatic identification the number of clusters, and quantitative quality assessment of resulting clusters independent on their size. After being trained on a short testing data stream, the method can reliably perform supervised online classification and monitoring of single neuron activity. The generalized procedure has been implemented in our FSPS spike sorting software (available free for non-commercial academic applications at the address: http://www.spikesorting.com) using LabVIEW (National Instruments, USA). We evaluated the performance of our algorithm both on benchmark simulated datasets with different levels of background noise and on real extracellular recordings from premotor cortex of Macaque monkeys. The results of these tests showed an excellent accuracy in discriminating low-amplitude and overlapping spikes under strong background noise. The performance of our method is competitive with respect to other robust spike sorting algorithms. This new software provides neuroscience laboratories with a new tool for fast and robust online classification of single neuron activity. This feature could become crucial in situations when online spike detection from multiple electrodes is paramount, such as in human clinical recordings or in brain-computer interfaces.

  5. Computational Intelligence in Early Diabetes Diagnosis: A Review

    PubMed Central

    Shankaracharya; Odedra, Devang; Samanta, Subir; Vidyarthi, Ambarish S.

    2010-01-01

    The development of an effective diabetes diagnosis system by taking advantage of computational intelligence is regarded as a primary goal nowadays. Many approaches based on artificial network and machine learning algorithms have been developed and tested against diabetes datasets, which were mostly related to individuals of Pima Indian origin. Yet, despite high accuracies of up to 99% in predicting the correct diabetes diagnosis, none of these approaches have reached clinical application so far. One reason for this failure may be that diabetologists or clinical investigators are sparsely informed about, or trained in the use of, computational diagnosis tools. Therefore, this article aims at sketching out an outline of the wide range of options, recent developments, and potentials in machine learning algorithms as diabetes diagnosis tools. One focus is on supervised and unsupervised methods, which have made significant impacts in the detection and diagnosis of diabetes at primary and advanced stages. Particular attention is paid to algorithms that show promise in improving diabetes diagnosis. A key advance has been the development of a more in-depth understanding and theoretical analysis of critical issues related to algorithmic construction and learning theory. These include trade-offs for maximizing generalization performance, use of physically realistic constraints, and incorporation of prior knowledge and uncertainty. The review presents and explains the most accurate algorithms, and discusses advantages and pitfalls of methodologies. This should provide a good resource for researchers from all backgrounds interested in computational intelligence-based diabetes diagnosis methods, and allows them to extend their knowledge into this kind of research. PMID:21713313

  6. Computational intelligence in early diabetes diagnosis: a review.

    PubMed

    Shankaracharya; Odedra, Devang; Samanta, Subir; Vidyarthi, Ambarish S

    2010-01-01

    The development of an effective diabetes diagnosis system by taking advantage of computational intelligence is regarded as a primary goal nowadays. Many approaches based on artificial network and machine learning algorithms have been developed and tested against diabetes datasets, which were mostly related to individuals of Pima Indian origin. Yet, despite high accuracies of up to 99% in predicting the correct diabetes diagnosis, none of these approaches have reached clinical application so far. One reason for this failure may be that diabetologists or clinical investigators are sparsely informed about, or trained in the use of, computational diagnosis tools. Therefore, this article aims at sketching out an outline of the wide range of options, recent developments, and potentials in machine learning algorithms as diabetes diagnosis tools. One focus is on supervised and unsupervised methods, which have made significant impacts in the detection and diagnosis of diabetes at primary and advanced stages. Particular attention is paid to algorithms that show promise in improving diabetes diagnosis. A key advance has been the development of a more in-depth understanding and theoretical analysis of critical issues related to algorithmic construction and learning theory. These include trade-offs for maximizing generalization performance, use of physically realistic constraints, and incorporation of prior knowledge and uncertainty. The review presents and explains the most accurate algorithms, and discusses advantages and pitfalls of methodologies. This should provide a good resource for researchers from all backgrounds interested in computational intelligence-based diabetes diagnosis methods, and allows them to extend their knowledge into this kind of research.

  7. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  8. MITHRA 1.0: A full-wave simulation tool for free electron lasers

    NASA Astrophysics Data System (ADS)

    Fallahi, Arya; Yahaghi, Alireza; Kärtner, Franz X.

    2018-07-01

    Free Electron Lasers (FELs) are a solution for providing intense, coherent and bright radiation in the hard X-ray regime. Due to the low wall-plug efficiency of FEL facilities, it is crucial and additionally very useful to develop complete and accurate simulation tools for better optimizing a FEL interaction. The highly sophisticated dynamics involved in a FEL process was the main obstacle hindering the development of general simulation tools for this problem. We present a numerical algorithm based on finite difference time domain/Particle in cell (FDTD/PIC) in a Lorentz boosted coordinate system which is able to fulfill a full-wave simulation of a FEL process. The developed software offers a suitable tool for the analysis of FEL interactions without considering any of the usual approximations. A coordinate transformation to bunch rest frame makes the very different length scales of bunch size, optical wavelengths and the undulator period transform to values with the same order. Consequently, FDTD/PIC simulations in conjunction with efficient parallelization techniques make the full-wave simulation feasible using the available computational resources. Several examples of free electron lasers are analyzed using the developed software, the results are benchmarked based on standard FEL codes and discussed in detail.

  9. Six Suggestions for Research on Games in Cognitive Science.

    PubMed

    Chabris, Christopher F

    2017-04-01

    Games are more varied and occupy more of daily life than ever before. At the same time, the tools available to study game play and players are more powerful than ever, especially massive data sets from online platforms and computational engines that can accurately evaluate human decisions. This essay offers six suggestions for future cognitive science research on games: (1) Don't forget about chess, (2) Look beyond action games and chess, (3) Use (near)-optimal play to understand human play and players, (4) Investigate social phenomena, (5) Raise the standards for studies of games as treatments, (6) Talk to real experts. Copyright © 2017 Cognitive Science Society, Inc.

  10. Spectral Analysis of B Stars: An Application of Bayesian Statistics

    NASA Astrophysics Data System (ADS)

    Mugnes, J.-M.; Robert, C.

    2012-12-01

    To better understand the processes involved in stellar physics, it is necessary to obtain accurate stellar parameters (effective temperature, surface gravity, abundances…). Spectral analysis is a powerful tool for investigating stars, but it is also vital to reduce uncertainties at a decent computational cost. Here we present a spectral analysis method based on a combination of Bayesian statistics and grids of synthetic spectra obtained with TLUSTY. This method simultaneously constrains the stellar parameters by using all the lines accessible in observed spectra and thus greatly reduces uncertainties and improves the overall spectrum fitting. Preliminary results are shown using spectra from the Observatoire du Mont-Mégantic.

  11. The role of positron emission tomography in the diagnosis, staging and response assessment of non-small cell lung cancer

    PubMed Central

    Ali, Jason M.; Tasker, Angela; Peryt, Adam; Aresu, Giuseppe; Coonar, Aman S.

    2018-01-01

    Lung cancer is a common disease and the leading cause of cancer-related mortality, with non-small cell lung cancer (NSCLC) accounting for the majority of cases. Following diagnosis of lung cancer, accurate staging is essential to guide clinical management and inform prognosis. Positron emission tomography (PET) in conjunction with computed tomography (CT)—as PET-CT has developed as an important tool in the multi-disciplinary management of lung cancer. This article will review the current evidence for the role of 18F-fluorodeoxyglucose (FDG) PET-CT in NSCLC diagnosis, staging, response assessment and follow up. PMID:29666818

  12. Study of Some Planetary Atmospheres Features by Probe Entry and Descent Simulations

    NASA Technical Reports Server (NTRS)

    Gil, P. J. S.; Rosa, P. M. B.

    2005-01-01

    Characterization of planetary atmospheres is analyzed by its effects in the entry and descent trajectories of probes. Emphasis is on the most important variables that characterize atmospheres e.g. density profile with altitude. Probe trajectories are numerically determined with ENTRAP, a developing multi-purpose computational tool for entry and descent trajectory simulations capable of taking into account many features and perturbations. Real data from Mars Pathfinder mission is used. The goal is to be able to determine more accurately the atmosphere structure by observing real trajectories and what changes are to expect in probe descent trajectories if atmospheres have different properties than the ones assumed initially.

  13. Specialized CFD Grid Generation Methods for Near-Field Sonic Boom Prediction

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Campbell, Richard L.; Elmiligui, Alaa; Cliff, Susan E.; Nayani, Sudheer N.

    2014-01-01

    Ongoing interest in analysis and design of low sonic boom supersonic transports re- quires accurate and ecient Computational Fluid Dynamics (CFD) tools. Specialized grid generation techniques are employed to predict near- eld acoustic signatures of these con- gurations. A fundamental examination of grid properties is performed including grid alignment with ow characteristics and element type. The issues a ecting the robustness of cylindrical surface extrusion are illustrated. This study will compare three methods in the extrusion family of grid generation methods that produce grids aligned with the freestream Mach angle. These methods are applied to con gurations from the First AIAA Sonic Boom Prediction Workshop.

  14. NASA's Aviation Safety and Modeling Project

    NASA Technical Reports Server (NTRS)

    Chidester, Thomas R.; Statler, Irving C.

    2006-01-01

    The Aviation Safety Monitoring and Modeling (ASMM) Project of NASA's Aviation Safety program is cultivating sources of data and developing automated computer hardware and software to facilitate efficient, comprehensive, and accurate analyses of the data collected from large, heterogeneous databases throughout the national aviation system. The ASMM addresses the need to provide means for increasing safety by enabling the identification and correcting of predisposing conditions that could lead to accidents or to incidents that pose aviation risks. A major component of the ASMM Project is the Aviation Performance Measuring System (APMS), which is developing the next generation of software tools for analyzing and interpreting flight data.

  15. Validation of catchment models for predicting land-use and climate change impacts. 1. Method

    NASA Astrophysics Data System (ADS)

    Ewen, J.; Parkin, G.

    1996-02-01

    Computer simulation models are increasingly being proposed as tools capable of giving water resource managers accurate predictions of the impact of changes in land-use and climate. Previous validation testing of catchment models is reviewed, and it is concluded that the methods used do not clearly test a model's fitness for such a purpose. A new generally applicable method is proposed. This involves the direct testing of fitness for purpose, uses established scientific techniques, and may be implemented within a quality assured programme of work. The new method is applied in Part 2 of this study (Parkin et al., J. Hydrol., 175:595-613, 1996).

  16. A proposal for a CT driven classification of left colon acute diverticulitis.

    PubMed

    Sartelli, Massimo; Moore, Frederick A; Ansaloni, Luca; Di Saverio, Salomone; Coccolini, Federico; Griffiths, Ewen A; Coimbra, Raul; Agresta, Ferdinando; Sakakushev, Boris; Ordoñez, Carlos A; Abu-Zidan, Fikri M; Karamarkovic, Aleksandar; Augustin, Goran; Costa Navarro, David; Ulrych, Jan; Demetrashvili, Zaza; Melo, Renato B; Marwah, Sanjay; Zachariah, Sanoop K; Wani, Imtiaz; Shelat, Vishal G; Kim, Jae Il; McFarlane, Michael; Pintar, Tadaja; Rems, Miran; Bala, Miklosh; Ben-Ishay, Offir; Gomes, Carlos Augusto; Faro, Mario Paulo; Pereira, Gerson Alves; Catani, Marco; Baiocchi, Gianluca; Bini, Roberto; Anania, Gabriele; Negoi, Ionut; Kecbaja, Zurabs; Omari, Abdelkarim H; Cui, Yunfeng; Kenig, Jakub; Sato, Norio; Vereczkei, Andras; Skrovina, Matej; Das, Koray; Bellanova, Giovanni; Di Carlo, Isidoro; Segovia Lohse, Helmut A; Kong, Victor; Kok, Kenneth Y; Massalou, Damien; Smirnov, Dmitry; Gachabayov, Mahir; Gkiokas, Georgios; Marinis, Athanasios; Spyropoulos, Charalampos; Nikolopoulos, Ioannis; Bouliaris, Konstantinos; Tepp, Jaan; Lohsiriwat, Varut; Çolak, Elif; Isik, Arda; Rios-Cruz, Daniel; Soto, Rodolfo; Abbas, Ashraf; Tranà, Cristian; Caproli, Emanuele; Soldatenkova, Darija; Corcione, Francesco; Piazza, Diego; Catena, Fausto

    2015-01-01

    Computed tomography (CT) imaging is the most appropriate diagnostic tool to confirm suspected left colonic diverticulitis. However, the utility of CT imaging goes beyond accurate diagnosis of diverticulitis; the grade of severity on CT imaging may drive treatment planning of patients presenting with acute diverticulitis. The appropriate management of left colon acute diverticulitis remains still debated because of the vast spectrum of clinical presentations and different approaches to treatment proposed. The authors present a new simple classification system based on both CT scan results driving decisions making management of acute diverticulitis that may be universally accepted for day to day practice.

  17. Space Flight Operations Center local area network

    NASA Technical Reports Server (NTRS)

    Goodman, Ross V.

    1988-01-01

    The existing Mission Control and Computer Center at JPL will be replaced by the Space Flight Operations Center (SFOC). One part of the SFOC is the LAN-based distribution system. The purpose of the LAN is to distribute the processed data among the various elements of the SFOC. The SFOC LAN will provide a robust subsystem that will support the Magellan launch configuration and future project adaptation. Its capabilities include (1) a proven cable medium as the backbone for the entire network; (2) hardware components that are reliable, varied, and follow OSI standards; (3) accurate and detailed documentation for fault isolation and future expansion; and (4) proven monitoring and maintenance tools.

  18. Teens, technology, and health care.

    PubMed

    Leanza, Francesco; Hauser, Diane

    2014-09-01

    Teens are avid users of new technologies and social media. Nearly 95% of US adolescents are online at least occasionally. Health care professionals and organizations that work with teens should identify online health information that is both accurate and teen friendly. Early studies indicate that some of the new health technology tools are acceptable to teens, particularly texting, computer-based psychosocial screening, and online interventions. Technology is being used to provide sexual health education, medication reminders for contraception, and information on locally available health care services. This article reviews early and emerging studies of technology use to promote teen health. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Computational ecology as an emerging science

    PubMed Central

    Petrovskii, Sergei; Petrovskaya, Natalia

    2012-01-01

    It has long been recognized that numerical modelling and computer simulations can be used as a powerful research tool to understand, and sometimes to predict, the tendencies and peculiarities in the dynamics of populations and ecosystems. It has been, however, much less appreciated that the context of modelling and simulations in ecology is essentially different from those that normally exist in other natural sciences. In our paper, we review the computational challenges arising in modern ecology in the spirit of computational mathematics, i.e. with our main focus on the choice and use of adequate numerical methods. Somewhat paradoxically, the complexity of ecological problems does not always require the use of complex computational methods. This paradox, however, can be easily resolved if we recall that application of sophisticated computational methods usually requires clear and unambiguous mathematical problem statement as well as clearly defined benchmark information for model validation. At the same time, many ecological problems still do not have mathematically accurate and unambiguous description, and available field data are often very noisy, and hence it can be hard to understand how the results of computations should be interpreted from the ecological viewpoint. In this scientific context, computational ecology has to deal with a new paradigm: conventional issues of numerical modelling such as convergence and stability become less important than the qualitative analysis that can be provided with the help of computational techniques. We discuss this paradigm by considering computational challenges arising in several specific ecological applications. PMID:23565336

  20. Computer Simulations of Intrinsically Disordered Proteins

    NASA Astrophysics Data System (ADS)

    Chong, Song-Ho; Chatterjee, Prathit; Ham, Sihyun

    2017-05-01

    The investigation of intrinsically disordered proteins (IDPs) is a new frontier in structural and molecular biology that requires a new paradigm to connect structural disorder to function. Molecular dynamics simulations and statistical thermodynamics potentially offer ideal tools for atomic-level characterizations and thermodynamic descriptions of this fascinating class of proteins that will complement experimental studies. However, IDPs display sensitivity to inaccuracies in the underlying molecular mechanics force fields. Thus, achieving an accurate structural characterization of IDPs via simulations is a challenge. It is also daunting to perform a configuration-space integration over heterogeneous structural ensembles sampled by IDPs to extract, in particular, protein configurational entropy. In this review, we summarize recent efforts devoted to the development of force fields and the critical evaluations of their performance when applied to IDPs. We also survey recent advances in computational methods for protein configurational entropy that aim to provide a thermodynamic link between structural disorder and protein activity.

Top