Sample records for small tool computer

  1. Evaluating Gaze-Based Interface Tools to Facilitate Point-and-Select Tasks with Small Targets

    ERIC Educational Resources Information Center

    Skovsgaard, Henrik; Mateo, Julio C.; Hansen, John Paulin

    2011-01-01

    Gaze interaction affords hands-free control of computers. Pointing to and selecting small targets using gaze alone is difficult because of the limited accuracy of gaze pointing. This is the first experimental comparison of gaze-based interface tools for small-target (e.g. less than 12 x 12 pixels) point-and-select tasks. We conducted two…

  2. Proposed method of producing large optical mirrors Single-point diamond crushing followed by polishing with a small-area tool

    NASA Technical Reports Server (NTRS)

    Wright, G.; Bryan, J. B.

    1986-01-01

    Faster production of large optical mirrors may result from combining single-point diamond crushing of the glass with polishing using a small area tool to smooth the surface and remove the damaged layer. Diamond crushing allows a surface contour accurate to 0.5 microns to be generated, and the small area computer-controlled polishing tool allows the surface roughness to be removed without destroying the initial contour. Final contours with an accuracy of 0.04 microns have been achieved.

  3. IMS - MS Data Extractor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-20

    An automated drift time extraction and computed associated collision cross section software tool for small molecule analysis with ion mobility spectrometry-mass spectrometry (IMS-MS). The software automatically extracts drift times and computes associated collision cross sections for small molecules analyzed using ion mobility spectrometry-mass spectrometry (IMS-MS) based on a target list of expected ions provided by the user.

  4. INTEGRATION OF POLLUTION PREVENTION TOOLS

    EPA Science Inventory

    A prototype computer-based decision support system was designed to provide small businesses with an integrated pollution prevention methodology. Preliminary research involved compilation of an inventory of existing pollution prevention tools (i.e., methodologies, software, etc.),...

  5. Parallel and serial computing tools for testing single-locus and epistatic SNP effects of quantitative traits in genome-wide association studies

    PubMed Central

    Ma, Li; Runesha, H Birali; Dvorkin, Daniel; Garbe, John R; Da, Yang

    2008-01-01

    Background Genome-wide association studies (GWAS) using single nucleotide polymorphism (SNP) markers provide opportunities to detect epistatic SNPs associated with quantitative traits and to detect the exact mode of an epistasis effect. Computational difficulty is the main bottleneck for epistasis testing in large scale GWAS. Results The EPISNPmpi and EPISNP computer programs were developed for testing single-locus and epistatic SNP effects on quantitative traits in GWAS, including tests of three single-locus effects for each SNP (SNP genotypic effect, additive and dominance effects) and five epistasis effects for each pair of SNPs (two-locus interaction, additive × additive, additive × dominance, dominance × additive, and dominance × dominance) based on the extended Kempthorne model. EPISNPmpi is the parallel computing program for epistasis testing in large scale GWAS and achieved excellent scalability for large scale analysis and portability for various parallel computing platforms. EPISNP is the serial computing program based on the EPISNPmpi code for epistasis testing in small scale GWAS using commonly available operating systems and computer hardware. Three serial computing utility programs were developed for graphical viewing of test results and epistasis networks, and for estimating CPU time and disk space requirements. Conclusion The EPISNPmpi parallel computing program provides an effective computing tool for epistasis testing in large scale GWAS, and the epiSNP serial computing programs are convenient tools for epistasis analysis in small scale GWAS using commonly available computer hardware. PMID:18644146

  6. DECUS Proceedings; Fall 1971, Papers and Presentations.

    ERIC Educational Resources Information Center

    1971

    Papers and presentations at the 1971 symposium of the Digital Equipment Computer Users Society (DECUS) are presented. The papers deal with medical and physiological applications, computer graphics, simulation education, small computer executive systems, management information tools, data acquisition systems, and high level languages. Although many…

  7. Computational Prediction of miRNA Genes from Small RNA Sequencing Data

    PubMed Central

    Kang, Wenjing; Friedländer, Marc R.

    2015-01-01

    Next-generation sequencing now for the first time allows researchers to gage the depth and variation of entire transcriptomes. However, now as rare transcripts can be detected that are present in cells at single copies, more advanced computational tools are needed to accurately annotate and profile them. microRNAs (miRNAs) are 22 nucleotide small RNAs (sRNAs) that post-transcriptionally reduce the output of protein coding genes. They have established roles in numerous biological processes, including cancers and other diseases. During miRNA biogenesis, the sRNAs are sequentially cleaved from precursor molecules that have a characteristic hairpin RNA structure. The vast majority of new miRNA genes that are discovered are mined from small RNA sequencing (sRNA-seq), which can detect more than a billion RNAs in a single run. However, given that many of the detected RNAs are degradation products from all types of transcripts, the accurate identification of miRNAs remain a non-trivial computational problem. Here, we review the tools available to predict animal miRNAs from sRNA sequencing data. We present tools for generalist and specialist use cases, including prediction from massively pooled data or in species without reference genome. We also present wet-lab methods used to validate predicted miRNAs, and approaches to computationally benchmark prediction accuracy. For each tool, we reference validation experiments and benchmarking efforts. Last, we discuss the future of the field. PMID:25674563

  8. Conducting Creativity Brainstorming Sessions in Small and Medium-Sized Enterprises Using Computer-Mediated Communication Tools

    NASA Astrophysics Data System (ADS)

    Murthy, Uday S.

    A variety of Web-based low cost computer-mediated communication (CMC) tools are now available for use by small and medium-sized enterprises (SME). These tools invariably incorporate chat systems that facilitate simultaneous input in synchronous electronic meeting environments, allowing what is referred to as “electronic brainstorming.” Although prior research in information systems (IS) has established that electronic brainstorming can be superior to face-to-face brainstorming, there is a lack of detailed guidance regarding how CMC tools should be optimally configured to foster creativity in SMEs. This paper discusses factors to be considered in using CMC tools for creativity brainstorming and proposes recommendations for optimally configuring CMC tools to enhance creativity in SMEs. The recommendations are based on lessons learned from several recent experimental studies on the use of CMC tools for rich brainstorming tasks that require participants to invoke domain-specific knowledge. Based on a consideration of the advantages and disadvantages of the various configuration options, the recommendations provided can form the basis for selecting a CMC tool for creativity brainstorming or for creating an in-house CMC tool for the purpose.

  9. Small scale sequence automation pays big dividends

    NASA Technical Reports Server (NTRS)

    Nelson, Bill

    1994-01-01

    Galileo sequence design and integration are supported by a suite of formal software tools. Sequence review, however, is largely a manual process with reviewers scanning hundreds of pages of cryptic computer printouts to verify sequence correctness. Beginning in 1990, a series of small, PC based sequence review tools evolved. Each tool performs a specific task but all have a common 'look and feel'. The narrow focus of each tool means simpler operation, and easier creation, testing, and maintenance. Benefits from these tools are (1) decreased review time by factors of 5 to 20 or more with a concomitant reduction in staffing, (2) increased review accuracy, and (3) excellent returns on time invested.

  10. THE COMPUTER CONCEPT OF SELF-INSTRUCTIONAL DEVICES.

    ERIC Educational Resources Information Center

    SILBERMAN, HARRY F.

    THE COMPUTER SYSTEM CONCEPT WILL BE DEVELOPED IN TWO WAYS--FIRST, A DESCRIPTION WILL BE MADE OF THE SMALL COMPUTER-BASED TEACHING MACHINE WHICH IS BEING USED AS A RESEARCH TOOL, SECOND, A DESCRIPTION WILL BE MADE OF THE LARGE COMPUTER LABORATORY FOR AUTOMATED SCHOOL SYSTEMS WHICH ARE BEING DEVELOPED. THE FIRST MACHINE CONSISTS OF THREE ELEMENTS--…

  11. Development of computational small animal models and their applications in preclinical imaging and therapy research.

    PubMed

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  12. SLUG - stochastically lighting up galaxies - III. A suite of tools for simulated photometry, spectroscopy, and Bayesian inference with stochastic stellar populations

    NASA Astrophysics Data System (ADS)

    Krumholz, Mark R.; Fumagalli, Michele; da Silva, Robert L.; Rendahl, Theodore; Parra, Jonathan

    2015-09-01

    Stellar population synthesis techniques for predicting the observable light emitted by a stellar population have extensive applications in numerous areas of astronomy. However, accurate predictions for small populations of young stars, such as those found in individual star clusters, star-forming dwarf galaxies, and small segments of spiral galaxies, require that the population be treated stochastically. Conversely, accurate deductions of the properties of such objects also require consideration of stochasticity. Here we describe a comprehensive suite of modular, open-source software tools for tackling these related problems. These include the following: a greatly-enhanced version of the SLUG code introduced by da Silva et al., which computes spectra and photometry for stochastically or deterministically sampled stellar populations with nearly arbitrary star formation histories, clustering properties, and initial mass functions; CLOUDY_SLUG, a tool that automatically couples SLUG-computed spectra with the CLOUDY radiative transfer code in order to predict stochastic nebular emission; BAYESPHOT, a general-purpose tool for performing Bayesian inference on the physical properties of stellar systems based on unresolved photometry; and CLUSTER_SLUG and SFR_SLUG, a pair of tools that use BAYESPHOT on a library of SLUG models to compute the mass, age, and extinction of mono-age star clusters, and the star formation rate of galaxies, respectively. The latter two tools make use of an extensive library of pre-computed stellar population models, which are included in the software. The complete package is available at http://www.slugsps.com.

  13. Development of computational small animal models and their applications in preclinical imaging and therapy research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Tianwu; Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch; Geneva Neuroscience Center, Geneva University, Geneva CH-1205

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and themore » development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.« less

  14. An Interactive, Versatile, Three-Dimensional Display, Manipulation and Plotting System for Biomedical Research

    ERIC Educational Resources Information Center

    Feldmann, Richard J.; And Others

    1972-01-01

    Computer graphics provides a valuable tool for the representation and a better understanding of structures, both small and large. Accurate and rapid construction, manipulation, and plotting of structures, such as macromolecules as complex as hemoglobin, are performed by a collection of computer programs and a time-sharing computer. (21 references)…

  15. Design Tool

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Developed under a Small Business Innovation Research (SBIR) contract, RAMPANT is a CFD software package for computing flow around complex shapes. The package is flexible, fast and easy to use. It has found a great number of applications, including computation of air flow around a Nordic ski jumper, prediction of flow over an airfoil and computation of the external aerodynamics of motor vehicles.

  16. Robotic surgery

    MedlinePlus

    Robot-assisted surgery; Robotic-assisted laparoscopic surgery; Laparoscopic surgery with robotic assistance ... computer station and directs the movements of a robot. Small surgical tools are attached to the robot's ...

  17. Tools Automate Spacecraft Testing, Operation

    NASA Technical Reports Server (NTRS)

    2010-01-01

    "NASA began the Small Explorer (SMEX) program to develop spacecraft to advance astrophysics and space physics. As one of the entities supporting software development at Goddard Space Flight Center, the Hammers Company Inc. (tHC Inc.), of Greenbelt, Maryland, developed the Integrated Test and Operations System to support SMEX. Later, the company received additional Small Business Innovation Research (SBIR) funding from Goddard for a tool to facilitate the development of flight software called VirtualSat. NASA uses the tools to support 15 satellites, and the aerospace industry is using them to develop science instruments, spacecraft computer systems, and navigation and control software."

  18. Computer Controlled Optical Surfacing With Orbital Tool Motion

    NASA Astrophysics Data System (ADS)

    Jones, Robert A.

    1985-11-01

    Asymmetric aspheric optical surfaces are very difficult to fabricate using classical techniques and laps the same size as the workpiece. Opticians can produce such surfaces by hand grinding and polishing, using small laps with orbital tool motion. However, this is a time consuming process unsuitable for large optical elements.

  19. Efficient hybrid-symbolic methods for quantum mechanical calculations

    NASA Astrophysics Data System (ADS)

    Scott, T. C.; Zhang, Wenxing

    2015-06-01

    We present hybrid symbolic-numerical tools to generate optimized numerical code for rapid prototyping and fast numerical computation starting from a computer algebra system (CAS) and tailored to any given quantum mechanical problem. Although a major focus concerns the quantum chemistry methods of H. Nakatsuji which has yielded successful and very accurate eigensolutions for small atoms and molecules, the tools are general and may be applied to any basis set calculation with a variational principle applied to its linear and non-linear parameters.

  20. Developing eThread pipeline using SAGA-pilot abstraction for large-scale structural bioinformatics.

    PubMed

    Ragothaman, Anjani; Boddu, Sairam Chowdary; Kim, Nayong; Feinstein, Wei; Brylinski, Michal; Jha, Shantenu; Kim, Joohyun

    2014-01-01

    While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread--a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure.

  1. Developing eThread Pipeline Using SAGA-Pilot Abstraction for Large-Scale Structural Bioinformatics

    PubMed Central

    Ragothaman, Anjani; Feinstein, Wei; Jha, Shantenu; Kim, Joohyun

    2014-01-01

    While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread—a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure. PMID:24995285

  2. Second-generation DNA-templated macrocycle libraries for the discovery of bioactive small molecules.

    PubMed

    Usanov, Dmitry L; Chan, Alix I; Maianti, Juan Pablo; Liu, David R

    2018-07-01

    DNA-encoded libraries have emerged as a widely used resource for the discovery of bioactive small molecules, and offer substantial advantages compared with conventional small-molecule libraries. Here, we have developed and streamlined multiple fundamental aspects of DNA-encoded and DNA-templated library synthesis methodology, including computational identification and experimental validation of a 20 × 20 × 20 × 80 set of orthogonal codons, chemical and computational tools for enhancing the structural diversity and drug-likeness of library members, a highly efficient polymerase-mediated template library assembly strategy, and library isolation and purification methods. We have integrated these improved methods to produce a second-generation DNA-templated library of 256,000 small-molecule macrocycles with improved drug-like physical properties. In vitro selection of this library for insulin-degrading enzyme affinity resulted in novel insulin-degrading enzyme inhibitors, including one of unusual potency and novel macrocycle stereochemistry (IC 50  = 40 nM). Collectively, these developments enable DNA-templated small-molecule libraries to serve as more powerful, accessible, streamlined and cost-effective tools for bioactive small-molecule discovery.

  3. Software Engineering for Scientific Computer Simulations

    NASA Astrophysics Data System (ADS)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  4. A Review of the Suitability of Available Computer Aided Software Engineering (CASE) Tools for the Small Software Development Environment

    DTIC Science & Technology

    1989-07-11

    LITERATURE CITED [Boeh73] Boehm, Barry W., "Software and its Impact: A Quantitative Assessment," Datamation, 19, 5, (May 1973), pp 48-59. [Boeh76...Boehm, Barry W., "Software Engineering," IEEE Transactions on Computers, C-25, 12, (December 1976), pp 1226-1241. [Boeh81a] Boehm, Barry W., Software...Engineering Economics, Prentice-Hall, Inc., Englewood Cliffs, NJ, (1981). [Boeh8lb] Boehm, Barry W., "An Experiment in Small Scale Application Software

  5. Scattered Dose Calculations and Measurements in a Life-Like Mouse Phantom

    PubMed Central

    Welch, David; Turner, Leah; Speiser, Michael; Randers-Pehrson, Gerhard; Brenner, David J.

    2017-01-01

    Anatomically accurate phantoms are useful tools for radiation dosimetry studies. In this work, we demonstrate the construction of a new generation of life-like mouse phantoms in which the methods have been generalized to be applicable to the fabrication of any small animal. The mouse phantoms, with built-in density inhomogeneity, exhibit different scattering behavior dependent on where the radiation is delivered. Computer models of the mouse phantoms and a small animal irradiation platform were devised in Monte Carlo N-Particle code (MCNP). A baseline test replicating the irradiation system in a computational model shows minimal differences from experimental results from 50 Gy down to 0.1 Gy. We observe excellent agreement between scattered dose measurements and simulation results from X-ray irradiations focused at either the lung or the abdomen within our phantoms. This study demonstrates the utility of our mouse phantoms as measurement tools with the goal of using our phantoms to verify complex computational models. PMID:28140787

  6. Computational Tools for Metabolic Engineering

    PubMed Central

    Copeland, Wilbert B.; Bartley, Bryan A.; Chandran, Deepak; Galdzicki, Michal; Kim, Kyung H.; Sleight, Sean C.; Maranas, Costas D.; Sauro, Herbert M.

    2012-01-01

    A great variety of software applications are now employed in the metabolic engineering field. These applications have been created to support a wide range of experimental and analysis techniques. Computational tools are utilized throughout the metabolic engineering workflow to extract and interpret relevant information from large data sets, to present complex models in a more manageable form, and to propose efficient network design strategies. In this review, we present a number of tools that can assist in modifying and understanding cellular metabolic networks. The review covers seven areas of relevance to metabolic engineers. These include metabolic reconstruction efforts, network visualization, nucleic acid and protein engineering, metabolic flux analysis, pathway prospecting, post-structural network analysis and culture optimization. The list of available tools is extensive and we can only highlight a small, representative portion of the tools from each area. PMID:22629572

  7. The mission events graphic generator software: A small tool with big results

    NASA Technical Reports Server (NTRS)

    Lupisella, Mark; Leibee, Jack; Scaffidi, Charles

    1993-01-01

    Utilization of graphics has long been a useful methodology for many aspects of spacecraft operations. A personal computer based software tool that implements straight-forward graphics and greatly enhances spacecraft operations is presented. This unique software tool is the Mission Events Graphic Generator (MEGG) software which is used in support of the Hubble Space Telescope (HST) Project. MEGG reads the HST mission schedule and generates a graphical timeline.

  8. Developing Strategic and Reasoning Abilities with Computer Games at Primary School Level

    ERIC Educational Resources Information Center

    Bottino, R. M.; Ferlino, L.; Ott, M.; Tavella, M.

    2007-01-01

    The paper reports a small-scale, long-term pilot project designed to foster strategic and reasoning abilities in young primary school pupils by engaging them in a number of computer games, mainly those usually called mind games (brainteasers, puzzlers, etc.). In this paper, the objectives, work methodology, experimental setting, and tools used in…

  9. When "Promotoras" and Technology Meet: A Qualitative Analysis of "Promotoras'" Use of Small Media to Increase Cancer Screening among South Texas Latinos

    ERIC Educational Resources Information Center

    Arvey, Sarah R.; Fernandez, Maria E.; LaRue, Denise M.; Bartholomew, L. Kay

    2012-01-01

    Computer-based multimedia technologies can be used to tailor health messages, but "promotoras" (Spanish-speaking community health workers) rarely use these tools. "Promotoras" delivered health messages about colorectal cancer screening to medically underserved Latinos in South Texas using two small media formats: a…

  10. Computational systems chemical biology.

    PubMed

    Oprea, Tudor I; May, Elebeoba E; Leitão, Andrei; Tropsha, Alexander

    2011-01-01

    There is a critical need for improving the level of chemistry awareness in systems biology. The data and information related to modulation of genes and proteins by small molecules continue to accumulate at the same time as simulation tools in systems biology and whole body physiologically based pharmacokinetics (PBPK) continue to evolve. We called this emerging area at the interface between chemical biology and systems biology systems chemical biology (SCB) (Nat Chem Biol 3: 447-450, 2007).The overarching goal of computational SCB is to develop tools for integrated chemical-biological data acquisition, filtering and processing, by taking into account relevant information related to interactions between proteins and small molecules, possible metabolic transformations of small molecules, as well as associated information related to genes, networks, small molecules, and, where applicable, mutants and variants of those proteins. There is yet an unmet need to develop an integrated in silico pharmacology/systems biology continuum that embeds drug-target-clinical outcome (DTCO) triplets, a capability that is vital to the future of chemical biology, pharmacology, and systems biology. Through the development of the SCB approach, scientists will be able to start addressing, in an integrated simulation environment, questions that make the best use of our ever-growing chemical and biological data repositories at the system-wide level. This chapter reviews some of the major research concepts and describes key components that constitute the emerging area of computational systems chemical biology.

  11. Computational Systems Chemical Biology

    PubMed Central

    Oprea, Tudor I.; May, Elebeoba E.; Leitão, Andrei; Tropsha, Alexander

    2013-01-01

    There is a critical need for improving the level of chemistry awareness in systems biology. The data and information related to modulation of genes and proteins by small molecules continue to accumulate at the same time as simulation tools in systems biology and whole body physiologically-based pharmacokinetics (PBPK) continue to evolve. We called this emerging area at the interface between chemical biology and systems biology systems chemical biology, SCB (Oprea et al., 2007). The overarching goal of computational SCB is to develop tools for integrated chemical-biological data acquisition, filtering and processing, by taking into account relevant information related to interactions between proteins and small molecules, possible metabolic transformations of small molecules, as well as associated information related to genes, networks, small molecules and, where applicable, mutants and variants of those proteins. There is yet an unmet need to develop an integrated in silico pharmacology / systems biology continuum that embeds drug-target-clinical outcome (DTCO) triplets, a capability that is vital to the future of chemical biology, pharmacology and systems biology. Through the development of the SCB approach, scientists will be able to start addressing, in an integrated simulation environment, questions that make the best use of our ever-growing chemical and biological data repositories at the system-wide level. This chapter reviews some of the major research concepts and describes key components that constitute the emerging area of computational systems chemical biology. PMID:20838980

  12. Production of rotational parts in small-series and computer-aided planning of its production engineering

    NASA Astrophysics Data System (ADS)

    Dudas, Illes; Berta, Miklos; Cser, Istvan

    1998-12-01

    Up-to-date manufacturing equipments of production of rotational parts in small series are lathe-centers and CNC grinding machines with high concentration of manufacturing operations. By the use of these machine tools it can be produced parts with requirements of increased accuracy and surface quality. In the lathe centers, which contain the manufacturing procedures of lathes using stationary tools and of drilling-milling machine tools using rotational tools, non-rotational surfaces of rotational parts can also be produced. The high concentration of manufacturing operations makes necessary the planning and programing of the measuring, monitoring and quality control into the technological process during manufacturing operation. In this way, taking into consideration the technological possibilities of lathe canters, the scope of computer aided technological planning duties significantly increases. It is trivial requirement to give only once the descriptions of the prefabricated parts and ready made parts. Starting taking into account these careful considerations we have been developing the planning system of technology of body of revolution on the base of GTIPROG/EC system which useful for programming of lathe centers. Out paper deals with the results of development and the occurring problems.

  13. Using Computer-Based Continuing Professional Education of Training Staff to Develop Small- and Medium-Sized Enterprises in Thailand

    ERIC Educational Resources Information Center

    Sooraksa, Nanta

    2012-01-01

    This paper describes a career development program for staff involved in providing training for small- and medium-sized enterprises (SMEs) in Thailand. Most of these staff were professional vocational teachers in schools. The program uses information communication technology (ICT), and its main objective is to teach Moodle software as a tool for…

  14. An Integrated Unix-based CAD System for the Design and Testing of Custom VLSI Chips

    NASA Technical Reports Server (NTRS)

    Deutsch, L. J.

    1985-01-01

    A computer aided design (CAD) system that is being used at the Jet Propulsion Laboratory for the design of custom and semicustom very large scale integrated (VLSI) chips is described. The system consists of a Digital Equipment Corporation VAX computer with the UNIX operating system and a collection of software tools for the layout, simulation, and verification of microcircuits. Most of these tools were written by the academic community and are, therefore, available to JPL at little or no cost. Some small pieces of software have been written in-house in order to make all the tools interact with each other with a minimal amount of effort on the part of the designer.

  15. Childhood CT scans linked to leukemia and brain cancer later in life

    Cancer.gov

    Children and young adults scanned multiple times by computed tomography (CT), a commonly used diagnostic tool, have a small increased risk of leukemia and brain tumors in the decade following their first scan.

  16. Benefits and applications of interdisciplinary digital tools for environmental meta-reviews and analyses

    NASA Astrophysics Data System (ADS)

    Grubert, Emily; Siders, Anne

    2016-09-01

    Digitally-aided reviews of large bodies of text-based information, such as academic literature, are growing in capability but are not yet common in environmental fields. Environmental sciences and studies can benefit from application of digital tools to create comprehensive, replicable, interdisciplinary reviews that provide rapid, up-to-date, and policy-relevant reports of existing work. This work reviews the potential for applications of computational text mining and analysis tools originating in the humanities to environmental science and policy questions. Two process-oriented case studies of digitally-aided environmental literature reviews and meta-analyses illustrate potential benefits and limitations. A medium-sized, medium-resolution review (∼8000 journal abstracts and titles) focuses on topic modeling as a rapid way to identify thematic changes over time. A small, high-resolution review (∼300 full text journal articles) combines collocation and network analysis with manual coding to synthesize and question empirical field work. We note that even small digitally-aided analyses are close to the upper limit of what can be done manually. Established computational methods developed in humanities disciplines and refined by humanities and social science scholars to interrogate large bodies of textual data are applicable and useful in environmental sciences but have not yet been widely applied. Two case studies provide evidence that digital tools can enhance insight. Two major conclusions emerge. First, digital tools enable scholars to engage large literatures rapidly and, in some cases, more comprehensively than is possible manually. Digital tools can confirm manually identified patterns or identify additional patterns visible only at a large scale. Second, digital tools allow for more replicable and transparent conclusions to be drawn from literature reviews and meta-analyses. The methodological subfields of digital humanities and computational social sciences will likely continue to create innovative tools for analyzing large bodies of text, providing opportunities for interdisciplinary collaboration with the environmental fields.

  17. Study of Image Qualities From 6D Robot-Based CBCT Imaging System of Small Animal Irradiator.

    PubMed

    Sharma, Sunil; Narayanasamy, Ganesh; Clarkson, Richard; Chao, Ming; Moros, Eduardo G; Zhang, Xin; Yan, Yulong; Boerma, Marjan; Paudel, Nava; Morrill, Steven; Corry, Peter; Griffin, Robert J

    2017-01-01

    To assess the quality of cone beam computed tomography images obtained by a robotic arm-based and image-guided small animal conformal radiation therapy device. The small animal conformal radiation therapy device is equipped with a 40 to 225 kV X-ray tube mounted on a custom made gantry, a 1024 × 1024 pixels flat panel detector (200 μm resolution), a programmable 6 degrees of freedom robot for cone beam computed tomography imaging and conformal delivery of radiation doses. A series of 2-dimensional radiographic projection images were recorded in cone beam mode by placing and rotating microcomputed tomography phantoms on the "palm' of the robotic arm. Reconstructed images were studied for image quality (spatial resolution, image uniformity, computed tomography number linearity, voxel noise, and artifacts). Geometric accuracy was measured to be 2% corresponding to 0.7 mm accuracy on a Shelley microcomputed tomography QA phantom. Qualitative resolution of reconstructed axial computed tomography slices using the resolution coils was within 200 μm. Quantitative spatial resolution was found to be 3.16 lp/mm. Uniformity of the system was measured within 34 Hounsfield unit on a QRM microcomputed tomography water phantom. Computed tomography numbers measured using the linearity plate were linear with material density ( R 2 > 0.995). Cone beam computed tomography images of the QRM multidisk phantom had minimal artifacts. Results showed that the small animal conformal radiation therapy device is capable of producing high-quality cone beam computed tomography images for precise and conformal small animal dose delivery. With its high-caliber imaging capabilities, the small animal conformal radiation therapy device is a powerful tool for small animal research.

  18. Role of computed tomography angiography in detection and staging of small bowel carcinoid tumors

    PubMed Central

    Bonekamp, David; Raman, Siva P; Horton, Karen M; Fishman, Elliot K

    2015-01-01

    Small-bowel carcinoid tumors are the most common form (42%) of gastrointestinal carcinoids, which by themselves comprise 70% of neuroendocrine tumors. Although primary small bowel neoplasms are overall rare (3%-6% of all gastrointestinal neoplasms), carcinoids still represent the second most common (20%-30%) primary small-bowel malignancy after small bowel adenocarcinoma. Their imaging evaluation is often challenging. State-of-the-art high-resolution multiphasic computed tomography together with advanced postprocessing methods provides an excellent tool for their depiction. The manifold interactive parameter choices however require knowledge of when to use which technique. Here, we discuss the imaging appearance and evaluation of duodenal, jejunal and ileal carcinoid tumors, including the imaging features of the primary tumor, locoregional mesenteric nodal metastases, and distant metastatic disease. A protocol for optimal lesion detection is presented, including the use of computed tomography enterography, volume acquisition, computed tomography angiography and three-dimensional mapping. Imaging findings are illustrated with a series of challenging cases which illustrate the spectrum of possible disease in the small bowel and mesentery, the range of possible appearances in the bowel itself on multiphase data and extraluminal findings such as the desmoplastic reaction in mesentery and hypervascular liver metastases. Typical imaging pitfalls and pearls are illustrated. PMID:26435774

  19. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    PubMed

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible.

  20. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road

    PubMed Central

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on “on-demand payment” for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  1. Optimization of the propulsion for multistage solid rocket motor launchers

    NASA Astrophysics Data System (ADS)

    Calabro, M.; Dufour, A.; Macaire, A.

    2002-02-01

    Some tools focused on a rapid multidisciplinary optimization capability for multistage launch vehicle design were developed at EADS-LV. These tools may be broken down into two categories, those related to propulsion design optimization and a computer code devoted to trajectories and under constraints optimization. Both are linked in order to obtain optimal vehicle design after an iterative process. After a description of the two categories tools, an example of application is given on a small space launcher.

  2. A meta-analysis of pedagogical tools used in introductory programming courses

    NASA Astrophysics Data System (ADS)

    Trees, Frances P.

    Programming is recognized as being challenging for teachers to teach and difficult for students to learn. For decades, computer science educators have looked at innovative approaches by creating pedagogical software tools that attempt to facilitate both the teaching of and the learning of programming. This dissertation investigates the motivations for the integration of pedagogical tools in introductory programming courses and the characteristics that are perceived to contribute to the effectiveness of these tools. The study employs three research stages that examine the tool characteristics and their use. The first stage surveys teachers who use pedagogical tools in an introductory programming course. The second interviews teachers to explore the survey results in more detail and to add greater depth into the choice and use of pedagogical tools in the introductory programming class. The third interviews tool developers to provide an explanatory insight of the tool and the motivation for its creation. The results indicate that the pedagogical tools perceived to be effective share common characteristics: They provide an environment that is manageable, flexible and visual; they provide for active engagement in learning activities and support programming in small pieces; they allow for an easy transition to subsequent courses and more robust environments; they provide technical support and resource materials. The results of this study also indicate that recommendations from other computer science educators have a strong impact on a teacher's initial tool choice for an introductory programming course. This study informs present and future tool developers of the characteristics that the teachers perceive to contribute to the effectiveness of a pedagogical tool and how to present their tools to encourage a more efficient and more effective widespread adoption of the tool into the teacher's curriculum. The teachers involved in this study are actively involved in the computer science education community. The results of this study, based on the perceptions of these computer science educators, provide guidance to those educators choosing to introduce a new pedagogical tool into their programming course.

  3. Gage for micromachining system

    DOEpatents

    Miller, Donald M.

    1979-02-27

    A gage for measuring the contour of the surface of an element of a micromachining tool system and of a work piece machined by the micromachining tool system. The gage comprises a glass plate containing two electrical contacts and supporting a steel ball resting against the contacts. As the element or workpiece is moved against the steel ball, the very slight contact pressure causes an extremely small movement of the steel ball which breaks the electrical circuit between the two contacts. The contour information is supplied to a dedicated computer controlling the micromachining tool so that the computer knows the contour of the element and the work piece to an accuracy of .+-. 25 nm. The micromachining tool system with X- and omega-axes is used to machine spherical, aspherical, and irregular surfaces with a maximum contour error of 100 nanometers (nm) and surface waviness of no more than 0.8 nm RMS.

  4. Tools and procedures for visualization of proteins and other biomolecules.

    PubMed

    Pan, Lurong; Aller, Stephen G

    2015-04-01

    Protein, peptides, and nucleic acids are biomolecules that drive biological processes in living organisms. An enormous amount of structural data for a large number of these biomolecules has been described with atomic precision in the form of structural "snapshots" that are freely available in public repositories. These snapshots can help explain how the biomolecules function, the nature of interactions between multi-molecular complexes, and even how small-molecule drugs can modulate the biomolecules for clinical benefits. Furthermore, these structural snapshots serve as inputs for sophisticated computer simulations to turn the biomolecules into moving, "breathing" molecular machines for understanding their dynamic properties in real-time computer simulations. In order for the researcher to take advantage of such a wealth of structural data, it is necessary to gain competency in the use of computer molecular visualization tools for exploring the structures and visualizing three-dimensional spatial representations. Here, we present protocols for using two common visualization tools--the Web-based Jmol and the stand-alone PyMOL package--as well as a few examples of other popular tools. Copyright © 2015 John Wiley & Sons, Inc.

  5. Runtime Performance Monitoring Tool for RTEMS System Software

    NASA Astrophysics Data System (ADS)

    Cho, B.; Kim, S.; Park, H.; Kim, H.; Choi, J.; Chae, D.; Lee, J.

    2007-08-01

    RTEMS is a commercial-grade real-time operating system that supports multi-processor computers. However, there are not many development tools for RTEMS. In this paper, we report new RTEMS-based runtime performance monitoring tool. We have implemented a light weight runtime monitoring task with an extension to the RTEMS APIs. Using our tool, software developers can verify various performance- related parameters during runtime. Our tool can be used during software development phase and in-orbit operation as well. Our implemented target agent is light weight and has small overhead using SpaceWire interface. Efforts to reduce overhead and to add other monitoring parameters are currently under research.

  6. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollingsworth, Jeff

    2014-07-31

    The purpose of this project was to develop tools and techniques to improve the ability of computational scientists to investigate and correct problems (bugs) in their programs. Specifically, the University of Maryland component of this project focused on the problems associated with the finite number of bits available in a computer to represent numeric values. In large scale scientific computation, numbers are frequently added to and multiplied with each other billions of times. Thus even small errors due to the representation of numbers can accumulate into big errors. However, using too many bits to represent a number results in additionalmore » computation, memory, and energy costs. Thus it is critical to find the right size for numbers. This project focused on several aspects of this general problem. First, we developed a tool to look for cancelations, the catastrophic loss of precision in numbers due to the addition of two numbers whose actual values are close to each other, but whose representation in a computer is identical or nearly so. Second, we developed a suite of tools to allow programmers to identify exactly how much precision is required for each operation in their program. This tool allows programmers to both verify that enough precision is available, but more importantly find cases where extra precision could be eliminated to allow the program to use less memory, computer time, or energy. These tools use advanced binary modification techniques to allow the analysis of actual optimized code. The system, called Craft, has been applied to a number of benchmarks and real applications.« less

  7. An Efficient Computational Framework for the Analysis of Whole Slide Images: Application to Follicular Lymphoma Immunohistochemistry

    PubMed Central

    Samsi, Siddharth; Krishnamurthy, Ashok K.; Gurcan, Metin N.

    2012-01-01

    Follicular Lymphoma (FL) is one of the most common non-Hodgkin Lymphoma in the United States. Diagnosis and grading of FL is based on the review of histopathological tissue sections under a microscope and is influenced by human factors such as fatigue and reader bias. Computer-aided image analysis tools can help improve the accuracy of diagnosis and grading and act as another tool at the pathologist’s disposal. Our group has been developing algorithms for identifying follicles in immunohistochemical images. These algorithms have been tested and validated on small images extracted from whole slide images. However, the use of these algorithms for analyzing the entire whole slide image requires significant changes to the processing methodology since the images are relatively large (on the order of 100k × 100k pixels). In this paper we discuss the challenges involved in analyzing whole slide images and propose potential computational methodologies for addressing these challenges. We discuss the use of parallel computing tools on commodity clusters and compare performance of the serial and parallel implementations of our approach. PMID:22962572

  8. Rational, computer-enabled peptide drug design: principles, methods, applications and future directions.

    PubMed

    Diller, David J; Swanson, Jon; Bayden, Alexander S; Jarosinski, Mark; Audie, Joseph

    2015-01-01

    Peptides provide promising templates for developing drugs to occupy a middle space between small molecules and antibodies and for targeting 'undruggable' intracellular protein-protein interactions. Importantly, rational or in cerebro design, especially when coupled with validated in silico tools, can be used to efficiently explore chemical space and identify islands of 'drug-like' peptides to satisfy diverse drug discovery program objectives. Here, we consider the underlying principles of and recent advances in rational, computer-enabled peptide drug design. In particular, we consider the impact of basic physicochemical properties, potency and ADME/Tox opportunities and challenges, and recently developed computational tools for enabling rational peptide drug design. Key principles and practices are spotlighted by recent case studies. We close with a hypothetical future case study.

  9. Centralized Authentication with Kerberos 5, Part I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wachsmann, A

    Account administration in a distributed Unix/Linux environment can become very complicated and messy if done by hand. Large sites use special tools to deal with this problem. I will describe how even very small installations like your three computer network at home can take advantage of the very same tools. The problem in a distributed environment is that password and shadow files need to be changed individually on each machine if an account change occurs. Account changes include: password change, addition/removal of accounts, name change of an account (UID/GID changes are a big problem in any case), additional or removedmore » login privileges to a (group of) computer(s), etc. In this article, I will show how Kerberos 5 solves the authentication problem in a distributed computing environment. A second article will describe a solution for the authorization problem.« less

  10. Diagnostic and therapeutic value of laparoscopy for small bowel blunt injuries: A case report.

    PubMed

    Addeo, Pietro; Calabrese, Daniela Paola

    2011-01-01

    Small bowel injuries after blunt abdominal trauma represent both a diagnostic and a therapeutic challenge. Early diagnosis and prompt treatment are necessary in order to avoid a dangerous diagnostic delay. Laparoscopy can represent a diagnostic and therapeutic tool in patients with uncertain clinical symptoms. We report the case of a 25-year-old man, haemodynamically stable, admitted for acute abdominal pain a few hours after a physical assault. Giving the persistence of the abdominal pain and the presence of free fluids at the computed tomography examination, an exploratory laparoscopy was performed. At the laparoscopic exploration, an isolated small bowel perforation was found, 60 cm distal from the ligament of Treitz. The injury was repaired by laparoscopic suturing and the patient was discharged home at postoperative day 3 after an uneventful postoperative course. Laparoscopy represents a valuable tool for patients with small bowel blunt injuries allowing a timely diagnosis and a prompt treatment.

  11. National Stormwater Calculator User's Guide – VERSION 1.1

    EPA Science Inventory

    This document is the user's guide for running EPA's National Stormwater Calculator (http://www.epa.gov/nrmrl/wswrd/wq/models/swc/). The National Stormwater Calculator is a simple to use tool for computing small site hydrology for any location within the US.

  12. Protocols for Molecular Dynamics Simulations of RNA Nanostructures.

    PubMed

    Kim, Taejin; Kasprzak, Wojciech K; Shapiro, Bruce A

    2017-01-01

    Molecular dynamics (MD) simulations have been used as one of the main research tools to study a wide range of biological systems and bridge the gap between X-ray crystallography or NMR structures and biological mechanism. In the field of RNA nanostructures, MD simulations have been used to fix steric clashes in computationally designed RNA nanostructures, characterize the dynamics, and investigate the interaction between RNA and other biomolecules such as delivery agents and membranes.In this chapter we present examples of computational protocols for molecular dynamics simulations in explicit and implicit solvent using the Amber Molecular Dynamics Package. We also show examples of post-simulation analysis steps and briefly mention selected tools beyond the Amber package. Limitations of the methods, tools, and protocols are also discussed. Most of the examples are illustrated for a small RNA duplex (helix), but the protocols are applicable to any nucleic acid structure, subject only to the computational speed and memory limitations of the hardware available to the user.

  13. Migrating the Belle II collaborative services and tools

    NASA Astrophysics Data System (ADS)

    Braun, N.; Dossett, D.; Dramburg, M.; Frost, O.; Gellrich, A.; Grygier, J.; Hauth, T.; Jahnke-Zumbusch, D.; Knittel, D.; Kuhr, T.; Levonian, S.; Moser, H.-G.; Li, L.; Nakao, N.; Prim, M.; Reest, P. v. d.; Schwenssen, F.; Urquijo, P.; Vennemann, B.

    2017-10-01

    The Belle II collaboration decided in 2016 to migrate its collaborative services and tools into the existing IT infrastructure at DESY. The goal was to reduce the maintenance effort for solutions operated by Belle II members as well as to deploy state-of-art technologies. In addition, some new services and tools were or will be introduced. Planning and migration work was carried out by small teams consisting of experts form Belle II and the involved IT divisions. The migration was successfully accomplished before the KEK computer centre replacement in August 2016.

  14. Distinguishing humans from computers in the game of go: A complex network approach

    NASA Astrophysics Data System (ADS)

    Coquidé, C.; Georgeot, B.; Giraud, O.

    2017-08-01

    We compare complex networks built from the game of go and obtained from databases of human-played games with those obtained from computer-played games. Our investigations show that statistical features of the human-based networks and the computer-based networks differ, and that these differences can be statistically significant on a relatively small number of games using specific estimators. We show that the deterministic or stochastic nature of the computer algorithm playing the game can also be distinguished from these quantities. This can be seen as a tool to implement a Turing-like test for go simulators.

  15. Universal quantum computation with little entanglement.

    PubMed

    Van den Nest, Maarten

    2013-02-08

    We show that universal quantum computation can be achieved in the standard pure-state circuit model while the entanglement entropy of every bipartition is small in each step of the computation. The entanglement entropy required for large-scale quantum computation even tends to zero. Moreover we show that the same conclusion applies to many entanglement measures commonly used in the literature. This includes e.g., the geometric measure, localizable entanglement, multipartite concurrence, squashed entanglement, witness-based measures, and more generally any entanglement measure which is continuous in a certain natural sense. These results demonstrate that many entanglement measures are unsuitable tools to assess the power of quantum computers.

  16. A framework for optimizing micro-CT in dual-modality micro-CT/XFCT small-animal imaging system

    NASA Astrophysics Data System (ADS)

    Vedantham, Srinivasan; Shrestha, Suman; Karellas, Andrew; Cho, Sang Hyun

    2017-09-01

    Dual-modality Computed Tomography (CT)/X-ray Fluorescence Computed Tomography (XFCT) can be a valuable tool for imaging and quantifying the organ and tissue distribution of small concentrations of high atomic number materials in small-animal system. In this work, the framework for optimizing the micro-CT imaging system component of the dual-modality system is described, either when the micro-CT images are concurrently acquired with XFCT and using the x-ray spectral conditions for XFCT, or when the micro-CT images are acquired sequentially and independently of XFCT. This framework utilizes the cascaded systems analysis for task-specific determination of the detectability index using numerical observer models at a given radiation dose, where the radiation dose is determined using Monte Carlo simulations.

  17. Stereo Correspondence Using Moment Invariants

    NASA Astrophysics Data System (ADS)

    Premaratne, Prashan; Safaei, Farzad

    Autonomous navigation is seen as a vital tool in harnessing the enormous potential of Unmanned Aerial Vehicles (UAV) and small robotic vehicles for both military and civilian use. Even though, laser based scanning solutions for Simultaneous Location And Mapping (SLAM) is considered as the most reliable for depth estimation, they are not feasible for use in UAV and land-based small vehicles due to their physical size and weight. Stereovision is considered as the best approach for any autonomous navigation solution as stereo rigs are considered to be lightweight and inexpensive. However, stereoscopy which estimates the depth information through pairs of stereo images can still be computationally expensive and unreliable. This is mainly due to some of the algorithms used in successful stereovision solutions require high computational requirements that cannot be met by small robotic vehicles. In our research, we implement a feature-based stereovision solution using moment invariants as a metric to find corresponding regions in image pairs that will reduce the computational complexity and improve the accuracy of the disparity measures that will be significant for the use in UAVs and in small robotic vehicles.

  18. National Stormwater Calculator User's Guide - Version 1.2

    EPA Science Inventory

    The National Stormwater Calculator is a simple to use tool for computing small site hydrology for any location within the US. It estimates the amount of stormwater runoff generated from a site under different development and control scenarios over a long term period of historica...

  19. Enabling drug discovery project decisions with integrated computational chemistry and informatics

    NASA Astrophysics Data System (ADS)

    Tsui, Vickie; Ortwine, Daniel F.; Blaney, Jeffrey M.

    2017-03-01

    Computational chemistry/informatics scientists and software engineers in Genentech Small Molecule Drug Discovery collaborate with experimental scientists in a therapeutic project-centric environment. Our mission is to enable and improve pre-clinical drug discovery design and decisions. Our goal is to deliver timely data, analysis, and modeling to our therapeutic project teams using best-in-class software tools. We describe our strategy, the organization of our group, and our approaches to reach this goal. We conclude with a summary of the interdisciplinary skills required for computational scientists and recommendations for their training.

  20. Field Level Computer Exploitation Package

    DTIC Science & Technology

    2007-03-01

    to take advantage of the data retrieved from the computer. Major Barge explained that if a tool could be designed that nearly anyone could use...the study of network forensics. This has become a necessity because of the constantly growing eCommerce industry and the stiff competition between...Security. One big advantage that Insert has is the fact that it is quite small compared to most bootable CDs. At only 60 megabytes it can be burned

  1. A computational continuum model of poroelastic beds

    PubMed Central

    Zampogna, G. A.

    2017-01-01

    Despite the ubiquity of fluid flows interacting with porous and elastic materials, we lack a validated non-empirical macroscale method for characterizing the flow over and through a poroelastic medium. We propose a computational tool to describe such configurations by deriving and validating a continuum model for the poroelastic bed and its interface with the above free fluid. We show that, using stress continuity condition and slip velocity condition at the interface, the effective model captures the effects of small changes in the microstructure anisotropy correctly and predicts the overall behaviour in a physically consistent and controllable manner. Moreover, we show that the performance of the effective model is accurate by validating with fully microscopic resolved simulations. The proposed computational tool can be used in investigations in a wide range of fields, including mechanical engineering, bio-engineering and geophysics. PMID:28413355

  2. When promotoras and technology meet: A qualitative analysis of promotoras’ use of small media to increase cancer screening among South Texas Latinos

    PubMed Central

    Fernandez, Maria E.; LaRue, Denise M.; Bartholomew, L. Kay

    2012-01-01

    Computer-based multimedia technologies can be used to tailor health messages, but promotoras (Spanish-speaking community health workers) rarely use these tools. Promotoras delivered health messages about colorectal cancer screening to medically underserved Latinos in South Texas using two small media formats: a “low-tech” format (flipchart and video); and a “high-tech” format consisting of a tailored, interactive computer program delivered on a tablet computer. Using qualitative methods, we observed promotora training and intervention delivery, and conducted interviews with five promotoras to compare and contrast program implementation of both formats. We discuss the ways each format aided or challenged promotoras’ intervention delivery. Findings reveal that some aspects of both formats enhanced intervention delivery by tapping into Latino health communication preferences and facilitating interpersonal communication, while other aspects hindered intervention delivery. This study contributes to our understanding of how community health workers use low- and high-tech small media formats when delivering health messages to Latinos. PMID:21986243

  3. Taking Care of the Small Computer: A Guide for Librarians.

    ERIC Educational Resources Information Center

    Williams, Gene

    1986-01-01

    Describes how to identify microcomputer problems and determine whether the services of a technician are required by troubleshooting, or using a process of elimination, without needing a technical background or special tools. Prevention methods and the use of diagnostic programs are also explained. (EM)

  4. NATIONAL STORMWATER CALCULATOR USER’S GUIDE – VERSION 1.1

    EPA Science Inventory

    The National Stormwater Calculator is a simple to use tool for computing small site hydrology for any location within the US. It estimates the amount of stormwater runoff generated from a site under different development and control scenarios over a long term period of historica...

  5. High-Performance Data Analysis Tools for Sun-Earth Connection Missions

    NASA Technical Reports Server (NTRS)

    Messmer, Peter

    2011-01-01

    The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the potential to interact, so one can build a cluster of PCs, each equipped with a GPU, and use mpiDL to communicate between the nodes and GPULib to accelerate the computations on each node.

  6. Personal computer study of finite-difference methods for the transonic small disturbance equation

    NASA Technical Reports Server (NTRS)

    Bland, Samuel R.

    1989-01-01

    Calculation of unsteady flow phenomena requires careful attention to the numerical treatment of the governing partial differential equations. The personal computer provides a convenient and useful tool for the development of meshes, algorithms, and boundary conditions needed to provide time accurate solution of these equations. The one-dimensional equation considered provides a suitable model for the study of wave propagation in the equations of transonic small disturbance potential flow. Numerical results for effects of mesh size, extent, and stretching, time step size, and choice of far-field boundary conditions are presented. Analysis of the discretized model problem supports these numerical results. Guidelines for suitable mesh and time step choices are given.

  7. Visual analysis of fluid dynamics at NASA's numerical aerodynamic simulation facility

    NASA Technical Reports Server (NTRS)

    Watson, Velvin R.

    1991-01-01

    A study aimed at describing and illustrating visualization tools used in Computational Fluid Dynamics (CFD) and indicating how these tools are likely to change by showing a projected resolution of the human computer interface is presented. The following are outlined using a graphically based test format: the revolution of human computer environments for CFD research; comparison of current environments; current environments with the ideal; predictions for the future CFD environments; what can be done to accelerate the improvements. The following comments are given: when acquiring visualization tools, potential rapid changes must be considered; environmental changes over the next ten years due to human computer interface cannot be fathomed; data flow packages such as AVS, apE, Explorer and Data Explorer are easy to learn and use for small problems, excellent for prototyping, but not so efficient for large problems; the approximation techniques used in visualization software must be appropriate for the data; it has become more cost effective to move jobs that fit on workstations and run only memory intensive jobs on the supercomputer; use of three dimensional skills will be maximized when the three dimensional environment is built in from the start.

  8. Computer Controlled Optical Surfacing With Orbital Tool Motion

    NASA Astrophysics Data System (ADS)

    Jones, Robert A.

    1985-10-01

    Asymmetric aspheric optical surfaces are very difficult to fabricate using classical techniques and laps the same size as the workpiece. Opticians can produce such surfaces by grinding and polishing, using small laps with orbital tool motion. However, hand correction is a time consuming process unsuitable for large optical elements. Itek has developed Computer Controlled Optical Surfacing (CCOS) for fabricating such aspheric optics. Automated equipment moves a nonrotating orbiting tool slowly over the workpiece surface. The process corrects low frequency surface errors by figuring. The velocity of the tool assembly over the workpiece surface is purposely varied. Since the amount of material removal is proportional to the polishing or grinding time, accurate control over material removal is achieved. The removal of middle and high frequency surface errors is accomplished by pad smoothing. For a soft pad material, the pad will compress to fit the workpiece surface producing greater pressure and more removal at the surface high areas. A harder pad will ride on only the high regions resulting in removal only for those locations.

  9. Distributing digital video to multiple computers

    PubMed Central

    Murray, James A.

    2004-01-01

    Video is an effective teaching tool, and live video microscopy is especially helpful in teaching dissection techniques and the anatomy of small neural structures. Digital video equipment is more affordable now and allows easy conversion from older analog video devices. I here describe a simple technique for bringing digital video from one camera to all of the computers in a single room. This technique allows students to view and record the video from a single camera on a microscope. PMID:23493464

  10. 75 FR 77934 - Small Business Information Security Task Force

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-14

    ... on them. The Task Force has until the end of 2013 to complete the report but it is hoped that the... computing technology industry itself. Mr. Aaron Berstein then volunteered to contact Microsoft to inquire into the possibility of Microsoft providing an online collaborative space software tool for use...

  11. Post hoc support vector machine learning for impedimetric biosensors based on weak protein-ligand interactions.

    PubMed

    Rong, Y; Padron, A V; Hagerty, K J; Nelson, N; Chi, S; Keyhani, N O; Katz, J; Datta, S P A; Gomes, C; McLamore, E S

    2018-04-30

    Impedimetric biosensors for measuring small molecules based on weak/transient interactions between bioreceptors and target analytes are a challenge for detection electronics, particularly in field studies or in the analysis of complex matrices. Protein-ligand binding sensors have enormous potential for biosensing, but achieving accuracy in complex solutions is a major challenge. There is a need for simple post hoc analytical tools that are not computationally expensive, yet provide near real time feedback on data derived from impedance spectra. Here, we show the use of a simple, open source support vector machine learning algorithm for analyzing impedimetric data in lieu of using equivalent circuit analysis. We demonstrate two different protein-based biosensors to show that the tool can be used for various applications. We conclude with a mobile phone-based demonstration focused on the measurement of acetone, an important biomarker related to the onset of diabetic ketoacidosis. In all conditions tested, the open source classifier was capable of performing as well as, or better, than the equivalent circuit analysis for characterizing weak/transient interactions between a model ligand (acetone) and a small chemosensory protein derived from the tsetse fly. In addition, the tool has a low computational requirement, facilitating use for mobile acquisition systems such as mobile phones. The protocol is deployed through Jupyter notebook (an open source computing environment available for mobile phone, tablet or computer use) and the code was written in Python. For each of the applications, we provide step-by-step instructions in English, Spanish, Mandarin and Portuguese to facilitate widespread use. All codes were based on scikit-learn, an open source software machine learning library in the Python language, and were processed in Jupyter notebook, an open-source web application for Python. The tool can easily be integrated with the mobile biosensor equipment for rapid detection, facilitating use by a broad range of impedimetric biosensor users. This post hoc analysis tool can serve as a launchpad for the convergence of nanobiosensors in planetary health monitoring applications based on mobile phone hardware.

  12. Discovery of the Kalman filter as a practical tool for aerospace and industry

    NASA Technical Reports Server (NTRS)

    Mcgee, L. A.; Schmidt, S. F.

    1985-01-01

    The sequence of events which led the researchers at Ames Research Center to the early discovery of the Kalman filter shortly after its introduction into the literature is recounted. The scientific breakthroughs and reformulations that were necessary to transform Kalman's work into a useful tool for a specific aerospace application are described. The resulting extended Kalman filter, as it is now known, is often still referred to simply as the Kalman filter. As the filter's use gained in popularity in the scientific community, the problems of implementation on small spaceborne and airborne computers led to a square-root formulation of the filter to overcome numerical difficulties associated with computer word length. The work that led to this new formulation is also discussed, including the first airborne computer implementation and flight test. Since then the applications of the extended and square-root formulations of the Kalman filter have grown rapidly throughout the aerospace industry.

  13. Satellite Systems Design/Simulation Environment: A Systems Approach to Pre-Phase A Design

    NASA Technical Reports Server (NTRS)

    Ferebee, Melvin J., Jr.; Troutman, Patrick A.; Monell, Donald W.

    1997-01-01

    A toolset for the rapid development of small satellite systems has been created. The objective of this tool is to support the definition of spacecraft mission concepts to satisfy a given set of mission and instrument requirements. The objective of this report is to provide an introduction to understanding and using the SMALLSAT Model. SMALLSAT is a computer-aided Phase A design and technology evaluation tool for small satellites. SMALLSAT enables satellite designers, mission planners, and technology program managers to observe the likely consequences of their decisions in terms of satellite configuration, non-recurring and recurring cost, and mission life cycle costs and availability statistics. It was developed by Princeton Synergetic, Inc. and User Systems, Inc. as a revision of the previous TECHSAT Phase A design tool, which modeled medium-sized Earth observation satellites. Both TECHSAT and SMALLSAT were developed for NASA.

  14. The UEA Small RNA Workbench: A Suite of Computational Tools for Small RNA Analysis.

    PubMed

    Mohorianu, Irina; Stocks, Matthew Benedict; Applegate, Christopher Steven; Folkes, Leighton; Moulton, Vincent

    2017-01-01

    RNA silencing (RNA interference, RNAi) is a complex, highly conserved mechanism mediated by short, typically 20-24 nt in length, noncoding RNAs known as small RNAs (sRNAs). They act as guides for the sequence-specific transcriptional and posttranscriptional regulation of target mRNAs and play a key role in the fine-tuning of biological processes such as growth, response to stresses, or defense mechanism.High-throughput sequencing (HTS) technologies are employed to capture the expression levels of sRNA populations. The processing of the resulting big data sets facilitated the computational analysis of the sRNA patterns of variation within biological samples such as time point experiments, tissue series or various treatments. Rapid technological advances enable larger experiments, often with biological replicates leading to a vast amount of raw data. As a result, in this fast-evolving field, the existing methods for sequence characterization and prediction of interaction (regulatory) networks periodically require adapting or in extreme cases, a complete redesign to cope with the data deluge. In addition, the presence of numerous tools focused only on particular steps of HTS analysis hinders the systematic parsing of the results and their interpretation.The UEA small RNA Workbench (v1-4), described in this chapter, provides a user-friendly, modular, interactive analysis in the form of a suite of computational tools designed to process and mine sRNA datasets for interesting characteristics that can be linked back to the observed phenotypes. First, we show how to preprocess the raw sequencing output and prepare it for downstream analysis. Then we review some quality checks that can be used as a first indication of sources of variability between samples. Next we show how the Workbench can provide a comparison of the effects of different normalization approaches on the distributions of expression, enhanced methods for the identification of differentially expressed transcripts and a summary of their corresponding patterns. Finally we describe individual analysis tools such as PAREsnip, for the analysis of PARE (degradome) data or CoLIde for the identification of sRNA loci based on their expression patterns and the visualization of the results using the software. We illustrate the features of the UEA sRNA Workbench on Arabidopsis thaliana and Homo sapiens datasets.

  15. Teaching NMR spectra analysis with nmr.cheminfo.org.

    PubMed

    Patiny, Luc; Bolaños, Alejandro; Castillo, Andrés M; Bernal, Andrés; Wist, Julien

    2018-06-01

    Teaching spectra analysis and structure elucidation requires students to get trained on real problems. This involves solving exercises of increasing complexity and when necessary using computational tools. Although desktop software packages exist for this purpose, nmr.cheminfo.org platform offers students an online alternative. It provides a set of exercises and tools to help solving them. Only a small number of exercises are currently available, but contributors are invited to submit new ones and suggest new types of problems. Copyright © 2018 John Wiley & Sons, Ltd.

  16. EMHP: an accurate automated hole masking algorithm for single-particle cryo-EM image processing.

    PubMed

    Berndsen, Zachary; Bowman, Charles; Jang, Haerin; Ward, Andrew B

    2017-12-01

    The Electron Microscopy Hole Punch (EMHP) is a streamlined suite of tools for quick assessment, sorting and hole masking of electron micrographs. With recent advances in single-particle electron cryo-microscopy (cryo-EM) data processing allowing for the rapid determination of protein structures using a smaller computational footprint, we saw the need for a fast and simple tool for data pre-processing that could run independent of existing high-performance computing (HPC) infrastructures. EMHP provides a data preprocessing platform in a small package that requires minimal python dependencies to function. https://www.bitbucket.org/chazbot/emhp Apache 2.0 License. bowman@scripps.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  17. Multi-objective reverse logistics model for integrated computer waste management.

    PubMed

    Ahluwalia, Poonam Khanijo; Nema, Arvind K

    2006-12-01

    This study aimed to address the issues involved in the planning and design of a computer waste management system in an integrated manner. A decision-support tool is presented for selecting an optimum configuration of computer waste management facilities (segregation, storage, treatment/processing, reuse/recycle and disposal) and allocation of waste to these facilities. The model is based on an integer linear programming method with the objectives of minimizing environmental risk as well as cost. The issue of uncertainty in the estimated waste quantities from multiple sources is addressed using the Monte Carlo simulation technique. An illustrated example of computer waste management in Delhi, India is presented to demonstrate the usefulness of the proposed model and to study tradeoffs between cost and risk. The results of the example problem show that it is possible to reduce the environmental risk significantly by a marginal increase in the available cost. The proposed model can serve as a powerful tool to address the environmental problems associated with exponentially growing quantities of computer waste which are presently being managed using rudimentary methods of reuse, recovery and disposal by various small-scale vendors.

  18. The investigation and implementation of real-time face pose and direction estimation on mobile computing devices

    NASA Astrophysics Data System (ADS)

    Fu, Deqian; Gao, Lisheng; Jhang, Seong Tae

    2012-04-01

    The mobile computing device has many limitations, such as relative small user interface and slow computing speed. Usually, augmented reality requires face pose estimation can be used as a HCI and entertainment tool. As far as the realtime implementation of head pose estimation on relatively resource limited mobile platforms is concerned, it is required to face different constraints while leaving enough face pose estimation accuracy. The proposed face pose estimation method met this objective. Experimental results running on a testing Android mobile device delivered satisfactory performing results in the real-time and accurately.

  19. An interactive computer code for calculation of gas-phase chemical equilibrium (EQLBRM)

    NASA Technical Reports Server (NTRS)

    Pratt, B. S.; Pratt, D. T.

    1984-01-01

    A user friendly, menu driven, interactive computer program known as EQLBRM which calculates the adiabatic equilibrium temperature and product composition resulting from the combustion of hydrocarbon fuels with air, at specified constant pressure and enthalpy is discussed. The program is developed primarily as an instructional tool to be run on small computers to allow the user to economically and efficiency explore the effects of varying fuel type, air/fuel ratio, inlet air and/or fuel temperature, and operating pressure on the performance of continuous combustion devices such as gas turbine combustors, Stirling engine burners, and power generation furnaces.

  20. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  1. A fast ultrasonic simulation tool based on massively parallel implementations

    NASA Astrophysics Data System (ADS)

    Lambert, Jason; Rougeron, Gilles; Lacassagne, Lionel; Chatillon, Sylvain

    2014-02-01

    This paper presents a CIVA optimized ultrasonic inspection simulation tool, which takes benefit of the power of massively parallel architectures: graphical processing units (GPU) and multi-core general purpose processors (GPP). This tool is based on the classical approach used in CIVA: the interaction model is based on Kirchoff, and the ultrasonic field around the defect is computed by the pencil method. The model has been adapted and parallelized for both architectures. At this stage, the configurations addressed by the tool are : multi and mono-element probes, planar specimens made of simple isotropic materials, planar rectangular defects or side drilled holes of small diameter. Validations on the model accuracy and performances measurements are presented.

  2. INTELLIGENT COMPUTING SYSTEM FOR RESERVOIR ANALYSIS AND RISK ASSESSMENT OF THE RED RIVER FORMATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenneth D. Luff

    2002-06-30

    Integrated software has been written that comprises the tool kit for the Intelligent Computing System (ICS). Luff Exploration Company is applying these tools for analysis of carbonate reservoirs in the southern Williston Basin. The integrated software programs are designed to be used by small team consisting of an engineer, geologist and geophysicist. The software tools are flexible and robust, allowing application in many environments for hydrocarbon reservoirs. Keystone elements of the software tools include clustering and neural-network techniques. The tools are used to transform seismic attribute data to reservoir characteristics such as storage (phi-h), probable oil-water contacts, structural depths andmore » structural growth history. When these reservoir characteristics are combined with neural network or fuzzy logic solvers, they can provide a more complete description of the reservoir. This leads to better estimates of hydrocarbons in place, areal limits and potential for infill or step-out drilling. These tools were developed and tested using seismic, geologic and well data from the Red River Play in Bowman County, North Dakota and Harding County, South Dakota. The geologic setting for the Red River Formation is shallow-shelf carbonate at a depth from 8000 to 10,000 ft.« less

  3. INTELLIGENT COMPUTING SYSTEM FOR RESERVOIR ANALYSIS AND RISK ASSESSMENT OF THE RED RIVER FORMATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenneth D. Luff

    2002-09-30

    Integrated software has been written that comprises the tool kit for the Intelligent Computing System (ICS). Luff Exploration Company is applying these tools for analysis of carbonate reservoirs in the southern Williston Basin. The integrated software programs are designed to be used by small team consisting of an engineer, geologist and geophysicist. The software tools are flexible and robust, allowing application in many environments for hydrocarbon reservoirs. Keystone elements of the software tools include clustering and neural-network techniques. The tools are used to transform seismic attribute data to reservoir characteristics such as storage (phi-h), probable oil-water contacts, structural depths andmore » structural growth history. When these reservoir characteristics are combined with neural network or fuzzy logic solvers, they can provide a more complete description of the reservoir. This leads to better estimates of hydrocarbons in place, areal limits and potential for infill or step-out drilling. These tools were developed and tested using seismic, geologic and well data from the Red River Play in Bowman County, North Dakota and Harding County, South Dakota. The geologic setting for the Red River Formation is shallow-shelf carbonate at a depth from 8000 to 10,000 ft.« less

  4. Rapid analysis of protein backbone resonance assignments using cryogenic probes, a distributed Linux-based computing architecture, and an integrated set of spectral analysis tools.

    PubMed

    Monleón, Daniel; Colson, Kimberly; Moseley, Hunter N B; Anklin, Clemens; Oswald, Robert; Szyperski, Thomas; Montelione, Gaetano T

    2002-01-01

    Rapid data collection, spectral referencing, processing by time domain deconvolution, peak picking and editing, and assignment of NMR spectra are necessary components of any efficient integrated system for protein NMR structure analysis. We have developed a set of software tools designated AutoProc, AutoPeak, and AutoAssign, which function together with the data processing and peak-picking programs NMRPipe and Sparky, to provide an integrated software system for rapid analysis of protein backbone resonance assignments. In this paper we demonstrate that these tools, together with high-sensitivity triple resonance NMR cryoprobes for data collection and a Linux-based computer cluster architecture, can be combined to provide nearly complete backbone resonance assignments and secondary structures (based on chemical shift data) for a 59-residue protein in less than 30 hours of data collection and processing time. In this optimum case of a small protein providing excellent spectra, extensive backbone resonance assignments could also be obtained using less than 6 hours of data collection and processing time. These results demonstrate the feasibility of high throughput triple resonance NMR for determining resonance assignments and secondary structures of small proteins, and the potential for applying NMR in large scale structural proteomics projects.

  5. Internet Databases of the Properties, Enzymatic Reactions, and Metabolism of Small Molecules—Search Options and Applications in Food Science

    PubMed Central

    Minkiewicz, Piotr; Darewicz, Małgorzata; Iwaniak, Anna; Bucholska, Justyna; Starowicz, Piotr; Czyrko, Emilia

    2016-01-01

    Internet databases of small molecules, their enzymatic reactions, and metabolism have emerged as useful tools in food science. Database searching is also introduced as part of chemistry or enzymology courses for food technology students. Such resources support the search for information about single compounds and facilitate the introduction of secondary analyses of large datasets. Information can be retrieved from databases by searching for the compound name or structure, annotating with the help of chemical codes or drawn using molecule editing software. Data mining options may be enhanced by navigating through a network of links and cross-links between databases. Exemplary databases reviewed in this article belong to two classes: tools concerning small molecules (including general and specialized databases annotating food components) and tools annotating enzymes and metabolism. Some problems associated with database application are also discussed. Data summarized in computer databases may be used for calculation of daily intake of bioactive compounds, prediction of metabolism of food components, and their biological activity as well as for prediction of interactions between food component and drugs. PMID:27929431

  6. Internet Databases of the Properties, Enzymatic Reactions, and Metabolism of Small Molecules-Search Options and Applications in Food Science.

    PubMed

    Minkiewicz, Piotr; Darewicz, Małgorzata; Iwaniak, Anna; Bucholska, Justyna; Starowicz, Piotr; Czyrko, Emilia

    2016-12-06

    Internet databases of small molecules, their enzymatic reactions, and metabolism have emerged as useful tools in food science. Database searching is also introduced as part of chemistry or enzymology courses for food technology students. Such resources support the search for information about single compounds and facilitate the introduction of secondary analyses of large datasets. Information can be retrieved from databases by searching for the compound name or structure, annotating with the help of chemical codes or drawn using molecule editing software. Data mining options may be enhanced by navigating through a network of links and cross-links between databases. Exemplary databases reviewed in this article belong to two classes: tools concerning small molecules (including general and specialized databases annotating food components) and tools annotating enzymes and metabolism. Some problems associated with database application are also discussed. Data summarized in computer databases may be used for calculation of daily intake of bioactive compounds, prediction of metabolism of food components, and their biological activity as well as for prediction of interactions between food component and drugs.

  7. Development and Validation of a Monte Carlo Simulation Tool for Multi-Pinhole SPECT

    PubMed Central

    Mok, Greta S. P.; Du, Yong; Wang, Yuchuan; Frey, Eric C.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose In this work, we developed and validated a Monte Carlo simulation (MCS) tool for investigation and evaluation of multi-pinhole (MPH) SPECT imaging. Procedures This tool was based on a combination of the SimSET and MCNP codes. Photon attenuation and scatter in the object, as well as penetration and scatter through the collimator detector, are modeled in this tool. It allows accurate and efficient simulation of MPH SPECT with focused pinhole apertures and user-specified photon energy, aperture material, and imaging geometry. The MCS method was validated by comparing the point response function (PRF), detection efficiency (DE), and image profiles obtained from point sources and phantom experiments. A prototype single-pinhole collimator and focused four- and five-pinhole collimators fitted on a small animal imager were used for the experimental validations. We have also compared computational speed among various simulation tools for MPH SPECT, including SimSET-MCNP, MCNP, SimSET-GATE, and GATE for simulating projections of a hot sphere phantom. Results We found good agreement between the MCS and experimental results for PRF, DE, and image profiles, indicating the validity of the simulation method. The relative computational speeds for SimSET-MCNP, MCNP, SimSET-GATE, and GATE are 1: 2.73: 3.54: 7.34, respectively, for 120-view simulations. We also demonstrated the application of this MCS tool in small animal imaging by generating a set of low-noise MPH projection data of a 3D digital mouse whole body phantom. Conclusions The new method is useful for studying MPH collimator designs, data acquisition protocols, image reconstructions, and compensation techniques. It also has great potential to be applied for modeling the collimator-detector response with penetration and scatter effects for MPH in the quantitative reconstruction method. PMID:19779896

  8. psRNATarget: a plant small RNA target analysis server

    PubMed Central

    Dai, Xinbin; Zhao, Patrick Xuechun

    2011-01-01

    Plant endogenous non-coding short small RNAs (20–24 nt), including microRNAs (miRNAs) and a subset of small interfering RNAs (ta-siRNAs), play important role in gene expression regulatory networks (GRNs). For example, many transcription factors and development-related genes have been reported as targets of these regulatory small RNAs. Although a number of miRNA target prediction algorithms and programs have been developed, most of them were designed for animal miRNAs which are significantly different from plant miRNAs in the target recognition process. These differences demand the development of separate plant miRNA (and ta-siRNA) target analysis tool(s). We present psRNATarget, a plant small RNA target analysis server, which features two important analysis functions: (i) reverse complementary matching between small RNA and target transcript using a proven scoring schema, and (ii) target-site accessibility evaluation by calculating unpaired energy (UPE) required to ‘open’ secondary structure around small RNA’s target site on mRNA. The psRNATarget incorporates recent discoveries in plant miRNA target recognition, e.g. it distinguishes translational and post-transcriptional inhibition, and it reports the number of small RNA/target site pairs that may affect small RNA binding activity to target transcript. The psRNATarget server is designed for high-throughput analysis of next-generation data with an efficient distributed computing back-end pipeline that runs on a Linux cluster. The server front-end integrates three simplified user-friendly interfaces to accept user-submitted or preloaded small RNAs and transcript sequences; and outputs a comprehensive list of small RNA/target pairs along with the online tools for batch downloading, key word searching and results sorting. The psRNATarget server is freely available at http://plantgrn.noble.org/psRNATarget/. PMID:21622958

  9. Logic integration of mRNA signals by an RNAi-based molecular computer.

    PubMed

    Xie, Zhen; Liu, Siyuan John; Bleris, Leonidas; Benenson, Yaakov

    2010-05-01

    Synthetic in vivo molecular 'computers' could rewire biological processes by establishing programmable, non-native pathways between molecular signals and biological responses. Multiple molecular computer prototypes have been shown to work in simple buffered solutions. Many of those prototypes were made of DNA strands and performed computations using cycles of annealing-digestion or strand displacement. We have previously introduced RNA interference (RNAi)-based computing as a way of implementing complex molecular logic in vivo. Because it also relies on nucleic acids for its operation, RNAi computing could benefit from the tools developed for DNA systems. However, these tools must be harnessed to produce bioactive components and be adapted for harsh operating environments that reflect in vivo conditions. In a step toward this goal, we report the construction and implementation of biosensors that 'transduce' mRNA levels into bioactive, small interfering RNA molecules via RNA strand exchange in a cell-free Drosophila embryo lysate, a step beyond simple buffered environments. We further integrate the sensors with our RNAi 'computational' module to evaluate two-input logic functions on mRNA concentrations. Our results show how RNA strand exchange can expand the utility of RNAi computing and point toward the possibility of using strand exchange in a native biological setting.

  10. VirusDetect: An automated pipeline for efficient virus discovery using deep sequencing of small RNAs

    USDA-ARS?s Scientific Manuscript database

    Accurate detection of viruses in plants and animals is critical for agriculture production and human health. Deep sequencing and assembly of virus-derived siRNAs has proven to be a highly efficient approach for virus discovery. However, to date no computational tools specifically designed for both k...

  11. Molecular Docking of Enzyme Inhibitors: A Computational Tool for Structure-Based Drug Design

    ERIC Educational Resources Information Center

    Rudnitskaya, Aleksandra; Torok, Bela; Torok, Marianna

    2010-01-01

    Molecular docking is a frequently used method in structure-based rational drug design. It is used for evaluating the complex formation of small ligands with large biomolecules, predicting the strength of the bonding forces and finding the best geometrical arrangements. The major goal of this advanced undergraduate biochemistry laboratory exercise…

  12. The diminiode: A research and development tool for nuclear thermionics

    NASA Technical Reports Server (NTRS)

    Morris, J. F.

    1972-01-01

    Diminiodes are fixed-or variable-gap cesium diodes with plane miniature emitters and guarded collectors. In addition to smallness, their relative advantages are simplicity, precision, ease of fabrication, interchangeability of parts, cleanliness, full instrumentation, ruggedness, and economy. With diminiodes and computers used in thermionic performance mapping, a thorough electrode screening program becomes practical.

  13. An Undergraduate Survey Course on Asynchronous Sequential Logic, Ladder Logic, and Fuzzy Logic

    ERIC Educational Resources Information Center

    Foster, D. L.

    2012-01-01

    For a basic foundation in computer engineering, universities traditionally teach synchronous sequential circuit design, using discrete gates or field programmable gate arrays, and a microcomputers course that includes basic I/O processing. These courses, though critical, expose students to only a small subset of tools. At co-op schools like…

  14. Parallel computing for automated model calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, John S.; Danielson, Gary R.; Schulz, Douglas A.

    2002-07-29

    Natural resources model calibration is a significant burden on computing and staff resources in modeling efforts. Most assessments must consider multiple calibration objectives (for example magnitude and timing of stream flow peak). An automated calibration process that allows real time updating of data/models, allowing scientists to focus effort on improving models is needed. We are in the process of building a fully featured multi objective calibration tool capable of processing multiple models cheaply and efficiently using null cycle computing. Our parallel processing and calibration software routines have been generically, but our focus has been on natural resources model calibration. Somore » far, the natural resources models have been friendly to parallel calibration efforts in that they require no inter-process communication, only need a small amount of input data and only output a small amount of statistical information for each calibration run. A typical auto calibration run might involve running a model 10,000 times with a variety of input parameters and summary statistical output. In the past model calibration has been done against individual models for each data set. The individual model runs are relatively fast, ranging from seconds to minutes. The process was run on a single computer using a simple iterative process. We have completed two Auto Calibration prototypes and are currently designing a more feature rich tool. Our prototypes have focused on running the calibration in a distributed computing cross platform environment. They allow incorporation of?smart? calibration parameter generation (using artificial intelligence processing techniques). Null cycle computing similar to SETI@Home has also been a focus of our efforts. This paper details the design of the latest prototype and discusses our plans for the next revision of the software.« less

  15. Current And Future Directions Of Lens Design Software

    NASA Astrophysics Data System (ADS)

    Gustafson, Darryl E.

    1983-10-01

    The most effective environment for doing lens design continues to evolve as new computer hardware and software tools become available. Important recent hardware developments include: Low-cost but powerful interactive multi-user 32 bit computers with virtual memory that are totally software-compatible with prior larger and more expensive members of the family. A rapidly growing variety of graphics devices for both hard-copy and screen graphics, including many with color capability. In addition, with optical design software readily accessible in many forms, optical design has become a part-time activity for a large number of engineers instead of being restricted to a small number of full-time specialists. A designer interface that is friendly for the part-time user while remaining efficient for the full-time designer is thus becoming more important as well as more practical. Along with these developments, software tools in other scientific and engineering disciplines are proliferating. Thus, the optical designer is less and less unique in his use of computer-aided techniques and faces the challenge and opportunity of efficiently communicating his designs to other computer-aided-design (CAD), computer-aided-manufacturing (CAM), structural, thermal, and mechanical software tools. This paper will address the impact of these developments on the current and future directions of the CODE VTM optical design software package, its implementation, and the resulting lens design environment.

  16. Electrophoresis of tear proteins as a new diagnostic tool for two high risk groups for dry eye: computer users and contact lens wearers.

    PubMed

    Chiva, Andreea

    2011-08-15

    Dry eye is the most prevalent condition seen by the ophthalmologist, in particular in elderly. The identification of new common risk factors (computer use and contact lens wear) extends the disease among the young people. The early diagnosis of dry eye is essential, but difficult, because the biochemical changes in tear film usually occur before any detectable signs. Due its advantages, electrophoresis of tear proteins could be an important tool for diagnosis of tear film impairment in high risk groups for dry eye. The role of tear proteins electrophoresis in early diagnosis of dry eye related to computer use and contact lens wear, as well as the biochemical changes in these high risk groups are presented. This review will summarize the actual data concerning the electrophoretic changes of tear proteins in computer users and contact lens wearers, two common high risk groups for dry eye. Electrophoresis of tear proteins using automated system Hyrys-Hydrasys SEBIA France is an important tool for early diagnosis of tear film alterations and monitoring of therapy. The quantification of many proteins in a single analysis using a small quantity of unconcentrated reflex tears is the main advantage of this technique. Electrophoresis of tear proteins should became a prerequisite, in particular for computer users less than 3 h/day, as well as at prescribing contact lenses.

  17. Conformational analysis by intersection: CONAN.

    PubMed

    Smellie, Andrew; Stanton, Robert; Henne, Randy; Teig, Steve

    2003-01-15

    As high throughput techniques in chemical synthesis and screening improve, more demands are placed on computer assisted design and virtual screening. Many of these computational methods require one or more three-dimensional conformations for molecules, creating a demand for a conformational analysis tool that can rapidly and robustly cover the low-energy conformational spaces of small molecules. A new algorithm of intersection is presented here, which quickly generates (on average <0.5 seconds/stereoisomer) a complete description of the low energy conformational space of a small molecule. The molecule is first decomposed into nonoverlapping nodes N (usually rings) and overlapping paths P with conformations (N and P) generated in an offline process. In a second step the node and path data are combined to form distinct conformers of the molecule. Finally, heuristics are applied after intersection to generate a small representative collection of conformations that span the conformational space. In a study of approximately 97,000 randomly selected molecules from the MDDR, results are presented that explore these conformations and their ability to cover low-energy conformational space. Copyright 2002 Wiley Periodicals, Inc. J Comput Chem 24: 10-20, 2003

  18. MOLA: a bootable, self-configuring system for virtual screening using AutoDock4/Vina on computer clusters.

    PubMed

    Abreu, Rui Mv; Froufe, Hugo Jc; Queiroz, Maria João Rp; Ferreira, Isabel Cfr

    2010-10-28

    Virtual screening of small molecules using molecular docking has become an important tool in drug discovery. However, large scale virtual screening is time demanding and usually requires dedicated computer clusters. There are a number of software tools that perform virtual screening using AutoDock4 but they require access to dedicated Linux computer clusters. Also no software is available for performing virtual screening with Vina using computer clusters. In this paper we present MOLA, an easy-to-use graphical user interface tool that automates parallel virtual screening using AutoDock4 and/or Vina in bootable non-dedicated computer clusters. MOLA automates several tasks including: ligand preparation, parallel AutoDock4/Vina jobs distribution and result analysis. When the virtual screening project finishes, an open-office spreadsheet file opens with the ligands ranked by binding energy and distance to the active site. All results files can automatically be recorded on an USB-flash drive or on the hard-disk drive using VirtualBox. MOLA works inside a customized Live CD GNU/Linux operating system, developed by us, that bypass the original operating system installed on the computers used in the cluster. This operating system boots from a CD on the master node and then clusters other computers as slave nodes via ethernet connections. MOLA is an ideal virtual screening tool for non-experienced users, with a limited number of multi-platform heterogeneous computers available and no access to dedicated Linux computer clusters. When a virtual screening project finishes, the computers can just be restarted to their original operating system. The originality of MOLA lies on the fact that, any platform-independent computer available can he added to the cluster, without ever using the computer hard-disk drive and without interfering with the installed operating system. With a cluster of 10 processors, and a potential maximum speed-up of 10x, the parallel algorithm of MOLA performed with a speed-up of 8,64× using AutoDock4 and 8,60× using Vina.

  19. Instability Mechanisms of Thermally-Driven Interfacial Flows in Liquid-Encapsulated Crystal Growth

    NASA Technical Reports Server (NTRS)

    Haj-Hariri, Hossein; Borhan, Ali

    1997-01-01

    During the past year, a great deal of effort was focused on the enhancement and refinement of the computational tools developed as part of our previous NASA grant. In particular, the interface mollification algorithm developed earlier was extended to incorporate the effects of surface-rheological properties in order to allow the study of thermocapillary flows in the presence of surface contamination. These tools will be used in the computational component of the proposed research in the remaining years of this grant. A detailed description of the progress made in this area is provided elsewhere. Briefly, the method developed allows for the convection and diffusion of bulk-insoluble surfactants on a moving and deforming interface. The novelty of the method is its grid independence: there is no need for front tracking, surface reconstruction, body-fitted grid generation, or metric evaluations; these are all very expensive computational tasks in three dimensions. For small local radii of curvature there is a need for local grid adaption so that the smearing thickness remains a small fraction of the radius of curvature. A special Neumann boundary condition was devised and applied so that the calculated surfactant concentration has no variations normal to the interface, and it is hence truly a surface-defined quantity. The discretized governing equations are solved subsequently using a time-split integration scheme which updates the concentration and the shape successively. Results demonstrate excellent agreement between the computed and exact solutions.

  20. Materials by numbers: Computations as tools of discovery

    PubMed Central

    Landman, Uzi

    2005-01-01

    Current issues pertaining to theoretical simulations of materials, with a focus on systems of nanometer-scale dimensions, are discussed. The use of atomistic simulations as high-resolution numerical experiments, enabling and guiding formulation and testing of analytic theoretical descriptions, is demonstrated through studies of the generation and breakup of nanojets, which have led to the derivation of a stochastic hydrodynamic description. Subsequently, I illustrate the use of computations and simulations as tools of discovery, with examples that include the self-organized formation of nanowires, the surprising nanocatalytic activity of small aggregates of gold that, in the bulk form, is notorious for being chemically inert, and the emergence of rotating electron molecules in two-dimensional quantum dots. I conclude with a brief discussion of some key challenges in nanomaterials simulations. PMID:15870210

  1. Step-and-Repeat Nanoimprint-, Photo- and Laser Lithography from One Customised CNC Machine.

    PubMed

    Greer, Andrew Im; Della-Rosa, Benoit; Khokhar, Ali Z; Gadegaard, Nikolaj

    2016-12-01

    The conversion of a computer numerical control machine into a nanoimprint step-and-repeat tool with additional laser- and photolithography capacity is documented here. All three processes, each demonstrated on a variety of photoresists, are performed successfully and analysed so as to enable the reader to relate their known lithography process(es) to the findings. Using the converted tool, 1 cm(2) of nanopattern may be exposed in 6 s, over 3300 times faster than the electron beam equivalent. Nanoimprint tools are commercially available, but these can cost around 1000 times more than this customised computer numerical control (CNC) machine. The converted equipment facilitates rapid production and large area micro- and nanoscale research on small grants, ultimately enabling faster and more diverse growth in this field of science. In comparison to commercial tools, this converted CNC also boasts capacity to handle larger substrates, temperature control and active force control, up to ten times more curing dose and compactness. Actual devices are fabricated using the machine including an expanded nanotopographic array and microfluidic PDMS Y-channel mixers.

  2. Step-and-Repeat Nanoimprint-, Photo- and Laser Lithography from One Customised CNC Machine

    NASA Astrophysics Data System (ADS)

    Greer, Andrew IM; Della-Rosa, Benoit; Khokhar, Ali Z.; Gadegaard, Nikolaj

    2016-03-01

    The conversion of a computer numerical control machine into a nanoimprint step-and-repeat tool with additional laser- and photolithography capacity is documented here. All three processes, each demonstrated on a variety of photoresists, are performed successfully and analysed so as to enable the reader to relate their known lithography process(es) to the findings. Using the converted tool, 1 cm2 of nanopattern may be exposed in 6 s, over 3300 times faster than the electron beam equivalent. Nanoimprint tools are commercially available, but these can cost around 1000 times more than this customised computer numerical control (CNC) machine. The converted equipment facilitates rapid production and large area micro- and nanoscale research on small grants, ultimately enabling faster and more diverse growth in this field of science. In comparison to commercial tools, this converted CNC also boasts capacity to handle larger substrates, temperature control and active force control, up to ten times more curing dose and compactness. Actual devices are fabricated using the machine including an expanded nanotopographic array and microfluidic PDMS Y-channel mixers.

  3. Building a generalized distributed system model

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Foudriat, E. C.

    1991-01-01

    A modeling tool for both analysis and design of distributed systems is discussed. Since many research institutions have access to networks of workstations, the researchers decided to build a tool running on top of the workstations to function as a prototype as well as a distributed simulator for a computing system. The effects of system modeling on performance prediction in distributed systems and the effect of static locking and deadlocks on the performance predictions of distributed transactions are also discussed. While the probability of deadlock is considerably small, its effects on performance could be significant.

  4. Prompt and Precise Prototyping

    NASA Technical Reports Server (NTRS)

    2003-01-01

    For Sanders Design International, Inc., of Wilton, New Hampshire, every passing second between the concept and realization of a product is essential to succeed in the rapid prototyping industry where amongst heavy competition, faster time-to-market means more business. To separate itself from its rivals, Sanders Design aligned with NASA's Marshall Space Flight Center to develop what it considers to be the most accurate rapid prototyping machine for fabrication of extremely precise tooling prototypes. The company's Rapid ToolMaker System has revolutionized production of high quality, small-to-medium sized prototype patterns and tooling molds with an exactness that surpasses that of computer numerically-controlled (CNC) machining devices. Created with funding and support from Marshall under a Small Business Innovation Research (SBIR) contract, the Rapid ToolMaker is a dual-use technology with applications in both commercial and military aerospace fields. The advanced technology provides cost savings in the design and manufacturing of automotive, electronic, and medical parts, as well as in other areas of consumer interest, such as jewelry and toys. For aerospace applications, the Rapid ToolMaker enables fabrication of high-quality turbine and compressor blades for jet engines on unmanned air vehicles, aircraft, and missiles.

  5. Novel Robotic Tools for Piping Inspection and Repair

    DTIC Science & Technology

    2015-01-14

    was selected due to its small size, and peripheral capability. The SoM measures 50mm x 44mm. The SoM processor is an ARM Cortex -A8 running at720MHz...designing an embedded computing system from scratch. The SoM is a single integrated module which contains the processor , RAM, power management, and

  6. An open source tool for automatic spatiotemporal assessment of calcium transients and local ‘signal-close-to-noise’ activity in calcium imaging data

    PubMed Central

    Martin, Corinna; Jablonka, Sibylle

    2018-01-01

    Local and spontaneous calcium signals play important roles in neurons and neuronal networks. Spontaneous or cell-autonomous calcium signals may be difficult to assess because they appear in an unpredictable spatiotemporal pattern and in very small neuronal loci of axons or dendrites. We developed an open source bioinformatics tool for an unbiased assessment of calcium signals in x,y-t imaging series. The tool bases its algorithm on a continuous wavelet transform-guided peak detection to identify calcium signal candidates. The highly sensitive calcium event definition is based on identification of peaks in 1D data through analysis of a 2D wavelet transform surface. For spatial analysis, the tool uses a grid to separate the x,y-image field in independently analyzed grid windows. A document containing a graphical summary of the data is automatically created and displays the loci of activity for a wide range of signal intensities. Furthermore, the number of activity events is summed up to create an estimated total activity value, which can be used to compare different experimental situations, such as calcium activity before or after an experimental treatment. All traces and data of active loci become documented. The tool can also compute the signal variance in a sliding window to visualize activity-dependent signal fluctuations. We applied the calcium signal detector to monitor activity states of cultured mouse neurons. Our data show that both the total activity value and the variance area created by a sliding window can distinguish experimental manipulations of neuronal activity states. Notably, the tool is powerful enough to compute local calcium events and ‘signal-close-to-noise’ activity in small loci of distal neurites of neurons, which remain during pharmacological blockade of neuronal activity with inhibitors such as tetrodotoxin, to block action potential firing, or inhibitors of ionotropic glutamate receptors. The tool can also offer information about local homeostatic calcium activity events in neurites. PMID:29601577

  7. CAD for small hydro projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bishop, N.A. Jr.

    1994-04-01

    Over the past decade, computer-aided design (CAD) has become a practical and economical design tool. Today, specifying CAD hardware and software is relatively easy once you know what the design requirements are. But finding experienced CAD professionals is often more difficult. Most CAD users have only two or three years of design experience; more experienced design personnel are frequently not CAD literate. However, effective use of CAD can be the key to lowering design costs and improving design quality--a quest familiar to every manager and designer. By emphasizing computer-aided design literacy at all levels of the firm, a Canadian joint-venturemore » company that specializes in engineering small hydroelectric projects has cut costs, become more productive and improved design quality. This article describes how they did it.« less

  8. Experimental Stage Separation Tool Development in NASA Langley's Aerothermodynamics Laboratory

    NASA Technical Reports Server (NTRS)

    Murphy, Kelly J.; Scallion, William I.

    2005-01-01

    As part of the research effort at NASA in support of the stage separation and ascent aerothermodynamics research program, proximity testing of a generic bimese wing-body configuration was conducted in NASA Langley's Aerothermodynamics Laboratory in the 20-Inch Mach 6 Air Tunnel. The objective of this work is the development of experimental tools and testing methodologies to apply to hypersonic stage separation problems for future multi-stage launch vehicle systems. Aerodynamic force and moment proximity data were generated at a nominal Mach number of 6 over a small range of angles of attack. The generic bimese configuration was tested in a belly-to-belly and back-to-belly orientation at 86 relative proximity locations. Over 800 aerodynamic proximity data points were taken to serve as a database for code validation. Longitudinal aerodynamic data generated in this test program show very good agreement with viscous computational predictions. Thus a framework has been established to study separation problems in the hypersonic regime using coordinated experimental and computational tools.

  9. Optical Micromachining

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Under an SBIR (Small Business Innovative Research) with Marshall Space Flight Center, Potomac Photonics, Inc., constructed and demonstrated a unique tool that fills a need in the area of diffractive and refractive micro-optics. It is an integrated computer-aided design and computer-aided micro-machining workstation that will extend the benefits of diffractive and micro-optic technology to optical designers. Applications of diffractive optics include sensors and monitoring equipment, analytical instruments, and fiber optic distribution and communication. The company has been making diffractive elements with the system as a commercial service for the last year.

  10. Plant Aquaporins: Genome-Wide Identification, Transcriptomics, Proteomics, and Advanced Analytical Tools.

    PubMed

    Deshmukh, Rupesh K; Sonah, Humira; Bélanger, Richard R

    2016-01-01

    Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is offered as a resource for AQP research.

  11. Plant Aquaporins: Genome-Wide Identification, Transcriptomics, Proteomics, and Advanced Analytical Tools

    PubMed Central

    Deshmukh, Rupesh K.; Sonah, Humira; Bélanger, Richard R.

    2016-01-01

    Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is offered as a resource for AQP research. PMID:28066459

  12. Logic integration of mRNA signals by an RNAi-based molecular computer

    PubMed Central

    Xie, Zhen; Liu, Siyuan John; Bleris, Leonidas; Benenson, Yaakov

    2010-01-01

    Synthetic in vivo molecular ‘computers’ could rewire biological processes by establishing programmable, non-native pathways between molecular signals and biological responses. Multiple molecular computer prototypes have been shown to work in simple buffered solutions. Many of those prototypes were made of DNA strands and performed computations using cycles of annealing-digestion or strand displacement. We have previously introduced RNA interference (RNAi)-based computing as a way of implementing complex molecular logic in vivo. Because it also relies on nucleic acids for its operation, RNAi computing could benefit from the tools developed for DNA systems. However, these tools must be harnessed to produce bioactive components and be adapted for harsh operating environments that reflect in vivo conditions. In a step toward this goal, we report the construction and implementation of biosensors that ‘transduce’ mRNA levels into bioactive, small interfering RNA molecules via RNA strand exchange in a cell-free Drosophila embryo lysate, a step beyond simple buffered environments. We further integrate the sensors with our RNAi ‘computational’ module to evaluate two-input logic functions on mRNA concentrations. Our results show how RNA strand exchange can expand the utility of RNAi computing and point toward the possibility of using strand exchange in a native biological setting. PMID:20194121

  13. FunRich proteomics software analysis, let the fun begin!

    PubMed

    Benito-Martin, Alberto; Peinado, Héctor

    2015-08-01

    Protein MS analysis is the preferred method for unbiased protein identification. It is normally applied to a large number of both small-scale and high-throughput studies. However, user-friendly computational tools for protein analysis are still needed. In this issue, Mathivanan and colleagues (Proteomics 2015, 15, 2597-2601) report the development of FunRich software, an open-access software that facilitates the analysis of proteomics data, providing tools for functional enrichment and interaction network analysis of genes and proteins. FunRich is a reinterpretation of proteomic software, a standalone tool combining ease of use with customizable databases, free access, and graphical representations. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. The Computational Science Education Reference Desk: A tool for increasing inquiry based learning in the science classroom

    NASA Astrophysics Data System (ADS)

    Joiner, D. A.; Stevenson, D. E.; Panoff, R. M.

    2000-12-01

    The Computational Science Reference Desk is an online tool designed to provide educators in math, physics, astronomy, biology, chemistry, and engineering with information on how to use computational science to enhance inquiry based learning in the undergraduate and pre college classroom. The Reference Desk features a showcase of original content exploration activities, including lesson plans and background materials; a catalog of websites which contain models, lesson plans, software, and instructional resources; and a forum to allow educators to communicate their ideas. Many of the recent advances in astronomy rely on the use of computer simulation, and tools are being developed by CSERD to allow students to experiment with some of the models that have guided scientific discovery. One of these models allows students to study how scientists use spectral information to determine the makeup of the interstellar medium by modeling the interstellar extinction curve using spherical grains of silicate, amorphous carbon, or graphite. Students can directly compare their model to the average interstellar extinction curve, and experiment with how small changes in their model alter the shape of the interstellar extinction curve. A simpler model allows students to visualize spatial relationships between the Earth, Moon, and Sun to understand the cause of the phases of the moon. A report on the usefulness of these models in two classes, the Computational Astrophysics workshop at The Shodor Education Foundation and the Conceptual Astronomy class at the University of North Carolina at Greensboro, will be presented.

  15. Introductory Tools for Radiative Transfer Models

    NASA Astrophysics Data System (ADS)

    Feldman, D.; Kuai, L.; Natraj, V.; Yung, Y.

    2006-12-01

    Satellite data are currently so voluminous that, despite their unprecedented quality and potential for scientific application, only a small fraction is analyzed due to two factors: researchers' computational constraints and a relatively small number of researchers actively utilizing the data. Ultimately it is hoped that the terabytes of unanalyzed data being archived can receive scientific scrutiny but this will require a popularization of the methods associated with the analysis. Since a large portion of complexity is associated with the proper implementation of the radiative transfer model, it is reasonable and appropriate to make the model as accessible as possible to general audiences. Unfortunately, the algorithmic and conceptual details that are necessary for state-of-the-art analysis also tend to frustrate the accessibility for those new to remote sensing. Several efforts have been made to have web- based radiative transfer calculations, and these are useful for limited calculations, but analysis of more than a few spectra requires the utilization of home- or server-based computing resources. We present a system that is designed to allow for easier access to radiative transfer models with implementation on a home computing platform in the hopes that this system can be utilized in and expanded upon in advanced high school and introductory college settings. This learning-by-doing process is aided through the use of several powerful tools. The first is a wikipedia-style introduction to the salient features of radiative transfer that references the seminal works in the field and refers to more complicated calculations and algorithms sparingly5. The second feature is a technical forum, commonly referred to as a tiki-wiki, that addresses technical and conceptual questions through public postings, private messages, and a ranked searching routine. Together, these tools may be able to facilitate greater interest in the field of remote sensing.

  16. Computational prediction of type III and IV secreted effectors in Gram-negative bacteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDermott, Jason E.; Corrigan, Abigail L.; Peterson, Elena S.

    2011-01-01

    In this review, we provide an overview of the methods employed by four recent papers that described novel methods for computational prediction of secreted effectors from type III and IV secretion systems in Gram-negative bacteria. The results of the studies in terms of performance at accurately predicting secreted effectors and similarities found between secretion signals that may reflect biologically relevant features for recognition. We discuss the web-based tools for secreted effector prediction described in these studies and announce the availability of our tool, the SIEVEserver (http://www.biopilot.org). Finally, we assess the accuracy of the three type III effector prediction methods onmore » a small set of proteins not known prior to the development of these tools that we have recently discovered and validated using both experimental and computational approaches. Our comparison shows that all methods use similar approaches and, in general arrive at similar conclusions. We discuss the possibility of an order-dependent motif in the secretion signal, which was a point of disagreement in the studies. Our results show that there may be classes of effectors in which the signal has a loosely defined motif, and others in which secretion is dependent only on compositional biases. Computational prediction of secreted effectors from protein sequences represents an important step toward better understanding the interaction between pathogens and hosts.« less

  17. SmallTool - a toolkit for realizing shared virtual environments on the Internet

    NASA Astrophysics Data System (ADS)

    Broll, Wolfgang

    1998-09-01

    With increasing graphics capabilities of computers and higher network communication speed, networked virtual environments have become available to a large number of people. While the virtual reality modelling language (VRML) provides users with the ability to exchange 3D data, there is still a lack of appropriate support to realize large-scale multi-user applications on the Internet. In this paper we will present SmallTool, a toolkit to support shared virtual environments on the Internet. The toolkit consists of a VRML-based parsing and rendering library, a device library, and a network library. This paper will focus on the networking architecture, provided by the network library - the distributed worlds transfer and communication protocol (DWTP). DWTP provides an application-independent network architecture to support large-scale multi-user environments on the Internet.

  18. Simple re-instantiation of small databases using cloud computing.

    PubMed

    Tan, Tin Wee; Xie, Chao; De Silva, Mark; Lim, Kuan Siong; Patro, C Pawan K; Lim, Shen Jean; Govindarajan, Kunde Ramamoorthy; Tong, Joo Chuan; Choo, Khar Heng; Ranganathan, Shoba; Khan, Asif M

    2013-01-01

    Small bioinformatics databases, unlike institutionally funded large databases, are vulnerable to discontinuation and many reported in publications are no longer accessible. This leads to irreproducible scientific work and redundant effort, impeding the pace of scientific progress. We describe a Web-accessible system, available online at http://biodb100.apbionet.org, for archival and future on demand re-instantiation of small databases within minutes. Depositors can rebuild their databases by downloading a Linux live operating system (http://www.bioslax.com), preinstalled with bioinformatics and UNIX tools. The database and its dependencies can be compressed into an ".lzm" file for deposition. End-users can search for archived databases and activate them on dynamically re-instantiated BioSlax instances, run as virtual machines over the two popular full virtualization standard cloud-computing platforms, Xen Hypervisor or vSphere. The system is adaptable to increasing demand for disk storage or computational load and allows database developers to use the re-instantiated databases for integration and development of new databases. Herein, we demonstrate that a relatively inexpensive solution can be implemented for archival of bioinformatics databases and their rapid re-instantiation should the live databases disappear.

  19. Simple re-instantiation of small databases using cloud computing

    PubMed Central

    2013-01-01

    Background Small bioinformatics databases, unlike institutionally funded large databases, are vulnerable to discontinuation and many reported in publications are no longer accessible. This leads to irreproducible scientific work and redundant effort, impeding the pace of scientific progress. Results We describe a Web-accessible system, available online at http://biodb100.apbionet.org, for archival and future on demand re-instantiation of small databases within minutes. Depositors can rebuild their databases by downloading a Linux live operating system (http://www.bioslax.com), preinstalled with bioinformatics and UNIX tools. The database and its dependencies can be compressed into an ".lzm" file for deposition. End-users can search for archived databases and activate them on dynamically re-instantiated BioSlax instances, run as virtual machines over the two popular full virtualization standard cloud-computing platforms, Xen Hypervisor or vSphere. The system is adaptable to increasing demand for disk storage or computational load and allows database developers to use the re-instantiated databases for integration and development of new databases. Conclusions Herein, we demonstrate that a relatively inexpensive solution can be implemented for archival of bioinformatics databases and their rapid re-instantiation should the live databases disappear. PMID:24564380

  20. DOE's Computer Incident Advisory Capability (CIAC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, E.

    1990-09-01

    Computer security is essential in maintaining quality in the computing environment. Computer security incidents, however, are becoming more sophisticated. The DOE Computer Incident Advisory Capability (CIAC) team was formed primarily to assist DOE sites in responding to computer security incidents. Among CIAC's other responsibilities are gathering and distributing information to DOE sites, providing training workshops, coordinating with other agencies, response teams, and vendors, creating guidelines for incident handling, and developing software tools. CIAC has already provided considerable assistance to DOE sites faced with virus infections and worm and hacker attacks, has issued over 40 information bulletins, and has developed andmore » presented a workshop on incident handling. CIAC's experience in helping sites has produced several lessons learned, including the need to follow effective procedures to avoid virus infections in small systems and the need for sound password management and system administration in networked systems. CIAC's activity and scope will expand in the future. 4 refs.« less

  1. StarScan: a web server for scanning small RNA targets from degradome sequencing data.

    PubMed

    Liu, Shun; Li, Jun-Hao; Wu, Jie; Zhou, Ke-Ren; Zhou, Hui; Yang, Jian-Hua; Qu, Liang-Hu

    2015-07-01

    Endogenous small non-coding RNAs (sRNAs), including microRNAs, PIWI-interacting RNAs and small interfering RNAs, play important gene regulatory roles in animals and plants by pairing to the protein-coding and non-coding transcripts. However, computationally assigning these various sRNAs to their regulatory target genes remains technically challenging. Recently, a high-throughput degradome sequencing method was applied to identify biologically relevant sRNA cleavage sites. In this study, an integrated web-based tool, StarScan (sRNA target Scan), was developed for scanning sRNA targets using degradome sequencing data from 20 species. Given a sRNA sequence from plants or animals, our web server performs an ultrafast and exhaustive search for potential sRNA-target interactions in annotated and unannotated genomic regions. The interactions between small RNAs and target transcripts were further evaluated using a novel tool, alignScore. A novel tool, degradomeBinomTest, was developed to quantify the abundance of degradome fragments located at the 9-11th nucleotide from the sRNA 5' end. This is the first web server for discovering potential sRNA-mediated RNA cleavage events in plants and animals, which affords mechanistic insights into the regulatory roles of sRNAs. The StarScan web server is available at http://mirlab.sysu.edu.cn/starscan/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. vvtools v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drake, Richard R.

    Vvtools is a suite of testing tools, with a focus on reproducible verification and validation. They are written in pure Python, and contain a test harness and an automated process management tool. Users of vvtools can develop suites of verification and validation tests and run them on small to large high performance computing resources in an automated and reproducible way. The test harness enables complex processes to be performed in each test and even supports a one-level parent/child dependency between tests. It includes a built in capability to manage workloads requiring multiple processors and platforms that use batch queueing systems.

  3. Portable computing - A fielded interactive scientific application in a small off-the-shelf package

    NASA Technical Reports Server (NTRS)

    Groleau, Nicolas; Hazelton, Lyman; Frainier, Rich; Compton, Michael; Colombano, Silvano; Szolovits, Peter

    1993-01-01

    Experience with the design and implementation of a portable computing system for STS crew-conducted science is discussed. Principal-Investigator-in-a-Box (PI) will help the SLS-2 astronauts perform vestibular (human orientation system) experiments in flight. PI is an interactive system that provides data acquisition and analysis, experiment step rescheduling, and various other forms of reasoning to astronaut users. The hardware architecture of PI consists of a computer and an analog interface box. 'Off-the-shelf' equipment is employed in the system wherever possible in an effort to use widely available tools and then to add custom functionality and application codes to them. Other projects which can help prospective teams to learn more about portable computing in space are also discussed.

  4. Computer Aided Drug Design: Success and Limitations.

    PubMed

    Baig, Mohammad Hassan; Ahmad, Khurshid; Roy, Sudeep; Ashraf, Jalaluddin Mohammad; Adil, Mohd; Siddiqui, Mohammad Haris; Khan, Saif; Kamal, Mohammad Amjad; Provazník, Ivo; Choi, Inho

    2016-01-01

    Over the last few decades, computer-aided drug design has emerged as a powerful technique playing a crucial role in the development of new drug molecules. Structure-based drug design and ligand-based drug design are two methods commonly used in computer-aided drug design. In this article, we discuss the theory behind both methods, as well as their successful applications and limitations. To accomplish this, we reviewed structure based and ligand based virtual screening processes. Molecular dynamics simulation, which has become one of the most influential tool for prediction of the conformation of small molecules and changes in their conformation within the biological target, has also been taken into account. Finally, we discuss the principles and concepts of molecular docking, pharmacophores and other methods used in computer-aided drug design.

  5. Mathematical and Computational Challenges in Population Biology and Ecosystems Science

    NASA Technical Reports Server (NTRS)

    Levin, Simon A.; Grenfell, Bryan; Hastings, Alan; Perelson, Alan S.

    1997-01-01

    Mathematical and computational approaches provide powerful tools in the study of problems in population biology and ecosystems science. The subject has a rich history intertwined with the development of statistics and dynamical systems theory, but recent analytical advances, coupled with the enhanced potential of high-speed computation, have opened up new vistas and presented new challenges. Key challenges involve ways to deal with the collective dynamics of heterogeneous ensembles of individuals, and to scale from small spatial regions to large ones. The central issues-understanding how detail at one scale makes its signature felt at other scales, and how to relate phenomena across scales-cut across scientific disciplines and go to the heart of algorithmic development of approaches to high-speed computation. Examples are given from ecology, genetics, epidemiology, and immunology.

  6. Efficient computation of the joint probability of multiple inherited risk alleles from pedigree data.

    PubMed

    Madsen, Thomas; Braun, Danielle; Peng, Gang; Parmigiani, Giovanni; Trippa, Lorenzo

    2018-06-25

    The Elston-Stewart peeling algorithm enables estimation of an individual's probability of harboring germline risk alleles based on pedigree data, and serves as the computational backbone of important genetic counseling tools. However, it remains limited to the analysis of risk alleles at a small number of genetic loci because its computing time grows exponentially with the number of loci considered. We propose a novel, approximate version of this algorithm, dubbed the peeling and paring algorithm, which scales polynomially in the number of loci. This allows extending peeling-based models to include many genetic loci. The algorithm creates a trade-off between accuracy and speed, and allows the user to control this trade-off. We provide exact bounds on the approximation error and evaluate it in realistic simulations. Results show that the loss of accuracy due to the approximation is negligible in important applications. This algorithm will improve genetic counseling tools by increasing the number of pathogenic risk alleles that can be addressed. To illustrate we create an extended five genes version of BRCAPRO, a widely used model for estimating the carrier probabilities of BRCA1 and BRCA2 risk alleles and assess its computational properties. © 2018 WILEY PERIODICALS, INC.

  7. OVERSMART Reporting Tool for Flow Computations Over Large Grid Systems

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Chan, William M.

    2012-01-01

    Structured grid solvers such as NASA's OVERFLOW compressible Navier-Stokes flow solver can generate large data files that contain convergence histories for flow equation residuals, turbulence model equation residuals, component forces and moments, and component relative motion dynamics variables. Most of today's large-scale problems can extend to hundreds of grids, and over 100 million grid points. However, due to the lack of efficient tools, only a small fraction of information contained in these files is analyzed. OVERSMART (OVERFLOW Solution Monitoring And Reporting Tool) provides a comprehensive report of solution convergence of flow computations over large, complex grid systems. It produces a one-page executive summary of the behavior of flow equation residuals, turbulence model equation residuals, and component forces and moments. Under the automatic option, a matrix of commonly viewed plots such as residual histograms, composite residuals, sub-iteration bar graphs, and component forces and moments is automatically generated. Specific plots required by the user can also be prescribed via a command file or a graphical user interface. Output is directed to the user s computer screen and/or to an html file for archival purposes. The current implementation has been targeted for the OVERFLOW flow solver, which is used to obtain a flow solution on structured overset grids. The OVERSMART framework allows easy extension to other flow solvers.

  8. Computer-based synthetic data to assess the tree delineation algorithm from airborne LiDAR survey

    Treesearch

    Lei Wang; Andrew G. Birt; Charles W. Lafon; David M. Cairns; Robert N. Coulson; Maria D. Tchakerian; Weimin Xi; Sorin C. Popescu; James M. Guldin

    2013-01-01

    Small Footprint LiDAR (Light Detection And Ranging) has been proposed as an effective tool for measuring detailed biophysical characteristics of forests over broad spatial scales. However, by itself LiDAR yields only a sample of the true 3D structure of a forest. In order to extract useful forestry relevant information, this data must be interpreted using mathematical...

  9. Sodium dopants in helium clusters: Structure, equilibrium and submersion kinetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calvo, F.

    Alkali impurities bind to helium nanodroplets very differently depending on their size and charge state, large neutral or charged dopants being wetted by the droplet whereas small neutral impurities prefer to reside aside. Using various computational modeling tools such as quantum Monte Carlo and path-integral molecular dynamics simulations, we have revisited some aspects of the physical chemistry of helium droplets interacting with sodium impurities, including the onset of snowball formation in presence of many-body polarization forces, the transition from non-wetted to wetted behavior in larger sodium clusters, and the kinetics of submersion of small dopants after sudden ionization.

  10. Systems Engineering and Integration (SE and I)

    NASA Technical Reports Server (NTRS)

    Chevers, ED; Haley, Sam

    1990-01-01

    The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.

  11. A precise goniometer/tensiometer using a low cost single-board computer

    NASA Astrophysics Data System (ADS)

    Favier, Benoit; Chamakos, Nikolaos T.; Papathanasiou, Athanasios G.

    2017-12-01

    Measuring the surface tension and the Young contact angle of a droplet is extremely important for many industrial applications. Here, considering the booming interest for small and cheap but precise experimental instruments, we have constructed a low-cost contact angle goniometer/tensiometer, based on a single-board computer (Raspberry Pi). The device runs an axisymmetric drop shape analysis (ADSA) algorithm written in Python. The code, here named DropToolKit, was developed in-house. We initially present the mathematical framework of our algorithm and then we validate our software tool against other well-established ADSA packages, including the commercial ramé-hart DROPimage Advanced as well as the DropAnalysis plugin in ImageJ. After successfully testing for various combinations of liquids and solid surfaces, we concluded that our prototype device would be highly beneficial for industrial applications as well as for scientific research in wetting phenomena compared to the commercial solutions.

  12. Nonlinear Aerodynamics and the Design of Wing Tips

    NASA Technical Reports Server (NTRS)

    Kroo, Ilan

    1991-01-01

    The analysis and design of wing tips for fixed wing and rotary wing aircraft still remains part art, part science. Although the design of airfoil sections and basic planform geometry is well developed, the tip regions require more detailed consideration. This is important because of the strong impact of wing tip flow on wing drag; although the tip region constitutes a small portion of the wing, its effect on the drag can be significant. The induced drag of a wing is, for a given lift and speed, inversely proportional to the square of the wing span. Concepts are proposed as a means of reducing drag. Modern computational methods provide a tool for studying these issues in greater detail. The purpose of the current research program is to improve the understanding of the fundamental issues involved in the design of wing tips and to develop the range of computational and experimental tools needed for further study of these ideas.

  13. Characterization of Protein Flexibility Using Small-Angle X-Ray Scattering and Amplified Collective Motion Simulations

    PubMed Central

    Wen, Bin; Peng, Junhui; Zuo, Xiaobing; Gong, Qingguo; Zhang, Zhiyong

    2014-01-01

    Large-scale flexibility within a multidomain protein often plays an important role in its biological function. Despite its inherent low resolution, small-angle x-ray scattering (SAXS) is well suited to investigate protein flexibility and determine, with the help of computational modeling, what kinds of protein conformations would coexist in solution. In this article, we develop a tool that combines SAXS data with a previously developed sampling technique called amplified collective motions (ACM) to elucidate structures of highly dynamic multidomain proteins in solution. We demonstrate the use of this tool in two proteins, bacteriophage T4 lysozyme and tandem WW domains of the formin-binding protein 21. The ACM simulations can sample the conformational space of proteins much more extensively than standard molecular dynamics (MD) simulations. Therefore, conformations generated by ACM are significantly better at reproducing the SAXS data than are those from MD simulations. PMID:25140431

  14. FITSManager: Management of Personal Astronomical Data

    NASA Astrophysics Data System (ADS)

    Cui, Chenzhou; Fan, Dongwei; Zhao, Yongheng; Kembhavi, Ajit; He, Boliang; Cao, Zihuang; Li, Jian; Nandrekar, Deoyani

    2011-07-01

    With the increase of personal storage capacity, it is easy to find hundreds to thousands of FITS files in the personal computer of an astrophysicist. Because Flexible Image Transport System (FITS) is a professional data format initiated by astronomers and used mainly in the small community, data management toolkits for FITS files are very few. Astronomers need a powerful tool to help them manage their local astronomical data. Although Virtual Observatory (VO) is a network oriented astronomical research environment, its applications and related technologies provide useful solutions to enhance the management and utilization of astronomical data hosted in an astronomer's personal computer. FITSManager is such a tool to provide astronomers an efficient management and utilization of their local data, bringing VO to astronomers in a seamless and transparent way. FITSManager provides fruitful functions for FITS file management, like thumbnail, preview, type dependent icons, header keyword indexing and search, collaborated working with other tools and online services, and so on. The development of the FITSManager is an effort to fill the gap between management and analysis of astronomical data.

  15. Approximating local observables on projected entangled pair states

    NASA Astrophysics Data System (ADS)

    Schwarz, M.; Buerschaper, O.; Eisert, J.

    2017-06-01

    Tensor network states are for good reasons believed to capture ground states of gapped local Hamiltonians arising in the condensed matter context, states which are in turn expected to satisfy an entanglement area law. However, the computational hardness of contracting projected entangled pair states in two- and higher-dimensional systems is often seen as a significant obstacle when devising higher-dimensional variants of the density-matrix renormalization group method. In this work, we show that for those projected entangled pair states that are expected to provide good approximations of such ground states of local Hamiltonians, one can compute local expectation values in quasipolynomial time. We therefore provide a complexity-theoretic justification of why state-of-the-art numerical tools work so well in practice. We finally turn to the computation of local expectation values on quantum computers, providing a meaningful application for a small-scale quantum computer.

  16. A Multiphysics and Multiscale Software Environment for Modeling Astrophysical Systems

    NASA Astrophysics Data System (ADS)

    Portegies Zwart, Simon; McMillan, Steve; O'Nualláin, Breanndán; Heggie, Douglas; Lombardi, James; Hut, Piet; Banerjee, Sambaran; Belkus, Houria; Fragos, Tassos; Fregeau, John; Fuji, Michiko; Gaburov, Evghenii; Glebbeek, Evert; Groen, Derek; Harfst, Stefan; Izzard, Rob; Jurić, Mario; Justham, Stephen; Teuben, Peter; van Bever, Joris; Yaron, Ofer; Zemp, Marcel

    We present MUSE, a software framework for tying together existing computational tools for different astrophysical domains into a single multiphysics, multiscale workload. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly-coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for a generalized stellar systems workload. MUSE has now reached a "Noah's Ark" milestone, with two available numerical solvers for each domain. MUSE can treat small stellar associations, galaxies and everything in between, including planetary systems, dense stellar clusters and galactic nuclei. Here we demonstrate an examples calculated with MUSE: the merger of two galaxies. In addition we demonstrate the working of MUSE on a distributed computer. The current MUSE code base is publicly available as open source at http://muse.li.

  17. The Internet and Computer User Profile: a questionnaire for determining intervention targets in occupational therapy at mental health vocational centers.

    PubMed

    Regev, Sivan; Hadas-Lidor, Noami; Rosenberg, Limor

    2016-08-01

    In this study, the assessment tool "Internet and Computer User Profile" questionnaire (ICUP) is presented and validated. It was developed in order to gather information for setting intervention goals to meet current demands. Sixty-eight subjects aged 23-68 participated in the study. The study group (n = 28) was sampled from two vocational centers. The control group consisted of 40 participants from the general population that were sampled by convenience sampling based on the demographics of the study group. Subjects from both groups answered the ICUP questionnaire. Subjects of the study group answered the General Self- Efficacy (GSE) questionnaire and performed the Assessment of Computer Task Performance (ACTP) test in order to examine the convergent validity of the ICUP. Twenty subjects from both groups retook the ICUP questionnaire in order to obtain test-retest results. Differences between groups were tested using multiple analysis of variance (MANOVA) tests. Pearson and Spearman's tests were used for calculating correlations. Cronbach's alpha coefficient and k equivalent were used to assess internal consistency. The results indicate that the questionnaire is valid and reliable. They emphasize that the layout of the ICUP items facilitates in making a comprehensive examination of the client's perception regarding his participation in computer and internet activities. Implications for Rehabiliation The assessment tool "Internet and Computer User Profile" (ICUP) questionnaire is a novel assessment tool that evaluates operative use and individual perception of computer activities. The questionnaire is valid and reliable for use with participants of vocational centers dealing with mental illness. It is essential to facilitate access to computers for people with mental illnesses, seeing that they express similar interest in computers and internet as people from the general population of the same age. Early intervention will be particularly effective for young adults dealing with mental illness, since the digital gap between them and young people in general is relatively small.

  18. Small Business Innovations

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Under an Army Small Business Innovation Research (SBIR) grant, Symbiotics, Inc. developed a software system that permits users to upgrade products from standalone applications so they can communicate in a distributed computing environment. Under a subsequent NASA SBIR grant, Symbiotics added additional tools to the SOCIAL product to enable NASA to coordinate conventional systems for planning Shuttle launch support operations. Using SOCIAL, data may be shared among applications in a computer network even when the applications are written in different programming languages. The product was introduced to the commercial market in 1993 and is used to monitor and control equipment for operation support and to integrate financial networks. The SBIR program was established to increase small business participation in federal R&D activities and to transfer government research to industry. InQuisiX is a reuse library providing high performance classification, cataloging, searching, browsing, retrieval and synthesis capabilities. These form the foundation for software reuse, producing higher quality software at lower cost and in less time. Software Productivity Solutions, Inc. developed the technology under Small Business Innovation Research (SBIR) projects funded by NASA and the Army and is marketing InQuisiX in conjunction with Science Applications International Corporation (SAIC). The SBIR program was established to increase small business participation in federal R&D activities and to transfer government research to industry.

  19. Current Capabilities at SNL for the Integration of Small Modular Reactors onto Smart Microgrids Using Sandia's Smart Microgrid Technology High Performance Computing and Advanced Manufacturing.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez, Salvador B.

    Smart grids are a crucial component for enabling the nation’s future energy needs, as part of a modernization effort led by the Department of Energy. Smart grids and smart microgrids are being considered in niche applications, and as part of a comprehensive energy strategy to help manage the nation’s growing energy demands, for critical infrastructures, military installations, small rural communities, and large populations with limited water supplies. As part of a far-reaching strategic initiative, Sandia National Laboratories (SNL) presents herein a unique, three-pronged approach to integrate small modular reactors (SMRs) into microgrids, with the goal of providing economically-competitive, reliable, andmore » secure energy to meet the nation’s needs. SNL’s triad methodology involves an innovative blend of smart microgrid technology, high performance computing (HPC), and advanced manufacturing (AM). In this report, Sandia’s current capabilities in those areas are summarized, as well as paths forward that will enable DOE to achieve its energy goals. In the area of smart grid/microgrid technology, Sandia’s current computational capabilities can model the entire grid, including temporal aspects and cyber security issues. Our tools include system development, integration, testing and evaluation, monitoring, and sustainment.« less

  20. Assessing a mini-application as a performance proxy for a finite element method engineering application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Paul T.; Heroux, Michael A.; Barrett, Richard F.

    The performance of a large-scale, production-quality science and engineering application (‘app’) is often dominated by a small subset of the code. Even within that subset, computational and data access patterns are often repeated, so that an even smaller portion can represent the performance-impacting features. If application developers, parallel computing experts, and computer architects can together identify this representative subset and then develop a small mini-application (‘miniapp’) that can capture these primary performance characteristics, then this miniapp can be used to both improve the performance of the app as well as provide a tool for co-design for the high-performance computing community.more » However, a critical question is whether a miniapp can effectively capture key performance behavior of an app. This study provides a comparison of an implicit finite element semiconductor device modeling app on unstructured meshes with an implicit finite element miniapp on unstructured meshes. The goal is to assess whether the miniapp is predictive of the performance of the app. Finally, single compute node performance will be compared, as well as scaling up to 16,000 cores. Results indicate that the miniapp can be reasonably predictive of the performance characteristics of the app for a single iteration of the solver on a single compute node.« less

  1. Assessing a mini-application as a performance proxy for a finite element method engineering application

    DOE PAGES

    Lin, Paul T.; Heroux, Michael A.; Barrett, Richard F.; ...

    2015-07-30

    The performance of a large-scale, production-quality science and engineering application (‘app’) is often dominated by a small subset of the code. Even within that subset, computational and data access patterns are often repeated, so that an even smaller portion can represent the performance-impacting features. If application developers, parallel computing experts, and computer architects can together identify this representative subset and then develop a small mini-application (‘miniapp’) that can capture these primary performance characteristics, then this miniapp can be used to both improve the performance of the app as well as provide a tool for co-design for the high-performance computing community.more » However, a critical question is whether a miniapp can effectively capture key performance behavior of an app. This study provides a comparison of an implicit finite element semiconductor device modeling app on unstructured meshes with an implicit finite element miniapp on unstructured meshes. The goal is to assess whether the miniapp is predictive of the performance of the app. Finally, single compute node performance will be compared, as well as scaling up to 16,000 cores. Results indicate that the miniapp can be reasonably predictive of the performance characteristics of the app for a single iteration of the solver on a single compute node.« less

  2. Defense Small Business Innovation Research Program (SBIR), Volume 4, Defense Agencies Abstracts of Phase 1 Awards 1991

    DTIC Science & Technology

    1991-01-01

    EXPERIENCE IN DEVELOPING INTEGRATED OPTICAL DEVICES, NONLINEAR MAGNETIC-OPTIC MATERIALS, HIGH FREQUENCY MODULATORS, COMPUTER-AIDED MODELING AND SOPHISTICATED... HIGH -LEVEL PRESENTATION AND DISTRIBUTED CONTROL MODELS FOR INTEGRATING HETEROGENEOUS MECHANICAL ENGINEERING APPLICATIONS AND TOOLS. THE DESIGN IS FOCUSED...STATISTICALLY ACCURATE WORST CASE DEVICE MODELS FOR CIRCUIT SIMULATION. PRESENT METHODS OF WORST CASE DEVICE DESIGN ARE AD HOC AND DO NOT ALLOW THE

  3. iTools: a framework for classification, categorization and integration of computational biology resources.

    PubMed

    Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W

    2008-05-28

    The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu.

  4. iTools: A Framework for Classification, Categorization and Integration of Computational Biology Resources

    PubMed Central

    Dinov, Ivo D.; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H. V.; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D. Stott; Toga, Arthur W.

    2008-01-01

    The advancement of the computational biology field hinges on progress in three fundamental directions – the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources–data, software tools and web-services. The iTools design, implementation and resource meta - data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu. PMID:18509477

  5. Vibrational Spectroscopy and Astrobiology

    NASA Technical Reports Server (NTRS)

    Chaban, Galina M.; Kwak, D. (Technical Monitor)

    2001-01-01

    Role of vibrational spectroscopy in solving problems related to astrobiology will be discussed. Vibrational (infrared) spectroscopy is a very sensitive tool for identifying molecules. Theoretical approach used in this work is based on direct computation of anharmonic vibrational frequencies and intensities from electronic structure codes. One of the applications of this computational technique is possible identification of biological building blocks (amino acids, small peptides, DNA bases) in the interstellar medium (ISM). Identifying small biological molecules in the ISM is very important from the point of view of origin of life. Hybrid (quantum mechanics/molecular mechanics) theoretical techniques will be discussed that may allow to obtain accurate vibrational spectra of biomolecular building blocks and to create a database of spectroscopic signatures that can assist observations of these molecules in space. Another application of the direct computational spectroscopy technique is to help to design and analyze experimental observations of ice surfaces of one of the Jupiter's moons, Europa, that possibly contains hydrated salts. The presence of hydrated salts on the surface can be an indication of a subsurface ocean and the possible existence of life forms inhabiting such an ocean.

  6. Computational methods for analysis and inference of kinase/inhibitor relationships

    PubMed Central

    Ferrè, Fabrizio; Palmeri, Antonio; Helmer-Citterich, Manuela

    2014-01-01

    The central role of kinases in virtually all signal transduction networks is the driving motivation for the development of compounds modulating their activity. ATP-mimetic inhibitors are essential tools for elucidating signaling pathways and are emerging as promising therapeutic agents. However, off-target ligand binding and complex and sometimes unexpected kinase/inhibitor relationships can occur for seemingly unrelated kinases, stressing that computational approaches are needed for learning the interaction determinants and for the inference of the effect of small compounds on a given kinase. Recently published high-throughput profiling studies assessed the effects of thousands of small compound inhibitors, covering a substantial portion of the kinome. This wealth of data paved the road for computational resources and methods that can offer a major contribution in understanding the reasons of the inhibition, helping in the rational design of more specific molecules, in the in silico prediction of inhibition for those neglected kinases for which no systematic analysis has been carried yet, in the selection of novel inhibitors with desired selectivity, and offering novel avenues of personalized therapies. PMID:25071826

  7. Navigation bronchoscopy for diagnosis and small nodule location

    PubMed Central

    Muñoz-Largacha, Juan A.; Litle, Virginia R.

    2017-01-01

    Lung cancer continues to be the most common cause of cancer death. Screening programs for high risk patients with the use of low-dose computed tomography (CT) has led to the identification of small lung lesions that were difficult to identify using previous imaging modalities. Electromagnetic navigational bronchoscopy (ENB) is a novel technique that has shown to be of great utility during the evaluation of small, peripheral lesions, that would otherwise be challenging to evaluate with conventional bronchoscopy. The diagnostic yield of navigational bronchoscopy however is highly variable, with reports ranging from 59% to 94%. This variability suggests that well-defined selection criteria and standardized protocols for the use of ENB are lacking. Despite this variability, we believe that this technique is a useful tool evaluating small peripheral lung lesions when patients are properly selected. PMID:28446971

  8. High-resolution short-exposure small-animal laboratory x-ray phase-contrast tomography

    NASA Astrophysics Data System (ADS)

    Larsson, Daniel H.; Vågberg, William; Yaroshenko, Andre; Yildirim, Ali Önder; Hertz, Hans M.

    2016-12-01

    X-ray computed tomography of small animals and their organs is an essential tool in basic and preclinical biomedical research. In both phase-contrast and absorption tomography high spatial resolution and short exposure times are of key importance. However, the observable spatial resolutions and achievable exposure times are presently limited by system parameters rather than more fundamental constraints like, e.g., dose. Here we demonstrate laboratory tomography with few-ten μm spatial resolution and few-minute exposure time at an acceptable dose for small-animal imaging, both with absorption contrast and phase contrast. The method relies on a magnifying imaging scheme in combination with a high-power small-spot liquid-metal-jet electron-impact source. The tomographic imaging is demonstrated on intact mouse, phantoms and excised lungs, both healthy and with pulmonary emphysema.

  9. Battery Pack Thermal Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pesaran, Ahmad

    This presentation describes the thermal design of battery packs at the National Renewable Energy Laboratory. A battery thermal management system essential for xEVs for both normal operation during daily driving (achieving life and performance) and off-normal operation during abuse conditions (achieving safety). The battery thermal management system needs to be optimized with the right tools for the lowest cost. Experimental tools such as NREL's isothermal battery calorimeter, thermal imaging, and heat transfer setups are needed. Thermal models and computer-aided engineering tools are useful for robust designs. During abuse conditions, designs should prevent cell-to-cell propagation in a module/pack (i.e., keep themore » fire small and manageable). NREL's battery ISC device can be used for evaluating the robustness of a module/pack to cell-to-cell propagation.« less

  10. Implementation of Biogas Stations into Smart Heating and Cooling Network

    NASA Astrophysics Data System (ADS)

    Milčák, P.; Konvička, J.; Jasenská, M.

    2016-10-01

    The paper is aimed at the description of implementation of a biogas station into software environment for the "Smart Heating and Cooling Networks". The aim of this project is creation of a software tool for preparation of operation and optimization of treatment of heat/cool in small regions. In this case, the biogas station represents a kind of renewable energy source, which, however, has its own operational specifics which need to be taken into account at the creation of an implementation project. For a specific biogas station, a detailed computational model was elaborated, which is parameterized in particular for an optimization of the total computational time.

  11. Automatic analysis of stereoscopic satellite image pairs for determination of cloud-top height and structure

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.; Strong, J.; Woodward, R. H.; Pierce, H.

    1991-01-01

    Results are presented on an automatic stereo analysis of cloud-top heights from nearly simultaneous satellite image pairs from the GOES and NOAA satellites, using a massively parallel processor computer. Comparisons of computer-derived height fields and manually analyzed fields show that the automatic analysis technique shows promise for performing routine stereo analysis in a real-time environment, providing a useful forecasting tool by augmenting observational data sets of severe thunderstorms and hurricanes. Simulations using synthetic stereo data show that it is possible to automatically resolve small-scale features such as 4000-m-diam clouds to about 1500 m in the vertical.

  12. Actualities and Development of Heavy-Duty CNC Machine Tool Thermal Error Monitoring Technology

    NASA Astrophysics Data System (ADS)

    Zhou, Zu-De; Gui, Lin; Tan, Yue-Gang; Liu, Ming-Yao; Liu, Yi; Li, Rui-Ya

    2017-09-01

    Thermal error monitoring technology is the key technological support to solve the thermal error problem of heavy-duty CNC (computer numerical control) machine tools. Currently, there are many review literatures introducing the thermal error research of CNC machine tools, but those mainly focus on the thermal issues in small and medium-sized CNC machine tools and seldom introduce thermal error monitoring technologies. This paper gives an overview of the research on the thermal error of CNC machine tools and emphasizes the study of thermal error of the heavy-duty CNC machine tool in three areas. These areas are the causes of thermal error of heavy-duty CNC machine tool and the issues with the temperature monitoring technology and thermal deformation monitoring technology. A new optical measurement technology called the "fiber Bragg grating (FBG) distributed sensing technology" for heavy-duty CNC machine tools is introduced in detail. This technology forms an intelligent sensing and monitoring system for heavy-duty CNC machine tools. This paper fills in the blank of this kind of review articles to guide the development of this industry field and opens up new areas of research on the heavy-duty CNC machine tool thermal error.

  13. Software tools for interactive instruction in radiologic anatomy.

    PubMed

    Alvarez, Antonio; Gold, Garry E; Tobin, Brian; Desser, Terry S

    2006-04-01

    To promote active learning in an introductory Radiologic Anatomy course through the use of computer-based exercises. DICOM datasets from our hospital PACS system were transferred to a networked cluster of desktop computers in a medical school classroom. Medical students in the Radiologic Anatomy course were divided into four small groups and assigned to work on a clinical case for 45 minutes. The groups used iPACS viewer software, a free DICOM viewer, to view images and annotate anatomic structures. The classroom instructor monitored and displayed each group's work sequentially on the master screen by running SynchronEyes, a software tool for controlling PC desktops remotely. Students were able to execute the assigned tasks using the iPACS software with minimal oversight or instruction. Course instructors displayed each group's work on the main display screen of the classroom as the students presented the rationale for their decisions. The interactive component of the course received high ratings from the students and overall course ratings were higher than in prior years when the course was given solely in lecture format. DICOM viewing software is an excellent tool for enabling students to learn radiologic anatomy from real-life clinical datasets. Interactive exercises performed in groups can be powerful tools for stimulating students to learn radiologic anatomy.

  14. Identifying influential data points in hydrological model calibration and their impact on streamflow predictions

    NASA Astrophysics Data System (ADS)

    Wright, David; Thyer, Mark; Westra, Seth

    2015-04-01

    Highly influential data points are those that have a disproportionately large impact on model performance, parameters and predictions. However, in current hydrological modelling practice the relative influence of individual data points on hydrological model calibration is not commonly evaluated. This presentation illustrates and evaluates several influence diagnostics tools that hydrological modellers can use to assess the relative influence of data. The feasibility and importance of including influence detection diagnostics as a standard tool in hydrological model calibration is discussed. Two classes of influence diagnostics are evaluated: (1) computationally demanding numerical "case deletion" diagnostics; and (2) computationally efficient analytical diagnostics, based on Cook's distance. These diagnostics are compared against hydrologically orientated diagnostics that describe changes in the model parameters (measured through the Mahalanobis distance), performance (objective function displacement) and predictions (mean and maximum streamflow). These influence diagnostics are applied to two case studies: a stage/discharge rating curve model, and a conceptual rainfall-runoff model (GR4J). Removing a single data point from the calibration resulted in differences to mean flow predictions of up to 6% for the rating curve model, and differences to mean and maximum flow predictions of up to 10% and 17%, respectively, for the hydrological model. When using the Nash-Sutcliffe efficiency in calibration, the computationally cheaper Cook's distance metrics produce similar results to the case-deletion metrics at a fraction of the computational cost. However, Cooks distance is adapted from linear regression with inherit assumptions on the data and is therefore less flexible than case deletion. Influential point detection diagnostics show great potential to improve current hydrological modelling practices by identifying highly influential data points. The findings of this study establish the feasibility and importance of including influential point detection diagnostics as a standard tool in hydrological model calibration. They provide the hydrologist with important information on whether model calibration is susceptible to a small number of highly influent data points. This enables the hydrologist to make a more informed decision of whether to (1) remove/retain the calibration data; (2) adjust the calibration strategy and/or hydrological model to reduce the susceptibility of model predictions to a small number of influential observations.

  15. Virtual substitution scan via single-step free energy perturbation.

    PubMed

    Chiang, Ying-Chih; Wang, Yi

    2016-02-05

    With the rapid expansion of our computing power, molecular dynamics (MD) simulations ranging from hundreds of nanoseconds to microseconds or even milliseconds have become increasingly common. The majority of these long trajectories are obtained from plain (vanilla) MD simulations, where no enhanced sampling or free energy calculation method is employed. To promote the 'recycling' of these trajectories, we developed the Virtual Substitution Scan (VSS) toolkit as a plugin of the open-source visualization and analysis software VMD. Based on the single-step free energy perturbation (sFEP) method, VSS enables the user to post-process a vanilla MD trajectory for a fast free energy scan of substituting aryl hydrogens by small functional groups. Dihedrals of the functional groups are sampled explicitly in VSS, which improves the performance of the calculation and is found particularly important for certain groups. As a proof-of-concept demonstration, we employ VSS to compute the solvation free energy change upon substituting the hydrogen of a benzene molecule by 12 small functional groups frequently considered in lead optimization. Additionally, VSS is used to compute the relative binding free energy of four selected ligands of the T4 lysozyme. Overall, the computational cost of VSS is only a fraction of the corresponding multi-step FEP (mFEP) calculation, while its results agree reasonably well with those of mFEP, indicating that VSS offers a promising tool for rapid free energy scan of small functional group substitutions. This article is protected by copyright. All rights reserved. © 2016 Wiley Periodicals, Inc.

  16. The role of positron emission tomography in the diagnosis, staging and response assessment of non-small cell lung cancer

    PubMed Central

    Ali, Jason M.; Tasker, Angela; Peryt, Adam; Aresu, Giuseppe; Coonar, Aman S.

    2018-01-01

    Lung cancer is a common disease and the leading cause of cancer-related mortality, with non-small cell lung cancer (NSCLC) accounting for the majority of cases. Following diagnosis of lung cancer, accurate staging is essential to guide clinical management and inform prognosis. Positron emission tomography (PET) in conjunction with computed tomography (CT)—as PET-CT has developed as an important tool in the multi-disciplinary management of lung cancer. This article will review the current evidence for the role of 18F-fluorodeoxyglucose (FDG) PET-CT in NSCLC diagnosis, staging, response assessment and follow up. PMID:29666818

  17. Adaptive awareness for personal and small group decision making.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perano, Kenneth J.; Tucker, Steve; Pancerella, Carmen M.

    2003-12-01

    Many situations call for the use of sensors monitoring physiological and environmental data. In order to use the large amounts of sensor data to affect decision making, we are coupling heterogeneous sensors with small, light-weight processors, other powerful computers, wireless communications, and embedded intelligent software. The result is an adaptive awareness and warning tool, which provides both situation awareness and personal awareness to individuals and teams. Central to this tool is a sensor-independent architecture, which combines both software agents and a reusable core software framework that manages the available hardware resources and provides services to the agents. Agents can recognizemore » cues from the data, warn humans about situations, and act as decision-making aids. Within the agents, self-organizing maps (SOMs) are used to process physiological data in order to provide personal awareness. We have employed a novel clustering algorithm to train the SOM to discern individual body states and activities. This awareness tool has broad applicability to emergency teams, military squads, military medics, individual exercise and fitness monitoring, health monitoring for sick and elderly persons, and environmental monitoring in public places. This report discusses our hardware decisions, software framework, and a pilot awareness tool, which has been developed at Sandia National Laboratories.« less

  18. Higher-order ice-sheet modelling accelerated by multigrid on graphics cards

    NASA Astrophysics Data System (ADS)

    Brædstrup, Christian; Egholm, David

    2013-04-01

    Higher-order ice flow modelling is a very computer intensive process owing primarily to the nonlinear influence of the horizontal stress coupling. When applied for simulating long-term glacial landscape evolution, the ice-sheet models must consider very long time series, while both high temporal and spatial resolution is needed to resolve small effects. The use of higher-order and full stokes models have therefore seen very limited usage in this field. However, recent advances in graphics card (GPU) technology for high performance computing have proven extremely efficient in accelerating many large-scale scientific computations. The general purpose GPU (GPGPU) technology is cheap, has a low power consumption and fits into a normal desktop computer. It could therefore provide a powerful tool for many glaciologists working on ice flow models. Our current research focuses on utilising the GPU as a tool in ice-sheet and glacier modelling. To this extent we have implemented the Integrated Second-Order Shallow Ice Approximation (iSOSIA) equations on the device using the finite difference method. To accelerate the computations, the GPU solver uses a non-linear Red-Black Gauss-Seidel iterator coupled with a Full Approximation Scheme (FAS) multigrid setup to further aid convergence. The GPU finite difference implementation provides the inherent parallelization that scales from hundreds to several thousands of cores on newer cards. We demonstrate the efficiency of the GPU multigrid solver using benchmark experiments.

  19. 1001 Ways to run AutoDock Vina for virtual screening

    NASA Astrophysics Data System (ADS)

    Jaghoori, Mohammad Mahdi; Bleijlevens, Boris; Olabarriaga, Silvia D.

    2016-03-01

    Large-scale computing technologies have enabled high-throughput virtual screening involving thousands to millions of drug candidates. It is not trivial, however, for biochemical scientists to evaluate the technical alternatives and their implications for running such large experiments. Besides experience with the molecular docking tool itself, the scientist needs to learn how to run it on high-performance computing (HPC) infrastructures, and understand the impact of the choices made. Here, we review such considerations for a specific tool, AutoDock Vina, and use experimental data to illustrate the following points: (1) an additional level of parallelization increases virtual screening throughput on a multi-core machine; (2) capturing of the random seed is not enough (though necessary) for reproducibility on heterogeneous distributed computing systems; (3) the overall time spent on the screening of a ligand library can be improved by analysis of factors affecting execution time per ligand, including number of active torsions, heavy atoms and exhaustiveness. We also illustrate differences among four common HPC infrastructures: grid, Hadoop, small cluster and multi-core (virtual machine on the cloud). Our analysis shows that these platforms are suitable for screening experiments of different sizes. These considerations can guide scientists when choosing the best computing platform and set-up for their future large virtual screening experiments.

  20. 1001 Ways to run AutoDock Vina for virtual screening.

    PubMed

    Jaghoori, Mohammad Mahdi; Bleijlevens, Boris; Olabarriaga, Silvia D

    2016-03-01

    Large-scale computing technologies have enabled high-throughput virtual screening involving thousands to millions of drug candidates. It is not trivial, however, for biochemical scientists to evaluate the technical alternatives and their implications for running such large experiments. Besides experience with the molecular docking tool itself, the scientist needs to learn how to run it on high-performance computing (HPC) infrastructures, and understand the impact of the choices made. Here, we review such considerations for a specific tool, AutoDock Vina, and use experimental data to illustrate the following points: (1) an additional level of parallelization increases virtual screening throughput on a multi-core machine; (2) capturing of the random seed is not enough (though necessary) for reproducibility on heterogeneous distributed computing systems; (3) the overall time spent on the screening of a ligand library can be improved by analysis of factors affecting execution time per ligand, including number of active torsions, heavy atoms and exhaustiveness. We also illustrate differences among four common HPC infrastructures: grid, Hadoop, small cluster and multi-core (virtual machine on the cloud). Our analysis shows that these platforms are suitable for screening experiments of different sizes. These considerations can guide scientists when choosing the best computing platform and set-up for their future large virtual screening experiments.

  1. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  2. Modeling And Simulation Of Bar Code Scanners Using Computer Aided Design Software

    NASA Astrophysics Data System (ADS)

    Hellekson, Ron; Campbell, Scott

    1988-06-01

    Many optical systems have demanding requirements to package the system in a small 3 dimensional space. The use of computer graphic tools can be a tremendous aid to the designer in analyzing the optical problems created by smaller and less costly systems. The Spectra Physics grocery store bar code scanner employs an especially complex 3 dimensional scan pattern to read bar code labels. By using a specially written program which interfaces with a computer aided design system, we have simulated many of the functions of this complex optical system. In this paper we will illustrate how a recent version of the scanner has been designed. We will discuss the use of computer graphics in the design process including interactive tweaking of the scan pattern, analysis of collected light, analysis of the scan pattern density, and analysis of the manufacturing tolerances used to build the scanner.

  3. EPEPT: A web service for enhanced P-value estimation in permutation tests

    PubMed Central

    2011-01-01

    Background In computational biology, permutation tests have become a widely used tool to assess the statistical significance of an event under investigation. However, the common way of computing the P-value, which expresses the statistical significance, requires a very large number of permutations when small (and thus interesting) P-values are to be accurately estimated. This is computationally expensive and often infeasible. Recently, we proposed an alternative estimator, which requires far fewer permutations compared to the standard empirical approach while still reliably estimating small P-values [1]. Results The proposed P-value estimator has been enriched with additional functionalities and is made available to the general community through a public website and web service, called EPEPT. This means that the EPEPT routines can be accessed not only via a website, but also programmatically using any programming language that can interact with the web. Examples of web service clients in multiple programming languages can be downloaded. Additionally, EPEPT accepts data of various common experiment types used in computational biology. For these experiment types EPEPT first computes the permutation values and then performs the P-value estimation. Finally, the source code of EPEPT can be downloaded. Conclusions Different types of users, such as biologists, bioinformaticians and software engineers, can use the method in an appropriate and simple way. Availability http://informatics.systemsbiology.net/EPEPT/ PMID:22024252

  4. Maestro: an orchestration framework for large-scale WSN simulations.

    PubMed

    Riliskis, Laurynas; Osipov, Evgeny

    2014-03-18

    Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation.

  5. Maestro: An Orchestration Framework for Large-Scale WSN Simulations

    PubMed Central

    Riliskis, Laurynas; Osipov, Evgeny

    2014-01-01

    Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation. PMID:24647123

  6. High-resolution short-exposure small-animal laboratory x-ray phase-contrast tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larsson, Daniel H.; Vågberg, William; Yaroshenko, Andre

    X-ray computed tomography of small animals and their organs is an essential tool in basic and preclinical biomedical research. In both phase-contrast and absorption tomography high spatial resolution and short exposure times are of key importance. However, the observable spatial resolutions and achievable exposure times are presently limited by system parameters rather than more fundamental constraints like, e.g., dose. Here we demonstrate laboratory tomography with few-ten μm spatial resolution and few-minute exposure time at an acceptable dose for small-animal imaging, both with absorption contrast and phase contrast. The method relies on a magnifying imaging scheme in combination with a high-powermore » small-spot liquid-metal-jet electron-impact source. Lastly, the tomographic imaging is demonstrated on intact mouse, phantoms and excised lungs, both healthy and with pulmonary emphysema.« less

  7. High-resolution short-exposure small-animal laboratory x-ray phase-contrast tomography

    DOE PAGES

    Larsson, Daniel H.; Vågberg, William; Yaroshenko, Andre; ...

    2016-12-13

    X-ray computed tomography of small animals and their organs is an essential tool in basic and preclinical biomedical research. In both phase-contrast and absorption tomography high spatial resolution and short exposure times are of key importance. However, the observable spatial resolutions and achievable exposure times are presently limited by system parameters rather than more fundamental constraints like, e.g., dose. Here we demonstrate laboratory tomography with few-ten μm spatial resolution and few-minute exposure time at an acceptable dose for small-animal imaging, both with absorption contrast and phase contrast. The method relies on a magnifying imaging scheme in combination with a high-powermore » small-spot liquid-metal-jet electron-impact source. Lastly, the tomographic imaging is demonstrated on intact mouse, phantoms and excised lungs, both healthy and with pulmonary emphysema.« less

  8. Comparison of Computational Approaches for Rapid Aerodynamic Assessment of Small UAVs

    NASA Technical Reports Server (NTRS)

    Shafer, Theresa C.; Lynch, C. Eric; Viken, Sally A.; Favaregh, Noah; Zeune, Cale; Williams, Nathan; Dansie, Jonathan

    2014-01-01

    Computational Fluid Dynamic (CFD) methods were used to determine the basic aerodynamic, performance, and stability and control characteristics of the unmanned air vehicle (UAV), Kahu. Accurate and timely prediction of the aerodynamic characteristics of small UAVs is an essential part of military system acquisition and air-worthiness evaluations. The forces and moments of the UAV were predicted using a variety of analytical methods for a range of configurations and conditions. The methods included Navier Stokes (N-S) flow solvers (USM3D, Kestrel and Cobalt) that take days to set up and hours to converge on a single solution; potential flow methods (PMARC, LSAERO, and XFLR5) that take hours to set up and minutes to compute; empirical methods (Datcom) that involve table lookups and produce a solution quickly; and handbook calculations. A preliminary aerodynamic database can be developed very efficiently by using a combination of computational tools. The database can be generated with low-order and empirical methods in linear regions, then replacing or adjusting the data as predictions from higher order methods are obtained. A comparison of results from all the data sources as well as experimental data obtained from a wind-tunnel test will be shown and the methods will be evaluated on their utility during each portion of the flight envelope.

  9. Unraveling the Web of Viroinformatics: Computational Tools and Databases in Virus Research

    PubMed Central

    Priyadarshini, Pragya; Vrati, Sudhanshu

    2014-01-01

    The beginning of the second century of research in the field of virology (the first virus was discovered in 1898) was marked by its amalgamation with bioinformatics, resulting in the birth of a new domain—viroinformatics. The availability of more than 100 Web servers and databases embracing all or specific viruses (for example, dengue virus, influenza virus, hepatitis virus, human immunodeficiency virus [HIV], hemorrhagic fever virus [HFV], human papillomavirus [HPV], West Nile virus, etc.) as well as distinct applications (comparative/diversity analysis, viral recombination, small interfering RNA [siRNA]/short hairpin RNA [shRNA]/microRNA [miRNA] studies, RNA folding, protein-protein interaction, structural analysis, and phylotyping and genotyping) will definitely aid the development of effective drugs and vaccines. However, information about their access and utility is not available at any single source or on any single platform. Therefore, a compendium of various computational tools and resources dedicated specifically to virology is presented in this article. PMID:25428870

  10. Unraveling the web of viroinformatics: computational tools and databases in virus research.

    PubMed

    Sharma, Deepak; Priyadarshini, Pragya; Vrati, Sudhanshu

    2015-02-01

    The beginning of the second century of research in the field of virology (the first virus was discovered in 1898) was marked by its amalgamation with bioinformatics, resulting in the birth of a new domain--viroinformatics. The availability of more than 100 Web servers and databases embracing all or specific viruses (for example, dengue virus, influenza virus, hepatitis virus, human immunodeficiency virus [HIV], hemorrhagic fever virus [HFV], human papillomavirus [HPV], West Nile virus, etc.) as well as distinct applications (comparative/diversity analysis, viral recombination, small interfering RNA [siRNA]/short hairpin RNA [shRNA]/microRNA [miRNA] studies, RNA folding, protein-protein interaction, structural analysis, and phylotyping and genotyping) will definitely aid the development of effective drugs and vaccines. However, information about their access and utility is not available at any single source or on any single platform. Therefore, a compendium of various computational tools and resources dedicated specifically to virology is presented in this article. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  11. PyOperators: Operators and solvers for high-performance computing

    NASA Astrophysics Data System (ADS)

    Chanial, P.; Barbey, N.

    2012-12-01

    PyOperators is a publicly available library that provides basic operators and solvers for small-to-very large inverse problems ({http://pchanial.github.com/pyoperators}). It forms the backbone of the package PySimulators, which implements specific operators to construct an instrument model and means to conveniently represent a map, a timeline or a time-dependent observation ({http://pchanial.github.com/pysimulators}). Both are part of the Tamasis (Tools for Advanced Map-making, Analysis and SImulations of Submillimeter surveys) toolbox, aiming at providing versatile, reliable, easy-to-use, and optimal map-making tools for Herschel and future generation of sub-mm instruments. The project is a collaboration between 4 institutes (ESO Garching, IAS Orsay, CEA Saclay, Univ. Leiden).

  12. Quantum chemistry simulation on quantum computers: theories and experiments.

    PubMed

    Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-07-14

    It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

  13. Accounting for aquifer heterogeneity from geological data to management tools.

    PubMed

    Blouin, Martin; Martel, Richard; Gloaguen, Erwan

    2013-01-01

    A nested workflow of multiple-point geostatistics (MPG) and sequential Gaussian simulation (SGS) was tested on a study area of 6 km(2) located about 20 km northwest of Quebec City, Canada. In order to assess its geological and hydrogeological parameter heterogeneity and to provide tools to evaluate uncertainties in aquifer management, direct and indirect field measurements are used as inputs in the geostatistical simulations to reproduce large and small-scale heterogeneities. To do so, the lithological information is first associated to equivalent hydrogeological facies (hydrofacies) according to hydraulic properties measured at several wells. Then, heterogeneous hydrofacies (HF) realizations are generated using a prior geological model as training image (TI) with the MPG algorithm. The hydraulic conductivity (K) heterogeneity modeling within each HF is finally computed using SGS algorithm. Different K models are integrated in a finite-element hydrogeological model to calculate multiple transport simulations. Different scenarios exhibit variations in mass transport path and dispersion associated with the large- and small-scale heterogeneity respectively. Three-dimensional maps showing the probability of overpassing different thresholds are presented as examples of management tools. © 2012, The Author(s). Groundwater © 2012, National Ground Water Association.

  14. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R

    PubMed Central

    Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia

    2016-01-01

    Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis. PMID:27792763

  15. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R.

    PubMed

    Chen, Shi-Yi; Deng, Feilong; Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia

    2016-01-01

    Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.

  16. Recent advances in the UltraScan SOlution MOdeller (US-SOMO) hydrodynamic and small-angle scattering data analysis and simulation suite.

    PubMed

    Brookes, Emre; Rocco, Mattia

    2018-03-28

    The UltraScan SOlution MOdeller (US-SOMO) is a comprehensive, public domain, open-source suite of computer programs centred on hydrodynamic modelling and small-angle scattering (SAS) data analysis and simulation. We describe here the advances that have been implemented since its last official release (#3087, 2017), which are available from release #3141 for Windows, Linux and Mac operating systems. A major effort has been the transition from the legacy Qt3 cross platform software development and user interface library to the modern Qt5 release. Apart from improved graphical support, this has allowed the direct implementation of the newest, almost two-orders of magnitude faster version of the ZENO hydrodynamic computation algorithm for all operating systems. Coupled with the SoMo-generated bead models with overlaps, ZENO provides the most accurate translational friction computations from atomic-level structures available (Rocco and Byron Eur Biophys J 44:417-431, 2015a), with computational times comparable with or faster than those of other methods. In addition, it has allowed us to introduce the direct representation of each atom in a structure as a (hydrated) bead, opening interesting new modelling possibilities. In the small-angle scattering (SAS) part of the suite, an indirect Fourier transform Bayesian algorithm has been implemented for the computation of the pairwise distance distribution function from SAS data. Finally, the SAS HPLC module, recently upgraded with improved baseline correction and Gaussian decomposition of not baseline-resolved peaks and with advanced statistical evaluation tools (Brookes et al. J Appl Cryst 49:1827-1841, 2016), now allows automatic top-peak frame selection and averaging.

  17. The Biomolecular Interaction Network Database and related tools 2005 update

    PubMed Central

    Alfarano, C.; Andrade, C. E.; Anthony, K.; Bahroos, N.; Bajec, M.; Bantoft, K.; Betel, D.; Bobechko, B.; Boutilier, K.; Burgess, E.; Buzadzija, K.; Cavero, R.; D'Abreo, C.; Donaldson, I.; Dorairajoo, D.; Dumontier, M. J.; Dumontier, M. R.; Earles, V.; Farrall, R.; Feldman, H.; Garderman, E.; Gong, Y.; Gonzaga, R.; Grytsan, V.; Gryz, E.; Gu, V.; Haldorsen, E.; Halupa, A.; Haw, R.; Hrvojic, A.; Hurrell, L.; Isserlin, R.; Jack, F.; Juma, F.; Khan, A.; Kon, T.; Konopinsky, S.; Le, V.; Lee, E.; Ling, S.; Magidin, M.; Moniakis, J.; Montojo, J.; Moore, S.; Muskat, B.; Ng, I.; Paraiso, J. P.; Parker, B.; Pintilie, G.; Pirone, R.; Salama, J. J.; Sgro, S.; Shan, T.; Shu, Y.; Siew, J.; Skinner, D.; Snyder, K.; Stasiuk, R.; Strumpf, D.; Tuekam, B.; Tao, S.; Wang, Z.; White, M.; Willis, R.; Wolting, C.; Wong, S.; Wrong, A.; Xin, C.; Yao, R.; Yates, B.; Zhang, S.; Zheng, K.; Pawson, T.; Ouellette, B. F. F.; Hogue, C. W. V.

    2005-01-01

    The Biomolecular Interaction Network Database (BIND) (http://bind.ca) archives biomolecular interaction, reaction, complex and pathway information. Our aim is to curate the details about molecular interactions that arise from published experimental research and to provide this information, as well as tools to enable data analysis, freely to researchers worldwide. BIND data are curated into a comprehensive machine-readable archive of computable information and provides users with methods to discover interactions and molecular mechanisms. BIND has worked to develop new methods for visualization that amplify the underlying annotation of genes and proteins to facilitate the study of molecular interaction networks. BIND has maintained an open database policy since its inception in 1999. Data growth has proceeded at a tremendous rate, approaching over 100 000 records. New services provided include a new BIND Query and Submission interface, a Standard Object Access Protocol service and the Small Molecule Interaction Database (http://smid.blueprint.org) that allows users to determine probable small molecule binding sites of new sequences and examine conserved binding residues. PMID:15608229

  18. Aerodynamic Modeling of Transonic Aircraft Using Vortex Lattice Coupled with Transonic Small Disturbance for Conceptual Design

    NASA Technical Reports Server (NTRS)

    Chaparro, Daniel; Fujiwara, Gustavo E. C.; Ting, Eric; Nguyen, Nhan

    2016-01-01

    The need to rapidly scan large design spaces during conceptual design calls for computationally inexpensive tools such as the vortex lattice method (VLM). Although some VLM tools, such as Vorview have been extended to model fully-supersonic flow, VLM solutions are typically limited to inviscid, subcritical flow regimes. Many transport aircraft operate at transonic speeds, which limits the applicability of VLM for such applications. This paper presents a novel approach to correct three-dimensional VLM through coupling of two-dimensional transonic small disturbance (TSD) solutions along the span of an aircraft wing in order to accurately predict transonic aerodynamic loading and wave drag for transport aircraft. The approach is extended to predict flow separation and capture the attenuation of aerodynamic forces due to boundary layer viscosity by coupling the TSD solver with an integral boundary layer (IBL) model. The modeling framework is applied to the NASA General Transport Model (GTM) integrated with a novel control surface known as the Variable Camber Continuous Trailing Edge Flap (VCCTEF).

  19. Electronic Handbooks Simplify Process Management

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Getting a multitude of people to work together to manage processes across many organizations for example, flight projects, research, technologies, or data centers and others is not an easy task. Just ask Dr. Barry E. Jacobs, a research computer scientist at Goddard Space Flight Center. He helped NASA develop a process management solution that provided documenting tools for process developers and participants to help them quickly learn, adapt, test, and teach their views. Some of these tools included editable files for subprocess descriptions, document descriptions, role guidelines, manager worksheets, and references. First utilized for NASA's Headquarters Directives Management process, the approach led to the invention of a concept called the Electronic Handbook (EHB). This EHB concept was successfully applied to NASA's Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) programs, among other NASA programs. Several Federal agencies showed interest in the concept, so Jacobs and his team visited these agencies to show them how their specific processes could be managed by the methodology, as well as to create mockup versions of the EHBs.

  20. Ocean Models and Proper Orthogonal Decomposition

    NASA Astrophysics Data System (ADS)

    Salas-de-Leon, D. A.

    2007-05-01

    The increasing computational developments and the better understanding of mathematical and physical systems resulted in an increasing number of ocean models. Long time ago, modelers were like a secret organization and recognize each other by using secret codes and languages that only a select group of people was able to recognize and understand. The access to computational systems was reduced, on one hand equipment and the using time of computers were expensive and restricted, and on the other hand, they required an advance computational languages that not everybody wanted to learn. Now a days most college freshman own a personal computer (PC or laptop), and/or have access to more sophisticated computational systems than those available for research in the early 80's. The resource availability resulted in a mayor access to all kind models. Today computer speed and time and the algorithms does not seem to be a problem, even though some models take days to run in small computational systems. Almost every oceanographic institution has their own model, what is more, in the same institution from one office to the next there are different models for the same phenomena, developed by different research member, the results does not differ substantially since the equations are the same, and the solving algorithms are similar. The algorithms and the grids, constructed with algorithms, can be found in text books and/or over the internet. Every year more sophisticated models are constructed. The Proper Orthogonal Decomposition is a technique that allows the reduction of the number of variables to solve keeping the model properties, for which it can be a very useful tool in diminishing the processes that have to be solved using "small" computational systems, making sophisticated models available for a greater community.

  1. Planetary-Scale Geospatial Data Analysis Techniques in Google's Earth Engine Platform (Invited)

    NASA Astrophysics Data System (ADS)

    Hancher, M.

    2013-12-01

    Geoscientists have more and more access to new tools for large-scale computing. With any tool, some tasks are easy and other tasks hard. It is natural to look to new computing platforms to increase the scale and efficiency of existing techniques, but there is a more exiting opportunity to discover and develop a new vocabulary of fundamental analysis idioms that are made easy and effective by these new tools. Google's Earth Engine platform is a cloud computing environment for earth data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog includes a nearly complete archive of scenes from Landsat 4, 5, 7, and 8 that have been processed by the USGS, as well as a wide variety of other remotely-sensed and ancillary data products. Earth Engine supports a just-in-time computation model that enables real-time preview during algorithm development and debugging as well as during experimental data analysis and open-ended data exploration. Data processing operations are performed in parallel across many computers in Google's datacenters. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, resampling, and associating image metadata with pixel data. Early applications of Earth Engine have included the development of Google's global cloud-free fifteen-meter base map and global multi-decadal time-lapse animations, as well as numerous large and small experimental analyses by scientists from a range of academic, government, and non-governmental institutions, working in a wide variety of application areas including forestry, agriculture, urban mapping, and species habitat modeling. Patterns in the successes and failures of these early efforts have begun to emerge, sketching the outlines of a new set of simple and effective approaches to geospatial data analysis.

  2. The effect of introducing computers into an introductory physics problem-solving laboratory

    NASA Astrophysics Data System (ADS)

    McCullough, Laura Ellen

    2000-10-01

    Computers are appearing in every type of classroom across the country. Yet they often appear without benefit of studying their effects. The research that is available on computer use in classrooms has found mixed results, and often ignores the theoretical and instructional contexts of the computer in the classroom. The University of Minnesota's physics department employs a cooperative-group problem solving pedagogy, based on a cognitive apprenticeship instructional model, in its calculus-based introductory physics course. This study was designed to determine possible negative effects of introducing a computerized data-acquisition and analysis tool into this pedagogy as a problem-solving tool for students to use in laboratory. To determine the effects of the computer tool, two quasi-experimental treatment groups were selected. The computer-tool group (N = 170) used a tool, designed for this study (VideoTool), to collect and analyze motion data in the laboratory. The control group (N = 170) used traditional non-computer equipment (spark tapes and Polaroid(TM) film). The curriculum was kept as similar as possible for the two groups. During the ten week academic quarter, groups were examined for effects on performance on conceptual tests and grades, attitudes towards the laboratory and the laboratory tools, and behaviors within cooperative groups. Possible interactions with gender were also examined. Few differences were found between the control and computer-tool groups. The control group received slightly higher scores on one conceptual test, but this difference was not educationally significant. The computer-tool group had slightly more positive attitudes towards using the computer tool than their counterparts had towards the traditional tools. The computer-tool group also perceived that they spoke more frequently about physics misunderstandings, while the control group felt that they discussed equipment difficulties more often. This perceptual difference interacted with gender, with the men in the control group more likely to discuss equipment difficulties than any other group. Overall, the differences between the control and quasi-experimental groups were minimal. It was concluded that carefully replacing traditional data collection and analysis tools with a computer tool had no negative effects on achievement, attitude, group behavior, and did not interact with gender.

  3. Acoustic Signature from Flames as a Combustion Diagnostic Tool

    DTIC Science & Technology

    1983-11-01

    empirical visual flame length had to be input to the computer for the inversion method to give good results. That is, if the experiment cnd inversion...method were asked to yield the flame length , poor results were obtained. Since this wa3 part of the information sought for practical application of the...to small experimental uncertainty. The method gave reasonably good results for the open flame but substantial input (the flame length ) had to be

  4. The Preliminary Design of a Standardized Spacecraft Bus for Small Tactical Satellites (Volume 2)

    DTIC Science & Technology

    1996-11-01

    this requirement, conditions of the model need to be modified to provide some flexibility to the original solution set. In the business world this...time The mission modules modeled in the Modsat computer model are necessarily "generic" in nature to provide both flexibility in design evaluation and...methods employed during the study, the scope of the problem, the value system used to evaluate alternatives, tradeoff studies performed, modeling tools

  5. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    NASA Astrophysics Data System (ADS)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  6. Removal of millimeter-scale rolled edges using bevel-cut-like tool influence function in magnetorheological jet polishing.

    PubMed

    Yang, Hao; Cheng, Haobo; Feng, Yunpeng; Jing, Xiaoli

    2018-05-01

    Subaperture polishing techniques usually produce rolled edges due to edge effect. The rolled edges, especially those in millimeter scale on small components, are difficult to eliminate using conventional polishing methods. Magnetorheological jet polishing (MJP) offers the possibility of the removal of these structures, owing to its small tool influence function (TIF) size. Hence, we investigate the removal characters of inclined MJP jetting models by means of computational fluid dynamics (CFD) simulations and polishing experiments. A discrete phase model (DPM) is introduced in the simulation to get the influence of abrasive particle concentration on the removal mechanism. Therefore, a more accurate model for MJP removal mechanisms is built. With several critical problems solved, a small bevel-cut-like TIF (B-TIF), which has fine acentric and unimodal characteristics, is obtained through inclined jetting. The B-TIF proves to have little edge effect and is applied in surface polishing of thin rolled edges. Finally, the RMS of the experimental section profile converges from 10.5 nm to 1.4 nm, and the rolled edges are successfully suppressed. Consequently, it is validated that the B-TIF has remarkable ability in the removal of millimeter-scale rolled edges.

  7. Identification of MicroRNAs and transcript targets in Camelina sativa by deep sequencing and computational methods

    DOE PAGES

    Poudel, Saroj; Aryal, Niranjan; Lu, Chaofu; ...

    2015-03-31

    Camelina sativa is an annual oilseed crop that is under intensive development for renewable resources of biofuels and industrial oils. MicroRNAs, or miRNAs, are endogenously encoded small RNAs that play key roles in diverse plant biological processes. Here, we conducted deep sequencing on small RNA libraries prepared from camelina leaves, flower buds and two stages of developing seeds corresponding to initial and peak storage products accumulation. Computational analyses identified 207 known miRNAs belonging to 63 families, as well as 5 novel miRNAs. These miRNAs, especially members of the miRNA families, varied greatly in different tissues and developmental stages. The predictedmore » miRNA target genes are involved in a broad range of physiological functions including lipid metabolism. This report is the first step toward elucidating roles of miRNAs in C. sativa and will provide additional tools to improve this oilseed crop for biofuels and biomaterials.« less

  8. 3D reconstruction software comparison for short sequences

    NASA Astrophysics Data System (ADS)

    Strupczewski, Adam; Czupryński, BłaŻej

    2014-11-01

    Large scale multiview reconstruction is recently a very popular area of research. There are many open source tools that can be downloaded and run on a personal computer. However, there are few, if any, comparisons between all the available software in terms of accuracy on small datasets that a single user can create. The typical datasets for testing of the software are archeological sites or cities, comprising thousands of images. This paper presents a comparison of currently available open source multiview reconstruction software for small datasets. It also compares the open source solutions with a simple structure from motion pipeline developed by the authors from scratch with the use of OpenCV and Eigen libraries.

  9. Diagnosis of peritoneal mesothelioma: computed tomography, sonography, and fine-needle aspiration biopsy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reuter, K.; Raptopoulos, V.; Reale, F.

    1983-06-01

    The diagnosis of peritoneal mesothelioma was made prospectively and noninvasively in four patients with the use of sonography, computed tomography, and sonographically guided fine-needle aspiration biopsy. The imaging methods revealed information similar to the operative findings, with clear superiority of computed tomography over sonography. These noninvasive methods may be used as screening tools, especially among groups or in regional areas with a high risk for asbestos exposure. The findings included soft-tissue masses with invariable involvement of the omentum; small intraperitoneal nodules; thickened peritoneum, mesentery, and bowel wall; pleural plaques; and usually minimal, if any, ascites. Since the differential diagnosis frommore » peritoneal carcinomatosis may be difficult, sonographically (or CT) guided aspiration biopsy is needed to produce diagnostic cytologic specimens. The use of this type of biopsy should obviate surgical exploration.« less

  10. A novel in silico approach to drug discovery via computational intelligence.

    PubMed

    Hecht, David; Fogel, Gary B

    2009-04-01

    A computational intelligence drug discovery platform is introduced as an innovative technology designed to accelerate high-throughput drug screening for generalized protein-targeted drug discovery. This technology results in collections of novel small molecule compounds that bind to protein targets as well as details on predicted binding modes and molecular interactions. The approach was tested on dihydrofolate reductase (DHFR) for novel antimalarial drug discovery; however, the methods developed can be applied broadly in early stage drug discovery and development. For this purpose, an initial fragment library was defined, and an automated fragment assembly algorithm was generated. These were combined with a computational intelligence screening tool for prescreening of compounds relative to DHFR inhibition. The entire method was assayed relative to spaces of known DHFR inhibitors and with chemical feasibility in mind, leading to experimental validation in future studies.

  11. [Obscure gastrointestinal bleeding due to gastrointestinal stromal tumors].

    PubMed

    Romero-Espinosa, Larry; Souza-Gallardo, Luis Manuel; Martínez-Ordaz, José Luis; Romero-Hernández, Teodoro; de la Fuente-Lira, Mauricio; Arellano-Sotelo, Jorge

    The gastrointestinal stromal tumours (GIST) are the most common soft tissue sarcomas of the digestive tract. They are usually found in the stomach (60-70%) and small intestine (25-30%) and, less commonly, in the oesophagus, mesentery, colon, or rectum. The symptoms present at diagnosis are, gastrointestinal bleeding, abdominal pain, abdominal mass, or intestinal obstruction. The type of symptomatology will depend on the location and size of the tumour. The definitive diagnosis is histopathological, with 95% of the tumours being positive for CD117. This is an observational and descriptive study of 5cases of small intestinal GIST that presented with gastrointestinal bleeding as the main symptom. The period from the initial symptom to the diagnosis varied from 1 to 84 months. The endoscopy was inconclusive in all of the patients, and the diagnosis was made using computed tomography and angiography. Treatment included resection in all patients. The histopathological results are also described. GIST can have multiple clinical pictures and unusual symptoms, such as obscure gastrointestinal bleeding. The use of computed tomography and angiography has shown to be an important tool in the diagnosis with patients with small intestine GISTs. Copyright © 2016. Publicado por Masson Doyma México S.A.

  12. Design of a radiation facility for very small specimens used in radiobiology studies

    NASA Astrophysics Data System (ADS)

    Rodriguez, Manuel; Jeraj, Robert

    2008-06-01

    A design of a radiation facility for very small specimens used in radiobiology is presented. This micro-irradiator has been primarily designed to irradiate partial bodies in zebrafish embryos 3-4 mm in length. A miniature x-ray, 50 kV photon beam, is used as a radiation source. The source is inserted in a cylindrical brass collimator that has a pinhole of 1.0 mm in diameter along the central axis to produce a pencil photon beam. The collimator with the source is attached underneath a computer-controlled movable table which holds the specimens. Using a 45° tilted mirror, a digital camera, connected to the computer, takes pictures of the specimen and the pinhole collimator. From the image provided by the camera, the relative distance from the specimen to the pinhole axis is calculated and coordinates are sent to the movable table to properly position the samples in the beam path. Due to its monitoring system, characteristic of the radiation beam, accuracy and precision of specimen positioning, and automatic image-based specimen recognition, this radiation facility is a suitable tool to irradiate partial bodies in zebrafish embryos, cell cultures or any other small specimen used in radiobiology research.

  13. Automated computer-based detection of encounter behaviours in groups of honeybees.

    PubMed

    Blut, Christina; Crespi, Alessandro; Mersch, Danielle; Keller, Laurent; Zhao, Linlin; Kollmann, Markus; Schellscheidt, Benjamin; Fülber, Carsten; Beye, Martin

    2017-12-15

    Honeybees form societies in which thousands of members integrate their behaviours to act as a single functional unit. We have little knowledge on how the collaborative features are regulated by workers' activities because we lack methods that enable collection of simultaneous and continuous behavioural information for each worker bee. In this study, we introduce the Bee Behavioral Annotation System (BBAS), which enables the automated detection of bees' behaviours in small observation hives. Continuous information on position and orientation were obtained by marking worker bees with 2D barcodes in a small observation hive. We computed behavioural and social features from the tracking information to train a behaviour classifier for encounter behaviours (interaction of workers via antennation) using a machine learning-based system. The classifier correctly detected 93% of the encounter behaviours in a group of bees, whereas 13% of the falsely classified behaviours were unrelated to encounter behaviours. The possibility of building accurate classifiers for automatically annotating behaviours may allow for the examination of individual behaviours of worker bees in the social environments of small observation hives. We envisage that BBAS will be a powerful tool for detecting the effects of experimental manipulation of social attributes and sub-lethal effects of pesticides on behaviour.

  14. Epiviz: a view inside the design of an integrated visual analysis software for genomics

    PubMed Central

    2015-01-01

    Background Computational and visual data analysis for genomics has traditionally involved a combination of tools and resources, of which the most ubiquitous consist of genome browsers, focused mainly on integrative visualization of large numbers of big datasets, and computational environments, focused on data modeling of a small number of moderately sized datasets. Workflows that involve the integration and exploration of multiple heterogeneous data sources, small and large, public and user specific have been poorly addressed by these tools. In our previous work, we introduced Epiviz, which bridges the gap between the two types of tools, simplifying these workflows. Results In this paper we expand on the design decisions behind Epiviz, and introduce a series of new advanced features that further support the type of interactive exploratory workflow we have targeted. We discuss three ways in which Epiviz advances the field of genomic data analysis: 1) it brings code to interactive visualizations at various different levels; 2) takes the first steps in the direction of collaborative data analysis by incorporating user plugins from source control providers, as well as by allowing analysis states to be shared among the scientific community; 3) combines established analysis features that have never before been available simultaneously in a genome browser. In our discussion section, we present security implications of the current design, as well as a series of limitations and future research steps. Conclusions Since many of the design choices of Epiviz are novel in genomics data analysis, this paper serves both as a document of our own approaches with lessons learned, as well as a start point for future efforts in the same direction for the genomics community. PMID:26328750

  15. DOVIS 2.0: an efficient and easy to use parallel virtual screening tool based on AutoDock 4.0.

    PubMed

    Jiang, Xiaohui; Kumar, Kamal; Hu, Xin; Wallqvist, Anders; Reifman, Jaques

    2008-09-08

    Small-molecule docking is an important tool in studying receptor-ligand interactions and in identifying potential drug candidates. Previously, we developed a software tool (DOVIS) to perform large-scale virtual screening of small molecules in parallel on Linux clusters, using AutoDock 3.05 as the docking engine. DOVIS enables the seamless screening of millions of compounds on high-performance computing platforms. In this paper, we report significant advances in the software implementation of DOVIS 2.0, including enhanced screening capability, improved file system efficiency, and extended usability. To keep DOVIS up-to-date, we upgraded the software's docking engine to the more accurate AutoDock 4.0 code. We developed a new parallelization scheme to improve runtime efficiency and modified the AutoDock code to reduce excessive file operations during large-scale virtual screening jobs. We also implemented an algorithm to output docked ligands in an industry standard format, sd-file format, which can be easily interfaced with other modeling programs. Finally, we constructed a wrapper-script interface to enable automatic rescoring of docked ligands by arbitrarily selected third-party scoring programs. The significance of the new DOVIS 2.0 software compared with the previous version lies in its improved performance and usability. The new version makes the computation highly efficient by automating load balancing, significantly reducing excessive file operations by more than 95%, providing outputs that conform to industry standard sd-file format, and providing a general wrapper-script interface for rescoring of docked ligands. The new DOVIS 2.0 package is freely available to the public under the GNU General Public License.

  16. Analyzing huge pathology images with open source software.

    PubMed

    Deroulers, Christophe; Ameisen, David; Badoual, Mathilde; Gerin, Chloé; Granier, Alexandre; Lartaud, Marc

    2013-06-06

    Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer's memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. The virtual slide(s) for this article can be found here:http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272.

  17. The Mental Disability Military Assessment Tool: A Reliable Tool for Determining Disability in Veterans with Post-traumatic Stress Disorder.

    PubMed

    Fokkens, Andrea S; Groothoff, Johan W; van der Klink, Jac J L; Popping, Roel; Stewart, Roy E; van de Ven, Lex; Brouwer, Sandra; Tuinstra, Jolanda

    2015-09-01

    An assessment tool was developed to assess disability in veterans who suffer from post-traumatic stress disorder (PTSD) due to a military mission. The objective of this study was to determine the reliability, intra-rater and inter-rater variation of the Mental Disability Military (MDM) assessment tool. Twenty-four assessment interviews of veterans with an insurance physician were videotaped. Each videotaped interview was assessed by a group of five independent raters on limitations of the veterans using the MDM assessment tool. After 2 months the raters repeated this procedure. Next the intra-rater and inter-rater variation was assessed with an adjusted version of AG09 computing weighted percentage agreement. The results of this study showed that both the intra-rater variation and inter-rater variation on the ten subcategories of the MDM assessment tool were small, with an agreement of 84-100% within raters and 93-100% between raters. The MDM assessment tool proves to be a reliable instrument to measure PTSD limitations in functioning in Dutch military veterans who apply for disability compensation. Further research is needed to assess the validity of this instrument.

  18. Implementing Nonlinear Buoyancy and Excitation Forces in the WEC-Sim Wave Energy Converter Modeling Tool: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawson, M.; Yu, Y. H.; Nelessen, A.

    2014-05-01

    Wave energy converters (WECs) are commonly designed and analyzed using numerical models that combine multi-body dynamics with hydrodynamic models based on the Cummins Equation and linearized hydrodynamic coefficients. These modeling methods are attractive design tools because they are computationally inexpensive and do not require the use of high performance computing resources necessitated by high-fidelity methods, such as Navier Stokes computational fluid dynamics. Modeling hydrodynamics using linear coefficients assumes that the device undergoes small motions and that the wetted surface area of the devices is approximately constant. WEC devices, however, are typically designed to undergo large motions in order to maximizemore » power extraction, calling into question the validity of assuming that linear hydrodynamic models accurately capture the relevant fluid-structure interactions. In this paper, we study how calculating buoyancy and Froude-Krylov forces from the instantaneous position of a WEC device (referred to as instantaneous buoyancy and Froude-Krylov forces from herein) changes WEC simulation results compared to simulations that use linear hydrodynamic coefficients. First, we describe the WEC-Sim tool used to perform simulations and how the ability to model instantaneous forces was incorporated into WEC-Sim. We then use a simplified one-body WEC device to validate the model and to demonstrate how accounting for these instantaneously calculated forces affects the accuracy of simulation results, such as device motions, hydrodynamic forces, and power generation.« less

  19. Construction of a cost effective optical tweezers for manipulation of birefringent materials using circularly polarized light

    NASA Astrophysics Data System (ADS)

    McMahon, Allison; Sauncy, Toni

    2008-10-01

    Light manipulation is a very powerful tool in physics, biology, and chemistry. There are several physical principles underlying the apparatus known as the ``optical tweezers,'' the term given to using focused light to manipulate and control small objects. By carefully controlling the orientation and position of a focused laser beam, dielectric particles can be effectively trapped and manipulated. We have designed a cost efficient and effective undergraduate optical tweezers apparatus by using standard ``off the shelf'' components and starting with a standard undergraduate laboratory microscope. Images are recorded using a small CCD camera interfaced to a computer and controlled by LabVIEW^TM software. By using wave plates to produce circular polarized light, rotational motion can be induced in small particles of birefringent materials such as calcite and mica.

  20. Software architecture and design of the web services facilitating climate model diagnostic analysis

    NASA Astrophysics Data System (ADS)

    Pan, L.; Lee, S.; Zhang, J.; Tang, B.; Zhai, C.; Jiang, J. H.; Wang, W.; Bao, Q.; Qi, M.; Kubar, T. L.; Teixeira, J.

    2015-12-01

    Climate model diagnostic analysis is a computationally- and data-intensive task because it involves multiple numerical model outputs and satellite observation data that can both be high resolution. We have built an online tool that facilitates this process. The tool is called Climate Model Diagnostic Analyzer (CMDA). It employs the web service technology and provides a web-based user interface. The benefits of these choices include: (1) No installation of any software other than a browser, hence it is platform compatable; (2) Co-location of computation and big data on the server side, and small results and plots to be downloaded on the client side, hence high data efficiency; (3) multi-threaded implementation to achieve parallel performance on multi-core servers; and (4) cloud deployment so each user has a dedicated virtual machine. In this presentation, we will focus on the computer science aspects of this tool, namely the architectural design, the infrastructure of the web services, the implementation of the web-based user interface, the mechanism of provenance collection, the approach to virtualization, and the Amazon Cloud deployment. As an example, We will describe our methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). Another example is the use of Docker, a light-weight virtualization container, to distribute and deploy CMDA onto an Amazon EC2 instance. Our tool of CMDA has been successfully used in the 2014 Summer School hosted by the JPL Center for Climate Science. Students had positive feedbacks in general and we will report their comments. An enhanced version of CMDA with several new features, some requested by the 2014 students, will be used in the 2015 Summer School soon.

  1. A survey of current trends in computational drug repositioning.

    PubMed

    Li, Jiao; Zheng, Si; Chen, Bin; Butte, Atul J; Swamidass, S Joshua; Lu, Zhiyong

    2016-01-01

    Computational drug repositioning or repurposing is a promising and efficient tool for discovering new uses from existing drugs and holds the great potential for precision medicine in the age of big data. The explosive growth of large-scale genomic and phenotypic data, as well as data of small molecular compounds with granted regulatory approval, is enabling new developments for computational repositioning. To achieve the shortest path toward new drug indications, advanced data processing and analysis strategies are critical for making sense of these heterogeneous molecular measurements. In this review, we show recent advancements in the critical areas of computational drug repositioning from multiple aspects. First, we summarize available data sources and the corresponding computational repositioning strategies. Second, we characterize the commonly used computational techniques. Third, we discuss validation strategies for repositioning studies, including both computational and experimental methods. Finally, we highlight potential opportunities and use-cases, including a few target areas such as cancers. We conclude with a brief discussion of the remaining challenges in computational drug repositioning. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.

  2. A multiphysics and multiscale software environment for modeling astrophysical systems

    NASA Astrophysics Data System (ADS)

    Portegies Zwart, Simon; McMillan, Steve; Harfst, Stefan; Groen, Derek; Fujii, Michiko; Nualláin, Breanndán Ó.; Glebbeek, Evert; Heggie, Douglas; Lombardi, James; Hut, Piet; Angelou, Vangelis; Banerjee, Sambaran; Belkus, Houria; Fragos, Tassos; Fregeau, John; Gaburov, Evghenii; Izzard, Rob; Jurić, Mario; Justham, Stephen; Sottoriva, Andrea; Teuben, Peter; van Bever, Joris; Yaron, Ofer; Zemp, Marcel

    2009-05-01

    We present MUSE, a software framework for combining existing computational tools for different astrophysical domains into a single multiphysics, multiscale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for studying generalized stellar systems. We have now reached a "Noah's Ark" milestone, with (at least) two available numerical solvers for each domain. MUSE can treat multiscale and multiphysics systems in which the time- and size-scales are well separated, like simulating the evolution of planetary systems, small stellar associations, dense stellar clusters, galaxies and galactic nuclei. In this paper we describe three examples calculated using MUSE: the merger of two galaxies, the merger of two evolving stars, and a hybrid N-body simulation. In addition, we demonstrate an implementation of MUSE on a distributed computer which may also include special-purpose hardware, such as GRAPEs or GPUs, to accelerate computations. The current MUSE code base is publicly available as open source at http://muse.li.

  3. Status and future of MUSE

    NASA Astrophysics Data System (ADS)

    Harfst, S.; Portegies Zwart, S.; McMillan, S.

    2008-12-01

    We present MUSE, a software framework for combining existing computational tools from different astrophysical domains into a single multi-physics, multi-scale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly-coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for studying generalized stellar systems. We have now reached a ``Noah's Ark'' milestone, with (at least) two available numerical solvers for each domain. MUSE can treat multi-scale and multi-physics systems in which the time- and size-scales are well separated, like simulating the evolution of planetary systems, small stellar associations, dense stellar clusters, galaxies and galactic nuclei. In this paper we describe two examples calculated using MUSE: the merger of two galaxies and an N-body simulation with live stellar evolution. In addition, we demonstrate an implementation of MUSE on a distributed computer which may also include special-purpose hardware, such as GRAPEs or GPUs, to accelerate computations. The current MUSE code base is publicly available as open source at http://muse.li.

  4. Factors influencing exemplary science teachers' levels of computer use

    NASA Astrophysics Data System (ADS)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.

  5. Pattern formation in binary colloidal assemblies: hidden symmetries in a kaleidoscope of structures.

    PubMed

    Lotito, Valeria; Zambelli, Tomaso

    2018-06-10

    In this study we present a detailed investigation of the morphology of binary colloidal structures formed by self-assembly at air/water interface of particles of two different sizes, with a size ratio such that the larger particles do not retain a hexagonal arrangement in the binary assembly. While the structure and symmetry of binary mixtures in which such hexagonal order is preserved has been thoroughly scrutinized, binary colloids in the regime of non-preservation of the hexagonal order have not been examined with the same level of detail due also to the difficulty in finding analysis tools suitable to recognize hidden symmetries in seemingly amorphous and disordered arrangements. For this purpose, we resorted to a combination of different analysis tools based on computational geometry and computational topology in order to get a comprehensive picture of the morphology of the assemblies. By carrying out an extensive investigation of binary assemblies in this regime with variable concentration of smaller particles with respect to larger particles, we identify the main patterns that coexist in the apparently disordered assemblies and detect transitions in the symmetries upon increase in the number of small particles. As the concentration of small particles increases, large particle arrangements become more dilute and a transition from hexagonal to rhombic and square symmetries occurs, accompanied also by an increase in clusters of small particles; the relative weight of each specific symmetry can be controlled by varying the composition of the assemblies. The demonstration of the possibility to control the morphology of apparently disordered binary colloidal assemblies by varying experimental conditions and the definition of a route for the investigation of disordered assemblies are precious for future studies of complex colloidal patterns to understand self-assembly mechanisms and to tailor physical properties of colloidal assemblies.

  6. Investigation of Small-Caliber Primer Function Using a Multiphase Computational Model

    DTIC Science & Technology

    2008-07-01

    all solid walls along with specified inflow at the primer orifice (0.102 cm < Y < 0.102 cm at X = 0). Initially , the entire flowfield is filled...to explicitly treat both the gas and solid phase. The model is based on the One Dimensional Turbulence modeling approach that has recently emerged as...a powerful tool in multiphase simulations. Initial results are shown for the model run as a stand-alone code and are compared to recent experiments

  7. NASA and CFD - Making investments for the future

    NASA Technical Reports Server (NTRS)

    Hessenius, Kristin A.; Richardson, P. F.

    1992-01-01

    From a NASA perspective, CFD is a new tool for fluid flow simulation and prediction with virtually none of the inherent limitations of other ground-based simulation techniques. A primary goal of NASA's CFD research program is to develop efficient and accurate computational techniques for utilization in the design and analysis of aerospace vehicles. The program in algorithm development has systematically progressed through the hierarchy of engineering simplifications of the Navier-Stokes equations, starting with the inviscid formulations such as transonic small disturbance, full potential, and Euler.

  8. Small Business Innovation Research (SBIR) Program, FY 1994. Program Solicitation 94.1, Closing Date: 14 January 1994

    DTIC Science & Technology

    1994-01-01

    is to design and develop a diode laser and ssociated driver circuitry with i•eh peak power, high pulse repetition frequency (PRF), and good beam...Computer modeling tools shall be used to design and optimize breadboard model of a multi-terminal high speed ring bus for flight critical applications... design , fabricate, and test a fiber optic interface device which will improve coupling of high energy, pulsed lasers into commercial fiber optics at a

  9. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems.

    PubMed

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun; Wang, Gi-Nam

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.

  10. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems

    PubMed Central

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively. PMID:27974882

  11. POVME 2.0: An Enhanced Tool for Determining Pocket Shape and Volume Characteristics

    PubMed Central

    2015-01-01

    Analysis of macromolecular/small-molecule binding pockets can provide important insights into molecular recognition and receptor dynamics. Since its release in 2011, the POVME (POcket Volume MEasurer) algorithm has been widely adopted as a simple-to-use tool for measuring and characterizing pocket volumes and shapes. We here present POVME 2.0, which is an order of magnitude faster, has improved accuracy, includes a graphical user interface, and can produce volumetric density maps for improved pocket analysis. To demonstrate the utility of the algorithm, we use it to analyze the binding pocket of RNA editing ligase 1 from the unicellular parasite Trypanosoma brucei, the etiological agent of African sleeping sickness. The POVME analysis characterizes the full dynamics of a potentially druggable transient binding pocket and so may guide future antitrypanosomal drug-discovery efforts. We are hopeful that this new version will be a useful tool for the computational- and medicinal-chemist community. PMID:25400521

  12. The PDA as a reference tool: libraries' role in enhancing nursing education.

    PubMed

    Scollin, Patrick; Callahan, John; Mehta, Apurva; Garcia, Elizabeth

    2006-01-01

    "The PDA as a Reference Tool: The Libraries' Role in Enhancing Nursing Education" is a pilot project funded by the University of Massachusetts President's Office Information Technology Council through their Professional Development Grant program in 2004. The project's goal is to offer faculty and students in nursing programs at two University of Massachusetts campuses access to an array of medical reference information, such as handbooks, dictionaries, calculators, and diagnostic tools, on small handheld computers called personal digital assistants. Through exposure to the variety of information resources in this digital format, participants can discover and explore these resources at no personal financial cost. Participants borrow handhelds from the University Library's circulation desks. The libraries provide support in routine resynchronizing of handhelds to update information. This report will discuss how the projects were administered, what we learned about what did and did not work, the problems and solutions, and where we hope to go from here.

  13. Synthetic RNAs for Gene Regulation: Design Principles and Computational Tools

    PubMed Central

    Laganà, Alessandro; Shasha, Dennis; Croce, Carlo Maria

    2014-01-01

    The use of synthetic non-coding RNAs for post-transcriptional regulation of gene expression has not only become a standard laboratory tool for gene functional studies but it has also opened up new perspectives in the design of new and potentially promising therapeutic strategies. Bioinformatics has provided researchers with a variety of tools for the design, the analysis, and the evaluation of RNAi agents such as small-interfering RNA (siRNA), short-hairpin RNA (shRNA), artificial microRNA (a-miR), and microRNA sponges. More recently, a new system for genome engineering based on the bacterial CRISPR-Cas9 system (Clustered Regularly Interspaced Short Palindromic Repeats), was shown to have the potential to also regulate gene expression at both transcriptional and post-transcriptional level in a more specific way. In this mini review, we present RNAi and CRISPRi design principles and discuss the advantages and limitations of the current design approaches. PMID:25566532

  14. Status of the Combustion Devices Injector Technology Program at the NASA MSFC

    NASA Technical Reports Server (NTRS)

    Jones, Gregg; Protz, Christopher; Trinh, Huu; Tucker, Kevin; Nesman, Tomas; Hulka, James

    2005-01-01

    To support the NASA Space Exploration Mission, an in-house program called Combustion Devices Injector Technology (CDIT) is being conducted at the NASA Marshall Space Flight Center (MSFC) for the fiscal year 2005. CDIT is focused on developing combustor technology and analysis tools to improve reliability and durability of upper-stage and in-space liquid propellant rocket engines. The three areas of focus include injector/chamber thermal compatibility, ignition, and combustion stability. In the compatibility and ignition areas, small-scale single- and multi-element hardware experiments will be conducted to demonstrate advanced technological concepts as well as to provide experimental data for validation of computational analysis tools. In addition, advanced analysis tools will be developed to eventually include 3-dimensional and multi- element effects and improve capability and validity to analyze heat transfer and ignition in large, multi-element injectors.

  15. ParCAT: A Parallel Climate Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.

    2012-12-01

    Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. ParCAT is implemented in C to provide efficient file IO. The file IO operations in the toolkit use the parallel-netcdf library; this enables the code to use the parallel IO capabilities of modern HPC systems. Analysis that currently requires an estimated 12+ hours with the traditional CCSM Land Model Diagnostics Package can now be performed in as little as 30 minutes on a single desktop workstation and a few minutes for relatively small jobs completed on modern HPC systems such as ORNL's Jaguar.

  16. Wear Detection of Drill Bit by Image-based Technique

    NASA Astrophysics Data System (ADS)

    Sukeri, Maziyah; Zulhilmi Paiz Ismadi, Mohd; Rahim Othman, Abdul; Kamaruddin, Shahrul

    2018-03-01

    Image processing for computer vision function plays an essential aspect in the manufacturing industries for the tool condition monitoring. This study proposes a dependable direct measurement method to measure the tool wear using image-based analysis. Segmentation and thresholding technique were used as the means to filter and convert the colour image to binary datasets. Then, the edge detection method was applied to characterize the edge of the drill bit. By using cross-correlation method, the edges of original and worn drill bits were correlated to each other. Cross-correlation graphs were able to detect the difference of the worn edge despite small difference between the graphs. Future development will focus on quantifying the worn profile as well as enhancing the sensitivity of the technique.

  17. Semantic Interoperability for Computational Mineralogy: Experiences of the eMinerals Consortium

    NASA Astrophysics Data System (ADS)

    Walker, A. M.; White, T. O.; Dove, M. T.; Bruin, R. P.; Couch, P. A.; Tyer, R. P.

    2006-12-01

    The use of atomic scale computer simulation of minerals to obtain information for geophysics and environmental science has grown enormously over the past couple of decades. It is now routine to probe mineral behavior in the Earth's deep interior and in the surface environment by borrowing methods and simulation codes from computational chemistry and physics. It is becoming increasingly important to use methods embodied in more than one of these codes to solve any single scientific problem. However, scientific codes are rarely designed for easy interoperability and data exchange; data formats are often code-specific, poorly documented and fragile, liable to frequent change between software versions, and even compiler versions. This means that the scientist's simple desire to use the methodological approaches offered by multiple codes is frustrated, and even the sharing of data between collaborators becomes fraught with difficulties. The eMinerals consortium was formed in the early stages of the UK eScience program with the aim of developing the tools needed to apply atomic scale simulation to environmental problems in a grid-enabled world, and to harness the computational power offered by grid technologies to address some outstanding mineralogical problems. One example of the kind of problem we can tackle is the origin of the compressibility anomaly in silica glass. By passing data directly between simulation and analysis tools we were able to probe this effect in more detail than has previously been possible and have shown how the anomaly is related to the details of the amorphous structure. In order to approach this kind of problem we have constructed a mini-grid, a small scale and extensible combined compute- and data-grid that allows the execution of many calculations in parallel, and the transparent storage of semantically-rich marked-up result data. Importantly, we automatically capture multiple kinds of metadata and key results from each calculation. We believe that the lessons learned and tools developed will be useful in many areas of science beyond the computational mineralogy. Key tools that will be described include: a pure Fortran XML library (FoX) that presents XPath, SAX and DOM interfaces as well as permitting the easy production of valid XML from legacy Fortran programs; a job submission framework that automatically schedules calculations to remote grid resources, handles data staging and metadata capture; and a tool (AgentX) that map concepts from an ontology onto locations in documents of various formats that we use to enable data exchange.

  18. Computational Methods in Drug Discovery

    PubMed Central

    Sliwoski, Gregory; Kothiwale, Sandeepkumar; Meiler, Jens

    2014-01-01

    Computer-aided drug discovery/design methods have played a major role in the development of therapeutically important small molecules for over three decades. These methods are broadly classified as either structure-based or ligand-based methods. Structure-based methods are in principle analogous to high-throughput screening in that both target and ligand structure information is imperative. Structure-based approaches include ligand docking, pharmacophore, and ligand design methods. The article discusses theory behind the most important methods and recent successful applications. Ligand-based methods use only ligand information for predicting activity depending on its similarity/dissimilarity to previously known active ligands. We review widely used ligand-based methods such as ligand-based pharmacophores, molecular descriptors, and quantitative structure-activity relationships. In addition, important tools such as target/ligand data bases, homology modeling, ligand fingerprint methods, etc., necessary for successful implementation of various computer-aided drug discovery/design methods in a drug discovery campaign are discussed. Finally, computational methods for toxicity prediction and optimization for favorable physiologic properties are discussed with successful examples from literature. PMID:24381236

  19. RighTime: A real time clock correcting program for MS-DOS-based computer systems

    NASA Technical Reports Server (NTRS)

    Becker, G. Thomas

    1993-01-01

    A computer program is described which effectively eliminates the misgivings of the DOS system clock in PC/AT-class computers. RighTime is a small, sophisticated memory-resident program that automatically corrects both the DOS system clock and the hardware 'CMOS' real time clock (RTC) in real time. RighTime learns what corrections are required without operator interaction beyond the occasional accurate time set. Both warm (power on) and cool (power off) errors are corrected, usually yielding better than one part per million accuracy in the typical desktop computer with no additional hardware, and RighTime increases the system clock resolution from approximately 0.0549 second to 0.01 second. Program tools are also available which allow visualization of RighTime's actions, verification of its performance, display of its history log, and which provide data for graphing of the system clock behavior. The program has found application in a wide variety of industries, including astronomy, satellite tracking, communications, broadcasting, transportation, public utilities, manufacturing, medicine, and the military.

  20. Role of computational fluid dynamics in unsteady aerodynamics for aeroelasticity

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.; Goorjian, Peter M.

    1989-01-01

    In the last two decades there have been extensive developments in computational unsteady transonic aerodynamics. Such developments are essential since the transonic regime plays an important role in the design of modern aircraft. Therefore, there has been a large effort to develop computational tools with which to accurately perform flutter analysis at transonic speeds. In the area of Computational Fluid Dynamics (CFD), unsteady transonic aerodynamics are characterized by the feature of modeling the motion of shock waves over aerodynamic bodies, such as wings. This modeling requires the solution of nonlinear partial differential equations. Most advanced codes such as XTRAN3S use the transonic small perturbation equation. Currently, XTRAN3S is being used for generic research in unsteady aerodynamics and aeroelasticity of almost full aircraft configurations. Use of Euler/Navier Stokes equations for simple typical sections has just begun. A brief history of the development of CFD for aeroelastic applications is summarized. The development of unsteady transonic aerodynamics and aeroelasticity are also summarized.

  1. Adjoint Sensitivity Analysis for Scale-Resolving Turbulent Flow Solvers

    NASA Astrophysics Data System (ADS)

    Blonigan, Patrick; Garai, Anirban; Diosady, Laslo; Murman, Scott

    2017-11-01

    Adjoint-based sensitivity analysis methods are powerful design tools for engineers who use computational fluid dynamics. In recent years, these engineers have started to use scale-resolving simulations like large-eddy simulations (LES) and direct numerical simulations (DNS), which resolve more scales in complex flows with unsteady separation and jets than the widely-used Reynolds-averaged Navier-Stokes (RANS) methods. However, the conventional adjoint method computes large, unusable sensitivities for scale-resolving simulations, which unlike RANS simulations exhibit the chaotic dynamics inherent in turbulent flows. Sensitivity analysis based on least-squares shadowing (LSS) avoids the issues encountered by conventional adjoint methods, but has a high computational cost even for relatively small simulations. The following talk discusses a more computationally efficient formulation of LSS, ``non-intrusive'' LSS, and its application to turbulent flows simulated with a discontinuous-Galkerin spectral-element-method LES/DNS solver. Results are presented for the minimal flow unit, a turbulent channel flow with a limited streamwise and spanwise domain.

  2. Simulation of X-ray absorption spectra with orthogonality constrained density functional theory.

    PubMed

    Derricotte, Wallace D; Evangelista, Francesco A

    2015-06-14

    Orthogonality constrained density functional theory (OCDFT) [F. A. Evangelista, P. Shushkov and J. C. Tully, J. Phys. Chem. A, 2013, 117, 7378] is a variational time-independent approach for the computation of electronic excited states. In this work we extend OCDFT to compute core-excited states and generalize the original formalism to determine multiple excited states. Benchmark computations on a set of 13 small molecules and 40 excited states show that unshifted OCDFT/B3LYP excitation energies have a mean absolute error of 1.0 eV. Contrary to time-dependent DFT, OCDFT excitation energies for first- and second-row elements are computed with near-uniform accuracy. OCDFT core excitation energies are insensitive to the choice of the functional and the amount of Hartree-Fock exchange. We show that OCDFT is a powerful tool for the assignment of X-ray absorption spectra of large molecules by simulating the gas-phase near-edge spectrum of adenine and thymine.

  3. Comparative assessment of methods for the fusion transcripts detection from RNA-Seq data

    PubMed Central

    Kumar, Shailesh; Vo, Angie Duy; Qin, Fujun; Li, Hui

    2016-01-01

    RNA-Seq made possible the global identification of fusion transcripts, i.e. “chimeric RNAs”. Even though various software packages have been developed to serve this purpose, they behave differently in different datasets provided by different developers. It is important for both users, and developers to have an unbiased assessment of the performance of existing fusion detection tools. Toward this goal, we compared the performance of 12 well-known fusion detection software packages. We evaluated the sensitivity, false discovery rate, computing time, and memory usage of these tools in four different datasets (positive, negative, mixed, and test). We conclude that some tools are better than others in terms of sensitivity, positive prediction value, time consumption and memory usage. We also observed small overlaps of the fusions detected by different tools in the real dataset (test dataset). This could be due to false discoveries by various tools, but could also be due to the reason that none of the tools are inclusive. We have found that the performance of the tools depends on the quality, read length, and number of reads of the RNA-Seq data. We recommend that users choose the proper tools for their purpose based on the properties of their RNA-Seq data. PMID:26862001

  4. Edge control in a computer controlled optical surfacing process using a heterocercal tool influence function.

    PubMed

    Hu, Haixiang; Zhang, Xin; Ford, Virginia; Luo, Xiao; Qi, Erhui; Zeng, Xuefeng; Zhang, Xuejun

    2016-11-14

    Edge effect is regarded as one of the most difficult technical issues in a computer controlled optical surfacing (CCOS) process. Traditional opticians have to even up the consequences of the two following cases. Operating CCOS in a large overhang condition affects the accuracy of material removal, while in a small overhang condition, it achieves a more accurate performance, but leaves a narrow rolled-up edge, which takes time and effort to remove. In order to control the edge residuals in the latter case, we present a new concept of the 'heterocercal' tool influence function (TIF). Generated from compound motion equipment, this type of TIF can 'transfer' the material removal from the inner place to the edge, meanwhile maintaining the high accuracy and efficiency of CCOS. We call it the 'heterocercal' TIF, because of the inspiration from the heterocercal tails of sharks, whose upper lobe provides most of the explosive power. The heterocercal TIF was theoretically analyzed, and physically realized in CCOS facilities. Experimental and simulation results showed good agreement. It enables significant control of the edge effect and convergence of entire surface errors in large tool-to-mirror size-ratio conditions. This improvement will largely help manufacturing efficiency in some extremely large optical system projects, like the tertiary mirror of the Thirty Meter Telescope.

  5. User manual for two simple postscript output FORTRAN plotting routines

    NASA Technical Reports Server (NTRS)

    Nguyen, T. X.

    1991-01-01

    Graphics is one of the important tools in engineering analysis and design. However, plotting routines that generate output on high quality laser printers normally come in graphics packages, which tend to be expensive and system dependent. These factors become important for small computer systems or desktop computers, especially when only some form of a simple plotting routine is sufficient. With the Postscript language becoming popular, there are more and more Postscript laser printers now available. Simple, versatile, low cost plotting routines that can generate output on high quality laser printers are needed and standard FORTRAN language plotting routines using output in Postscript language seems logical. The purpose here is to explain two simple FORTRAN plotting routines that generate output in Postscript language.

  6. The impact of computer self-efficacy, computer anxiety, and perceived usability and acceptability on the efficacy of a decision support tool for colorectal cancer screening

    PubMed Central

    Lindblom, Katrina; Gregory, Tess; Flight, Ingrid H K; Zajac, Ian

    2011-01-01

    Objective This study investigated the efficacy of an internet-based personalized decision support (PDS) tool designed to aid in the decision to screen for colorectal cancer (CRC) using a fecal occult blood test. We tested whether the efficacy of the tool in influencing attitudes to screening was mediated by perceived usability and acceptability, and considered the role of computer self-efficacy and computer anxiety in these relationships. Methods Eighty-one participants aged 50–76 years worked through the on-line PDS tool and completed questionnaires on computer self-efficacy, computer anxiety, attitudes to and beliefs about CRC screening before and after exposure to the PDS, and perceived usability and acceptability of the tool. Results Repeated measures ANOVA found that PDS exposure led to a significant increase in knowledge about CRC and screening, and more positive attitudes to CRC screening as measured by factors from the Preventive Health Model. Perceived usability and acceptability of the PDS mediated changes in attitudes toward CRC screening (but not CRC knowledge), and computer self-efficacy and computer anxiety were significant predictors of individuals' perceptions of the tool. Conclusion Interventions designed to decrease computer anxiety, such as computer courses and internet training, may improve the acceptability of new health information technologies including internet-based decision support tools, increasing their impact on behavior change. PMID:21857024

  7. Parallel Calculation of Sensitivity Derivatives for Aircraft Design using Automatic Differentiation

    NASA Technical Reports Server (NTRS)

    Bischof, c. H.; Green, L. L.; Haigler, K. J.; Knauff, T. L., Jr.

    1994-01-01

    Sensitivity derivative (SD) calculation via automatic differentiation (AD) typical of that required for the aerodynamic design of a transport-type aircraft is considered. Two ways of computing SD via code generated by the ADIFOR automatic differentiation tool are compared for efficiency and applicability to problems involving large numbers of design variables. A vector implementation on a Cray Y-MP computer is compared with a coarse-grained parallel implementation on an IBM SP1 computer, employing a Fortran M wrapper. The SD are computed for a swept transport wing in turbulent, transonic flow; the number of geometric design variables varies from 1 to 60 with coupling between a wing grid generation program and a state-of-the-art, 3-D computational fluid dynamics program, both augmented for derivative computation via AD. For a small number of design variables, the Cray Y-MP implementation is much faster. As the number of design variables grows, however, the IBM SP1 becomes an attractive alternative in terms of compute speed, job turnaround time, and total memory available for solutions with large numbers of design variables. The coarse-grained parallel implementation also can be moved easily to a network of workstations.

  8. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  9. NASA Tech Briefs, December 2005

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Topics covered include: Video Mosaicking for Inspection of Gas Pipelines; Shuttle-Data-Tape XML Translator; Highly Reliable, High-Speed, Unidirectional Serial Data Links; Data-Analysis System for Entry, Descent, and Landing; Hybrid UV Imager Containing Face-Up AlGaN/GaN Photodiodes; Multiple Embedded Processors for Fault-Tolerant Computing; Hybrid Power Management; Magnetometer Based on Optoelectronic Microwave Oscillator; Program Predicts Time Courses of Human/ Computer Interactions; Chimera Grid Tools; Astronomer's Proposal Tool; Conservative Patch Algorithm and Mesh Sequencing for PAB3D; Fitting Nonlinear Curves by Use of Optimization Techniques; Tool for Viewing Faults Under Terrain; Automated Synthesis of Long Communication Delays for Testing; Solving Nonlinear Euler Equations With Arbitrary Accuracy; Self-Organizing-Map Program for Analyzing Multivariate Data; Tool for Sizing Analysis of the Advanced Life Support System; Control Software for a High-Performance Telerobot; Java Radar Analysis Tool; Architecture for Verifiable Software; Tool for Ranking Research Options; Enhanced, Partially Redundant Emergency Notification System; Close-Call Action Log Form; Task Description Language; Improved Small-Particle Powders for Plasma Spraying; Bonding-Compatible Corrosion Inhibitor for Rinsing Metals; Wipes, Coatings, and Patches for Detecting Hydrazines; Rotating Vessels for Growing Protein Crystals; Oscillating-Linear-Drive Vacuum Compressor for CO2; Mechanically Biased, Hinged Pairs of Piezoelectric Benders; Apparatus for Precise Indium-Bump Bonding of Microchips; Radiation Dosimetry via Automated Fluorescence Microscopy; Multistage Magnetic Separator of Cells and Proteins; Elastic-Tether Suits for Artificial Gravity and Exercise; Multichannel Brain-Signal-Amplifying and Digitizing System; Ester-Based Electrolytes for Low-Temperature Li-Ion Cells; Hygrometer for Detecting Water in Partially Enclosed Volumes; Radio-Frequency Plasma Cleaning of a Penning Malmberg Trap; Reduction of Flap Side Edge Noise - the Blowing Flap; and Preventing Accidental Ignition of Upper-Stage Rocket Motors.

  10. The Small Bodies Imager Browser --- finding asteroid and comet images without pain

    NASA Astrophysics Data System (ADS)

    Palmer, E.; Sykes, M.; Davis, D.; Neese, C.

    2014-07-01

    To facilitate accessing and downloading spatially resolved imagery of asteroids and comets in the NASA Planetary Data System (PDS), we have created the Small Bodies Image Browser. It is a HTML5 webpage that runs inside a standard web browser needing no installation (http://sbn.psi.edu/sbib/). The volume of data returned by spacecraft missions has grown substantially over the last decade. While this wealth of data provides scientists with ample support for research, it has greatly increased the difficulty of managing, accessing and processing these data. Further, the complexity necessary for a long-term archive results in an architecture that is efficient for computers, but not user friendly. The Small Bodies Image Browser (SBIB) is tied into the PDS archive of the Small Bodies Asteroid Subnode hosted at the Planetary Science Institute [1]. Currently, the tool contains the entire repository of the Dawn mission's encounter with Vesta [2], and we will be adding other datasets in the future. For Vesta, this includes both the level 1A and 1B images for the Framing Camera (FC) and the level 1B spectral cubes from the Visual and Infrared (VIR) spectrometer, providing over 30,000 individual images. A key strength of the tool is providing quick and easy access of these data. The tool allows for searches based on clicking on a map or typing in coordinates. The SBIB can show an entire mission phase (such as cycle 7 of the Low Altitude Mapping Orbit) and the associated footprints, as well as search by image name. It can focus the search by mission phase, resolution or instrument. Imagery archived in the PDS are generally provided by missions in a single or narrow range of formats. To enhance the value and usability of this data to researchers, SBIB makes these available in these original formats as well as PNG, JPEG and ArcGIS compatible ISIS cubes [3]. Additionally, we provide header files for the VIR cubes so they can be read into ENVI without additional processing. Finally, we also provide both camera-based and map-projected products with geometric data embedded for use within ArcGIS and ISIS. We use the Gaskell shape model for terrain projections [4]. There are several other outstanding data analysis tools that have access to asteroid and comet data: JAsteroid (a derivative of JMARS [5]) and the Applied Physics Laboratory's Small Body Mapping Tool [6]. The SBIB has specifically focused on providing data in the easiest manner possible rather than trying to be an analytical tool.

  11. Planning chemical syntheses with deep neural networks and symbolic AI

    NASA Astrophysics Data System (ADS)

    Segler, Marwin H. S.; Preuss, Mike; Waller, Mark P.

    2018-03-01

    To plan the syntheses of small organic molecules, chemists use retrosynthesis, a problem-solving technique in which target molecules are recursively transformed into increasingly simpler precursors. Computer-aided retrosynthesis would be a valuable tool but at present it is slow and provides results of unsatisfactory quality. Here we use Monte Carlo tree search and symbolic artificial intelligence (AI) to discover retrosynthetic routes. We combined Monte Carlo tree search with an expansion policy network that guides the search, and a filter network to pre-select the most promising retrosynthetic steps. These deep neural networks were trained on essentially all reactions ever published in organic chemistry. Our system solves for almost twice as many molecules, thirty times faster than the traditional computer-aided search method, which is based on extracted rules and hand-designed heuristics. In a double-blind AB test, chemists on average considered our computer-generated routes to be equivalent to reported literature routes.

  12. Mechanistic Insights and Computational Design of Transition-Metal Catalysts for Hydrogenation and Dehydrogenation Reactions.

    PubMed

    Chen, Xiangyang; Yang, Xinzheng

    2016-10-01

    Catalytic hydrogenation and dehydrogenation reactions are fundamentally important in chemical synthesis and industrial processes, as well as potential applications in the storage and conversion of renewable energy. Modern computational quantum chemistry has already become a powerful tool in understanding the structures and properties of compounds and elucidating mechanistic insights of chemical reactions, and therefore, holds great promise in the design of new catalysts. Herein, we review our computational studies on the catalytic hydrogenation of carbon dioxide and small organic carbonyl compounds, and on the dehydrogenation of amine-borane and alcohols with an emphasis on elucidating reaction mechanisms and predicting new catalytic reactions, and in return provide some general ideas for the design of high-efficiency, low-cost transition-metal complexes for hydrogenation and dehydrogenation reactions. © 2016 The Chemical Society of Japan & Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) for a 3-D Flexible Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J.-W.

    2001-01-01

    The formulation and implementation of an optimization method called Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) are extended from single discipline analysis (aerodynamics only) to multidisciplinary analysis - in this case, static aero-structural analysis - and applied to a simple 3-D wing problem. The method aims to reduce the computational expense incurred in performing shape optimization using state-of-the-art Computational Fluid Dynamics (CFD) flow analysis, Finite Element Method (FEM) structural analysis and sensitivity analysis tools. Results for this small problem show that the method reaches the same local optimum as conventional optimization. However, unlike its application to the win,, (single discipline analysis), the method. as I implemented here, may not show significant reduction in the computational cost. Similar reductions were seen in the two-design-variable (DV) problem results but not in the 8-DV results given here.

  14. Inheritance on processes, exemplified on distributed termination detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomsen, K.S.

    1987-02-01

    A multiple inheritance mechanism on processes is designed and presented within the framework of a small object oriented language. Processes are described in classes, and the different action parts of a process inherited from different classes are executed in a coroutine-like style called alternation. The inheritance mechanism is a useful tool for factorizing the description of common aspects of processes. This is demonstrated within the domain of distributed programming by using the inheritance mechanism to factorize the description of distributed termination detection algorithms from the description of the distributed main computations for which termination is to be detected. A clearmore » separation of concerns is obtained, and arbitrary combinations of terminations detection algorithms and main computations can be formed. The same termination detection classes can also be used for more general purposes within distributed programming, such as detecting termination of each phase in a multi-phase main computation.« less

  15. Design of a fault-tolerant reversible control unit in molecular quantum-dot cellular automata

    NASA Astrophysics Data System (ADS)

    Bahadori, Golnaz; Houshmand, Monireh; Zomorodi-Moghadam, Mariam

    Quantum-dot cellular automata (QCA) is a promising emerging nanotechnology that has been attracting considerable attention due to its small feature size, ultra-low power consuming, and high clock frequency. Therefore, there have been many efforts to design computational units based on this technology. Despite these advantages of the QCA-based nanotechnologies, their implementation is susceptible to a high error rate. On the other hand, using the reversible computing leads to zero bit erasures and no energy dissipation. As the reversible computation does not lose information, the fault detection happens with a high probability. In this paper, first we propose a fault-tolerant control unit using reversible gates which improves on the previous design. The proposed design is then synthesized to the QCA technology and is simulated by the QCADesigner tool. Evaluation results indicate the performance of the proposed approach.

  16. Occupational risk identification using hand-held or laptop computers.

    PubMed

    Naumanen, Paula; Savolainen, Heikki; Liesivuori, Jyrki

    2008-01-01

    This paper describes the Work Environment Profile (WEP) program and its use in risk identification by computer. It is installed into a hand-held computer or a laptop to be used in risk identification during work site visits. A 5-category system is used to describe the identified risks in 7 groups, i.e., accidents, biological and physical hazards, ergonomic and psychosocial load, chemicals, and information technology hazards. Each group contains several qualifying factors. These 5 categories are colour-coded at this stage to aid with visualization. Risk identification produces visual summary images the interpretation of which is facilitated by colours. The WEP program is a tool for risk assessment which is easy to learn and to use both by experts and nonprofessionals. It is especially well adapted to be used both in small and in larger enterprises. Considerable time is saved as no paper notes are needed.

  17. Analyzing the Cohesion of English Text and Discourse with Automated Computer Tools

    ERIC Educational Resources Information Center

    Jeon, Moongee

    2014-01-01

    This article investigates the lexical and discourse features of English text and discourse with automated computer technologies. Specifically, this article examines the cohesion of English text and discourse with automated computer tools, Coh-Metrix and TEES. Coh-Metrix is a text analysis computer tool that can analyze English text and discourse…

  18. The Implications of Cognitive Psychology for Computer-Based Learning Tools.

    ERIC Educational Resources Information Center

    Kozma, Robert B.

    1987-01-01

    Defines cognitive computer tools as software programs that use the control capabilities of computers to amplify, extend, or enhance human cognition; suggests seven ways in which computers can aid learning; and describes the "Learning Tool," a software package for the Apple Macintosh microcomputer that is designed to aid learning of…

  19. GenomicTools: a computational platform for developing high-throughput analytics in genomics.

    PubMed

    Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo

    2012-01-15

    Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.

  20. Beyond directed evolution - semi-rational protein engineering and design

    PubMed Central

    Lutz, Stefan

    2010-01-01

    Over the last two decades, directed evolution has transformed the field of protein engineering. The advances in understanding protein structure and function, in no insignificant part a result of directed evolution studies, are increasingly empowering scientists and engineers to device more effective methods for manipulating and tailoring biocatalysts. Abandoning large combinatorial libraries, the focus has shifted to small, functionally-rich libraries and rational design. A critical component to the success of these emerging engineering strategies are computational tools for the evaluation of protein sequence datasets and the analysis of conformational variations of amino acids in proteins. Highlighting the opportunities and limitations of such approaches, this review focuses on recent engineering and design examples that require screening or selection of small libraries. PMID:20869867

  1. gene2drug: a computational tool for pathway-based rational drug repositioning.

    PubMed

    Napolitano, Francesco; Carrella, Diego; Mandriani, Barbara; Pisonero-Vaquero, Sandra; Sirci, Francesco; Medina, Diego L; Brunetti-Pierri, Nicola; di Bernardo, Diego

    2018-05-01

    Drug repositioning has been proposed as an effective shortcut to drug discovery. The availability of large collections of transcriptional responses to drugs enables computational approaches to drug repositioning directly based on measured molecular effects. We introduce a novel computational methodology for rational drug repositioning, which exploits the transcriptional responses following treatment with small molecule. Specifically, given a therapeutic target gene, a prioritization of potential effective drugs is obtained by assessing their impact on the transcription of genes in the pathway(s) including the target. We performed in silico validation and comparison with a state-of-art technique based on similar principles. We next performed experimental validation in two different real-case drug repositioning scenarios: (i) upregulation of the glutamate-pyruvate transaminase (GPT), which has been shown to induce reduction of oxalate levels in a mouse model of primary hyperoxaluria, and (ii) activation of the transcription factor TFEB, a master regulator of lysosomal biogenesis and autophagy, whose modulation may be beneficial in neurodegenerative disorders. A web tool for Gene2drug is freely available at http://gene2drug.tigem.it. An R package is under development and can be obtained from https://github.com/franapoli/gep2pep. dibernardo@tigem.it. Supplementary data are available at Bioinformatics online.

  2. RegulonDB version 9.0: high-level integration of gene regulation, coexpression, motif clustering and beyond

    PubMed Central

    Gama-Castro, Socorro; Salgado, Heladia; Santos-Zavaleta, Alberto; Ledezma-Tejeida, Daniela; Muñiz-Rascado, Luis; García-Sotelo, Jair Santiago; Alquicira-Hernández, Kevin; Martínez-Flores, Irma; Pannier, Lucia; Castro-Mondragón, Jaime Abraham; Medina-Rivera, Alejandra; Solano-Lira, Hilda; Bonavides-Martínez, César; Pérez-Rueda, Ernesto; Alquicira-Hernández, Shirley; Porrón-Sotelo, Liliana; López-Fuentes, Alejandra; Hernández-Koutoucheva, Anastasia; Moral-Chávez, Víctor Del; Rinaldi, Fabio; Collado-Vides, Julio

    2016-01-01

    RegulonDB (http://regulondb.ccg.unam.mx) is one of the most useful and important resources on bacterial gene regulation,as it integrates the scattered scientific knowledge of the best-characterized organism, Escherichia coli K-12, in a database that organizes large amounts of data. Its electronic format enables researchers to compare their results with the legacy of previous knowledge and supports bioinformatics tools and model building. Here, we summarize our progress with RegulonDB since our last Nucleic Acids Research publication describing RegulonDB, in 2013. In addition to maintaining curation up-to-date, we report a collection of 232 interactions with small RNAs affecting 192 genes, and the complete repertoire of 189 Elementary Genetic Sensory-Response units (GENSOR units), integrating the signal, regulatory interactions, and metabolic pathways they govern. These additions represent major progress to a higher level of understanding of regulated processes. We have updated the computationally predicted transcription factors, which total 304 (184 with experimental evidence and 120 from computational predictions); we updated our position-weight matrices and have included tools for clustering them in evolutionary families. We describe our semiautomatic strategy to accelerate curation, including datasets from high-throughput experiments, a novel coexpression distance to search for ‘neighborhood’ genes to known operons and regulons, and computational developments. PMID:26527724

  3. Stereoscopic vascular models of the head and neck: A computed tomography angiography visualization.

    PubMed

    Cui, Dongmei; Lynch, James C; Smith, Andrew D; Wilson, Timothy D; Lehman, Michael N

    2016-01-01

    Computer-assisted 3D models are used in some medical and allied health science schools; however, they are often limited to online use and 2D flat screen-based imaging. Few schools take advantage of 3D stereoscopic learning tools in anatomy education and clinically relevant anatomical variations when teaching anatomy. A new approach to teaching anatomy includes use of computed tomography angiography (CTA) images of the head and neck to create clinically relevant 3D stereoscopic virtual models. These high resolution images of the arteries can be used in unique and innovative ways to create 3D virtual models of the vasculature as a tool for teaching anatomy. Blood vessel 3D models are presented stereoscopically in a virtual reality environment, can be rotated 360° in all axes, and magnified according to need. In addition, flexible views of internal structures are possible. Images are displayed in a stereoscopic mode, and students view images in a small theater-like classroom while wearing polarized 3D glasses. Reconstructed 3D models enable students to visualize vascular structures with clinically relevant anatomical variations in the head and neck and appreciate spatial relationships among the blood vessels, the skull and the skin. © 2015 American Association of Anatomists.

  4. Applications of computer-aided approaches in the development of hepatitis C antiviral agents.

    PubMed

    Ganesan, Aravindhan; Barakat, Khaled

    2017-04-01

    Hepatitis C virus (HCV) is a global health problem that causes several chronic life-threatening liver diseases. The numbers of people affected by HCV are rising annually. Since 2011, the FDA has approved several anti-HCV drugs; while many other promising HCV drugs are currently in late clinical trials. Areas covered: This review discusses the applications of different computational approaches in HCV drug design. Expert opinion: Molecular docking and virtual screening approaches have emerged as a low-cost tool to screen large databases and identify potential small-molecule hits against HCV targets. Ligand-based approaches are useful for filtering-out compounds with rich physicochemical properties to inhibit HCV targets. Molecular dynamics (MD) remains a useful tool in optimizing the ligand-protein complexes and understand the ligand binding modes and drug resistance mechanisms in HCV. Despite their varied roles, the application of in-silico approaches in HCV drug design is still in its infancy. A more mature application should aim at modelling the whole HCV replicon in its active form and help to identify new effective druggable sites within the replicon system. With more technological advancements, the roles of computer-aided methods are only going to increase several folds in the development of next-generation HCV drugs.

  5. Computational Identification and Functional Predictions of Long Noncoding RNA in Zea mays

    PubMed Central

    Boerner, Susan; McGinnis, Karen M.

    2012-01-01

    Background Computational analysis of cDNA sequences from multiple organisms suggests that a large portion of transcribed DNA does not code for a functional protein. In mammals, noncoding transcription is abundant, and often results in functional RNA molecules that do not appear to encode proteins. Many long noncoding RNAs (lncRNAs) appear to have epigenetic regulatory function in humans, including HOTAIR and XIST. While epigenetic gene regulation is clearly an essential mechanism in plants, relatively little is known about the presence or function of lncRNAs in plants. Methodology/Principal Findings To explore the connection between lncRNA and epigenetic regulation of gene expression in plants, a computational pipeline using the programming language Python has been developed and applied to maize full length cDNA sequences to identify, classify, and localize potential lncRNAs. The pipeline was used in parallel with an SVM tool for identifying ncRNAs to identify the maximal number of ncRNAs in the dataset. Although the available library of sequences was small and potentially biased toward protein coding transcripts, 15% of the sequences were predicted to be noncoding. Approximately 60% of these sequences appear to act as precursors for small RNA molecules and may function to regulate gene expression via a small RNA dependent mechanism. ncRNAs were predicted to originate from both genic and intergenic loci. Of the lncRNAs that originated from genic loci, ∼20% were antisense to the host gene loci. Conclusions/Significance Consistent with similar studies in other organisms, noncoding transcription appears to be widespread in the maize genome. Computational predictions indicate that maize lncRNAs may function to regulate expression of other genes through multiple RNA mediated mechanisms. PMID:22916204

  6. The interpretation of dream meaning: Resolving ambiguity using Latent Semantic Analysis in a small corpus of text.

    PubMed

    Altszyler, Edgar; Ribeiro, Sidarta; Sigman, Mariano; Fernández Slezak, Diego

    2017-11-01

    Computer-based dreams content analysis relies on word frequencies within predefined categories in order to identify different elements in text. As a complementary approach, we explored the capabilities and limitations of word-embedding techniques to identify word usage patterns among dream reports. These tools allow us to quantify words associations in text and to identify the meaning of target words. Word-embeddings have been extensively studied in large datasets, but only a few studies analyze semantic representations in small corpora. To fill this gap, we compared Skip-gram and Latent Semantic Analysis (LSA) capabilities to extract semantic associations from dream reports. LSA showed better performance than Skip-gram in small size corpora in two tests. Furthermore, LSA captured relevant word associations in dream collection, even in cases with low-frequency words or small numbers of dreams. Word associations in dreams reports can thus be quantified by LSA, which opens new avenues for dream interpretation and decoding. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. CFD: computational fluid dynamics or confounding factor dissemination? The role of hemodynamics in intracranial aneurysm rupture risk assessment.

    PubMed

    Xiang, J; Tutino, V M; Snyder, K V; Meng, H

    2014-10-01

    Image-based computational fluid dynamics holds a prominent position in the evaluation of intracranial aneurysms, especially as a promising tool to stratify rupture risk. Current computational fluid dynamics findings correlating both high and low wall shear stress with intracranial aneurysm growth and rupture puzzle researchers and clinicians alike. These conflicting findings may stem from inconsistent parameter definitions, small datasets, and intrinsic complexities in intracranial aneurysm growth and rupture. In Part 1 of this 2-part review, we proposed a unifying hypothesis: both high and low wall shear stress drive intracranial aneurysm growth and rupture through mural cell-mediated and inflammatory cell-mediated destructive remodeling pathways, respectively. In the present report, Part 2, we delineate different wall shear stress parameter definitions and survey recent computational fluid dynamics studies, in light of this mechanistic heterogeneity. In the future, we expect that larger datasets, better analyses, and increased understanding of hemodynamic-biologic mechanisms will lead to more accurate predictive models for intracranial aneurysm risk assessment from computational fluid dynamics. © 2014 by American Journal of Neuroradiology.

  8. Fast simulation tool for ultraviolet radiation at the earth's surface

    NASA Astrophysics Data System (ADS)

    Engelsen, Ola; Kylling, Arve

    2005-04-01

    FastRT is a fast, yet accurate, UV simulation tool that computes downward surface UV doses, UV indices, and irradiances in the spectral range 290 to 400 nm with a resolution as small as 0.05 nm. It computes a full UV spectrum within a few milliseconds on a standard PC, and enables the user to convolve the spectrum with user-defined and built-in spectral response functions including the International Commission on Illumination (CIE) erythemal response function used for UV index calculations. The program accounts for the main radiative input parameters, i.e., instrumental characteristics, solar zenith angle, ozone column, aerosol loading, clouds, surface albedo, and surface altitude. FastRT is based on look-up tables of carefully selected entries of atmospheric transmittances and spherical albedos, and exploits the smoothness of these quantities with respect to atmospheric, surface, geometrical, and spectral parameters. An interactive site, http://nadir.nilu.no/~olaeng/fastrt/fastrt.html, enables the public to run the FastRT program with most input options. This page also contains updated information about FastRT and links to freely downloadable source codes and binaries.

  9. A Rat Body Phantom for Radiation Analysis

    NASA Technical Reports Server (NTRS)

    Qualls, Garry D.; Clowdsley, Martha S.; Slaba, Tony C.; Walker, Steven A.

    2010-01-01

    To reduce the uncertainties associated with estimating the biological effects of ionizing radiation in tissue, researchers rely on laboratory experiments in which mono-energetic, single specie beams are applied to cell cultures, insects, and small animals. To estimate the radiation effects on astronauts in deep space or low Earth orbit, who are exposed to mixed field broad spectrum radiation, these experimental results are extrapolated and combined with other data to produce radiation quality factors, radiation weighting factors, and other risk related quantities for humans. One way to reduce the uncertainty associated with such extrapolations is to utilize analysis tools that are applicable to both laboratory and space environments. The use of physical and computational body phantoms to predict radiation exposure and its effects is well established and a wide range of human and non-human phantoms are in use today. In this paper, a computational rat phantom is presented, as well as a description of the process through which that phantom has been coupled to existing radiation analysis tools. Sample results are presented for two space radiation environments.

  10. SwissADME: a free web tool to evaluate pharmacokinetics, drug-likeness and medicinal chemistry friendliness of small molecules

    PubMed Central

    Daina, Antoine; Michielin, Olivier; Zoete, Vincent

    2017-01-01

    To be effective as a drug, a potent molecule must reach its target in the body in sufficient concentration, and stay there in a bioactive form long enough for the expected biologic events to occur. Drug development involves assessment of absorption, distribution, metabolism and excretion (ADME) increasingly earlier in the discovery process, at a stage when considered compounds are numerous but access to the physical samples is limited. In that context, computer models constitute valid alternatives to experiments. Here, we present the new SwissADME web tool that gives free access to a pool of fast yet robust predictive models for physicochemical properties, pharmacokinetics, drug-likeness and medicinal chemistry friendliness, among which in-house proficient methods such as the BOILED-Egg, iLOGP and Bioavailability Radar. Easy efficient input and interpretation are ensured thanks to a user-friendly interface through the login-free website http://www.swissadme.ch. Specialists, but also nonexpert in cheminformatics or computational chemistry can predict rapidly key parameters for a collection of molecules to support their drug discovery endeavours. PMID:28256516

  11. Integrating Computational Science Tools into a Thermodynamics Course

    NASA Astrophysics Data System (ADS)

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tatli, Emre; Ferroni, Paolo; Mazzoccoli, Jason

    The possible use of compact heat exchangers (HXs) in sodium-cooled fast reactors (SFR) employing a Brayton cycle is promising due to their high power density and resulting small volume in comparison with conventional shell-and-tube HXs. However, the small diameter of their channels makes them more susceptible to plugging due to Na2O deposition during accident conditions. Although cold traps are designed to reduce oxygen impurity levels in the sodium coolant, their failure, in conjunction with accidental air ingress into the sodium boundary, could result in coolant oxygen levels that are above the saturation limit in the cooler parts of the HXmore » channels. This can result in Na2O crystallization and the formation of solid deposits on cooled channel surfaces, limiting or even blocking coolant flow. The development of analysis tools capable of modeling the formation of these deposits in the presence of sodium flow will allow designers of SFRs to properly size the HX channels so that, in the scenario mentioned above, the reactor operator has sufficient time to detect and react to the affected HX. Until now, analytical methodologies to predict the formation of these deposits have been developed, but never implemented in a high-fidelity computational tool suited to modern reactor design techniques. This paper summarizes the challenges and the current status in the development of a Computational Fluid Dynamics (CFD) methodology to predict deposit formation, with particular emphasis on sensitivity studies on some parameters affecting deposition.« less

  13. Mixing the Green-Ampt model and Curve Number method as an empirical tool for rainfall excess estimation in small ungauged catchments.

    NASA Astrophysics Data System (ADS)

    Grimaldi, S.; Petroselli, A.; Romano, N.

    2012-04-01

    The Soil Conservation Service - Curve Number (SCS-CN) method is a popular rainfall-runoff model that is widely used to estimate direct runoff from small and ungauged basins. The SCS-CN is a simple and valuable approach to estimate the total stream-flow volume generated by a storm rainfall, but it was developed to be used with daily rainfall data. To overcome this drawback, we propose to include the Green-Ampt (GA) infiltration model into a mixed procedure, which is referred to as CN4GA (Curve Number for Green-Ampt), aiming to distribute in time the information provided by the SCS-CN method so as to provide estimation of sub-daily incremental rainfall excess. For a given storm, the computed SCS-CN total net rainfall amount is used to calibrate the soil hydraulic conductivity parameter of the Green-Ampt model. The proposed procedure was evaluated by analyzing 100 rainfall-runoff events observed in four small catchments of varying size. CN4GA appears an encouraging tool for predicting the net rainfall peak and duration values and has shown, at least for the test cases considered in this study, a better agreement with observed hydrographs than that of the classic SCS-CN method.

  14. Ligand.Info small-molecule Meta-Database.

    PubMed

    von Grotthuss, Marcin; Koczyk, Grzegorz; Pas, Jakub; Wyrwicz, Lucjan S; Rychlewski, Leszek

    2004-12-01

    Ligand.Info is a compilation of various publicly available databases of small molecules. The total size of the Meta-Database is over 1 million entries. The compound records contain calculated three-dimensional coordinates and sometimes information about biological activity. Some molecules have information about FDA drug approving status or about anti-HIV activity. Meta-Database can be downloaded from the http://Ligand.Info web page. The database can also be screened using a Java-based tool. The tool can interactively cluster sets of molecules on the user side and automatically download similar molecules from the server. The application requires the Java Runtime Environment 1.4 or higher, which can be automatically downloaded from Sun Microsystems or Apple Computer and installed during the first use of Ligand.Info on desktop systems, which support Java (Ms Windows, Mac OS, Solaris, and Linux). The Ligand.Info Meta-Database can be used for virtual high-throughput screening of new potential drugs. Presented examples showed that using a known antiviral drug as query the system was able to find others antiviral drugs and inhibitors.

  15. High content screening of defined chemical libraries using normal and glioma-derived neural stem cell lines.

    PubMed

    Danovi, Davide; Folarin, Amos A; Baranowski, Bart; Pollard, Steven M

    2012-01-01

    Small molecules with potent biological effects on the fate of normal and cancer-derived stem cells represent both useful research tools and new drug leads for regenerative medicine and oncology. Long-term expansion of mouse and human neural stem cells is possible using adherent monolayer culture. These cultures represent a useful cellular resource to carry out image-based high content screening of small chemical libraries. Improvements in automated microscopy, desktop computational power, and freely available image processing tools, now means that such chemical screens are realistic to undertake in individual academic laboratories. Here we outline a cost effective and versatile time lapse imaging strategy suitable for chemical screening. Protocols are described for the handling and screening of human fetal Neural Stem (NS) cell lines and their malignant counterparts, Glioblastoma-derived neural stem cells (GNS). We focus on identification of cytostatic and cytotoxic "hits" and discuss future possibilities and challenges for extending this approach to assay lineage commitment and differentiation. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Implementation of the AES as a Hash Function for Confirming the Identity of Software on a Computer System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Randy R.; Bass, Robert B.; Kouzes, Richard T.

    2003-01-20

    This paper provides a brief overview of the implementation of the Advanced Encryption Standard (AES) as a hash function for confirming the identity of software resident on a computer system. The PNNL Software Authentication team chose to use a hash function to confirm software identity on a system for situations where: (1) there is limited time to perform the confirmation and (2) access to the system is restricted to keyboard or thumbwheel input and output can only be displayed on a monitor. PNNL reviewed three popular algorithms: the Secure Hash Algorithm - 1 (SHA-1), the Message Digest - 5 (MD-5),more » and the Advanced Encryption Standard (AES) and selected the AES to incorporate in software confirmation tool we developed. This paper gives a brief overview of the SHA-1, MD-5, and the AES and sites references for further detail. It then explains the overall processing steps of the AES to reduce a large amount of generic data-the plain text, such is present in memory and other data storage media in a computer system, to a small amount of data-the hash digest, which is a mathematically unique representation or signature of the former that could be displayed on a computer's monitor. This paper starts with a simple definition and example to illustrate the use of a hash function. It concludes with a description of how the software confirmation tool uses the hash function to confirm the identity of software on a computer system.« less

  17. CRADA Final Report: Weld Predictor App

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Billings, Jay Jay

    Welding is an important manufacturing process used in a broad range of industries and market sectors, including automotive, aerospace, heavy manufacturing, medical, and defense. During welded fabrication, high localized heat input and subsequent rapid cooling result in the creation of residual stresses and distortion. These residual stresses can significantly affect the fatigue resistance, cracking behavior, and load-carrying capacity of welded structures during service. Further, additional fitting and tacking time is often required to fit distorted subassemblies together, resulting in non-value added cost. Using trial-and-error methods to determine which welding parameters, welding sequences, and fixture designs will most effectively reduce distortionmore » is a time-consuming and expensive process. For complex structures with many welds, this approach can take several months. For this reason, efficient and accurate methods of mitigating distortion are in-demand across all industries where welding is used. Analytical and computational methods and commercial software tools have been developed to predict welding-induced residual stresses and distortion. Welding process parameters, fixtures, and tooling can be optimized to reduce the HAZ softening and minimize weld residual stress and distortion, improving performance and reducing design, fabrication and testing costs. However, weld modeling technology tools are currently accessible only to engineers and designers with a background in finite element analysis (FEA) who work with large manufacturers, research institutes, and universities with access to high-performance computing (HPC) resources. Small and medium enterprises (SMEs) in the US do not typically have the human and computational resources needed to adopt and utilize weld modeling technology. To allow an engineer with no background in FEA and SMEs to gain access to this important design tool, EWI and the Ohio Supercomputer Center (OSC) developed the online weld application software tool “WeldPredictor” ( https://eweldpredictor.ewi.org ). About 1400 users have tested this application. This project marked the beginning of development on the next version of WeldPredictor that addresses many outstanding features of the original, including 3D models, allow more material hardening laws, model material phase transformation, and uses open source finite element solvers to quickly solve problems (as opposed to expensive commercial tools).« less

  18. Visualization Tools for Teaching Computer Security

    ERIC Educational Resources Information Center

    Yuan, Xiaohong; Vega, Percy; Qadah, Yaseen; Archer, Ricky; Yu, Huiming; Xu, Jinsheng

    2010-01-01

    Using animated visualization tools has been an important teaching approach in computer science education. We have developed three visualization and animation tools that demonstrate various information security concepts and actively engage learners. The information security concepts illustrated include: packet sniffer and related computer network…

  19. THE COMPUTER AND SMALL BUSINESS.

    DTIC Science & Technology

    The place of the computer in small business is investigated with respect to what type of problems it can solve for small business and how the small...firm can acquire time on one. The decision-making process and the importance of information is discussed in relation to small business . Several...applications of computers are examined to show how the firm can use the computer in day-to-day business operations. The capabilities of a digital computer

  20. A Proposal for Automatic Fruit Harvesting by Combining a Low Cost Stereovision Camera and a Robotic Arm

    PubMed Central

    Font, Davinia; Pallejà, Tomàs; Tresanchez, Marcel; Runcan, David; Moreno, Javier; Martínez, Dani; Teixidó, Mercè; Palacín, Jordi

    2014-01-01

    This paper proposes the development of an automatic fruit harvesting system by combining a low cost stereovision camera and a robotic arm placed in the gripper tool. The stereovision camera is used to estimate the size, distance and position of the fruits whereas the robotic arm is used to mechanically pickup the fruits. The low cost stereovision system has been tested in laboratory conditions with a reference small object, an apple and a pear at 10 different intermediate distances from the camera. The average distance error was from 4% to 5%, and the average diameter error was up to 30% in the case of a small object and in a range from 2% to 6% in the case of a pear and an apple. The stereovision system has been attached to the gripper tool in order to obtain relative distance, orientation and size of the fruit. The harvesting stage requires the initial fruit location, the computation of the inverse kinematics of the robotic arm in order to place the gripper tool in front of the fruit, and a final pickup approach by iteratively adjusting the vertical and horizontal position of the gripper tool in a closed visual loop. The complete system has been tested in controlled laboratory conditions with uniform illumination applied to the fruits. As a future work, this system will be tested and improved in conventional outdoor farming conditions. PMID:24984059

  1. A proposal for automatic fruit harvesting by combining a low cost stereovision camera and a robotic arm.

    PubMed

    Font, Davinia; Pallejà, Tomàs; Tresanchez, Marcel; Runcan, David; Moreno, Javier; Martínez, Dani; Teixidó, Mercè; Palacín, Jordi

    2014-06-30

    This paper proposes the development of an automatic fruit harvesting system by combining a low cost stereovision camera and a robotic arm placed in the gripper tool. The stereovision camera is used to estimate the size, distance and position of the fruits whereas the robotic arm is used to mechanically pickup the fruits. The low cost stereovision system has been tested in laboratory conditions with a reference small object, an apple and a pear at 10 different intermediate distances from the camera. The average distance error was from 4% to 5%, and the average diameter error was up to 30% in the case of a small object and in a range from 2% to 6% in the case of a pear and an apple. The stereovision system has been attached to the gripper tool in order to obtain relative distance, orientation and size of the fruit. The harvesting stage requires the initial fruit location, the computation of the inverse kinematics of the robotic arm in order to place the gripper tool in front of the fruit, and a final pickup approach by iteratively adjusting the vertical and horizontal position of the gripper tool in a closed visual loop. The complete system has been tested in controlled laboratory conditions with uniform illumination applied to the fruits. As a future work, this system will be tested and improved in conventional outdoor farming conditions.

  2. Functional Analysis of OMICs Data and Small Molecule Compounds in an Integrated "Knowledge-Based" Platform.

    PubMed

    Dubovenko, Alexey; Nikolsky, Yuri; Rakhmatulin, Eugene; Nikolskaya, Tatiana

    2017-01-01

    Analysis of NGS and other sequencing data, gene variants, gene expression, proteomics, and other high-throughput (OMICs) data is challenging because of its biological complexity and high level of technical and biological noise. One way to deal with both problems is to perform analysis with a high fidelity annotated knowledgebase of protein interactions, pathways, and functional ontologies. This knowledgebase has to be structured in a computer-readable format and must include software tools for managing experimental data, analysis, and reporting. Here, we present MetaCore™ and Key Pathway Advisor (KPA), an integrated platform for functional data analysis. On the content side, MetaCore and KPA encompass a comprehensive database of molecular interactions of different types, pathways, network models, and ten functional ontologies covering human, mouse, and rat genes. The analytical toolkit includes tools for gene/protein list enrichment analysis, statistical "interactome" tool for the identification of over- and under-connected proteins in the dataset, and a biological network analysis module made up of network generation algorithms and filters. The suite also features Advanced Search, an application for combinatorial search of the database content, as well as a Java-based tool called Pathway Map Creator for drawing and editing custom pathway maps. Applications of MetaCore and KPA include molecular mode of action of disease research, identification of potential biomarkers and drug targets, pathway hypothesis generation, analysis of biological effects for novel small molecule compounds and clinical applications (analysis of large cohorts of patients, and translational and personalized medicine).

  3. Electromagnetomechanical elastodynamic model for Lamb wave damage quantification in composites

    NASA Astrophysics Data System (ADS)

    Borkowski, Luke; Chattopadhyay, Aditi

    2014-03-01

    Physics-based wave propagation computational models play a key role in structural health monitoring (SHM) and the development of improved damage quantification methodologies. Guided waves (GWs), such as Lamb waves, provide the capability to monitor large plate-like aerospace structures with limited actuators and sensors and are sensitive to small scale damage; however due to the complex nature of GWs, accurate and efficient computation tools are necessary to investigate the mechanisms responsible for dispersion, coupling, and interaction with damage. In this paper, the local interaction simulation approach (LISA) coupled with the sharp interface model (SIM) solution methodology is used to solve the fully coupled electro-magneto-mechanical elastodynamic equations for the piezoelectric and piezomagnetic actuation and sensing of GWs in fiber reinforced composite material systems. The final framework provides the full three-dimensional displacement as well as electrical and magnetic potential fields for arbitrary plate and transducer geometries and excitation waveform and frequency. The model is validated experimentally and proven computationally efficient for a laminated composite plate. Studies are performed with surface bonded piezoelectric and embedded piezomagnetic sensors to gain insight into the physics of experimental techniques used for SHM. The symmetric collocation of piezoelectric actuators is modeled to demonstrate mode suppression in laminated composites for the purpose of damage detection. The effect of delamination and damage (i.e., matrix cracking) on the GW propagation is demonstrated and quantified. The developed model provides a valuable tool for the improvement of SHM techniques due to its proven accuracy and computational efficiency.

  4. Cognitive simulation as a tool for cognitive task analysis.

    PubMed

    Roth, E M; Woods, D D; Pople, H E

    1992-10-01

    Cognitive simulations are runnable computer programs that represent models of human cognitive activities. We show how one cognitive simulation built as a model of some of the cognitive processes involved in dynamic fault management can be used in conjunction with small-scale empirical data on human performance to uncover the cognitive demands of a task, to identify where intention errors are likely to occur, and to point to improvements in the person-machine system. The simulation, called Cognitive Environment Simulation or CES, has been exercised on several nuclear power plant accident scenarios. Here we report one case to illustrate how a cognitive simulation tool such as CES can be used to clarify the cognitive demands of a problem-solving situation as part of a cognitive task analysis.

  5. Measurement of sound velocity made easy using harmonic resonant frequencies with everyday mobile technology

    NASA Astrophysics Data System (ADS)

    Hirth, Michael; Kuhn, Jochen; Müller, Andreas

    2015-02-01

    Recent articles about smartphone experiments have described their applications as experimental tools in different physical contexts.1-4 They have established that smartphones facilitate experimental setups, thanks to the small size and diverse functions of mobile devices, in comparison to setups with computer-based measurements. In the experiment described in this article, the experimental setup is reduced to a minimum. The objective of the experiment is to determine the speed of sound with a high degree of accuracy using everyday tools. An article published recently proposes a time-of-flight method where sound or acoustic pulses are reflected at the ends of an open tube.5 In contrast, the following experiment idea is based on the harmonic resonant frequencies of such a tube, simultaneously triggered by a noise signal.

  6. A Method for Aircraft Concept Selection Using Multicriteria Interactive Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Buonanno, Michael; Mavris, Dimitri

    2005-01-01

    The problem of aircraft concept selection has become increasingly difficult in recent years as a result of a change from performance as the primary evaluation criteria of aircraft concepts to the current situation in which environmental effects, economics, and aesthetics must also be evaluated and considered in the earliest stages of the decision-making process. This has prompted a shift from design using historical data regression techniques for metric prediction to the use of physics-based analysis tools that are capable of analyzing designs outside of the historical database. The use of optimization methods with these physics-based tools, however, has proven difficult because of the tendency of optimizers to exploit assumptions present in the models and drive the design towards a solution which, while promising to the computer, may be infeasible due to factors not considered by the computer codes. In addition to this difficulty, the number of discrete options available at this stage may be unmanageable due to the combinatorial nature of the concept selection problem, leading the analyst to arbitrarily choose a sub-optimum baseline vehicle. These concept decisions such as the type of control surface scheme to use, though extremely important, are frequently made without sufficient understanding of their impact on the important system metrics because of a lack of computational resources or analysis tools. This paper describes a hybrid subjective/quantitative optimization method and its application to the concept selection of a Small Supersonic Transport. The method uses Genetic Algorithms to operate on a population of designs and promote improvement by varying more than sixty parameters governing the vehicle geometry, mission, and requirements. In addition to using computer codes for evaluation of quantitative criteria such as gross weight, expert input is also considered to account for criteria such as aeroelasticity or manufacturability which may be impossible or too computationally expensive to consider explicitly in the analysis. Results indicate that concepts resulting from the use of this method represent designs which are promising to both the computer and the analyst, and that a mapping between concepts and requirements that would not otherwise be apparent is revealed.

  7. CT-assisted agile manufacturing

    NASA Astrophysics Data System (ADS)

    Stanley, James H.; Yancey, Robert N.

    1996-11-01

    The next century will witness at least two great revolutions in the way goods are produced. First, workers will use the medium of virtual reality in all aspects of marketing, research, development, prototyping, manufacturing, sales and service. Second, market forces will drive manufacturing towards small-lot production and just-in-time delivery. Already, we can discern the merging of these megatrends into what some are calling agile manufacturing. Under this new paradigm, parts and processes will be designed and engineered within the mind of a computer, tooled and manufactured by the offspring of today's rapid prototyping equipment, and evaluated for performance and reliability by advanced nondestructive evaluation (NDE) techniques and sophisticated computational models. Computed tomography (CT) is the premier example of an NDE method suitable for future agile manufacturing activities. It is the only modality that provides convenient access to the full suite of engineering data that users will need to avail themselves of computer- aided design, computer-aided manufacturing, and computer- aided engineering capabilities, as well as newly emerging reverse engineering, rapid prototyping and solid freeform fabrication technologies. As such, CT is assured a central, utilitarian role in future industrial operations. An overview of this exciting future for industrial CT is presented.

  8. Computational approaches to metabolic engineering utilizing systems biology and synthetic biology.

    PubMed

    Fong, Stephen S

    2014-08-01

    Metabolic engineering modifies cellular function to address various biochemical applications. Underlying metabolic engineering efforts are a host of tools and knowledge that are integrated to enable successful outcomes. Concurrent development of computational and experimental tools has enabled different approaches to metabolic engineering. One approach is to leverage knowledge and computational tools to prospectively predict designs to achieve the desired outcome. An alternative approach is to utilize combinatorial experimental tools to empirically explore the range of cellular function and to screen for desired traits. This mini-review focuses on computational systems biology and synthetic biology tools that can be used in combination for prospective in silico strain design.

  9. Computational biology of RNA interactions.

    PubMed

    Dieterich, Christoph; Stadler, Peter F

    2013-01-01

    The biodiversity of the RNA world has been underestimated for decades. RNA molecules are key building blocks, sensors, and regulators of modern cells. The biological function of RNA molecules cannot be separated from their ability to bind to and interact with a wide space of chemical species, including small molecules, nucleic acids, and proteins. Computational chemists, physicists, and biologists have developed a rich tool set for modeling and predicting RNA interactions. These interactions are to some extent determined by the binding conformation of the RNA molecule. RNA binding conformations are approximated with often acceptable accuracy by sequence and secondary structure motifs. Secondary structure ensembles of a given RNA molecule can be efficiently computed in many relevant situations by employing a standard energy model for base pair interactions and dynamic programming techniques. The case of bi-molecular RNA-RNA interactions can be seen as an extension of this approach. However, unbiased transcriptome-wide scans for local RNA-RNA interactions are computationally challenging yet become efficient if the binding motif/mode is known and other external information can be used to confine the search space. Computational methods are less developed for proteins and small molecules, which bind to RNA with very high specificity. Binding descriptors of proteins are usually determined by in vitro high-throughput assays (e.g., microarrays or sequencing). Intriguingly, recent experimental advances, which are mostly based on light-induced cross-linking of binding partners, render in vivo binding patterns accessible yet require new computational methods for careful data interpretation. The grand challenge is to model the in vivo situation where a complex interplay of RNA binders competes for the same target RNA molecule. Evidently, bioinformaticians are just catching up with the impressive pace of these developments. Copyright © 2012 John Wiley & Sons, Ltd.

  10. Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.

    2013-12-01

    Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.

  11. Ligand design by a combinatorial approach based on modeling and experiment: application to HLA-DR4

    NASA Astrophysics Data System (ADS)

    Evensen, Erik; Joseph-McCarthy, Diane; Weiss, Gregory A.; Schreiber, Stuart L.; Karplus, Martin

    2007-07-01

    Combinatorial synthesis and large scale screening methods are being used increasingly in drug discovery, particularly for finding novel lead compounds. Although these "random" methods sample larger areas of chemical space than traditional synthetic approaches, only a relatively small percentage of all possible compounds are practically accessible. It is therefore helpful to select regions of chemical space that have greater likelihood of yielding useful leads. When three-dimensional structural data are available for the target molecule this can be achieved by applying structure-based computational design methods to focus the combinatorial library. This is advantageous over the standard usage of computational methods to design a small number of specific novel ligands, because here computation is employed as part of the combinatorial design process and so is required only to determine a propensity for binding of certain chemical moieties in regions of the target molecule. This paper describes the application of the Multiple Copy Simultaneous Search (MCSS) method, an active site mapping and de novo structure-based design tool, to design a focused combinatorial library for the class II MHC protein HLA-DR4. Methods for the synthesizing and screening the computationally designed library are presented; evidence is provided to show that binding was achieved. Although the structure of the protein-ligand complex could not be determined, experimental results including cross-exclusion of a known HLA-DR4 peptide ligand (HA) by a compound from the library. Computational model building suggest that at least one of the ligands designed and identified by the methods described binds in a mode similar to that of native peptides.

  12. Computer-facilitated rapid HIV testing in emergency care settings: provider and patient usability and acceptability.

    PubMed

    Spielberg, Freya; Kurth, Ann E; Severynen, Anneleen; Hsieh, Yu-Hsiang; Moring-Parris, Daniel; Mackenzie, Sara; Rothman, Richard

    2011-06-01

    Providers in emergency care settings (ECSs) often face barriers to expanded HIV testing. We undertook formative research to understand the potential utility of a computer tool, "CARE," to facilitate rapid HIV testing in ECSs. Computer tool usability and acceptability were assessed among 35 adult patients, and provider focus groups were held, in two ECSs in Washington State and Maryland. The computer tool was usable by patients of varying computer literacy. Patients appreciated the tool's privacy and lack of judgment and their ability to reflect on HIV risks and create risk reduction plans. Staff voiced concerns regarding ECS-based HIV testing generally, including resources for follow-up of newly diagnosed people. Computer-delivered HIV testing support was acceptable and usable among low-literacy populations in two ECSs. Such tools may help circumvent some practical barriers associated with routine HIV testing in busy settings though linkages to care will still be needed.

  13. A Computer Model for Red Blood Cell Chemistry

    DTIC Science & Technology

    1996-10-01

    5012. 13. ABSTRACT (Maximum 200 There is a growing need for interactive computational tools for medical education and research. The most exciting...paradigm for interactive education is simulation. Fluid Mod is a simulation based computational tool developed in the late sixties and early seventies at...to a modern Windows, object oriented interface. This development will provide students with a useful computational tool for learning . More important

  14. Theoretical analysis of microwave propagation

    NASA Astrophysics Data System (ADS)

    Parl, S.; Malaga, A.

    1984-04-01

    This report documents a comprehensive investigation of microwave propagation. The structure of line-of-sight multipath is determined and the impact on practical diversity is discussed. A new model of diffraction propagation for multiple rounded obstacles is developed. A troposcatter model valid at microwave frequencies is described. New results for the power impulse response, and the delay spread and Doppler spread are developed. A 2-component model separating large and small scale scatter effects is proposed. The prediction techniques for diffraction and troposcatter have been implemented in a computer program intended as a tool to analyze propagation experiments.

  15. Primary care physicians' perspectives on computer-based health risk assessment tools for chronic diseases: a mixed methods study.

    PubMed

    Voruganti, Teja R; O'Brien, Mary Ann; Straus, Sharon E; McLaughlin, John R; Grunfeld, Eva

    2015-09-24

    Health risk assessment tools compute an individual's risk of developing a disease. Routine use of such tools by primary care physicians (PCPs) is potentially useful in chronic disease prevention. We sought physicians' awareness and perceptions of the usefulness, usability and feasibility of performing assessments with computer-based risk assessment tools in primary care settings. Focus groups and usability testing with a computer-based risk assessment tool were conducted with PCPs from both university-affiliated and community-based practices. Analysis was derived from grounded theory methodology. PCPs (n = 30) were aware of several risk assessment tools although only select tools were used routinely. The decision to use a tool depended on how use impacted practice workflow and whether the tool had credibility. Participants felt that embedding tools in the electronic medical records (EMRs) system might allow for health information from the medical record to auto-populate into the tool. User comprehension of risk could also be improved with computer-based interfaces that present risk in different formats. In this study, PCPs chose to use certain tools more regularly because of usability and credibility. Despite there being differences in the particular tools a clinical practice used, there was general appreciation for the usefulness of tools for different clinical situations. Participants characterised particular features of an ideal tool, feeling strongly that embedding risk assessment tools in the EMR would maximise accessibility and use of the tool for chronic disease management. However, appropriate practice workflow integration and features that facilitate patient understanding at point-of-care are also essential.

  16. PIPI: PTM-Invariant Peptide Identification Using Coding Method.

    PubMed

    Yu, Fengchao; Li, Ning; Yu, Weichuan

    2016-12-02

    In computational proteomics, the identification of peptides with an unlimited number of post-translational modification (PTM) types is a challenging task. The computational cost associated with database search increases exponentially with respect to the number of modified amino acids and linearly with respect to the number of potential PTM types at each amino acid. The problem becomes intractable very quickly if we want to enumerate all possible PTM patterns. To address this issue, one group of methods named restricted tools (including Mascot, Comet, and MS-GF+) only allow a small number of PTM types in database search process. Alternatively, the other group of methods named unrestricted tools (including MS-Alignment, ProteinProspector, and MODa) avoids enumerating PTM patterns with an alignment-based approach to localizing and characterizing modified amino acids. However, because of the large search space and PTM localization issue, the sensitivity of these unrestricted tools is low. This paper proposes a novel method named PIPI to achieve PTM-invariant peptide identification. PIPI belongs to the category of unrestricted tools. It first codes peptide sequences into Boolean vectors and codes experimental spectra into real-valued vectors. For each coded spectrum, it then searches the coded sequence database to find the top scored peptide sequences as candidates. After that, PIPI uses dynamic programming to localize and characterize modified amino acids in each candidate. We used simulation experiments and real data experiments to evaluate the performance in comparison with restricted tools (i.e., Mascot, Comet, and MS-GF+) and unrestricted tools (i.e., Mascot with error tolerant search, MS-Alignment, ProteinProspector, and MODa). Comparison with restricted tools shows that PIPI has a close sensitivity and running speed. Comparison with unrestricted tools shows that PIPI has the highest sensitivity except for Mascot with error tolerant search and ProteinProspector. These two tools simplify the task by only considering up to one modified amino acid in each peptide, which results in a higher sensitivity but has difficulty in dealing with multiple modified amino acids. The simulation experiments also show that PIPI has the lowest false discovery proportion, the highest PTM characterization accuracy, and the shortest running time among the unrestricted tools.

  17. Integrating Computational Science Tools into a Thermodynamics Course

    ERIC Educational Resources Information Center

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of…

  18. Static versus dynamic sampling for data mining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John, G.H.; Langley, P.

    1996-12-31

    As data warehouses grow to the point where one hundred gigabytes is considered small, the computational efficiency of data-mining algorithms on large databases becomes increasingly important. Using a sample from the database can speed up the datamining process, but this is only acceptable if it does not reduce the quality of the mined knowledge. To this end, we introduce the {open_quotes}Probably Close Enough{close_quotes} criterion to describe the desired properties of a sample. Sampling usually refers to the use of static statistical tests to decide whether a sample is sufficiently similar to the large database, in the absence of any knowledgemore » of the tools the data miner intends to use. We discuss dynamic sampling methods, which take into account the mining tool being used and can thus give better samples. We describe dynamic schemes that observe a mining tool`s performance on training samples of increasing size and use these results to determine when a sample is sufficiently large. We evaluate these sampling methods on data from the UCI repository and conclude that dynamic sampling is preferable.« less

  19. Touchfree medical interfaces.

    PubMed

    Rossol, Nathaniel; Cheng, Irene; Rui Shen; Basu, Anup

    2014-01-01

    Real-time control of visual display systems via mid-air hand gestures offers many advantages over traditional interaction modalities. In medicine, for example, it allows a practitioner to adjust display values, e.g. contrast or zoom, on a medical visualization interface without the need to re-sterilize the interface. However, when users are holding a small tool (such as a pen, surgical needle, or computer stylus) the need to constantly put the tool down in order to make hand gesture interactions is not ideal. This work presents a novel interface that automatically adjusts for gesturing with hands and hand-held tools to precisely control medical displays. The novelty of our interface is that it uses a single set of gestures designed to be equally effective for fingers and hand-held tools without using markers. This type of interface was previously not feasible with low-resolution depth sensors such as Kinect, but is now achieved by using the recently released Leap Motion controller. Our interface is validated through a user study on a group of people given the task of adjusting parameters on a medical image.

  20. Cloud computing for comparative genomics

    PubMed Central

    2010-01-01

    Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems. PMID:20482786

  1. Cloud computing for comparative genomics.

    PubMed

    Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J

    2010-05-18

    Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  2. Shuffle Optimizer: A Program to Optimize DNA Shuffling for Protein Engineering.

    PubMed

    Milligan, John N; Garry, Daniel J

    2017-01-01

    DNA shuffling is a powerful tool to develop libraries of variants for protein engineering. Here, we present a protocol to use our freely available and easy-to-use computer program, Shuffle Optimizer. Shuffle Optimizer is written in the Python computer language and increases the nucleotide homology between two pieces of DNA desired to be shuffled together without changing the amino acid sequence. In addition we also include sections on optimal primer design for DNA shuffling and library construction, a small-volume ultrasonicator method to create sheared DNA, and finally a method to reassemble the sheared fragments and recover and clone the library. The Shuffle Optimizer program and these protocols will be useful to anyone desiring to perform any of the nucleotide homology-dependent shuffling methods.

  3. Metabolomics and Diabetes: Analytical and Computational Approaches

    PubMed Central

    Sas, Kelli M.; Karnovsky, Alla; Michailidis, George

    2015-01-01

    Diabetes is characterized by altered metabolism of key molecules and regulatory pathways. The phenotypic expression of diabetes and associated complications encompasses complex interactions between genetic, environmental, and tissue-specific factors that require an integrated understanding of perturbations in the network of genes, proteins, and metabolites. Metabolomics attempts to systematically identify and quantitate small molecule metabolites from biological systems. The recent rapid development of a variety of analytical platforms based on mass spectrometry and nuclear magnetic resonance have enabled identification of complex metabolic phenotypes. Continued development of bioinformatics and analytical strategies has facilitated the discovery of causal links in understanding the pathophysiology of diabetes and its complications. Here, we summarize the metabolomics workflow, including analytical, statistical, and computational tools, highlight recent applications of metabolomics in diabetes research, and discuss the challenges in the field. PMID:25713200

  4. Self-Directed Cooperative Planetary Rovers

    NASA Technical Reports Server (NTRS)

    Zilberstein, Shlomo; Morris, Robert (Technical Monitor)

    2003-01-01

    The project is concerned with the development of decision-theoretic techniques to optimize the scientific return of planetary rovers. Planetary rovers are small unmanned vehicles equipped with cameras and a variety of sensors used for scientific experiments. They must operate under tight constraints over such resources as operation time, power, storage capacity, and communication bandwidth. Moreover, the limited computational resources of the rover limit the complexity of on-line planning and scheduling. We have developed a comprehensive solution to this problem that involves high-level tools to describe a mission; a compiler that maps a mission description and additional probabilistic models of the components of the rover into a Markov decision problem; and algorithms for solving the rover control problem that are sensitive to the limited computational resources and high-level of uncertainty in this domain.

  5. 48 CFR 252.227-7018 - Rights in noncommercial technical data and computer software-Small Business Innovation Research...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Clauses 252.227-7018 Rights in noncommercial technical data and computer software—Small Business... Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (MAR 2011...

  6. Computational tool for simulation of power and refrigeration cycles

    NASA Astrophysics Data System (ADS)

    Córdoba Tuta, E.; Reyes Orozco, M.

    2016-07-01

    Small improvement in thermal efficiency of power cycles brings huge cost savings in the production of electricity, for that reason have a tool for simulation of power cycles allows modeling the optimal changes for a best performance. There is also a big boom in research Organic Rankine Cycle (ORC), which aims to get electricity at low power through cogeneration, in which the working fluid is usually a refrigerant. A tool to design the elements of an ORC cycle and the selection of the working fluid would be helpful, because sources of heat from cogeneration are very different and in each case would be a custom design. In this work the development of a multiplatform software for the simulation of power cycles and refrigeration, which was implemented in the C ++ language and includes a graphical interface which was developed using multiplatform environment Qt and runs on operating systems Windows and Linux. The tool allows the design of custom power cycles, selection the type of fluid (thermodynamic properties are calculated through CoolProp library), calculate the plant efficiency, identify the fractions of flow in each branch and finally generates a report very educational in pdf format via the LaTeX tool.

  7. The Small Body Mapping Tool (SBMT) for Accessing, Visualizing, and Analyzing Spacecraft Data in Three Dimensions

    NASA Astrophysics Data System (ADS)

    Barnouin, O. S.; Ernst, C. M.; Daly, R. T.

    2018-04-01

    The free, publicly available Small Body Mapping Tool (SBMT) developed at the Johns Hopkins University Applied Physics Laboratory is a powerful, easy-to-use tool for accessing and analyzing data from small bodies.

  8. A versatile localization system for microscopic multiparametric analysis of cells.

    PubMed

    Thaw, H H; Rundquist, I; Johansson, U; Svensson, I; Collins, V P

    1983-03-01

    A new, simple and relatively inexpensive electronic digital position readout (DPRO) system which can be applied to the rapid localization and recovery of microscopic material is described. It is based upon a commercially available digital position readout system which is routinely utilized by industry for small machine tools and measuring equipment. This has been mounted onto the stage of various microscopic instrumentation to provide X and Y coordinates relative to an arbitrary reference point. The integration of small computers interfaced to scanning interferometric, microdensitometric and fluorescence microscopes were used to demonstrate the reliability, versatility and ease of application of this system to problems of multiparametric measurements and analysis of cultured cells. The system may be expanded and applied to clinical material to obtain automatized, multiparametric measurements of cells in haematology and clinical cytology.

  9. Prediction of Undsteady Flows in Turbomachinery Using the Linearized Euler Equations on Deforming Grids

    NASA Technical Reports Server (NTRS)

    Clark, William S.; Hall, Kenneth C.

    1994-01-01

    A linearized Euler solver for calculating unsteady flows in turbomachinery blade rows due to both incident gusts and blade motion is presented. The model accounts for blade loading, blade geometry, shock motion, and wake motion. Assuming that the unsteadiness in the flow is small relative to the nonlinear mean solution, the unsteady Euler equations can be linearized about the mean flow. This yields a set of linear variable coefficient equations that describe the small amplitude harmonic motion of the fluid. These linear equations are then discretized on a computational grid and solved using standard numerical techniques. For transonic flows, however, one must use a linear discretization which is a conservative linearization of the non-linear discretized Euler equations to ensure that shock impulse loads are accurately captured. Other important features of this analysis include a continuously deforming grid which eliminates extrapolation errors and hence, increases accuracy, and a new numerically exact, nonreflecting far-field boundary condition treatment based on an eigenanalysis of the discretized equations. Computational results are presented which demonstrate the computational accuracy and efficiency of the method and demonstrate the effectiveness of the deforming grid, far-field nonreflecting boundary conditions, and shock capturing techniques. A comparison of the present unsteady flow predictions to other numerical, semi-analytical, and experimental methods shows excellent agreement. In addition, the linearized Euler method presented requires one or two orders-of-magnitude less computational time than traditional time marching techniques making the present method a viable design tool for aeroelastic analyses.

  10. Membrane Packing Problems: A short Review on computational Membrane Modeling Methods and Tools

    PubMed Central

    Sommer, Björn

    2013-01-01

    The use of model membranes is currently part of the daily workflow for many biochemical and biophysical disciplines. These membranes are used to analyze the behavior of small substances, to simulate transport processes, to study the structure of macromolecules or for illustrative purposes. But, how can these membrane structures be generated? This mini review discusses a number of ways to obtain these structures. First, the problem will be formulated as the Membrane Packing Problem. It will be shown that the theoretical problem of placing proteins and lipids onto a membrane area differ significantly. Thus, two sub-problems will be defined and discussed. Then, different – partly historical – membrane modeling methods will be introduced. And finally, membrane modeling tools will be evaluated which are able to semi-automatically generate these model membranes and thus, drastically accelerate and simplify the membrane generation process. The mini review concludes with advice about which tool is appropriate for which application case. PMID:24688707

  11. Synthetic RNAs for Gene Regulation: Design Principles and Computational Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laganà, Alessandro, E-mail: alessandro.lagana@osumc.edu; Shasha, Dennis; Croce, Carlo Maria

    The use of synthetic non-coding RNAs for post-transcriptional regulation of gene expression has not only become a standard laboratory tool for gene functional studies but it has also opened up new perspectives in the design of new and potentially promising therapeutic strategies. Bioinformatics has provided researchers with a variety of tools for the design, the analysis, and the evaluation of RNAi agents such as small-interfering RNA (siRNA), short-hairpin RNA (shRNA), artificial microRNA (a-miR), and microRNA sponges. More recently, a new system for genome engineering based on the bacterial CRISPR-Cas9 system (Clustered Regularly Interspaced Short Palindromic Repeats), was shown to havemore » the potential to also regulate gene expression at both transcriptional and post-transcriptional level in a more specific way. In this mini review, we present RNAi and CRISPRi design principles and discuss the advantages and limitations of the current design approaches.« less

  12. Advances in computational approaches for prioritizing driver mutations and significantly mutated genes in cancer genomes.

    PubMed

    Cheng, Feixiong; Zhao, Junfei; Zhao, Zhongming

    2016-07-01

    Cancer is often driven by the accumulation of genetic alterations, including single nucleotide variants, small insertions or deletions, gene fusions, copy-number variations, and large chromosomal rearrangements. Recent advances in next-generation sequencing technologies have helped investigators generate massive amounts of cancer genomic data and catalog somatic mutations in both common and rare cancer types. So far, the somatic mutation landscapes and signatures of >10 major cancer types have been reported; however, pinpointing driver mutations and cancer genes from millions of available cancer somatic mutations remains a monumental challenge. To tackle this important task, many methods and computational tools have been developed during the past several years and, thus, a review of its advances is urgently needed. Here, we first summarize the main features of these methods and tools for whole-exome, whole-genome and whole-transcriptome sequencing data. Then, we discuss major challenges like tumor intra-heterogeneity, tumor sample saturation and functionality of synonymous mutations in cancer, all of which may result in false-positive discoveries. Finally, we highlight new directions in studying regulatory roles of noncoding somatic mutations and quantitatively measuring circulating tumor DNA in cancer. This review may help investigators find an appropriate tool for detecting potential driver or actionable mutations in rapidly emerging precision cancer medicine. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  13. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and ROC tests allow us to judge data completeness and estimate error. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges and pitfalls in serving up these datasets over the web.

  14. Cutting tool form compensation system and method

    DOEpatents

    Barkman, W.E.; Babelay, E.F. Jr.; Klages, E.J.

    1993-10-19

    A compensation system for a computer-controlled machining apparatus having a controller and including a cutting tool and a workpiece holder which are movable relative to one another along a preprogrammed path during a machining operation utilizes a camera and a vision computer for gathering information at a preselected stage of a machining operation relating to the actual shape and size of the cutting edge of the cutting tool and for altering the preprogrammed path in accordance with detected variations between the actual size and shape of the cutting edge and an assumed size and shape of the cutting edge. The camera obtains an image of the cutting tool against a background so that the cutting tool and background possess contrasting light intensities, and the vision computer utilizes the contrasting light intensities of the image to locate points therein which correspond to points along the actual cutting edge. Following a series of computations involving the determining of a tool center from the points identified along the tool edge, the results of the computations are fed to the controller where the preprogrammed path is altered as aforedescribed. 9 figures.

  15. Cutting tool form compensaton system and method

    DOEpatents

    Barkman, William E.; Babelay, Jr., Edwin F.; Klages, Edward J.

    1993-01-01

    A compensation system for a computer-controlled machining apparatus having a controller and including a cutting tool and a workpiece holder which are movable relative to one another along a preprogrammed path during a machining operation utilizes a camera and a vision computer for gathering information at a preselected stage of a machining operation relating to the actual shape and size of the cutting edge of the cutting tool and for altering the preprogrammed path in accordance with detected variations between the actual size and shape of the cutting edge and an assumed size and shape of the cutting edge. The camera obtains an image of the cutting tool against a background so that the cutting tool and background possess contrasting light intensities, and the vision computer utilizes the contrasting light intensities of the image to locate points therein which correspond to points along the actual cutting edge. Following a series of computations involving the determining of a tool center from the points identified along the tool edge, the results of the computations are fed to the controller where the preprogrammed path is altered as aforedescribed.

  16. Computational methods and challenges in hydrogen/deuterium exchange mass spectrometry.

    PubMed

    Claesen, Jürgen; Burzykowski, Tomasz

    2017-09-01

    Hydrogen/Deuterium exchange (HDX) has been applied, since the 1930s, as an analytical tool to study the structure and dynamics of (small) biomolecules. The popularity of using HDX to study proteins increased drastically in the last two decades due to the successful combination with mass spectrometry (MS). Together with this growth in popularity, several technological advances have been made, such as improved quenching and fragmentation. As a consequence of these experimental improvements and the increased use of protein-HDXMS, large amounts of complex data are generated, which require appropriate analysis. Computational analysis of HDXMS requires several steps. A typical workflow for proteins consists of identification of (non-)deuterated peptides or fragments of the protein under study (local analysis), or identification of the deuterated protein as a whole (global analysis); determination of the deuteration level; estimation of the protection extent or exchange rates of the labile backbone amide hydrogen atoms; and a statistically sound interpretation of the estimated protection extent or exchange rates. Several algorithms, specifically designed for HDX analysis, have been proposed. They range from procedures that focus on one specific step in the analysis of HDX data to complete HDX workflow analysis tools. In this review, we provide an overview of the computational methods and discuss outstanding challenges. © 2016 Wiley Periodicals, Inc. Mass Spec Rev 36:649-667, 2017. © 2016 Wiley Periodicals, Inc.

  17. Integrating a Narrative Medicine Telephone Interview with Online Life Review Education for Cancer Patients: Lessons Learned and Future Directions

    PubMed Central

    Wise, Meg; Marchand, Lucille; Cleary, James F.; Aeschlimann, Elizabeth; Causier, Daniel

    2012-01-01

    We describe an online narrative and life review education program for cancer patients and the results of a small implementation test to inform future directions for further program development and full-scale evaluation research. The intervention combined three types of psycho-oncology narrative interventions that have been shown to help patients address emotional and existential issues: 1) a physician-led dignity-enhancing telephone interview to elicit the life narrative and delivery of an edited life manuscript, 2) life review education, delivered via 3) a website self-directed instructional materials and expert consultation to help people revise and share their story. Eleven cancer patients tested the intervention and provided feedback in an in-depth exit interview. While everyone said telling and receiving the edited story manuscript was helpful and meaningful, only people with high death salience and prior computer experience used the web tools to enhance and share their story. Computer users prodded us to provide more sophisticated tools and older (>70 years) users needed more staff and family support. We conclude that combining a telephone expert-led interview with online life review education can extend access to integrative oncology services, are most feasible for computer-savvy patients with advanced cancer, and must use platforms that allow patients to upload files and invite their social network. PMID:19476731

  18. RegulonDB version 9.0: high-level integration of gene regulation, coexpression, motif clustering and beyond.

    PubMed

    Gama-Castro, Socorro; Salgado, Heladia; Santos-Zavaleta, Alberto; Ledezma-Tejeida, Daniela; Muñiz-Rascado, Luis; García-Sotelo, Jair Santiago; Alquicira-Hernández, Kevin; Martínez-Flores, Irma; Pannier, Lucia; Castro-Mondragón, Jaime Abraham; Medina-Rivera, Alejandra; Solano-Lira, Hilda; Bonavides-Martínez, César; Pérez-Rueda, Ernesto; Alquicira-Hernández, Shirley; Porrón-Sotelo, Liliana; López-Fuentes, Alejandra; Hernández-Koutoucheva, Anastasia; Del Moral-Chávez, Víctor; Rinaldi, Fabio; Collado-Vides, Julio

    2016-01-04

    RegulonDB (http://regulondb.ccg.unam.mx) is one of the most useful and important resources on bacterial gene regulation,as it integrates the scattered scientific knowledge of the best-characterized organism, Escherichia coli K-12, in a database that organizes large amounts of data. Its electronic format enables researchers to compare their results with the legacy of previous knowledge and supports bioinformatics tools and model building. Here, we summarize our progress with RegulonDB since our last Nucleic Acids Research publication describing RegulonDB, in 2013. In addition to maintaining curation up-to-date, we report a collection of 232 interactions with small RNAs affecting 192 genes, and the complete repertoire of 189 Elementary Genetic Sensory-Response units (GENSOR units), integrating the signal, regulatory interactions, and metabolic pathways they govern. These additions represent major progress to a higher level of understanding of regulated processes. We have updated the computationally predicted transcription factors, which total 304 (184 with experimental evidence and 120 from computational predictions); we updated our position-weight matrices and have included tools for clustering them in evolutionary families. We describe our semiautomatic strategy to accelerate curation, including datasets from high-throughput experiments, a novel coexpression distance to search for 'neighborhood' genes to known operons and regulons, and computational developments. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. Modeling and Analysis Compute Environments, Utilizing Virtualization Technology in the Climate and Earth Systems Science domain

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.

    2010-12-01

    Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.

  20. Covariance Analysis Tool (G-CAT) for Computing Ascent, Descent, and Landing Errors

    NASA Technical Reports Server (NTRS)

    Boussalis, Dhemetrios; Bayard, David S.

    2013-01-01

    G-CAT is a covariance analysis tool that enables fast and accurate computation of error ellipses for descent, landing, ascent, and rendezvous scenarios, and quantifies knowledge error contributions needed for error budgeting purposes. Because GCAT supports hardware/system trade studies in spacecraft and mission design, it is useful in both early and late mission/ proposal phases where Monte Carlo simulation capability is not mature, Monte Carlo simulation takes too long to run, and/or there is a need to perform multiple parametric system design trades that would require an unwieldy number of Monte Carlo runs. G-CAT is formulated as a variable-order square-root linearized Kalman filter (LKF), typically using over 120 filter states. An important property of G-CAT is that it is based on a 6-DOF (degrees of freedom) formulation that completely captures the combined effects of both attitude and translation errors on the propagated trajectories. This ensures its accuracy for guidance, navigation, and control (GN&C) analysis. G-CAT provides the desired fast turnaround analysis needed for error budgeting in support of mission concept formulations, design trade studies, and proposal development efforts. The main usefulness of a covariance analysis tool such as G-CAT is its ability to calculate the performance envelope directly from a single run. This is in sharp contrast to running thousands of simulations to obtain similar information using Monte Carlo methods. It does this by propagating the "statistics" of the overall design, rather than simulating individual trajectories. G-CAT supports applications to lunar, planetary, and small body missions. It characterizes onboard knowledge propagation errors associated with inertial measurement unit (IMU) errors (gyro and accelerometer), gravity errors/dispersions (spherical harmonics, masscons), and radar errors (multiple altimeter beams, multiple Doppler velocimeter beams). G-CAT is a standalone MATLAB- based tool intended to run on any engineer's desktop computer.

  1. Analyzing huge pathology images with open source software

    PubMed Central

    2013-01-01

    Background Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer’s memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. Results We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Conclusions Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. Virtual slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272 PMID:23829479

  2. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures.

    PubMed

    Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena

    2018-01-01

    The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.

  3. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures

    PubMed Central

    Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena

    2018-01-01

    The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results. PMID:29765315

  4. G-LoSA for Prediction of Protein-Ligand Binding Sites and Structures.

    PubMed

    Lee, Hui Sun; Im, Wonpil

    2017-01-01

    Recent advances in high-throughput structure determination and computational protein structure prediction have significantly enriched the universe of protein structure. However, there is still a large gap between the number of available protein structures and that of proteins with annotated function in high accuracy. Computational structure-based protein function prediction has emerged to reduce this knowledge gap. The identification of a ligand binding site and its structure is critical to the determination of a protein's molecular function. We present a computational methodology for predicting small molecule ligand binding site and ligand structure using G-LoSA, our protein local structure alignment and similarity measurement tool. All the computational procedures described here can be easily implemented using G-LoSA Toolkit, a package of standalone software programs and preprocessed PDB structure libraries. G-LoSA and G-LoSA Toolkit are freely available to academic users at http://compbio.lehigh.edu/GLoSA . We also illustrate a case study to show the potential of our template-based approach harnessing G-LoSA for protein function prediction.

  5. QUADrATiC: scalable gene expression connectivity mapping for repurposing FDA-approved therapeutics.

    PubMed

    O'Reilly, Paul G; Wen, Qing; Bankhead, Peter; Dunne, Philip D; McArt, Darragh G; McPherson, Suzanne; Hamilton, Peter W; Mills, Ken I; Zhang, Shu-Dong

    2016-05-04

    Gene expression connectivity mapping has proven to be a powerful and flexible tool for research. Its application has been shown in a broad range of research topics, most commonly as a means of identifying potential small molecule compounds, which may be further investigated as candidates for repurposing to treat diseases. The public release of voluminous data from the Library of Integrated Cellular Signatures (LINCS) programme further enhanced the utilities and potentials of gene expression connectivity mapping in biomedicine. We describe QUADrATiC ( http://go.qub.ac.uk/QUADrATiC ), a user-friendly tool for the exploration of gene expression connectivity on the subset of the LINCS data set corresponding to FDA-approved small molecule compounds. It enables the identification of compounds for repurposing therapeutic potentials. The software is designed to cope with the increased volume of data over existing tools, by taking advantage of multicore computing architectures to provide a scalable solution, which may be installed and operated on a range of computers, from laptops to servers. This scalability is provided by the use of the modern concurrent programming paradigm provided by the Akka framework. The QUADrATiC Graphical User Interface (GUI) has been developed using advanced Javascript frameworks, providing novel visualization capabilities for further analysis of connections. There is also a web services interface, allowing integration with other programs or scripts. QUADrATiC has been shown to provide an improvement over existing connectivity map software, in terms of scope (based on the LINCS data set), applicability (using FDA-approved compounds), usability and speed. It offers potential to biological researchers to analyze transcriptional data and generate potential therapeutics for focussed study in the lab. QUADrATiC represents a step change in the process of investigating gene expression connectivity and provides more biologically-relevant results than previous alternative solutions.

  6. An Accurate and Computationally Efficient Model for Membrane-Type Circular-Symmetric Micro-Hotplates

    PubMed Central

    Khan, Usman; Falconi, Christian

    2014-01-01

    Ideally, the design of high-performance micro-hotplates would require a large number of simulations because of the existence of many important design parameters as well as the possibly crucial effects of both spread and drift. However, the computational cost of FEM simulations, which are the only available tool for accurately predicting the temperature in micro-hotplates, is very high. As a result, micro-hotplate designers generally have no effective simulation-tools for the optimization. In order to circumvent these issues, here, we propose a model for practical circular-symmetric micro-hot-plates which takes advantage of modified Bessel functions, computationally efficient matrix-approach for considering the relevant boundary conditions, Taylor linearization for modeling the Joule heating and radiation losses, and external-region-segmentation strategy in order to accurately take into account radiation losses in the entire micro-hotplate. The proposed model is almost as accurate as FEM simulations and two to three orders of magnitude more computationally efficient (e.g., 45 s versus more than 8 h). The residual errors, which are mainly associated to the undesired heating in the electrical contacts, are small (e.g., few degrees Celsius for an 800 °C operating temperature) and, for important analyses, almost constant. Therefore, we also introduce a computationally-easy single-FEM-compensation strategy in order to reduce the residual errors to about 1 °C. As illustrative examples of the power of our approach, we report the systematic investigation of a spread in the membrane thermal conductivity and of combined variations of both ambient and bulk temperatures. Our model enables a much faster characterization of micro-hotplates and, thus, a much more effective optimization prior to fabrication. PMID:24763214

  7. Tools for Administration of a UNIX-Based Network

    NASA Technical Reports Server (NTRS)

    LeClaire, Stephen; Farrar, Edward

    2004-01-01

    Several computer programs have been developed to enable efficient administration of a large, heterogeneous, UNIX-based computing and communication network that includes a variety of computers connected to a variety of subnetworks. One program provides secure software tools for administrators to create, modify, lock, and delete accounts of specific users. This program also provides tools for users to change their UNIX passwords and log-in shells. These tools check for errors. Another program comprises a client and a server component that, together, provide a secure mechanism to create, modify, and query quota levels on a network file system (NFS) mounted by use of the VERITAS File SystemJ software. The client software resides on an internal secure computer with a secure Web interface; one can gain access to the client software from any authorized computer capable of running web-browser software. The server software resides on a UNIX computer configured with the VERITAS software system. Directories where VERITAS quotas are applied are NFS-mounted. Another program is a Web-based, client/server Internet Protocol (IP) address tool that facilitates maintenance lookup of information about IP addresses for a network of computers.

  8. Computer-automated silica aerosol generator and animal inhalation exposure system

    PubMed Central

    McKinney, Walter; Chen, Bean; Schwegler-Berry, Diane; Frazer, Dave G.

    2015-01-01

    Inhalation exposure systems are necessary tools for determining the dose response relationship of inhaled toxicants under a variety of exposure conditions. The objective of this study was to develop an automated computer controlled system to expose small laboratory animals to precise concentrations of uniformly dispersed airborne silica particles. An acoustical aerosol generator was developed which was capable of re-suspending particles from bulk powder. The aerosolized silica output from the generator was introduced into the throat of a venturi tube. The turbulent high-velocity air stream within the venturi tube increased the dispersion of the re-suspended powder. That aerosol was then used to expose small laboratory animals to constant aerosol concentrations, up to 20mg/m3, for durations lasting up to 8h. Particle distribution and morphology of the silica aerosol delivered to the exposure chamber were characterized to verify that a fully dispersed and respirable aerosol was being produced. The inhalation exposure system utilized a combination of airflow controllers, particle monitors, data acquisition devices and custom software with automatic feedback control to achieve constant and repeatable exposure environments. The automatic control algorithm was capable of maintaining median aerosol concentrations to within ±0.2 mg/m3 of a user selected target concentration during exposures lasting from 2 to 8 h. The system was able to reach 95% of the desired target value in <10min during the beginning phase of an exposure. This exposure system provided a highly automated tool for conducting inhalation toxicology studies involving silica particles. PMID:23796015

  9. Visualization of the Eastern Renewable Generation Integration Study: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny; Novacheck, Joshua; Bloom, Aaron

    The Eastern Renewable Generation Integration Study (ERGIS), explores the operational impacts of the wide spread adoption of wind and solar photovoltaics (PV) resources in the U.S. Eastern Interconnection and Quebec Interconnection (collectively, EI). In order to understand some of the economic and reliability challenges of managing hundreds of gigawatts of wind and PV generation, we developed state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NREL's high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated withmore » evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions. state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NRELs high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated with evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions.« less

  10. A Visual Tool for Computer Supported Learning: The Robot Motion Planning Example

    ERIC Educational Resources Information Center

    Elnagar, Ashraf; Lulu, Leena

    2007-01-01

    We introduce an effective computer aided learning visual tool (CALVT) to teach graph-based applications. We present the robot motion planning problem as an example of such applications. The proposed tool can be used to simulate and/or further to implement practical systems in different areas of computer science such as graphics, computational…

  11. Effects of Attitudes and Behaviours on Learning Mathematics with Computer Tools

    ERIC Educational Resources Information Center

    Reed, Helen C.; Drijvers, Paul; Kirschner, Paul A.

    2010-01-01

    This mixed-methods study investigates the effects of student attitudes and behaviours on the outcomes of learning mathematics with computer tools. A computer tool was used to help students develop the mathematical concept of function. In the whole sample (N = 521), student attitudes could account for a 3.4 point difference in test scores between…

  12. From the desktop to the grid: scalable bioinformatics via workflow conversion.

    PubMed

    de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver

    2016-03-12

    Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We strongly believe that our efforts not only decrease the turnaround time to obtain scientific results but also have a positive impact on reproducibility, thus elevating the quality of obtained scientific results.

  13. Machine Learning to Discover and Optimize Materials

    NASA Astrophysics Data System (ADS)

    Rosenbrock, Conrad Waldhar

    For centuries, scientists have dreamed of creating materials by design. Rather than discovery by accident, bespoke materials could be tailored to fulfill specific technological needs. Quantum theory and computational methods are essentially equal to the task, and computational power is the new bottleneck. Machine learning has the potential to solve that problem by approximating material behavior at multiple length scales. A full end-to-end solution must allow us to approximate the quantum mechanics, microstructure and engineering tasks well enough to be predictive in the real world. In this dissertation, I present algorithms and methodology to address some of these problems at various length scales. In the realm of enumeration, systems with many degrees of freedom such as high-entropy alloys may contain prohibitively many unique possibilities so that enumerating all of them would exhaust available compute memory. One possible way to address this problem is to know in advance how many possibilities there are so that the user can reduce their search space by restricting the occupation of certain lattice sites. Although tools to calculate this number were available, none performed well for very large systems and none could easily be integrated into low-level languages for use in existing scientific codes. I present an algorithm to solve these problems. Testing the robustness of machine-learned models is an essential component in any materials discovery or optimization application. While it is customary to perform a small number of system-specific tests to validate an approach, this may be insufficient in many cases. In particular, for Cluster Expansion models, the expansion may not converge quickly enough to be useful and reliable. Although the method has been used for decades, a rigorous investigation across many systems to determine when CE "breaks" was still lacking. This dissertation includes this investigation along with heuristics that use only a small training database to predict whether a model is worth pursuing in detail. To be useful, computational materials discovery must lead to experimental validation. However, experiments are difficult due to sample purity, environmental effects and a host of other considerations. In many cases, it is difficult to connect theory to experiment because computation is deterministic. By combining advanced group theory with machine learning, we created a new tool that bridges the gap between experiment and theory so that experimental and computed phase diagrams can be harmonized. Grain boundaries in real materials control many important material properties such as corrosion, thermal conductivity, and creep. Because of their high dimensionality, learning the underlying physics to optimizing grain boundaries is extremely complex. By leveraging a mathematically rigorous representation for local atomic environments, machine learning becomes a powerful tool to approximate properties for grain boundaries. But it also goes beyond predicting properties by highlighting those atomic environments that are most important for influencing the boundary properties. This provides an immense dimensionality reduction that empowers grain boundary scientists to know where to look for deeper physical insights.

  14. A Probabilistic and Observation Based Methodology to Estimate Small Craft Harbor Vulnerability to Tsunami Events

    NASA Astrophysics Data System (ADS)

    Keen, A. S.; Lynett, P. J.; Ayca, A.

    2016-12-01

    Because of the damage resulting from the 2010 Chile and 2011 Japanese tele-tsunamis, the tsunami risk to the small craft marinas in California has become an important concern. The talk will outline an assessment tool which can be used to assess the tsunami hazard to small craft harbors. The methodology is based on the demand and structural capacity of the floating dock system, composed of floating docks/fingers and moored vessels. The structural demand is determined using a Monte Carlo methodology. Monte Carlo methodology is a probabilistic computational tool where the governing might be well known, but the independent variables of the input (demand) as well as the resisting structural components (capacity) may not be completely known. The Monte Carlo approach uses a distribution of each variable, and then uses that random variable within the described parameters, to generate a single computation. The process then repeats hundreds or thousands of times. The numerical model "Method of Splitting Tsunamis" (MOST) has been used to determine the inputs for the small craft harbors within California. Hydrodynamic model results of current speed, direction and surface elevation were incorporated via the drag equations to provide the bases of the demand term. To determine the capacities, an inspection program was developed to identify common features of structural components. A total of six harbors have been inspected ranging from Crescent City in Northern California to Oceanside Harbor in Southern California. Results from the inspection program were used to develop component capacity tables which incorporated the basic specifications of each component (e.g. bolt size and configuration) and a reduction factor (which accounts for the component reduction in capacity with age) to estimate in situ capacities. Like the demand term, these capacities are added probabilistically into the model. To date the model has been applied to Santa Cruz Harbor as well as Noyo River. Once calibrated, the model was able to hindcast the damage produced in Santa Cruz Harbor during the 2010 Chile and 2011 Japan events. Results of the Santa Cruz analysis will be presented and discussed.

  15. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  16. Quantitative 3-D Imaging, Segmentation and Feature Extraction of the Respiratory System in Small Mammals for Computational Biophysics Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trease, Lynn L.; Trease, Harold E.; Fowler, John

    2007-03-15

    One of the critical steps toward performing computational biology simulations, using mesh based integration methods, is in using topologically faithful geometry derived from experimental digital image data as the basis for generating the computational meshes. Digital image data representations contain both the topology of the geometric features and experimental field data distributions. The geometric features that need to be captured from the digital image data are three-dimensional, therefore the process and tools we have developed work with volumetric image data represented as data-cubes. This allows us to take advantage of 2D curvature information during the segmentation and feature extraction process.more » The process is basically: 1) segmenting to isolate and enhance the contrast of the features that we wish to extract and reconstruct, 2) extracting the geometry of the features in an isosurfacing technique, and 3) building the computational mesh using the extracted feature geometry. “Quantitative” image reconstruction and feature extraction is done for the purpose of generating computational meshes, not just for producing graphics "screen" quality images. For example, the surface geometry that we extract must represent a closed water-tight surface.« less

  17. GPU based framework for geospatial analyses

    NASA Astrophysics Data System (ADS)

    Cosmin Sandric, Ionut; Ionita, Cristian; Dardala, Marian; Furtuna, Titus

    2017-04-01

    Parallel processing on multiple CPU cores is already used at large scale in geocomputing, but parallel processing on graphics cards is just at the beginning. Being able to use an simple laptop with a dedicated graphics card for advanced and very fast geocomputation is an advantage that each scientist wants to have. The necessity to have high speed computation in geosciences has increased in the last 10 years, mostly due to the increase in the available datasets. These datasets are becoming more and more detailed and hence they require more space to store and more time to process. Distributed computation on multicore CPU's and GPU's plays an important role by processing one by one small parts from these big datasets. These way of computations allows to speed up the process, because instead of using just one process for each dataset, the user can use all the cores from a CPU or up to hundreds of cores from GPU The framework provide to the end user a standalone tools for morphometry analyses at multiscale level. An important part of the framework is dedicated to uncertainty propagation in geospatial analyses. The uncertainty may come from the data collection or may be induced by the model or may have an infinite sources. These uncertainties plays important roles when a spatial delineation of the phenomena is modelled. Uncertainty propagation is implemented inside the GPU framework using Monte Carlo simulations. The GPU framework with the standalone tools proved to be a reliable tool for modelling complex natural phenomena The framework is based on NVidia Cuda technology and is written in C++ programming language. The code source will be available on github at https://github.com/sandricionut/GeoRsGPU Acknowledgement: GPU framework for geospatial analysis, Young Researchers Grant (ICUB-University of Bucharest) 2016, director Ionut Sandric

  18. Methods of photoelectrode characterization with high spatial and temporal resolution

    DOE PAGES

    Esposito, Daniel V.; Baxter, Jason B.; John, Jimmy; ...

    2015-06-19

    Here, materials and photoelectrode architectures that are highly efficient, extremely stable, and made from low cost materials are required for commercially viable photoelectrochemical (PEC) water-splitting technology. A key challenge is the heterogeneous nature of real-world materials, which often possess spatial variation in their crystal structure, morphology, and/or composition at the nano-, micro-, or macro-scale. Different structures and compositions can have vastly different properties and can therefore strongly influence the overall performance of the photoelectrode through complex structure–property relationships. A complete understanding of photoelectrode materials would also involve elucidation of processes such as carrier collection and electrochemical charge transfer that occurmore » at very fast time scales. We present herein an overview of a broad suite of experimental and computational tools that can be used to define the structure–property relationships of photoelectrode materials at small dimensions and on fast time scales. A major focus is on in situ scanning-probe measurement (SPM) techniques that possess the ability to measure differences in optical, electronic, catalytic, and physical properties with nano- or micro-scale spatial resolution. In situ ultrafast spectroscopic techniques, used to probe carrier dynamics involved with processes such as carrier generation, recombination, and interfacial charge transport, are also discussed. Complementing all of these experimental techniques are computational atomistic modeling tools, which can be invaluable for interpreting experimental results, aiding in materials discovery, and interrogating PEC processes at length and time scales not currently accessible by experiment. In addition to reviewing the basic capabilities of these experimental and computational techniques, we highlight key opportunities and limitations of applying these tools for the development of PEC materials.« less

  19. Electromagnetic Simulations for Aerospace Application Final Report CRADA No. TC-0376-92

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madsen, N.; Meredith, S.

    Electromagnetic (EM) simulation tools play an important role in the design cycle, allowing optimization of a design before it is fabricated for testing. The purpose of this cooperative project was to provide Lockheed with state-of-the-art electromagnetic (EM) simulation software that will enable the optimal design of the next generation of low-observable (LO) military aircraft through the VHF regime. More particularly, the project was principally code development and validation, its goal to produce a 3-D, conforming grid,time-domain (TD) EM simulation tool, consisting of a mesh generator, a DS13D-based simulation kernel, and an RCS postprocessor, which was useful in the optimization ofmore » LO aircraft, both for full-aircraft simulations run on a massively parallel computer and for small scale problems run on a UNIX workstation.« less

  20. Advancing crime scene computer forensics techniques

    NASA Astrophysics Data System (ADS)

    Hosmer, Chet; Feldman, John; Giordano, Joe

    1999-02-01

    Computers and network technology have become inexpensive and powerful tools that can be applied to a wide range of criminal activity. Computers have changed the world's view of evidence because computers are used more and more as tools in committing `traditional crimes' such as embezzlements, thefts, extortion and murder. This paper will focus on reviewing the current state-of-the-art of the data recovery and evidence construction tools used in both the field and laboratory for prosection purposes.

  1. Visual Analytics Tools for Sustainable Lifecycle Design: Current Status, Challenges, and Future Opportunities.

    PubMed

    Ramanujan, Devarajan; Bernstein, William Z; Chandrasegaran, Senthil K; Ramani, Karthik

    2017-01-01

    The rapid rise in technologies for data collection has created an unmatched opportunity to advance the use of data-rich tools for lifecycle decision-making. However, the usefulness of these technologies is limited by the ability to translate lifecycle data into actionable insights for human decision-makers. This is especially true in the case of sustainable lifecycle design (SLD), as the assessment of environmental impacts, and the feasibility of making corresponding design changes, often relies on human expertise and intuition. Supporting human sense-making in SLD requires the use of both data-driven and user-driven methods while exploring lifecycle data. A promising approach for combining the two is through the use of visual analytics (VA) tools. Such tools can leverage the ability of computer-based tools to gather, process, and summarize data along with the ability of human-experts to guide analyses through domain knowledge or data-driven insight. In this paper, we review previous research that has created VA tools in SLD. We also highlight existing challenges and future opportunities for such tools in different lifecycle stages-design, manufacturing, distribution & supply chain, use-phase, end-of-life, as well as life cycle assessment. Our review shows that while the number of VA tools in SLD is relatively small, researchers are increasingly focusing on the subject matter. Our review also suggests that VA tools can address existing challenges in SLD and that significant future opportunities exist.

  2. Shaping Small Bioactive Molecules to Untangle Their Biological Function: A Focus on Fluorescent Plant Hormones.

    PubMed

    Lace, Beatrice; Prandi, Cristina

    2016-08-01

    Modern biology overlaps with chemistry in explaining the structure and function of all cellular processes at the molecular level. Plant hormone research is perfectly located at the interface between these two disciplines, taking advantage of synthetic and computational chemistry as a tool to decipher the complex biological mechanisms regulating the action of plant hormones. These small signaling molecules regulate a wide range of developmental processes, adapting plant growth to ever changing environmental conditions. The synthesis of small bioactive molecules mimicking the activity of endogenous hormones allows us to unveil many molecular features of their functioning, giving rise to a new field, plant chemical biology. In this framework, fluorescence labeling of plant hormones is emerging as a successful strategy to track the fate of these challenging molecules inside living organisms. Thanks to the increasing availability of new fluorescent probes as well as advanced and innovative imaging technologies, we are now in a position to investigate many of the dynamic mechanisms through which plant hormones exert their action. Such a deep and detailed comprehension is mandatory for the development of new green technologies for practical applications. In this review, we summarize the results obtained so far concerning the fluorescent labeling of plant hormones, highlighting the basic steps leading to the design and synthesis of these compelling molecular tools and their applications. Copyright © 2016 The Author. Published by Elsevier Inc. All rights reserved.

  3. Irena : tool suite for modeling and analysis of small-angle scattering.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilavsky, J.; Jemian, P.

    2009-04-01

    Irena, a tool suite for analysis of both X-ray and neutron small-angle scattering (SAS) data within the commercial Igor Pro application, brings together a comprehensive suite of tools useful for investigations in materials science, physics, chemistry, polymer science and other fields. In addition to Guinier and Porod fits, the suite combines a variety of advanced SAS data evaluation tools for the modeling of size distribution in the dilute limit using maximum entropy and other methods, dilute limit small-angle scattering from multiple non-interacting populations of scatterers, the pair-distance distribution function, a unified fit, the Debye-Bueche model, the reflectivity (X-ray and neutron)more » using Parratt's formalism, and small-angle diffraction. There are also a number of support tools, such as a data import/export tool supporting a broad sampling of common data formats, a data modification tool, a presentation-quality graphics tool optimized for small-angle scattering data, and a neutron and X-ray scattering contrast calculator. These tools are brought together into one suite with consistent interfaces and functionality. The suite allows robust automated note recording and saving of parameters during export.« less

  4. Application of X-ray computed microtomography to soil craters formed by raindrop splash

    NASA Astrophysics Data System (ADS)

    Beczek, Michał; Ryżak, Magdalena; Lamorski, Krzysztof; Sochan, Agata; Mazur, Rafał; Bieganowski, Andrzej

    2018-02-01

    The creation of craters on the soil surface is part of splash erosion. Due to the small size of these craters, they are difficult to study. The main aim of this paper was to test X-ray computed microtomography to investigate craters formed by raindrop impacts. Measurements were made on soil samples moistened to three different levels corresponding with soil water potentials of 0.1, 3.16 and 16 kPa. Using images obtained by X-ray microtomography, geometric parameters of the craters were recorded and analysed. X-ray computed microtomography proved to be a useful and efficient tool for the investigation of craters formed on the soil surface after the impact of water drops. The parameters of the craters changed with the energy of the water drops and were dependent on the initial moisture content of the soil. Crater depth is more dependent on the increased energy of the water drop than crater diameter.

  5. Protein Structure Determination by Assembling Super-Secondary Structure Motifs Using Pseudocontact Shifts.

    PubMed

    Pilla, Kala Bharath; Otting, Gottfried; Huber, Thomas

    2017-03-07

    Computational and nuclear magnetic resonance hybrid approaches provide efficient tools for 3D structure determination of small proteins, but currently available algorithms struggle to perform with larger proteins. Here we demonstrate a new computational algorithm that assembles the 3D structure of a protein from its constituent super-secondary structural motifs (Smotifs) with the help of pseudocontact shift (PCS) restraints for backbone amide protons, where the PCSs are produced from different metal centers. The algorithm, DINGO-PCS (3D assembly of Individual Smotifs to Near-native Geometry as Orchestrated by PCSs), employs the PCSs to recognize, orient, and assemble the constituent Smotifs of the target protein without any other experimental data or computational force fields. Using a universal Smotif database, the DINGO-PCS algorithm exhaustively enumerates any given Smotif. We benchmarked the program against ten different protein targets ranging from 100 to 220 residues with different topologies. For nine of these targets, the method was able to identify near-native Smotifs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. CREW CHIEF: A computer graphics simulation of an aircraft maintenance technician

    NASA Technical Reports Server (NTRS)

    Aume, Nilss M.

    1990-01-01

    Approximately 35 percent of the lifetime cost of a military system is spent for maintenance. Excessive repair time is caused by not considering maintenance during design. Problems are usually discovered only after a mock-up has been constructed, when it is too late to make changes. CREW CHIEF will reduce the incidence of such problems by catching design defects in the early design stages. CREW CHIEF is a computer graphic human factors evaluation system interfaced to commercial computer aided design (CAD) systems. It creates a three dimensional man model, either male or female, large or small, with various types of clothing and in several postures. It can perform analyses for physical accessibility, strength capability with tools, visual access, and strength capability for manual materials handling. The designer would produce a drawing on his CAD system and introduce CREW CHIEF in it. CREW CHIEF's analyses would then indicate places where problems could be foreseen and corrected before the design is frozen.

  7. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    NASA Technical Reports Server (NTRS)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  8. Effect of high-frequency spectral components in computer recognition of dysarthric speech based on a Mel-cepstral stochastic model.

    PubMed

    Polur, Prasad D; Miller, Gerald E

    2005-01-01

    Computer speech recognition of individuals with dysarthria, such as cerebral palsy patients, requires a robust technique that can handle conditions of very high variability and limited training data. In this study, a hidden Markov model (HMM) was constructed and conditions investigated that would provide improved performance for a dysarthric speech (isolated word) recognition system intended to act as an assistive/control tool. In particular, we investigated the effect of high-frequency spectral components on the recognition rate of the system to determine if they contributed useful additional information to the system. A small-size vocabulary spoken by three cerebral palsy subjects was chosen. Mel-frequency cepstral coefficients extracted with the use of 15 ms frames served as training input to an ergodic HMM setup. Subsequent results demonstrated that no significant useful information was available to the system for enhancing its ability to discriminate dysarthric speech above 5.5 kHz in the current set of dysarthric data. The level of variability in input dysarthric speech patterns limits the reliability of the system. However, its application as a rehabilitation/control tool to assist dysarthric motor-impaired individuals such as cerebral palsy subjects holds sufficient promise.

  9. Foreign Object Damage Identification in Turbine Engines

    NASA Technical Reports Server (NTRS)

    Strack, William; Zhang, Desheng; Turso, James; Pavlik, William; Lopez, Isaac

    2005-01-01

    This report summarizes the collective work of a five-person team from different organizations examining the problem of detecting foreign object damage (FOD) events in turbofan engines from gas path thermodynamic and bearing accelerometer sensors, and determining the severity of damage to each component (diagnosis). Several detection and diagnostic approaches were investigated and a software tool (FODID) was developed to assist researchers detect/diagnose FOD events. These approaches include (1) fan efficiency deviation computed from upstream and downstream temperature/ pressure measurements, (2) gas path weighted least squares estimation of component health parameter deficiencies, (3) Kalman filter estimation of component health parameters, and (4) use of structural vibration signal processing to detect both large and small FOD events. The last three of these approaches require a significant amount of computation in conjunction with a physics-based analytic model of the underlying phenomenon the NPSS thermodynamic cycle code for approaches 1 to 3 and the DyRoBeS reduced-order rotor dynamics code for approach 4. A potential application of the FODID software tool, in addition to its detection/diagnosis role, is using its sensitivity results to help identify the best types of sensors and their optimum locations within the gas path, and similarly for bearing accelerometers.

  10. miRNAtools: Advanced Training Using the miRNA Web of Knowledge.

    PubMed

    Stępień, Ewa Ł; Costa, Marina C; Enguita, Francisco J

    2018-02-16

    Micro-RNAs (miRNAs) are small non-coding RNAs that act as negative regulators of the genomic output. Their intrinsic importance within cell biology and human disease is well known. Their mechanism of action based on the base pairing binding to their cognate targets have helped the development not only of many computer applications for the prediction of miRNA target recognition but also of specific applications for functional assessment and analysis. Learning about miRNA function requires practical training in the use of specific computer and web-based applications that are complementary to wet-lab studies. In order to guide the learning process about miRNAs, we have created miRNAtools (http://mirnatools.eu), a web repository of miRNA tools and tutorials. This article compiles tools with which miRNAs and their regulatory action can be analyzed and that function to collect and organize information dispersed on the web. The miRNAtools website contains a collection of tutorials that can be used by students and tutors engaged in advanced training courses. The tutorials engage in analyses of the functions of selected miRNAs, starting with their nomenclature and genomic localization and finishing with their involvement in specific cellular functions.

  11. iBIOMES Lite: Summarizing Biomolecular Simulation Data in Limited Settings

    PubMed Central

    2015-01-01

    As the amount of data generated by biomolecular simulations dramatically increases, new tools need to be developed to help manage this data at the individual investigator or small research group level. In this paper, we introduce iBIOMES Lite, a lightweight tool for biomolecular simulation data indexing and summarization. The main goal of iBIOMES Lite is to provide a simple interface to summarize computational experiments in a setting where the user might have limited privileges and limited access to IT resources. A command-line interface allows the user to summarize, publish, and search local simulation data sets. Published data sets are accessible via static hypertext markup language (HTML) pages that summarize the simulation protocols and also display data analysis graphically. The publication process is customized via extensible markup language (XML) descriptors while the HTML summary template is customized through extensible stylesheet language (XSL). iBIOMES Lite was tested on different platforms and at several national computing centers using various data sets generated through classical and quantum molecular dynamics, quantum chemistry, and QM/MM. The associated parsers currently support AMBER, GROMACS, Gaussian, and NWChem data set publication. The code is available at https://github.com/jcvthibault/ibiomes. PMID:24830957

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagberg, Aric; Swart, Pieter; S Chult, Daniel

    NetworkX is a Python language package for exploration and analysis of networks and network algorithms. The core package provides data structures for representing many types of networks, or graphs, including simple graphs, directed graphs, and graphs with parallel edges and self loops. The nodes in NetworkX graphs can be any (hashable) Python object and edges can contain arbitrary data; this flexibility mades NetworkX ideal for representing networks found in many different scientific fields. In addition to the basic data structures many graph algorithms are implemented for calculating network properties and structure measures: shortest paths, betweenness centrality, clustering, and degree distributionmore » and many more. NetworkX can read and write various graph formats for eash exchange with existing data, and provides generators for many classic graphs and popular graph models, such as the Erdoes-Renyi, Small World, and Barabasi-Albert models, are included. The ease-of-use and flexibility of the Python programming language together with connection to the SciPy tools make NetworkX a powerful tool for scientific computations. We discuss some of our recent work studying synchronization of coupled oscillators to demonstrate how NetworkX enables research in the field of computational networks.« less

  13. NASA Tech Briefs, August 2005

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Topics include: Hidden Identification on Parts: Magnetic Machine-Readable Matrix Symbols; System for Processing Coded OFDM Under Doppler and Fading; Multipurpose Hyperspectral Imaging System; Magnetic-Flux-Compensated Voltage Divider; High-Performance Satellite/Terrestrial-Network Gateway; Internet-Based System for Voice Communication With the ISS; Stripline/Microstrip Transition in Multilayer Circuit Board; Dual-Band Feed for a Microwave Reflector Antenna; Quadratic Programming for Allocating Control Effort; Range Process Simulation Tool; Simulator of Space Communication Networks; Computing Q-D Relationships for Storage of Rocket Fuels; Contour Error Map Algorithm; Portfolio Analysis Tool; Glass Frit Filters for Collecting Metal Oxide Nanoparticles; Anhydrous Proton-Conducting Membranes for Fuel Cells; Portable Electron-Beam Free-Form Fabrication System; Miniature Laboratory for Detecting Sparse Biomolecules; Multicompartment Liquid-Cooling/Warming Protective Garments; Laser Metrology for an Optical-Path-Length Modulator; PCM Passive Cooling System Containing Active Subsystems; Automated Electrostatics Environmental Chamber; Estimating Aeroheating of a 3D Body Using a 2D Flow Solver; Artificial Immune System for Recognizing Patterns; Computing the Thermodynamic State of a Cryogenic Fluid; Safety and Mission Assurance Performance Metric; Magnetic Control of Concentration Gradient in Microgravity; Avionics for a Small Robotic Inspection Spacecraft; and Simulation of Dynamics of a Flexible Miniature Airplane.

  14. Students' Use of Electronic Support Tools in Mathematics

    ERIC Educational Resources Information Center

    Crawford, Lindy; Higgins, Kristina N.; Huscroft-D'Angelo, Jacqueline N.; Hall, Lindsay

    2016-01-01

    This study investigated students' use of electronic support tools within a computer-based mathematics program. Electronic support tools are tools, such as hyperlinks or calculators, available within many computer-based instructional programs. A convenience sample of 73 students in grades 4-6 was selected to participate in the study. Students…

  15. tRF2Cancer: A web server to detect tRNA-derived small RNA fragments (tRFs) and their expression in multiple cancers.

    PubMed

    Zheng, Ling-Ling; Xu, Wei-Lin; Liu, Shun; Sun, Wen-Ju; Li, Jun-Hao; Wu, Jie; Yang, Jian-Hua; Qu, Liang-Hu

    2016-07-08

    tRNA-derived small RNA fragments (tRFs) are one class of small non-coding RNAs derived from transfer RNAs (tRNAs). tRFs play important roles in cellular processes and are involved in multiple cancers. High-throughput small RNA (sRNA) sequencing experiments can detect all the cellular expressed sRNAs, including tRFs. However, distinguishing genuine tRFs from RNA fragments generated by random degradation remains a major challenge. In this study, we developed an integrated web-based computing system, tRF2Cancer, to accurately identify tRFs from sRNA deep-sequencing data and evaluate their expression in multiple cancers. The binomial test was introduced to evaluate whether reads from a small RNA-seq data set represent tRFs or degraded fragments. A classification method was then used to annotate the types of tRFs based on their sites of origin in pre-tRNA or mature tRNA. We applied the pipeline to analyze 10 991 data sets from 32 types of cancers and identified thousands of expressed tRFs. A tool called 'tRFinCancer' was developed to facilitate the users to inspect the expression of tRFs across different types of cancers. Another tool called 'tRFBrowser' shows both the sites of origin and the distribution of chemical modification sites in tRFs on their source tRNA. The tRF2Cancer web server is available at http://rna.sysu.edu.cn/tRFfinder/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    ERIC Educational Resources Information Center

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  17. An Exploration of the Role of Visual Programming Tools in the Development of Young Children's Computational Thinking

    ERIC Educational Resources Information Center

    Rose, Simon P.; Habgood, M. P. Jacob; Jay, Tim

    2017-01-01

    Programming tools are being used in education to teach computer science to children as young as 5 years old. This research aims to explore young children's approaches to programming in two tools with contrasting programming interfaces, ScratchJr and Lightbot, and considers the impact of programming approaches on developing computational thinking.…

  18. A Decade-Long European-Scale Convection-Resolving Climate Simulation on GPUs

    NASA Astrophysics Data System (ADS)

    Leutwyler, D.; Fuhrer, O.; Ban, N.; Lapillonne, X.; Lüthi, D.; Schar, C.

    2016-12-01

    Convection-resolving models have proven to be very useful tools in numerical weather prediction and in climate research. However, due to their extremely demanding computational requirements, they have so far been limited to short simulations and/or small computational domains. Innovations in the supercomputing domain have led to new supercomputer designs that involve conventional multi-core CPUs and accelerators such as graphics processing units (GPUs). One of the first atmospheric models that has been fully ported to GPUs is the Consortium for Small-Scale Modeling weather and climate model COSMO. This new version allows us to expand the size of the simulation domain to areas spanning continents and the time period up to one decade. We present results from a decade-long, convection-resolving climate simulation over Europe using the GPU-enabled COSMO version on a computational domain with 1536x1536x60 gridpoints. The simulation is driven by the ERA-interim reanalysis. The results illustrate how the approach allows for the representation of interactions between synoptic-scale and meso-scale atmospheric circulations at scales ranging from 1000 to 10 km. We discuss some of the advantages and prospects from using GPUs, and focus on the performance of the convection-resolving modeling approach on the European scale. Specifically we investigate the organization of convective clouds and on validate hourly rainfall distributions with various high-resolution data sets.

  19. Tool Use of Experienced Learners in Computer-Based Learning Environments: Can Tools Be Beneficial?

    ERIC Educational Resources Information Center

    Juarez Collazo, Norma A.; Corradi, David; Elen, Jan; Clarebout, Geraldine

    2014-01-01

    Research has documented the use of tools in computer-based learning environments as problematic, that is, learners do not use the tools and when they do, they tend to do it suboptimally. This study attempts to disentangle cause and effect of this suboptimal tool use for experienced learners. More specifically, learner variables (metacognitive and…

  20. Chimera Grid Tools

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  1. High Fidelity Modeling of Turbulent Mixing and Chemical Kinetics Interactions in a Post-Detonation Flow Field

    NASA Astrophysics Data System (ADS)

    Sinha, Neeraj; Zambon, Andrea; Ott, James; Demagistris, Michael

    2015-06-01

    Driven by the continuing rapid advances in high-performance computing, multi-dimensional high-fidelity modeling is an increasingly reliable predictive tool capable of providing valuable physical insight into complex post-detonation reacting flow fields. Utilizing a series of test cases featuring blast waves interacting with combustible dispersed clouds in a small-scale test setup under well-controlled conditions, the predictive capabilities of a state-of-the-art code are demonstrated and validated. Leveraging physics-based, first principle models and solving large system of equations on highly-resolved grids, the combined effects of finite-rate/multi-phase chemical processes (including thermal ignition), turbulent mixing and shock interactions are captured across the spectrum of relevant time-scales and length scales. Since many scales of motion are generated in a post-detonation environment, even if the initial ambient conditions are quiescent, turbulent mixing plays a major role in the fireball afterburning as well as in dispersion, mixing, ignition and burn-out of combustible clouds in its vicinity. Validating these capabilities at the small scale is critical to establish a reliable predictive tool applicable to more complex and large-scale geometries of practical interest.

  2. First-Principles Design of Novel Catalytic and Chemoresponsive Materials

    NASA Astrophysics Data System (ADS)

    Roling, Luke T.

    An emerging trend in materials design is the use of computational chemistry tools to accelerate materials discovery and implementation. In particular, the parallel nature of computational models enables high-throughput screening approaches that would be laborious and time-consuming with experiments alone, and can be useful for identifying promising candidate materials for experimental synthesis and evaluation. Additionally, atomic-scale modeling allows researchers to obtain a detailed understanding of phenomena invisible to many current experimental techniques. In this thesis, we highlight mechanistic studies and successes in catalyst design for heterogeneous electrochemical reactions, discussing both anode and cathode chemistries. In particular, we evaluate the properties of a new class of Pd-Pt core-shell and hollow nanocatalysts toward the oxygen reduction reaction. We do not limit our study to electrochemical reactivity, but also consider these catalysts in a broader context by performing in-depth studies of their stability at elevated temperatures as well as investigating the mechanisms by which they are able to form. We also present fundamental surface science studies, investigating graphene formation and H2 dissociation, which are processes of both fundamental and practical interest in many catalytic applications. Finally, we extend our materials design paradigm outside the field of catalysis to develop and apply a model for the detection of small chemical analytes by chemoresponsive liquid crystals, and offer several predictions for improving the detection of small chemicals. A close connection between computation, synthesis, and experimental evaluation is essential to the work described herein, as computations are used to gain fundamental insight into experimental observations, and experiments and synthesis are in turn used to validate predictions of material activities from computational models.

  3. Predicting "Hot" and "Warm" Spots for Fragment Binding.

    PubMed

    Rathi, Prakash Chandra; Ludlow, R Frederick; Hall, Richard J; Murray, Christopher W; Mortenson, Paul N; Verdonk, Marcel L

    2017-05-11

    Computational fragment mapping methods aim to predict hotspots on protein surfaces where small fragments will bind. Such methods are popular for druggability assessment as well as structure-based design. However, to date researchers developing or using such tools have had no clear way of assessing the performance of these methods. Here, we introduce the first diverse, high quality validation set for computational fragment mapping. The set contains 52 diverse examples of fragment binding "hot" and "warm" spots from the Protein Data Bank (PDB). Additionally, we describe PLImap, a novel protocol for fragment mapping based on the Protein-Ligand Informatics force field (PLIff). We evaluate PLImap against the new fragment mapping test set, and compare its performance to that of simple shape-based algorithms and fragment docking using GOLD. PLImap is made publicly available from https://bitbucket.org/AstexUK/pli .

  4. GASP- General Aviation Synthesis Program. Volume 1: Main program. Part 1: Theoretical development

    NASA Technical Reports Server (NTRS)

    Hague, D.

    1978-01-01

    The General Aviation synthesis program performs tasks generally associated with aircraft preliminary design and allows an analyst the capability of performing parametric studies in a rapid manner. GASP emphasizes small fixed-wing aircraft employing propulsion systems varying froma single piston engine with fixed pitch propeller through twin turboprop/ turbofan powered business or transport type aircraft. The program, which may be operated from a computer terminal in either the batch or interactive graphic mode, is comprised of modules representing the various technical disciplines integrated into a computational flow which ensures that the interacting effects of design variables are continuously accounted for in the aircraft sizing procedure. The model is a useful tool for comparing configurations, assessing aircraft performance and economics, performing tradeoff and sensitivity studies, and assessing the impact of advanced technologies on aircraft performance and economics.

  5. Imaging of pharmacokinetic rates of indocyanine green in mouse liver with a hybrid fluorescence molecular tomography/x-ray computed tomography system.

    PubMed

    Zhang, Guanglei; Liu, Fei; Zhang, Bin; He, Yun; Luo, Jianwen; Bai, Jing

    2013-04-01

    Pharmacokinetic rates have the potential to provide quantitative physiological and pathological information for biological studies and drug development. Fluorescence molecular tomography (FMT) is an attractive imaging tool for three-dimensionally resolving fluorophore distribution in small animals. In this letter, pharmacokinetic rates of indocyanine green (ICG) in mouse liver are imaged with a hybrid FMT and x-ray computed tomography (XCT) system. A recently developed FMT method using structural priors from an XCT system is adopted to improve the quality of FMT reconstruction. In the in vivo experiments, images of uptake and excretion rates of ICG in mouse liver are obtained, which can be used to quantitatively evaluate liver function. The accuracy of the results is validated by a fiber-based fluorescence measurement system.

  6. Space shuttle rendezous, radiation and reentry analysis code

    NASA Technical Reports Server (NTRS)

    Mcglathery, D. M.

    1973-01-01

    A preliminary space shuttle mission design and analysis tool is reported emphasizing versatility, flexibility, and user interaction through the use of a relatively small computer (IBM-7044). The Space Shuttle Rendezvous, Radiation and Reentry Analysis Code is used to perform mission and space radiation environmental analyses for four typical space shuttle missions. Included also is a version of the proposed Apollo/Soyuz rendezvous and docking test mission. Tangential steering circle to circle low-thrust tug orbit raising and the effects of the trapped radiation environment on trajectory shaping due to solar electric power losses are also features of this mission analysis code. The computational results include a parametric study on single impulse versus double impulse deorbiting for relatively low space shuttle orbits as well as some definitive data on the magnetically trapped protons and electrons encountered on a particular mission.

  7. A Summary of NASA Research Exploring the Acoustics of Small Unmanned Aerial Systems

    NASA Technical Reports Server (NTRS)

    Zawodny, Nikolas S.; Christian, Andrew; Cabell, Randolph

    2018-01-01

    Proposed uses of small unmanned aerial systems (sUAS) have the potential to expose large portions of communities to a new noise source. In order to understand the potential noise impact of sUAS, NASA initiated acoustics research as one component of the 3-year DELIVER project, with the goal of documenting the feasibility of using existing aircraft design tools and methods on this class of vehicles. This paper summarizes the acoustics research conducted within the DELIVER project. The research described here represents an initial study, and subsequent research building on the findings of this work has been proposed for other NASA projects. The paper summarizes acoustics research in four areas: measurements of noise generated by flyovers of small unmanned aerial vehicles, measurements in controlled test facilities to understand the noise generated by components of these vehicles, computational predictions of component and full vehicle noise, and psychoacoustic tests including auralizations conducted to assess human annoyance to the noise generated by these vehicles.

  8. Project-Based Teaching-Learning Computer-Aided Engineering Tools

    ERIC Educational Resources Information Center

    Simoes, J. A.; Relvas, C.; Moreira, R.

    2004-01-01

    Computer-aided design, computer-aided manufacturing, computer-aided analysis, reverse engineering and rapid prototyping are tools that play an important key role within product design. These are areas of technical knowledge that must be part of engineering and industrial design courses' curricula. This paper describes our teaching experience of…

  9. An Interactive Computer Tool for Teaching About Desalination and Managing Water Demand in the US

    NASA Astrophysics Data System (ADS)

    Ziolkowska, J. R.; Reyes, R.

    2016-12-01

    This paper presents an interactive tool to geospatially and temporally analyze desalination developments and trends in the US in the time span 1950-2013, its current contribution to satisfying water demands and its future potentials. The computer tool is open access and can be used by any user with Internet connection, thus facilitating interactive learning about water resources. The tool can also be used by stakeholders and policy makers for decision-making support and with designing sustainable water management strategies. Desalination technology has been acknowledged as a solution to a sustainable water demand management stemming from many sectors, including municipalities, industry, agriculture, power generation, and other users. Desalination has been applied successfully in the US and many countries around the world since 1950s. As of 2013, around 1,336 desalination plants were operating in the US alone, with a daily production capacity of 2 BGD (billion gallons per day) (GWI, 2013). Despite a steady increase in the number of new desalination plants and growing production capacity, in many regions, the costs of desalination are still prohibitive. At the same time, the technology offers a tremendous potential for `enormous supply expansion that exceeds all likely demands' (Chowdhury et al., 2013). The model and tool are based on data from Global Water Intelligence (GWI, 2013). The analysis shows that more than 90% of all the plants in the US are small-scale plants with the capacity below 4.31 MGD. Most of the plants (and especially larger plants) are located on the US East Coast, as well as in California, Texas, Oklahoma, and Florida. The models and the tool provide information about economic feasibility of potential new desalination plants based on the access to feed water, energy sources, water demand, and experiences of other plants in that region.

  10. Research on OpenStack of open source cloud computing in colleges and universities’ computer room

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Zhang, Dandan

    2017-06-01

    In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.

  11. Computer Forensics Education - the Open Source Approach

    NASA Astrophysics Data System (ADS)

    Huebner, Ewa; Bem, Derek; Cheung, Hon

    In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.

  12. Business grants

    NASA Astrophysics Data System (ADS)

    Twelve small businesses who are developing equipment and computer programs for geophysics have won Small Business Innovative Research (SBIR) grants from the National Science Foundation for their 1989 proposals. The SBIR program was set up to encourage the private sector to undertake costly, advanced experimental work that has potential for great benefit.The geophysical research projects are a long-path intracavity laser spectrometer for measuring atmospheric trace gases, optimizing a local weather forecast model, a new platform for high-altitude atmospheric science, an advanced density logging tool, a deep-Earth sampling system, superconducting seismometers, a phased-array Doppler current profiler, monitoring mesoscale surface features of the ocean through automated analysis, krypton-81 dating in polar ice samples, discrete stochastic modeling of thunderstorm winds, a layered soil-synthetic liner base system to isolate buildings from earthquakes, and a low-cost continuous on-line organic-content monitor for water-quality determination.

  13. Spectral Factorization and Homogenization Methods for Modeling and Control of Flexible Structures.

    DTIC Science & Technology

    1986-12-15

    to the computation of hybrid, state-space modeling of an integrated space platform . Throughout this effort we have focused on the potential for...models can provide an effective tool for analysis of dynamics of vibrations and their effect on small angle motions for complex space platforms . In this... WIX 1 v .41(Ac 0 0o4 1 2.. 9 2% - L .0U V)V14IC Ma a * 9L 0 a soe - a a.. x m c 4. i.! 0~~~I W ** PMiscellaneous Routines• Power Series Expansion

  14. The application of simulation modeling to the cost and performance ranking of solar thermal power plants

    NASA Technical Reports Server (NTRS)

    Rosenberg, L. S.; Revere, W. R.; Selcuk, M. K.

    1981-01-01

    Small solar thermal power systems (up to 10 MWe in size) were tested. The solar thermal power plant ranking study was performed to aid in experiment activity and support decisions for the selection of the most appropriate technological approach. The cost and performance were determined for insolation conditions by utilizing the Solar Energy Simulation computer code (SESII). This model optimizes the size of the collector field and energy storage subsystem for given engine generator and energy transport characteristics. The development of the simulation tool, its operation, and the results achieved from the analysis are discussed.

  15. Modeling Tools Predict Flow in Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    2010-01-01

    "Because rocket engines operate under extreme temperature and pressure, they present a unique challenge to designers who must test and simulate the technology. To this end, CRAFT Tech Inc., of Pipersville, Pennsylvania, won Small Business Innovation Research (SBIR) contracts from Marshall Space Flight Center to develop software to simulate cryogenic fluid flows and related phenomena. CRAFT Tech enhanced its CRUNCH CFD (computational fluid dynamics) software to simulate phenomena in various liquid propulsion components and systems. Today, both government and industry clients in the aerospace, utilities, and petrochemical industries use the software for analyzing existing systems as well as designing new ones."

  16. Simulation and evaluation of the Sh-2F helicopter in a shipboard environment using the interchangeable cab system

    NASA Technical Reports Server (NTRS)

    Paulk, C. H., Jr.; Astill, D. L.; Donley, S. T.

    1983-01-01

    The operation of the SH-2F helicopter from the decks of small ships in adverse weather was simulated using a large amplitude vertical motion simulator, a wide angle computer generated imagery visual system, and an interchangeable cab (ICAB). The simulation facility, the mathematical programs, and the validation method used to ensure simulation fidelity are described. The results show the simulator to be a useful tool in simulating the ship-landing problem. Characteristics of the ICAB system and ways in which the simulation can be improved are presented.

  17. CFD Analysis of Turbo Expander for Cryogenic Refrigeration and Liquefaction Cycles

    NASA Astrophysics Data System (ADS)

    Verma, Rahul; Sam, Ashish Alex; Ghosh, Parthasarathi

    Computational Fluid Dynamics analysis has emerged as a necessary tool for designing of turbomachinery. It helps to understand the various sources of inefficiency through investigation of flow physics of the turbine. In this paper, 3D turbulent flow analysis of a cryogenic turboexpander for small scale air separation was performed using Ansys CFX®. The turboexpander has been designed following assumptions based on meanlineblade generation procedure provided in open literature and good engineering judgement. Through analysis of flow field, modifications and further analysis required to evolve a more robust design procedure, have been suggested.

  18. A Simplified Shuttle Payload Thermal Analyzer /SSPTA/ program

    NASA Technical Reports Server (NTRS)

    Bartoszek, J. T.; Huckins, B.; Coyle, M.

    1979-01-01

    A simple thermal analysis program for Space Shuttle payloads has been developed to accommodate the user who requires an easily understood but dependable analytical tool. The thermal analysis program includes several thermal subprograms traditionally employed in spacecraft thermal studies, a data management system for data generated by the subprograms, and a master program to coordinate the data files and thermal subprograms. The language and logic used to run the thermal analysis program are designed for the small user. In addition, analytical and storage techniques which conserve computer time and minimize core requirements are incorporated into the program.

  19. Evaluation of Internet Social Networks using Net scoring Tool: A Case Study in Adverse Drug Reaction Mining.

    PubMed

    Katsahian, Sandrine; Simond Moreau, Erica; Leprovost, Damien; Lardon, Jeremy; Bousquet, Cedric; Kerdelhué, Gaétan; Abdellaoui, Redhouane; Texier, Nathalie; Burgun, Anita; Boussadi, Abdelali; Faviez, Carole

    2015-01-01

    Suspected adverse drug reactions (ADR) reported by patients through social media can be a complementary tool to already existing ADRs signal detection processes. However, several studies have shown that the quality of medical information published online varies drastically whatever the health topic addressed. The aim of this study is to use an existing rating tool on a set of social network web sites in order to assess the capabilities of these tools to guide experts for selecting the most adapted social network web site to mine ADRs. First, we reviewed and rated 132 Internet forums and social networks according to three major criteria: the number of visits, the notoriety of the forum and the number of messages posted in relation with health and drug therapy. Second, the pharmacist reviewed the topic-oriented message boards with a small number of drug names to ensure that they were not off topic. Six experts have been chosen to assess the selected internet forums using a French scoring tool: Net scoring. Three different scores and the agreement between experts according to each set of scores using weighted kappa pooled using mean have been computed. Three internet forums were chosen at the end of the selection step. Some criteria get high score (scores 3-4) no matter the website evaluated like accessibility (45-46) or design (34-36), at the opposite some criteria always have bad scores like quantitative (40-42) and ethical aspect (43-44), hyperlinks actualization (30-33). Kappa were positives but very small which corresponds to a weak agreement between experts. The personal opinion of the expert seems to have a major impact, undermining the relevance of the criterion. Our future work is to collect results given by this evaluation grid and proposes a new scoring tool for Internet social networks assessment.

  20. Small-Body Extensions for the Satellite Orbit Analysis Program (SOAP)

    NASA Technical Reports Server (NTRS)

    Carnright, Robert; Stodden, David; Coggi, John

    2008-01-01

    An extension to the SOAP software allows users to work with tri-axial ellipsoid-based representations of planetary bodies, primarily for working with small, natural satellites, asteroids, and comets. SOAP is a widely used tool for the visualization and analysis of space missions. The small body extension provides the same visualization and analysis constructs for use with small bodies. These constructs allow the user to characterize satellite path and instrument cover information for small bodies in both 3D display and numerical output formats. Tri-axial ellipsoids are geometric shapes the diameters of which are different in each of three principal x, y, and z dimensions. This construct provides a better approximation than using spheres or oblate spheroids (ellipsoids comprising two common equatorial diameters as a distinct polar diameter). However, the tri-axial ellipsoid is considerably more difficult to work with from a modeling perspective. In addition, the SOAP small-body extensions allow the user to actually employ a plate model for highly irregular surfaces. Both tri-axial ellipsoids and plate models can be assigned to coordinate frames, thus allowing for the modeling of arbitrary changes to body orientation. A variety of features have been extended to support tri-axial ellipsoids, including the computation and display of the spacecraft sub-orbital point, ground trace, instrument footprints, and swathes. Displays of 3D instrument volumes can be shown interacting with the ellipsoids. Longitude/latitude grids, contour plots, and texture maps can be displayed on the ellipsoids using a variety of projections. The distance along an arbitrary line of sight can be computed between the spacecraft and the ellipsoid, and the coordinates of that intersection can be plotted as a function of time. The small-body extension supports the same visual and analytical constructs that are supported for spheres and oblate spheroids in SOAP making the implementation of the more complex algorithms largely transparent to the user.

  1. A novel algorithm for determining contact area between a respirator and a headform.

    PubMed

    Lei, Zhipeng; Yang, James; Zhuang, Ziqing

    2014-01-01

    The contact area, as well as the contact pressure, is created when a respiratory protection device (a respirator or surgical mask) contacts a human face. A computer-based algorithm for determining the contact area between a headform and N95 filtering facepiece respirator (FFR) was proposed. Six N95 FFRs were applied to five sizes of standard headforms (large, medium, small, long/narrow, and short/wide) to simulate respirator donning. After the contact simulation between a headform and an N95 FFR was conducted, a contact area was determined by extracting the intersection surfaces of the headform and the N95 FFR. Using computer-aided design tools, a superimposed contact area and an average contact area, which are non-uniform rational basis spline (NURBS) surfaces, were developed for each headform. Experiments that directly measured dimensions of the contact areas between headform prototypes and N95 FFRs were used to validate the simulation results. Headform sizes influenced all contact area dimensions (P < 0.0001), and N95 FFR sizing systems influenced all contact area dimensions (P < 0.05) except the left and right chin regions. The medium headform produced the largest contact area, while the large and small headforms produced the smallest.

  2. An Automated Method for High-Definition Transcranial Direct Current Stimulation Modeling*

    PubMed Central

    Huang, Yu; Su, Yuzhuo; Rorden, Christopher; Dmochowski, Jacek; Datta, Abhishek; Parra, Lucas C.

    2014-01-01

    Targeted transcranial stimulation with electric currents requires accurate models of the current flow from scalp electrodes to the human brain. Idiosyncratic anatomy of individual brains and heads leads to significant variability in such current flows across subjects, thus, necessitating accurate individualized head models. Here we report on an automated processing chain that computes current distributions in the head starting from a structural magnetic resonance image (MRI). The main purpose of automating this process is to reduce the substantial effort currently required for manual segmentation, electrode placement, and solving of finite element models. In doing so, several weeks of manual labor were reduced to no more than 4 hours of computation time and minimal user interaction, while current-flow results for the automated method deviated by less than 27.9% from the manual method. Key facilitating factors are the addition of three tissue types (skull, scalp and air) to a state-of-the-art automated segmentation process, morphological processing to correct small but important segmentation errors, and automated placement of small electrodes based on easily reproducible standard electrode configurations. We anticipate that such an automated processing will become an indispensable tool to individualize transcranial direct current stimulation (tDCS) therapy. PMID:23367144

  3. Comparison of Monte Carlo simulated and measured performance parameters of miniPET scanner

    NASA Astrophysics Data System (ADS)

    Kis, S. A.; Emri, M.; Opposits, G.; Bükki, T.; Valastyán, I.; Hegyesi, Gy.; Imrek, J.; Kalinka, G.; Molnár, J.; Novák, D.; Végh, J.; Kerek, A.; Trón, L.; Balkay, L.

    2007-02-01

    In vivo imaging of small laboratory animals is a valuable tool in the development of new drugs. For this purpose, miniPET, an easy to scale modular small animal PET camera has been developed at our institutes. The system has four modules, which makes it possible to rotate the whole detector system around the axis of the field of view. Data collection and image reconstruction are performed using a data acquisition (DAQ) module with Ethernet communication facility and a computer cluster of commercial PCs. Performance tests were carried out to determine system parameters, such as energy resolution, sensitivity and noise equivalent count rate. A modified GEANT4-based GATE Monte Carlo software package was used to simulate PET data analogous to those of the performance measurements. GATE was run on a Linux cluster of 10 processors (64 bit, Xeon with 3.0 GHz) and controlled by a SUN grid engine. The application of this special computer cluster reduced the time necessary for the simulations by an order of magnitude. The simulated energy spectra, maximum rate of true coincidences and sensitivity of the camera were in good agreement with the measured parameters.

  4. Salary Management System for Small and Medium-sized Enterprises

    NASA Astrophysics Data System (ADS)

    Hao, Zhang; Guangli, Xu; Yuhuan, Zhang; Yilong, Lei

    Small and Medium-sized Enterprises (SMEs) in the process of wage entry, calculation, the total number are needed to be done manually in the past, the data volume is quite large, processing speed is low, and it is easy to make error, which is resulting in low efficiency. The main purpose of writing this paper is to present the basis of salary management system, establish a scientific database, the computer payroll system, using the computer instead of a lot of past manual work in order to reduce duplication of staff labor, it will improve working efficiency.This system combines the actual needs of SMEs, through in-depth study and practice of the C/S mode, PowerBuilder10.0 development tools, databases and SQL language, Completed a payroll system needs analysis, database design, application design and development work. Wages, departments, units and personnel database file are included in this system, and have data management, department management, personnel management and other functions, through the control and management of the database query, add, delete, modify, and other functions can be realized. This system is reasonable design, a more complete function, stable operation has been tested to meet the basic needs of the work.

  5. Prediction of miRNA targets.

    PubMed

    Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis

    2015-01-01

    Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.

  6. National Energy Audit Tool for Multifamily Buildings Development Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malhotra, Mini; MacDonald, Michael; Accawi, Gina K

    The U.S. Department of Energy's (DOE's) Weatherization Assistance Program (WAP) enables low-income families to reduce their energy costs by providing funds to make their homes more energy efficient. In addition, the program funds Weatherization Training and Technical Assistance (T and TA) activities to support a range of program operations. These activities include measuring and documenting performance, monitoring programs, promoting advanced techniques and collaborations to further improve program effectiveness, and training, including developing tools and information resources. The T and TA plan outlines the tasks, activities, and milestones to support the weatherization network with the program implementation ramp up efforts. Weatherizationmore » of multifamily buildings has been recognized as an effective way to ramp up weatherization efforts. To support this effort, the 2009 National Weatherization T and TA plan includes the task of expanding the functionality of the Weatherization Assistant, a DOE-sponsored family of energy audit computer programs, to perform audits for large and small multifamily buildings This report describes the planning effort for a new multifamily energy audit tool for DOE's WAP. The functionality of the Weatherization Assistant is being expanded to also perform energy audits of small multifamily and large multifamily buildings. The process covers an assessment of needs that includes input from national experts during two national Web conferences. The assessment of needs is then translated into capability and performance descriptions for the proposed new multifamily energy audit, with some description of what might or should be provided in the new tool. The assessment of needs is combined with our best judgment to lay out a strategy for development of the multifamily tool that proceeds in stages, with features of an initial tool (version 1) and a more capable version 2 handled with currently available resources. Additional development in the future is expected to be needed if more capabilities are to be added. A rough schedule for development of the version 1 tool is presented. The components and capabilities described in this plan will serve as the starting point for development of the proposed new multifamily energy audit tool for WAP.« less

  7. Evaluation of Visual Computer Simulator for Computer Architecture Education

    ERIC Educational Resources Information Center

    Imai, Yoshiro; Imai, Masatoshi; Moritoh, Yoshio

    2013-01-01

    This paper presents trial evaluation of a visual computer simulator in 2009-2011, which has been developed to play some roles of both instruction facility and learning tool simultaneously. And it illustrates an example of Computer Architecture education for University students and usage of e-Learning tool for Assembly Programming in order to…

  8. EFL Learners' Attitudes towards Using Computers as a Learning Tool in Language Learning

    ERIC Educational Resources Information Center

    Kitchakarn, Orachorn

    2015-01-01

    The study was conducted to investigate attitudes toward using computers as a learning tool among undergraduate students in a private university. In this regards, some variables which might be potential antecedents of attitudes toward computer including gender, experience of using computers and perceived abilities in using programs were examined.…

  9. Evaluating effectiveness of small group information literacy instruction for Undergraduate Medical Education students using a pre- and post-survey study design.

    PubMed

    McClurg, Caitlin; Powelson, Susan; Lang, Eddy; Aghajafari, Fariba; Edworthy, Steven

    2015-06-01

    The Undergraduate Medical Education (UME) programme at the University of Calgary is a three-year programme with a strong emphasis on small group learning. The purpose of our study was to determine whether librarian led small group information literacy instruction, closely integrated with course content and faculty participation, but without a hands on component, was an effective means to convey EBM literacy skills. Five 15-minute EBM information literacy sessions were delivered by three librarians to 12 practicing physician led small groups of 15 students. Students were asked to complete an online survey before and after the sessions. Data analysis was performed through simple descriptive statistics. A total of 144 of 160 students responded to the pre-survey, and 112 students answered the post-survey. Instruction in a small group environment without a mandatory hands on component had a positive impact on student's evidence-based information literacy skills. Students were more likely to consult a librarian and had increased confidence in their abilities to search and find relevant information. Our study demonstrates that student engagement and faculty involvement are effective tools for delivering information literacy skills when working with students in a small group setting outside of a computer classroom. © 2015 Health Libraries Group.

  10. Small Business Management Training Tools Directory.

    ERIC Educational Resources Information Center

    American Association of Community and Junior Colleges, Washington, DC. National Small Business Training Network.

    This directory is designed to assist in the identification of supplementary materials to support program development for small businesses. Following introductory comments and an overview of small business management training, section I lists training tools available from the Small Business Administration (SBA). Section II provides descriptions and…

  11. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    PubMed

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  12. Fast Dynamic Simulation-Based Small Signal Stability Assessment and Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acharya, Naresh; Baone, Chaitanya; Veda, Santosh

    2014-12-31

    Power grid planning and operation decisions are made based on simulation of the dynamic behavior of the system. Enabling substantial energy savings while increasing the reliability of the aging North American power grid through improved utilization of existing transmission assets hinges on the adoption of wide-area measurement systems (WAMS) for power system stabilization. However, adoption of WAMS alone will not suffice if the power system is to reach its full entitlement in stability and reliability. It is necessary to enhance predictability with "faster than real-time" dynamic simulations that will enable the dynamic stability margins, proactive real-time control, and improve gridmore » resiliency to fast time-scale phenomena such as cascading network failures. Present-day dynamic simulations are performed only during offline planning studies, considering only worst case conditions such as summer peak, winter peak days, etc. With widespread deployment of renewable generation, controllable loads, energy storage devices and plug-in hybrid electric vehicles expected in the near future and greater integration of cyber infrastructure (communications, computation and control), monitoring and controlling the dynamic performance of the grid in real-time would become increasingly important. The state-of-the-art dynamic simulation tools have limited computational speed and are not suitable for real-time applications, given the large set of contingency conditions to be evaluated. These tools are optimized for best performance of single-processor computers, but the simulation is still several times slower than real-time due to its computational complexity. With recent significant advances in numerical methods and computational hardware, the expectations have been rising towards more efficient and faster techniques to be implemented in power system simulators. This is a natural expectation, given that the core solution algorithms of most commercial simulators were developed decades ago, when High Performance Computing (HPC) resources were not commonly available.« less

  13. Software Tools on the Peregrine System | High-Performance Computing | NREL

    Science.gov Websites

    Debugger or performance analysis Tool for understanding the behavior of MPI applications. Intel VTune environment for statistical computing and graphics. VirtualGL/TurboVNC Visualization and analytics Remote Tools on the Peregrine System Software Tools on the Peregrine System NREL has a variety of

  14. Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods, and Results for a User Study

    DTIC Science & Technology

    2016-11-01

    Display Design, Methods , and Results for a User Study by Christopher J Garneau and Robert F Erbacher Approved for public...NOV 2016 US Army Research Laboratory Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods ...January 2013–September 2015 4. TITLE AND SUBTITLE Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods

  15. A Java-based tool for creating KML files from GPS waypoints

    NASA Astrophysics Data System (ADS)

    Kinnicutt, P. G.; Rivard, C.; Rimer, S.

    2008-12-01

    Google Earth provides a free tool with powerful capabilities for visualizing geoscience images and data. Commercial software tools exist for doing sophisticated digitizing and spatial modeling , but for the purposes of presentation, visualization and overlaying aerial images with data Google Earth provides much of the functionality. Likewise, with current technologies in GPS (Global Positioning System) systems and with Google Earth Plus, it is possible to upload GPS waypoints, tracks and routes directly into Google Earth for visualization. However, older technology GPS units and even low-cost GPS units found today may lack the necessary communications interface to a computer (e.g. no Bluetooth, no WiFi, no USB, no Serial, etc.) or may have an incompatible interface, such as a Serial port but no USB adapter available. In such cases, any waypoints, tracks and routes saved in the GPS unit or recorded in a field notebook must be manually transferred to a computer for use in a GIS system or other program. This presentation describes a Java-based tool developed by the author which enables users to enter GPS coordinates in a user-friendly manner, then save these coordinates in a Keyhole MarkUp Language (KML) file format, for visualization in Google Earth. This tool either accepts user-interactive input or accepts input from a CSV (Comma Separated Value) file, which can be generated from any spreadsheet program. This tool accepts input in the form of lat/long or UTM (Universal Transverse Mercator) coordinates. This presentation describes this system's applicability through several small case studies. This free and lightweight tool simplifies the task of manually inputting GPS data into Google Earth for people working in the field without an automated mechanism for uploading the data; for instance, the user may not have internet connectivity or may not have the proper hardware or software. Since it is a Java application and not a web- based tool, it can be installed on one's field laptop and the GPS data can be manually entered without the need for internet connectivity. This tool provides a table view of the GPS data, but lacks a KML viewer to view the data overlain on top of an aerial view, as this viewer functionality is provided in Google Earth. The tool's primary contribution lies in its more convenient method for entering the GPS data manually when automated technologies are not available.

  16. Incorporating Concept Mapping in Project-Based Learning: Lessons from Watershed Investigations

    NASA Astrophysics Data System (ADS)

    Rye, James; Landenberger, Rick; Warner, Timothy A.

    2013-06-01

    The concept map tool set forth by Novak and colleagues is underutilized in education. A meta-analysis has encouraged teachers to make extensive use of concept mapping, and researchers have advocated computer-based concept mapping applications that exploit hyperlink technology. Through an NSF sponsored geosciences education grant, middle and secondary science teachers participated in professional development to apply computer-based concept mapping in project-based learning (PBL) units that investigated local watersheds. Participants attended a summer institute, engaged in a summer through spring online learning academy, and presented PBL units at a subsequent fall science teachers' convention. The majority of 17 teachers who attended the summer institute had previously used the concept mapping strategy with students and rated it highly. Of the 12 teachers who continued beyond summer, applications of concept mapping ranged from collaborative planning of PBL projects to building students' vocabulary to students producing maps related to the PBL driving question. Barriers to the adoption and use of concept mapping included technology access at the schools, lack of time for teachers to advance their technology skills, lack of student motivation to choose to learn, and student difficulty with linking terms. In addition to mitigating the aforementioned barriers, projects targeting teachers' use of technology tools may enhance adoption by recruiting teachers as partners from schools as well as a small number that already are proficient in the targeted technology and emphasizing the utility of the concept map as a planning tool.

  17. Adaptable Computing Environment/Self-Assembling Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osbourn, Gordon C.; Bouchard, Ann M.; Bartholomew, John W.

    Complex software applications are difficult to learn to use and to remember how to use. Further, the user has no control over the functionality available in a given application. The software we use can be created and modified only by a relatively small group of elite, highly skilled artisans known as programmers. "Normal users" are powerless to create and modify software themselves, because the tools for software development, designed by and for programmers, are a barrier to entry. This software, when completed, will be a user-adaptable computing environment in which the user is really in control of his/her own software,more » able to adapt the system, make new parts of the system interactive, and even modify the behavior of the system itself. Som key features of the basic environment that have been implemented are (a) books in bookcases, where all data is stored, (b) context-sensitive compass menus (compass, because the buttons are located in compass directions relative to the mouose cursor position), (c) importing tabular data and displaying it in a book, (d) light-weight table querying/sorting, (e) a Reach&Get capability (sort of a "smart" copy/paste that prevents the user from copying invalid data), and (f) a LogBook that automatically logs all user actions that change data or the system itself. To bootstrap toward full end-user adaptability, we implemented a set of development tools. With the development tools, compass menus can be made and customized.« less

  18. Combustor design tool for a gas fired thermophotovoltaic energy converter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindler, K.W.; Harper, M.J.

    1995-12-31

    Recently, there has been a renewed interest in thermophotovoltaic (TPV) energy conversion. A TPV device converts radiant energy from a high temperature incandescent emitter directly into electricity by photovoltaic cells. The current Department of Energy sponsored research involves the design, construction and demonstration of a prototype TPV converter that uses a hydrocarbon fuel (such as natural gas) as the energy source. As the photovoltaic cells are designed to efficiently convert radiant energy at a prescribed wavelength, it is important that the temperature of the emitter be nearly constant over its entire surface. The U. S. Naval Academy has been taskedmore » with the development of a small emitter (with a high emissivity) that can be maintained at 1756 K (2700 F). This paper describes the computer spreadsheet model that was developed as a tool to be used for the design of the high temperature emitter.« less

  19. Using fragmentation trees and mass spectral trees for identifying unknown compounds in metabolomics.

    PubMed

    Vaniya, Arpana; Fiehn, Oliver

    2015-06-01

    Identification of unknown metabolites is the bottleneck in advancing metabolomics, leaving interpretation of metabolomics results ambiguous. The chemical diversity of metabolism is vast, making structure identification arduous and time consuming. Currently, comprehensive analysis of mass spectra in metabolomics is limited to library matching, but tandem mass spectral libraries are small compared to the large number of compounds found in the biosphere, including xenobiotics. Resolving this bottleneck requires richer data acquisition and better computational tools. Multi-stage mass spectrometry (MSn) trees show promise to aid in this regard. Fragmentation trees explore the fragmentation process, generate fragmentation rules and aid in sub-structure identification, while mass spectral trees delineate the dependencies in multi-stage MS of collision-induced dissociations. This review covers advancements over the past 10 years as a tool for metabolite identification, including algorithms, software and databases used to build and to implement fragmentation trees and mass spectral annotations.

  20. XenoSite server: a web-available site of metabolism prediction tool.

    PubMed

    Matlock, Matthew K; Hughes, Tyler B; Swamidass, S Joshua

    2015-04-01

    Cytochrome P450 enzymes (P450s) are metabolic enzymes that process the majority of FDA-approved, small-molecule drugs. Understanding how these enzymes modify molecule structure is key to the development of safe, effective drugs. XenoSite server is an online implementation of the XenoSite, a recently published computational model for P450 metabolism. XenoSite predicts which atomic sites of a molecule--sites of metabolism (SOMs)--are modified by P450s. XenoSite server accepts input in common chemical file formats including SDF and SMILES and provides tools for visualizing the likelihood that each atomic site is a site of metabolism for a variety of important P450s, as well as a flat file download of SOM predictions. XenoSite server is available at http://swami.wustl.edu/xenosite. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Toward anthropomimetic robotics: development, simulation, and control of a musculoskeletal torso.

    PubMed

    Wittmeier, Steffen; Alessandro, Cristiano; Bascarevic, Nenad; Dalamagkidis, Konstantinos; Devereux, David; Diamond, Alan; Jäntsch, Michael; Jovanovic, Kosta; Knight, Rob; Marques, Hugo Gravato; Milosavljevic, Predrag; Mitra, Bhargav; Svetozarevic, Bratislav; Potkonjak, Veljko; Pfeifer, Rolf; Knoll, Alois; Holland, Owen

    2013-01-01

    Anthropomimetic robotics differs from conventional approaches by capitalizing on the replication of the inner structures of the human body, such as muscles, tendons, bones, and joints. Here we present our results of more than three years of research in constructing, simulating, and, most importantly, controlling anthropomimetic robots. We manufactured four physical torsos, each more complex than its predecessor, and developed the tools required to simulate their behavior. Furthermore, six different control approaches, inspired by classical control theory, machine learning, and neuroscience, were developed and evaluated via these simulations or in small-scale setups. While the obtained results are encouraging, we are aware that we have barely exploited the potential of the anthropomimetic design so far. But, with the tools developed, we are confident that this novel approach will contribute to our understanding of morphological computation and human motor control in the future.

  2. Hypercard Another Computer Tool.

    ERIC Educational Resources Information Center

    Geske, Joel

    1991-01-01

    Describes "Hypercard," a computer application package usable in all three modes of instructional computing: tutor, tool, and tutee. Suggests using Hypercard in scholastic journalism programs to teach such topics as news, headlines, design, photography, and advertising. Argues that the ability to access, organize, manipulate, and comprehend…

  3. CAROLINA CENTER FOR COMPUTATIONAL TOXICOLOGY

    EPA Science Inventory

    The Center will advance the field of computational toxicology through the development of new methods and tools, as well as through collaborative efforts. In each Project, new computer-based models will be developed and published that represent the state-of-the-art. The tools p...

  4. Learning Disabled Students and Computers: A Teacher's Guide Book.

    ERIC Educational Resources Information Center

    Metzger, Merrianne; And Others

    This booklet is provided as a guide to teachers working with learning disabled (LD) students who are interested in using computers as a teaching tool. The computer is presented as a powerful option to enhance educational opportunities for LD children. The author outlines the three main modes in educational computer use (tutor, tool, and tutee) and…

  5. Reading and Computers: Issues for Theory and Practice. Computers and Education Series.

    ERIC Educational Resources Information Center

    Reinking, David, Ed.

    Embodying two themes--that the computer can become an even more exciting instructional tool than it is today, and that the research necessary for developing the potential of this tool is already underway, this book explores the theoretical, research, and instructional issues concerning computers and reading. The titles of the essays and their…

  6. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    ERIC Educational Resources Information Center

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-01-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…

  7. Computers: Tools of Oppression, Tools of Liberation.

    ERIC Educational Resources Information Center

    Taylor, Jefferey H.

    This paper contends that students who are learning to use computers can benefit from having an overview of the history and social context of computers. The paper highlights some milestones in the history of computers, from ancient times to ENIAC to Altair to Bill Gates to the Internet. It also suggests some things for students to think about and…

  8. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    DOE PAGES

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-08-03

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  9. Support for Systematic Code Reviews with the SCRUB Tool

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerald J.

    2010-01-01

    SCRUB is a code review tool that supports both large, team-based software development efforts (e.g., for mission software) as well as individual tasks. The tool was developed at JPL to support a new, streamlined code review process that combines human-generated review reports with program-generated review reports from a customizable range of state-of-the-art source code analyzers. The leading commercial tools include Codesonar, Coverity, and Klocwork, each of which can achieve a reasonably low rate of false-positives in the warnings that they generate. The time required to analyze code with these tools can vary greatly. In each case, however, the tools produce results that would be difficult to realize with human code inspections alone. There is little overlap in the results produced by the different analyzers, and each analyzer used generally increases the effectiveness of the overall effort. The SCRUB tool allows all reports to be accessed through a single, uniform interface (see figure) that facilitates brows ing code and reports. Improvements over existing software include significant simplification, and leveraging of a range of commercial, static source code analyzers in a single, uniform framework. The tool runs as a small stand-alone application, avoiding the security problems related to tools based on Web browsers. A developer or reviewer, for instance, must have already obtained access rights to a code base before that code can be browsed and reviewed with the SCRUB tool. The tool cannot open any files or folders to which the user does not already have access. This means that the tool does not need to enforce or administer any additional security policies. The analysis results presented through the SCRUB tool s user interface are always computed off-line, given that, especially for larger projects, this computation can take longer than appropriate for interactive tool use. The recommended code review process that is supported by the SCRUB tool consists of three phases: Code Review, Developer Response, and Closeout Resolution. In the Code Review phase, all tool-based analysis reports are generated, and specific comments from expert code reviewers are entered into the SCRUB tool. In the second phase, Developer Response, the developer is asked to respond to each comment and tool-report that was produced, either agreeing or disagreeing to provide a fix that addresses the issue that was raised. In the third phase, Closeout Resolution, all disagreements are discussed in a meeting of all parties involved, and a resolution is made for all disagreements. The first two phases generally take one week each, and the third phase is concluded in a single closeout meeting.

  10. Towards early software reliability prediction for computer forensic tools (case study).

    PubMed

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  11. Assessing Child Nutrient Intakes Using a Tablet-Based 24-Hour Recall Tool in Rural Zambia.

    PubMed

    Caswell, Bess L; Talegawkar, Sameera A; Dyer, Brian; Siamusantu, Ward; Klemm, Rolf D W; Palmer, Amanda C

    2015-12-01

    Detailed dietary intake data in low-income populations are needed for research and program evaluation. However, collection of such data by paper-based 24-hour recall imposes substantial demands for staff time and expertise, training, materials, and data entry. To describe our development and use of a tablet-based 24-hour recall tool for conducting dietary intake surveys in remote settings. We designed a 24-hour recall tool using Open Data Kit software on an Android tablet platform. The tool contains a list of local foods, questions on portion size, cooking method, ingredients, and food source and prompts to guide interviewers. We used this tool to interview caregivers on dietary intakes of children participating in an efficacy trial of provitamin A-biofortified maize conducted in Mkushi, a rural district in central Zambia. Participants were children aged 4 to 8 years not yet enrolled in school (n = 938). Dietary intake data were converted to nutrient intakes using local food composition and recipe tables. We developed a tablet-based 24-hour recall tool and used it to collect dietary data among 928 children. The majority of foods consumed were maize, leafy vegetable, or small fish dishes. Median daily energy intake was 6416 kJ (1469 kcal). Food and nutrient intakes assessed using the tablet-based tool were consistent with those reported in prior research. The tool was easily used by interviewers without prior nutrition training or computing experience. Challenges remain to improve programming, but the tool is an innovation that enables efficient collection of 24-hour recall data in remote settings. © The Author(s) 2015.

  12. Virtual reality based surgical assistance and training system for long duration space missions.

    PubMed

    Montgomery, K; Thonier, G; Stephanides, M; Schendel, S

    2001-01-01

    Access to medical care during long duration space missions is extremely important. Numerous unanticipated medical problems will need to be addressed promptly and efficiently. Although telemedicine provides a convenient tool for remote diagnosis and treatment, it is impractical due to the long delay between data transmission and reception to Earth. While a well-trained surgeon-internist-astronaut would be an essential addition to the crew, the vast number of potential medical problems necessitate instant access to computerized, skill-enhancing and diagnostic tools. A functional prototype of a virtual reality based surgical training and assistance tool was created at our center, using low-power, small, lightweight components that would be easy to transport on a space mission. The system consists of a tracked, head-mounted display, a computer system, and a number of tracked surgical instruments. The software provides a real-time surgical simulation system with integrated monitoring and information retrieval and a voice input/output subsystem. Initial medical content for the system has been created, comprising craniofacial, hand, inner ear, and general anatomy, as well as information on a number of surgical procedures and techniques. One surgical specialty in particular, microsurgery, was provided as a full simulation due to its long training requirements, significant impact on result due to experience, and likelihood for need. However, the system is easily adapted to realistically simulate a large number of other surgical procedures. By providing a general system for surgical simulation and assistance, the astronaut-surgeon can maintain their skills, acquire new specialty skills, and use tools for computer-based surgical planning and assistance to minimize overall crew and mission risk.

  13. sRNAdb: A small non-coding RNA database for gram-positive bacteria

    PubMed Central

    2012-01-01

    Background The class of small non-coding RNA molecules (sRNA) regulates gene expression by different mechanisms and enables bacteria to mount a physiological response due to adaptation to the environment or infection. Over the last decades the number of sRNAs has been increasing rapidly. Several databases like Rfam or fRNAdb were extended to include sRNAs as a class of its own. Furthermore new specialized databases like sRNAMap (gram-negative bacteria only) and sRNATarBase (target prediction) were established. To the best of the authors’ knowledge no database focusing on sRNAs from gram-positive bacteria is publicly available so far. Description In order to understand sRNA’s functional and phylogenetic relationships we have developed sRNAdb and provide tools for data analysis and visualization. The data compiled in our database is assembled from experiments as well as from bioinformatics analyses. The software enables comparison and visualization of gene loci surrounding the sRNAs of interest. To accomplish this, we use a client–server based approach. Offline versions of the database including analyses and visualization tools can easily be installed locally on the user’s computer. This feature facilitates customized local addition of unpublished sRNA candidates and related information such as promoters or terminators using tab-delimited files. Conclusion sRNAdb allows a user-friendly and comprehensive comparative analysis of sRNAs from available sequenced gram-positive prokaryotic replicons. Offline versions including analysis and visualization tools facilitate complex user specific bioinformatics analyses. PMID:22883983

  14. A new approach for developing adjoint models

    NASA Astrophysics Data System (ADS)

    Farrell, P. E.; Funke, S. W.

    2011-12-01

    Many data assimilation algorithms rely on the availability of gradients of misfit functionals, which can be efficiently computed with adjoint models. However, the development of an adjoint model for a complex geophysical code is generally very difficult. Algorithmic differentiation (AD, also called automatic differentiation) offers one strategy for simplifying this task: it takes the abstraction that a model is a sequence of primitive instructions, each of which may be differentiated in turn. While extremely successful, this low-level abstraction runs into time-consuming difficulties when applied to the whole codebase of a model, such as differentiating through linear solves, model I/O, calls to external libraries, language features that are unsupported by the AD tool, and the use of multiple programming languages. While these difficulties can be overcome, it requires a large amount of technical expertise and an intimate familiarity with both the AD tool and the model. An alternative to applying the AD tool to the whole codebase is to assemble the discrete adjoint equations and use these to compute the necessary gradients. With this approach, the AD tool must be applied to the nonlinear assembly operators, which are typically small, self-contained units of the codebase. The disadvantage of this approach is that the assembly of the discrete adjoint equations is still very difficult to perform correctly, especially for complex multiphysics models that perform temporal integration; as it stands, this approach is as difficult and time-consuming as applying AD to the whole model. In this work, we have developed a library which greatly simplifies and automates the alternate approach of assembling the discrete adjoint equations. We propose a complementary, higher-level abstraction to that of AD: that a model is a sequence of linear solves. The developer annotates model source code with library calls that build a 'tape' of the operators involved and their dependencies, and supplies callbacks to compute the action of these operators. The library, called libadjoint, is then capable of symbolically manipulating the forward annotation to automatically assemble the adjoint equations. Libadjoint is open source, and is explicitly designed to be bolted-on to an existing discrete model. It can be applied to any discretisation, steady or time-dependent problems, and both linear and nonlinear systems. Using libadjoint has several advantages. It requires the application of an AD tool only to small pieces of code, making the use of AD far more tractable. As libadjoint derives the adjoint equations, the expertise required to develop an adjoint model is greatly diminished. One major advantage of this approach is that the model developer is freed from implementing complex checkpointing strategies for the adjoint model: libadjoint has sufficient information about the forward model to re-play the entire forward solve when necessary, and thus the checkpointing algorithm can be implemented entirely within the library itself. Examples are shown using the Fluidity/ICOM framework, a complex ocean model under development at Imperial College London.

  15. Face and construct validity of a computer-based virtual reality simulator for ERCP.

    PubMed

    Bittner, James G; Mellinger, John D; Imam, Toufic; Schade, Robert R; Macfadyen, Bruce V

    2010-02-01

    Currently, little evidence supports computer-based simulation for ERCP training. To determine face and construct validity of a computer-based simulator for ERCP and assess its perceived utility as a training tool. Novice and expert endoscopists completed 2 simulated ERCP cases by using the GI Mentor II. Virtual Education and Surgical Simulation Laboratory, Medical College of Georgia. Outcomes included times to complete the procedure, reach the papilla, and use fluoroscopy; attempts to cannulate the papilla, pancreatic duct, and common bile duct; and number of contrast injections and complications. Subjects assessed simulator graphics, procedural accuracy, difficulty, haptics, overall realism, and training potential. Only when performance data from cases A and B were combined did the GI Mentor II differentiate novices and experts based on times to complete the procedure, reach the papilla, and use fluoroscopy. Across skill levels, overall opinions were similar regarding graphics (moderately realistic), accuracy (similar to clinical ERCP), difficulty (similar to clinical ERCP), overall realism (moderately realistic), and haptics. Most participants (92%) claimed that the simulator has definite training potential or should be required for training. Small sample size, single institution. The GI Mentor II demonstrated construct validity for ERCP based on select metrics. Most subjects thought that the simulated graphics, procedural accuracy, and overall realism exhibit face validity. Subjects deemed it a useful training tool. Study repetition involving more participants and cases may help confirm results and establish the simulator's ability to differentiate skill levels based on ERCP-specific metrics.

  16. Web-Based Computational Chemistry Education with CHARMMing II: Coarse-Grained Protein Folding

    PubMed Central

    Schalk, Vinushka; Lerner, Michael G.; Woodcock, H. Lee; Brooks, Bernard R.

    2014-01-01

    A lesson utilizing a coarse-grained (CG) G-like model has been implemented into the CHARMM INterface and Graphics (CHARMMing) web portal (www.charmming.org) to the Chemistry at HARvard Macromolecular Mechanics (CHARMM) molecular simulation package. While widely used to model various biophysical processes, such as protein folding and aggregation, CG models can also serve as an educational tool because they can provide qualitative descriptions of complex biophysical phenomena for a relatively cheap computational cost. As a proof of concept, this lesson demonstrates the construction of a CG model of a small globular protein, its simulation via Langevin dynamics, and the analysis of the resulting data. This lesson makes connections between modern molecular simulation techniques and topics commonly presented in an advanced undergraduate lecture on physical chemistry. It culminates in a straightforward analysis of a short dynamics trajectory of a small fast folding globular protein; we briefly describe the thermodynamic properties that can be calculated from this analysis. The assumptions inherent in the model and the data analysis are laid out in a clear, concise manner, and the techniques used are consistent with those employed by specialists in the field of CG modeling. One of the major tasks in building the G-like model is determining the relative strength of the nonbonded interactions between coarse-grained sites. New functionality has been added to CHARMMing to facilitate this process. The implementation of these features into CHARMMing helps automate many of the tedious aspects of constructing a CG G model. The CG model builder and its accompanying lesson should be a valuable tool to chemistry students, teachers, and modelers in the field. PMID:25058338

  17. Web-based computational chemistry education with CHARMMing II: Coarse-grained protein folding.

    PubMed

    Pickard, Frank C; Miller, Benjamin T; Schalk, Vinushka; Lerner, Michael G; Woodcock, H Lee; Brooks, Bernard R

    2014-07-01

    A lesson utilizing a coarse-grained (CG) Gō-like model has been implemented into the CHARMM INterface and Graphics (CHARMMing) web portal (www.charmming.org) to the Chemistry at HARvard Macromolecular Mechanics (CHARMM) molecular simulation package. While widely used to model various biophysical processes, such as protein folding and aggregation, CG models can also serve as an educational tool because they can provide qualitative descriptions of complex biophysical phenomena for a relatively cheap computational cost. As a proof of concept, this lesson demonstrates the construction of a CG model of a small globular protein, its simulation via Langevin dynamics, and the analysis of the resulting data. This lesson makes connections between modern molecular simulation techniques and topics commonly presented in an advanced undergraduate lecture on physical chemistry. It culminates in a straightforward analysis of a short dynamics trajectory of a small fast folding globular protein; we briefly describe the thermodynamic properties that can be calculated from this analysis. The assumptions inherent in the model and the data analysis are laid out in a clear, concise manner, and the techniques used are consistent with those employed by specialists in the field of CG modeling. One of the major tasks in building the Gō-like model is determining the relative strength of the nonbonded interactions between coarse-grained sites. New functionality has been added to CHARMMing to facilitate this process. The implementation of these features into CHARMMing helps automate many of the tedious aspects of constructing a CG Gō model. The CG model builder and its accompanying lesson should be a valuable tool to chemistry students, teachers, and modelers in the field.

  18. Integrated design, execution, and analysis of arrayed and pooled CRISPR genome-editing experiments.

    PubMed

    Canver, Matthew C; Haeussler, Maximilian; Bauer, Daniel E; Orkin, Stuart H; Sanjana, Neville E; Shalem, Ophir; Yuan, Guo-Cheng; Zhang, Feng; Concordet, Jean-Paul; Pinello, Luca

    2018-05-01

    CRISPR (clustered regularly interspaced short palindromic repeats) genome-editing experiments offer enormous potential for the evaluation of genomic loci using arrayed single guide RNAs (sgRNAs) or pooled sgRNA libraries. Numerous computational tools are available to help design sgRNAs with optimal on-target efficiency and minimal off-target potential. In addition, computational tools have been developed to analyze deep-sequencing data resulting from genome-editing experiments. However, these tools are typically developed in isolation and oftentimes are not readily translatable into laboratory-based experiments. Here, we present a protocol that describes in detail both the computational and benchtop implementation of an arrayed and/or pooled CRISPR genome-editing experiment. This protocol provides instructions for sgRNA design with CRISPOR (computational tool for the design, evaluation, and cloning of sgRNA sequences), experimental implementation, and analysis of the resulting high-throughput sequencing data with CRISPResso (computational tool for analysis of genome-editing outcomes from deep-sequencing data). This protocol allows for design and execution of arrayed and pooled CRISPR experiments in 4-5 weeks by non-experts, as well as computational data analysis that can be performed in 1-2 d by both computational and noncomputational biologists alike using web-based and/or command-line versions.

  19. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  20. A computational pipeline for quantification of pulmonary infections in small animal models using serial PET-CT imaging.

    PubMed

    Bagci, Ulas; Foster, Brent; Miller-Jaster, Kirsten; Luna, Brian; Dey, Bappaditya; Bishai, William R; Jonsson, Colleen B; Jain, Sanjay; Mollura, Daniel J

    2013-07-23

    Infectious diseases are the second leading cause of death worldwide. In order to better understand and treat them, an accurate evaluation using multi-modal imaging techniques for anatomical and functional characterizations is needed. For non-invasive imaging techniques such as computed tomography (CT), magnetic resonance imaging (MRI), and positron emission tomography (PET), there have been many engineering improvements that have significantly enhanced the resolution and contrast of the images, but there are still insufficient computational algorithms available for researchers to use when accurately quantifying imaging data from anatomical structures and functional biological processes. Since the development of such tools may potentially translate basic research into the clinic, this study focuses on the development of a quantitative and qualitative image analysis platform that provides a computational radiology perspective for pulmonary infections in small animal models. Specifically, we designed (a) a fast and robust automated and semi-automated image analysis platform and a quantification tool that can facilitate accurate diagnostic measurements of pulmonary lesions as well as volumetric measurements of anatomical structures, and incorporated (b) an image registration pipeline to our proposed framework for volumetric comparison of serial scans. This is an important investigational tool for small animal infectious disease models that can help advance researchers' understanding of infectious diseases. We tested the utility of our proposed methodology by using sequentially acquired CT and PET images of rabbit, ferret, and mouse models with respiratory infections of Mycobacterium tuberculosis (TB), H1N1 flu virus, and an aerosolized respiratory pathogen (necrotic TB) for a total of 92, 44, and 24 scans for the respective studies with half of the scans from CT and the other half from PET. Institutional Administrative Panel on Laboratory Animal Care approvals were obtained prior to conducting this research. First, the proposed computational framework registered PET and CT images to provide spatial correspondences between images. Second, the lungs from the CT scans were segmented using an interactive region growing (IRG) segmentation algorithm with mathematical morphology operations to avoid false positive (FP) uptake in PET images. Finally, we segmented significant radiotracer uptake from the PET images in lung regions determined from CT and computed metabolic volumes of the significant uptake. All segmentation processes were compared with expert radiologists' delineations (ground truths). Metabolic and gross volume of lesions were automatically computed with the segmentation processes using PET and CT images, and percentage changes in those volumes over time were calculated. (Continued on next page)(Continued from previous page) Standardized uptake value (SUV) analysis from PET images was conducted as a complementary quantitative metric for disease severity assessment. Thus, severity and extent of pulmonary lesions were examined through both PET and CT images using the aforementioned quantification metrics outputted from the proposed framework. Each animal study was evaluated within the same subject class, and all steps of the proposed methodology were evaluated separately. We quantified the accuracy of the proposed algorithm with respect to the state-of-the-art segmentation algorithms. For evaluation of the segmentation results, dice similarity coefficient (DSC) as an overlap measure and Haussdorf distance as a shape dissimilarity measure were used. Significant correlations regarding the estimated lesion volumes were obtained both in CT and PET images with respect to the ground truths (R2=0.8922,p<0.01 and R2=0.8664,p<0.01, respectively). The segmentation accuracy (DSC (%)) was 93.4±4.5% for normal lung CT scans and 86.0±7.1% for pathological lung CT scans. Experiments showed excellent agreements (all above 85%) with expert evaluations for both structural and functional imaging modalities. Apart from quantitative analysis of each animal, we also qualitatively showed how metabolic volumes were changing over time by examining serial PET/CT scans. Evaluation of the registration processes was based on precisely defined anatomical landmark points by expert clinicians. An average of 2.66, 3.93, and 2.52 mm errors was found in rabbit, ferret, and mouse data (all within the resolution limits), respectively. Quantitative results obtained from the proposed methodology were visually related to the progress and severity of the pulmonary infections as verified by the participating radiologists. Moreover, we demonstrated that lesions due to the infections were metabolically active and appeared multi-focal in nature, and we observed similar patterns in the CT images as well. Consolidation and ground glass opacity were the main abnormal imaging patterns and consistently appeared in all CT images. We also found that the gross and metabolic lesion volume percentage follow the same trend as the SUV-based evaluation in the longitudinal analysis. We explored the feasibility of using PET and CT imaging modalities in three distinct small animal models for two diverse pulmonary infections. We concluded from the clinical findings, derived from the proposed computational pipeline, that PET-CT imaging is an invaluable hybrid modality for tracking pulmonary infections longitudinally in small animals and has great potential to become routinely used in clinics. Our proposed methodology showed that automated computed-aided lesion detection and quantification of pulmonary infections in small animal models are efficient and accurate as compared to the clinical standard of manual and semi-automated approaches. Automated analysis of images in pre-clinical applications can increase the efficiency and quality of pre-clinical findings that ultimately inform downstream experimental design in human clinical studies; this innovation will allow researchers and clinicians to more effectively allocate study resources with respect to research demands without compromising accuracy.

  1. Development of Anthropometric Analogous Headforms. Phase 1.

    DTIC Science & Technology

    1994-10-31

    shown in figure 5. This surface mesh can then be transformed into polygon faces that are able to be rendered by the AutoCAD rendering tools . Rendering of...computer-generated surfaces. The material removal techniques require the programming of the tool path of the cutter and in some cases requires specialized... tooling . Tool path programs are available to transfer the computer-generated surface into actual paths of the cutting tool . In cases where the

  2. Deployment and evaluation of a dual-sensor autofocusing method for on-machine measurement of patterns of small holes on freeform surfaces.

    PubMed

    Chen, Xiaomei; Longstaff, Andrew; Fletcher, Simon; Myers, Alan

    2014-04-01

    This paper presents and evaluates an active dual-sensor autofocusing system that combines an optical vision sensor and a tactile probe for autofocusing on arrays of small holes on freeform surfaces. The system has been tested on a two-axis test rig and then integrated onto a three-axis computer numerical control (CNC) milling machine, where the aim is to rapidly and controllably measure the hole position errors while the part is still on the machine. The principle of operation is for the tactile probe to locate the nominal positions of holes, and the optical vision sensor follows to focus and capture the images of the holes. The images are then processed to provide hole position measurement. In this paper, the autofocusing deviations are analyzed. First, the deviations caused by the geometric errors of the axes on which the dual-sensor unit is deployed are estimated to be 11 μm when deployed on a test rig and 7 μm on the CNC machine tool. Subsequently, the autofocusing deviations caused by the interaction of the tactile probe, surface, and small hole are mathematically analyzed and evaluated. The deviations are a result of the tactile probe radius, the curvatures at the positions where small holes are drilled on the freeform surface, and the effect of the position error of the hole on focusing. An example case study is provided for the measurement of a pattern of small holes on an elliptical cylinder on the two machines. The absolute sum of the autofocusing deviations is 118 μm on the test rig and 144 μm on the machine tool. This is much less than the 500 μm depth of field of the optical microscope. Therefore, the method is capable of capturing a group of clear images of the small holes on this workpiece for either implementation.

  3. A Debugger for Computational Grid Applications

    NASA Technical Reports Server (NTRS)

    Hood, Robert; Jost, Gabriele; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation gives an overview of a debugger for computational grid applications. Details are given on NAS parallel tools groups (including parallelization support tools, evaluation of various parallelization strategies, and distributed and aggregated computing), debugger dependencies, scalability, initial implementation, the process grid, and information on Globus.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  5. Dynamic Analyses of Result Quality in Energy-Aware Approximate Programs

    NASA Astrophysics Data System (ADS)

    RIngenburg, Michael F.

    Energy efficiency is a key concern in the design of modern computer systems. One promising approach to energy-efficient computation, approximate computing, trades off output precision for energy efficiency. However, this tradeoff can have unexpected effects on computation quality. This thesis presents dynamic analysis tools to study, debug, and monitor the quality and energy efficiency of approximate computations. We propose three styles of tools: prototyping tools that allow developers to experiment with approximation in their applications, online tools that instrument code to determine the key sources of error, and online tools that monitor the quality of deployed applications in real time. Our prototyping tool is based on an extension to the functional language OCaml. We add approximation constructs to the language, an approximation simulator to the runtime, and profiling and auto-tuning tools for studying and experimenting with energy-quality tradeoffs. We also present two online debugging tools and three online monitoring tools. The first online tool identifies correlations between output quality and the total number of executions of, and errors in, individual approximate operations. The second tracks the number of approximate operations that flow into a particular value. Our online tools comprise three low-cost approaches to dynamic quality monitoring. They are designed to monitor quality in deployed applications without spending more energy than is saved by approximation. Online monitors can be used to perform real time adjustments to energy usage in order to meet specific quality goals. We present prototype implementations of all of these tools and describe their usage with several applications. Our prototyping, profiling, and autotuning tools allow us to experiment with approximation strategies and identify new strategies, our online tools succeed in providing new insights into the effects of approximation on output quality, and our monitors succeed in controlling output quality while still maintaining significant energy efficiency gains.

  6. Energy Efficiency Challenges of 5G Small Cell Networks.

    PubMed

    Ge, Xiaohu; Yang, Jing; Gharavi, Hamid; Sun, Yang

    2017-05-01

    The deployment of a large number of small cells poses new challenges to energy efficiency, which has often been ignored in fifth generation (5G) cellular networks. While massive multiple-input multiple outputs (MIMO) will reduce the transmission power at the expense of higher computational cost, the question remains as to which computation or transmission power is more important in the energy efficiency of 5G small cell networks. Thus, the main objective in this paper is to investigate the computation power based on the Landauer principle. Simulation results reveal that more than 50% of the energy is consumed by the computation power at 5G small cell base stations (BSs). Moreover, the computation power of 5G small cell BS can approach 800 watt when the massive MIMO (e.g., 128 antennas) is deployed to transmit high volume traffic. This clearly indicates that computation power optimization can play a major role in the energy efficiency of small cell networks.

  7. Energy Efficiency Challenges of 5G Small Cell Networks

    PubMed Central

    Ge, Xiaohu; Yang, Jing; Gharavi, Hamid; Sun, Yang

    2017-01-01

    The deployment of a large number of small cells poses new challenges to energy efficiency, which has often been ignored in fifth generation (5G) cellular networks. While massive multiple-input multiple outputs (MIMO) will reduce the transmission power at the expense of higher computational cost, the question remains as to which computation or transmission power is more important in the energy efficiency of 5G small cell networks. Thus, the main objective in this paper is to investigate the computation power based on the Landauer principle. Simulation results reveal that more than 50% of the energy is consumed by the computation power at 5G small cell base stations (BSs). Moreover, the computation power of 5G small cell BS can approach 800 watt when the massive MIMO (e.g., 128 antennas) is deployed to transmit high volume traffic. This clearly indicates that computation power optimization can play a major role in the energy efficiency of small cell networks. PMID:28757670

  8. Computer-aided system design

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.

    1991-01-01

    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eto, Joseph H.; Parashar, Manu; Lewis, Nancy Jo

    The Real Time System Operations (RTSO) 2006-2007 project focused on two parallel technical tasks: (1) Real-Time Applications of Phasors for Monitoring, Alarming and Control; and (2) Real-Time Voltage Security Assessment (RTVSA) Prototype Tool. The overall goal of the phasor applications project was to accelerate adoption and foster greater use of new, more accurate, time-synchronized phasor measurements by conducting research and prototyping applications on California ISO's phasor platform - Real-Time Dynamics Monitoring System (RTDMS) -- that provide previously unavailable information on the dynamic stability of the grid. Feasibility assessment studies were conducted on potential application of this technology for small-signal stabilitymore » monitoring, validating/improving existing stability nomograms, conducting frequency response analysis, and obtaining real-time sensitivity information on key metrics to assess grid stress. Based on study findings, prototype applications for real-time visualization and alarming, small-signal stability monitoring, measurement based sensitivity analysis and frequency response assessment were developed, factory- and field-tested at the California ISO and at BPA. The goal of the RTVSA project was to provide California ISO with a prototype voltage security assessment tool that runs in real time within California ISO?s new reliability and congestion management system. CERTS conducted a technical assessment of appropriate algorithms, developed a prototype incorporating state-of-art algorithms (such as the continuation power flow, direct method, boundary orbiting method, and hyperplanes) into a framework most suitable for an operations environment. Based on study findings, a functional specification was prepared, which the California ISO has since used to procure a production-quality tool that is now a part of a suite of advanced computational tools that is used by California ISO for reliability and congestion management.« less

  10. DEVELOPMENT AND USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOLS FOR POLLUTION PREVENTION

    EPA Science Inventory

    The use of Computer-Aided Process Engineering (CAPE) and process simulation tools has become established industry practice to predict simulation software, new opportunities are available for the creation of a wide range of ancillary tools that can be used from within multiple sim...

  11. Computer Simulation Shows the Effect of Communication on Day of Surgery Patient Flow.

    PubMed

    Taaffe, Kevin; Fredendall, Lawrence; Huynh, Nathan; Franklin, Jennifer

    2015-07-01

    To improve patient flow in a surgical environment, practitioners and academicians often use process mapping and simulation as tools to evaluate and recommend changes. We used simulations to help staff visualize the effect of communication and coordination delays that occur on the day of surgery. Perioperative services staff participated in tabletop exercises in which they chose the delays that were most important to eliminate. Using a day-of-surgery computer simulation model, the elimination of delays was tested and the results were shared with the group. This exercise, repeated for multiple groups of staff, provided an understanding of not only the dynamic events taking place, but also how small communication delays can contribute to a significant loss in efficiency and the ability to provide timely care. Survey results confirmed these understandings. Copyright © 2015 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  12. Solution x-ray scattering and structure formation in protein dynamics

    NASA Astrophysics Data System (ADS)

    Nasedkin, Alexandr; Davidsson, Jan; Niemi, Antti J.; Peng, Xubiao

    2017-12-01

    We propose a computationally effective approach that builds on Landau mean-field theory in combination with modern nonequilibrium statistical mechanics to model and interpret protein dynamics and structure formation in small- to wide-angle x-ray scattering (S/WAXS) experiments. We develop the methodology by analyzing experimental data in the case of Engrailed homeodomain protein as an example. We demonstrate how to interpret S/WAXS data qualitatively with a good precision and over an extended temperature range. We explain experimental observations in terms of protein phase structure, and we make predictions for future experiments and for how to analyze data at different ambient temperature values. We conclude that the approach we propose has the potential to become a highly accurate, computationally effective, and predictive tool for analyzing S/WAXS data. For this, we compare our results with those obtained previously in an all-atom molecular dynamics simulation.

  13. Youpi: YOUr processing PIpeline

    NASA Astrophysics Data System (ADS)

    Monnerville, Mathias; Sémah, Gregory

    2012-03-01

    Youpi is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. Built on top of various open source reduction tools released to the community by TERAPIX (http://terapix.iap.fr), Youpi can help organize data, manage processing jobs on a computer cluster in real time (using Condor) and facilitate teamwork by allowing fine-grain sharing of results and data. Youpi is modular and comes with plugins which perform, from within a browser, various processing tasks such as evaluating the quality of incoming images (using the QualityFITS software package), computing astrometric and photometric solutions (using SCAMP), resampling and co-adding FITS images (using SWarp) and extracting sources and building source catalogues from astronomical images (using SExtractor). Youpi is useful for small to medium-sized data reduction projects; it is free and is published under the GNU General Public License.

  14. Drawing the PDB: Protein-Ligand Complexes in Two Dimensions.

    PubMed

    Stierand, Katrin; Rarey, Matthias

    2010-12-09

    The two-dimensional representation of molecules is a popular communication medium in chemistry and the associated scientific fields. Computational methods for drawing small molecules with and without manual investigation are well-established and widely spread in terms of numerous software tools. Concerning the planar depiction of molecular complexes, there is considerably less choice. We developed the software PoseView, which automatically generates two-dimensional diagrams of macromolecular complexes, showing the ligand, the interactions, and the interacting residues. All depicted molecules are drawn on an atomic level as structure diagrams; thus, the output plots are clearly structured and easily readable for the scientist. We tested the performance of PoseView in a large-scale application on nearly all druglike complexes of the PDB (approximately 200000 complexes); for more than 92% of the complexes considered for drawing, a layout could be computed. In the following, we will present the results of this application study.

  15. Signal Processing for Metagenomics: Extracting Information from the Soup

    PubMed Central

    Rosen, Gail L.; Sokhansanj, Bahrad A.; Polikar, Robi; Bruns, Mary Ann; Russell, Jacob; Garbarine, Elaine; Essinger, Steve; Yok, Non

    2009-01-01

    Traditionally, studies in microbial genomics have focused on single-genomes from cultured species, thereby limiting their focus to the small percentage of species that can be cultured outside their natural environment. Fortunately, recent advances in high-throughput sequencing and computational analyses have ushered in the new field of metagenomics, which aims to decode the genomes of microbes from natural communities without the need for cultivation. Although metagenomic studies have shed a great deal of insight into bacterial diversity and coding capacity, several computational challenges remain due to the massive size and complexity of metagenomic sequence data. Current tools and techniques are reviewed in this paper which address challenges in 1) genomic fragment annotation, 2) phylogenetic reconstruction, 3) functional classification of samples, and 4) interpreting complementary metaproteomics and metametabolomics data. Also surveyed are important applications of metagenomic studies, including microbial forensics and the roles of microbial communities in shaping human health and soil ecology. PMID:20436876

  16. Performance bounds for nonlinear systems with a nonlinear ℒ2-gain property

    NASA Astrophysics Data System (ADS)

    Zhang, Huan; Dower, Peter M.

    2012-09-01

    Nonlinear ℒ2-gain is a finite gain concept that generalises the notion of conventional (linear) finite ℒ2-gain to admit the application of ℒ2-gain analysis tools of a broader class of nonlinear systems. The computation of tight comparison function bounds for this nonlinear ℒ2-gain property is important in applications such as small gain design. This article presents an approximation framework for these comparison function bounds through the formulation and solution of an optimal control problem. Key to the solution of this problem is the lifting of an ℒ2-norm input constraint, which is facilitated via the introduction of an energy saturation operator. This admits the solution of the optimal control problem of interest via dynamic programming and associated numerical methods, leading to the computation of the proposed bounds. Two examples are presented to demonstrate this approach.

  17. Neural-network quantum state tomography

    NASA Astrophysics Data System (ADS)

    Torlai, Giacomo; Mazzola, Guglielmo; Carrasquilla, Juan; Troyer, Matthias; Melko, Roger; Carleo, Giuseppe

    2018-05-01

    The experimental realization of increasingly complex synthetic quantum systems calls for the development of general theoretical methods to validate and fully exploit quantum resources. Quantum state tomography (QST) aims to reconstruct the full quantum state from simple measurements, and therefore provides a key tool to obtain reliable analytics1-3. However, exact brute-force approaches to QST place a high demand on computational resources, making them unfeasible for anything except small systems4,5. Here we show how machine learning techniques can be used to perform QST of highly entangled states with more than a hundred qubits, to a high degree of accuracy. We demonstrate that machine learning allows one to reconstruct traditionally challenging many-body quantities—such as the entanglement entropy—from simple, experimentally accessible measurements. This approach can benefit existing and future generations of devices ranging from quantum computers to ultracold-atom quantum simulators6-8.

  18. Large-Eddy Simulation of Aeroacoustic Applications

    NASA Technical Reports Server (NTRS)

    Pruett, C. David; Sochacki, James S.

    1999-01-01

    This report summarizes work accomplished under a one-year NASA grant from NASA Langley Research Center (LaRC). The effort culminates three years of NASA-supported research under three consecutive one-year grants. The period of support was April 6, 1998, through April 5, 1999. By request, the grant period was extended at no-cost until October 6, 1999. Its predecessors have been directed toward adapting the numerical tool of large-eddy simulation (LES) to aeroacoustic applications, with particular focus on noise suppression in subsonic round jets. In LES, the filtered Navier-Stokes equations are solved numerically on a relatively coarse computational grid. Residual stresses, generated by scales of motion too small to be resolved on the coarse grid, are modeled. Although most LES incorporate spatial filtering, time-domain filtering affords certain conceptual and computational advantages, particularly for aeroacoustic applications. Consequently, this work has focused on the development of subgrid-scale (SGS) models that incorporate time-domain filters.

  19. Aerodynamic Design and Computational Analysis of a Spacecraft Cabin Ventilation Fan

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.

    2010-01-01

    Quieter working environments for astronauts are needed if future long-duration space exploration missions are to be safe and productive. Ventilation and payload cooling fans are known to be dominant sources of noise, with the International Space Station being a good case in point. To address this issue in a cost-effective way, early attention to fan design, selection, and installation has been recommended. Toward that end, NASA has begun to investigate the potential for small-fan noise reduction through improvements in fan aerodynamic design. Using tools and methodologies similar to those employed by the aircraft engine industry, most notably computational fluid dynamics (CFD) codes, the aerodynamic design of a new cabin ventilation fan has been developed, and its aerodynamic performance has been predicted and analyzed. The design, intended to serve as a baseline for future work, is discussed along with selected CFD results

  20. Do medical students watch video clips in eLearning and do these facilitate learning?

    PubMed

    Romanov, Kalle; Nevgi, Anne

    2007-06-01

    There is controversial evidence of the impact of individual learning style on students' performance in computer-aided learning. We assessed the association between the use of multimedia materials, such as video clips, and collaborative communication tools with learning outcome among medical students. One hundred and twenty-one third-year medical students attended a course in medical informatics (0.7 credits) consisting of lectures, small group sessions and eLearning material. The eLearning material contained six learning modules with integrated video clips and collaborative learning tools in WebCT. Learning outcome was measured with a course exam. Approximately two-thirds of students (68.6%) viewed two or more videos. Female students were significantly more active video-watchers. No significant associations were found between video-watching and self-test scores or the time used in eLearning. Video-watchers were more active in WebCT; they loaded more pages and more actively participated in discussion forums. Video-watching was associated with a better course grade. Students who watched video clips were more active in using collaborative eLearning tools and achieved higher course grades.

  1. Bridging a Gap: In Search of an Analytical Tool Capturing Teachers' Perceptions of Their Own Teaching

    ERIC Educational Resources Information Center

    Rolandsson, Lennart; Skogh, Inga-Britt; Männikkö Barbutiu, Sirkku

    2017-01-01

    Computing and computers are introduced in school as important examples of technology, sometimes as a subject matter of their own, and sometimes they are used as tools for other subjects. All in all, one might even say that "learning about" computing and computers is part of "learning about" technology. Lately, many countries…

  2. PyNEST: A Convenient Interface to the NEST Simulator.

    PubMed

    Eppler, Jochen Martin; Helias, Moritz; Muller, Eilif; Diesmann, Markus; Gewaltig, Marc-Oliver

    2008-01-01

    The neural simulation tool NEST (http://www.nest-initiative.org) is a simulator for heterogeneous networks of point neurons or neurons with a small number of compartments. It aims at simulations of large neural systems with more than 10(4) neurons and 10(7) to 10(9) synapses. NEST is implemented in C++ and can be used on a large range of architectures from single-core laptops over multi-core desktop computers to super-computers with thousands of processor cores. Python (http://www.python.org) is a modern programming language that has recently received considerable attention in Computational Neuroscience. Python is easy to learn and has many extension modules for scientific computing (e.g. http://www.scipy.org). In this contribution we describe PyNEST, the new user interface to NEST. PyNEST combines NEST's efficient simulation kernel with the simplicity and flexibility of Python. Compared to NEST's native simulation language SLI, PyNEST makes it easier to set up simulations, generate stimuli, and analyze simulation results. We describe how PyNEST connects NEST and Python and how it is implemented. With a number of examples, we illustrate how it is used.

  3. PyNEST: A Convenient Interface to the NEST Simulator

    PubMed Central

    Eppler, Jochen Martin; Helias, Moritz; Muller, Eilif; Diesmann, Markus; Gewaltig, Marc-Oliver

    2008-01-01

    The neural simulation tool NEST (http://www.nest-initiative.org) is a simulator for heterogeneous networks of point neurons or neurons with a small number of compartments. It aims at simulations of large neural systems with more than 104 neurons and 107 to 109 synapses. NEST is implemented in C++ and can be used on a large range of architectures from single-core laptops over multi-core desktop computers to super-computers with thousands of processor cores. Python (http://www.python.org) is a modern programming language that has recently received considerable attention in Computational Neuroscience. Python is easy to learn and has many extension modules for scientific computing (e.g. http://www.scipy.org). In this contribution we describe PyNEST, the new user interface to NEST. PyNEST combines NEST's efficient simulation kernel with the simplicity and flexibility of Python. Compared to NEST's native simulation language SLI, PyNEST makes it easier to set up simulations, generate stimuli, and analyze simulation results. We describe how PyNEST connects NEST and Python and how it is implemented. With a number of examples, we illustrate how it is used. PMID:19198667

  4. Integration of SimSET photon history generator in GATE for efficient Monte Carlo simulations of pinhole SPECT.

    PubMed

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J S; Tsui, Benjamin M W

    2008-07-01

    The authors developed and validated an efficient Monte Carlo simulation (MCS) workflow to facilitate small animal pinhole SPECT imaging research. This workflow seamlessly integrates two existing MCS tools: simulation system for emission tomography (SimSET) and GEANT4 application for emission tomography (GATE). Specifically, we retained the strength of GATE in describing complex collimator/detector configurations to meet the anticipated needs for studying advanced pinhole collimation (e.g., multipinhole) geometry, while inserting the fast SimSET photon history generator (PHG) to circumvent the relatively slow GEANT4 MCS code used by GATE in simulating photon interactions inside voxelized phantoms. For validation, data generated from this new SimSET-GATE workflow were compared with those from GATE-only simulations as well as experimental measurements obtained using a commercial small animal pinhole SPECT system. Our results showed excellent agreement (e.g., in system point response functions and energy spectra) between SimSET-GATE and GATE-only simulations, and, more importantly, a significant computational speedup (up to approximately 10-fold) provided by the new workflow. Satisfactory agreement between MCS results and experimental data were also observed. In conclusion, the authors have successfully integrated SimSET photon history generator in GATE for fast and realistic pinhole SPECT simulations, which can facilitate research in, for example, the development and application of quantitative pinhole and multipinhole SPECT for small animal imaging. This integrated simulation tool can also be adapted for studying other preclinical and clinical SPECT techniques.

  5. IUWare and Computing Tools: Indiana University's Approach to Low-Cost Software.

    ERIC Educational Resources Information Center

    Sheehan, Mark C.; Williams, James G.

    1987-01-01

    Describes strategies for providing low-cost microcomputer-based software for classroom use on college campuses. Highlights include descriptions of the software (IUWare and Computing Tools); computing center support; license policies; documentation; promotion; distribution; staff, faculty, and user training; problems; and future plans. (LRW)

  6. Data management and analysis for the Earth System Grid

    NASA Astrophysics Data System (ADS)

    Williams, D. N.; Ananthakrishnan, R.; Bernholdt, D. E.; Bharathi, S.; Brown, D.; Chen, M.; Chervenak, A. L.; Cinquini, L.; Drach, R.; Foster, I. T.; Fox, P.; Hankin, S.; Henson, V. E.; Jones, P.; Middleton, D. E.; Schwidder, J.; Schweitzer, R.; Schuler, R.; Shoshani, A.; Siebenlist, F.; Sim, A.; Strand, W. G.; Wilhelmi, N.; Su, M.

    2008-07-01

    The international climate community is expected to generate hundreds of petabytes of simulation data within the next five to seven years. This data must be accessed and analyzed by thousands of analysts worldwide in order to provide accurate and timely estimates of the likely impact of climate change on physical, biological, and human systems. Climate change is thus not only a scientific challenge of the first order but also a major technological challenge. In order to address this technological challenge, the Earth System Grid Center for Enabling Technologies (ESG-CET) has been established within the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC)-2 program, with support from the offices of Advanced Scientific Computing Research and Biological and Environmental Research. ESG-CET's mission is to provide climate researchers worldwide with access to the data, information, models, analysis tools, and computational capabilities required to make sense of enormous climate simulation datasets. Its specific goals are to (1) make data more useful to climate researchers by developing Grid technology that enhances data usability; (2) meet specific distributed database, data access, and data movement needs of national and international climate projects; (3) provide a universal and secure web-based data access portal for broad multi-model data collections; and (4) provide a wide-range of Grid-enabled climate data analysis tools and diagnostic methods to international climate centers and U.S. government agencies. Building on the successes of the previous Earth System Grid (ESG) project, which has enabled thousands of researchers to access tens of terabytes of data from a small number of ESG sites, ESG-CET is working to integrate a far larger number of distributed data providers, high-bandwidth wide-area networks, and remote computers in a highly collaborative problem-solving environment.

  7. Computational protein design: validation and possible relevance as a tool for homology searching and fold recognition.

    PubMed

    Schmidt Am Busch, Marcel; Sedano, Audrey; Simonson, Thomas

    2010-05-05

    Protein fold recognition usually relies on a statistical model of each fold; each model is constructed from an ensemble of natural sequences belonging to that fold. A complementary strategy may be to employ sequence ensembles produced by computational protein design. Designed sequences can be more diverse than natural sequences, possibly avoiding some limitations of experimental databases. WE EXPLORE THIS STRATEGY FOR FOUR SCOP FAMILIES: Small Kunitz-type inhibitors (SKIs), Interleukin-8 chemokines, PDZ domains, and large Caspase catalytic subunits, represented by 43 structures. An automated procedure is used to redesign the 43 proteins. We use the experimental backbones as fixed templates in the folded state and a molecular mechanics model to compute the interaction energies between sidechain and backbone groups. Calculations are done with the Proteins@Home volunteer computing platform. A heuristic algorithm is used to scan the sequence and conformational space, yielding 200,000-300,000 sequences per backbone template. The results confirm and generalize our earlier study of SH2 and SH3 domains. The designed sequences ressemble moderately-distant, natural homologues of the initial templates; e.g., the SUPERFAMILY, profile Hidden-Markov Model library recognizes 85% of the low-energy sequences as native-like. Conversely, Position Specific Scoring Matrices derived from the sequences can be used to detect natural homologues within the SwissProt database: 60% of known PDZ domains are detected and around 90% of known SKIs and chemokines. Energy components and inter-residue correlations are analyzed and ways to improve the method are discussed. For some families, designed sequences can be a useful complement to experimental ones for homologue searching. However, improved tools are needed to extract more information from the designed profiles before the method can be of general use.

  8. Large-deformation modal coordinates for nonrigid vehicle dynamics

    NASA Technical Reports Server (NTRS)

    Likins, P. W.; Fleischer, G. E.

    1972-01-01

    The derivation of minimum-dimension sets of discrete-coordinate and hybrid-coordinate equations of motion of a system consisting of an arbitrary number of hinge-connected rigid bodies assembled in tree topology is presented. These equations are useful for the simulation of dynamical systems that can be idealized as tree-like arrangements of substructures, with each substructure consisting of either a rigid body or a collection of elastically interconnected rigid bodies restricted to small relative rotations at each connection. Thus, some of the substructures represent elastic bodies subjected to small strains or local deformations, but possibly large gross deformations, in the hybrid formulation, distributed coordinates referred to herein as large-deformation modal coordinates, are used for the deformations of these substructures. The equations are in a form suitable for incorporation into one or more computer programs to be used as multipurpose tools in the simulation of spacecraft and other complex electromechanical systems.

  9. Deterministic versus stochastic model of reprogramming: new evidence from cellular barcoding technique

    PubMed Central

    Yunusova, Anastasia M.; Fishman, Veniamin S.; Vasiliev, Gennady V.

    2017-01-01

    Factor-mediated reprogramming of somatic cells towards pluripotency is a low-efficiency process during which only small subsets of cells are successfully reprogrammed. Previous analyses of the determinants of the reprogramming potential are based on average measurements across a large population of cells or on monitoring a relatively small number of single cells with live imaging. Here, we applied lentiviral genetic barcoding, a powerful tool enabling the identification of familiar relationships in thousands of cells. High-throughput sequencing of barcodes from successfully reprogrammed cells revealed a significant number of barcodes from related cells. We developed a computer model, according to which a probability of synchronous reprogramming of sister cells equals 10–30%. We conclude that the reprogramming success is pre-established in some particular cells and, being a heritable trait, can be maintained through cell division. Thus, reprogramming progresses in a deterministic manner, at least at the level of cell lineages. PMID:28446707

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poudel, Saroj; Aryal, Niranjan; Lu, Chaofu

    Camelina sativa is an annual oilseed crop that is under intensive development for renewable resources of biofuels and industrial oils. MicroRNAs, or miRNAs, are endogenously encoded small RNAs that play key roles in diverse plant biological processes. Here, we conducted deep sequencing on small RNA libraries prepared from camelina leaves, flower buds and two stages of developing seeds corresponding to initial and peak storage products accumulation. Computational analyses identified 207 known miRNAs belonging to 63 families, as well as 5 novel miRNAs. These miRNAs, especially members of the miRNA families, varied greatly in different tissues and developmental stages. The predictedmore » miRNA target genes are involved in a broad range of physiological functions including lipid metabolism. This report is the first step toward elucidating roles of miRNAs in C. sativa and will provide additional tools to improve this oilseed crop for biofuels and biomaterials.« less

  11. Space spider crane

    NASA Technical Reports Server (NTRS)

    Macconochie, Ian O. (Inventor); Mikulas, Martin M., Jr. (Inventor); Pennington, Jack E. (Inventor); Kinkead, Rebecca L. (Inventor); Bryan, Charles F., Jr. (Inventor)

    1988-01-01

    A space spider crane for the movement, placement, and or assembly of various components on or in the vicinity of a space structure is described. As permanent space structures are utilized by the space program, a means will be required to transport cargo and perform various repair tasks. A space spider crane comprising a small central body with attached manipulators and legs fulfills this requirement. The manipulators may be equipped with constant pressure gripping end effectors or tools to accomplish various repair tasks. The legs are also equipped with constant pressure gripping end effectors to grip the space structure. Control of the space spider crane may be achieved either by computer software or a remotely situated human operator, who maintains visual contact via television cameras mounted on the space spider crane. One possible walking program consists of a parallel motion walking program whereby the small central body alternatively leans forward and backward relative to end effectors.

  12. Computational Design of Clusters for Catalysis

    NASA Astrophysics Data System (ADS)

    Jimenez-Izal, Elisa; Alexandrova, Anastassia N.

    2018-04-01

    When small clusters are studied in chemical physics or physical chemistry, one perhaps thinks of the fundamental aspects of cluster electronic structure, or precision spectroscopy in ultracold molecular beams. However, small clusters are also of interest in catalysis, where the cold ground state or an isolated cluster may not even be the right starting point. Instead, the big question is: What happens to cluster-based catalysts under real conditions of catalysis, such as high temperature and coverage with reagents? Myriads of metastable cluster states become accessible, the entire system is dynamic, and catalysis may be driven by rare sites present only under those conditions. Activity, selectivity, and stability are highly dependent on size, composition, shape, support, and environment. To probe and master cluster catalysis, sophisticated tools are being developed for precision synthesis, operando measurements, and multiscale modeling. This review intends to tell the messy story of clusters in catalysis.

  13. Partial molar enthalpies and reaction enthalpies from equilibrium molecular dynamics simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schnell, Sondre K.; Department of Chemical and Biomolecular Engineering, University of California, Berkeley, California 94720; Department of Chemistry, Faculty of Natural Science and Technology, Norwegian University of Science and Technology, 4791 Trondheim

    2014-10-14

    We present a new molecular simulation technique for determining partial molar enthalpies in mixtures of gases and liquids from single simulations, without relying on particle insertions, deletions, or identity changes. The method can also be applied to systems with chemical reactions. We demonstrate our method for binary mixtures of Weeks-Chandler-Anderson particles by comparing with conventional simulation techniques, as well as for a simple model that mimics a chemical reaction. The method considers small subsystems inside a large reservoir (i.e., the simulation box), and uses the construction of Hill to compute properties in the thermodynamic limit from small-scale fluctuations. Results obtainedmore » with the new method are in excellent agreement with those from previous methods. Especially for modeling chemical reactions, our method can be a valuable tool for determining reaction enthalpies directly from a single MD simulation.« less

  14. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  15. The Use of Computer Tools in the Design Process of Students’ Architectural Projects. Case Studies in Algeria

    NASA Astrophysics Data System (ADS)

    Saighi, Ouafa; Salah Zerouala, Mohamed

    2017-12-01

    This The paper particularly deals with the way in which computer tools are used by students in their design studio’s projects. Four institutions of architecture education in Algeria are considered as a case study to evaluate the impact of such tools on student design process. This aims to inspect in depth such use, to sort out its advantages and shortcomings in order to suggest some solutions. A field survey was undertaken on a sample of students and their teachers at the same institutions. The analysed results mainly show that computer tools are highly focusing on improving the quality of drawings representation and images seeking observers’ satisfaction hence influencing their decision. Some teachers are not very keen to overuse the computer during the design phase; they prefer the “traditional” approach. This is the present situation that Algerian university is facing which leads to conflict and disagreement between students and teachers. Meanwhile, there was no doubt that computer tools have effectively contributed to improve the competitive level among students.

  16. The Use of Computer Tools to Support Meaningful Learning

    ERIC Educational Resources Information Center

    Keengwe, Jared; Onchwari, Grace; Wachira, Patrick

    2008-01-01

    This article attempts to provide a review of literature pertaining to computer technology use in education. The authors discuss the benefits of learning with technology tools when integrated into teaching. The argument that introducing computer technology into schools will neither improve nor change the quality of classroom instruction unless…

  17. Advanced Computing Tools and Models for Accelerator Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  18. Moving Object Detection in Heterogeneous Conditions in Embedded Systems.

    PubMed

    Garbo, Alessandro; Quer, Stefano

    2017-07-01

    This paper presents a system for moving object exposure, focusing on pedestrian detection, in external, unfriendly, and heterogeneous environments. The system manipulates and accurately merges information coming from subsequent video frames, making small computational efforts in each single frame. Its main characterizing feature is to combine several well-known movement detection and tracking techniques, and to orchestrate them in a smart way to obtain good results in diversified scenarios. It uses dynamically adjusted thresholds to characterize different regions of interest, and it also adopts techniques to efficiently track movements, and detect and correct false positives. Accuracy and reliability mainly depend on the overall receipt, i.e., on how the software system is designed and implemented, on how the different algorithmic phases communicate information and collaborate with each other, and on how concurrency is organized. The application is specifically designed to work with inexpensive hardware devices, such as off-the-shelf video cameras and small embedded computational units, eventually forming an intelligent urban grid. As a matter of fact, the major contribution of the paper is the presentation of a tool for real-time applications in embedded devices with finite computational (time and memory) resources. We run experimental results on several video sequences (both home-made and publicly available), showing the robustness and accuracy of the overall detection strategy. Comparisons with state-of-the-art strategies show that our application has similar tracking accuracy but much higher frame-per-second rates.

  19. Portable long trace profiler: Concept and solution

    NASA Astrophysics Data System (ADS)

    Qian, Shinan; Takacs, Peter; Sostero, Giovanni; Cocco, Daniele

    2001-08-01

    Since the early development of the penta-prism long trace profiler (LTP) and the in situ LTP, and following the completion of the first in situ distortion profile measurements at Sincrotrone Trieste (ELETTRA) in Italy in 1995, a concept was developed for a compact, portable LTP with the following characteristics: easily installed on synchrotron radiation beam lines, easily carried to different laboratories around the world for measurements and calibration, convenient for use in evaluating the LTP as an in-process tool in the optical workshop, and convenient for use in temporarily installation as required by other special applications. The initial design of a compact LTP optical head was made at ELETTRA in 1995. Since 1997 further efforts to reduce the optical head size and weight, and to improve measurement stability have been made at Brookhaven National Laboratory. This article introduces the following solutions and accomplishments for the portable LTP: (1) a new design for a compact and very stable optical head, (2) the use of a small detector connected to a laptop computer directly via an enhanced parallel port, and there is no extra frame grabber interface and control box, (3) a customized small mechanical slide that uses a compact motor with a connector-sized motor controller, and (4) the use of a laptop computer system. These solutions make the portable LTP able to be packed into two laptop-size cases: one for the computer and one for the rest of the system.

  20. Moving Object Detection in Heterogeneous Conditions in Embedded Systems

    PubMed Central

    Garbo, Alessandro

    2017-01-01

    This paper presents a system for moving object exposure, focusing on pedestrian detection, in external, unfriendly, and heterogeneous environments. The system manipulates and accurately merges information coming from subsequent video frames, making small computational efforts in each single frame. Its main characterizing feature is to combine several well-known movement detection and tracking techniques, and to orchestrate them in a smart way to obtain good results in diversified scenarios. It uses dynamically adjusted thresholds to characterize different regions of interest, and it also adopts techniques to efficiently track movements, and detect and correct false positives. Accuracy and reliability mainly depend on the overall receipt, i.e., on how the software system is designed and implemented, on how the different algorithmic phases communicate information and collaborate with each other, and on how concurrency is organized. The application is specifically designed to work with inexpensive hardware devices, such as off-the-shelf video cameras and small embedded computational units, eventually forming an intelligent urban grid. As a matter of fact, the major contribution of the paper is the presentation of a tool for real-time applications in embedded devices with finite computational (time and memory) resources. We run experimental results on several video sequences (both home-made and publicly available), showing the robustness and accuracy of the overall detection strategy. Comparisons with state-of-the-art strategies show that our application has similar tracking accuracy but much higher frame-per-second rates. PMID:28671582

  1. The Unknown Oldowan: ~1.7-Million-Year-Old Standardized Obsidian Small Tools from Garba IV, Melka Kunture, Ethiopia

    PubMed Central

    2015-01-01

    The Oldowan Industrial Complex has long been thought to have been static, with limited internal variability, embracing techno-complexes essentially focused on small-to-medium flake production. The flakes were rarely modified by retouch to produce small tools, which do not show any standardized pattern. Usually, the manufacture of small standardized tools has been interpreted as a more complex behavior emerging with the Acheulean technology. Here we report on the ~1.7 Ma Oldowan assemblages from Garba IVE-F at Melka Kunture in the Ethiopian highland. This industry is structured by technical criteria shared by the other East African Oldowan assemblages. However, there is also evidence of a specific technical process never recorded before, i.e. the systematic production of standardized small pointed tools strictly linked to the obsidian exploitation. Standardization and raw material selection in the manufacture of small tools disappear at Melka Kunture during the Lower Pleistocene Acheulean. This proves that 1) the emergence of a certain degree of standardization in tool-kits does not reflect in itself a major step in cultural evolution; and that 2) the Oldowan knappers, when driven by functional needs and supported by a highly suitable raw material, were occasionally able to develop specific technical solutions. The small tool production at ~1.7 Ma, at a time when the Acheulean was already emerging elsewhere in East Africa, adds to the growing amount of evidence of Oldowan techno-economic variability and flexibility, further challenging the view that early stone knapping was static over hundreds of thousands of years. PMID:26690569

  2. The Unknown Oldowan: ~1.7-Million-Year-Old Standardized Obsidian Small Tools from Garba IV, Melka Kunture, Ethiopia.

    PubMed

    Gallotti, Rosalia; Mussi, Margherita

    2015-01-01

    The Oldowan Industrial Complex has long been thought to have been static, with limited internal variability, embracing techno-complexes essentially focused on small-to-medium flake production. The flakes were rarely modified by retouch to produce small tools, which do not show any standardized pattern. Usually, the manufacture of small standardized tools has been interpreted as a more complex behavior emerging with the Acheulean technology. Here we report on the ~1.7 Ma Oldowan assemblages from Garba IVE-F at Melka Kunture in the Ethiopian highland. This industry is structured by technical criteria shared by the other East African Oldowan assemblages. However, there is also evidence of a specific technical process never recorded before, i.e. the systematic production of standardized small pointed tools strictly linked to the obsidian exploitation. Standardization and raw material selection in the manufacture of small tools disappear at Melka Kunture during the Lower Pleistocene Acheulean. This proves that 1) the emergence of a certain degree of standardization in tool-kits does not reflect in itself a major step in cultural evolution; and that 2) the Oldowan knappers, when driven by functional needs and supported by a highly suitable raw material, were occasionally able to develop specific technical solutions. The small tool production at ~1.7 Ma, at a time when the Acheulean was already emerging elsewhere in East Africa, adds to the growing amount of evidence of Oldowan techno-economic variability and flexibility, further challenging the view that early stone knapping was static over hundreds of thousands of years.

  3. Phosphodiester models for cleavage of nucleic acids

    PubMed Central

    2018-01-01

    Nucleic acids that store and transfer biological information are polymeric diesters of phosphoric acid. Cleavage of the phosphodiester linkages by protein enzymes, nucleases, is one of the underlying biological processes. The remarkable catalytic efficiency of nucleases, together with the ability of ribonucleic acids to serve sometimes as nucleases, has made the cleavage of phosphodiesters a subject of intensive mechanistic studies. In addition to studies of nucleases by pH-rate dependency, X-ray crystallography, amino acid/nucleotide substitution and computational approaches, experimental and theoretical studies with small molecular model compounds still play a role. With small molecules, the importance of various elementary processes, such as proton transfer and metal ion binding, for stabilization of transition states may be elucidated and systematic variation of the basicity of the entering or departing nucleophile enables determination of the position of the transition state on the reaction coordinate. Such data is important on analyzing enzyme mechanisms based on synergistic participation of several catalytic entities. Many nucleases are metalloenzymes and small molecular models offer an excellent tool to construct models for their catalytic centers. The present review tends to be an up to date summary of what has been achieved by mechanistic studies with small molecular phosphodiesters. PMID:29719577

  4. Development of pharmacophore models for small molecules targeting RNA: Application to the RNA repeat expansion in myotonic dystrophy type 1.

    PubMed

    Angelbello, Alicia J; González, Àlex L; Rzuczek, Suzanne G; Disney, Matthew D

    2016-12-01

    RNA is an important drug target, but current approaches to identify bioactive small molecules have been engineered primarily for protein targets. Moreover, the identification of small molecules that bind a specific RNA target with sufficient potency remains a challenge. Computer-aided drug design (CADD) and, in particular, ligand-based drug design provide a myriad of tools to identify rapidly new chemical entities for modulating a target based on previous knowledge of active compounds without relying on a ligand complex. Herein we describe pharmacophore virtual screening based on previously reported active molecules that target the toxic RNA that causes myotonic dystrophy type 1 (DM1). DM1-associated defects are caused by sequestration of muscleblind-like 1 protein (MBNL1), an alternative splicing regulator, by expanded CUG repeats (r(CUG) exp ). Several small molecules have been found to disrupt the MBNL1-r(CUG) exp complex, ameliorating DM1 defects. Our pharmacophore model identified a number of potential lead compounds from which we selected 11 compounds to evaluate. Of the 11 compounds, several improved DM1 defects both in vitro and in cells. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Radio Frequency Mass Gauging of Propellants

    NASA Technical Reports Server (NTRS)

    Zimmerli, Gregory A.; Vaden, Karl R.; Herlacher, Michael D.; Buchanan, David A.; VanDresar, Neil T.

    2007-01-01

    A combined experimental and computer simulation effort was conducted to measure radio frequency (RF) tank resonance modes in a dewar partially filled with liquid oxygen, and compare the measurements with numerical simulations. The goal of the effort was to demonstrate that computer simulations of a tank's electromagnetic eigenmodes can be used to accurately predict ground-based measurements, thereby providing a computational tool for predicting tank modes in a low-gravity environment. Matching the measured resonant frequencies of several tank modes with computer simulations can be used to gauge the amount of liquid in a tank, thus providing a possible method to gauge cryogenic propellant tanks in low-gravity. Using a handheld RF spectrum analyzer and a small antenna in a 46 liter capacity dewar for experimental measurements, we have verified that the four lowest transverse magnetic eigenmodes can be accurately predicted as a function of liquid oxygen fill level using computer simulations. The input to the computer simulations consisted of tank dimensions, and the dielectric constant of the fluid. Without using any adjustable parameters, the calculated and measured frequencies agree such that the liquid oxygen fill level was gauged to within 2 percent full scale uncertainty. These results demonstrate the utility of using electromagnetic simulations to form the basis of an RF mass gauging technology with the power to simulate tank resonance frequencies from arbitrary fluid configurations.

  6. Computer Simulation in Mass Emergency and Disaster Response: An Evaluation of Its Effectiveness as a Tool for Demonstrating Strategic Competency in Emergency Department Medical Responders

    ERIC Educational Resources Information Center

    O'Reilly, Daniel J.

    2011-01-01

    This study examined the capability of computer simulation as a tool for assessing the strategic competency of emergency department nurses as they responded to authentically computer simulated biohazard-exposed patient case studies. Thirty registered nurses from a large, urban hospital completed a series of computer-simulated case studies of…

  7. Development and validation of a new dynamic computer-controlled model of the human stomach and small intestine.

    PubMed

    Guerra, Aurélie; Denis, Sylvain; le Goff, Olivier; Sicardi, Vincent; François, Olivier; Yao, Anne-Françoise; Garrait, Ghislain; Manzi, Aimé Pacifique; Beyssac, Eric; Alric, Monique; Blanquet-Diot, Stéphanie

    2016-06-01

    For ethical, regulatory, and economic reasons, in vitro human digestion models are increasingly used as an alternative to in vivo assays. This study aims to present the new Engineered Stomach and small INtestine (ESIN) model and its validation for pharmaceutical applications. This dynamic computer-controlled system reproduces, according to in vivo data, the complex physiology of the human stomach and small intestine, including pH, transit times, chyme mixing, digestive secretions, and passive absorption of digestion products. Its innovative design allows a progressive meal intake and the differential gastric emptying of solids and liquids. The pharmaceutical behavior of two model drugs (paracetamol immediate release form and theophylline sustained release tablet) was studied in ESIN during liquid digestion. The results were compared to those found with a classical compendial method (paddle apparatus) and in human volunteers. Paracetamol and theophylline tablets showed similar absorption profiles in ESIN and in healthy subjects. For theophylline, a level A in vitro-in vivo correlation could be established between the results obtained in ESIN and in humans. Interestingly, using a pharmaceutical basket, the swelling and erosion of the theophylline sustained release form was followed during transit throughout ESIN. ESIN emerges as a relevant tool for pharmaceutical studies but once further validated may find many other applications in nutritional, toxicological, and microbiological fields. Biotechnol. Bioeng. 2016;113: 1325-1335. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  8. Recent Advances in Conotoxin Classification by Using Machine Learning Methods.

    PubMed

    Dao, Fu-Ying; Yang, Hui; Su, Zhen-Dong; Yang, Wuritu; Wu, Yun; Hui, Ding; Chen, Wei; Tang, Hua; Lin, Hao

    2017-06-25

    Conotoxins are disulfide-rich small peptides, which are invaluable peptides that target ion channel and neuronal receptors. Conotoxins have been demonstrated as potent pharmaceuticals in the treatment of a series of diseases, such as Alzheimer's disease, Parkinson's disease, and epilepsy. In addition, conotoxins are also ideal molecular templates for the development of new drug lead compounds and play important roles in neurobiological research as well. Thus, the accurate identification of conotoxin types will provide key clues for the biological research and clinical medicine. Generally, conotoxin types are confirmed when their sequence, structure, and function are experimentally validated. However, it is time-consuming and costly to acquire the structure and function information by using biochemical experiments. Therefore, it is important to develop computational tools for efficiently and effectively recognizing conotoxin types based on sequence information. In this work, we reviewed the current progress in computational identification of conotoxins in the following aspects: (i) construction of benchmark dataset; (ii) strategies for extracting sequence features; (iii) feature selection techniques; (iv) machine learning methods for classifying conotoxins; (v) the results obtained by these methods and the published tools; and (vi) future perspectives on conotoxin classification. The paper provides the basis for in-depth study of conotoxins and drug therapy research.

  9. Globus Online: Climate Data Management for Small Teams

    NASA Astrophysics Data System (ADS)

    Ananthakrishnan, R.; Foster, I.

    2013-12-01

    Large and highly distributed climate data demands new approaches to data organization and lifecycle management. We need, in particular, catalogs that can allow researchers to track the location and properties of large numbers of data files, and management tools that can allow researchers to update data properties and organization during their research, move data among different locations, and invoke analysis computations on data--all as easily as if they were working with small numbers of files on their desktop computer. Both catalogs and management tools often need to be able to scale to extremely large quantities of data. When developing solutions to these problems, it is important to distinguish between the needs of (a) large communities, for whom the ability to organize published data is crucial (e.g., by implementing formal data publication processes, assigning DOIs, recording definitive metadata, providing for versioning), and (b) individual researchers and small teams, who are more frequently concerned with tracking the diverse data and computations involved in what highly dynamic and iterative research processes. Key requirements in the latter case include automated data registration and metadata extraction, ease of update, close-to-zero management overheads (e.g., no local software install); and flexible, user-managed sharing support, allowing read and write privileges within small groups. We describe here how new capabilities provided by the Globus Online system address the needs of the latter group of climate scientists, providing for the rapid creation and establishment of lightweight individual- or team-specific catalogs; the definition of logical groupings of data elements, called datasets; the evolution of catalogs, dataset definitions, and associated metadata over time, to track changes in data properties and organization as a result of research processes; and the manipulation of data referenced by catalog entries (e.g., replication of a dataset to a remote location for analysis, sharing of a dataset). Its software-as-a-service ('SaaS') architecture means that these capabilities are provided to users over the network, without a need for local software installation. In addition, Globus Online provides well defined APIs, thus providing a platform that can be leveraged to integrate the capabilities with other portals and applications. We describe early applications of these new Globus Online to climate science. We focus in particular on applications that demonstrate how Globus Online capabilities complement those of the Earth System Grid Federation (ESGF), the premier system for publication and discovery of large community datasets. ESGF already uses Globus Online mechanisms for data download. We demonstrate methods by which the two systems can be further integrated and harmonized, so that for example data collections produced within a small team can be easily published from Globus Online to ESGF for archival storage and broader access--and a Globus Online catalog can be used to organize an individual view of a subset of data held in ESGF.

  10. Cost-Effective Cloud Computing: A Case Study Using the Comparative Genomics Tool, Roundup

    PubMed Central

    Kudtarkar, Parul; DeLuca, Todd F.; Fusaro, Vincent A.; Tonellato, Peter J.; Wall, Dennis P.

    2010-01-01

    Background Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource—Roundup—using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Methods Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon’s Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. Results We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon’s computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure. PMID:21258651

  11. ProjectQ: Compiling quantum programs for various backends

    NASA Astrophysics Data System (ADS)

    Haener, Thomas; Steiger, Damian S.; Troyer, Matthias

    In order to control quantum computers beyond the current generation, a high level quantum programming language and optimizing compilers will be essential. Therefore, we have developed ProjectQ - an open source software framework to facilitate implementing and running quantum algorithms both in software and on actual quantum hardware. Here, we introduce the backends available in ProjectQ. This includes a high-performance simulator and emulator to test and debug quantum algorithms, tools for resource estimation, and interfaces to several small-scale quantum devices. We demonstrate the workings of the framework and show how easily it can be further extended to control upcoming quantum hardware.

  12. Simulation Software

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Various NASA Small Business Innovation Research grants from Marshall Space Flight Center, Langley Research Center and Ames Research Center were used to develop the 'kernel' of COMCO's modeling and simulation software, the PHLEX finite element code. NASA needed it to model designs of flight vehicles; one of many customized commercial applications is UNISIM, a PHLEX-based code for analyzing underground flows in oil reservoirs for Texaco, Inc. COMCO's products simulate a computational mechanics problem, estimate the solution's error and produce the optimal hp-adapted mesh for the accuracy the user chooses. The system is also used as a research or training tool in universities and in mechanical design in industrial corporations.

  13. Coprocessors for quantum devices

    NASA Astrophysics Data System (ADS)

    Kay, Alastair

    2018-03-01

    Quantum devices, from simple fixed-function tools to the ultimate goal of a universal quantum computer, will require high-quality, frequent repetition of a small set of core operations, such as the preparation of entangled states. These tasks are perfectly suited to realization by a coprocessor or supplementary instruction set, as is common practice in modern CPUs. In this paper, we present two quintessentially quantum coprocessor functions: production of a Greenberger-Horne-Zeilinger state and implementation of optimal universal (asymmetric) quantum cloning. Both are based on the evolution of a fixed Hamiltonian. We introduce a technique for deriving the parameters of these Hamiltonians based on the numerical integration of Toda-like flows.

  14. Applications of Parallel Process HiMAP for Large Scale Multidisciplinary Problems

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.; Potsdam, Mark; Rodriguez, David; Kwak, Dochay (Technical Monitor)

    2000-01-01

    HiMAP is a three level parallel middleware that can be interfaced to a large scale global design environment for code independent, multidisciplinary analysis using high fidelity equations. Aerospace technology needs are rapidly changing. Computational tools compatible with the requirements of national programs such as space transportation are needed. Conventional computation tools are inadequate for modern aerospace design needs. Advanced, modular computational tools are needed, such as those that incorporate the technology of massively parallel processors (MPP).

  15. Computational fluid dynamics: An engineering tool?

    NASA Astrophysics Data System (ADS)

    Anderson, J. D., Jr.

    1982-06-01

    Computational fluid dynamics in general, and time dependent finite difference techniques in particular, are examined from the point of view of direct engineering applications. Examples are given of the supersonic blunt body problem and gasdynamic laser calculations, where such techniques are clearly engineering tools. In addition, Navier-Stokes calculations of chemical laser flows are discussed as an example of a near engineering tool. Finally, calculations of the flowfield in a reciprocating internal combustion engine are offered as a promising future engineering application of computational fluid dynamics.

  16. Interpolation Environment of Tensor Mathematics at the Corpuscular Stage of Computational Experiments in Hydromechanics

    NASA Astrophysics Data System (ADS)

    Bogdanov, Alexander; Degtyarev, Alexander; Khramushin, Vasily; Shichkina, Yulia

    2018-02-01

    Stages of direct computational experiments in hydromechanics based on tensor mathematics tools are represented by conditionally independent mathematical models for calculations separation in accordance with physical processes. Continual stage of numerical modeling is constructed on a small time interval in a stationary grid space. Here coordination of continuity conditions and energy conservation is carried out. Then, at the subsequent corpuscular stage of the computational experiment, kinematic parameters of mass centers and surface stresses at the boundaries of the grid cells are used in modeling of free unsteady motions of volume cells that are considered as independent particles. These particles can be subject to vortex and discontinuous interactions, when restructuring of free boundaries and internal rheological states has place. Transition from one stage to another is provided by interpolation operations of tensor mathematics. Such interpolation environment formalizes the use of physical laws for mechanics of continuous media modeling, provides control of rheological state and conditions for existence of discontinuous solutions: rigid and free boundaries, vortex layers, their turbulent or empirical generalizations.

  17. Raspberry Pi in-situ network monitoring system of groundwater flow and temperature integrated with OpenGeoSys

    NASA Astrophysics Data System (ADS)

    Park, Chan-Hee; Lee, Cholwoo

    2016-04-01

    Raspberry Pi series is a low cost, smaller than credit-card sized computers that various operating systems such as linux and recently even Windows 10 are ported to run on. Thanks to massive production and rapid technology development, the price of various sensors that can be attached to Raspberry Pi has been dropping at an increasing speed. Therefore, the device can be an economic choice as a small portable computer to monitor temporal hydrogeological data in fields. In this study, we present a Raspberry Pi system that measures a flow rate, and temperature of groundwater at sites, stores them into mysql database, and produces interactive figures and tables such as google charts online or bokeh offline for further monitoring and analysis. Since all the data are to be monitored on internet, any computers or mobile devices can be good monitoring tools at convenience. The measured data are further integrated with OpenGeoSys, one of the hydrogeological models that is also ported to the Raspberry Pi series. This leads onsite hydrogeological modeling fed by temporal sensor data to meet various needs.

  18. 2013 R&D 100 Award: ‘Miniapps’ Bolster High Performance Computing

    ScienceCinema

    Belak, Jim; Richards, David

    2018-06-12

    Two Livermore computer scientists served on a Sandia National Laboratories-led team that developed Mantevo Suite 1.0, the first integrated suite of small software programs, also called "miniapps," to be made available to the high performance computing (HPC) community. These miniapps facilitate the development of new HPC systems and the applications that run on them. Miniapps (miniature applications) serve as stripped down surrogates for complex, full-scale applications that can require a great deal of time and effort to port to a new HPC system because they often consist of hundreds of thousands of lines of code. The miniapps are a prototype that contains some or all of the essentials of the real application but with many fewer lines of code, making the miniapp more versatile for experimentation. This allows researchers to more rapidly explore options and optimize system design, greatly improving the chances the full-scale application will perform successfully. These miniapps have become essential tools for exploring complex design spaces because they can reliably predict the performance of full applications.

  19. On the isotropic Raman spectrum of Ar2 and how to benchmark ab initio calculations of small atomic clusters: Paradox lost.

    PubMed

    Chrysos, Michael; Dixneuf, Sophie; Rachet, Florent

    2015-07-14

    This is the long-overdue answer to the discrepancies observed between theory and experiment in Ar2 regarding both the isotropic Raman spectrum and the second refractivity virial coefficient, BR [Gaye et al., Phys. Rev. A 55, 3484 (1997)]. At the origin of this progress is the advent (posterior to 1997) of advanced computational methods for weakly interconnected neutral species at close separations. Here, we report agreement between the previously taken Raman measurements and quantum lineshapes now computed with the employ of large-scale CCSD or smartly constructed MP2 induced-polarizability data. By using these measurements as a benchmark tool, we assess the degree of performance of various other ab initio computed data for the mean polarizability α, and we show that an excellent agreement with the most recently measured value of BR is reached. We propose an even more refined model for α, which is solution of the inverse-scattering problem and whose lineshape matches exactly the measured spectrum over the entire frequency-shift range probed.

  20. Adjoint-based sensitivity analysis of low-order thermoacoustic networks using a wave-based approach

    NASA Astrophysics Data System (ADS)

    Aguilar, José G.; Magri, Luca; Juniper, Matthew P.

    2017-07-01

    Strict pollutant emission regulations are pushing gas turbine manufacturers to develop devices that operate in lean conditions, with the downside that combustion instabilities are more likely to occur. Methods to predict and control unstable modes inside combustion chambers have been developed in the last decades but, in some cases, they are computationally expensive. Sensitivity analysis aided by adjoint methods provides valuable sensitivity information at a low computational cost. This paper introduces adjoint methods and their application in wave-based low order network models, which are used as industrial tools, to predict and control thermoacoustic oscillations. Two thermoacoustic models of interest are analyzed. First, in the zero Mach number limit, a nonlinear eigenvalue problem is derived, and continuous and discrete adjoint methods are used to obtain the sensitivities of the system to small modifications. Sensitivities to base-state modification and feedback devices are presented. Second, a more general case with non-zero Mach number, a moving flame front and choked outlet, is presented. The influence of the entropy waves on the computed sensitivities is shown.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vinuesa, Ricardo; Fick, Lambert; Negi, Prabal

    In the present document we describe a toolbox for the spectral-element code Nek5000, aimed at computing turbulence statistics. The toolbox is presented for a small test case, namely a square duct with L x = 2h, L y = 2h and L z = 4h, where x, y and z are the horizontal, vertical and streamwise directions, respectively. The number of elements in the xy-plane is 16 X 16 = 256, and the number of elements in z is 4, leading to a total of 1,204 spectral elements. A polynomial order of N = 5 is chosen, and the meshmore » is generated using the Nek5000 tool genbox. The toolbox presented here allows to compute mean-velocity components, the Reynolds-stress tensor as well as turbulent kinetic energy (TKE) and Reynolds-stress budgets. Note that the present toolbox allows to compute turbulence statistics in turbulent flows with one homogeneous direction (where the statistics are based on time-averaging as well as averaging in the homogeneous direction), as well as in fully three-dimensional flows (with no periodic directions, where only time-averaging is considered).« less

  2. Trajectory Tracking of a Planer Parallel Manipulator by Using Computed Force Control Method

    NASA Astrophysics Data System (ADS)

    Bayram, Atilla

    2017-03-01

    Despite small workspace, parallel manipulators have some advantages over their serial counterparts in terms of higher speed, acceleration, rigidity, accuracy, manufacturing cost and payload. Accordingly, this type of manipulators can be used in many applications such as in high-speed machine tools, tuning machine for feeding, sensitive cutting, assembly and packaging. This paper presents a special type of planar parallel manipulator with three degrees of freedom. It is constructed as a variable geometry truss generally known planar Stewart platform. The reachable and orientation workspaces are obtained for this manipulator. The inverse kinematic analysis is solved for the trajectory tracking according to the redundancy and joint limit avoidance. Then, the dynamics model of the manipulator is established by using Virtual Work method. The simulations are performed to follow the given planar trajectories by using the dynamic equations of the variable geometry truss manipulator and computed force control method. In computed force control method, the feedback gain matrices for PD control are tuned with fixed matrices by trail end error and variable ones by means of optimization with genetic algorithm.

  3. Harnessing the power of emerging petascale platforms

    NASA Astrophysics Data System (ADS)

    Mellor-Crummey, John

    2007-07-01

    As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC-2) program, science teams are tackling problems that require computational simulation and modeling at the petascale. A grand challenge for computer science is to develop software technology that makes it easier to harness the power of these systems to aid scientific discovery. As part of its activities, the SciDAC-2 Center for Scalable Application Development Software (CScADS) is building open source software tools to support efficient scientific computing on the emerging leadership-class platforms. In this paper, we describe two tools for performance analysis and tuning that are being developed as part of CScADS: a tool for analyzing scalability and performance, and a tool for optimizing loop nests for better node performance. We motivate these tools by showing how they apply to S3D, a turbulent combustion code under development at Sandia National Laboratory. For S3D, our node performance analysis tool helped uncover several performance bottlenecks. Using our loop nest optimization tool, we transformed S3D's most costly loop nest to reduce execution time by a factor of 2.94 for a processor working on a 503 domain.

  4. Design of testbed and emulation tools

    NASA Technical Reports Server (NTRS)

    Lundstrom, S. F.; Flynn, M. J.

    1986-01-01

    The research summarized was concerned with the design of testbed and emulation tools suitable to assist in projecting, with reasonable accuracy, the expected performance of highly concurrent computing systems on large, complete applications. Such testbed and emulation tools are intended for the eventual use of those exploring new concurrent system architectures and organizations, either as users or as designers of such systems. While a range of alternatives was considered, a software based set of hierarchical tools was chosen to provide maximum flexibility, to ease in moving to new computers as technology improves and to take advantage of the inherent reliability and availability of commercially available computing systems.

  5. Automatic Differentiation as a tool in engineering design

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M.; Hall, Laura E.

    1992-01-01

    Automatic Differentiation (AD) is a tool that systematically implements the chain rule of differentiation to obtain the derivatives of functions calculated by computer programs. In this paper, it is assessed as a tool for engineering design. The paper discusses the forward and reverse modes of AD, their computing requirements, and approaches to implementing AD. It continues with application to two different tools to two medium-size structural analysis problems to generate sensitivity information typically necessary in an optimization or design situation. The paper concludes with the observation that AD is to be preferred to finite differencing in most cases, as long as sufficient computer storage is available.

  6. Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models

    ERIC Educational Resources Information Center

    Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron

    2011-01-01

    To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…

  7. An Evaluation of Teaching Introductory Geomorphology Using Computer-based Tools.

    ERIC Educational Resources Information Center

    Wentz, Elizabeth A.; Vender, Joann C.; Brewer, Cynthia A.

    1999-01-01

    Compares student reactions to traditional teaching methods and an approach where computer-based tools (GEODe CD-ROM and GIS-based exercises) were either integrated with or replaced the traditional methods. Reveals that the students found both of these tools valuable forms of instruction when used in combination with the traditional methods. (CMK)

  8. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Tengfang; Flapper, Joris; Ke, Jing

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  9. A Framework for the Evaluation of CASE Tool Learnability in Educational Environments

    ERIC Educational Resources Information Center

    Senapathi, Mali

    2005-01-01

    The aim of the research is to derive a framework for the evaluation of Computer Aided Software Engineering (CASE) tool learnability in educational environments. Drawing from the literature of Human Computer Interaction and educational research, a framework for evaluating CASE tool learnability in educational environments is derived. The two main…

  10. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    ERIC Educational Resources Information Center

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  11. Toward A Simulation-Based Tool for the Treatment of Vocal Fold Paralysis

    PubMed Central

    Mittal, Rajat; Zheng, Xudong; Bhardwaj, Rajneesh; Seo, Jung Hee; Xue, Qian; Bielamowicz, Steven

    2011-01-01

    Advances in high-performance computing are enabling a new generation of software tools that employ computational modeling for surgical planning. Surgical management of laryngeal paralysis is one area where such computational tools could have a significant impact. The current paper describes a comprehensive effort to develop a software tool for planning medialization laryngoplasty where a prosthetic implant is inserted into the larynx in order to medialize the paralyzed vocal fold (VF). While this is one of the most common procedures used to restore voice in patients with VF paralysis, it has a relatively high revision rate, and the tool being developed is expected to improve surgical outcomes. This software tool models the biomechanics of airflow-induced vibration in the human larynx and incorporates sophisticated approaches for modeling the turbulent laryngeal flow, the complex dynamics of the VFs, as well as the production of voiced sound. The current paper describes the key elements of the modeling approach, presents computational results that demonstrate the utility of the approach and also describes some of the limitations and challenges. PMID:21556320

  12. Decision support system development at the Upper Midwest Environmental Sciences Center

    USGS Publications Warehouse

    Fox, Timothy J.; Nelson, J. C.; Rohweder, Jason J.

    2014-01-01

    A Decision Support System (DSS) can be defined in many ways. The working definition used by the U.S. Geological Survey Upper Midwest Environmental Sciences Center (UMESC) is, “A spatially based computer application or data that assists a researcher or manager in making decisions.” This is quite a broad definition—and it needs to be, because the possibilities for types of DSSs are limited only by the user group and the developer’s imagination. There is no one DSS; the types of DSSs are as diverse as the problems they help solve. This diversity requires that DSSs be built in a variety of ways, using the most appropriate methods and tools for the individual application. The skills of potential DSS users vary widely as well, further necessitating multiple approaches to DSS development. Some small, highly trained user groups may want a powerful modeling tool with extensive functionality at the expense of ease of use. Other user groups less familiar with geographic information system (GIS) and spatial data may want an easy-to-use application for a nontechnical audience. UMESC has been developing DSSs for almost 20 years. Our DSS developers offer our partners a wide variety of technical skills and development options, ranging from the most simple Web page or small application to complex modeling application development.

  13. Next Generation Analytic Tools for Large Scale Genetic Epidemiology Studies of Complex Diseases

    PubMed Central

    Mechanic, Leah E.; Chen, Huann-Sheng; Amos, Christopher I.; Chatterjee, Nilanjan; Cox, Nancy J.; Divi, Rao L.; Fan, Ruzong; Harris, Emily L.; Jacobs, Kevin; Kraft, Peter; Leal, Suzanne M.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Province, Michael A.; Ramos, Erin M.; Ritchie, Marylyn D.; Roeder, Kathryn; Schaid, Daniel J.; Stephens, Matthew; Thomas, Duncan C.; Weinberg, Clarice R.; Witte, John S.; Zhang, Shunpu; Zöllner, Sebastian; Feuer, Eric J.; Gillanders, Elizabeth M.

    2012-01-01

    Over the past several years, genome-wide association studies (GWAS) have succeeded in identifying hundreds of genetic markers associated with common diseases. However, most of these markers confer relatively small increments of risk and explain only a small proportion of familial clustering. To identify obstacles to future progress in genetic epidemiology research and provide recommendations to NIH for overcoming these barriers, the National Cancer Institute sponsored a workshop entitled “Next Generation Analytic Tools for Large-Scale Genetic Epidemiology Studies of Complex Diseases” on September 15–16, 2010. The goal of the workshop was to facilitate discussions on (1) statistical strategies and methods to efficiently identify genetic and environmental factors contributing to the risk of complex disease; and (2) how to develop, apply, and evaluate these strategies for the design, analysis, and interpretation of large-scale complex disease association studies in order to guide NIH in setting the future agenda in this area of research. The workshop was organized as a series of short presentations covering scientific (gene-gene and gene-environment interaction, complex phenotypes, and rare variants and next generation sequencing) and methodological (simulation modeling and computational resources and data management) topic areas. Specific needs to advance the field were identified during each session and are summarized. PMID:22147673

  14. 48 CFR 252.227-7018 - Rights in noncommercial technical data and computer software-Small Business Innovation Research...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Innovation Research (SBIR) Program. As prescribed in 227.7104(a), use the following clause: Rights in Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (JUN 1995...

  15. 48 CFR 252.227-7018 - Rights in noncommercial technical data and computer software-Small Business Innovation Research...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Innovation Research (SBIR) Program. As prescribed in 227.7104(a), use the following clause: Rights in Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (FEB 2014...

  16. 48 CFR 252.227-7018 - Rights in noncommercial technical data and computer software-Small Business Innovation Research...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Innovation Research (SBIR) Program. As prescribed in 227.7104(a), use the following clause: Rights in Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (MAR 2011...

  17. 48 CFR 252.227-7018 - Rights in noncommercial technical data and computer software-Small Business Innovation Research...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Innovation Research (SBIR) Program. As prescribed in 227.7104(a), use the following clause: Rights in Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (MAY 2013...

  18. Computational Investigations of Noise Suppression in Subsonic Round Jets

    NASA Technical Reports Server (NTRS)

    Pruett, C. David

    1997-01-01

    NASA Grant NAG1-1802, originally submitted in June 1996 as a two-year proposal, was awarded one-year's funding by NASA LaRC for the period 5 Oct., 1996, through 4 Oct., 1997. Because of the inavailability (from IT at NASA ARC) of sufficient supercomputer time in fiscal 1998 to complete the computational goals of the second year of the original proposal (estimated to be at least 400 Cray C-90 CPU hours), those goals have been appropriately amended, and a new proposal has been submitted to LaRC as a follow-on to NAG1-1802. The current report documents the activities and accomplishments on NAG1-1802 during the one-year period from 5 Oct., 1996, through 4 Oct., 1997. NASA Grant NAG1-1802, and its predecessor, NAG1-1772, have been directed toward adapting the numerical tool of Large-Eddy Simulation (LES) to aeroacoustic applications, with particular focus on noise suppression in subsonic round jets. In LES, the filtered Navier-Stokes equations are solved numerically on a relatively coarse computational grid. Residual stresses, generated by scales of motion too small to be resolved on the coarse grid, are modeled. Although most LES incorporate spatial filtering, time-domain filtering affords certain conceptual and computational advantages, particularly for aeroacoustic applications. Consequently, this work has focused on the development of SubGrid-Scale (SGS) models that incorporate time- domain filters. The author is unaware of any previous attempt at purely time-filtered LES; however, Aldama and Dakhoul and Bedford have considered approaches that combine both spatial and temporal filtering. In our view, filtering in both space and time is redundant, because removal of high frequencies effects the removal of small spatial scales and vice versa.

  19. Crossing the chasm: how to develop weather and climate models for next generation computers?

    NASA Astrophysics Data System (ADS)

    Lawrence, Bryan N.; Rezny, Michael; Budich, Reinhard; Bauer, Peter; Behrens, Jörg; Carter, Mick; Deconinck, Willem; Ford, Rupert; Maynard, Christopher; Mullerworth, Steven; Osuna, Carlos; Porter, Andrew; Serradell, Kim; Valcke, Sophie; Wedi, Nils; Wilson, Simon

    2018-05-01

    Weather and climate models are complex pieces of software which include many individual components, each of which is evolving under pressure to exploit advances in computing to enhance some combination of a range of possible improvements (higher spatio-temporal resolution, increased fidelity in terms of resolved processes, more quantification of uncertainty, etc.). However, after many years of a relatively stable computing environment with little choice in processing architecture or programming paradigm (basically X86 processors using MPI for parallelism), the existing menu of processor choices includes significant diversity, and more is on the horizon. This computational diversity, coupled with ever increasing software complexity, leads to the very real possibility that weather and climate modelling will arrive at a chasm which will separate scientific aspiration from our ability to develop and/or rapidly adapt codes to the available hardware. In this paper we review the hardware and software trends which are leading us towards this chasm, before describing current progress in addressing some of the tools which we may be able to use to bridge the chasm. This brief introduction to current tools and plans is followed by a discussion outlining the scientific requirements for quality model codes which have satisfactory performance and portability, while simultaneously supporting productive scientific evolution. We assert that the existing method of incremental model improvements employing small steps which adjust to the changing hardware environment is likely to be inadequate for crossing the chasm between aspiration and hardware at a satisfactory pace, in part because institutions cannot have all the relevant expertise in house. Instead, we outline a methodology based on large community efforts in engineering and standardisation, which will depend on identifying a taxonomy of key activities - perhaps based on existing efforts to develop domain-specific languages, identify common patterns in weather and climate codes, and develop community approaches to commonly needed tools and libraries - and then collaboratively building up those key components. Such a collaborative approach will depend on institutions, projects, and individuals adopting new interdependencies and ways of working.

  20. C-mii: a tool for plant miRNA and target identification.

    PubMed

    Numnark, Somrak; Mhuantong, Wuttichai; Ingsriswang, Supawadee; Wichadakul, Duangdao

    2012-01-01

    MicroRNAs (miRNAs) have been known to play an important role in several biological processes in both animals and plants. Although several tools for miRNA and target identification are available, the number of tools tailored towards plants is limited, and those that are available have specific functionality, lack graphical user interfaces, and restrict the number of input sequences. Large-scale computational identifications of miRNAs and/or targets of several plants have been also reported. Their methods, however, are only described as flow diagrams, which require programming skills and the understanding of input and output of the connected programs to reproduce. To overcome these limitations and programming complexities, we proposed C-mii as a ready-made software package for both plant miRNA and target identification. C-mii was designed and implemented based on established computational steps and criteria derived from previous literature with the following distinguishing features. First, software is easy to install with all-in-one programs and packaged databases. Second, it comes with graphical user interfaces (GUIs) for ease of use. Users can identify plant miRNAs and targets via step-by-step execution, explore the detailed results from each step, filter the results according to proposed constraints in plant miRNA and target biogenesis, and export sequences and structures of interest. Third, it supplies bird's eye views of the identification results with infographics and grouping information. Fourth, in terms of functionality, it extends the standard computational steps of miRNA target identification with miRNA-target folding and GO annotation. Fifth, it provides helper functions for the update of pre-installed databases and automatic recovery. Finally, it supports multi-project and multi-thread management. C-mii constitutes the first complete software package with graphical user interfaces enabling computational identification of both plant miRNA genes and miRNA targets. With the provided functionalities, it can help accelerate the study of plant miRNAs and targets, especially for small and medium plant molecular labs without bioinformaticians. C-mii is freely available at http://www.biotec.or.th/isl/c-mii for both Windows and Ubuntu Linux platforms.

  1. C-mii: a tool for plant miRNA and target identification

    PubMed Central

    2012-01-01

    Background MicroRNAs (miRNAs) have been known to play an important role in several biological processes in both animals and plants. Although several tools for miRNA and target identification are available, the number of tools tailored towards plants is limited, and those that are available have specific functionality, lack graphical user interfaces, and restrict the number of input sequences. Large-scale computational identifications of miRNAs and/or targets of several plants have been also reported. Their methods, however, are only described as flow diagrams, which require programming skills and the understanding of input and output of the connected programs to reproduce. Results To overcome these limitations and programming complexities, we proposed C-mii as a ready-made software package for both plant miRNA and target identification. C-mii was designed and implemented based on established computational steps and criteria derived from previous literature with the following distinguishing features. First, software is easy to install with all-in-one programs and packaged databases. Second, it comes with graphical user interfaces (GUIs) for ease of use. Users can identify plant miRNAs and targets via step-by-step execution, explore the detailed results from each step, filter the results according to proposed constraints in plant miRNA and target biogenesis, and export sequences and structures of interest. Third, it supplies bird's eye views of the identification results with infographics and grouping information. Fourth, in terms of functionality, it extends the standard computational steps of miRNA target identification with miRNA-target folding and GO annotation. Fifth, it provides helper functions for the update of pre-installed databases and automatic recovery. Finally, it supports multi-project and multi-thread management. Conclusions C-mii constitutes the first complete software package with graphical user interfaces enabling computational identification of both plant miRNA genes and miRNA targets. With the provided functionalities, it can help accelerate the study of plant miRNAs and targets, especially for small and medium plant molecular labs without bioinformaticians. C-mii is freely available at http://www.biotec.or.th/isl/c-mii for both Windows and Ubuntu Linux platforms. PMID:23281648

  2. Dynamic Load-Balancing for Distributed Heterogeneous Computing of Parallel CFD Problems

    NASA Technical Reports Server (NTRS)

    Ecer, A.; Chien, Y. P.; Boenisch, T.; Akay, H. U.

    2000-01-01

    The developed methodology is aimed at improving the efficiency of executing block-structured algorithms on parallel, distributed, heterogeneous computers. The basic approach of these algorithms is to divide the flow domain into many sub- domains called blocks, and solve the governing equations over these blocks. Dynamic load balancing problem is defined as the efficient distribution of the blocks among the available processors over a period of several hours of computations. In environments with computers of different architecture, operating systems, CPU speed, memory size, load, and network speed, balancing the loads and managing the communication between processors becomes crucial. Load balancing software tools for mutually dependent parallel processes have been created to efficiently utilize an advanced computation environment and algorithms. These tools are dynamic in nature because of the chances in the computer environment during execution time. More recently, these tools were extended to a second operating system: NT. In this paper, the problems associated with this application will be discussed. Also, the developed algorithms were combined with the load sharing capability of LSF to efficiently utilize workstation clusters for parallel computing. Finally, results will be presented on running a NASA based code ADPAC to demonstrate the developed tools for dynamic load balancing.

  3. The Data Collector: A Qualitative Research Tool.

    ERIC Educational Resources Information Center

    Handler, Marianne G.; Turner, Sandra V.

    Computer software that is intended to assist the qualitative researcher in the analysis of textual data is relatively new. One such program, the Data Collector, is a HyperCard computer program designed for use on the Macintosh computer. A tool for organizing and analyzing textual data obtained from observations, interviews, surveys, and other…

  4. Integrating Data Base into the Elementary School Science Program.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    This document describes seven science activities that combine scientific principles and computers. The objectives for the activities are to show students how the computer can be used as a tool to store and arrange scientific data, provide students with experience using the computer as a tool to manage scientific data, and provide students with…

  5. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    NASA Astrophysics Data System (ADS)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  6. Computational System For Rapid CFD Analysis In Engineering

    NASA Technical Reports Server (NTRS)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  7. Accuracy of the adiabatic-impulse approximation for closed and open quantum systems

    NASA Astrophysics Data System (ADS)

    Tomka, Michael; Campos Venuti, Lorenzo; Zanardi, Paolo

    2018-03-01

    We study the adiabatic-impulse approximation (AIA) as a tool to approximate the time evolution of quantum states when driven through a region of small gap. Such small-gap regions are a common situation in adiabatic quantum computing and having reliable approximations is important in this context. The AIA originates from the Kibble-Zurek theory applied to continuous quantum phase transitions. The Kibble-Zurek mechanism was developed to predict the power-law scaling of the defect density across a continuous quantum phase transition. Instead, here we quantify the accuracy of the AIA via the trace norm distance with respect to the exact evolved state. As expected, we find that for short times or fast protocols, the AIA outperforms the simple adiabatic approximation. However, for large times or slow protocols, the situation is actually reversed and the AIA provides a worse approximation. Nevertheless, we found a variation of the AIA that can perform better than the adiabatic one. This counterintuitive modification consists in crossing the region of small gap twice. Our findings are illustrated by several examples of driven closed and open quantum systems.

  8. Small Microprocessor for ASIC or FPGA Implementation

    NASA Technical Reports Server (NTRS)

    Kleyner, Igor; Katz, Richard; Blair-Smith, Hugh

    2011-01-01

    A small microprocessor, suitable for use in applications in which high reliability is required, was designed to be implemented in either an application-specific integrated circuit (ASIC) or a field-programmable gate array (FPGA). The design is based on commercial microprocessor architecture, making it possible to use available software development tools and thereby to implement the microprocessor at relatively low cost. The design features enhancements, including trapping during execution of illegal instructions. The internal structure of the design yields relatively high performance, with a significant decrease, relative to other microprocessors that perform the same functions, in the number of microcycles needed to execute macroinstructions. The problem meant to be solved in designing this microprocessor was to provide a modest level of computational capability in a general-purpose processor while adding as little as possible to the power demand, size, and weight of a system into which the microprocessor would be incorporated. As designed, this microprocessor consumes very little power and occupies only a small portion of a typical modern ASIC or FPGA. The microprocessor operates at a rate of about 4 million instructions per second with clock frequency of 20 MHz.

  9. Integrating computational and chemical biology tools in the discovery of antiangiogenic small molecule ligands of FGF2 derived from endogenous inhibitors

    PubMed Central

    Foglieni, Chiara; Pagano, Katiuscia; Lessi, Marco; Bugatti, Antonella; Moroni, Elisabetta; Pinessi, Denise; Resovi, Andrea; Ribatti, Domenico; Bertini, Sabrina; Ragona, Laura; Bellina, Fabio; Rusnati, Marco; Colombo, Giorgio; Taraboletti, Giulia

    2016-01-01

    The FGFs/FGFRs system is a recognized actionable target for therapeutic approaches aimed at inhibiting tumor growth, angiogenesis, metastasis, and resistance to therapy. We previously identified a non-peptidic compound (SM27) that retains the structural and functional properties of the FGF2-binding sequence of thrombospondin-1 (TSP-1), a major endogenous inhibitor of angiogenesis. Here we identified new small molecule inhibitors of FGF2 based on the initial lead. A similarity-based screening of small molecule libraries, followed by docking calculations and experimental studies, allowed selecting 7 bi-naphthalenic compounds that bound FGF2 inhibiting its binding to both heparan sulfate proteoglycans and FGFR-1. The compounds inhibit FGF2 activity in in vitro and ex vivo models of angiogenesis, with improved potency over SM27. Comparative analysis of the selected hits, complemented by NMR and biochemical analysis of 4 newly synthesized functionalized phenylamino-substituted naphthalenes, allowed identifying the minimal stereochemical requirements to improve the design of naphthalene sulfonates as FGF2 inhibitors. PMID:27000667

  10. Software and resources for computational medicinal chemistry

    PubMed Central

    Liao, Chenzhong; Sitzmann, Markus; Pugliese, Angelo; Nicklaus, Marc C

    2011-01-01

    Computer-aided drug design plays a vital role in drug discovery and development and has become an indispensable tool in the pharmaceutical industry. Computational medicinal chemists can take advantage of all kinds of software and resources in the computer-aided drug design field for the purposes of discovering and optimizing biologically active compounds. This article reviews software and other resources related to computer-aided drug design approaches, putting particular emphasis on structure-based drug design, ligand-based drug design, chemical databases and chemoinformatics tools. PMID:21707404

  11. Comparison of HSPF and PRMS model simulated flows using different temporal and spatial scales in the Black Hills, South Dakota

    USGS Publications Warehouse

    Chalise, D. R.; Haj, Adel E.; Fontaine, T.A.

    2018-01-01

    The hydrological simulation program Fortran (HSPF) [Hydrological Simulation Program Fortran version 12.2 (Computer software). USEPA, Washington, DC] and the precipitation runoff modeling system (PRMS) [Precipitation Runoff Modeling System version 4.0 (Computer software). USGS, Reston, VA] models are semidistributed, deterministic hydrological tools for simulating the impacts of precipitation, land use, and climate on basin hydrology and streamflow. Both models have been applied independently to many watersheds across the United States. This paper reports the statistical results assessing various temporal (daily, monthly, and annual) and spatial (small versus large watershed) scale biases in HSPF and PRMS simulations using two watersheds in the Black Hills, South Dakota. The Nash-Sutcliffe efficiency (NSE), Pearson correlation coefficient (r">rr), and coefficient of determination (R2">R2R2) statistics for the daily, monthly, and annual flows were used to evaluate the models’ performance. Results from the HSPF models showed that the HSPF consistently simulated the annual flows for both large and small basins better than the monthly and daily flows, and the simulated flows for the small watershed better than flows for the large watershed. In comparison, the PRMS model results show that the PRMS simulated the monthly flows for both the large and small watersheds better than the daily and annual flows, and the range of statistical error in the PRMS models was greater than that in the HSPF models. Moreover, it can be concluded that the statistical error in the HSPF and the PRMSdaily, monthly, and annual flow estimates for watersheds in the Black Hills was influenced by both temporal and spatial scale variability.

  12. Analytical prediction with multidimensional computer programs and experimental verification of the performance, at a variety of operating conditions, of two traveling wave tubes with depressed collectors

    NASA Technical Reports Server (NTRS)

    Dayton, J. A., Jr.; Kosmahl, H. G.; Ramins, P.; Stankiewicz, N.

    1979-01-01

    Experimental and analytical results are compared for two high performance, octave bandwidth TWT's that use depressed collectors (MDC's) to improve the efficiency. The computations were carried out with advanced, multidimensional computer programs that are described here in detail. These programs model the electron beam as a series of either disks or rings of charge and follow their multidimensional trajectories from the RF input of the ideal TWT, through the slow wave structure, through the magnetic refocusing system, to their points of impact in the depressed collector. Traveling wave tube performance, collector efficiency, and collector current distribution were computed and the results compared with measurements for a number of TWT-MDC systems. Power conservation and correct accounting of TWT and collector losses were observed. For the TWT's operating at saturation, very good agreement was obtained between the computed and measured collector efficiencies. For a TWT operating 3 and 6 dB below saturation, excellent agreement between computed and measured collector efficiencies was obtained in some cases but only fair agreement in others. However, deviations can largely be explained by small differences in the computed and actual spent beam energy distributions. The analytical tools used here appear to be sufficiently refined to design efficient collectors for this class of TWT. However, for maximum efficiency, some experimental optimization (e.g., collector voltages and aperture sizes) will most likely be required.

  13. A Fog Computing and Cloudlet Based Augmented Reality System for the Industry 4.0 Shipyard.

    PubMed

    Fernández-Caramés, Tiago M; Fraga-Lamas, Paula; Suárez-Albela, Manuel; Vilar-Montesinos, Miguel

    2018-06-02

    Augmented Reality (AR) is one of the key technologies pointed out by Industry 4.0 as a tool for enhancing the next generation of automated and computerized factories. AR can also help shipbuilding operators, since they usually need to interact with information (e.g., product datasheets, instructions, maintenance procedures, quality control forms) that could be handled easily and more efficiently through AR devices. This is the reason why Navantia, one of the 10 largest shipbuilders in the world, is studying the application of AR (among other technologies) in different shipyard environments in a project called "Shipyard 4.0". This article presents Navantia's industrial AR (IAR) architecture, which is based on cloudlets and on the fog computing paradigm. Both technologies are ideal for supporting physically-distributed, low-latency and QoS-aware applications that decrease the network traffic and the computational load of traditional cloud computing systems. The proposed IAR communications architecture is evaluated in real-world scenarios with payload sizes according to demanding Microsoft HoloLens applications and when using a cloud, a cloudlet and a fog computing system. The results show that, in terms of response delay, the fog computing system is the fastest when transferring small payloads (less than 128 KB), while for larger file sizes, the cloudlet solution is faster than the others. Moreover, under high loads (with many concurrent IAR clients), the cloudlet in some cases is more than four times faster than the fog computing system in terms of response delay.

  14. Evaluation of a computerized aid for creating human behavioral representations of human-computer interaction.

    PubMed

    Williams, Kent E; Voigt, Jeffrey R

    2004-01-01

    The research reported herein presents the results of an empirical evaluation that focused on the accuracy and reliability of cognitive models created using a computerized tool: the cognitive analysis tool for human-computer interaction (CAT-HCI). A sample of participants, expert in interacting with a newly developed tactical display for the U.S. Army's Bradley Fighting Vehicle, individually modeled their knowledge of 4 specific tasks employing the CAT-HCI tool. Measures of the accuracy and consistency of task models created by these task domain experts using the tool were compared with task models created by a double expert. The findings indicated a high degree of consistency and accuracy between the different "single experts" in the task domain in terms of the resultant models generated using the tool. Actual or potential applications of this research include assessing human-computer interaction complexity, determining the productivity of human-computer interfaces, and analyzing an interface design to determine whether methods can be automated.

  15. Automatic Generation of OpenMP Directives and Its Application to Computational Fluid Dynamics Codes

    NASA Technical Reports Server (NTRS)

    Yan, Jerry; Jin, Haoqiang; Frumkin, Michael; Yan, Jerry (Technical Monitor)

    2000-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. As great progress was made in hardware and software technologies, performance of parallel programs with compiler directives has demonstrated large improvement. The introduction of OpenMP directives, the industrial standard for shared-memory programming, has minimized the issue of portability. In this study, we have extended CAPTools, a computer-aided parallelization toolkit, to automatically generate OpenMP-based parallel programs with nominal user assistance. We outline techniques used in the implementation of the tool and discuss the application of this tool on the NAS Parallel Benchmarks and several computational fluid dynamics codes. This work demonstrates the great potential of using the tool to quickly port parallel programs and also achieve good performance that exceeds some of the commercial tools.

  16. Computer-Based Indexing on a Small Scale: Bibliography.

    ERIC Educational Resources Information Center

    Douglas, Kimberly; Wismer, Don

    The 131 references on small scale computer-based indexing cited in this bibliography are subdivided as follows: general, general (computer), index structure, microforms, specific systems, KWIC KWAC KWOC, and thesauri. (RAA)

  17. Noninvasive imaging of experimental lung fibrosis.

    PubMed

    Zhou, Yong; Chen, Huaping; Ambalavanan, Namasivayam; Liu, Gang; Antony, Veena B; Ding, Qiang; Nath, Hrudaya; Eary, Janet F; Thannickal, Victor J

    2015-07-01

    Small animal models of lung fibrosis are essential for unraveling the molecular mechanisms underlying human fibrotic lung diseases; additionally, they are useful for preclinical testing of candidate antifibrotic agents. The current end-point measures of experimental lung fibrosis involve labor-intensive histological and biochemical analyses. These measures fail to account for dynamic changes in the disease process in individual animals and are limited by the need for large numbers of animals for longitudinal studies. The emergence of noninvasive imaging technologies provides exciting opportunities to image lung fibrosis in live animals as often as needed and to longitudinally track the efficacy of novel antifibrotic compounds. Data obtained by noninvasive imaging provide complementary information to histological and biochemical measurements. In addition, the use of noninvasive imaging in animal studies reduces animal usage, thus satisfying animal welfare concerns. In this article, we review these new imaging modalities with the potential for evaluation of lung fibrosis in small animal models. Such techniques include micro-computed tomography (micro-CT), magnetic resonance imaging, positron emission tomography (PET), single photon emission computed tomography (SPECT), and multimodal imaging systems including PET/CT and SPECT/CT. It is anticipated that noninvasive imaging will be increasingly used in animal models of fibrosis to gain insights into disease pathogenesis and as preclinical tools to assess drug efficacy.

  18. Integrin-Targeted Hybrid Fluorescence Molecular Tomography/X-ray Computed Tomography for Imaging Tumor Progression and Early Response in Non-Small Cell Lung Cancer.

    PubMed

    Ma, Xiaopeng; Phi Van, Valerie; Kimm, Melanie A; Prakash, Jaya; Kessler, Horst; Kosanke, Katja; Feuchtinger, Annette; Aichler, Michaela; Gupta, Aayush; Rummeny, Ernst J; Eisenblätter, Michel; Siveke, Jens; Walch, Axel K; Braren, Rickmer; Ntziachristos, Vasilis; Wildgruber, Moritz

    2017-01-01

    Integrins play an important role in tumor progression, invasion and metastasis. Therefore we aimed to evaluate a preclinical imaging approach applying ανβ3 integrin targeted hybrid Fluorescence Molecular Tomography/X-ray Computed Tomography (FMT-XCT) for monitoring tumor progression as well as early therapy response in a syngeneic murine Non-Small Cell Lung Cancer (NSCLC) model. Lewis Lung Carcinomas were grown orthotopically in C57BL/6 J mice and imaged in-vivo using a ανβ3 targeted near-infrared fluorescence (NIRF) probe. ανβ3-targeted FMT-XCT was able to track tumor progression. Cilengitide was able to substantially block the binding of the NIRF probe and suppress the imaging signal. Additionally mice were treated with an established chemotherapy regimen of Cisplatin and Bevacizumab or with a novel MEK inhibitor (Refametinib) for 2 weeks. While μCT revealed only a moderate slowdown of tumor growth, ανβ3 dependent signal decreased significantly compared to non-treated mice already at one week post treatment. ανβ3 targeted imaging might therefore become a promising tool for assessment of early therapy response in the future. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  19. A Novel Algorithm for Determining Contact Area Between a Respirator and a Headform

    PubMed Central

    Lei, Zhipeng; Yang, James; Zhuang, Ziqing

    2016-01-01

    The contact area, as well as the contact pressure, is created when a respiratory protection device (a respirator or surgical mask) contacts a human face. A computer-based algorithm for determining the contact area between a headform and N95 filtering facepiece respirator (FFR) was proposed. Six N95 FFRs were applied to five sizes of standard headforms (large, medium, small, long/narrow, and short/wide) to simulate respirator donning. After the contact simulation between a headform and an N95 FFR was conducted, a contact area was determined by extracting the intersection surfaces of the headform and the N95 FFR. Using computer-aided design tools, a superimposed contact area and an average contact area, which are non-uniform rational basis spline (NURBS) surfaces, were developed for each headform. Experiments that directly measured dimensions of the contact areas between headform prototypes and N95 FFRs were used to validate the simulation results. Headform sizes influenced all contact area dimensions (P < 0.0001), and N95 FFR sizing systems influenced all contact area dimensions (P < 0.05) except the left and right chin regions. The medium headform produced the largest contact area, while the large and small headforms produced the smallest. PMID:24579752

  20. Voss retrieves a small tool from a tool kit in ISS Node 1/Unity

    NASA Image and Video Library

    2001-08-13

    STS105-E-5175 (13 August 2001) --- Astronaut James S. Voss, retrieves a small tool from a tool case in the U.S.-built Unity node aboard the International Space Station (ISS). The Expedition Two flight engineer is only days away from returning to Earth following five months aboard the orbital outpost. The image was recorded with a digital still camera.

Top