Using computer visualizations to help understand how forests change and develop
Brian Orland; Cenk Ursavas
2006-01-01
Probably a first question people ask when they hear about proposed forest management actions to address fire hazard or forest health concerns is "what will the forest look like"? The recent advent of powerful computer visualization tools has provided one means of answering that question. The resultant images can be a powerful tool for communicating the...
A computer tool to support in design of industrial Ethernet.
Lugli, Alexandre Baratella; Santos, Max Mauro Dias; Franco, Lucia Regina Horta Rodrigues
2009-04-01
This paper presents a computer tool to support in the project and development of an industrial Ethernet network, verifying the physical layer (cables-resistance and capacitance, scan time, network power supply-POE's concept "Power Over Ethernet" and wireless), and occupation rate (amount of information transmitted to the network versus the controller network scan time). These functions are accomplished without a single physical element installed in the network, using only simulation. The computer tool has a software that presents a detailed vision of the network to the user, besides showing some possible problems in the network, and having an extremely friendly environment.
Computerized power supply analysis: State equation generation and terminal models
NASA Technical Reports Server (NTRS)
Garrett, S. J.
1978-01-01
To aid engineers that design power supply systems two analysis tools that can be used with the state equation analysis package were developed. These tools include integration routines that start with the description of a power supply in state equation form and yield analytical results. The first tool uses a computer program that works with the SUPER SCEPTRE circuit analysis program and prints the state equation for an electrical network. The state equations developed automatically by the computer program are used to develop an algorithm for reducing the number of state variables required to describe an electrical network. In this way a second tool is obtained in which the order of the network is reduced and a simpler terminal model is obtained.
Teach Graphic Design Basics with PowerPoint
ERIC Educational Resources Information Center
Lazaros, Edward J.; Spotts, Thomas H.
2007-01-01
While PowerPoint is generally regarded as simply software for creating slide presentations, it includes often overlooked--but powerful--drawing tools. Because it is part of the Microsoft Office package, PowerPoint comes preloaded on many computers and thus is already available in many classrooms. Since most computers are not preloaded with good…
Grid Integration Research | Wind | NREL
-generated simulation of a wind turbine. Wind Power Plant Modeling and Simulation Engineers at the National computer-aided engineering tool, FAST, as well as their wind power plant simulation tool, Wind-Plant
Learning Disabled Students and Computers: A Teacher's Guide Book.
ERIC Educational Resources Information Center
Metzger, Merrianne; And Others
This booklet is provided as a guide to teachers working with learning disabled (LD) students who are interested in using computers as a teaching tool. The computer is presented as a powerful option to enhance educational opportunities for LD children. The author outlines the three main modes in educational computer use (tutor, tool, and tutee) and…
Use of computers in dysmorphology.
Diliberti, J H
1988-01-01
As a consequence of the increasing power and decreasing cost of digital computers, dysmorphologists have begun to explore a wide variety of computerised applications in clinical genetics. Of considerable interest are developments in the areas of syndrome databases, expert systems, literature searches, image processing, and pattern recognition. Each of these areas is reviewed from the perspective of the underlying computer principles, existing applications, and the potential for future developments. Particular emphasis is placed on the analysis of the tasks performed by the dysmorphologist and the design of appropriate tools to facilitate these tasks. In this context the computer and associated software are considered paradigmatically as tools for the dysmorphologist and should be designed accordingly. Continuing improvements in the ability of computers to manipulate vast amounts of data rapidly makes the development of increasingly powerful tools for the dysmorphologist highly probable. PMID:3050092
Advancing crime scene computer forensics techniques
NASA Astrophysics Data System (ADS)
Hosmer, Chet; Feldman, John; Giordano, Joe
1999-02-01
Computers and network technology have become inexpensive and powerful tools that can be applied to a wide range of criminal activity. Computers have changed the world's view of evidence because computers are used more and more as tools in committing `traditional crimes' such as embezzlements, thefts, extortion and murder. This paper will focus on reviewing the current state-of-the-art of the data recovery and evidence construction tools used in both the field and laboratory for prosection purposes.
NASA Astrophysics Data System (ADS)
Jain, A.
2017-08-01
Computer based method can help in discovery of leads and can potentially eliminate chemical synthesis and screening of many irrelevant compounds, and in this way, it save time as well as cost. Molecular modeling systems are powerful tools for building, visualizing, analyzing and storing models of complex molecular structure that can help to interpretate structure activity relationship. The use of various techniques of molecular mechanics and dynamics and software in Computer aided drug design along with statistics analysis is powerful tool for the medicinal chemistry to synthesis therapeutic and effective drugs with minimum side effect.
Application programs written by using customizing tools of a computer-aided design system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, X.; Huang, R.; Juricic, D.
1995-12-31
Customizing tools of Computer-Aided Design Systems have been developed to such a degree as to become equivalent to powerful higher-level programming languages that are especially suitable for graphics applications. Two examples of application programs written by using AutoCAD`s customizing tools are given in some detail to illustrate their power. One tool uses AutoLISP list-processing language to develop an application program that produces four views of a given solid model. The other uses AutoCAD Developmental System, based on program modules written in C, to produce an application program that renders a freehand sketch from a given CAD drawing.
Computational electronics and electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shang, C. C.
The Computational Electronics and Electromagnetics thrust area at Lawrence Livermore National Laboratory serves as the focal point for engineering R&D activities for developing computer-based design, analysis, and tools for theory. Key representative applications include design of particle accelerator cells and beamline components; engineering analysis and design of high-power components, photonics, and optoelectronics circuit design; EMI susceptibility analysis; and antenna synthesis. The FY-96 technology-base effort focused code development on (1) accelerator design codes; (2) 3-D massively parallel, object-oriented time-domain EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; (5) 3-D spectral-domainmore » CEM tools; and (6) enhancement of laser drilling codes. Joint efforts with the Power Conversion Technologies thrust area include development of antenna systems for compact, high-performance radar, in addition to novel, compact Marx generators. 18 refs., 25 figs., 1 tab.« less
Radiation Mitigation and Power Optimization Design Tools for Reconfigurable Hardware in Orbit
NASA Technical Reports Server (NTRS)
French, Matthew; Graham, Paul; Wirthlin, Michael; Wang, Li; Larchev, Gregory
2005-01-01
The Reconfigurable Hardware in Orbit (RHinO)project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. In the second year of the project, design tools that leverage an established FPGA design environment have been created to visualize and analyze an FPGA circuit for radiation weaknesses and power inefficiencies. For radiation, a single event Upset (SEU) emulator, persistence analysis tool, and a half-latch removal tool for Xilinx/Virtex-II devices have been created. Research is underway on a persistence mitigation tool and multiple bit upsets (MBU) studies. For power, synthesis level dynamic power visualization and analysis tools have been completed. Power optimization tools are under development and preliminary test results are positive.
NASA Astrophysics Data System (ADS)
Horodinca, M.
2016-08-01
This paper intend to propose some new results related with computer aided monitoring of transient regimes on machine-tools based on the evolution of active electrical power absorbed by the electric motor used to drive the main kinematic chains and the evolution of rotational speed and acceleration of the main shaft. The active power is calculated in numerical format using the evolution of instantaneous voltage and current delivered by electrical power system to the electric motor. The rotational speed and acceleration of the main shaft are calculated based on the signal delivered by a sensor. Three real-time analogic signals are acquired with a very simple computer assisted setup which contains a voltage transformer, a current transformer, an AC generator as rotational speed sensor, a data acquisition system and a personal computer. The data processing and analysis was done using Matlab software. Some different transient regimes were investigated; several important conclusions related with the advantages of this monitoring technique were formulated. Many others features of the experimental setup are also available: to supervise the mechanical loading of machine-tools during cutting processes or for diagnosis of machine-tools condition by active electrical power signal analysis in frequency domain.
Harnessing the power of emerging petascale platforms
NASA Astrophysics Data System (ADS)
Mellor-Crummey, John
2007-07-01
As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC-2) program, science teams are tackling problems that require computational simulation and modeling at the petascale. A grand challenge for computer science is to develop software technology that makes it easier to harness the power of these systems to aid scientific discovery. As part of its activities, the SciDAC-2 Center for Scalable Application Development Software (CScADS) is building open source software tools to support efficient scientific computing on the emerging leadership-class platforms. In this paper, we describe two tools for performance analysis and tuning that are being developed as part of CScADS: a tool for analyzing scalability and performance, and a tool for optimizing loop nests for better node performance. We motivate these tools by showing how they apply to S3D, a turbulent combustion code under development at Sandia National Laboratory. For S3D, our node performance analysis tool helped uncover several performance bottlenecks. Using our loop nest optimization tool, we transformed S3D's most costly loop nest to reduce execution time by a factor of 2.94 for a processor working on a 503 domain.
Water Power Data and Tools | Water Power | NREL
computer modeling tools and data with state-of-the-art design and analysis. Photo of a buoy designed around National Wind Technology Center's Information Portal as well as a WEC-Sim fact sheet. WEC Design Response Toolbox The WEC Design Response Toolbox provides extreme response and fatigue analysis tools specifically
ERIC Educational Resources Information Center
Xu, Q.; Lai, L. L.; Tse, N. C. F.; Ichiyanagi, K.
2011-01-01
An interactive computer-based learning tool with multiple sessions is proposed in this paper, which teaches students to think and helps them recognize the merits and limitations of simulation tools so as to improve their practical abilities in electrical circuit simulation based on the case of a power converter with progressive problems. The…
Sub-Second Parallel State Estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Rice, Mark J.; Glaesemann, Kurt R.
This report describes the performance of Pacific Northwest National Laboratory (PNNL) sub-second parallel state estimation (PSE) tool using the utility data from the Bonneville Power Administrative (BPA) and discusses the benefits of the fast computational speed for power system applications. The test data were provided by BPA. They are two-days’ worth of hourly snapshots that include power system data and measurement sets in a commercial tool format. These data are extracted out from the commercial tool box and fed into the PSE tool. With the help of advanced solvers, the PSE tool is able to solve each BPA hourly statemore » estimation problem within one second, which is more than 10 times faster than today’s commercial tool. This improved computational performance can help increase the reliability value of state estimation in many aspects: (1) the shorter the time required for execution of state estimation, the more time remains for operators to take appropriate actions, and/or to apply automatic or manual corrective control actions. This increases the chances of arresting or mitigating the impact of cascading failures; (2) the SE can be executed multiple times within time allowance. Therefore, the robustness of SE can be enhanced by repeating the execution of the SE with adaptive adjustments, including removing bad data and/or adjusting different initial conditions to compute a better estimate within the same time as a traditional state estimator’s single estimate. There are other benefits with the sub-second SE, such as that the PSE results can potentially be used in local and/or wide-area automatic corrective control actions that are currently dependent on raw measurements to minimize the impact of bad measurements, and provides opportunities to enhance the power grid reliability and efficiency. PSE also can enable other advanced tools that rely on SE outputs and could be used to further improve operators’ actions and automated controls to mitigate effects of severe events on the grid. The power grid continues to grow and the number of measurements is increasing at an accelerated rate due to the variety of smart grid devices being introduced. A parallel state estimation implementation will have better performance than traditional, sequential state estimation by utilizing the power of high performance computing (HPC). This increased performance positions parallel state estimators as valuable tools for operating the increasingly more complex power grid.« less
The Computer as a Tool for Learning through Reflection. Technical Report No. 376.
ERIC Educational Resources Information Center
Collins, Allan; Brown, John Seely
Because of its ability to record and represent process, the computer can provide a powerful, motivating, and as yet untapped tool for focusing the students' attention directly on their own thought processes and learning through reflection. Properly abstracted and structured, the computational medium can capture the processes by which a novice or…
The environment power system analysis tool development program
NASA Technical Reports Server (NTRS)
Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.
1990-01-01
The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.
A computer controlled power tool for the servicing of the Hubble Space Telescope
NASA Technical Reports Server (NTRS)
Richards, Paul W.; Konkel, Carl; Smith, Chris; Brown, Lee; Wagner, Ken
1996-01-01
The Hubble Space Telescope (HST) Pistol Grip Tool (PGT) is a self-contained, microprocessor controlled, battery-powered, 3/8-inch-drive hand-held tool. The PGT is also a non-powered ratchet wrench. This tool will be used by astronauts during Extravehicular Activity (EVA) to apply torque to the HST and HST Servicing Support Equipment mechanical interfaces and fasteners. Numerous torque, speed, and turn or angle limits are programmed into the PGT for use during various missions. Batteries are replaceable during ground operations, Intravehicular Activities, and EVA's.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brun, B.
1997-07-01
Computer technology has improved tremendously during the last years with larger media capacity, memory and more computational power. Visual computing with high-performance graphic interface and desktop computational power have changed the way engineers accomplish everyday tasks, development and safety studies analysis. The emergence of parallel computing will permit simulation over a larger domain. In addition, new development methods, languages and tools have appeared in the last several years.
The Power of Computer-aided Tomography to Investigate Marine Benthic Communities
Utilization of Computer-aided-Tomography (CT) technology is a powerful tool to investigate benthic communities in aquatic systems. In this presentation, we will attempt to summarize our 15 years of experience in developing specific CT methods and applications to marine benthic co...
A Switching-Mode Power Supply Design Tool to Improve Learning in a Power Electronics Course
ERIC Educational Resources Information Center
Miaja, P. F.; Lamar, D. G.; de Azpeitia, M.; Rodriguez, A.; Rodriguez, M.; Hernando, M. M.
2011-01-01
The static design of ac/dc and dc/dc switching-mode power supplies (SMPS) relies on a simple but repetitive process. Although specific spreadsheets, available in various computer-aided design (CAD) programs, are widely used, they are difficult to use in educational applications. In this paper, a graphic tool programmed in MATLAB is presented,…
PC Software graphics tool for conceptual design of space/planetary electrical power systems
NASA Technical Reports Server (NTRS)
Truong, Long V.
1995-01-01
This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.
Applications of the pipeline environment for visual informatics and genomics computations
2011-01-01
Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102
Modeling of power electronic systems with EMTP
NASA Technical Reports Server (NTRS)
Tam, Kwa-Sur; Dravid, Narayan V.
1989-01-01
In view of the potential impact of power electronics on power systems, there is need for a computer modeling/analysis tool to perform simulation studies on power systems with power electronic components as well as to educate engineering students about such systems. The modeling of the major power electronic components of the NASA Space Station Freedom Electric Power System is described along with ElectroMagnetic Transients Program (EMTP) and it is demonstrated that EMTP can serve as a very useful tool for teaching, design, analysis, and research in the area of power systems with power electronic components. EMTP modeling of power electronic circuits is described and simulation results are presented.
Computer-Aided Engineering Tools | Water Power | NREL
energy converters that will provide a full range of simulation capabilities for single devices and arrays simulation of water power technologies on high-performance computers enables the study of complex systems and experimentation. Such simulation is critical to accelerate progress in energy programs within the U.S. Department
WINCADRE (COMPUTER-AIDED DATA REVIEW AND EVALUATION)
WinCADRE (Computer-Aided Data Review and Evaluation) is a Windows -based program designed for computer-assisted data validation. WinCADRE is a powerful tool which significantly decreases data validation turnaround time. The electronic-data-deliverable format has been designed ...
Using Microsoft PowerPoint as an Astronomical Image Analysis Tool
NASA Astrophysics Data System (ADS)
Beck-Winchatz, Bernhard
2006-12-01
Engaging students in the analysis of authentic scientific data is an effective way to teach them about the scientific process and to develop their problem solving, teamwork and communication skills. In astronomy several image processing and analysis software tools have been developed for use in school environments. However, the practical implementation in the classroom is often difficult because the teachers may not have the comfort level with computers necessary to install and use these tools, they may not have adequate computer privileges and/or support, and they may not have the time to learn how to use specialized astronomy software. To address this problem, we have developed a set of activities in which students analyze astronomical images using basic tools provided in PowerPoint. These include measuring sizes, distances, and angles, and blinking images. In contrast to specialized software, PowerPoint is broadly available on school computers. Many teachers are already familiar with PowerPoint, and the skills developed while learning how to analyze astronomical images are highly transferable. We will discuss several practical examples of measurements, including the following: -Variations in the distances to the sun and moon from their angular sizes -Magnetic declination from images of shadows -Diameter of the moon from lunar eclipse images -Sizes of lunar craters -Orbital radii of the Jovian moons and mass of Jupiter -Supernova and comet searches -Expansion rate of the universe from images of distant galaxies
Physics Education through Computational Tools: The Case of Geometrical and Physical Optics
ERIC Educational Resources Information Center
Rodríguez, Y.; Santana, A.; Mendoza, L. M.
2013-01-01
Recently, with the development of more powerful and accurate computational tools, the inclusion of new didactic materials in the classroom is known to have increased. However, the form in which these materials can be used to enhance the learning process is still under debate. Many different methodologies have been suggested for constructing new…
Self-learning computers for surgical planning and prediction of postoperative alignment.
Lafage, Renaud; Pesenti, Sébastien; Lafage, Virginie; Schwab, Frank J
2018-02-01
In past decades, the role of sagittal alignment has been widely demonstrated in the setting of spinal conditions. As several parameters can be affected, identifying the driver of the deformity is the cornerstone of a successful treatment approach. Despite the importance of restoring sagittal alignment for optimizing outcome, this task remains challenging. Self-learning computers and optimized algorithms are of great interest in spine surgery as in that they facilitate better planning and prediction of postoperative alignment. Nowadays, computer-assisted tools are part of surgeons' daily practice; however, the use of such tools remains to be time-consuming. NARRATIVE REVIEW AND RESULTS: Computer-assisted methods for the prediction of postoperative alignment consist of a three step analysis: identification of anatomical landmark, definition of alignment objectives, and simulation of surgery. Recently, complex rules for the prediction of alignment have been proposed. Even though this kind of work leads to more personalized objectives, the number of parameters involved renders it difficult for clinical use, stressing the importance of developing computer-assisted tools. The evolution of our current technology, including machine learning and other types of advanced algorithms, will provide powerful tools that could be useful in improving surgical outcomes and alignment prediction. These tools can combine different types of advanced technologies, such as image recognition and shape modeling, and using this technique, computer-assisted methods are able to predict spinal shape. The development of powerful computer-assisted methods involves the integration of several sources of information such as radiographic parameters (X-rays, MRI, CT scan, etc.), demographic information, and unusual non-osseous parameters (muscle quality, proprioception, gait analysis data). In using a larger set of data, these methods will aim to mimic what is actually done by spine surgeons, leading to real tailor-made solutions. Integrating newer technology can change the current way of planning/simulating surgery. The use of powerful computer-assisted tools that are able to integrate several parameters and learn from experience can change the traditional way of selecting treatment pathways and counseling patients. However, there is still much work to be done to reach a desired level as noted in other orthopedic fields, such as hip surgery. Many of these tools already exist in non-medical fields and their adaptation to spine surgery is of considerable interest.
Savant Genome Browser 2: visualization and analysis for population-scale genomics.
Fiume, Marc; Smith, Eric J M; Brook, Andrew; Strbenac, Dario; Turner, Brian; Mezlini, Aziz M; Robinson, Mark D; Wodak, Shoshana J; Brudno, Michael
2012-07-01
High-throughput sequencing (HTS) technologies are providing an unprecedented capacity for data generation, and there is a corresponding need for efficient data exploration and analysis capabilities. Although most existing tools for HTS data analysis are developed for either automated (e.g. genotyping) or visualization (e.g. genome browsing) purposes, such tools are most powerful when combined. For example, integration of visualization and computation allows users to iteratively refine their analyses by updating computational parameters within the visual framework in real-time. Here we introduce the second version of the Savant Genome Browser, a standalone program for visual and computational analysis of HTS data. Savant substantially improves upon its predecessor and existing tools by introducing innovative visualization modes and navigation interfaces for several genomic datatypes, and synergizing visual and automated analyses in a way that is powerful yet easy even for non-expert users. We also present a number of plugins that were developed by the Savant Community, which demonstrate the power of integrating visual and automated analyses using Savant. The Savant Genome Browser is freely available (open source) at www.savantbrowser.com.
Savant Genome Browser 2: visualization and analysis for population-scale genomics
Smith, Eric J. M.; Brook, Andrew; Strbenac, Dario; Turner, Brian; Mezlini, Aziz M.; Robinson, Mark D.; Wodak, Shoshana J.; Brudno, Michael
2012-01-01
High-throughput sequencing (HTS) technologies are providing an unprecedented capacity for data generation, and there is a corresponding need for efficient data exploration and analysis capabilities. Although most existing tools for HTS data analysis are developed for either automated (e.g. genotyping) or visualization (e.g. genome browsing) purposes, such tools are most powerful when combined. For example, integration of visualization and computation allows users to iteratively refine their analyses by updating computational parameters within the visual framework in real-time. Here we introduce the second version of the Savant Genome Browser, a standalone program for visual and computational analysis of HTS data. Savant substantially improves upon its predecessor and existing tools by introducing innovative visualization modes and navigation interfaces for several genomic datatypes, and synergizing visual and automated analyses in a way that is powerful yet easy even for non-expert users. We also present a number of plugins that were developed by the Savant Community, which demonstrate the power of integrating visual and automated analyses using Savant. The Savant Genome Browser is freely available (open source) at www.savantbrowser.com. PMID:22638571
Changing computing paradigms towards power efficiency
Klavík, Pavel; Malossi, A. Cristiano I.; Bekas, Costas; Curioni, Alessandro
2014-01-01
Power awareness is fast becoming immensely important in computing, ranging from the traditional high-performance computing applications to the new generation of data centric workloads. In this work, we describe our efforts towards a power-efficient computing paradigm that combines low- and high-precision arithmetic. We showcase our ideas for the widely used kernel of solving systems of linear equations that finds numerous applications in scientific and engineering disciplines as well as in large-scale data analytics, statistics and machine learning. Towards this goal, we developed tools for the seamless power profiling of applications at a fine-grain level. In addition, we verify here previous work on post-FLOPS/W metrics and show that these can shed much more light in the power/energy profile of important applications. PMID:24842033
Computational electronics and electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shang, C C
The Computational Electronics and Electromagnetics thrust area serves as the focal point for Engineering R and D activities for developing computer-based design and analysis tools. Representative applications include design of particle accelerator cells and beamline components; design of transmission line components; engineering analysis and design of high-power (optical and microwave) components; photonics and optoelectronics circuit design; electromagnetic susceptibility analysis; and antenna synthesis. The FY-97 effort focuses on development and validation of (1) accelerator design codes; (2) 3-D massively parallel, time-dependent EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; andmore » (5) development of beam control algorithms coupled to beam transport physics codes. These efforts are in association with technology development in the power conversion, nondestructive evaluation, and microtechnology areas. The efforts complement technology development in Lawrence Livermore National programs.« less
Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Y; Glascoe, L
The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less
Visualization and Interaction in Research, Teaching, and Scientific Communication
NASA Astrophysics Data System (ADS)
Ammon, C. J.
2017-12-01
Modern computing provides many tools for exploring observations, numerical calculations, and theoretical relationships. The number of options is, in fact, almost overwhelming. But the choices provide those with modest programming skills opportunities to create unique views of scientific information and to develop deeper insights into their data, their computations, and the underlying theoretical data-model relationships. I present simple examples of using animation and human-computer interaction to explore scientific data and scientific-analysis approaches. I illustrate how valuable a little programming ability can free scientists from the constraints of existing tools and can facilitate the development of deeper appreciation data and models. I present examples from a suite of programming languages ranging from C to JavaScript including the Wolfram Language. JavaScript is valuable for sharing tools and insight (hopefully) with others because it is integrated into one of the most powerful communication tools in human history, the web browser. Although too much of that power is often spent on distracting advertisements, the underlying computation and graphics engines are efficient, flexible, and almost universally available in desktop and mobile computing platforms. Many are working to fulfill the browser's potential to become the most effective tool for interactive study. Open-source frameworks for visualizing everything from algorithms to data are available, but advance rapidly. One strategy for dealing with swiftly changing tools is to adopt common, open data formats that are easily adapted (often by framework or tool developers). I illustrate the use of animation and interaction in research and teaching with examples from earthquake seismology.
WINCADRE INORGANIC (WINDOWS COMPUTER-AIDED DATA REVIEW AND EVALUATION)
WinCADRE (Computer-Aided Data Review and Evaluation) is a Windows -based program designed for computer-assisted data validation. WinCADRE is a powerful tool which significantly decreases data validation turnaround time. The electronic-data-deliverable format has been designed in...
Bhargava, Puneet; Lackey, Amanda E; Dhand, Sabeen; Moshiri, Mariam; Jambhekar, Kedar; Pandey, Tarun
2013-03-01
We are in the midst of an evolving educational revolution. Use of digital devices such as smart phones and tablet computers is rapidly increasing among radiologists who now regularly use them for medical, technical, and administrative tasks. These electronic tools provide a wide array of new tools to the radiologists allowing for faster, more simplified, and widespread distribution of educational material. The utility, future potential, and limitations of some these powerful tools are discussed in this article. Published by Elsevier Inc.
High performance TWT development for the microwave power module
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whaley, D.R.; Armstrong, C.M.; Groshart, G.
1996-12-31
Northrop Grumman`s ongoing development of microwave power modules (MPM) provides microwave power at various power levels, frequencies, and bandwidths for a variety of applications. Present day requirements for the vacuum power booster traveling wave tubes of the microwave power module are becoming increasingly more demanding, necessitating the need for further enhancement of tube performance. The MPM development program at Northrop Grumman is designed specifically to meet this need by construction and test of a series of new tubes aimed at verifying computation and reaching high efficiency design goals. Tubes under test incorporate several different helix designs, as well as varyingmore » electron gun and magnetic confinement configurations. Current efforts also include further development of state-of-the-art TWT modeling and computational methods at Northrop Grumman incorporating new, more accurate models into existing design tools and developing new tools to be used in all aspects of traveling wave tube design. Current status of the Northrop Grumman MPM TWT development program will be presented.« less
Changing computing paradigms towards power efficiency.
Klavík, Pavel; Malossi, A Cristiano I; Bekas, Costas; Curioni, Alessandro
2014-06-28
Power awareness is fast becoming immensely important in computing, ranging from the traditional high-performance computing applications to the new generation of data centric workloads. In this work, we describe our efforts towards a power-efficient computing paradigm that combines low- and high-precision arithmetic. We showcase our ideas for the widely used kernel of solving systems of linear equations that finds numerous applications in scientific and engineering disciplines as well as in large-scale data analytics, statistics and machine learning. Towards this goal, we developed tools for the seamless power profiling of applications at a fine-grain level. In addition, we verify here previous work on post-FLOPS/W metrics and show that these can shed much more light in the power/energy profile of important applications. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
NASA Technical Reports Server (NTRS)
Simoneau, Robert J.; Strazisar, Anthony J.; Sockol, Peter M.; Reid, Lonnie; Adamczyk, John J.
1987-01-01
The discipline research in turbomachinery, which is directed toward building the tools needed to understand such a complex flow phenomenon, is based on the fact that flow in turbomachinery is fundamentally unsteady or time dependent. Success in building a reliable inventory of analytic and experimental tools will depend on how the time and time-averages are treated, as well as on who the space and space-averages are treated. The raw tools at disposal (both experimentally and computational) are truly powerful and their numbers are growing at a staggering pace. As a result of this power, a case can be made that a situation exists where information is outstripping understanding. The challenge is to develop a set of computational and experimental tools which genuinely increase understanding of the fluid flow and heat transfer in a turbomachine. Viewgraphs outline a philosophy based on working on a stairstep hierarchy of mathematical and experimental complexity to build a system of tools, which enable one to aggressively design the turbomachinery of the next century. Examples of the types of computational and experimental tools under current development at Lewis, with progress to date, are examined. The examples include work in both the time-resolved and time-averaged domains. Finally, an attempt is made to identify the proper place for Lewis in this continuum of research.
1987-06-01
to a field of research called Computer-Aided Instruction (CAI). CAI is a powerful methodology for enhancing the overall quaiity and effectiveness of...provides a very powerful tool for statistical inference, especially when pooling informations from different source is appropriate. Thus. prior...04 , 2 ’ .. ."k, + ++ ,,;-+-,..,,..v ->’,0,,.’ I The power of the model lies in its ability to adapt a diagnostic session to the level of knowledge
Bounds on the power of proofs and advice in general physical theories.
Lee, Ciarán M; Hoban, Matty J
2016-06-01
Quantum theory presents us with the tools for computational and communication advantages over classical theory. One approach to uncovering the source of these advantages is to determine how computation and communication power vary as quantum theory is replaced by other operationally defined theories from a broad framework of such theories. Such investigations may reveal some of the key physical features required for powerful computation and communication. In this paper, we investigate how simple physical principles bound the power of two different computational paradigms which combine computation and communication in a non-trivial fashion: computation with advice and interactive proof systems. We show that the existence of non-trivial dynamics in a theory implies a bound on the power of computation with advice. Moreover, we provide an explicit example of a theory with no non-trivial dynamics in which the power of computation with advice is unbounded. Finally, we show that the power of simple interactive proof systems in theories where local measurements suffice for tomography is non-trivially bounded. This result provides a proof that [Formula: see text] is contained in [Formula: see text], which does not make use of any uniquely quantum structure-such as the fact that observables correspond to self-adjoint operators-and thus may be of independent interest.
ERIC Educational Resources Information Center
Joyner, Amy
2003-01-01
Handheld computers provide students tremendous computing and learning power at about a 10th the cost of a regular computer. Describes the evolution of handhelds; provides some examples of their uses; and cites research indicating they are effective classroom tools that can improve efficiency and instruction. A sidebar lists handheld resources.…
ERIC Educational Resources Information Center
Chien, Tien-Chen
2008-01-01
Computer is not only a powerful technology for managing information and enhancing productivity, but also an efficient tool for education and training. Computer anxiety can be one of the major problems that affect the effectiveness of learning. Through analyzing related literature, this study describes the phenomenon of computer anxiety,…
Computing organic stereoselectivity - from concepts to quantitative calculations and predictions.
Peng, Qian; Duarte, Fernanda; Paton, Robert S
2016-11-07
Advances in theory and processing power have established computation as a valuable interpretative and predictive tool in the discovery of new asymmetric catalysts. This tutorial review outlines the theory and practice of modeling stereoselective reactions. Recent examples illustrate how an understanding of the fundamental principles and the application of state-of-the-art computational methods may be used to gain mechanistic insight into organic and organometallic reactions. We highlight the emerging potential of this computational tool-box in providing meaningful predictions for the rational design of asymmetric catalysts. We present an accessible account of the field to encourage future synergy between computation and experiment.
Computational Methods for Stability and Control (COMSAC): The Time Has Come
NASA Technical Reports Server (NTRS)
Hall, Robert M.; Biedron, Robert T.; Ball, Douglas N.; Bogue, David R.; Chung, James; Green, Bradford E.; Grismer, Matthew J.; Brooks, Gregory P.; Chambers, Joseph R.
2005-01-01
Powerful computational fluid dynamics (CFD) tools have emerged that appear to offer significant benefits as an adjunct to the experimental methods used by the stability and control community to predict aerodynamic parameters. The decreasing costs for and increasing availability of computing hours are making these applications increasingly viable as time goes on and the cost of computing continues to drop. This paper summarizes the efforts of four organizations to utilize high-end computational fluid dynamics (CFD) tools to address the challenges of the stability and control arena. General motivation and the backdrop for these efforts will be summarized as well as examples of current applications.
COMPUTER TECHNOLOGY AND SOCIAL CHANGE,
This paper presents a discussion of the social , political, economic and psychological problems associated with the rapid growth and development of...public officials and responsible groups is required to increase public understanding of the computer as a powerful tool, to select appropriate
Design Tools for Reconfigurable Hardware in Orbit (RHinO)
NASA Technical Reports Server (NTRS)
French, Mathew; Graham, Paul; Wirthlin, Michael; Larchev, Gregory; Bellows, Peter; Schott, Brian
2004-01-01
The Reconfigurable Hardware in Orbit (RHinO) project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. These tools leverage an established FPGA design environment and focus primarily on space effects mitigation and power optimization. The project is creating software to automatically test and evaluate the single-event-upsets (SEUs) sensitivities of an FPGA design and insert mitigation techniques. Extensions into the tool suite will also allow evolvable algorithm techniques to reconfigure around single-event-latchup (SEL) events. In the power domain, tools are being created for dynamic power visualiization and optimization. Thus, this technology seeks to enable the use of Reconfigurable Hardware in Orbit, via an integrated design tool-suite aiming to reduce risk, cost, and design time of multimission reconfigurable space processors using SRAM-based FPGAs.
Biamonte, Jacob; Wittek, Peter; Pancotti, Nicola; Rebentrost, Patrick; Wiebe, Nathan; Lloyd, Seth
2017-09-13
Fuelled by increasing computer power and algorithmic advances, machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that classical systems are thought not to produce efficiently, so it is reasonable to postulate that quantum computers may outperform classical computers on machine learning tasks. The field of quantum machine learning explores how to devise and implement quantum software that could enable machine learning that is faster than that of classical computers. Recent work has produced quantum algorithms that could act as the building blocks of machine learning programs, but the hardware and software challenges are still considerable.
NASA Astrophysics Data System (ADS)
Biamonte, Jacob; Wittek, Peter; Pancotti, Nicola; Rebentrost, Patrick; Wiebe, Nathan; Lloyd, Seth
2017-09-01
Fuelled by increasing computer power and algorithmic advances, machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that classical systems are thought not to produce efficiently, so it is reasonable to postulate that quantum computers may outperform classical computers on machine learning tasks. The field of quantum machine learning explores how to devise and implement quantum software that could enable machine learning that is faster than that of classical computers. Recent work has produced quantum algorithms that could act as the building blocks of machine learning programs, but the hardware and software challenges are still considerable.
Airfoil/Wing Flow Control Using Flexible Extended Trailing Edge
2009-02-27
and (b) Power spectrums of drag coefficient Figure 4. Mean velocity profiles O Baseline NACA0012. AoA 18 deg c Baseline NACA0012. AoA 20...dynamics, (a) fin amplitude and (b) power spectrum of fin amplitude Development of Computational Tools Simulations of the time-dependent deformation of...combination of experimental, computational and theoretical methods. Compared with Gurney flap and conventional flap, this device enhanced lift at a smaller
Computational protein design-the next generation tool to expand synthetic biology applications.
Gainza-Cirauqui, Pablo; Correia, Bruno Emanuel
2018-05-02
One powerful approach to engineer synthetic biology pathways is the assembly of proteins sourced from one or more natural organisms. However, synthetic pathways often require custom functions or biophysical properties not displayed by natural proteins, limitations that could be overcome through modern protein engineering techniques. Structure-based computational protein design is a powerful tool to engineer new functional capabilities in proteins, and it is beginning to have a profound impact in synthetic biology. Here, we review efforts to increase the capabilities of synthetic biology using computational protein design. We focus primarily on computationally designed proteins not only validated in vitro, but also shown to modulate different activities in living cells. Efforts made to validate computational designs in cells can illustrate both the challenges and opportunities in the intersection of protein design and synthetic biology. We also highlight protein design approaches, which although not validated as conveyors of new cellular function in situ, may have rapid and innovative applications in synthetic biology. We foresee that in the near-future, computational protein design will vastly expand the functional capabilities of synthetic cells. Copyright © 2018. Published by Elsevier Ltd.
GAPIT version 2: an enhanced integrated tool for genomic association and prediction
USDA-ARS?s Scientific Manuscript database
Most human diseases and agriculturally important traits are complex. Dissecting their genetic architecture requires continued development of innovative and powerful statistical methods. Corresponding advances in computing tools are critical to efficiently use these statistical innovations and to enh...
What's New in Software? Computers and the Writing Process: Strategies That Work.
ERIC Educational Resources Information Center
Ellsworth, Nancy J.
1990-01-01
The computer can be a powerful tool to help students who are having difficulty learning the skills of prewriting, composition, revision, and editing. Specific software is suggested for each phase, as well as for classroom publishing. (Author/JDD)
Simulation techniques in hyperthermia treatment planning
Paulides, MM; Stauffer, PR; Neufeld, E; Maccarini, P; Kyriakou, A; Canters, RAM; Diederich, C; Bakker, JF; Van Rhoon, GC
2013-01-01
Clinical trials have shown that hyperthermia (HT), i.e. an increase of tissue temperature to 39-44°C, significantly enhance radiotherapy and chemotherapy effectiveness (1). Driven by the developments in computational techniques and computing power, personalized hyperthermia treatment planning (HTP) has matured and has become a powerful tool for optimizing treatment quality. Electromagnetic, ultrasound, and thermal simulations using realistic clinical setups are now being performed to achieve patient-specific treatment optimization. In addition, extensive studies aimed to properly implement novel HT tools and techniques, and to assess the quality of HT, are becoming more common. In this paper, we review the simulation tools and techniques developed for clinical hyperthermia, and evaluate their current status on the path from “model” to “clinic”. In addition, we illustrate the major techniques employed for validation and optimization. HTP has become an essential tool for improvement, control, and assessment of HT treatment quality. As such, it plays a pivotal role in the quest to establish HT as an efficacious addition to multi-modality treatment of cancer. PMID:23672453
Computational tool for simulation of power and refrigeration cycles
NASA Astrophysics Data System (ADS)
Córdoba Tuta, E.; Reyes Orozco, M.
2016-07-01
Small improvement in thermal efficiency of power cycles brings huge cost savings in the production of electricity, for that reason have a tool for simulation of power cycles allows modeling the optimal changes for a best performance. There is also a big boom in research Organic Rankine Cycle (ORC), which aims to get electricity at low power through cogeneration, in which the working fluid is usually a refrigerant. A tool to design the elements of an ORC cycle and the selection of the working fluid would be helpful, because sources of heat from cogeneration are very different and in each case would be a custom design. In this work the development of a multiplatform software for the simulation of power cycles and refrigeration, which was implemented in the C ++ language and includes a graphical interface which was developed using multiplatform environment Qt and runs on operating systems Windows and Linux. The tool allows the design of custom power cycles, selection the type of fluid (thermodynamic properties are calculated through CoolProp library), calculate the plant efficiency, identify the fractions of flow in each branch and finally generates a report very educational in pdf format via the LaTeX tool.
ResidPlots-2: Computer Software for IRT Graphical Residual Analyses
ERIC Educational Resources Information Center
Liang, Tie; Han, Kyung T.; Hambleton, Ronald K.
2009-01-01
This article discusses the ResidPlots-2, a computer software that provides a powerful tool for IRT graphical residual analyses. ResidPlots-2 consists of two components: a component for computing residual statistics and another component for communicating with users and for plotting the residual graphs. The features of the ResidPlots-2 software are…
Bimolecular dynamics by computer analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eilbeck, J.C.; Lomdahl, P.S.; Scott, A.C.
1984-01-01
As numerical tools (computers and display equipment) become more powerful and the atomic structures of important biological molecules become known, the importance of detailed computation of nonequilibrium biomolecular dynamics increases. In this manuscript we report results from a well developed study of the hydrogen bonded polypeptide crystal acetanilide, a model protein. Directions for future research are suggested. 9 references, 6 figures.
The Role of Crop Systems Simulation in Agriculture and Environment
USDA-ARS?s Scientific Manuscript database
Over the past 30 to 40 years, simulation of crop systems has advanced from a neophyte science with inadequate computing power into a robust and increasingly accepted science supported by improved software, languages, development tools, and computer capabilities. Crop system simulators contain mathe...
Tools and Techniques for Measuring and Improving Grid Performance
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Frumkin, M.; Smith, W.; VanderWijngaart, R.; Wong, P.; Biegel, Bryan (Technical Monitor)
2001-01-01
This viewgraph presentation provides information on NASA's geographically dispersed computing resources, and the various methods by which the disparate technologies are integrated within a nationwide computational grid. Many large-scale science and engineering projects are accomplished through the interaction of people, heterogeneous computing resources, information systems and instruments at different locations. The overall goal is to facilitate the routine interactions of these resources to reduce the time spent in design cycles, particularly for NASA's mission critical projects. The IPG (Information Power Grid) seeks to implement NASA's diverse computing resources in a fashion similar to the way in which electric power is made available.
TOOLS FOR PRESENTING SPATIAL AND TEMPORAL PATTERNS OF ENVIRONMENTAL MONITORING DATA
The EPA Health Effects Research Laboratory has developed this data presentation tool for use with a variety of types of data which may contain spatial and temporal patterns of interest. he technology links mainframe computing power to the new generation of "desktop publishing" ha...
A tool for modeling concurrent real-time computation
NASA Technical Reports Server (NTRS)
Sharma, D. D.; Huang, Shie-Rei; Bhatt, Rahul; Sridharan, N. S.
1990-01-01
Real-time computation is a significant area of research in general, and in AI in particular. The complexity of practical real-time problems demands use of knowledge-based problem solving techniques while satisfying real-time performance constraints. Since the demands of a complex real-time problem cannot be predicted (owing to the dynamic nature of the environment) powerful dynamic resource control techniques are needed to monitor and control the performance. A real-time computation model for a real-time tool, an implementation of the QP-Net simulator on a Symbolics machine, and an implementation on a Butterfly multiprocessor machine are briefly described.
Next-generation genotype imputation service and methods.
Das, Sayantan; Forer, Lukas; Schönherr, Sebastian; Sidore, Carlo; Locke, Adam E; Kwong, Alan; Vrieze, Scott I; Chew, Emily Y; Levy, Shawn; McGue, Matt; Schlessinger, David; Stambolian, Dwight; Loh, Po-Ru; Iacono, William G; Swaroop, Anand; Scott, Laura J; Cucca, Francesco; Kronenberg, Florian; Boehnke, Michael; Abecasis, Gonçalo R; Fuchsberger, Christian
2016-10-01
Genotype imputation is a key component of genetic association studies, where it increases power, facilitates meta-analysis, and aids interpretation of signals. Genotype imputation is computationally demanding and, with current tools, typically requires access to a high-performance computing cluster and to a reference panel of sequenced genomes. Here we describe improvements to imputation machinery that reduce computational requirements by more than an order of magnitude with no loss of accuracy in comparison to standard imputation tools. We also describe a new web-based service for imputation that facilitates access to new reference panels and greatly improves user experience and productivity.
Mobile computing device as tools for college student education: a case on flashcards application
NASA Astrophysics Data System (ADS)
Kang, Congying
2012-04-01
Traditionally, college students always use flash cards as a tool to remember massive knowledge, such as nomenclature, structures, and reactions in chemistry. Educational and information technology have enabled flashcards viewed on computers, like Slides and PowerPoint, works as tunnels of drilling and feedback for the learners. The current generation of students is more capable of information technology and mobile computing devices. For example, they use their Mobile phones much more intensively everyday day. Trends of using Mobile phone as an educational tool is analyzed and a educational technology initiative is proposed, which use Mobile phone flash cards applications to help students learn biology and chemistry. Experiments show that users responded positively to these mobile flash cards.
A Latency-Tolerant Partitioner for Distributed Computing on the Information Power Grid
NASA Technical Reports Server (NTRS)
Das, Sajal K.; Harvey, Daniel J.; Biwas, Rupak; Kwak, Dochan (Technical Monitor)
2001-01-01
NASA's Information Power Grid (IPG) is an infrastructure designed to harness the power of graphically distributed computers, databases, and human expertise, in order to solve large-scale realistic computational problems. This type of a meta-computing environment is necessary to present a unified virtual machine to application developers that hides the intricacies of a highly heterogeneous environment and yet maintains adequate security. In this paper, we present a novel partitioning scheme. called MinEX, that dynamically balances processor workloads while minimizing data movement and runtime communication, for applications that are executed in a parallel distributed fashion on the IPG. We also analyze the conditions that are required for the IPG to be an effective tool for such distributed computations. Our results show that MinEX is a viable load balancer provided the nodes of the IPG are connected by a high-speed asynchronous interconnection network.
Advanced Computational Techniques for Power Tube Design.
1986-07-01
fixturing applications, in addition to the existing computer-aided engineering capabilities. o Helix TWT Manufacturing has Implemented a tooling and fixturing...illustrates the ajor features of this computer network. ) The backbone of our system is a Sytek Broadband Network (LAN) which Interconnects terminals and...automatic network analyzer (FANA) which electrically characterizes the slow-wave helices of traveling-wave tubes ( TWTs ) -- both for engineering design
Memory management in genome-wide association studies
2009-01-01
Genome-wide association is a powerful tool for the identification of genes that underlie common diseases. Genome-wide association studies generate billions of genotypes and pose significant computational challenges for most users including limited computer memory. We applied a recently developed memory management tool to two analyses of North American Rheumatoid Arthritis Consortium studies and measured the performance in terms of central processing unit and memory usage. We conclude that our memory management approach is simple, efficient, and effective for genome-wide association studies. PMID:20018047
Sánchez-Álvarez, David; Rodríguez-Pérez, Francisco-Javier
2018-01-01
In this paper, we present a work based on the computational load distribution among the homogeneous nodes and the Hub/Sink of Wireless Sensor Networks (WSNs). The main contribution of the paper is an early decision support framework helping WSN designers to take decisions about computational load distribution for those WSNs where power consumption is a key issue (when we refer to “framework” in this work, we are considering it as a support tool to make decisions where the executive judgment can be included along with the set of mathematical tools of the WSN designer; this work shows the need to include the load distribution as an integral component of the WSN system for making early decisions regarding energy consumption). The framework takes advantage of the idea that balancing sensors nodes and Hub/Sink computational load can lead to improved energy consumption for the whole or at least the battery-powered nodes of the WSN. The approach is not trivial and it takes into account related issues such as the required data distribution, nodes, and Hub/Sink connectivity and availability due to their connectivity features and duty-cycle. For a practical demonstration, the proposed framework is applied to an agriculture case study, a sector very relevant in our region. In this kind of rural context, distances, low costs due to vegetable selling prices and the lack of continuous power supplies may lead to viable or inviable sensing solutions for the farmers. The proposed framework systematize and facilitates WSN designers the required complex calculations taking into account the most relevant variables regarding power consumption, avoiding full/partial/prototype implementations, and measurements of different computational load distribution potential solutions for a specific WSN. PMID:29570645
Initiating a Programmatic Assessment Report
ERIC Educational Resources Information Center
Berkaliev, Zaur; Devi, Shavila; Fasshauer, Gregory E.; Hickernell, Fred J.; Kartal, Ozgul; Li, Xiaofan; McCray, Patrick; Whitney, Stephanie; Zawojewski, Judith S.
2014-01-01
In the context of a department of applied mathematics, a program assessment was conducted to assess the departmental goal of enabling undergraduate students to recognize, appreciate, and apply the power of computational tools in solving mathematical problems that cannot be solved by hand, or would require extensive and tedious hand computation. A…
Intelligent Instruction by Computer: Theory and Practice.
ERIC Educational Resources Information Center
Farr, Marshall J., Ed.; Psotka, Joseph, Ed.
The essays collected in this volume are concerned with the field of computer-based intelligent instruction. The papers are organized into four groups that address the following topics: particular theoretical approaches (3 titles); the development and improvement of tools and environments (3 titles); the power of well-engineered implementations and…
Developing Simulations in Multi-User Virtual Environments to Enhance Healthcare Education
ERIC Educational Resources Information Center
Rogers, Luke
2011-01-01
Computer-based clinical simulations are a powerful teaching and learning tool because of their ability to expand healthcare students' clinical experience by providing practice-based learning. Despite the benefits of traditional computer-based clinical simulations, there are significant issues that arise when incorporating them into a flexible,…
Multidisciplinary Shape Optimization of a Composite Blended Wing Body Aircraft
NASA Astrophysics Data System (ADS)
Boozer, Charles Maxwell
A multidisciplinary shape optimization tool coupling aerodynamics, structure, and performance was developed for battery powered aircraft. Utilizing high-fidelity computational fluid dynamics analysis tools and a structural wing weight tool, coupled based on the multidisciplinary feasible optimization architecture; aircraft geometry is modified in the optimization of the aircraft's range or endurance. The developed tool is applied to three geometries: a hybrid blended wing body, delta wing UAS, the ONERA M6 wing, and a modified ONERA M6 wing. First, the optimization problem is presented with the objective function, constraints, and design vector. Next, the tool's architecture and the analysis tools that are utilized are described. Finally, various optimizations are described and their results analyzed for all test subjects. Results show that less computationally expensive inviscid optimizations yield positive performance improvements using planform, airfoil, and three-dimensional degrees of freedom. From the results obtained through a series of optimizations, it is concluded that the newly developed tool is both effective at improving performance and serves as a platform ready to receive additional performance modules, further improving its computational design support potential.
Contingency Analysis Post-Processing With Advanced Computing and Visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Glaesemann, Kurt; Fitzhenry, Erin
Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability andmore » accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.« less
Development and Evaluation of LEGUME ID: A ToolBook Multimedia Module.
ERIC Educational Resources Information Center
Hannaway, David B.; And Others
1992-01-01
Describes the development and advantages of LEGUME ID, a multimedia module for agricultural education. LEGUME ID is an example of how teachers, given the opportunity through accessible computer software programs, can create powerful teaching tools. Summarized is a student response to the use of this teacher-produced software program. (MCO)
Exploration and Evaluation of Nanometer Low-power Multi-core VLSI Computer Architectures
2015-03-01
ICC, the Milkway database was created using the command: milkyway –galaxy –nogui –tcl –log memory.log one.tcl As stated previously, it is...EDA tools. Typically, Synopsys® tools use Milkway databases, whereas, Cadence Design System® use Layout Exchange Format (LEF) formats. To help
Educate at Penn State: Preparing Beginning Teachers with Powerful Digital Tools
ERIC Educational Resources Information Center
Murray, Orrin T.; Zembal-Saul, Carla
2008-01-01
University based teacher education programs are slowly beginning to catch up to other professional programs that use modern digital tools to prepare students to enter professional fields. This discussion looks at how one teacher education program reached the conclusion that students and faculty would use notebook computers. Frequently referred to…
Blast2GO goes grid: developing a grid-enabled prototype for functional genomics analysis.
Aparicio, G; Götz, S; Conesa, A; Segrelles, D; Blanquer, I; García, J M; Hernandez, V; Robles, M; Talon, M
2006-01-01
The vast amount in complexity of data generated in Genomic Research implies that new dedicated and powerful computational tools need to be developed to meet their analysis requirements. Blast2GO (B2G) is a bioinformatics tool for Gene Ontology-based DNA or protein sequence annotation and function-based data mining. The application has been developed with the aim of affering an easy-to-use tool for functional genomics research. Typical B2G users are middle size genomics labs carrying out sequencing, ETS and microarray projects, handling datasets up to several thousand sequences. In the current version of B2G. The power and analytical potential of both annotation and function data-mining is somehow restricted to the computational power behind each particular installation. In order to be able to offer the possibility of an enhanced computational capacity within this bioinformatics application, a Grid component is being developed. A prototype has been conceived for the particular problem of speeding up the Blast searches to obtain fast results for large datasets. Many efforts have been done in the literature concerning the speeding up of Blast searches, but few of them deal with the use of large heterogeneous production Grid Infrastructures. These are the infrastructures that could reach the largest number of resources and the best load balancing for data access. The Grid Service under development will analyse requests based on the number of sequences, splitting them accordingly to the available resources. Lower-level computation will be performed through MPIBLAST. The software architecture is based on the WSRF standard.
Citizens unite for computational immunology!
Belden, Orrin S; Baker, Sarah Catherine; Baker, Brian M
2015-07-01
Recruiting volunteers who can provide computational time, programming expertise, or puzzle-solving talent has emerged as a powerful tool for biomedical research. Recent projects demonstrate the potential for such 'crowdsourcing' efforts in immunology. Tools for developing applications, new funding opportunities, and an eager public make crowdsourcing a serious option for creative solutions for computationally-challenging problems. Expanded uses of crowdsourcing in immunology will allow for more efficient large-scale data collection and analysis. It will also involve, inspire, educate, and engage the public in a variety of meaningful ways. The benefits are real - it is time to jump in! Copyright © 2015 Elsevier Ltd. All rights reserved.
PlantCV v2: Image analysis software for high-throughput plant phenotyping
Abbasi, Arash; Berry, Jeffrey C.; Callen, Steven T.; Chavez, Leonardo; Doust, Andrew N.; Feldman, Max J.; Gilbert, Kerrigan B.; Hodge, John G.; Hoyer, J. Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony
2017-01-01
Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning. PMID:29209576
PlantCV v2: Image analysis software for high-throughput plant phenotyping.
Gehan, Malia A; Fahlgren, Noah; Abbasi, Arash; Berry, Jeffrey C; Callen, Steven T; Chavez, Leonardo; Doust, Andrew N; Feldman, Max J; Gilbert, Kerrigan B; Hodge, John G; Hoyer, J Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony
2017-01-01
Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.
PlantCV v2: Image analysis software for high-throughput plant phenotyping
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash
Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less
PlantCV v2: Image analysis software for high-throughput plant phenotyping
Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash; ...
2017-12-01
Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less
Computational Exploration of a Protein Receptor Binding Space with Student Proposed Peptide Ligands
ERIC Educational Resources Information Center
King, Matthew D.; Phillips, Paul; Turner, Matthew W.; Katz, Michael; Lew, Sarah; Bradburn, Sarah; Andersen, Tim; McDougal, Owen M.
2016-01-01
Computational molecular docking is a fast and effective "in silico" method for the analysis of binding between a protein receptor model and a ligand. The visualization and manipulation of protein to ligand binding in three-dimensional space represents a powerful tool in the biochemistry curriculum to enhance student learning. The…
Instructional Computer Programs and the Phonological Deficits of Dyslexic Children
ERIC Educational Resources Information Center
Cammarata, Lisa
2006-01-01
The 21st century is a time to contemplate the power of the technological advances that have occurred today. Computers have become idea engines- a tool used for thinking, performing, processing, and instructing people. No one understands or appreciates this phenomenon more than children suffering with dyslexia. These children's ability to learn or…
Library Signage: Applications for the Apple Macintosh and MacPaint.
ERIC Educational Resources Information Center
Diskin, Jill A.; FitzGerald, Patricia
1984-01-01
Describes specific applications of the Macintosh computer at Carnegie-Mellon University Libraries, where MacPaint was used as a flexible, easy to use, and powerful tool to produce informational, instructional, and promotional signage. Profiles of system hardware and software, an evaluation of the computer program MacPaint, and MacPaint signage…
Supporting Abstraction Processes in Problem Solving through Pattern-Oriented Instruction
ERIC Educational Resources Information Center
Muller, Orna; Haberman, Bruria
2008-01-01
Abstraction is a major concept in computer science and serves as a powerful tool in software development. Pattern-oriented instruction (POI) is a pedagogical approach that incorporates patterns in an introductory computer science course in order to structure the learning of algorithmic problem solving. This paper examines abstraction processes in…
Fast Dynamic Simulation-Based Small Signal Stability Assessment and Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Acharya, Naresh; Baone, Chaitanya; Veda, Santosh
2014-12-31
Power grid planning and operation decisions are made based on simulation of the dynamic behavior of the system. Enabling substantial energy savings while increasing the reliability of the aging North American power grid through improved utilization of existing transmission assets hinges on the adoption of wide-area measurement systems (WAMS) for power system stabilization. However, adoption of WAMS alone will not suffice if the power system is to reach its full entitlement in stability and reliability. It is necessary to enhance predictability with "faster than real-time" dynamic simulations that will enable the dynamic stability margins, proactive real-time control, and improve gridmore » resiliency to fast time-scale phenomena such as cascading network failures. Present-day dynamic simulations are performed only during offline planning studies, considering only worst case conditions such as summer peak, winter peak days, etc. With widespread deployment of renewable generation, controllable loads, energy storage devices and plug-in hybrid electric vehicles expected in the near future and greater integration of cyber infrastructure (communications, computation and control), monitoring and controlling the dynamic performance of the grid in real-time would become increasingly important. The state-of-the-art dynamic simulation tools have limited computational speed and are not suitable for real-time applications, given the large set of contingency conditions to be evaluated. These tools are optimized for best performance of single-processor computers, but the simulation is still several times slower than real-time due to its computational complexity. With recent significant advances in numerical methods and computational hardware, the expectations have been rising towards more efficient and faster techniques to be implemented in power system simulators. This is a natural expectation, given that the core solution algorithms of most commercial simulators were developed decades ago, when High Performance Computing (HPC) resources were not commonly available.« less
Real-Time-Simulation of IEEE-5-Bus Network on OPAL-RT-OP4510 Simulator
NASA Astrophysics Data System (ADS)
Atul Bhandakkar, Anjali; Mathew, Lini, Dr.
2018-03-01
The Real-Time Simulator tools have high computing technologies, improved performance. They are widely used for design and improvement of electrical systems. The advancement of the software tools like MATLAB/SIMULINK with its Real-Time Workshop (RTW) and Real-Time Windows Target (RTWT), real-time simulators are used extensively in many engineering fields, such as industry, education, and research institutions. OPAL-RT-OP4510 is a Real-Time Simulator which is used in both industry and academia. In this paper, the real-time simulation of IEEE-5-Bus network is carried out by means of OPAL-RT-OP4510 with CRO and other hardware. The performance of the network is observed with the introduction of fault at various locations. The waveforms of voltage, current, active and reactive power are observed in the MATLAB simulation environment and on the CRO. Also, Load Flow Analysis (LFA) of IEEE-5-Bus network is computed using MATLAB/Simulink power-gui load flow tool.
McIDAS-V: A Data Analysis and Visualization Tool for Global Satellite Data
NASA Astrophysics Data System (ADS)
Achtor, T. H.; Rink, T. D.
2011-12-01
The Man-computer Interactive Data Access System (McIDAS-V) is a java-based, open-source, freely available system for scientists, researchers and algorithm developers working with atmospheric data. The McIDAS-V software tools provide powerful new data manipulation and visualization capabilities, including 4-dimensional displays, an abstract data model with integrated metadata, user defined computation, and a powerful scripting capability. As such, McIDAS-V is a valuable tool for scientists and researchers within the GEO and GOESS domains. The advancing polar and geostationary orbit environmental satellite missions conducted by several countries will carry advanced instrumentation and systems that will collect and distribute land, ocean, and atmosphere data. These systems provide atmospheric and sea surface temperatures, humidity sounding, cloud and aerosol properties, and numerous other environmental products. This presentation will display and demonstrate some of the capabilities of McIDAS-V to analyze and display high temporal and spectral resolution data using examples from international environmental satellites.
Review of computational fluid dynamics (CFD) researches on nano fluid flow through micro channel
NASA Astrophysics Data System (ADS)
Dewangan, Satish Kumar
2018-05-01
Nanofluid is becoming a promising heat transfer fluids due to its improved thermo-physical properties and heat transfer performance. Micro channel heat transfer has potential application in the cooling high power density microchips in CPU system, micro power systems and many such miniature thermal systems which need advanced cooling capacity. Use of nanofluids enhances the effectiveness of t=scu systems. Computational Fluid Dynamics (CFD) is a very powerful tool in computational analysis of the various physical processes. It application to the situations of flow and heat transfer analysis of the nano fluids is catching up very fast. Present research paper gives a brief account of the methodology of the CFD and also summarizes its application on nano fluid and heat transfer for microchannel cases.
Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kozacik, Stephen
Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.
Khan, F I; Abbasi, S A
2000-07-10
Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.
Probabilistic power flow using improved Monte Carlo simulation method with correlated wind sources
NASA Astrophysics Data System (ADS)
Bie, Pei; Zhang, Buhan; Li, Hang; Deng, Weisi; Wu, Jiasi
2017-01-01
Probabilistic Power Flow (PPF) is a very useful tool for power system steady-state analysis. However, the correlation among different random injection power (like wind power) brings great difficulties to calculate PPF. Monte Carlo simulation (MCS) and analytical methods are two commonly used methods to solve PPF. MCS has high accuracy but is very time consuming. Analytical method like cumulants method (CM) has high computing efficiency but the cumulants calculating is not convenient when wind power output does not obey any typical distribution, especially when correlated wind sources are considered. In this paper, an Improved Monte Carlo simulation method (IMCS) is proposed. The joint empirical distribution is applied to model different wind power output. This method combines the advantages of both MCS and analytical method. It not only has high computing efficiency, but also can provide solutions with enough accuracy, which is very suitable for on-line analysis.
Human-Computer Interaction and Information Management Research Needs
2003-10-01
Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be...hand-held personal digital assistants, networked sensors and actuators, and low-power computers on satellites. 5 most complex tools that humans have...calculations using data on external media such as tapes evolved into our multi-functional 21st century systems. More ideas came as networks of computing
Integrating Learning Services in the Cloud: An Approach That Benefits Both Systems and Learning
ERIC Educational Resources Information Center
Gutiérrez-Carreón, Gustavo; Daradoumis, Thanasis; Jorba, Josep
2015-01-01
Currently there is an increasing trend to implement functionalities that allow for the development of applications based on Cloud computing. In education there are high expectations for Learning Management Systems since they can be powerful tools to foster more effective collaboration within a virtual classroom. Tools can also be integrated with…
Science and Technology Review, January-February 1997
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Table of contents: accelerators at Livermore; the B-Factory and the Big Bang; assessing exposure to radiation; next generation of computer storage; and a powerful new tool to detect clandestine nuclear tests.
Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain
NASA Technical Reports Server (NTRS)
Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem
2016-01-01
The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.
A Survey of Videodisc Technology.
1985-12-01
store images and the microcomputer is used as an interactive and management tool , makes for a powerful teaching system. General Motors was the first...videodisc are used for archival storage of documents. * IBM uses videodisc in over 180 branch offices where they are used both as a presentation tool and to...provide reference material. IBM is also currently working on a videodisc project as a direct training tool for mainten- ance of their computers. A
Applications of automatic differentiation in computational fluid dynamics
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Carle, A.; Bischof, C.; Haigler, Kara J.; Newman, Perry A.
1994-01-01
Automatic differentiation (AD) is a powerful computational method that provides for computing exact sensitivity derivatives (SD) from existing computer programs for multidisciplinary design optimization (MDO) or in sensitivity analysis. A pre-compiler AD tool for FORTRAN programs called ADIFOR has been developed. The ADIFOR tool has been easily and quickly applied by NASA Langley researchers to assess the feasibility and computational impact of AD in MDO with several different FORTRAN programs. These include a state-of-the-art three dimensional multigrid Navier-Stokes flow solver for wings or aircraft configurations in transonic turbulent flow. With ADIFOR the user specifies sets of independent and dependent variables with an existing computer code. ADIFOR then traces the dependency path throughout the code, applies the chain rule to formulate derivative expressions, and generates new code to compute the required SD matrix. The resulting codes have been verified to compute exact non-geometric and geometric SD for a variety of cases. in less time than is required to compute the SD matrix using centered divided differences.
Smart Phones, a Powerful Tool in the Chemistry Classroom
ERIC Educational Resources Information Center
Williams, Antony J.; Pence, Harry E.
2011-01-01
Cell phones, especially "smart phones", seem to have become ubiquitous. Actually, it is misleading to call many of these devices phones, as they are actually a portable and powerful computer that can be very valuable in the chemistry classroom. Currently, there are three major ways in which smart phones can be used for education. Smart phones…
NASA Technical Reports Server (NTRS)
Daly, J. K.; Torian, J. G.
1979-01-01
An overview of studies conducted to establish the requirements for advanced subsystem analytical tools is presented. Modifications are defined for updating current computer programs used to analyze environmental control, life support, and electric power supply systems so that consumables for future advanced spacecraft may be managed.
Integrating Commercial Off-The-Shelf (COTS) graphics and extended memory packages with CLIPS
NASA Technical Reports Server (NTRS)
Callegari, Andres C.
1990-01-01
This paper addresses the question of how to mix CLIPS with graphics and how to overcome PC's memory limitations by using the extended memory available in the computer. By adding graphics and extended memory capabilities, CLIPS can be converted into a complete and powerful system development tool, on the other most economical and popular computer platform. New models of PCs have amazing processing capabilities and graphic resolutions that cannot be ignored and should be used to the fullest of their resources. CLIPS is a powerful expert system development tool, but it cannot be complete without the support of a graphics package needed to create user interfaces and general purpose graphics, or without enough memory to handle large knowledge bases. Now, a well known limitation on the PC's is the usage of real memory which limits CLIPS to use only 640 Kb of real memory, but now that problem can be solved by developing a version of CLIPS that uses extended memory. The user has access of up to 16 MB of memory on 80286 based computers and, practically, all the available memory (4 GB) on computers that use the 80386 processor. So if we give CLIPS a self-configuring graphics package that will automatically detect the graphics hardware and pointing device present in the computer, and we add the availability of the extended memory that exists in the computer (with no special hardware needed), the user will be able to create more powerful systems at a fraction of the cost and on the most popular, portable, and economic platform available such as the PC platform.
EXTENDING THE REALM OF OPTIMIZATION FOR COMPLEX SYSTEMS: UNCERTAINTY, COMPETITION, AND DYNAMICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shanbhag, Uday V; Basar, Tamer; Meyn, Sean
Research reported addressed these topics: the development of analytical and algorithmic tools for distributed computation of Nash equilibria; synchronization in mean-field oscillator games, with an emphasis on learning and efficiency analysis; questions that combine learning and computation; questions including stochastic and mean-field games; modeling and control in the context of power markets.
PDAs in Teacher Education: A Case Study Examining Mobile Technology Integration
ERIC Educational Resources Information Center
Franklin, Teresa; Sexton, Colleen; Lu, Young; Ma, Hongyan
2007-01-01
The classroom computer is no longer confined to a box on the desk. Mobile handheld computing devices have evolved into powerful and affordable learning tools. Handheld technologies are changing the way people access and work with information. The use of Personal Digital Assistants (PDAs or handhelds) has been an evolving part of the business world…
SCANIT: centralized digitizing of forest resource maps or photographs
Elliot L. Amidon; E. Joyce Dye
1981-01-01
Spatial data on wildland resource maps and aerial photographs can be analyzed by computer after digitizing. SCANIT is a computerized system for encoding such data in digital form. The system, consisting of a collection of computer programs and subroutines, provides a powerful and versatile tool for a variety of resource analyses. SCANIT also may be converted easily to...
Synthesizing parallel imaging applications using the CAP (computer-aided parallelization) tool
NASA Astrophysics Data System (ADS)
Gennart, Benoit A.; Mazzariol, Marc; Messerli, Vincent; Hersch, Roger D.
1997-12-01
Imaging applications such as filtering, image transforms and compression/decompression require vast amounts of computing power when applied to large data sets. These applications would potentially benefit from the use of parallel processing. However, dedicated parallel computers are expensive and their processing power per node lags behind that of the most recent commodity components. Furthermore, developing parallel applications remains a difficult task: writing and debugging the application is difficult (deadlocks), programs may not be portable from one parallel architecture to the other, and performance often comes short of expectations. In order to facilitate the development of parallel applications, we propose the CAP computer-aided parallelization tool which enables application programmers to specify at a high-level of abstraction the flow of data between pipelined-parallel operations. In addition, the CAP tool supports the programmer in developing parallel imaging and storage operations. CAP enables combining efficiently parallel storage access routines and image processing sequential operations. This paper shows how processing and I/O intensive imaging applications must be implemented to take advantage of parallelism and pipelining between data access and processing. This paper's contribution is (1) to show how such implementations can be compactly specified in CAP, and (2) to demonstrate that CAP specified applications achieve the performance of custom parallel code. The paper analyzes theoretically the performance of CAP specified applications and demonstrates the accuracy of the theoretical analysis through experimental measurements.
NASA Astrophysics Data System (ADS)
Wan, Tian
This work is motivated by the lack of fully coupled computational tool that solves successfully the turbulent chemically reacting Navier-Stokes equation, the electron energy conservation equation and the electric current Poisson equation. In the present work, the abovementioned equations are solved in a fully coupled manner using fully implicit parallel GMRES methods. The system of Navier-Stokes equations are solved using a GMRES method with combined Schwarz and ILU(0) preconditioners. The electron energy equation and the electric current Poisson equation are solved using a GMRES method with combined SOR and Jacobi preconditioners. The fully coupled method has also been implemented successfully in an unstructured solver, US3D, and convergence test results were presented. This new method is shown two to five times faster than the original DPLR method. The Poisson solver is validated with analytic test problems. Then, four problems are selected; two of them are computed to explore the possibility of onboard MHD control and power generation, and the other two are simulation of experiments. First, the possibility of onboard reentry shock control by a magnetic field is explored. As part of a previous project, MHD power generation onboard a re-entry vehicle is also simulated. Then, the MHD acceleration experiments conducted at NASA Ames research center are simulated. Lastly, the MHD power generation experiments known as the HVEPS project are simulated. For code validation, the scramjet experiments at University of Queensland are simulated first. The generator section of the HVEPS test facility is computed then. The main conclusion is that the computational tool is accurate for different types of problems and flow conditions, and its accuracy and efficiency are necessary when the flow complexity increases.
Modeling a Wireless Network for International Space Station
NASA Technical Reports Server (NTRS)
Alena, Richard; Yaprak, Ece; Lamouri, Saad
2000-01-01
This paper describes the application of wireless local area network (LAN) simulation modeling methods to the hybrid LAN architecture designed for supporting crew-computing tools aboard the International Space Station (ISS). These crew-computing tools, such as wearable computers and portable advisory systems, will provide crew members with real-time vehicle and payload status information and access to digital technical and scientific libraries, significantly enhancing human capabilities in space. A wireless network, therefore, will provide wearable computer and remote instruments with the high performance computational power needed by next-generation 'intelligent' software applications. Wireless network performance in such simulated environments is characterized by the sustainable throughput of data under different traffic conditions. This data will be used to help plan the addition of more access points supporting new modules and more nodes for increased network capacity as the ISS grows.
Proceedings of the American Power Conference. Volume 58-II
DOE Office of Scientific and Technical Information (OSTI.GOV)
McBride, A.E.
1996-11-01
This book is part 2 of the proceedings of the American Power Conference, Technology for Competition and Globalization, 1996. The topics of the papers include structural plant design; challenges of the global marketplace; thermal hydraulic methods for nuclear power plant safety and operation; decontamination and decommissioning; competitive operations and maintenance; fuel opportunities; cooling; competitive power pricing; operations; transformers; relays; plant controls; training to meet the competitive future; burning technologies; ash and byproducts utilization; advanced systems; computer tools for plant design; globalization of power; power system protection and power quality; life extension; grounding; and transmission line equipment.
Use of Transition Modeling to Enable the Computation of Losses for Variable-Speed Power Turbine
NASA Technical Reports Server (NTRS)
Ameri, Ali A.
2012-01-01
To investigate the penalties associated with using a variable speed power turbine (VSPT) in a rotorcraft capable of vertical takeoff and landing, various analysis tools are required. Such analysis tools must be able to model the flow accurately within the operating envelope of VSPT. For power turbines low Reynolds numbers and a wide range of the incidence angles, positive and negative, due to the variation in the shaft speed at relatively fixed corrected flows, characterize this envelope. The flow in the turbine passage is expected to be transitional and separated at high incidence. The turbulence model of Walters and Leylek was implemented in the NASA Glenn-HT code to enable a more accurate analysis of such flows. Two-dimensional heat transfer predictions of flat plate flow and two-dimensional and three-dimensional heat transfer predictions on a turbine blade were performed and reported herein. Heat transfer computations were performed because it is a good marker for transition. The final goal is to be able to compute the aerodynamic losses. Armed with the new transition model, total pressure losses for three-dimensional flow of an Energy Efficient Engine (E3) tip section cascade for a range of incidence angles were computed in anticipation of the experimental data. The results obtained form a loss bucket for the chosen blade.
Application of evolutionary computation in ECAD problems
NASA Astrophysics Data System (ADS)
Lee, Dae-Hyun; Hwang, Seung H.
1998-10-01
Design of modern electronic system is a complicated task which demands the use of computer- aided design (CAD) tools. Since a lot of problems in ECAD are combinatorial optimization problems, evolutionary computations such as genetic algorithms and evolutionary programming have been widely employed to solve those problems. We have applied evolutionary computation techniques to solve ECAD problems such as technology mapping, microcode-bit optimization, data path ordering and peak power estimation, where their benefits are well observed. This paper presents experiences and discusses issues in those applications.
2001-08-01
Apollo-era technology spurred the development of cordless products that we take for granted everyday. In the 1960s, NASA asked Black Decker to develop a special drill that would be powerful enough to cut through hard layers of the lunar surface and be lightweight, compact, and operate under its own power source, allowing Apollo astronauts to collect lunar samples further away from the Lunar Experiment Module. In response, Black Decker developed a computer program that analyzed and optimized drill motor operations. From their analysis, engineers were able to design a motor that was powerful yet required minimal battery power to operate. Since those first days of cordless products, Black Decker has continued to refine this technology and they now sell their rechargeable products worldwide (i.e. the Dustbuster, cordless tools for home and industrial use, and medical tools.)
NASA Technical Reports Server (NTRS)
2001-01-01
Apollo-era technology spurred the development of cordless products that we take for granted everyday. In the 1960s, NASA asked Black Decker to develop a special drill that would be powerful enough to cut through hard layers of the lunar surface and be lightweight, compact, and operate under its own power source, allowing Apollo astronauts to collect lunar samples further away from the Lunar Experiment Module. In response, Black Decker developed a computer program that analyzed and optimized drill motor operations. From their analysis, engineers were able to design a motor that was powerful yet required minimal battery power to operate. Since those first days of cordless products, Black Decker has continued to refine this technology and they now sell their rechargeable products worldwide (i.e. the Dustbuster, cordless tools for home and industrial use, and medical tools.)
Parallel, distributed and GPU computing technologies in single-particle electron microscopy
Schmeisser, Martin; Heisen, Burkhard C.; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger
2009-01-01
Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today’s technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined. PMID:19564686
Parallel, distributed and GPU computing technologies in single-particle electron microscopy.
Schmeisser, Martin; Heisen, Burkhard C; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger
2009-07-01
Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today's technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined.
Continuous-variable quantum computing on encrypted data.
Marshall, Kevin; Jacobsen, Christian S; Schäfermeier, Clemens; Gehring, Tobias; Weedbrook, Christian; Andersen, Ulrik L
2016-12-14
The ability to perform computations on encrypted data is a powerful tool for protecting a client's privacy, especially in today's era of cloud and distributed computing. In terms of privacy, the best solutions that classical techniques can achieve are unfortunately not unconditionally secure in the sense that they are dependent on a hacker's computational power. Here we theoretically investigate, and experimentally demonstrate with Gaussian displacement and squeezing operations, a quantum solution that achieves the security of a user's privacy using the practical technology of continuous variables. We demonstrate losses of up to 10 km both ways between the client and the server and show that security can still be achieved. Our approach offers a number of practical benefits (from a quantum perspective) that could one day allow the potential widespread adoption of this quantum technology in future cloud-based computing networks.
Continuous-variable quantum computing on encrypted data
Marshall, Kevin; Jacobsen, Christian S.; Schäfermeier, Clemens; Gehring, Tobias; Weedbrook, Christian; Andersen, Ulrik L.
2016-01-01
The ability to perform computations on encrypted data is a powerful tool for protecting a client's privacy, especially in today's era of cloud and distributed computing. In terms of privacy, the best solutions that classical techniques can achieve are unfortunately not unconditionally secure in the sense that they are dependent on a hacker's computational power. Here we theoretically investigate, and experimentally demonstrate with Gaussian displacement and squeezing operations, a quantum solution that achieves the security of a user's privacy using the practical technology of continuous variables. We demonstrate losses of up to 10 km both ways between the client and the server and show that security can still be achieved. Our approach offers a number of practical benefits (from a quantum perspective) that could one day allow the potential widespread adoption of this quantum technology in future cloud-based computing networks. PMID:27966528
Continuous-variable quantum computing on encrypted data
NASA Astrophysics Data System (ADS)
Marshall, Kevin; Jacobsen, Christian S.; Schäfermeier, Clemens; Gehring, Tobias; Weedbrook, Christian; Andersen, Ulrik L.
2016-12-01
The ability to perform computations on encrypted data is a powerful tool for protecting a client's privacy, especially in today's era of cloud and distributed computing. In terms of privacy, the best solutions that classical techniques can achieve are unfortunately not unconditionally secure in the sense that they are dependent on a hacker's computational power. Here we theoretically investigate, and experimentally demonstrate with Gaussian displacement and squeezing operations, a quantum solution that achieves the security of a user's privacy using the practical technology of continuous variables. We demonstrate losses of up to 10 km both ways between the client and the server and show that security can still be achieved. Our approach offers a number of practical benefits (from a quantum perspective) that could one day allow the potential widespread adoption of this quantum technology in future cloud-based computing networks.
Computer-aided design of chemicals and chemical mixtures provides a powerful tool to help engineers identify cleaner process designs and more-benign alternatives to toxic industrial solvents. Three software programs are discussed: (1) PARIS II (Program for Assisting the Replaceme...
ERIC Educational Resources Information Center
Weaver, Dave
Science interfacing packages (also known as microcomputer-based laboratories or probeware) generally consist of a set of programs on disks, a user's manual, and hardware which includes one or more sensory devices. Together with a microcomputer they combine to make a powerful data acquisition and analysis tool. Packages are available for accurately…
Statistical methods and computing for big data.
Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing; Yan, Jun
2016-01-01
Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay.
A new tool called DISSECT for analysing large genomic data sets using a Big Data approach
Canela-Xandri, Oriol; Law, Andy; Gray, Alan; Woolliams, John A.; Tenesa, Albert
2015-01-01
Large-scale genetic and genomic data are increasingly available and the major bottleneck in their analysis is a lack of sufficiently scalable computational tools. To address this problem in the context of complex traits analysis, we present DISSECT. DISSECT is a new and freely available software that is able to exploit the distributed-memory parallel computational architectures of compute clusters, to perform a wide range of genomic and epidemiologic analyses, which currently can only be carried out on reduced sample sizes or under restricted conditions. We demonstrate the usefulness of our new tool by addressing the challenge of predicting phenotypes from genotype data in human populations using mixed-linear model analysis. We analyse simulated traits from 470,000 individuals genotyped for 590,004 SNPs in ∼4 h using the combined computational power of 8,400 processor cores. We find that prediction accuracies in excess of 80% of the theoretical maximum could be achieved with large sample sizes. PMID:26657010
Statistical methods and computing for big data
Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing
2016-01-01
Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay. PMID:27695593
Enhanced Electric Power Transmission by Hybrid Compensation Technique
NASA Astrophysics Data System (ADS)
Palanichamy, C.; Kiu, G. Q.
2015-04-01
In today's competitive environment, new power system engineers are likely to contribute immediately to the task, without years of seasoning via on-the-job training, mentoring, and rotation assignments. At the same time it is becoming obligatory to train power system engineering graduates for an increasingly quality-minded corporate environment. In order to achieve this, there is a need to make available better-quality tools for educating and training power system engineering students and in-service system engineers too. As a result of the swift advances in computer hardware and software, many windows-based computer software packages were developed for the purpose of educating and training. In line with those packages, a simulation package called Hybrid Series-Shunt Compensators (HSSC) has been developed and presented in this paper for educational purposes.
Cloud-Based Tools to Support High-Resolution Modeling (Invited)
NASA Astrophysics Data System (ADS)
Jones, N.; Nelson, J.; Swain, N.; Christensen, S.
2013-12-01
The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482
Parallel processing for scientific computations
NASA Technical Reports Server (NTRS)
Alkhatib, Hasan S.
1995-01-01
The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.
Why Won't You Do What I Want? The Informative Failures of Children and Models
ERIC Educational Resources Information Center
Chatham, Christopher H.; Yerys, Benjamin E.; Munakata, Yuko
2012-01-01
Computational models are powerful tools--too powerful, according to some. We argue that the idea that models can "do anything" is wrong, and we describe how their failures have been informative. We present new work showing surprising diversity in the effects of feedback on children's task-switching, such that some children perseverate despite this…
ERIC Educational Resources Information Center
Assante, Leonard E.; Schrader, Stuart M.
The International Health Communication Hotline (InHealth) represents an attempt to firmly establish, develop and promote a new Communication Studies subdiscipline in the academic and health care arenas via computer networking. If successful, the project will demonstrate the power of computer networking as an agent of change. Health communication…
Computational methods in drug discovery
Leelananda, Sumudu P
2016-01-01
The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein–ligand docking, pharmacophore modeling and QSAR techniques are reviewed. PMID:28144341
Computational methods in drug discovery.
Leelananda, Sumudu P; Lindert, Steffen
2016-01-01
The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein-ligand docking, pharmacophore modeling and QSAR techniques are reviewed.
Graphics processing units in bioinformatics, computational biology and systems biology.
Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela
2017-09-01
Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.
Computing Linear Mathematical Models Of Aircraft
NASA Technical Reports Server (NTRS)
Duke, Eugene L.; Antoniewicz, Robert F.; Krambeer, Keith D.
1991-01-01
Derivation and Definition of Linear Aircraft Model (LINEAR) computer program provides user with powerful, and flexible, standard, documented, and verified software tool for linearization of mathematical models of aerodynamics of aircraft. Intended for use in software tool to drive linear analysis of stability and design of control laws for aircraft. Capable of both extracting such linearized engine effects as net thrust, torque, and gyroscopic effects, and including these effects in linear model of system. Designed to provide easy selection of state, control, and observation variables used in particular model. Also provides flexibility of allowing alternate formulations of both state and observation equations. Written in FORTRAN.
LEMON - LHC Era Monitoring for Large-Scale Infrastructures
NASA Astrophysics Data System (ADS)
Marian, Babik; Ivan, Fedorko; Nicholas, Hook; Hector, Lansdale Thomas; Daniel, Lenkes; Miroslav, Siket; Denis, Waldron
2011-12-01
At the present time computer centres are facing a massive rise in virtualization and cloud computing as these solutions bring advantages to service providers and consolidate the computer centre resources. However, as a result the monitoring complexity is increasing. Computer centre management requires not only to monitor servers, network equipment and associated software but also to collect additional environment and facilities data (e.g. temperature, power consumption, cooling efficiency, etc.) to have also a good overview of the infrastructure performance. The LHC Era Monitoring (Lemon) system is addressing these requirements for a very large scale infrastructure. The Lemon agent that collects data on every client and forwards the samples to the central measurement repository provides a flexible interface that allows rapid development of new sensors. The system allows also to report on behalf of remote devices such as switches and power supplies. Online and historical data can be visualized via a web-based interface or retrieved via command-line tools. The Lemon Alarm System component can be used for notifying the operator about error situations. In this article, an overview of the Lemon monitoring is provided together with a description of the CERN LEMON production instance. No direct comparison is made with other monitoring tool.
Student Research in Computational Astrophysics
NASA Astrophysics Data System (ADS)
Blondin, J. M.
1999-12-01
Computational physics can shorten the long road from freshman physics major to independent research by providing students with powerful tools to deal with the complexities of modern research problems. At North Carolina State University we have introduced dozens of students to astrophysics research using the tools of computational fluid dynamics. We have used several formats for working with students, including the traditional approach of one-on-one mentoring, a more group-oriented format in which several students work together on one or more related projects, and a novel attempt to involve an entire class in a coordinated semester research project. The advantages and disadvantages of these formats will be discussed at length, but the single most important influence has been peer support. Having students work in teams or learn the tools of research together but tackle different problems has led to more positive experiences than a lone student diving into solo research. This work is supported by an NSF CAREER Award.
Information Power Grid (IPG) Tutorial 2003
NASA Technical Reports Server (NTRS)
Meyers, George
2003-01-01
For NASA and the general community today Grid middleware: a) provides tools to access/use data sources (databases, instruments, ...); b) provides tools to access computing (unique and generic); c) Is an enabler of large scale collaboration. Dynamically responding to needs is a key selling point of a grid. Independent resources can be joined as appropriate to solve a problem. Provide tools to enable the building of a frameworks for application. Provide value added service to the NASA user base for utilizing resources on the grid in new and more efficient ways. Provides tools for development of Frameworks.
Stone, John E.; Hynninen, Antti-Pekka; Phillips, James C.; Schulten, Klaus
2017-01-01
All-atom molecular dynamics simulations of biomolecules provide a powerful tool for exploring the structure and dynamics of large protein complexes within realistic cellular environments. Unfortunately, such simulations are extremely demanding in terms of their computational requirements, and they present many challenges in terms of preparation, simulation methodology, and analysis and visualization of results. We describe our early experiences porting the popular molecular dynamics simulation program NAMD and the simulation preparation, analysis, and visualization tool VMD to GPU-accelerated OpenPOWER hardware platforms. We report our experiences with compiler-provided autovectorization and compare with hand-coded vector intrinsics for the POWER8 CPU. We explore the performance benefits obtained from unique POWER8 architectural features such as 8-way SMT and its value for particular molecular modeling tasks. Finally, we evaluate the performance of several GPU-accelerated molecular modeling kernels and relate them to other hardware platforms. PMID:29202130
Solar Power Tower Integrated Layout and Optimization Tool | Concentrating
methods to reduce the overall computational burden while generating accurate and precise results. These methods have been developed as part of the U.S. Department of Energy (DOE) SunShot Initiative research
OpenStructure: a flexible software framework for computational structural biology.
Biasini, Marco; Mariani, Valerio; Haas, Jürgen; Scheuber, Stefan; Schenk, Andreas D; Schwede, Torsten; Philippsen, Ansgar
2010-10-15
Developers of new methods in computational structural biology are often hampered in their research by incompatible software tools and non-standardized data formats. To address this problem, we have developed OpenStructure as a modular open source platform to provide a powerful, yet flexible general working environment for structural bioinformatics. OpenStructure consists primarily of a set of libraries written in C++ with a cleanly designed application programmer interface. All functionality can be accessed directly in C++ or in a Python layer, meeting both the requirements for high efficiency and ease of use. Powerful selection queries and the notion of entity views to represent these selections greatly facilitate the development and implementation of algorithms on structural data. The modular integration of computational core methods with powerful visualization tools makes OpenStructure an ideal working and development environment. Several applications, such as the latest versions of IPLT and QMean, have been implemented based on OpenStructure-demonstrating its value for the development of next-generation structural biology algorithms. Source code licensed under the GNU lesser general public license and binaries for MacOS X, Linux and Windows are available for download at http://www.openstructure.org. torsten.schwede@unibas.ch Supplementary data are available at Bioinformatics online.
NASA Technical Reports Server (NTRS)
Biernacki, John; Juhasz, John; Sadler, Gerald
1991-01-01
A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.
Software systems for modeling articulated figures
NASA Technical Reports Server (NTRS)
Phillips, Cary B.
1989-01-01
Research in computer animation and simulation of human task performance requires sophisticated geometric modeling and user interface tools. The software for a research environment should present the programmer with a powerful but flexible substrate of facilities for displaying and manipulating geometric objects, yet insure that future tools have a consistent and friendly user interface. Jack is a system which provides a flexible and extensible programmer and user interface for displaying and manipulating complex geometric figures, particularly human figures in a 3D working environment. It is a basic software framework for high-performance Silicon Graphics IRIS workstations for modeling and manipulating geometric objects in a general but powerful way. It provides a consistent and user-friendly interface across various applications in computer animation and simulation of human task performance. Currently, Jack provides input and control for applications including lighting specification and image rendering, anthropometric modeling, figure positioning, inverse kinematics, dynamic simulation, and keyframe animation.
NASA Astrophysics Data System (ADS)
Matsypura, Dmytro
In this dissertation, I develop a new theoretical framework for the modeling, pricing analysis, and computation of solutions to electric power supply chains with power generators, suppliers, transmission service providers, and the inclusion of consumer demands. In particular, I advocate the application of finite-dimensional variational inequality theory, projected dynamical systems theory, game theory, network theory, and other tools that have been recently proposed for the modeling and analysis of supply chain networks (cf. Nagurney (2006)) to electric power markets. This dissertation contributes to the extant literature on the modeling, analysis, and solution of supply chain networks, including global supply chains, in general, and electric power supply chains, in particular, in the following ways. It develops a theoretical framework for modeling, pricing analysis, and computation of electric power flows/transactions in electric power systems using the rationale for supply chain analysis. The models developed include both static and dynamic ones. The dissertation also adds a new dimension to the methodology of the theory of projected dynamical systems by proving that, irrespective of the speeds of adjustment, the equilibrium of the system remains the same. Finally, I include alternative fuel suppliers, along with their behavior into the supply chain modeling and analysis framework. This dissertation has strong practical implications. In an era in which technology and globalization, coupled with increasing risk and uncertainty, complicate electricity demand and supply within and between nations, the successful management of electric power systems and pricing become increasingly pressing topics with relevance not only for economic prosperity but also national security. This dissertation addresses such related topics by providing models, pricing tools, and algorithms for decentralized electric power supply chains. This dissertation is based heavily on the following coauthored papers: Nagurney, Cruz, and Matsypura (2003), Nagurney and Matsypura (2004, 2005, 2006), Matsypura and Nagurney (2005), Matsypura, Nagurney, and Liu (2006).
Full 3D visualization tool-kit for Monte Carlo and deterministic transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frambati, S.; Frignani, M.
2012-07-01
We propose a package of tools capable of translating the geometric inputs and outputs of many Monte Carlo and deterministic radiation transport codes into open source file formats. These tools are aimed at bridging the gap between trusted, widely-used radiation analysis codes and very powerful, more recent and commonly used visualization software, thus supporting the design process and helping with shielding optimization. Three main lines of development were followed: mesh-based analysis of Monte Carlo codes, mesh-based analysis of deterministic codes and Monte Carlo surface meshing. The developed kit is considered a powerful and cost-effective tool in the computer-aided design formore » radiation transport code users of the nuclear world, and in particular in the fields of core design and radiation analysis. (authors)« less
A computer-based specification methodology
NASA Technical Reports Server (NTRS)
Munck, Robert G.
1986-01-01
Standard practices for creating and using system specifications are inadequate for large, advanced-technology systems. A need exists to break away from paper documents in favor of documents that are stored in computers and which are read and otherwise used with the help of computers. An SADT-based system, running on the proposed Space Station data management network, could be a powerful tool for doing much of the required technical work of the Station, including creating and operating the network itself.
Space Shuttle Debris Transport
NASA Technical Reports Server (NTRS)
Gomez, Reynaldo J., III
2010-01-01
This slide presentation reviews the assessment of debris damage to the Space Shuttle, and the use of computation to assist in the space shuttle applications. The presentation reviews the sources of debris, a mechanism for determining the probability of damaging debris impacting the shuttle, tools used, eliminating potential damaging debris sources, the use of computation to assess while inflight damage, and a chart showing the applications that have been used on increasingly powerful computers simulate the shuttle and the debris transport.
Attack-Resistant Trust Metrics
NASA Astrophysics Data System (ADS)
Levien, Raph
The Internet is an amazingly powerful tool for connecting people together, unmatched in human history. Yet, with that power comes great potential for spam and abuse. Trust metrics are an attempt to compute the set of which people are trustworthy and which are likely attackers. This chapter presents two specific trust metrics developed and deployed on the Advogato Website, which is a community blog for free software developers. This real-world experience demonstrates that the trust metrics fulfilled their goals, but that for good results, it is important to match the assumptions of the abstract trust metric computation to the real-world implementation.
GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data
NASA Astrophysics Data System (ADS)
Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.
2016-12-01
Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.
Mobile Centers For Secondary Power Distribution
NASA Technical Reports Server (NTRS)
Mears, Robert L.
1990-01-01
Concept for distribution of 60-Hz ac power in large building devoted to assembly and testing of equipment improves safety, reduces number of outlets and lengthy cables, and readily accommodates frequent changes in operations and configuration. Power from floor recess fed via unobtrusive cable to portable power management center. A cart containing variety of outlets and circuit breakers, wheeled to convenient location near equipment to be assembled or tested. Power distribution system presents larger range of operational configurations than fixed location. Meets tighter standards to feed computers and delicate instruments. Industrial-grade power suitable for power tools and other hardware. Three-phase and single-phase outlets available from each.
New Tools for Estimating and Managing Local/Regional Air Quality Impacts of Prescribed Burns
2013-02-01
189 8.1.2 Conference Papers ...the dryer fuels and horizontal burn configuration would result in a more intense fire and provide emission factors closer to those from the flaming...the EPA web site, the CMAQ Model is a powerful computational tool used by EPA and states for air quality management in that it can be used to design
Tools of the Future: How Decision Tree Analysis Will Impact Mission Planning
NASA Technical Reports Server (NTRS)
Otterstatter, Matthew R.
2005-01-01
The universe is infinitely complex; however, the human mind has a finite capacity. The multitude of possible variables, metrics, and procedures in mission planning are far too many to address exhaustively. This is unfortunate because, in general, considering more possibilities leads to more accurate and more powerful results. To compensate, we can get more insightful results by employing our greatest tool, the computer. The power of the computer will be utilized through a technology that considers every possibility, decision tree analysis. Although decision trees have been used in many other fields, this is innovative for space mission planning. Because this is a new strategy, no existing software is able to completely accommodate all of the requirements. This was determined through extensive research and testing of current technologies. It was necessary to create original software, for which a short-term model was finished this summer. The model was built into Microsoft Excel to take advantage of the familiar graphical interface for user input, computation, and viewing output. Macros were written to automate the process of tree construction, optimization, and presentation. The results are useful and promising. If this tool is successfully implemented in mission planning, our reliance on old-fashioned heuristics, an error-prone shortcut for handling complexity, will be reduced. The computer algorithms involved in decision trees will revolutionize mission planning. The planning will be faster and smarter, leading to optimized missions with the potential for more valuable data.
NASA Astrophysics Data System (ADS)
Fasel, Markus
2016-10-01
High-Performance Computing Systems are powerful tools tailored to support large- scale applications that rely on low-latency inter-process communications to run efficiently. By design, these systems often impose constraints on application workflows, such as limited external network connectivity and whole node scheduling, that make more general-purpose computing tasks, such as those commonly found in high-energy nuclear physics applications, more difficult to carry out. In this work, we present a tool designed to simplify access to such complicated environments by handling the common tasks of job submission, software management, and local data management, in a framework that is easily adaptable to the specific requirements of various computing systems. The tool, initially constructed to process stand-alone ALICE simulations for detector and software development, was successfully deployed on the NERSC computing systems, Carver, Hopper and Edison, and is being configured to provide access to the next generation NERSC system, Cori. In this report, we describe the tool and discuss our experience running ALICE applications on NERSC HPC systems. The discussion will include our initial benchmarks of Cori compared to other systems and our attempts to leverage the new capabilities offered with Cori to support data-intensive applications, with a future goal of full integration of such systems into ALICE grid operations.
Program to determine space vehicle response to wind turbulence
NASA Technical Reports Server (NTRS)
Wilkening, H. D.
1972-01-01
Computer program was developed as prelaunch wind monitoring tool for Saturn 5 vehicle. Program accounts for characteristic wind changes including turbulence power spectral density, wind shear, peak wind velocity, altitude, and wind direction using stored variational statistics.
Impact of computational structure-based methods on drug discovery.
Reynolds, Charles H
2014-01-01
Structure-based drug design has become an indispensible tool in drug discovery. The emergence of structure-based design is due to gains in structural biology that have provided exponential growth in the number of protein crystal structures, new computational algorithms and approaches for modeling protein-ligand interactions, and the tremendous growth of raw computer power in the last 30 years. Computer modeling and simulation have made major contributions to the discovery of many groundbreaking drugs in recent years. Examples are presented that highlight the evolution of computational structure-based design methodology, and the impact of that methodology on drug discovery.
NASA Technical Reports Server (NTRS)
Barber, Peter W.; Demerdash, Nabeel A. O.; Wang, R.; Hurysz, B.; Luo, Z.
1991-01-01
The goal is to analyze the potential effects of electromagnetic interference (EMI) originating from power system processing and transmission components for Space Station Freedom.The approach consists of four steps: (1) develop analytical tools (models and computer programs); (2) conduct parameterization studies; (3) predict the global space station EMI environment; and (4) provide a basis for modification of EMI standards.
Laboratory for Computer Science Progress Report 19, 1 July 1981-30 June 1982.
1984-05-01
Multiprocessor Architectures 202 4. TRIX Operating System 209 5. VLSI Tools 212 ’SYSTEMATIC PROGRAM DEVELOPMENT, 221 1. Introduction 222 2. Specification...exploring distributed operating systems and the architecture of single-user powerful computers that are interconnected by communication networks. The...to now. In particular, we expect to experiment with languages, operating systems , and applications that establish the feasibility of distributed
Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.
Pearl, Lisa S; Sprouse, Jon
2015-06-01
Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.
Integration of a neuroimaging processing pipeline into a pan-canadian computing grid
NASA Astrophysics Data System (ADS)
Lavoie-Courchesne, S.; Rioux, P.; Chouinard-Decorte, F.; Sherif, T.; Rousseau, M.-E.; Das, S.; Adalat, R.; Doyon, J.; Craddock, C.; Margulies, D.; Chu, C.; Lyttelton, O.; Evans, A. C.; Bellec, P.
2012-02-01
The ethos of the neuroimaging field is quickly moving towards the open sharing of resources, including both imaging databases and processing tools. As a neuroimaging database represents a large volume of datasets and as neuroimaging processing pipelines are composed of heterogeneous, computationally intensive tools, such open sharing raises specific computational challenges. This motivates the design of novel dedicated computing infrastructures. This paper describes an interface between PSOM, a code-oriented pipeline development framework, and CBRAIN, a web-oriented platform for grid computing. This interface was used to integrate a PSOM-compliant pipeline for preprocessing of structural and functional magnetic resonance imaging into CBRAIN. We further tested the capacity of our infrastructure to handle a real large-scale project. A neuroimaging database including close to 1000 subjects was preprocessed using our interface and publicly released to help the participants of the ADHD-200 international competition. This successful experiment demonstrated that our integrated grid-computing platform is a powerful solution for high-throughput pipeline analysis in the field of neuroimaging.
Computer-aided design of the RF-cavity for a high-power S-band klystron
NASA Astrophysics Data System (ADS)
Kant, D.; Bandyopadhyay, A. K.; Pal, D.; Meena, R.; Nangru, S. C.; Joshi, L. M.
2012-08-01
This article describes the computer-aided design of the RF-cavity for a S-band klystron operating at 2856 MHz. State-of-the-art electromagnetic simulation tools SUPERFISH, CST Microwave studio, HFSS and MAGIC have been used for cavity design. After finalising the geometrical details of the cavity through simulation, it has been fabricated and characterised through cold testing. Detailed results of the computer-aided simulation and cold measurements are presented in this article.
Computational nanomedicine: modeling of nanoparticle-mediated hyperthermal cancer therapy
Kaddi, Chanchala D; Phan, John H; Wang, May D
2016-01-01
Nanoparticle-mediated hyperthermia for cancer therapy is a growing area of cancer nanomedicine because of the potential for localized and targeted destruction of cancer cells. Localized hyperthermal effects are dependent on many factors, including nanoparticle size and shape, excitation wavelength and power, and tissue properties. Computational modeling is an important tool for investigating and optimizing these parameters. In this review, we focus on computational modeling of magnetic and gold nanoparticle-mediated hyperthermia, followed by a discussion of new opportunities and challenges. PMID:23914967
Chiral phosphoric acid catalysis: from numbers to insights.
Maji, Rajat; Mallojjala, Sharath Chandra; Wheeler, Steven E
2018-02-19
Chiral phosphoric acids (CPAs) have emerged as powerful organocatalysts for asymmetric reactions, and applications of computational quantum chemistry have revealed important insights into the activity and selectivity of these catalysts. In this tutorial review, we provide an overview of computational tools at the disposal of computational organic chemists and demonstrate their application to a wide array of CPA catalysed reactions. Predictive models of the stereochemical outcome of these reactions are discussed along with specific examples of representative reactions and an outlook on remaining challenges in this area.
NASA Astrophysics Data System (ADS)
Aharonov, Dorit
In the last few years, theoretical study of quantum systems serving as computational devices has achieved tremendous progress. We now have strong theoretical evidence that quantum computers, if built, might be used as a dramatically powerful computational tool, capable of performing tasks which seem intractable for classical computers. This review is about to tell the story of theoretical quantum computation. I l out the developing topic of experimental realizations of the model, and neglected other closely related topics which are quantum information and quantum communication. As a result of narrowing the scope of this paper, I hope it has gained the benefit of being an almost self contained introduction to the exciting field of quantum computation. The review begins with background on theoretical computer science, Turing machines and Boolean circuits. In light of these models, I define quantum computers, and discuss the issue of universal quantum gates. Quantum algorithms, including Shor's factorization algorithm and Grover's algorithm for searching databases, are explained. I will devote much attention to understanding what the origins of the quantum computational power are, and what the limits of this power are. Finally, I describe the recent theoretical results which show that quantum computers maintain their complexity power even in the presence of noise, inaccuracies and finite precision. This question cannot be separated from that of quantum complexity because any realistic model will inevitably be subjected to such inaccuracies. I tried to put all results in their context, asking what the implications to other issues in computer science and physics are. In the end of this review, I make these connections explicit by discussing the possible implications of quantum computation on fundamental physical questions such as the transition from quantum to classical physics.
Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George
2001-01-01
The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.
NASA Astrophysics Data System (ADS)
Poulet, T.; Veveakis, M.; Paesold, M.; Regenauer-Lieb, K.
2014-12-01
Multiphysics modelling has become an indispensable tool for geoscientists to simulate the complex behaviours observed in their various fields of study where multiple processes are involved, including thermal, hydraulic, mechanical and chemical (THMC) laws. This modelling activity involves simulations that are computationally expensive and its soaring uptake is tightly linked to the increasing availability of supercomputing power and easy access to powerful nonlinear solvers such as PETSc (http://www.mcs.anl.gov/petsc/). The Multiphysics Object-Oriented Simulation Environment (MOOSE) is a finite-element, multiphysics framework (http://mooseframework.org) that can harness such computational power and allow scientists to develop easily some tightly-coupled fully implicit multiphysics simulations that run automatically in parallel on large clusters. This open-source framework provides a powerful tool to collaborate on numerical modelling activities and we are contributing to its development with REDBACK (https://github.com/pou036/redback), a module for Rock mEchanics with Dissipative feedBACKs. REDBACK builds on the tensor mechanics finite strain implementation available in MOOSE to provide a THMC simulator where the energetic formulation highlights the importance of all dissipative terms in the coupled system of equations. We show first applications of fully coupled dehydration reactions triggering episodic fluid transfer through shear zones (Alevizos et al, 2014). The dimensionless approach used allows focusing on the critical underlying variables which are driving the resulting behaviours observed and this tool is specifically designed to study material instabilities underpinning geological features like faulting, folding, boudinage, shearing, fracturing, etc. REDBACK provides a collaborative and educational tool which captures the physical and mathematical understanding of such material instabilities and provides an easy way to apply this knowledge to realistic scenarios, where the size and complexity of the geometries considered, along with the material parameters distributions, add as many sources of different instabilities. References: Alevizos, S., T. Poulet, and E. Veveakis (2014), J. Geophys. Res., 119, 4558-4582, doi:10.1002/2013JB010070.
ERIC Educational Resources Information Center
TechTrends, 1992
1992-01-01
Reviews new educational technology products, including a microcomputer-based tutoring system, laser barcode reader, video/data projectors, CD-ROM for notebook computers, a system to increase a printer's power, data cartridge storage shell, knowledge-based decision tool, video illustrator, interactive videodiscs, surge protectors, scanner system,…
The AgESGUI geospatial simulation system for environmental model application and evaluation
USDA-ARS?s Scientific Manuscript database
Practical decision making in spatially-distributed environmental assessment and management is increasingly being based on environmental process-based models linked to geographical information systems (GIS). Furthermore, powerful computers and Internet-accessible assessment tools are providing much g...
Clock Agreement Among Parallel Supercomputer Nodes
Jones, Terry R.; Koenig, Gregory A.
2014-04-30
This dataset presents measurements that quantify the clock synchronization time-agreement characteristics among several high performance computers including the current world's most powerful machine for open science, the U.S. Department of Energy's Titan machine sited at Oak Ridge National Laboratory. These ultra-fast machines derive much of their computational capability from extreme node counts (over 18000 nodes in the case of the Titan machine). Time-agreement is commonly utilized by parallel programming applications and tools, distributed programming application and tools, and system software. Our time-agreement measurements detail the degree of time variance between nodes and how that variance changes over time. The dataset includes empirical measurements and the accompanying spreadsheets.
NASA Technical Reports Server (NTRS)
2010-01-01
Topics covered include: Burnishing Techniques Strengthen Hip Implants; Signal Processing Methods Monitor Cranial Pressure; Ultraviolet-Blocking Lenses Protect, Enhance Vision; Hyperspectral Systems Increase Imaging Capabilities; Programs Model the Future of Air Traffic Management; Tail Rotor Airfoils Stabilize Helicopters, Reduce Noise; Personal Aircraft Point to the Future of Transportation; Ducted Fan Designs Lead to Potential New Vehicles; Winglets Save Billions of Dollars in Fuel Costs; Sensor Systems Collect Critical Aerodynamics Data; Coatings Extend Life of Engines and Infrastructure; Radiometers Optimize Local Weather Prediction; Energy-Efficient Systems Eliminate Icing Danger for UAVs; Rocket-Powered Parachutes Rescue Entire Planes; Technologies Advance UAVs for Science, Military; Inflatable Antennas Support Emergency Communication; Smart Sensors Assess Structural Health; Hand-Held Devices Detect Explosives and Chemical Agents; Terahertz Tools Advance Imaging for Security, Industry; LED Systems Target Plant Growth; Aerogels Insulate Against Extreme Temperatures; Image Sensors Enhance Camera Technologies; Lightweight Material Patches Allow for Quick Repairs; Nanomaterials Transform Hairstyling Tools; Do-It-Yourself Additives Recharge Auto Air Conditioning; Systems Analyze Water Quality in Real Time; Compact Radiometers Expand Climate Knowledge; Energy Servers Deliver Clean, Affordable Power; Solutions Remediate Contaminated Groundwater; Bacteria Provide Cleanup of Oil Spills, Wastewater; Reflective Coatings Protect People and Animals; Innovative Techniques Simplify Vibration Analysis; Modeling Tools Predict Flow in Fluid Dynamics; Verification Tools Secure Online Shopping, Banking; Toolsets Maintain Health of Complex Systems; Framework Resources Multiply Computing Power; Tools Automate Spacecraft Testing, Operation; GPS Software Packages Deliver Positioning Solutions; Solid-State Recorders Enhance Scientific Data Collection; Computer Models Simulate Fine Particle Dispersion; Composite Sandwich Technologies Lighten Components; Cameras Reveal Elements in the Short Wave Infrared; Deformable Mirrors Correct Optical Distortions; Stitching Techniques Advance Optics Manufacturing; Compact, Robust Chips Integrate Optical Functions; Fuel Cell Stations Automate Processes, Catalyst Testing; Onboard Systems Record Unique Videos of Space Missions; Space Research Results Purify Semiconductor Materials; and Toolkits Control Motion of Complex Robotics.
A Format for Phylogenetic Placements
Matsen, Frederick A.; Hoffman, Noah G.; Gallagher, Aaron; Stamatakis, Alexandros
2012-01-01
We have developed a unified format for phylogenetic placements, that is, mappings of environmental sequence data (e.g., short reads) into a phylogenetic tree. We are motivated to do so by the growing number of tools for computing and post-processing phylogenetic placements, and the lack of an established standard for storing them. The format is lightweight, versatile, extensible, and is based on the JSON format, which can be parsed by most modern programming languages. Our format is already implemented in several tools for computing and post-processing parsimony- and likelihood-based phylogenetic placements and has worked well in practice. We believe that establishing a standard format for analyzing read placements at this early stage will lead to a more efficient development of powerful and portable post-analysis tools for the growing applications of phylogenetic placement. PMID:22383988
A format for phylogenetic placements.
Matsen, Frederick A; Hoffman, Noah G; Gallagher, Aaron; Stamatakis, Alexandros
2012-01-01
We have developed a unified format for phylogenetic placements, that is, mappings of environmental sequence data (e.g., short reads) into a phylogenetic tree. We are motivated to do so by the growing number of tools for computing and post-processing phylogenetic placements, and the lack of an established standard for storing them. The format is lightweight, versatile, extensible, and is based on the JSON format, which can be parsed by most modern programming languages. Our format is already implemented in several tools for computing and post-processing parsimony- and likelihood-based phylogenetic placements and has worked well in practice. We believe that establishing a standard format for analyzing read placements at this early stage will lead to a more efficient development of powerful and portable post-analysis tools for the growing applications of phylogenetic placement.
The research and application of the power big data
NASA Astrophysics Data System (ADS)
Zhang, Suxiang; Zhang, Dong; Zhang, Yaping; Cao, Jinping; Xu, Huiming
2017-01-01
Facing the increasing environment crisis, how to improve energy efficiency is the important problem. Power big data is main support tool to realize demand side management and response. With the promotion of smart power consumption, distributed clean energy and electric vehicles etc get wide application; meanwhile, the continuous development of the Internet of things technology, more applications access the endings in the grid power link, which leads to that a large number of electric terminal equipment, new energy access smart grid, and it will produce massive heterogeneous and multi-state electricity data. These data produce the power grid enterprise's precious wealth, as the power big data. How to transform it into valuable knowledge and effective operation becomes an important problem, it needs to interoperate in the smart grid. In this paper, we had researched the various applications of power big data and integrate the cloud computing and big data technology, which include electricity consumption online monitoring, the short-term power load forecasting and the analysis of the energy efficiency. Based on Hadoop, HBase and Hive etc., we realize the ETL and OLAP functions; and we also adopt the parallel computing framework to achieve the power load forecasting algorithms and propose a parallel locally weighted linear regression model; we study on energy efficiency rating model to comprehensive evaluate the level of energy consumption of electricity users, which allows users to understand their real-time energy consumption situation, adjust their electricity behavior to reduce energy consumption, it provides decision-making basis for the user. With an intelligent industrial park as example, this paper complete electricity management. Therefore, in the future, power big data will provide decision-making support tools for energy conservation and emissions reduction.
Sedig, Kamran; Parsons, Paul; Dittmer, Mark; Ola, Oluwakemi
2012-01-01
Public health professionals work with a variety of information sources to carry out their everyday activities. In recent years, interactive computational tools have become deeply embedded in such activities. Unlike the early days of computational tool use, the potential of tools nowadays is not limited to simply providing access to information; rather, they can act as powerful mediators of human-information discourse, enabling rich interaction with public health information. If public health informatics tools are designed and used properly, they can facilitate, enhance, and support the performance of complex cognitive activities that are essential to public health informatics, such as problem solving, forecasting, sense-making, and planning. However, the effective design and evaluation of public health informatics tools requires an understanding of the cognitive and perceptual issues pertaining to how humans work and think with information to perform such activities. This paper draws on research that has examined some of the relevant issues, including interaction design, complex cognition, and visual representations, to offer some human-centered design and evaluation considerations for public health informatics tools.
A new approach to the rationale discovery of polymeric biomaterials
Kohn, Joachim; Welsh, William J.; Knight, Doyle
2007-01-01
This paper attempts to illustrate both the need for new approaches to biomaterials discovery as well as the significant promise inherent in the use of combinatorial and computational design strategies. The key observation of this Leading Opinion Paper is that the biomaterials community has been slow to embrace advanced biomaterials discovery tools such as combinatorial methods, high throughput experimentation, and computational modeling in spite of the significant promise shown by these discovery tools in materials science, medicinal chemistry and the pharmaceutical industry. It seems that the complexity of living cells and their interactions with biomaterials has been a conceptual as well as a practical barrier to the use of advanced discovery tools in biomaterials science. However, with the continued increase in computer power, the goal of predicting the biological response of cells in contact with biomaterials surfaces is within reach. Once combinatorial synthesis, high throughput experimentation, and computational modeling are integrated into the biomaterials discovery process, a significant acceleration is possible in the pace of development of improved medical implants, tissue regeneration scaffolds, and gene/drug delivery systems. PMID:17644176
Higher-order ice-sheet modelling accelerated by multigrid on graphics cards
NASA Astrophysics Data System (ADS)
Brædstrup, Christian; Egholm, David
2013-04-01
Higher-order ice flow modelling is a very computer intensive process owing primarily to the nonlinear influence of the horizontal stress coupling. When applied for simulating long-term glacial landscape evolution, the ice-sheet models must consider very long time series, while both high temporal and spatial resolution is needed to resolve small effects. The use of higher-order and full stokes models have therefore seen very limited usage in this field. However, recent advances in graphics card (GPU) technology for high performance computing have proven extremely efficient in accelerating many large-scale scientific computations. The general purpose GPU (GPGPU) technology is cheap, has a low power consumption and fits into a normal desktop computer. It could therefore provide a powerful tool for many glaciologists working on ice flow models. Our current research focuses on utilising the GPU as a tool in ice-sheet and glacier modelling. To this extent we have implemented the Integrated Second-Order Shallow Ice Approximation (iSOSIA) equations on the device using the finite difference method. To accelerate the computations, the GPU solver uses a non-linear Red-Black Gauss-Seidel iterator coupled with a Full Approximation Scheme (FAS) multigrid setup to further aid convergence. The GPU finite difference implementation provides the inherent parallelization that scales from hundreds to several thousands of cores on newer cards. We demonstrate the efficiency of the GPU multigrid solver using benchmark experiments.
Aerospace Power Systems Design and Analysis (APSDA) Tool
NASA Technical Reports Server (NTRS)
Truong, Long V.
1998-01-01
The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.
Energy efficiency analysis and optimization for mobile platforms
NASA Astrophysics Data System (ADS)
Metri, Grace Camille
The introduction of mobile devices changed the landscape of computing. Gradually, these devices are replacing traditional personal computer (PCs) to become the devices of choice for entertainment, connectivity, and productivity. There are currently at least 45.5 million people in the United States who own a mobile device, and that number is expected to increase to 1.5 billion by 2015. Users of mobile devices expect and mandate that their mobile devices have maximized performance while consuming minimal possible power. However, due to the battery size constraints, the amount of energy stored in these devices is limited and is only growing by 5% annually. As a result, we focused in this dissertation on energy efficiency analysis and optimization for mobile platforms. We specifically developed SoftPowerMon, a tool that can power profile Android platforms in order to expose the power consumption behavior of the CPU. We also performed an extensive set of case studies in order to determine energy inefficiencies of mobile applications. Through our case studies, we were able to propose optimization techniques in order to increase the energy efficiency of mobile devices and proposed guidelines for energy-efficient application development. In addition, we developed BatteryExtender, an adaptive user-guided tool for power management of mobile devices. The tool enables users to extend battery life on demand for a specific duration until a particular task is completed. Moreover, we examined the power consumption of System-on-Chips (SoCs) and observed the impact on the energy efficiency in the event of offloading tasks from the CPU to the specialized custom engines. Based on our case studies, we were able to demonstrate that current software-based power profiling techniques for SoCs can have an error rate close to 12%, which needs to be addressed in order to be able to optimize the energy consumption of the SoC. Finally, we summarize our contributions and outline possible direction for future research in this field.
B-MIC: An Ultrafast Three-Level Parallel Sequence Aligner Using MIC.
Cui, Yingbo; Liao, Xiangke; Zhu, Xiaoqian; Wang, Bingqiang; Peng, Shaoliang
2016-03-01
Sequence alignment is the central process for sequence analysis, where mapping raw sequencing data to reference genome. The large amount of data generated by NGS is far beyond the process capabilities of existing alignment tools. Consequently, sequence alignment becomes the bottleneck of sequence analysis. Intensive computing power is required to address this challenge. Intel recently announced the MIC coprocessor, which can provide massive computing power. The Tianhe-2 is the world's fastest supercomputer now equipped with three MIC coprocessors each compute node. A key feature of sequence alignment is that different reads are independent. Considering this property, we proposed a MIC-oriented three-level parallelization strategy to speed up BWA, a widely used sequence alignment tool, and developed our ultrafast parallel sequence aligner: B-MIC. B-MIC contains three levels of parallelization: firstly, parallelization of data IO and reads alignment by a three-stage parallel pipeline; secondly, parallelization enabled by MIC coprocessor technology; thirdly, inter-node parallelization implemented by MPI. In this paper, we demonstrate that B-MIC outperforms BWA by a combination of those techniques using Inspur NF5280M server and the Tianhe-2 supercomputer. To the best of our knowledge, B-MIC is the first sequence alignment tool to run on Intel MIC and it can achieve more than fivefold speedup over the original BWA while maintaining the alignment precision.
DR2DI: a powerful computational tool for predicting novel drug-disease associations
NASA Astrophysics Data System (ADS)
Lu, Lu; Yu, Hua
2018-05-01
Finding the new related candidate diseases for known drugs provides an effective method for fast-speed and low-risk drug development. However, experimental identification of drug-disease associations is expensive and time-consuming. This motivates the need for developing in silico computational methods that can infer true drug-disease pairs with high confidence. In this study, we presented a novel and powerful computational tool, DR2DI, for accurately uncovering the potential associations between drugs and diseases using high-dimensional and heterogeneous omics data as information sources. Based on a unified and extended similarity kernel framework, DR2DI inferred the unknown relationships between drugs and diseases using Regularized Kernel Classifier. Importantly, DR2DI employed a semi-supervised and global learning algorithm which can be applied to uncover the diseases (drugs) associated with known and novel drugs (diseases). In silico global validation experiments showed that DR2DI significantly outperforms recent two approaches for predicting drug-disease associations. Detailed case studies further demonstrated that the therapeutic indications and side effects of drugs predicted by DR2DI could be validated by existing database records and literature, suggesting that DR2DI can be served as a useful bioinformatic tool for identifying the potential drug-disease associations and guiding drug repositioning. Our software and comparison codes are freely available at https://github.com/huayu1111/DR2DI.
DR2DI: a powerful computational tool for predicting novel drug-disease associations
NASA Astrophysics Data System (ADS)
Lu, Lu; Yu, Hua
2018-04-01
Finding the new related candidate diseases for known drugs provides an effective method for fast-speed and low-risk drug development. However, experimental identification of drug-disease associations is expensive and time-consuming. This motivates the need for developing in silico computational methods that can infer true drug-disease pairs with high confidence. In this study, we presented a novel and powerful computational tool, DR2DI, for accurately uncovering the potential associations between drugs and diseases using high-dimensional and heterogeneous omics data as information sources. Based on a unified and extended similarity kernel framework, DR2DI inferred the unknown relationships between drugs and diseases using Regularized Kernel Classifier. Importantly, DR2DI employed a semi-supervised and global learning algorithm which can be applied to uncover the diseases (drugs) associated with known and novel drugs (diseases). In silico global validation experiments showed that DR2DI significantly outperforms recent two approaches for predicting drug-disease associations. Detailed case studies further demonstrated that the therapeutic indications and side effects of drugs predicted by DR2DI could be validated by existing database records and literature, suggesting that DR2DI can be served as a useful bioinformatic tool for identifying the potential drug-disease associations and guiding drug repositioning. Our software and comparison codes are freely available at https://github.com/huayu1111/DR2DI.
Generative Representations for Computer-Automated Evolutionary Design
NASA Technical Reports Server (NTRS)
Hornby, Gregory S.
2006-01-01
With the increasing computational power of computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design systems is the representation with which they encode designs. If the representation cannot encode a certain design, then the design system cannot produce it. To be able to produce new types of designs, and not just optimize pre-defined parameterizations, evolutionary design systems must use generative representations. Generative representations are assembly procedures, or algorithms, for constructing a design thereby allowing for truly novel design solutions to be encoded. In addition, by enabling modularity, regularity and hierarchy, the level of sophistication that can be evolved is increased. We demonstrate the advantages of generative representations on two different design domains: the evolution of spacecraft antennas and the evolution of 3D objects.
Generative Representations for Computer-Automated Design Systems
NASA Technical Reports Server (NTRS)
Hornby, Gregory S.
2004-01-01
With the increasing computational power of Computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design programs is the representation with which they encode designs. If the representation cannot encode a certain design, then the design program cannot produce it. Similarly, a poor representation makes some types of designs extremely unlikely to be created. Here we define generative representations as those representations which can create and reuse organizational units within a design and argue that reuse is necessary for design systems to scale to more complex and interesting designs. To support our argument we describe GENRE, an evolutionary design program that uses both a generative and a non-generative representation, and compare the results of evolving designs with both types of representations.
Using parallel computing for the display and simulation of the space debris environment
NASA Astrophysics Data System (ADS)
Möckel, M.; Wiedemann, C.; Flegel, S.; Gelhaus, J.; Vörsmann, P.; Klinkrad, H.; Krag, H.
2011-07-01
Parallelism is becoming the leading paradigm in today's computer architectures. In order to take full advantage of this development, new algorithms have to be specifically designed for parallel execution while many old ones have to be upgraded accordingly. One field in which parallel computing has been firmly established for many years is computer graphics. Calculating and displaying three-dimensional computer generated imagery in real time requires complex numerical operations to be performed at high speed on a large number of objects. Since most of these objects can be processed independently, parallel computing is applicable in this field. Modern graphics processing units (GPUs) have become capable of performing millions of matrix and vector operations per second on multiple objects simultaneously. As a side project, a software tool is currently being developed at the Institute of Aerospace Systems that provides an animated, three-dimensional visualization of both actual and simulated space debris objects. Due to the nature of these objects it is possible to process them individually and independently from each other. Therefore, an analytical orbit propagation algorithm has been implemented to run on a GPU. By taking advantage of all its processing power a huge performance increase, compared to its CPU-based counterpart, could be achieved. For several years efforts have been made to harness this computing power for applications other than computer graphics. Software tools for the simulation of space debris are among those that could profit from embracing parallelism. With recently emerged software development tools such as OpenCL it is possible to transfer the new algorithms used in the visualization outside the field of computer graphics and implement them, for example, into the space debris simulation environment. This way they can make use of parallel hardware such as GPUs and Multi-Core-CPUs for faster computation. In this paper the visualization software will be introduced, including a comparison between the serial and the parallel method of orbit propagation. Ways of how to use the benefits of the latter method for space debris simulation will be discussed. An introduction to OpenCL will be given as well as an exemplary algorithm from the field of space debris simulation.
Using parallel computing for the display and simulation of the space debris environment
NASA Astrophysics Data System (ADS)
Moeckel, Marek; Wiedemann, Carsten; Flegel, Sven Kevin; Gelhaus, Johannes; Klinkrad, Heiner; Krag, Holger; Voersmann, Peter
Parallelism is becoming the leading paradigm in today's computer architectures. In order to take full advantage of this development, new algorithms have to be specifically designed for parallel execution while many old ones have to be upgraded accordingly. One field in which parallel computing has been firmly established for many years is computer graphics. Calculating and displaying three-dimensional computer generated imagery in real time requires complex numerical operations to be performed at high speed on a large number of objects. Since most of these objects can be processed independently, parallel computing is applicable in this field. Modern graphics processing units (GPUs) have become capable of performing millions of matrix and vector operations per second on multiple objects simultaneously. As a side project, a software tool is currently being developed at the Institute of Aerospace Systems that provides an animated, three-dimensional visualization of both actual and simulated space debris objects. Due to the nature of these objects it is possible to process them individually and independently from each other. Therefore, an analytical orbit propagation algorithm has been implemented to run on a GPU. By taking advantage of all its processing power a huge performance increase, compared to its CPU-based counterpart, could be achieved. For several years efforts have been made to harness this computing power for applications other than computer graphics. Software tools for the simulation of space debris are among those that could profit from embracing parallelism. With recently emerged software development tools such as OpenCL it is possible to transfer the new algorithms used in the visualization outside the field of computer graphics and implement them, for example, into the space debris simulation environment. This way they can make use of parallel hardware such as GPUs and Multi-Core-CPUs for faster computation. In this paper the visualization software will be introduced, including a comparison between the serial and the parallel method of orbit propagation. Ways of how to use the benefits of the latter method for space debris simulation will be discussed. An introduction of OpenCL will be given as well as an exemplary algorithm from the field of space debris simulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, S.E.
Emerging fossil energy power generation systems must operate with unprecedented efficiency and near-zero emissions, while optimizing profitably amid cost fluctuations for raw materials, finished products, and energy. To help address these challenges, the fossil energy industry will have to rely increasingly on the use advanced computational tools for modeling and simulating complex process systems. In this paper, we present the computational research challenges and opportunities for the optimization of fossil energy power generation systems across the plant lifecycle from process synthesis and design to plant operations. We also look beyond the plant gates to discuss research challenges and opportunities formore » enterprise-wide optimization, including planning, scheduling, and supply chain technologies.« less
How DARHT Works - the World's Most Powerful X-ray Machine
None
2018-06-01
The Dual Axis Radiographic Hydrodynamic Test (DARHT) facility at Los Alamos National Laboratory is an essential scientific tool that supports Stockpile Stewardship at the Laboratory. The World's most powerful x-ray machine, it's used to take high-speed images of mock nuclear devices - data that is used to confirm and modify advanced computer codes in assuring the safety, security, and effectiveness of the U.S. nuclear deterrent.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finch, D.R.; Chandler, J.R.; Church, J.P.
1979-01-01
The SHIELD system is a powerful new computational tool for calculation of isotopic inventory, radiation sources, decay heat, and shielding assessment in part of the nuclear fuel cycle. The integrated approach used in this system permitss the communication and management of large fields of numbers efficiently thus permitting the user to address the technical rather than computer aspects of a problem. Emphasis on graphical outputs permits large fields of resulting numbers to be efficiently displayed.
The Amber Biomolecular Simulation Programs
CASE, DAVID A.; CHEATHAM, THOMAS E.; DARDEN, TOM; GOHLKE, HOLGER; LUO, RAY; MERZ, KENNETH M.; ONUFRIEV, ALEXEY; SIMMERLING, CARLOS; WANG, BING; WOODS, ROBERT J.
2006-01-01
We describe the development, current features, and some directions for future development of the Amber package of computer programs. This package evolved from a program that was constructed in the late 1970s to do Assisted Model Building with Energy Refinement, and now contains a group of programs embodying a number of powerful tools of modern computational chemistry, focused on molecular dynamics and free energy calculations of proteins, nucleic acids, and carbohydrates. PMID:16200636
HEP Computing Tools, Grid and Supercomputers for Genome Sequencing Studies
NASA Astrophysics Data System (ADS)
De, K.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Novikov, A.; Poyda, A.; Tertychnyy, I.; Wenaus, T.
2017-10-01
PanDA - Production and Distributed Analysis Workload Management System has been developed to address ATLAS experiment at LHC data processing and analysis challenges. Recently PanDA has been extended to run HEP scientific applications on Leadership Class Facilities and supercomputers. The success of the projects to use PanDA beyond HEP and Grid has drawn attention from other compute intensive sciences such as bioinformatics. Recent advances of Next Generation Genome Sequencing (NGS) technology led to increasing streams of sequencing data that need to be processed, analysed and made available for bioinformaticians worldwide. Analysis of genomes sequencing data using popular software pipeline PALEOMIX can take a month even running it on the powerful computer resource. In this paper we will describe the adaptation the PALEOMIX pipeline to run it on a distributed computing environment powered by PanDA. To run pipeline we split input files into chunks which are run separately on different nodes as separate inputs for PALEOMIX and finally merge output file, it is very similar to what it done by ATLAS to process and to simulate data. We dramatically decreased the total walltime because of jobs (re)submission automation and brokering within PanDA. Using software tools developed initially for HEP and Grid can reduce payload execution time for Mammoths DNA samples from weeks to days.
A Software Upgrade of the NASA Aeroheating Code "MINIVER"
NASA Technical Reports Server (NTRS)
Louderback, Pierce Mathew
2013-01-01
Computational Fluid Dynamics (CFD) is a powerful and versatile tool simulating fluid and thermal environments of launch and re-entry vehicles alike. Where it excels in power and accuracy, however, it lacks in speed. An alternative tool for this purpose is known as MINIVER, an aeroheating code widely used by NASA and within the aerospace industry. Capable of providing swift, reasonably accurate approximations of the fluid and thermal environment of launch vehicles, MINIVER is used where time is of the essence and accuracy need not be exact. However, MINIVER is an old, aging tool: running on a user-unfriendly, legacy command-line interface, it is difficult for it to keep pace with more modem software tools. Florida Institute of Technology was tasked with the construction of a new Graphical User Interface (GUI) that implemented the legacy version's capabilities and enhanced them with new tools and utilities. This thesis provides background to the legacy version of the program, the progression and final version of a modem user interface, and benchmarks to demonstrate its usefulness.
Desktop Publishing: A Powerful Tool for Advanced Composition Courses.
ERIC Educational Resources Information Center
Sullivan, Patricia
1988-01-01
Examines the advantages of using desktop publishing in advanced writing classes. Explains how desktop publishing can spur creativity, call attention to the interaction between words and pictures, encourage the social dimensions of computing and composing, and provide students with practical skills. (MM)
AN OVERVIEW OF REDUCED ORDER MODELING TECHNIQUES FOR SAFETY APPLICATIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mandelli, D.; Alfonsi, A.; Talbot, P.
2016-10-01
The RISMC project is developing new advanced simulation-based tools to perform Computational Risk Analysis (CRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermal-hydraulic behavior of the reactors primary and secondary systems, but also external event temporal evolution and component/system ageing. Thus, this is not only a multi-physics problem being addressed, but also a multi-scale problem (both spatial, µm-mm-m, and temporal, seconds-hours-years). As part of the RISMC CRA approach, a large amount of computationally-expensive simulation runs may be required. An important aspect is that even though computational power is growing, themore » overall computational cost of a RISMC analysis using brute-force methods may be not viable for certain cases. A solution that is being evaluated to assist the computational issue is the use of reduced order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RISMC analysis computational cost by decreasing the number of simulation runs; for this analysis improvement we used surrogate models instead of the actual simulation codes. This article focuses on the use of reduced order modeling techniques that can be applied to RISMC analyses in order to generate, analyze, and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (microseconds instead of hours/days).« less
Modeling of power transmission and stress grading for corona protection
NASA Astrophysics Data System (ADS)
Zohdi, T. I.; Abali, B. E.
2017-11-01
Electrical high voltage (HV) machines are prone to corona discharges leading to power losses as well as damage of the insulating layer. Many different techniques are applied as corona protection and computational methods aid to select the best design. In this paper we develop a reduced-order model in 1D estimating electric field and temperature distribution of a conductor wrapped with different layers, as usual for HV-machines. Many assumptions and simplifications are undertaken for this 1D model, therefore, we compare its results to a direct numerical simulation in 3D quantitatively. Both models are transient and nonlinear, giving a possibility to quickly estimate in 1D or fully compute in 3D by a computational cost. Such tools enable understanding, evaluation, and optimization of corona shielding systems for multilayered coils.
In-silico wear prediction for knee replacements--methodology and corroboration.
Strickland, M A; Taylor, M
2009-07-22
The capability to predict in-vivo wear of knee replacements is a valuable pre-clinical analysis tool for implant designers. Traditionally, time-consuming experimental tests provided the principal means of investigating wear. Today, computational models offer an alternative. However, the validity of these models has not been demonstrated across a range of designs and test conditions, and several different formulas are in contention for estimating wear rates, limiting confidence in the predictive power of these in-silico models. This study collates and retrospectively simulates a wide range of experimental wear tests using fast rigid-body computational models with extant wear prediction algorithms, to assess the performance of current in-silico wear prediction tools. The number of tests corroborated gives a broader, more general assessment of the performance of these wear-prediction tools, and provides better estimates of the wear 'constants' used in computational models. High-speed rigid-body modelling allows a range of alternative algorithms to be evaluated. Whilst most cross-shear (CS)-based models perform comparably, the 'A/A+B' wear model appears to offer the best predictive power amongst existing wear algorithms. However, the range and variability of experimental data leaves considerable uncertainty in the results. More experimental data with reduced variability and more detailed reporting of studies will be necessary to corroborate these models with greater confidence. With simulation times reduced to only a few minutes, these models are ideally suited to large-volume 'design of experiment' or probabilistic studies (which are essential if pre-clinical assessment tools are to begin addressing the degree of variation observed clinically and in explanted components).
A Pythonic Approach for Computational Geosciences and Geo-Data Processing
NASA Astrophysics Data System (ADS)
Morra, G.; Yuen, D. A.; Lee, S. M.
2016-12-01
Computational methods and data analysis play a constantly increasing role in Earth Sciences however students and professionals need to climb a steep learning curve before reaching a sufficient level that allows them to run effective models. Furthermore the recent arrival and new powerful machine learning tools such as Torch and Tensor Flow has opened new possibilities but also created a new realm of complications related to the completely different technology employed. We present here a series of examples entirely written in Python, a language that combines the simplicity of Matlab with the power and speed of compiled languages such as C, and apply them to a wide range of geological processes such as porous media flow, multiphase fluid-dynamics, creeping flow and many-faults interaction. We also explore ways in which machine learning can be employed in combination with numerical modelling. From immediately interpreting a large number of modeling results to optimizing a set of modeling parameters to obtain a desired optimal simulation. We show that by using Python undergraduate and graduate can learn advanced numerical technologies with a minimum dedicated effort, which in turn encourages them to develop more numerical tools and quickly progress in their computational abilities. We also show how Python allows combining modeling with machine learning as pieces of LEGO, therefore simplifying the transition towards a new kind of scientific geo-modelling. The conclusion is that Python is an ideal tool to create an infrastructure for geosciences that allows users to quickly develop tools, reuse techniques and encourage collaborative efforts to interpret and integrate geo-data in profound new ways.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, Xiao; Blazek, Jonathan A.; McEwen, Joseph E.
Cosmological perturbation theory is a powerful tool to predict the statistics of large-scale structure in the weakly non-linear regime, but even at 1-loop order it results in computationally expensive mode-coupling integrals. Here we present a fast algorithm for computing 1-loop power spectra of quantities that depend on the observer's orientation, thereby generalizing the FAST-PT framework (McEwen et al., 2016) that was originally developed for scalars such as the matter density. This algorithm works for an arbitrary input power spectrum and substantially reduces the time required for numerical evaluation. We apply the algorithm to four examples: intrinsic alignments of galaxies inmore » the tidal torque model; the Ostriker-Vishniac effect; the secondary CMB polarization due to baryon flows; and the 1-loop matter power spectrum in redshift space. Code implementing this algorithm and these applications is publicly available at https://github.com/JoeMcEwen/FAST-PT.« less
Computational resources for ribosome profiling: from database to Web server and software.
Wang, Hongwei; Wang, Yan; Xie, Zhi
2017-08-14
Ribosome profiling is emerging as a powerful technique that enables genome-wide investigation of in vivo translation at sub-codon resolution. The increasing application of ribosome profiling in recent years has achieved remarkable progress toward understanding the composition, regulation and mechanism of translation. This benefits from not only the awesome power of ribosome profiling but also an extensive range of computational resources available for ribosome profiling. At present, however, a comprehensive review on these resources is still lacking. Here, we survey the recent computational advances guided by ribosome profiling, with a focus on databases, Web servers and software tools for storing, visualizing and analyzing ribosome profiling data. This review is intended to provide experimental and computational biologists with a reference to make appropriate choices among existing resources for the question at hand. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Driscoll, Brandon; Jaffray, David; Coolens, Catherine
2014-03-01
Purpose: To provide clinicians & researchers participating in multi-centre clinical trials with a central repository for large volume dynamic imaging data as well as a set of tools for providing end-to-end testing and image analysis standards of practice. Methods: There are three main pieces to the data archiving and analysis system; the PACS server, the data analysis computer(s) and the high-speed networks that connect them. Each clinical trial is anonymized using a customizable anonymizer and is stored on a PACS only accessible by AE title access control. The remote analysis station consists of a single virtual machine per trial running on a powerful PC supporting multiple simultaneous instances. Imaging data management and analysis is performed within ClearCanvas Workstation® using custom designed plug-ins for kinetic modelling (The DCE-Tool®), quality assurance (The DCE-QA Tool) and RECIST. Results: A framework has been set up currently serving seven clinical trials spanning five hospitals with three more trials to be added over the next six months. After initial rapid image transfer (+ 2 MB/s), all data analysis is done server side making it robust and rapid. This has provided the ability to perform computationally expensive operations such as voxel-wise kinetic modelling on very large data archives (+20 GB/50k images/patient) remotely with minimal end-user hardware. Conclusions: This system is currently in its proof of concept stage but has been used successfully to send and analyze data from remote hospitals. Next steps will involve scaling up the system with a more powerful PACS and multiple high powered analysis machines as well as adding real-time review capabilities.
Object-oriented Tools for Distributed Computing
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1993-01-01
Distributed computing systems are proliferating, owing to the availability of powerful, affordable microcomputers and inexpensive communication networks. A critical problem in developing such systems is getting application programs to interact with one another across a computer network. Remote interprogram connectivity is particularly challenging across heterogeneous environments, where applications run on different kinds of computers and operating systems. NetWorks! (trademark) is an innovative software product that provides an object-oriented messaging solution to these problems. This paper describes the design and functionality of NetWorks! and illustrates how it is being used to build complex distributed applications for NASA and in the commercial sector.
Computational neuropharmacology: dynamical approaches in drug discovery.
Aradi, Ildiko; Erdi, Péter
2006-05-01
Computational approaches that adopt dynamical models are widely accepted in basic and clinical neuroscience research as indispensable tools with which to understand normal and pathological neuronal mechanisms. Although computer-aided techniques have been used in pharmaceutical research (e.g. in structure- and ligand-based drug design), the power of dynamical models has not yet been exploited in drug discovery. We suggest that dynamical system theory and computational neuroscience--integrated with well-established, conventional molecular and electrophysiological methods--offer a broad perspective in drug discovery and in the search for novel targets and strategies for the treatment of neurological and psychiatric diseases.
Universal quantum computation with little entanglement.
Van den Nest, Maarten
2013-02-08
We show that universal quantum computation can be achieved in the standard pure-state circuit model while the entanglement entropy of every bipartition is small in each step of the computation. The entanglement entropy required for large-scale quantum computation even tends to zero. Moreover we show that the same conclusion applies to many entanglement measures commonly used in the literature. This includes e.g., the geometric measure, localizable entanglement, multipartite concurrence, squashed entanglement, witness-based measures, and more generally any entanglement measure which is continuous in a certain natural sense. These results demonstrate that many entanglement measures are unsuitable tools to assess the power of quantum computers.
ERIC Educational Resources Information Center
Dehinbo, Johnson
2010-01-01
The use of email utilizes the power of Web 1.0 to enable users to access their email from any computer and mobile devices that is connected to the Internet making email valuable in acquiring and transferring knowledge. But the advent of Web 2.0 and social networking seems to indicate certain limitations of email. The use of social networking seems…
PLAYGROUND: preparing students for the cyber battleground
NASA Astrophysics Data System (ADS)
Nielson, Seth James
2016-12-01
Attempting to educate practitioners of computer security can be difficult if for no other reason than the breadth of knowledge required today. The security profession includes widely diverse subfields including cryptography, network architectures, programming, programming languages, design, coding practices, software testing, pattern recognition, economic analysis, and even human psychology. While an individual may choose to specialize in one of these more narrow elements, there is a pressing need for practitioners that have a solid understanding of the unifying principles of the whole. We created the Playground network simulation tool and used it in the instruction of a network security course to graduate students. This tool was created for three specific purposes. First, it provides simulation sufficiently powerful to permit rigorous study of desired principles while simultaneously reducing or eliminating unnecessary and distracting complexities. Second, it permitted the students to rapidly prototype a suite of security protocols and mechanisms. Finally, with equal rapidity, the students were able to develop attacks against the protocols that they themselves had created. Based on our own observations and student reviews, we believe that these three features combine to create a powerful pedagogical tool that provides students with a significant amount of breadth and intense emotional connection to computer security in a single semester.
NASA Technical Reports Server (NTRS)
Maluf, David A.; Koga, Dennis (Technical Monitor)
2002-01-01
This presentation discuss NASA's proposed NETMARK knowledge management tool which aims 'to control and interoperate with every block in a document, email, spreadsheet, power point, database, etc. across the lifecycle'. Topics covered include: system software requirements and hardware requirements, seamless information systems, computer architecture issues, and potential benefits to NETMARK users.
Modular, Semantics-Based Composition of Biosimulation Models
ERIC Educational Resources Information Center
Neal, Maxwell Lewis
2010-01-01
Biosimulation models are valuable, versatile tools used for hypothesis generation and testing, codification of biological theory, education, and patient-specific modeling. Driven by recent advances in computational power and the accumulation of systems-level experimental data, modelers today are creating models with an unprecedented level of…
Stakeholder Perceptions of ICT Usage across Management Institutes
ERIC Educational Resources Information Center
Goyal, Ela; Purohit, Seema; Bhagat, Manju
2013-01-01
Information and communication technology (ICT) which includes radio, television and newer digital technology such as computers and the internet, are potentially powerful tools for extending educational opportunities, formal and non-formal, to one and all. It provides opportunities to deploy innovative teaching methodologies and interesting…
Eco-Evo PVAs: Incorporating Eco-Evolutionary Processes into Population Viability Models
We synthesize how advances in computational methods and population genomics can be combined within an Ecological-Evolutionary (Eco-Evo) PVA model. Eco-Evo PVA models are powerful new tools for understanding the influence of evolutionary processes on plant and animal population pe...
Lim, Chun Shen; Brown, Chris M
2017-01-01
Structured RNA elements may control virus replication, transcription and translation, and their distinct features are being exploited by novel antiviral strategies. Viral RNA elements continue to be discovered using combinations of experimental and computational analyses. However, the wealth of sequence data, notably from deep viral RNA sequencing, viromes, and metagenomes, necessitates computational approaches being used as an essential discovery tool. In this review, we describe practical approaches being used to discover functional RNA elements in viral genomes. In addition to success stories in new and emerging viruses, these approaches have revealed some surprising new features of well-studied viruses e.g., human immunodeficiency virus, hepatitis C virus, influenza, and dengue viruses. Some notable discoveries were facilitated by new comparative analyses of diverse viral genome alignments. Importantly, comparative approaches for finding RNA elements embedded in coding and non-coding regions differ. With the exponential growth of computer power we have progressed from stem-loop prediction on single sequences to cutting edge 3D prediction, and from command line to user friendly web interfaces. Despite these advances, many powerful, user friendly prediction tools and resources are underutilized by the virology community.
Lim, Chun Shen; Brown, Chris M.
2018-01-01
Structured RNA elements may control virus replication, transcription and translation, and their distinct features are being exploited by novel antiviral strategies. Viral RNA elements continue to be discovered using combinations of experimental and computational analyses. However, the wealth of sequence data, notably from deep viral RNA sequencing, viromes, and metagenomes, necessitates computational approaches being used as an essential discovery tool. In this review, we describe practical approaches being used to discover functional RNA elements in viral genomes. In addition to success stories in new and emerging viruses, these approaches have revealed some surprising new features of well-studied viruses e.g., human immunodeficiency virus, hepatitis C virus, influenza, and dengue viruses. Some notable discoveries were facilitated by new comparative analyses of diverse viral genome alignments. Importantly, comparative approaches for finding RNA elements embedded in coding and non-coding regions differ. With the exponential growth of computer power we have progressed from stem-loop prediction on single sequences to cutting edge 3D prediction, and from command line to user friendly web interfaces. Despite these advances, many powerful, user friendly prediction tools and resources are underutilized by the virology community. PMID:29354101
NASA Technical Reports Server (NTRS)
Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.
1974-01-01
A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.
COMPUTER-AIDED DRUG DISCOVERY AND DEVELOPMENT (CADDD): in silico-chemico-biological approach
Kapetanovic, I.M.
2008-01-01
It is generally recognized that drug discovery and development are very time and resources consuming processes. There is an ever growing effort to apply computational power to the combined chemical and biological space in order to streamline drug discovery, design, development and optimization. In biomedical arena, computer-aided or in silico design is being utilized to expedite and facilitate hit identification, hit-to-lead selection, optimize the absorption, distribution, metabolism, excretion and toxicity profile and avoid safety issues. Commonly used computational approaches include ligand-based drug design (pharmacophore, a 3-D spatial arrangement of chemical features essential for biological activity), structure-based drug design (drug-target docking), and quantitative structure-activity and quantitative structure-property relationships. Regulatory agencies as well as pharmaceutical industry are actively involved in development of computational tools that will improve effectiveness and efficiency of drug discovery and development process, decrease use of animals, and increase predictability. It is expected that the power of CADDD will grow as the technology continues to evolve. PMID:17229415
NASA Technical Reports Server (NTRS)
Barber, Peter W.; Demerdash, Nabeel A. O.; Hurysz, B.; Luo, Z.; Denny, Hugh W.; Millard, David P.; Herkert, R.; Wang, R.
1992-01-01
The goal of this research project was to analyze the potential effects of electromagnetic interference (EMI) originating from power system processing and transmission components for Space Station Freedom. The approach consists of four steps: (1) developing analytical tools (models and computer programs); (2) conducting parameterization (what if?) studies; (3) predicting the global space station EMI environment; and (4) providing a basis for modification of EMI standards.
A computer network with scada and case tools for on-line process control in greenhouses
NASA Astrophysics Data System (ADS)
Gieling, Th. H.; van Meurs, W. Th. M.; Janssen, H. J. J.
Climate control computers in greenhouses are used to control heating and ventilation, supply water and dilute and dispense nutrients. They integrate models into optimally controlled systems. This paper describes how information technology, as in use in other sectors of industry, is applied to greenhouse control. The introduction of modern software and hardware concepts in horticulture adds power and extra opportunities to climate control in greenhouses.
A computer network with SCADA and case tools for on-line process control in greenhouses.
Gieling ThH; van Meurs WTh; Janssen, H J
1996-01-01
Climate control computers in greenhouses are used to control heating and ventilation, supply water and dilute and dispense nutrients. They integrate models into optimally controlled systems. This paper describes how information technology, as in use in other sectors of industry, is applied to greenhouse control. The introduction of modern software and hardware concepts in horticulture adds power and extra oppurtunities to climate contol in greenhouses.
SoftLab: A Soft-Computing Software for Experimental Research with Commercialization Aspects
NASA Technical Reports Server (NTRS)
Akbarzadeh-T, M.-R.; Shaikh, T. S.; Ren, J.; Hubbell, Rob; Kumbla, K. K.; Jamshidi, M
1998-01-01
SoftLab is a software environment for research and development in intelligent modeling/control using soft-computing paradigms such as fuzzy logic, neural networks, genetic algorithms, and genetic programs. SoftLab addresses the inadequacies of the existing soft-computing software by supporting comprehensive multidisciplinary functionalities from management tools to engineering systems. Furthermore, the built-in features help the user process/analyze information more efficiently by a friendly yet powerful interface, and will allow the user to specify user-specific processing modules, hence adding to the standard configuration of the software environment.
GP, Douglas; RA, Deula; SE, Connor
2003-01-01
Computer-based order entry is a powerful tool for enhancing patient care. A pilot project in the pediatric department of the Lilongwe Central Hospital (LCH) in Malawi, Africa has demonstrated that computer-based order entry (COE): 1) can be successfully deployed and adopted in resource-poor settings, 2) can be built, deployed and sustained at relatively low cost and with local resources, and 3) has a greater potential to improve patient care in developing than in developed countries. PMID:14728338
NASA Technical Reports Server (NTRS)
Grantham, C.
1979-01-01
The Interactive Software Invocation (ISIS), an interactive data management system, was developed to act as a buffer between the user and host computer system. The user is provided by ISIS with a powerful system for developing software or systems in the interactive environment. The user is protected from the idiosyncracies of the host computer system by providing such a complete range of capabilities that the user should have no need for direct access to the host computer. These capabilities are divided into four areas: desk top calculator, data editor, file manager, and tool invoker.
Implementing WebGL and HTML5 in Macromolecular Visualization and Modern Computer-Aided Drug Design.
Yuan, Shuguang; Chan, H C Stephen; Hu, Zhenquan
2017-06-01
Web browsers have long been recognized as potential platforms for remote macromolecule visualization. However, the difficulty in transferring large-scale data to clients and the lack of native support for hardware-accelerated applications in the local browser undermine the feasibility of such utilities. With the introduction of WebGL and HTML5 technologies in recent years, it is now possible to exploit the power of a graphics-processing unit (GPU) from a browser without any third-party plugin. Many new tools have been developed for biological molecule visualization and modern drug discovery. In contrast to traditional offline tools, real-time computing, interactive data analysis, and cross-platform analyses feature WebGL- and HTML5-based tools, facilitating biological research in a more efficient and user-friendly way. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Arkadov, G. V.; Zhukavin, A. P.; Kroshilin, A. E.; Parshikov, I. A.; Solov'ev, S. L.; Shishov, A. V.
2014-10-01
The article describes the "Virtual Digital VVER-Based Nuclear Power Plant" computerized system comprising a totality of verified initial data (sets of input data for a model intended for describing the behavior of nuclear power plant (NPP) systems in design and emergency modes of their operation) and a unified system of new-generation computation codes intended for carrying out coordinated computation of the variety of physical processes in the reactor core and NPP equipment. Experiments with the demonstration version of the "Virtual Digital VVER-Based NPP" computerized system has shown that it is in principle possible to set up a unified system of computation codes in a common software environment for carrying out interconnected calculations of various physical phenomena at NPPs constructed according to the standard AES-2006 project. With the full-scale version of the "Virtual Digital VVER-Based NPP" computerized system put in operation, the concerned engineering, design, construction, and operating organizations will have access to all necessary information relating to the NPP power unit project throughout its entire lifecycle. The domestically developed commercial-grade software product set to operate as an independently operating application to the project will bring about additional competitive advantages in the modern market of nuclear power technologies.
GAPIT: genome association and prediction integrated tool.
Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu
2012-09-15
Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.
Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing
NASA Technical Reports Server (NTRS)
Ordaz, Irian
2011-01-01
Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.
An Integrated Software Package to Enable Predictive Simulation Capabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Fitzhenry, Erin B.; Jin, Shuangshuang
The power grid is increasing in complexity due to the deployment of smart grid technologies. Such technologies vastly increase the size and complexity of power grid systems for simulation and modeling. This increasing complexity necessitates not only the use of high-performance-computing (HPC) techniques, but a smooth, well-integrated interplay between HPC applications. This paper presents a new integrated software package that integrates HPC applications and a web-based visualization tool based on a middleware framework. This framework can support the data communication between different applications. Case studies with a large power system demonstrate the predictive capability brought by the integrated software package,more » as well as the better situational awareness provided by the web-based visualization tool in a live mode. Test results validate the effectiveness and usability of the integrated software package.« less
Eruptive event generator based on the Gibson-Low magnetic configuration
NASA Astrophysics Data System (ADS)
Borovikov, D.; Sokolov, I. V.; Manchester, W. B.; Jin, M.; Gombosi, T. I.
2017-08-01
Coronal mass ejections (CMEs), a kind of energetic solar eruptions, are an integral subject of space weather research. Numerical magnetohydrodynamic (MHD) modeling, which requires powerful computational resources, is one of the primary means of studying the phenomenon. With increasing accessibility of such resources, grows the demand for user-friendly tools that would facilitate the process of simulating CMEs for scientific and operational purposes. The Eruptive Event Generator based on Gibson-Low flux rope (EEGGL), a new publicly available computational model presented in this paper, is an effort to meet this demand. EEGGL allows one to compute the parameters of a model flux rope driving a CME via an intuitive graphical user interface. We provide a brief overview of the physical principles behind EEGGL and its functionality. Ways toward future improvements of the tool are outlined.
TethysCluster: A comprehensive approach for harnessing cloud resources for hydrologic modeling
NASA Astrophysics Data System (ADS)
Nelson, J.; Jones, N.; Ames, D. P.
2015-12-01
Advances in water resources modeling are improving the information that can be supplied to support decisions affecting the safety and sustainability of society. However, as water resources models become more sophisticated and data-intensive they require more computational power to run. Purchasing and maintaining the computing facilities needed to support certain modeling tasks has been cost-prohibitive for many organizations. With the advent of the cloud, the computing resources needed to address this challenge are now available and cost-effective, yet there still remains a significant technical barrier to leverage these resources. This barrier inhibits many decision makers and even trained engineers from taking advantage of the best science and tools available. Here we present the Python tools TethysCluster and CondorPy, that have been developed to lower the barrier to model computation in the cloud by providing (1) programmatic access to dynamically scalable computing resources, (2) a batch scheduling system to queue and dispatch the jobs to the computing resources, (3) data management for job inputs and outputs, and (4) the ability to dynamically create, submit, and monitor computing jobs. These Python tools leverage the open source, computing-resource management, and job management software, HTCondor, to offer a flexible and scalable distributed-computing environment. While TethysCluster and CondorPy can be used independently to provision computing resources and perform large modeling tasks, they have also been integrated into Tethys Platform, a development platform for water resources web apps, to enable computing support for modeling workflows and decision-support systems deployed as web apps.
The Exponential Expansion of Simulation: How Simulation has Grown as a Research Tool
2012-09-01
exponential growth of computing power. Although other analytic approaches also benefit from this trend, keyword searches of several scholarly search ... engines reveal that the reliance on simulation is increasing more rapidly. A descriptive analysis paints a compelling picture: simulation is frequently
What Makes a Computer Lab: Use These Ideas to Design Your Own.
ERIC Educational Resources Information Center
Day, C. William; Herlihy, John J.
1989-01-01
Although educators have succeeded in informing the public about the transformative power of electronic and telecommunications tools, they have frequently overlooked physical features of spaces needed to house them. This article discusses workspace, arrangements, flooring, environment, electrical wiring, lighting, acoustics, and security features…
USDA-ARS?s Scientific Manuscript database
Hydrologic models are essential tools for environmental assessment of agricultural non-point source pollution. The automatic calibration of hydrologic models, though efficient, demands significant computational power, which can limit its application. The study objective was to investigate a cost e...
Institute for Sustained Performance, Energy, and Resilience (SuPER)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jagode, Heike; Bosilca, George; Danalis, Anthony
The University of Tennessee (UTK) and University of Texas at El Paso (UTEP) partnership supported the three main thrusts of the SUPER project---performance, energy, and resilience. The UTK-UTEP effort thus helped advance the main goal of SUPER, which was to ensure that DOE's computational scientists can successfully exploit the emerging generation of high performance computing (HPC) systems. This goal is being met by providing application scientists with strategies and tools to productively maximize performance, conserve energy, and attain resilience. The primary vehicle through which UTK provided performance measurement support to SUPER and the larger HPC community is the Performance Applicationmore » Programming Interface (PAPI). PAPI is an ongoing project that provides a consistent interface and methodology for collecting hardware performance information from various hardware and software components, including most major CPUs, GPUs and accelerators, interconnects, I/O systems, and power interfaces, as well as virtual cloud environments. The PAPI software is widely used for performance modeling of scientific and engineering applications---for example, the HOMME (High Order Methods Modeling Environment) climate code, and the GAMESS and NWChem computational chemistry codes---on DOE supercomputers. PAPI is widely deployed as middleware for use by higher-level profiling, tracing, and sampling tools (e.g., CrayPat, HPCToolkit, Scalasca, Score-P, TAU, Vampir, PerfExpert), making it the de facto standard for hardware counter analysis. PAPI has established itself as fundamental software infrastructure in every application domain (spanning academia, government, and industry), where improving performance can be mission critical. Ultimately, as more application scientists migrate their applications to HPC platforms, they will benefit from the extended capabilities this grant brought to PAPI to analyze and optimize performance in these environments, whether they use PAPI directly, or via third-party performance tools. Capabilities added to PAPI through this grant include support for new architectures such as the lastest GPU and Xeon Phi accelerators, and advanced power measurement and management features. Another important topic for the UTK team was providing support for a rich ecosystem of different fault management strategies in the context of parallel computing. Our long term efforts have been oriented toward proposing flexible strategies and providing building boxes that application developers can use to build the most efficient fault management technique for their application. These efforts span across the entire software spectrum, from theoretical models of existing strategies to easily assess their performance, to algorithmic modifications to take advantage of specific mathematical properties for data redundancy and to extensions to widely used programming paradigms to empower the application developers to deal with all types of faults. We have also continued our tight collaborations with users to help them adopt these technologies to ensure their application always deliver meaningful scientific data. Large supercomputer systems are becoming more and more power and energy constrained, and future systems and applications running on them will need to be optimized to run under power caps and/or minimize energy consumption. The UTEP team contributed to the SUPER energy thrust by developing power modeling methodologies and investigating power management strategies. Scalability modeling results showed that some applications can scale better with respect to an increasing power budget than with respect to only the number of processors. Power management, in particular shifting power to processors on the critical path of an application execution, can reduce perturbation due to system noise and other sources of runtime variability, which are growing problems on large-scale power-constrained computer systems.« less
Towards a C2 Poly-Visualization Tool: Leveraging the Power of Social-Network Analysis and GIS
2011-06-01
from Magsino.14 AutoMap, a product of CASOS at Carnegie Mellon University, is a text-mining tool that enables the extraction of network data from...enables community leaders to prepare for biological attacks using computational models. BioWar is a CASOS package that combines many factors into a...models, demographically accurate agent modes, wind dispersion models, and an error-diagnostic model. Construct, also developed by CASOS , is a
CLAST: CUDA implemented large-scale alignment search tool.
Yano, Masahiro; Mori, Hiroshi; Akiyama, Yutaka; Yamada, Takuji; Kurokawa, Ken
2014-12-11
Metagenomics is a powerful methodology to study microbial communities, but it is highly dependent on nucleotide sequence similarity searching against sequence databases. Metagenomic analyses with next-generation sequencing technologies produce enormous numbers of reads from microbial communities, and many reads are derived from microbes whose genomes have not yet been sequenced, limiting the usefulness of existing sequence similarity search tools. Therefore, there is a clear need for a sequence similarity search tool that can rapidly detect weak similarity in large datasets. We developed a tool, which we named CLAST (CUDA implemented large-scale alignment search tool), that enables analyses of millions of reads and thousands of reference genome sequences, and runs on NVIDIA Fermi architecture graphics processing units. CLAST has four main advantages over existing alignment tools. First, CLAST was capable of identifying sequence similarities ~80.8 times faster than BLAST and 9.6 times faster than BLAT. Second, CLAST executes global alignment as the default (local alignment is also an option), enabling CLAST to assign reads to taxonomic and functional groups based on evolutionarily distant nucleotide sequences with high accuracy. Third, CLAST does not need a preprocessed sequence database like Burrows-Wheeler Transform-based tools, and this enables CLAST to incorporate large, frequently updated sequence databases. Fourth, CLAST requires <2 GB of main memory, making it possible to run CLAST on a standard desktop computer or server node. CLAST achieved very high speed (similar to the Burrows-Wheeler Transform-based Bowtie 2 for long reads) and sensitivity (equal to BLAST, BLAT, and FR-HIT) without the need for extensive database preprocessing or a specialized computing platform. Our results demonstrate that CLAST has the potential to be one of the most powerful and realistic approaches to analyze the massive amount of sequence data from next-generation sequencing technologies.
Verification of Space Station Secondary Power System Stability Using Design of Experiment
NASA Technical Reports Server (NTRS)
Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce
1998-01-01
This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.
Epstein, Joshua M.; Pankajakshan, Ramesh; Hammond, Ross A.
2011-01-01
We introduce a novel hybrid of two fields—Computational Fluid Dynamics (CFD) and Agent-Based Modeling (ABM)—as a powerful new technique for urban evacuation planning. CFD is a predominant technique for modeling airborne transport of contaminants, while ABM is a powerful approach for modeling social dynamics in populations of adaptive individuals. The hybrid CFD-ABM method is capable of simulating how large, spatially-distributed populations might respond to a physically realistic contaminant plume. We demonstrate the overall feasibility of CFD-ABM evacuation design, using the case of a hypothetical aerosol release in Los Angeles to explore potential effectiveness of various policy regimes. We conclude by arguing that this new approach can be powerfully applied to arbitrary population centers, offering an unprecedented preparedness and catastrophic event response tool. PMID:21687788
Surface Traps in Colloidal Quantum Dots: A Combined Experimental and Theoretical Perspective.
Giansante, Carlo; Infante, Ivan
2017-10-19
Surface traps are ubiquitous to nanoscopic semiconductor materials. Understanding their atomistic origin and manipulating them chemically have capital importance to design defect-free colloidal quantum dots and make a leap forward in the development of efficient optoelectronic devices. Recent advances in computing power established computational chemistry as a powerful tool to describe accurately complex chemical species and nowadays it became conceivable to model colloidal quantum dots with realistic sizes and shapes. In this Perspective, we combine the knowledge gathered in recent experimental findings with the computation of quantum dot electronic structures. We analyze three different systems: namely, CdSe, PbS, and CsPbI 3 as benchmark semiconductor nanocrystals showing how different types of trap states can form at their surface. In addition, we suggest experimental healing of such traps according to their chemical origin and nanocrystal composition.
Advanced techniques in reliability model representation and solution
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L.; Nicol, David M.
1992-01-01
The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.
Google-Earth Based Visualizations for Environmental Flows and Pollutant Dispersion in Urban Areas
Liu, Daoming; Kenjeres, Sasa
2017-01-01
In the present study, we address the development and application of an efficient tool for conversion of results obtained by an integrated computational fluid dynamics (CFD) and computational reaction dynamics (CRD) approach and their visualization in the Google Earth. We focus on results typical for environmental fluid mechanics studies at a city scale that include characteristic wind flow patterns and dispersion of reactive scalars. This is achieved by developing a code based on the Java language, which converts the typical four-dimensional structure (spatial and temporal dependency) of data results in the Keyhole Markup Language (KML) format. The visualization techniques most often used are revisited and implemented into the conversion tool. The potential of the tool is demonstrated in a case study of smog formation due to an intense traffic emission in Rotterdam (The Netherlands). It is shown that the Google Earth can provide a computationally efficient and user-friendly means of data representation. This feature can be very useful for visualization of pollution at street levels, which is of great importance for the city residents. Various meteorological and traffic emissions can be easily visualized and analyzed, providing a powerful, user-friendly tool for traffic regulations and urban climate adaptations. PMID:28257078
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawson, M.; Yu, Y. H.; Nelessen, A.
2014-05-01
Wave energy converters (WECs) are commonly designed and analyzed using numerical models that combine multi-body dynamics with hydrodynamic models based on the Cummins Equation and linearized hydrodynamic coefficients. These modeling methods are attractive design tools because they are computationally inexpensive and do not require the use of high performance computing resources necessitated by high-fidelity methods, such as Navier Stokes computational fluid dynamics. Modeling hydrodynamics using linear coefficients assumes that the device undergoes small motions and that the wetted surface area of the devices is approximately constant. WEC devices, however, are typically designed to undergo large motions in order to maximizemore » power extraction, calling into question the validity of assuming that linear hydrodynamic models accurately capture the relevant fluid-structure interactions. In this paper, we study how calculating buoyancy and Froude-Krylov forces from the instantaneous position of a WEC device (referred to as instantaneous buoyancy and Froude-Krylov forces from herein) changes WEC simulation results compared to simulations that use linear hydrodynamic coefficients. First, we describe the WEC-Sim tool used to perform simulations and how the ability to model instantaneous forces was incorporated into WEC-Sim. We then use a simplified one-body WEC device to validate the model and to demonstrate how accounting for these instantaneously calculated forces affects the accuracy of simulation results, such as device motions, hydrodynamic forces, and power generation.« less
Analysis of outcomes in radiation oncology: An integrated computational platform
Liu, Dezhi; Ajlouni, Munther; Jin, Jian-Yue; Ryu, Samuel; Siddiqui, Farzan; Patel, Anushka; Movsas, Benjamin; Chetty, Indrin J.
2009-01-01
Radiotherapy research and outcome analyses are essential for evaluating new methods of radiation delivery and for assessing the benefits of a given technology on locoregional control and overall survival. In this article, a computational platform is presented to facilitate radiotherapy research and outcome studies in radiation oncology. This computational platform consists of (1) an infrastructural database that stores patient diagnosis, IMRT treatment details, and follow-up information, (2) an interface tool that is used to import and export IMRT plans in DICOM RT and AAPM/RTOG formats from a wide range of planning systems to facilitate reproducible research, (3) a graphical data analysis and programming tool that visualizes all aspects of an IMRT plan including dose, contour, and image data to aid the analysis of treatment plans, and (4) a software package that calculates radiobiological models to evaluate IMRT treatment plans. Given the limited number of general-purpose computational environments for radiotherapy research and outcome studies, this computational platform represents a powerful and convenient tool that is well suited for analyzing dose distributions biologically and correlating them with the delivered radiation dose distributions and other patient-related clinical factors. In addition the database is web-based and accessible by multiple users, facilitating its convenient application and use. PMID:19544785
Perspectives on the Future of CFD
NASA Technical Reports Server (NTRS)
Kwak, Dochan
2000-01-01
This viewgraph presentation gives an overview of the future of computational fluid dynamics (CFD), which in the past has pioneered the field of flow simulation. Over time CFD has progressed as computing power. Numerical methods have been advanced as CPU and memory capacity increases. Complex configurations are routinely computed now and direct numerical simulations (DNS) and large eddy simulations (LES) are used to study turbulence. As the computing resources changed to parallel and distributed platforms, computer science aspects such as scalability (algorithmic and implementation) and portability and transparent codings have advanced. Examples of potential future (or current) challenges include risk assessment, limitations of the heuristic model, and the development of CFD and information technology (IT) tools.
High-Performance Data Analysis Tools for Sun-Earth Connection Missions
NASA Technical Reports Server (NTRS)
Messmer, Peter
2011-01-01
The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the potential to interact, so one can build a cluster of PCs, each equipped with a GPU, and use mpiDL to communicate between the nodes and GPULib to accelerate the computations on each node.
The OSG open facility: A sharing ecosystem
Jayatilaka, B.; Levshina, T.; Rynge, M.; ...
2015-12-23
The Open Science Grid (OSG) ties together individual experiments’ computing power, connecting their resources to create a large, robust computing grid, this computing infrastructure started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero. In the years since, the OSG has broadened its focus to also address the needs of other US researchers and increased delivery of Distributed High Through-put Computing (DHTC) to users from a wide variety of disciplines via the OSG Open Facility. Presently, the Open Facility delivers about 100 million computing wall hours per year to researchers whomore » are not already associated with the owners of the computing sites, this is primarily accomplished by harvesting and organizing the temporarily unused capacity (i.e. opportunistic cycles) from the sites in the OSG. Using these methods, OSG resource providers and scientists share computing hours with researchers in many other fields to enable their science, striving to make sure that these computing power used with maximal efficiency. Furthermore, we believe that expanded access to DHTC is an essential tool for scientific innovation and work continues in expanding this service.« less
Laser Powered Launch Vehicle Performance Analyses
NASA Technical Reports Server (NTRS)
Chen, Yen-Sen; Liu, Jiwen; Wang, Ten-See (Technical Monitor)
2001-01-01
The purpose of this study is to establish the technical ground for modeling the physics of laser powered pulse detonation phenomenon. Laser powered propulsion systems involve complex fluid dynamics, thermodynamics and radiative transfer processes. Successful predictions of the performance of laser powered launch vehicle concepts depend on the sophisticate models that reflects the underlying flow physics including the laser ray tracing the focusing, inverse Bremsstrahlung (IB) effects, finite-rate air chemistry, thermal non-equilibrium, plasma radiation and detonation wave propagation, etc. The proposed work will extend the base-line numerical model to an efficient design analysis tool. The proposed model is suitable for 3-D analysis using parallel computing methods.
A cyber infrastructure for the SKA Telescope Manager
NASA Astrophysics Data System (ADS)
Barbosa, Domingos; Barraca, João. P.; Carvalho, Bruno; Maia, Dalmiro; Gupta, Yashwant; Natarajan, Swaminathan; Le Roux, Gerhard; Swart, Paul
2016-07-01
The Square Kilometre Array Telescope Manager (SKA TM) will be responsible for assisting the SKA Operations and Observation Management, carrying out System diagnosis and collecting Monitoring and Control data from the SKA subsystems and components. To provide adequate compute resources, scalability, operation continuity and high availability, as well as strict Quality of Service, the TM cyber-infrastructure (embodied in the Local Infrastructure - LINFRA) consists of COTS hardware and infrastructural software (for example: server monitoring software, host operating system, virtualization software, device firmware), providing a specially tailored Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) solution. The TM infrastructure provides services in the form of computational power, software defined networking, power, storage abstractions, and high level, state of the art IaaS and PaaS management interfaces. This cyber platform will be tailored to each of the two SKA Phase 1 telescopes (SKA_MID in South Africa and SKA_LOW in Australia) instances, each presenting different computational and storage infrastructures and conditioned by location. This cyber platform will provide a compute model enabling TM to manage the deployment and execution of its multiple components (observation scheduler, proposal submission tools, MandC components, Forensic tools and several Databases, etc). In this sense, the TM LINFRA is primarily focused towards the provision of isolated instances, mostly resorting to virtualization technologies, while defaulting to bare hardware if specifically required due to performance, security, availability, or other requirement.
Navigation within the heart and vessels in clinical practice.
Beyar, Rafael
2010-02-01
The field of interventional cardiology has developed at an unprecedented pace on account of the visual and imaging power provided by constantly improving biomedical technologies. Transcatheter-based technology is now routinely used for coronary revascularization and noncoronary interventions using balloon angioplasty, stents, and many other devices. In the early days of interventional practice, the operating physician had to manually navigate catheters and devices under fluoroscopic imaging and was exposed to radiation, with its comcomitant necessity for wearing heavy lead aprons for protection. Until recently, very little has changed in the way procedures have been carried out in the catheterization laboratory. The technological capacity to remotely manipulate devices, using robotic arms and computational tools, has been developed for surgery and other medical procedures. This has brought to practice the powerful combination of the abilities afforded by imaging, navigational tools, and remote control manipulation. This review covers recent developments in navigational tools for catheter positioning, electromagnetic mapping, magnetic resonance imaging (MRI)-based cardiac electrophysiological interventions, and navigation tools through coronary arteries.
IGMS: An Integrated ISO-to-Appliance Scale Grid Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan; Hale, Elaine; Hansen, Timothy M.
This paper describes the Integrated Grid Modeling System (IGMS), a novel electric power system modeling platform for integrated transmission-distribution analysis that co-simulates off-the-shelf tools on high performance computing (HPC) platforms to offer unprecedented resolution from ISO markets down to appliances and other end uses. Specifically, the system simultaneously models hundreds or thousands of distribution systems in co-simulation with detailed Independent System Operator (ISO) markets and AGC-level reserve deployment. IGMS uses a new MPI-based hierarchical co-simulation framework to connect existing sub-domain models. Our initial efforts integrate opensource tools for wholesale markets (FESTIV), bulk AC power flow (MATPOWER), and full-featured distribution systemsmore » including physics-based end-use and distributed generation models (many instances of GridLAB-D[TM]). The modular IGMS framework enables tool substitution and additions for multi-domain analyses. This paper describes the IGMS tool, characterizes its performance, and demonstrates the impacts of the coupled simulations for analyzing high-penetration solar PV and price responsive load scenarios.« less
Google Scholar and the Continuing Education Literature
ERIC Educational Resources Information Center
Howland, Jared L.; Howell, Scott; Wright, Thomas C.; Dickson, Cody
2009-01-01
The recent introduction of Google Scholar has renewed hope that someday a powerful research tool will bring continuing education literature more quickly, freely, and completely to one's computer. The authors suggest that using Google Scholar with other traditional search methods will narrow the research gap between what is discoverable and…
Challenging the Context: Perception, Polity, and Power.
ERIC Educational Resources Information Center
Hartfield, Ronne
1994-01-01
"Contextual areas" employ models, replicas, artwork, art materials, tools, interpretive panels, and interactive computer installations to help visitors explore the historical and cultural context of 6 of 12 works of art at the "Art Inside Out" exhibition in the Kraft General Foods Education Center of the Art Institute of Chicago. (MDH)
How Digital Scaffolds in Games Direct Problem-Solving Behaviors
ERIC Educational Resources Information Center
Sun, Chuen-Tsai; Wang, Dai-Yi; Chan, Hui-Ling
2011-01-01
Digital systems offer computational power and instant feedback. Game designers are using these features to create scaffolding tools to reduce player frustration. However, researchers are finding some unexpected effects of scaffolding on strategy development and problem-solving behaviors. We used a digital Sudoku game named "Professor Sudoku" to…
The Electronic Biology Classroom: Implementation and Student Opinion.
ERIC Educational Resources Information Center
Davis, Mark S.
This paper describes a method for teaching introductory biology using a multimedia approach. This methodology aimed to increase student participation, promote independent learning, and enhance computer literacy. Five multimedia tools were used to teach the course. PowerPoint slide shows were used to present lecture material; videodiscs displayed…
Homology Modeling and Molecular Docking for the Science Curriculum
ERIC Educational Resources Information Center
McDougal, Owen M.; Cornia, Nic; Sambasivarao, S. V.; Remm, Andrew; Mallory, Chris; Oxford, Julia Thom; Maupin, C. Mark; Andersen, Tim
2014-01-01
DockoMatic 2.0 is a powerful open source software program (downloadable from sourceforge.net) that allows users to utilize a readily accessible computational tool to explore biomolecules and their interactions. This manuscript describes a practical tutorial for use in the undergraduate curriculum that introduces students to macromolecular…
ERIC Educational Resources Information Center
Thomas, David A.; Li, Qing
2008-01-01
The World Wide Web is evolving in response to users who demand faster and more efficient access to information, portability, and reusability of digital objects between Web-based and computer-based applications and powerful communication, publication, collaboration, and teaching and learning tools. This article reviews current uses of Web-based…
Computer Simulation and New Ways of Creating Matched-Guise Techniques
ERIC Educational Resources Information Center
Connor, Robert T.
2008-01-01
Matched-guise experiments have passed their 40th year as a powerful attitudinal research tool, and they are becoming more relevant and useful as technology is applied to language research. Combining the specificity of conversation analysis with the generalizability of social psychology research, technological innovations allow the measurement of…
Not Just for Computation: Basic Calculators Can Advance the Process Standards
ERIC Educational Resources Information Center
Moss, Laura J.; Grover, Barbara W.
2007-01-01
Simple nongraphing calculators can be powerful tools to enhance students' conceptual understanding of mathematics concepts. Students have opportunities to develop (1) a broad repertoire of problem-solving strategies by observing multiple solution strategies; (2) respect for other students' abilities and ways of thinking about mathematics; (3) the…
ERIC Educational Resources Information Center
López, Víctor; Pintó, Roser
2017-01-01
Computer simulations are often considered effective educational tools, since their visual and communicative power enable students to better understand physical systems and phenomena. However, previous studies have found that when students read visual representations some reading difficulties can arise, especially when these are complex or dynamic…
Teachers' Implementation of a Game-Based Biotechnology Curriculum
ERIC Educational Resources Information Center
Eastwood, Jennifer L.; Sadler, Troy D.
2013-01-01
Research in education suggests that computer games can serve as powerful learning environments, however, teachers perceive many obstacles to using games as teaching tools. In this study, we examine three science teachers' implementation and perceptions of a curriculum unit incorporating the game, Mission Biotech (MBt) and a set of supporting…
Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.
2016-01-01
An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.
The Computer as a Tool for Learning
Starkweather, John A.
1986-01-01
Experimenters from the beginning recognized the advantages computers might offer in medical education. Several medical schools have gained experience in such programs in automated instruction. Television images and graphic display combined with computer control and user interaction are effective for teaching problem solving. The National Board of Medical Examiners has developed patient-case simulation for examining clinical skills, and the National Library of Medicine has experimented with combining media. Advances from the field of artificial intelligence and the availability of increasingly powerful microcomputers at lower cost will aid further development. Computers will likely affect existing educational methods, adding new capabilities to laboratory exercises, to self-assessment and to continuing education. PMID:3544511
Predictive Model and Methodology for Heat Treatment Distortion Final Report CRADA No. TC-298-92
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikkel, D. J.; McCabe, J.
This project was a multi-lab, multi-partner CRADA involving LLNL, Los Alamos National Laboratory, Sandia National Laboratories, Oak Ridge National Laboratory, Martin Marietta Energy Systems and the industrial partner, The National Center of Manufacturing Sciences (NCMS). A number of member companies of NCMS participated including General Motors Corporation, Ford Motor Company, The Torrington Company, Gear Research, the Illinois Institute of Technology Research Institute, and Deformation Control Technology •. LLNL was the lead laboratory for metrology technology used for validation of the computational tool/methodology. LLNL was also the lead laboratory for the development of the software user interface , for the computationalmore » tool. This report focuses on the participation of LLNL and NCMS. The purpose of the project was to develop a computational tool/methodology that engineers would use to predict the effects of heat treatment on the _size and shape of industrial parts made of quench hardenable alloys. Initially, the target application of the tool was gears for automotive power trains.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2010-09-01
ADEPT Project: Georgia Tech is creating compact, low-profile power adapters and power bricks using materials and tools adapted from other industries and from grid-scale power applications. Adapters and bricks convert electrical energy into useable power for many types of electronic devices, including laptop computers and mobile phones. These converters are often called wall warts because they are big, bulky, and sometimes cover up an adjacent wall socket that could be used to power another electronic device. The magnetic components traditionally used to make adapters and bricks have reached their limits; they can't be made any smaller without sacrificing performance. Georgiamore » Tech is taking a cue from grid-scale power converters that use iron alloys as magnetic cores. These low-cost alloys can handle more power than other materials, but the iron must be stacked in insulated plates to maximize energy efficiency. In order to create compact, low-profile power adapters and bricks, these stacked iron plates must be extremely thin-only hundreds of nanometers in thickness, in fact. To make plates this thin, Georgia Tech is using manufacturing tools used in microelectromechanics and other small-scale industries.« less
Unified Performance and Power Modeling of Scientific Workloads
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Shuaiwen; Barker, Kevin J.; Kerbyson, Darren J.
2013-11-17
It is expected that scientific applications executing on future large-scale HPC must be optimized not only in terms of performance, but also in terms of power consumption. As power and energy become increasingly constrained resources, researchers and developers must have access to tools that will allow for accurate prediction of both performance and power consumption. Reasoning about performance and power consumption in concert will be critical for achieving maximum utilization of limited resources on future HPC systems. To this end, we present a unified performance and power model for the Nek-Bone mini-application developed as part of the DOE's CESAR Exascalemore » Co-Design Center. Our models consider the impact of computation, point-to-point communication, and collective communication« less
SAVANT: Solar Array Verification and Analysis Tool Demonstrated
NASA Technical Reports Server (NTRS)
Chock, Ricaurte
2000-01-01
The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.
Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis.
Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E
2018-04-15
Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele. darrell.hurt@nih.gov.
NASA Technical Reports Server (NTRS)
Kovarik, Madeline
1993-01-01
Intelligent computer aided training systems hold great promise for the application of this technology to mainstream education and training. Yet, this technology, which holds such a vast potential impact for the future of education and training, has had little impact beyond the enclaves of government research labs. This is largely due to the inaccessibility of the technology to those individuals in whose hands it can have the greatest impact, teachers and educators. Simply throwing technology at an educator and expecting them to use it as an effective tool is not the answer. This paper provides a background into the use of technology as a training tool. MindLink, developed by HyperTech Systems, provides trainers with a powerful rule-based tool that can be integrated directly into a Windows application. By embedding expert systems technology it becomes more accessible and easier to master.
Recent Advances in Cardiac Computed Tomography: Dual Energy, Spectral and Molecular CT Imaging
Danad, Ibrahim; Fayad, Zahi A.; Willemink, Martin J.; Min, James K.
2015-01-01
Computed tomography (CT) evolved into a powerful diagnostic tool and it is impossible to imagine current clinical practice without CT imaging. Due to its widespread availability, ease of clinical application, superb sensitivity for detection of CAD, and non-invasive nature, CT has become a valuable tool within the armamentarium of the cardiologist. In the last few years, numerous technological advances in CT have occurred—including dual energy CT (DECT), spectral CT and CT-based molecular imaging. By harnessing the advances in technology, cardiac CT has advanced beyond the mere evaluation of coronary stenosis to an imaging modality tool that permits accurate plaque characterization, assessment of myocardial perfusion and even probing of molecular processes that are involved in coronary atherosclerosis. Novel innovations in CT contrast agents and pre-clinical spectral CT devices have paved the way for CT-based molecular imaging. PMID:26068288
Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis
Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E
2018-01-01
Abstract Motivation Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. Results The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. Availability and implementation https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele Contact darrell.hurt@nih.gov PMID:29028892
Automatic Estimation of the Radiological Inventory for the Dismantling of Nuclear Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia-Bermejo, R.; Felipe, A.; Gutierrez, S.
The estimation of the radiological inventory of Nuclear Facilities to be dismantled is a process that included information related with the physical inventory of all the plant and radiological survey. Estimation of the radiological inventory for all the components and civil structure of the plant could be obtained with mathematical models with statistical approach. A computer application has been developed in order to obtain the radiological inventory in an automatic way. Results: A computer application that is able to estimate the radiological inventory from the radiological measurements or the characterization program has been developed. In this computer applications has beenmore » included the statistical functions needed for the estimation of the central tendency and variability, e.g. mean, median, variance, confidence intervals, variance coefficients, etc. This computer application is a necessary tool in order to be able to estimate the radiological inventory of a nuclear facility and it is a powerful tool for decision taken in future sampling surveys.« less
AA9int: SNP Interaction Pattern Search Using Non-Hierarchical Additive Model Set.
Lin, Hui-Yi; Huang, Po-Yu; Chen, Dung-Tsa; Tung, Heng-Yuan; Sellers, Thomas A; Pow-Sang, Julio; Eeles, Rosalind; Easton, Doug; Kote-Jarai, Zsofia; Amin Al Olama, Ali; Benlloch, Sara; Muir, Kenneth; Giles, Graham G; Wiklund, Fredrik; Gronberg, Henrik; Haiman, Christopher A; Schleutker, Johanna; Nordestgaard, Børge G; Travis, Ruth C; Hamdy, Freddie; Neal, David E; Pashayan, Nora; Khaw, Kay-Tee; Stanford, Janet L; Blot, William J; Thibodeau, Stephen N; Maier, Christiane; Kibel, Adam S; Cybulski, Cezary; Cannon-Albright, Lisa; Brenner, Hermann; Kaneva, Radka; Batra, Jyotsna; Teixeira, Manuel R; Pandha, Hardev; Lu, Yong-Jie; Park, Jong Y
2018-06-07
The use of single nucleotide polymorphism (SNP) interactions to predict complex diseases is getting more attention during the past decade, but related statistical methods are still immature. We previously proposed the SNP Interaction Pattern Identifier (SIPI) approach to evaluate 45 SNP interaction patterns/patterns. SIPI is statistically powerful but suffers from a large computation burden. For large-scale studies, it is necessary to use a powerful and computation-efficient method. The objective of this study is to develop an evidence-based mini-version of SIPI as the screening tool or solitary use and to evaluate the impact of inheritance mode and model structure on detecting SNP-SNP interactions. We tested two candidate approaches: the 'Five-Full' and 'AA9int' method. The Five-Full approach is composed of the five full interaction models considering three inheritance modes (additive, dominant and recessive). The AA9int approach is composed of nine interaction models by considering non-hierarchical model structure and the additive mode. Our simulation results show that AA9int has similar statistical power compared to SIPI and is superior to the Five-Full approach, and the impact of the non-hierarchical model structure is greater than that of the inheritance mode in detecting SNP-SNP interactions. In summary, it is recommended that AA9int is a powerful tool to be used either alone or as the screening stage of a two-stage approach (AA9int+SIPI) for detecting SNP-SNP interactions in large-scale studies. The 'AA9int' and 'parAA9int' functions (standard and parallel computing version) are added in the SIPI R package, which is freely available at https://linhuiyi.github.io/LinHY_Software/. hlin1@lsuhsc.edu. Supplementary data are available at Bioinformatics online.
Counter Action Procedure Generation in an Emergency Situation of Nuclear Power Plants
NASA Astrophysics Data System (ADS)
Gofuku, A.
2018-02-01
Lessons learned from the Fukushima Daiichi accident revealed various weak points in the design and operation of nuclear power plants at the time although there were many resilient activities made by the plant staff under difficult work environment. In order to reinforce the measures to make nuclear power plants more resilient, improvement of hardware and improvement of education and training of nuclear personnel are considered. In addition, considering the advancement of computer technology and artificial intelligence, it is a promising way to develop software tools to support the activities of plant staff.This paper focuses on the software tools to support the operations by human operators and introduces a concept of an intelligent operator support system that is called as co-operator. This paper also describes a counter operation generation technique the authors are studying as a core component of the co-operator.
A case study for cloud based high throughput analysis of NGS data using the globus genomics system
Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; ...
2015-01-01
Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-end NGS analysis requirements. The Globus Genomicsmore » system is built on Amazon's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.« less
A case study for cloud based high throughput analysis of NGS data using the globus genomics system
Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; Rodriguez, Alex; Madduri, Ravi; Dave, Utpal; Lacinski, Lukasz; Foster, Ian; Gusev, Yuriy; Madhavan, Subha
2014-01-01
Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon 's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research. PMID:26925205
Compressive sensing scalp EEG signals: implementations and practical performance.
Abdulghani, Amir M; Casson, Alexander J; Rodriguez-Villegas, Esther
2012-11-01
Highly miniaturised, wearable computing and communication systems allow unobtrusive, convenient and long term monitoring of a range of physiological parameters. For long term operation from the physically smallest batteries, the average power consumption of a wearable device must be very low. It is well known that the overall power consumption of these devices can be reduced by the inclusion of low power consumption, real-time compression of the raw physiological data in the wearable device itself. Compressive sensing is a new paradigm for providing data compression: it has shown significant promise in fields such as MRI; and is potentially suitable for use in wearable computing systems as the compression process required in the wearable device has a low computational complexity. However, the practical performance very much depends on the characteristics of the signal being sensed. As such the utility of the technique cannot be extrapolated from one application to another. Long term electroencephalography (EEG) is a fundamental tool for the investigation of neurological disorders and is increasingly used in many non-medical applications, such as brain-computer interfaces. This article investigates in detail the practical performance of different implementations of the compressive sensing theory when applied to scalp EEG signals.
3-d finite element model development for biomechanics: a software demonstration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollerbach, K.; Hollister, A.M.; Ashby, E.
1997-03-01
Finite element analysis is becoming an increasingly important part of biomechanics and orthopedic research, as computational resources become more powerful, and data handling algorithms become more sophisticated. Until recently, tools with sufficient power did not exist or were not accessible to adequately model complicated, three-dimensional, nonlinear biomechanical systems. In the past, finite element analyses in biomechanics have often been limited to two-dimensional approaches, linear analyses, or simulations of single tissue types. Today, we have the resources to model fully three-dimensional, nonlinear, multi-tissue, and even multi-joint systems. The authors will present the process of developing these kinds of finite element models,more » using human hand and knee examples, and will demonstrate their software tools.« less
NASA Astrophysics Data System (ADS)
Mueller, David S.
2013-04-01
Selection of the appropriate extrapolation methods for computing the discharge in the unmeasured top and bottom parts of a moving-boat acoustic Doppler current profiler (ADCP) streamflow measurement is critical to the total discharge computation. The software tool, extrap, combines normalized velocity profiles from the entire cross section and multiple transects to determine a mean profile for the measurement. The use of an exponent derived from normalized data from the entire cross section is shown to be valid for application of the power velocity distribution law in the computation of the unmeasured discharge in a cross section. Selected statistics are combined with empirically derived criteria to automatically select the appropriate extrapolation methods. A graphical user interface (GUI) provides the user tools to visually evaluate the automatically selected extrapolation methods and manually change them, as necessary. The sensitivity of the total discharge to available extrapolation methods is presented in the GUI. Use of extrap by field hydrographers has demonstrated that extrap is a more accurate and efficient method of determining the appropriate extrapolation methods compared with tools currently (2012) provided in the ADCP manufacturers' software.
NASA Technical Reports Server (NTRS)
Rosenberg, L. S.; Revere, W. R.; Selcuk, M. K.
1981-01-01
Small solar thermal power systems (up to 10 MWe in size) were tested. The solar thermal power plant ranking study was performed to aid in experiment activity and support decisions for the selection of the most appropriate technological approach. The cost and performance were determined for insolation conditions by utilizing the Solar Energy Simulation computer code (SESII). This model optimizes the size of the collector field and energy storage subsystem for given engine generator and energy transport characteristics. The development of the simulation tool, its operation, and the results achieved from the analysis are discussed.
Electronics Environmental Benefits Calculator
The Electronics Environmental Benefits Calculator (EEBC) was developed to assist organizations in estimating the environmental benefits of greening their purchase, use and disposal of electronics.The EEBC estimates the environmental and economic benefits of: Purchasing Electronic Product Environmental Assessment Tool (EPEAT)-registered products; Enabling power management features on computers and monitors above default percentages; Extending the life of equipment beyond baseline values; Reusing computers, monitors and cell phones; and Recycling computers, monitors, cell phones and loads of mixed electronic products.The EEBC may be downloaded as a Microsoft Excel spreadsheet.See https://www.federalelectronicschallenge.net/resources/bencalc.htm for more details.
An interactive computer code for calculation of gas-phase chemical equilibrium (EQLBRM)
NASA Technical Reports Server (NTRS)
Pratt, B. S.; Pratt, D. T.
1984-01-01
A user friendly, menu driven, interactive computer program known as EQLBRM which calculates the adiabatic equilibrium temperature and product composition resulting from the combustion of hydrocarbon fuels with air, at specified constant pressure and enthalpy is discussed. The program is developed primarily as an instructional tool to be run on small computers to allow the user to economically and efficiency explore the effects of varying fuel type, air/fuel ratio, inlet air and/or fuel temperature, and operating pressure on the performance of continuous combustion devices such as gas turbine combustors, Stirling engine burners, and power generation furnaces.
A fast ultrasonic simulation tool based on massively parallel implementations
NASA Astrophysics Data System (ADS)
Lambert, Jason; Rougeron, Gilles; Lacassagne, Lionel; Chatillon, Sylvain
2014-02-01
This paper presents a CIVA optimized ultrasonic inspection simulation tool, which takes benefit of the power of massively parallel architectures: graphical processing units (GPU) and multi-core general purpose processors (GPP). This tool is based on the classical approach used in CIVA: the interaction model is based on Kirchoff, and the ultrasonic field around the defect is computed by the pencil method. The model has been adapted and parallelized for both architectures. At this stage, the configurations addressed by the tool are : multi and mono-element probes, planar specimens made of simple isotropic materials, planar rectangular defects or side drilled holes of small diameter. Validations on the model accuracy and performances measurements are presented.
Multi-objective reverse logistics model for integrated computer waste management.
Ahluwalia, Poonam Khanijo; Nema, Arvind K
2006-12-01
This study aimed to address the issues involved in the planning and design of a computer waste management system in an integrated manner. A decision-support tool is presented for selecting an optimum configuration of computer waste management facilities (segregation, storage, treatment/processing, reuse/recycle and disposal) and allocation of waste to these facilities. The model is based on an integer linear programming method with the objectives of minimizing environmental risk as well as cost. The issue of uncertainty in the estimated waste quantities from multiple sources is addressed using the Monte Carlo simulation technique. An illustrated example of computer waste management in Delhi, India is presented to demonstrate the usefulness of the proposed model and to study tradeoffs between cost and risk. The results of the example problem show that it is possible to reduce the environmental risk significantly by a marginal increase in the available cost. The proposed model can serve as a powerful tool to address the environmental problems associated with exponentially growing quantities of computer waste which are presently being managed using rudimentary methods of reuse, recovery and disposal by various small-scale vendors.
Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB.
Lee, Leng-Feng; Umberger, Brian R
2016-01-01
Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1-2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility.
Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB
Lee, Leng-Feng
2016-01-01
Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1–2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility. PMID:26835184
Methodolgy For Evaluation Of Technology Impacts In Space Electric Power Systems
NASA Technical Reports Server (NTRS)
Holda, Julie
2004-01-01
The Analysis and Management branch of the Power and Propulsion Office at NASA Glenn Research Center is responsible for performing complex analyses of the space power and In-Space propulsion products developed by GRC. This work quantifies the benefits of the advanced technologies to support on-going advocacy efforts. The Power and Propulsion Office is committed to understanding how the advancement in space technologies could benefit future NASA missions. They support many diverse projects and missions throughout NASA as well as industry and academia. The area of work that we are concentrating on is space technology investment strategies. Our goal is to develop a Monte-Carlo based tool to investigate technology impacts in space electric power systems. The framework is being developed at this stage, which will be used to set up a computer simulation of a space electric power system (EPS). The outcome is expected to be a probabilistic assessment of critical technologies and potential development issues. We are developing methods for integrating existing spreadsheet-based tools into the simulation tool. Also, work is being done on defining interface protocols to enable rapid integration of future tools. Monte Carlo-based simulation programs for statistical modeling of the EPS Model. I decided to learn and evaluate Palisade's @Risk and Risk Optimizer software, and utilize it's capabilities for the Electric Power System (EPS) model. I also looked at similar software packages (JMP, SPSS, Crystal Ball, VenSim, Analytica) available from other suppliers and evaluated them. The second task was to develop the framework for the tool, in which we had to define technology characteristics using weighing factors and probability distributions. Also we had to define the simulation space and add hard and soft constraints to the model. The third task is to incorporate (preliminary) cost factors into the model. A final task is developing a cross-platform solution of this framework.
EMTP; A powerful tool for analyzing power system transients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, W.; Cotcher, D.; Ruiu, D.
1990-07-01
This paper reports on the electromagnetic transients program (EMTP), a general purpose computer program for simulating high-speed transient effects in electric power systems. The program features an extremely wide variety of modeling capabilities encompassing electromagnetic and electromechanical oscillations ranging in duration from microseconds to seconds. Examples of its use include switching and lightning surge analysis, insulation coordination, shaft torsional oscillations, ferroresonance, and HVDC converter control and operation. In the late 1960s Hermann Dommel developed the EMTP at Bonneville Power Administration (BPA), which considered the program to be the digital computer replacement for the transient network analyzer. The program initially comprisedmore » about 5000 lines of code, and was useful primarily for transmission line switching studies. As more uses for the program became apparent, BPA coordinated many improvements to the program. As the program grew in versatility and in size, it likewise became more unwieldy and difficult to use. One had to be an EMTP aficionado to take advantage of its capabilities.« less
The multi-disciplinary design study: A life cycle cost algorithm
NASA Technical Reports Server (NTRS)
Harding, R. R.; Pichi, F. J.
1988-01-01
The approach and results of a Life Cycle Cost (LCC) analysis of the Space Station Solar Dynamic Power Subsystem (SDPS) including gimbal pointing and power output performance are documented. The Multi-Discipline Design Tool (MDDT) computer program developed during the 1986 study has been modified to include the design, performance, and cost algorithms for the SDPS as described. As with the Space Station structural and control subsystems, the LCC of the SDPS can be computed within the MDDT program as a function of the engineering design variables. Two simple examples of MDDT's capability to evaluate cost sensitivity and design based on LCC are included. MDDT was designed to accept NASA's IMAT computer program data as input so that IMAT's detailed structural and controls design capability can be assessed with expected system LCC as computed by MDDT. No changes to IMAT were required. Detailed knowledge of IMAT is not required to perform the LCC analyses as the interface with IMAT is noninteractive.
RighTime: A real time clock correcting program for MS-DOS-based computer systems
NASA Technical Reports Server (NTRS)
Becker, G. Thomas
1993-01-01
A computer program is described which effectively eliminates the misgivings of the DOS system clock in PC/AT-class computers. RighTime is a small, sophisticated memory-resident program that automatically corrects both the DOS system clock and the hardware 'CMOS' real time clock (RTC) in real time. RighTime learns what corrections are required without operator interaction beyond the occasional accurate time set. Both warm (power on) and cool (power off) errors are corrected, usually yielding better than one part per million accuracy in the typical desktop computer with no additional hardware, and RighTime increases the system clock resolution from approximately 0.0549 second to 0.01 second. Program tools are also available which allow visualization of RighTime's actions, verification of its performance, display of its history log, and which provide data for graphing of the system clock behavior. The program has found application in a wide variety of industries, including astronomy, satellite tracking, communications, broadcasting, transportation, public utilities, manufacturing, medicine, and the military.
Computational Tools and Algorithms for Designing Customized Synthetic Genes
Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris
2014-01-01
Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations. PMID:25340050
Intelligent control system based on ARM for lithography tool
NASA Astrophysics Data System (ADS)
Chen, Changlong; Tang, Xiaoping; Hu, Song; Wang, Nan
2014-08-01
The control system of traditional lithography tool is based on PC and MCU. The PC handles the complex algorithm, human-computer interaction, and communicates with MCU via serial port; The MCU controls motors and electromagnetic valves, etc. This mode has shortcomings like big volume, high power consumption, and wasting of PC resource. In this paper, an embedded intelligent control system of lithography tool, based on ARM, is provided. The control system used S5PV210 as processor, completing the functions of PC in traditional lithography tool, and provided a good human-computer interaction by using LCD and capacitive touch screen. Using Android4.0.3 as operating system, the equipment provided a cool and easy UI which made the control more user-friendly, and implemented remote control and debug, pushing video information of product by network programming. As a result, it's convenient for equipment vendor to provide technical support for users. Finally, compared with traditional lithography tool, this design reduced the PC part, making the hardware resources efficiently used and reducing the cost and volume. Introducing embedded OS and the concepts in "The Internet of things" into the design of lithography tool can be a development trend.
EPA/ECLSS consumables analyses for the Spacelab 1 flight
NASA Technical Reports Server (NTRS)
Steines, G. J.; Pipher, M. D.
1976-01-01
The results of electrical power system (EPS) and environmental control/life support system (ECLSS) consumables analyses of the Spacelab 1 mission are presented. The analyses were performed to assess the capability of the orbiter systems to support the proposed mission and to establish the various non propulsive consumables requirements. The EPS analysis was performed using the shuttle electrical power system (SEPS) analysis computer program. The ECLSS analysis was performed using the shuttle environmental consumables requirements evaluation tool (SECRET) program.
Parallel algorithm for computation of second-order sequential best rotations
NASA Astrophysics Data System (ADS)
Redif, Soydan; Kasap, Server
2013-12-01
Algorithms for computing an approximate polynomial matrix eigenvalue decomposition of para-Hermitian systems have emerged as a powerful, generic signal processing tool. A technique that has shown much success in this regard is the sequential best rotation (SBR2) algorithm. Proposed is a scheme for parallelising SBR2 with a view to exploiting the modern architectural features and inherent parallelism of field-programmable gate array (FPGA) technology. Experiments show that the proposed scheme can achieve low execution times while requiring minimal FPGA resources.
NASA Technical Reports Server (NTRS)
Peredo, James P.
1988-01-01
Like many large companies, Ames relies very much on its computing power to get work done. And, like many other large companies, finding the IBM PC a reliable tool, Ames uses it for many of the same types of functions as other companies. Presentation and clarification needs demand much of graphics packages. Programming and text editing needs require simpler, more-powerful packages. The storage space needed by NASA's scientists and users for the monumental amounts of data that Ames needs to keep demand the best database packages that are large and easy to use. Availability to the Micom Switching Network combines the powers of the IBM PC with the capabilities of other computers and mainframes and allows users to communicate electronically. These four primary capabilities of the PC are vital to the needs of NASA's users and help to continue and support the vast amounts of work done by the NASA employees.
Surface Traps in Colloidal Quantum Dots: A Combined Experimental and Theoretical Perspective
2017-01-01
Surface traps are ubiquitous to nanoscopic semiconductor materials. Understanding their atomistic origin and manipulating them chemically have capital importance to design defect-free colloidal quantum dots and make a leap forward in the development of efficient optoelectronic devices. Recent advances in computing power established computational chemistry as a powerful tool to describe accurately complex chemical species and nowadays it became conceivable to model colloidal quantum dots with realistic sizes and shapes. In this Perspective, we combine the knowledge gathered in recent experimental findings with the computation of quantum dot electronic structures. We analyze three different systems: namely, CdSe, PbS, and CsPbI3 as benchmark semiconductor nanocrystals showing how different types of trap states can form at their surface. In addition, we suggest experimental healing of such traps according to their chemical origin and nanocrystal composition. PMID:28972763
Cotes-Ruiz, Iván Tomás; Prado, Rocío P.; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás
2017-01-01
Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique. PMID:28085932
Cotes-Ruiz, Iván Tomás; Prado, Rocío P; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás
2017-01-01
Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.
Integrated solar energy system optimization
NASA Astrophysics Data System (ADS)
Young, S. K.
1982-11-01
The computer program SYSOPT, intended as a tool for optimizing the subsystem sizing, performance, and economics of integrated wind and solar energy systems, is presented. The modular structure of the methodology additionally allows simulations when the solar subsystems are combined with conventional technologies, e.g., a utility grid. Hourly energy/mass flow balances are computed for interconnection points, yielding optimized sizing and time-dependent operation of various subsystems. The program requires meteorological data, such as insolation, diurnal and seasonal variations, and wind speed at the hub height of a wind turbine, all of which can be taken from simulations like the TRNSYS program. Examples are provided for optimization of a solar-powered (wind turbine and parabolic trough-Rankine generator) desalinization plant, and a design analysis for a solar powered greenhouse.
Modeling and Validation of Power-split and P2 Parallel Hybrid Electric Vehicles SAE 2013-01-1470)
The Advanced Light-Duty Powertrain and Hybrid Analysis tool was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types combined ...
ERIC Educational Resources Information Center
Ocal, Mehmet Fatih
2017-01-01
Integrating the properties of computer algebra systems and dynamic geometry environments, Geogebra became an effective and powerful tool for teaching and learning mathematics. One of the reasons that teachers use Geogebra in mathematics classrooms is to make students learn mathematics meaningfully and conceptually. From this perspective, the…
The Virtual Lecture Hall: Utilisation, Effectiveness and Student Perceptions
ERIC Educational Resources Information Center
Cramer, Kenneth M.; Collins, Kandice R.; Snider, Don; Fawcett, Graham
2007-01-01
We presently introduce the Virtual Lecture Hall (VLH), an instructional computer-based platform for delivering Microsoft PowerPoint slides threaded with audio clips for later review. There were 839 male and female university students enrolled in an introductory psychology class who had access to review class lectures via the VLH. This tool was…
USDA-ARS?s Scientific Manuscript database
The temptation to include model parameters and high resolution input data together with the availability of powerful optimization and uncertainty analysis algorithms has significantly enhanced the complexity of hydrologic and water quality modeling. However, the ability to take advantage of sophist...
Stochastic Estimation and Control of Queues Within a Computer Network
2009-03-01
3]. And NS-2 is a network simulator developed at UC Berkely and is a well known, free, powerful network simulator tool. As will be more discussed...HA011118931033.aspx 7. James Trulove , “Broadband Networking”, CRC Press, 2nd edition, 2000 8. Jonathan Pengelly “MONTE CARLO METHODS” University of Otago
The Miller Motivation Scale: A New Counselling and Research Tool.
ERIC Educational Resources Information Center
Miller, Harold J.
The Miller Motivation Scale is a 160-item computer scored scale. It was developed to measure quickly and easily and display the motivational profile of the client. It has eight subscales. Five subscales measure encouragement, self-fulfillment and social interest. They are called Creative, Innovative, Productive, Cooperative, and Power. Three…
ERIC Educational Resources Information Center
Ardiel, Evan L.; Giles, Andrew C.; Yu, Alex J.; Lindsay, Theodore H.; Lockery, Shawn R.; Rankin, Catharine H.
2016-01-01
Habituation is a highly conserved phenomenon that remains poorly understood at the molecular level. Invertebrate model systems, like "Caenorhabditis elegans," can be a powerful tool for investigating this fundamental process. Here we established a high-throughput learning assay that used real-time computer vision software for behavioral…
Scripting for Collaborative Search Computer-Supported Classroom Activities
ERIC Educational Resources Information Center
Verdugo, Renato; Barros, Leonardo; Albornoz, Daniela; Nussbaum, Miguel; McFarlane, Angela
2014-01-01
Searching online is one of the most powerful resources today's students have for accessing information. Searching in groups is a daily practice across multiple contexts; however, the tools we use for searching online do not enable collaborative practices and traditional search models consider a single user navigating online in solitary. This paper…
Earth Science Learning in SMALLab: A Design Experiment for Mixed Reality
ERIC Educational Resources Information Center
Birchfield, David; Megowan-Romanowicz, Colleen
2009-01-01
Conversational technologies such as email, chat rooms, and blogs have made the transition from novel communication technologies to powerful tools for learning. Currently virtual worlds are undergoing the same transition. We argue that the next wave of innovation is at the level of the computer interface, and that mixed-reality environments offer…
Factors Affecting Pre-Service TESOL Teachers' Attitudes towards Using CD-ROM Dictionary
ERIC Educational Resources Information Center
Issa, Jinan Hatem; Jamil, Hazri
2011-01-01
Rapid technological advances in communication technologies and computational power are altering the nature of knowledge, skills, talents and the know-how of individuals. A CD-ROM dictionary is an interesting and effective teaching tool, which captures pre-service teachers' interest and does much more than just translates especially with the…
Effects of Computer-Based Visual Representation on Mathematics Learning and Cognitive Load
ERIC Educational Resources Information Center
Yung, Hsin I.; Paas, Fred
2015-01-01
Visual representation has been recognized as a powerful learning tool in many learning domains. Based on the assumption that visual representations can support deeper understanding, we examined the effects of visual representations on learning performance and cognitive load in the domain of mathematics. An experimental condition with visual…
Parallel Implementation of MAFFT on CUDA-Enabled Graphics Hardware.
Zhu, Xiangyuan; Li, Kenli; Salah, Ahmad; Shi, Lin; Li, Keqin
2015-01-01
Multiple sequence alignment (MSA) constitutes an extremely powerful tool for many biological applications including phylogenetic tree estimation, secondary structure prediction, and critical residue identification. However, aligning large biological sequences with popular tools such as MAFFT requires long runtimes on sequential architectures. Due to the ever increasing sizes of sequence databases, there is increasing demand to accelerate this task. In this paper, we demonstrate how graphic processing units (GPUs), powered by the compute unified device architecture (CUDA), can be used as an efficient computational platform to accelerate the MAFFT algorithm. To fully exploit the GPU's capabilities for accelerating MAFFT, we have optimized the sequence data organization to eliminate the bandwidth bottleneck of memory access, designed a memory allocation and reuse strategy to make full use of limited memory of GPUs, proposed a new modified-run-length encoding (MRLE) scheme to reduce memory consumption, and used high-performance shared memory to speed up I/O operations. Our implementation tested in three NVIDIA GPUs achieves speedup up to 11.28 on a Tesla K20m GPU compared to the sequential MAFFT 7.015.
Education review: applied medical informatics--informatics in medical education.
Naeymi-Rad, F; Trace, D; Moidu, K; Carmony, L; Booden, T
1994-05-01
The importance of informatics training within a health sciences program is well recognized and is being implemented on an increasing scale. At Chicago Medical School (CMS), the Informatics program incorporates information technology at every stage of medical education. First-year students are offered an elective in computer topics that concentrate on basic computer literacy. Second-year students learn information management such as entry and information retrieval skills. For example, during the Introduction to Clinical Medicine course, the student is exposed to the Intelligent Medical Record-Entry (IMR-E), allowing the student to enter and organize information gathered from patient encounters. In the third year, students in the Internal Medicine rotation at Norwalk Hospital use Macintosh power books to enter and manage their patients. Patient data gathered by the student are stored in a local server in Norwalk Hospital. In the final year, we teach students the role of informatics in clinical decision making. The present senior class at CMS has been exposed to the power of medical informatics tools for several years. The use of these informatics tools at the point of care is stressed.
Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Xu; Tuo, Rui; Jeff Wu, C. F.
Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less
Software Engineering for Scientific Computer Simulations
NASA Astrophysics Data System (ADS)
Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.
2004-11-01
Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.
Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion
He, Xu; Tuo, Rui; Jeff Wu, C. F.
2017-01-31
Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Zhenyu Henry; Tate, Zeb; Abhyankar, Shrirang
The power grid has been evolving over the last 120 years, but it is seeing more changes in this decade and next than it has seen over the past century. In particular, the widespread deployment of intermittent renewable generation, smart loads and devices, hierarchical and distributed control technologies, phasor measurement units, energy storage, and widespread usage of electric vehicles will require fundamental changes in methods and tools for the operation and planning of the power grid. The resulting new dynamic and stochastic behaviors will demand the inclusion of more complexity in modeling the power grid. Solving such complex models inmore » the traditional computing environment will be a major challenge. Along with the increasing complexity of power system models, the increasing complexity of smart grid data further adds to the prevailing challenges. In this environment, the myriad of smart sensors and meters in the power grid increase by multiple orders of magnitude, so do the volume and speed of the data. The information infrastructure will need to drastically change to support the exchange of enormous amounts of data as smart grid applications will need the capability to collect, assimilate, analyze and process the data, to meet real-time grid functions. High performance computing (HPC) holds the promise to enhance these functions, but it is a great resource that has not been fully explored and adopted for the power grid domain.« less
Light-weight Parallel Python Tools for Earth System Modeling Workflows
NASA Astrophysics Data System (ADS)
Mickelson, S. A.; Paul, K.; Xu, H.; Dennis, J.; Brown, D. I.
2015-12-01
With the growth in computing power over the last 30 years, earth system modeling codes have become increasingly data-intensive. As an example, it is expected that the data required for the next Intergovernmental Panel on Climate Change (IPCC) Assessment Report (AR6) will increase by more than 10x to an expected 25PB per climate model. Faced with this daunting challenge, developers of the Community Earth System Model (CESM) have chosen to change the format of their data for long-term storage from time-slice to time-series, in order to reduce the required download bandwidth needed for later analysis and post-processing by climate scientists. Hence, efficient tools are required to (1) perform the transformation of the data from time-slice to time-series format and to (2) compute climatology statistics, needed for many diagnostic computations, on the resulting time-series data. To address the first of these two challenges, we have developed a parallel Python tool for converting time-slice model output to time-series format. To address the second of these challenges, we have developed a parallel Python tool to perform fast time-averaging of time-series data. These tools are designed to be light-weight, be easy to install, have very few dependencies, and can be easily inserted into the Earth system modeling workflow with negligible disruption. In this work, we present the motivation, approach, and testing results of these two light-weight parallel Python tools, as well as our plans for future research and development.
Current And Future Directions Of Lens Design Software
NASA Astrophysics Data System (ADS)
Gustafson, Darryl E.
1983-10-01
The most effective environment for doing lens design continues to evolve as new computer hardware and software tools become available. Important recent hardware developments include: Low-cost but powerful interactive multi-user 32 bit computers with virtual memory that are totally software-compatible with prior larger and more expensive members of the family. A rapidly growing variety of graphics devices for both hard-copy and screen graphics, including many with color capability. In addition, with optical design software readily accessible in many forms, optical design has become a part-time activity for a large number of engineers instead of being restricted to a small number of full-time specialists. A designer interface that is friendly for the part-time user while remaining efficient for the full-time designer is thus becoming more important as well as more practical. Along with these developments, software tools in other scientific and engineering disciplines are proliferating. Thus, the optical designer is less and less unique in his use of computer-aided techniques and faces the challenge and opportunity of efficiently communicating his designs to other computer-aided-design (CAD), computer-aided-manufacturing (CAM), structural, thermal, and mechanical software tools. This paper will address the impact of these developments on the current and future directions of the CODE VTM optical design software package, its implementation, and the resulting lens design environment.
Instrumentation, performance visualization, and debugging tools for multiprocessors
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Fineman, Charles E.; Hontalas, Philip J.
1991-01-01
The need for computing power has forced a migration from serial computation on a single processor to parallel processing on multiprocessor architectures. However, without effective means to monitor (and visualize) program execution, debugging, and tuning parallel programs becomes intractably difficult as program complexity increases with the number of processors. Research on performance evaluation tools for multiprocessors is being carried out at ARC. Besides investigating new techniques for instrumenting, monitoring, and presenting the state of parallel program execution in a coherent and user-friendly manner, prototypes of software tools are being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Our current tool set, the Ames Instrumentation Systems (AIMS), incorporates features from various software systems developed in academia and industry. The execution of FORTRAN programs on the Intel iPSC/860 can be automatically instrumented and monitored. Performance data collected in this manner can be displayed graphically on workstations supporting X-Windows. We have successfully compared various parallel algorithms for computational fluid dynamics (CFD) applications in collaboration with scientists from the Numerical Aerodynamic Simulation Systems Division. By performing these comparisons, we show that performance monitors and debuggers such as AIMS are practical and can illuminate the complex dynamics that occur within parallel programs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etmektzoglou, A; Mishra, P; Svatos, M
Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomesmore » available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly translate research ideas into machine readable scripts without programming knowledge. As an open source initiative, it also enables researcher collaboration on future developments. I am a full time employee at Varian Medical Systems, Palo Alto, California.« less
Barton, G; Abbott, J; Chiba, N; Huang, DW; Huang, Y; Krznaric, M; Mack-Smith, J; Saleem, A; Sherman, BT; Tiwari, B; Tomlinson, C; Aitman, T; Darlington, J; Game, L; Sternberg, MJE; Butcher, SA
2008-01-01
Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System) is a multi-user rich internet application (RIA) providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data management and analysis tasks through a single easy-to-use web application. The system architecture is flexible and scalable to allow new array types, analysis algorithms and tools to be added with relative ease and to cope with large increases in data volume. PMID:19032776
Benefit-cost methodology study with example application of the use of wind generators
NASA Technical Reports Server (NTRS)
Zimmer, R. P.; Justus, C. G.; Mason, R. M.; Robinette, S. L.; Sassone, P. G.; Schaffer, W. A.
1975-01-01
An example application for cost-benefit methodology is presented for the use of wind generators. The approach adopted for the example application consisted of the following activities: (1) surveying of the available wind data and wind power system information, (2) developing models which quantitatively described wind distributions, wind power systems, and cost-benefit differences between conventional systems and wind power systems, and (3) applying the cost-benefit methodology to compare a conventional electrical energy generation system with systems which included wind power generators. Wind speed distribution data were obtained from sites throughout the contiguous United States and were used to compute plant factor contours shown on an annual and seasonal basis. Plant factor values (ratio of average output power to rated power) are found to be as high as 0.6 (on an annual average basis) in portions of the central U. S. and in sections of the New England coastal area. Two types of wind power systems were selected for the application of the cost-benefit methodology. A cost-benefit model was designed and implemented on a computer to establish a practical tool for studying the relative costs and benefits of wind power systems under a variety of conditions and to efficiently and effectively perform associated sensitivity analyses.
BCILAB: a platform for brain-computer interface development
NASA Astrophysics Data System (ADS)
Kothe, Christian Andreas; Makeig, Scott
2013-10-01
Objective. The past two decades have seen dramatic progress in our ability to model brain signals recorded by electroencephalography, functional near-infrared spectroscopy, etc., and to derive real-time estimates of user cognitive state, response, or intent for a variety of purposes: to restore communication by the severely disabled, to effect brain-actuated control and, more recently, to augment human-computer interaction. Continuing these advances, largely achieved through increases in computational power and methods, requires software tools to streamline the creation, testing, evaluation and deployment of new data analysis methods. Approach. Here we present BCILAB, an open-source MATLAB-based toolbox built to address the need for the development and testing of brain-computer interface (BCI) methods by providing an organized collection of over 100 pre-implemented methods and method variants, an easily extensible framework for the rapid prototyping of new methods, and a highly automated framework for systematic testing and evaluation of new implementations. Main results. To validate and illustrate the use of the framework, we present two sample analyses of publicly available data sets from recent BCI competitions and from a rapid serial visual presentation task. We demonstrate the straightforward use of BCILAB to obtain results compatible with the current BCI literature. Significance. The aim of the BCILAB toolbox is to provide the BCI community a powerful toolkit for methods research and evaluation, thereby helping to accelerate the pace of innovation in the field, while complementing the existing spectrum of tools for real-time BCI experimentation, deployment and use.
Cazzaniga, Paolo; Nobile, Marco S.; Besozzi, Daniela; Bellini, Matteo; Mauri, Giancarlo
2014-01-01
The introduction of general-purpose Graphics Processing Units (GPUs) is boosting scientific applications in Bioinformatics, Systems Biology, and Computational Biology. In these fields, the use of high-performance computing solutions is motivated by the need of performing large numbers of in silico analysis to study the behavior of biological systems in different conditions, which necessitate a computing power that usually overtakes the capability of standard desktop computers. In this work we present coagSODA, a CUDA-powered computational tool that was purposely developed for the analysis of a large mechanistic model of the blood coagulation cascade (BCC), defined according to both mass-action kinetics and Hill functions. coagSODA allows the execution of parallel simulations of the dynamics of the BCC by automatically deriving the system of ordinary differential equations and then exploiting the numerical integration algorithm LSODA. We present the biological results achieved with a massive exploration of perturbed conditions of the BCC, carried out with one-dimensional and bi-dimensional parameter sweep analysis, and show that GPU-accelerated parallel simulations of this model can increase the computational performances up to a 181× speedup compared to the corresponding sequential simulations. PMID:25025072
Bio and health informatics meets cloud : BioVLab as an example.
Chae, Heejoon; Jung, Inuk; Lee, Hyungro; Marru, Suresh; Lee, Seong-Whan; Kim, Sun
2013-01-01
The exponential increase of genomic data brought by the advent of the next or the third generation sequencing (NGS) technologies and the dramatic drop in sequencing cost have driven biological and medical sciences to data-driven sciences. This revolutionary paradigm shift comes with challenges in terms of data transfer, storage, computation, and analysis of big bio/medical data. Cloud computing is a service model sharing a pool of configurable resources, which is a suitable workbench to address these challenges. From the medical or biological perspective, providing computing power and storage is the most attractive feature of cloud computing in handling the ever increasing biological data. As data increases in size, many research organizations start to experience the lack of computing power, which becomes a major hurdle in achieving research goals. In this paper, we review the features of publically available bio and health cloud systems in terms of graphical user interface, external data integration, security and extensibility of features. We then discuss about issues and limitations of current cloud systems and conclude with suggestion of a biological cloud environment concept, which can be defined as a total workbench environment assembling computational tools and databases for analyzing bio/medical big data in particular application domains.
A CFD-informed quasi-steady model of flapping wing aerodynamics.
Nakata, Toshiyuki; Liu, Hao; Bomphrey, Richard J
2015-11-01
Aerodynamic performance and agility during flapping flight are determined by the combination of wing shape and kinematics. The degree of morphological and kinematic optimisation is unknown and depends upon a large parameter space. Aimed at providing an accurate and computationally inexpensive modelling tool for flapping-wing aerodynamics, we propose a novel CFD (computational fluid dynamics)-informed quasi-steady model (CIQSM), which assumes that the aerodynamic forces on a flapping wing can be decomposed into the quasi-steady forces and parameterised based on CFD results. Using least-squares fitting, we determine a set of proportional coefficients for the quasi-steady model relating wing kinematics to instantaneous aerodynamic force and torque; we calculate power with the product of quasi-steady torques and angular velocity. With the quasi-steady model fully and independently parameterised on the basis of high-fidelity CFD modelling, it is capable of predicting flapping-wing aerodynamic forces and power more accurately than the conventional blade element model (BEM) does. The improvement can be attributed to, for instance, taking into account the effects of the induced downwash and the wing tip vortex on the force generation and power consumption. Our model is validated by comparing the aerodynamics of a CFD model and the present quasi-steady model using the example case of a hovering hawkmoth. It demonstrates that the CIQSM outperforms the conventional BEM while remaining computationally cheap, and hence can be an effective tool for revealing the mechanisms of optimization and control of kinematics and morphology in flapping-wing flight for both bio-flyers and unmanned air systems.
A CFD-informed quasi-steady model of flapping wing aerodynamics
Nakata, Toshiyuki; Liu, Hao; Bomphrey, Richard J.
2016-01-01
Aerodynamic performance and agility during flapping flight are determined by the combination of wing shape and kinematics. The degree of morphological and kinematic optimisation is unknown and depends upon a large parameter space. Aimed at providing an accurate and computationally inexpensive modelling tool for flapping-wing aerodynamics, we propose a novel CFD (computational fluid dynamics)-informed quasi-steady model (CIQSM), which assumes that the aerodynamic forces on a flapping wing can be decomposed into the quasi-steady forces and parameterised based on CFD results. Using least-squares fitting, we determine a set of proportional coefficients for the quasi-steady model relating wing kinematics to instantaneous aerodynamic force and torque; we calculate power with the product of quasi-steady torques and angular velocity. With the quasi-steady model fully and independently parameterised on the basis of high-fidelity CFD modelling, it is capable of predicting flapping-wing aerodynamic forces and power more accurately than the conventional blade element model (BEM) does. The improvement can be attributed to, for instance, taking into account the effects of the induced downwash and the wing tip vortex on the force generation and power consumption. Our model is validated by comparing the aerodynamics of a CFD model and the present quasi-steady model using the example case of a hovering hawkmoth. It demonstrates that the CIQSM outperforms the conventional BEM while remaining computationally cheap, and hence can be an effective tool for revealing the mechanisms of optimization and control of kinematics and morphology in flapping-wing flight for both bio-flyers and unmanned air systems. PMID:27346891
How Not To Drown in Data: A Guide for Biomaterial Engineers.
Vasilevich, Aliaksei S; Carlier, Aurélie; de Boer, Jan; Singh, Shantanu
2017-08-01
High-throughput assays that produce hundreds of measurements per sample are powerful tools for quantifying cell-material interactions. With advances in automation and miniaturization in material fabrication, hundreds of biomaterial samples can be rapidly produced, which can then be characterized using these assays. However, the resulting deluge of data can be overwhelming. To the rescue are computational methods that are well suited to these problems. Machine learning techniques provide a vast array of tools to make predictions about cell-material interactions and to find patterns in cellular responses. Computational simulations allow researchers to pose and test hypotheses and perform experiments in silico. This review describes approaches from these two domains that can be brought to bear on the problem of analyzing biomaterial screening data. Copyright © 2017 Elsevier Ltd. All rights reserved.
The AstroVR Collaboratory, an On-line Multi-User Environment for Research in Astrophysics
NASA Astrophysics Data System (ADS)
van Buren, D.; Curtis, P.; Nichols, D. A.; Brundage, M.
We describe our experiment with an on-line collaborative environment where users share the execution of programs and communicate via audio, video, and typed text. Collaborative environments represent the next step in computer-mediated conferencing, combining powerful compute engines, data persistence, shared applications, and teleconferencing tools. As proof of concept, we have implemented a shared image analysis tool, allowing geographically distinct users to analyze FITS images together. We anticipate that \\htmllink{AstroVR}{http://astrovr.ipac.caltech.edu:8888} and similar systems will become an important part of collaborative work in the next decade, including with applications in remote observing, spacecraft operations, on-line meetings, as well as and day-to-day research activities. The technology is generic and promises to find uses in business, medicine, government, and education.
Novel Multiscale Modeling Tool Applied to Pseudomonas aeruginosa Biofilm Formation
Biggs, Matthew B.; Papin, Jason A.
2013-01-01
Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool. PMID:24147108
Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.
Biggs, Matthew B; Papin, Jason A
2013-01-01
Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.
The Internet and managed care: a new wave of innovation.
Goldsmith, J
2000-01-01
Managed care firms have been under siege in the political system and the marketplace for the past few years. The rise of the Internet has brought into being powerful new electronic tools for automating administrative and financial processes in health insurance. These tools may enable new firms or employers to create custom-designed networks connecting their workers and providers, bypassing health plans altogether. Alternatively, health plans may use these tools to create a new consumer-focused business model. While some disintermediation of managed care plans may occur, the barriers to adoption of Internet tools by established plans are quite low. Network computing may provide important leverage for health plans not only to retain their franchises but also to improve their profitability and customer service.
Brown, David K; Penkler, David L; Musyoka, Thommas M; Bishop, Özlem Tastan
2015-01-01
Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.
Brown, David K.; Penkler, David L.; Musyoka, Thommas M.; Bishop, Özlem Tastan
2015-01-01
Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS. PMID:26280450
Single Cell Genomics: Approaches and Utility in Immunology
Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A
2017-01-01
Single cell genomics offers powerful tools for studying lymphocytes, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population-level. Advances in computer science and single cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single cell RNA-seq data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. PMID:28094102
High-resolution PET [Positron Emission Tomography] for Medical Science Studies
DOE R&D Accomplishments Database
Budinger, T. F.; Derenzo, S. E.; Huesman, R. H.; Jagust, W. J.; Valk, P. E.
1989-09-01
One of the unexpected fruits of basic physics research and the computer revolution is the noninvasive imaging power available to today's physician. Technologies that were strictly the province of research scientists only a decade or two ago now serve as the foundations for such standard diagnostic tools as x-ray computer tomography (CT), magnetic resonance imaging (MRI), magnetic resonance spectroscopy (MRS), ultrasound, single photon emission computed tomography (SPECT), and positron emission tomography (PET). Furthermore, prompted by the needs of both the practicing physician and the clinical researcher, efforts to improve these technologies continue. This booklet endeavors to describe the advantages of achieving high resolution in PET imaging.
Cloud computing for energy management in smart grid - an application survey
NASA Astrophysics Data System (ADS)
Naveen, P.; Kiing Ing, Wong; Kobina Danquah, Michael; Sidhu, Amandeep S.; Abu-Siada, Ahmed
2016-03-01
The smart grid is the emerging energy system wherein the application of information technology, tools and techniques that make the grid run more efficiently. It possesses demand response capacity to help balance electrical consumption with supply. The challenges and opportunities of emerging and future smart grids can be addressed by cloud computing. To focus on these requirements, we provide an in-depth survey on different cloud computing applications for energy management in the smart grid architecture. In this survey, we present an outline of the current state of research on smart grid development. We also propose a model of cloud based economic power dispatch for smart grid.
NASA Technical Reports Server (NTRS)
Bruce, E. A.
1980-01-01
The software developed by the IPAD project, a new and very powerful tool for the implementation of integrated Computer Aided Design (CAD) systems in the aerospace engineering community, is discussed. The IPAD software is a tool and, as such, can be well applied or misapplied in any particular environment. The many benefits of an integrated CAD system are well documented, but there are few such systems in existence, especially in the mechanical engineering disciplines, and therefore little available experience to guide the implementor.
NASA Enterprise Visual Analysis
NASA Technical Reports Server (NTRS)
Lopez-Tellado, Maria; DiSanto, Brenda; Humeniuk, Robert; Bard, Richard, Jr.; Little, Mia; Edwards, Robert; Ma, Tien-Chi; Hollifield, Kenneith; White, Chuck
2007-01-01
NASA Enterprise Visual Analysis (NEVA) is a computer program undergoing development as a successor to Launch Services Analysis Tool (LSAT), formerly known as Payload Carrier Analysis Tool (PCAT). NEVA facilitates analyses of proposed configurations of payloads and packing fixtures (e.g. pallets) in a space shuttle payload bay for transport to the International Space Station. NEVA reduces the need to use physical models, mockups, and full-scale ground support equipment in performing such analyses. Using NEVA, one can take account of such diverse considerations as those of weight distribution, geometry, collision avoidance, power requirements, thermal loads, and mechanical loads.
Novel Robotic Tools for Piping Inspection and Repair
2015-01-14
was selected due to its small size, and peripheral capability. The SoM measures 50mm x 44mm. The SoM processor is an ARM Cortex -A8 running at720MHz...designing an embedded computing system from scratch. The SoM is a single integrated module which contains the processor , RAM, power management, and
Fidget with Widgets: CNC Activity Introduces the Flatbed Router
ERIC Educational Resources Information Center
Tryon, Daniel V.
2006-01-01
The computer numerical control (CNC) flatbed router is a powerful tool and a must-have piece of equipment for any technology education program in which students will produce a product--whether it involves Manufacturing, Materials Processing, or any of the vast array of Project Lead the Way courses. This article describes an activity--producing a…
ERIC Educational Resources Information Center
Piatt, Carley; Coret, Marian; Choi, Michael; Volden, Joanne; Bisanz, Jeffrey
2016-01-01
Tablet computers (tablets) are positioned to be powerful, innovative, effective, and motivating research and assessment tools. We addressed two questions critical for evaluating the appropriateness of using tablets to study number-line estimation, a skill associated with math achievement and argued to be central to numerical cognition. First, is…
Soil vapor extraction (SVE) and bioventing (BV) are proven strategies for remediation of unsaturated zone soils. Mathematical models are powerful tools that can be used to integrate and quantify the interaction of physical, chemical, and biological processes occurring in field sc...
Introduction to Classical Density Functional Theory by a Computational Experiment
ERIC Educational Resources Information Center
Jeanmairet, Guillaume; Levy, Nicolas; Levesque, Maximilien; Borgis, Daniel
2014-01-01
We propose an in silico experiment to introduce the classical density functional theory (cDFT). Density functional theories, whether quantum or classical, rely on abstract concepts that are nonintuitive; however, they are at the heart of powerful tools and active fields of research in both physics and chemistry. They led to the 1998 Nobel Prize in…
E-Books Plus: Role of Interactive Visuals in Exploration of Mathematical Information and E-Learning
ERIC Educational Resources Information Center
Rowhani, Sonja; Sedig, Kamran
2005-01-01
E-books promise to become a widespread delivery mechanism for educational resources. However, current e-books do not take full advantage of the power of computing tools. In particular, interaction with the content is often reduced to navigation through the information. This article investigates how adding interactive visuals to an e-book…
ERIC Educational Resources Information Center
Quintana, Maclovia; Morales, Alfonso
2015-01-01
Computer-mediated communications, in particular listservs, can be powerful tools for creating social change--namely, shifting our food system to a more healthy, just, and localised model. They do this by creating the conditions--collaborations, interaction, self-reflection, and personal empowerment--that cultivate distributed leadership. In this…
CT Imaging, Data Reduction, and Visualization of Hardwood Logs
Daniel L. Schmoldt
1996-01-01
Computer tomography (CT) is a mathematical technique that, combined with noninvasive scanning such as x-ray imaging, has become a powerful tool to nondestructively test materials prior to use or to evaluate materials prior to processing. In the current context, hardwood lumber processing can benefit greatly by knowing what a log looks like prior to initial breakdown....
Sensor-Free or Sensor-Full: A Comparison of Data Modalities in Multi-Channel Affect Detection
ERIC Educational Resources Information Center
Paquette, Luc; Rowe, Jonathan; Baker, Ryan; Mott, Bradford; Lester, James; DeFalco, Jeanine; Brawner, Keith; Sottilare, Robert; Georgoulas, Vasiliki
2016-01-01
Computational models that automatically detect learners' affective states are powerful tools for investigating the interplay of affect and learning. Over the past decade, affect detectors--which recognize learners' affective states at run-time using behavior logs and sensor data--have advanced substantially across a range of K-12 and postsecondary…
Recapturing Technology for Education: Keeping Tomorrow in Today's Classrooms
ERIC Educational Resources Information Center
Gura, Mark; Percy, Bernard
2005-01-01
Despite significant investment of funds, time, and effort in bringing computers, the Internet, and related technologies into the classrooms, educators have turned their back on these new power tools of the intellect. School is the last remaining institution to keep 21st Century technology at arms distance. How can technology be used to enrich and…
An EMTP system level model of the PMAD DC test bed
NASA Technical Reports Server (NTRS)
Dravid, Narayan V.; Kacpura, Thomas J.; Tam, Kwa-Sur
1991-01-01
A power management and distribution direct current (PMAD DC) test bed was set up at the NASA Lewis Research Center to investigate Space Station Freedom Electric Power Systems issues. Efficiency of test bed operation significantly improves with a computer simulation model of the test bed as an adjunct tool of investigation. Such a model is developed using the Electromagnetic Transients Program (EMTP) and is available to the test bed developers and experimenters. The computer model is assembled on a modular basis. Device models of different types can be incorporated into the system model with only a few lines of code. A library of the various model types is created for this purpose. Simulation results and corresponding test bed results are presented to demonstrate model validity.
Providing security for automated process control systems at hydropower engineering facilities
NASA Astrophysics Data System (ADS)
Vasiliev, Y. S.; Zegzhda, P. D.; Zegzhda, D. P.
2016-12-01
This article suggests the concept of a cyberphysical system to manage computer security of automated process control systems at hydropower engineering facilities. According to the authors, this system consists of a set of information processing tools and computer-controlled physical devices. Examples of cyber attacks on power engineering facilities are provided, and a strategy of improving cybersecurity of hydropower engineering systems is suggested. The architecture of the multilevel protection of the automated process control system (APCS) of power engineering facilities is given, including security systems, control systems, access control, encryption, secure virtual private network of subsystems for monitoring and analysis of security events. The distinctive aspect of the approach is consideration of interrelations and cyber threats, arising when SCADA is integrated with the unified enterprise information system.
VisRseq: R-based visual framework for analysis of sequencing data
2015-01-01
Background Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. Results We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. Conclusions To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights. PMID:26328469
VisRseq: R-based visual framework for analysis of sequencing data.
Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M
2015-01-01
Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Kai; Qi, Junjian; Kang, Wei
2016-08-01
Growing penetration of intermittent resources such as renewable generations increases the risk of instability in a power grid. This paper introduces the concept of observability and its computational algorithms for a power grid monitored by the wide-area measurement system (WAMS) based on synchrophasors, e.g. phasor measurement units (PMUs). The goal is to estimate real-time states of generators, especially for potentially unstable trajectories, the information that is critical for the detection of rotor angle instability of the grid. The paper studies the number and siting of synchrophasors in a power grid so that the state of the system can be accuratelymore » estimated in the presence of instability. An unscented Kalman filter (UKF) is adopted as a tool to estimate the dynamic states that are not directly measured by synchrophasors. The theory and its computational algorithms are illustrated in detail by using a 9-bus 3-generator power system model and then tested on a 140-bus 48-generator Northeast Power Coordinating Council power grid model. Case studies on those two systems demonstrate the performance of the proposed approach using a limited number of synchrophasors for dynamic state estimation for stability assessment and its robustness against moderate inaccuracies in model parameters.« less
COMPUTATIONAL METHODOLOGIES for REAL-SPACE STRUCTURAL REFINEMENT of LARGE MACROMOLECULAR COMPLEXES
Goh, Boon Chong; Hadden, Jodi A.; Bernardi, Rafael C.; Singharoy, Abhishek; McGreevy, Ryan; Rudack, Till; Cassidy, C. Keith; Schulten, Klaus
2017-01-01
The rise of the computer as a powerful tool for model building and refinement has revolutionized the field of structure determination for large biomolecular systems. Despite the wide availability of robust experimental methods capable of resolving structural details across a range of spatiotemporal resolutions, computational hybrid methods have the unique ability to integrate the diverse data from multimodal techniques such as X-ray crystallography and electron microscopy into consistent, fully atomistic structures. Here, commonly employed strategies for computational real-space structural refinement are reviewed, and their specific applications are illustrated for several large macromolecular complexes: ribosome, virus capsids, chemosensory array, and photosynthetic chromatophore. The increasingly important role of computational methods in large-scale structural refinement, along with current and future challenges, is discussed. PMID:27145875
Towards quantum chemistry on a quantum computer.
Lanyon, B P; Whitfield, J D; Gillett, G G; Goggin, M E; Almeida, M P; Kassal, I; Biamonte, J D; Mohseni, M; Powell, B J; Barbieri, M; Aspuru-Guzik, A; White, A G
2010-02-01
Exact first-principles calculations of molecular properties are currently intractable because their computational cost grows exponentially with both the number of atoms and basis set size. A solution is to move to a radically different model of computing by building a quantum computer, which is a device that uses quantum systems themselves to store and process data. Here we report the application of the latest photonic quantum computer technology to calculate properties of the smallest molecular system: the hydrogen molecule in a minimal basis. We calculate the complete energy spectrum to 20 bits of precision and discuss how the technique can be expanded to solve large-scale chemical problems that lie beyond the reach of modern supercomputers. These results represent an early practical step toward a powerful tool with a broad range of quantum-chemical applications.
A computer method for schedule processing and quick-time updating.
NASA Technical Reports Server (NTRS)
Mccoy, W. H.
1972-01-01
A schedule analysis program is presented which can be used to process any schedule with continuous flow and with no loops. Although generally thought of as a management tool, it has applicability to such extremes as music composition and computer program efficiency analysis. Other possibilities for its use include the determination of electrical power usage during some operation such as spacecraft checkout, and the determination of impact envelopes for the purpose of scheduling payloads in launch processing. At the core of the described computer method is an algorithm which computes the position of each activity bar on the output waterfall chart. The algorithm is basically a maximal-path computation which gives to each node in the schedule network the maximal path from the initial node to the given node.
CFD and Neutron codes coupling on a computational platform
NASA Astrophysics Data System (ADS)
Cerroni, D.; Da Vià, R.; Manservisi, S.; Menghini, F.; Scardovelli, R.
2017-01-01
In this work we investigate the thermal-hydraulics behavior of a PWR nuclear reactor core, evaluating the power generation distribution taking into account the local temperature field. The temperature field, evaluated using a self-developed CFD module, is exchanged with a neutron code, DONJON-DRAGON, which updates the macroscopic cross sections and evaluates the new neutron flux. From the updated neutron flux the new peak factor is evaluated and the new temperature field is computed. The exchange of data between the two codes is obtained thanks to their inclusion into the computational platform SALOME, an open-source tools developed by the collaborative project NURESAFE. The numerical libraries MEDmem, included into the SALOME platform, are used in this work, for the projection of computational fields from one problem to another. The two problems are driven by a common supervisor that can access to the computational fields of both systems, in every time step, the temperature field, is extracted from the CFD problem and set into the neutron problem. After this iteration the new power peak factor is projected back into the CFD problem and the new time step can be computed. Several computational examples, where both neutron and thermal-hydraulics quantities are parametrized, are finally reported in this work.
Garrett, Daniel S; Gronenborn, Angela M; Clore, G Marius
2011-12-01
The Contour Approach to Peak Picking was developed to aid in the analysis and interpretation and of multidimensional NMR spectra of large biomolecules. In essence, it comprises an interactive graphics software tool to computationally select resonance positions in heteronuclear, 3- and 4D spectra. Copyright © 2011. Published by Elsevier Inc.
1987-01-01
after the MYCIN expert system. Host Computer PC+ is available on both symbolic and numeric computers. It operates on: the IBM PC AT, TI Bus- Pro (IBM PC...suppose that the data baseTool picks up pace contains 100 motors, and in only one case does a lightweight motor pro . duce more power than heavier units...every sor, ART 2.0. In the bargain it con - the figure). decision point takes time. More sub- sumes 10 times less storage. ART 3.0 reduces the comparison
Renaud, Patrice; Joyal, Christian; Stoleru, Serge; Goyette, Mathieu; Weiskopf, Nikolaus; Birbaumer, Niels
2011-01-01
This chapter proposes a prospective view on using a real-time functional magnetic imaging (rt-fMRI) brain-computer interface (BCI) application as a new treatment for pedophilia. Neurofeedback mediated by interactive virtual stimuli is presented as the key process in this new BCI application. Results on the diagnostic discriminant power of virtual characters depicting sexual stimuli relevant to pedophilia are given. Finally, practical and ethical implications are briefly addressed. Copyright © 2011 Elsevier B.V. All rights reserved.
Heavy Lift Vehicle (HLV) Avionics Flight Computing Architecture Study
NASA Technical Reports Server (NTRS)
Hodson, Robert F.; Chen, Yuan; Morgan, Dwayne R.; Butler, A. Marc; Sdhuh, Joseph M.; Petelle, Jennifer K.; Gwaltney, David A.; Coe, Lisa D.; Koelbl, Terry G.; Nguyen, Hai D.
2011-01-01
A NASA multi-Center study team was assembled from LaRC, MSFC, KSC, JSC and WFF to examine potential flight computing architectures for a Heavy Lift Vehicle (HLV) to better understand avionics drivers. The study examined Design Reference Missions (DRMs) and vehicle requirements that could impact the vehicles avionics. The study considered multiple self-checking and voting architectural variants and examined reliability, fault-tolerance, mass, power, and redundancy management impacts. Furthermore, a goal of the study was to develop the skills and tools needed to rapidly assess additional architectures should requirements or assumptions change.
TEMPEST: A computer code for three-dimensional analysis of transient fluid dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fort, J.A.
TEMPEST (Transient Energy Momentum and Pressure Equations Solutions in Three dimensions) is a powerful tool for solving engineering problems in nuclear energy, waste processing, chemical processing, and environmental restoration because it analyzes and illustrates 3-D time-dependent computational fluid dynamics and heat transfer analysis. It is a family of codes with two primary versions, a N- Version (available to public) and a T-Version (not currently available to public). This handout discusses its capabilities, applications, numerical algorithms, development status, and availability and assistance.
Using artificial intelligence to control fluid flow computations
NASA Technical Reports Server (NTRS)
Gelsey, Andrew
1992-01-01
Computational simulation is an essential tool for the prediction of fluid flow. Many powerful simulation programs exist today. However, using these programs to reliably analyze fluid flow and other physical situations requires considerable human effort and expertise to set up a simulation, determine whether the output makes sense, and repeatedly run the simulation with different inputs until a satisfactory result is achieved. Automating this process is not only of considerable practical importance but will also significantly advance basic artificial intelligence (AI) research in reasoning about the physical world.
NASA Astrophysics Data System (ADS)
Nandigam, V.; Crosby, C. J.; Baru, C.; Arrowsmith, R.
2009-12-01
LIDAR is an excellent example of the new generation of powerful remote sensing data now available to Earth science researchers. Capable of producing digital elevation models (DEMs) more than an order of magnitude higher resolution than those currently available, LIDAR data allows earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible, yet essential for their appropriate representation. Along with these high-resolution datasets comes an increase in the volume and complexity of data that the user must efficiently manage and process in order for it to be scientifically useful. Although there are expensive commercial LIDAR software applications available, processing and analysis of these datasets are typically computationally inefficient on the conventional hardware and software that is currently available to most of the Earth science community. We have designed and implemented an Internet-based system, the OpenTopography Portal, that provides integrated access to high-resolution LIDAR data as well as web-based tools for processing of these datasets. By using remote data storage and high performance compute resources, the OpenTopography Portal attempts to simplify data access and standard LIDAR processing tasks for the Earth Science community. The OpenTopography Portal allows users to access massive amounts of raw point cloud LIDAR data as well as a suite of DEM generation tools to enable users to generate custom digital elevation models to best fit their science applications. The Cyberinfrastructure software tools for processing the data are freely available via the portal and conveniently integrated with the data selection in a single user-friendly interface. The ability to run these tools on powerful Cyberinfrastructure resources instead of their own labs provides a huge advantage in terms of performance and compute power. The system also encourages users to explore data processing methods and the variations in algorithm parameters since all of the processing is done remotely and numerous jobs can be submitted in sequence. The web-based software also eliminates the need for users to deal with the hassles and costs associated with software installation and licensing while providing adequate disk space for storage and personal job archival capability. Although currently limited to data access and DEM generation tasks, the OpenTopography system is modular in design and can be modified to accommodate new processing tools as they become available. We are currently exploring implementation of higher-level DEM analysis tasks in OpenTopography, since such processing is often computationally intensive and thus lends itself to utilization of cyberinfrastructure. Products derived from OpenTopography processing are available in a variety of formats ranging from simple Google Earth visualizations of LIDAR-derived hillshades to various GIS-compatible grid formats. To serve community users less interested in data processing, OpenTopography also hosts 1 km^2 digital elevation model tiles as well as Google Earth image overlays for a synoptic view of the data.
Analysis and simulation tools for solar array power systems
NASA Astrophysics Data System (ADS)
Pongratananukul, Nattorn
This dissertation presents simulation tools developed specifically for the design of solar array power systems. Contributions are made in several aspects of the system design phases, including solar source modeling, system simulation, and controller verification. A tool to automate the study of solar array configurations using general purpose circuit simulators has been developed based on the modeling of individual solar cells. Hierarchical structure of solar cell elements, including semiconductor properties, allows simulation of electrical properties as well as the evaluation of the impact of environmental conditions. A second developed tool provides a co-simulation platform with the capability to verify the performance of an actual digital controller implemented in programmable hardware such as a DSP processor, while the entire solar array including the DC-DC power converter is modeled in software algorithms running on a computer. This "virtual plant" allows developing and debugging code for the digital controller, and also to improve the control algorithm. One important task in solar arrays is to track the maximum power point on the array in order to maximize the power that can be delivered. Digital controllers implemented with programmable processors are particularly attractive for this task because sophisticated tracking algorithms can be implemented and revised when needed to optimize their performance. The proposed co-simulation tools are thus very valuable in developing and optimizing the control algorithm, before the system is built. Examples that demonstrate the effectiveness of the proposed methodologies are presented. The proposed simulation tools are also valuable in the design of multi-channel arrays. In the specific system that we have designed and tested, the control algorithm is implemented on a single digital signal processor. In each of the channels the maximum power point is tracked individually. In the prototype we built, off-the-shelf commercial DC-DC converters were utilized. At the end, the overall performance of the entire system was evaluated using solar array simulators capable of simulating various I-V characteristics, and also by using an electronic load. Experimental results are presented.
MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories.
McGibbon, Robert T; Beauchamp, Kyle A; Harrigan, Matthew P; Klein, Christoph; Swails, Jason M; Hernández, Carlos X; Schwantes, Christian R; Wang, Lee-Ping; Lane, Thomas J; Pande, Vijay S
2015-10-20
As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories
McGibbon, Robert T.; Beauchamp, Kyle A.; Harrigan, Matthew P.; Klein, Christoph; Swails, Jason M.; Hernández, Carlos X.; Schwantes, Christian R.; Wang, Lee-Ping; Lane, Thomas J.; Pande, Vijay S.
2015-01-01
As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. PMID:26488642
Live broadcast of laparoscopic surgery to handheld computers.
Gandsas, A; McIntire, K; Park, A
2004-06-01
Thanks to advances in computer power and miniaturization technology, portable electronic devices are now being used to assist physicians with various applications that extend far beyond Web browsing or sending e-mail. Handheld computers are used for electronic medical records, billing, coding, and to enable convenient access to electronic journals for reference purposes. The results of diagnostic investigations, such as laboratory results, study reports, and still radiographic pictures, can also be downloaded into portable devices for later view. Handheld computer technology, combined with wireless protocols and streaming video technology, has the added potential to become a powerful educational tool for medical students and residents. The purpose of this study was to assess the feasibility of transferring multimedia data in real time to a handheld computer via a wireless network and displaying them on the computer screens of clients at remote locations. A live laparoscopic splenectomy was transmitted live to eight handheld computers simultaneously through our institution's wireless network. All eight viewers were able to view the procedure and to hear the surgeon's comments throughout the entire duration of the operation. Handheld computer technology can play a key role in surgical education by delivering information to surgical residents or students when they are geographically distant from the actual event. Validation of this new technology by conducting clinical research is still needed to determine whether resident physicians or medical students can benefit from the use of handheld computers.
Comparison of ISS Power System Telemetry with Analytically Derived Data for Shadowed Cases
NASA Technical Reports Server (NTRS)
Fincannon, H. James
2002-01-01
Accurate International Space Station (ISS) power prediction requires the quantification of solar array shadowing. Prior papers have discussed the NASA Glenn Research Center (GRC) ISS power system tool SPACE (System Power Analysis for Capability Evaluation) and its integrated shadowing algorithms. On-orbit telemetry has become available that permits the correlation of theoretical shadowing predictions with actual data. This paper documents the comparison of a shadowing metric (total solar array current) as derived from SPACE predictions and on-orbit flight telemetry data for representative significant shadowing cases. Images from flight video recordings and the SPACE computer program graphical output are used to illustrate the comparison. The accuracy of the SPACE shadowing capability is demonstrated for the cases examined.
An 'electronic' extramural course in epidemiology and medical statistics.
Ostbye, T
1989-03-01
This article describes an extramural university course in epidemiology and medical statistics taught using a computer conferencing system, microcomputers and data communications. Computer conferencing was shown to be a powerful, yet quite easily mastered, vehicle for distance education. It allows health personnel unable to attend regular classes due to geographical or time constraints, to take part in an interactive learning environment at low cost. This overcomes part of the intellectual and social isolation associated with traditional correspondence courses. Teaching of epidemiology and medical statistics is well suited to computer conferencing, even if the asynchronicity of the medium makes discussion of the most complex statistical concepts a little cumbersome. Computer conferencing may also prove to be a useful tool for teaching other medical and health related subjects.
Molecular dynamics simulations and applications in computational toxicology and nanotoxicology.
Selvaraj, Chandrabose; Sakkiah, Sugunadevi; Tong, Weida; Hong, Huixiao
2018-02-01
Nanotoxicology studies toxicity of nanomaterials and has been widely applied in biomedical researches to explore toxicity of various biological systems. Investigating biological systems through in vivo and in vitro methods is expensive and time taking. Therefore, computational toxicology, a multi-discipline field that utilizes computational power and algorithms to examine toxicology of biological systems, has gained attractions to scientists. Molecular dynamics (MD) simulations of biomolecules such as proteins and DNA are popular for understanding of interactions between biological systems and chemicals in computational toxicology. In this paper, we review MD simulation methods, protocol for running MD simulations and their applications in studies of toxicity and nanotechnology. We also briefly summarize some popular software tools for execution of MD simulations. Published by Elsevier Ltd.
Use of parallel computing for analyzing big data in EEG studies of ambiguous perception
NASA Astrophysics Data System (ADS)
Maksimenko, Vladimir A.; Grubov, Vadim V.; Kirsanov, Daniil V.
2018-02-01
Problem of interaction between human and machine systems through the neuro-interfaces (or brain-computer interfaces) is an urgent task which requires analysis of large amount of neurophysiological EEG data. In present paper we consider the methods of parallel computing as one of the most powerful tools for processing experimental data in real-time with respect to multichannel structure of EEG. In this context we demonstrate the application of parallel computing for the estimation of the spectral properties of multichannel EEG signals, associated with the visual perception. Using CUDA C library we run wavelet-based algorithm on GPUs and show possibility for detection of specific patterns in multichannel set of EEG data in real-time.
A simple computer-based measurement and analysis system of pulmonary auscultation sounds.
Polat, Hüseyin; Güler, Inan
2004-12-01
Listening to various lung sounds has proven to be an important diagnostic tool for detecting and monitoring certain types of lung diseases. In this study a computer-based system has been designed for easy measurement and analysis of lung sound using the software package DasyLAB. The designed system presents the following features: it is able to digitally record the lung sounds which are captured with an electronic stethoscope plugged to a sound card on a portable computer, display the lung sound waveform for auscultation sites, record the lung sound into the ASCII format, acoustically reproduce the lung sound, edit and print the sound waveforms, display its time-expanded waveform, compute the Fast Fourier Transform (FFT), and display the power spectrum and spectrogram.
Shultzaberger, Ryan K.; Paddock, Mark L.; Katsuki, Takeo; Greenspan, Ralph J.; Golden, Susan S.
2016-01-01
The temporal measurement of a bioluminescent reporter has proven to be one of the most powerful tools for characterizing circadian rhythms in the cyanobacterium Synechococcus elongatus. Primarily, two approaches have been used to automate this process: (1) detection of cell culture bioluminescence in 96-well plates by a photomultiplier tube-based plate-cycling luminometer (TopCount Microplate Scintillation and Luminescence Counter, Perkin Elmer) and (2) detection of individual colony bioluminescence by iteratively rotating a Petri dish under a cooled CCD camera using a computer-controlled turntable. Each approach has distinct advantages. The TopCount provides a more quantitative measurement of bioluminescence, enabling the direct comparison of clock output levels among strains. The computer-controlled turntable approach has a shorter set-up time and greater throughput, making it a more powerful phenotypic screening tool. While the latter approach is extremely useful, only a few labs have been able to build such an apparatus because of technical hurdles involved in coordinating and controlling both the camera and the turntable, and in processing the resulting images. This protocol provides instructions on how to construct, use, and process data from a computer-controlled turntable to measure the temporal changes in bioluminescence of individual cyanobacterial colonies. Furthermore, we describe how to prepare samples for use with the TopCount to minimize experimental noise, and generate meaningful quantitative measurements of clock output levels for advanced analysis. PMID:25662451
NASA Technical Reports Server (NTRS)
1984-01-01
A solar pond electric power generation subsystem, an electric power transformer and switch yard, a large solar pond, a water treatment plant, and numerous storage and evaporation ponds. Because a solar pond stores thermal energy over a long period of time, plant operation at any point in time is dependent upon past operation and future perceived generation plans. This time or past history factor introduces a new dimension in the design process. The design optimization of a plant must go beyond examination of operational state points and consider the seasonal variations in solar, solar pond energy storage, and desired plant annual duty-cycle profile. Models or design tools will be required to optimize a plant design. These models should be developed in order to include a proper but not excessive level of detail. The model should be targeted to a specific objective and not conceived as a do everything analysis tool, i.e., system design and not gradient-zone stability.
Fast and Epsilon-Optimal Discretized Pursuit Learning Automata.
Zhang, JunQi; Wang, Cheng; Zhou, MengChu
2015-10-01
Learning automata (LA) are powerful tools for reinforcement learning. A discretized pursuit LA is the most popular one among them. During an iteration its operation consists of three basic phases: 1) selecting the next action; 2) finding the optimal estimated action; and 3) updating the state probability. However, when the number of actions is large, the learning becomes extremely slow because there are too many updates to be made at each iteration. The increased updates are mostly from phases 1 and 3. A new fast discretized pursuit LA with assured ε -optimality is proposed to perform both phases 1 and 3 with the computational complexity independent of the number of actions. Apart from its low computational complexity, it achieves faster convergence speed than the classical one when operating in stationary environments. This paper can promote the applications of LA toward the large-scale-action oriented area that requires efficient reinforcement learning tools with assured ε -optimality, fast convergence speed, and low computational complexity for each iteration.
MIGS-GPU: Microarray Image Gridding and Segmentation on the GPU.
Katsigiannis, Stamos; Zacharia, Eleni; Maroulis, Dimitris
2017-05-01
Complementary DNA (cDNA) microarray is a powerful tool for simultaneously studying the expression level of thousands of genes. Nevertheless, the analysis of microarray images remains an arduous and challenging task due to the poor quality of the images that often suffer from noise, artifacts, and uneven background. In this study, the MIGS-GPU [Microarray Image Gridding and Segmentation on Graphics Processing Unit (GPU)] software for gridding and segmenting microarray images is presented. MIGS-GPU's computations are performed on the GPU by means of the compute unified device architecture (CUDA) in order to achieve fast performance and increase the utilization of available system resources. Evaluation on both real and synthetic cDNA microarray images showed that MIGS-GPU provides better performance than state-of-the-art alternatives, while the proposed GPU implementation achieves significantly lower computational times compared to the respective CPU approaches. Consequently, MIGS-GPU can be an advantageous and useful tool for biomedical laboratories, offering a user-friendly interface that requires minimum input in order to run.
A PICKSC Science Gateway for enabling the common plasma physicist to run kinetic software
NASA Astrophysics Data System (ADS)
Hu, Q.; Winjum, B. J.; Zonca, A.; Youn, C.; Tsung, F. S.; Mori, W. B.
2017-10-01
Computer simulations offer tremendous opportunities for studying plasmas, ranging from simulations for students that illuminate fundamental educational concepts to research-level simulations that advance scientific knowledge. Nevertheless, there is a significant hurdle to using simulation tools. Users must navigate codes and software libraries, determine how to wrangle output into meaningful plots, and oftentimes confront a significant cyberinfrastructure with powerful computational resources. Science gateways offer a Web-based environment to run simulations without needing to learn or manage the underlying software and computing cyberinfrastructure. We discuss our progress on creating a Science Gateway for the Particle-in-Cell and Kinetic Simulation Software Center that enables users to easily run and analyze kinetic simulations with our software. We envision that this technology could benefit a wide range of plasma physicists, both in the use of our simulation tools as well as in its adaptation for running other plasma simulation software. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.
Autonomous control systems: applications to remote sensing and image processing
NASA Astrophysics Data System (ADS)
Jamshidi, Mohammad
2001-11-01
One of the main challenges of any control (or image processing) paradigm is being able to handle complex systems under unforeseen uncertainties. A system may be called complex here if its dimension (order) is too high and its model (if available) is nonlinear, interconnected, and information on the system is uncertain such that classical techniques cannot easily handle the problem. Examples of complex systems are power networks, space robotic colonies, national air traffic control system, and integrated manufacturing plant, the Hubble Telescope, the International Space Station, etc. Soft computing, a consortia of methodologies such as fuzzy logic, neuro-computing, genetic algorithms and genetic programming, has proven to be powerful tools for adding autonomy and semi-autonomy to many complex systems. For such systems the size of soft computing control architecture will be nearly infinite. In this paper new paradigms using soft computing approaches are utilized to design autonomous controllers and image enhancers for a number of application areas. These applications are satellite array formations for synthetic aperture radar interferometry (InSAR) and enhancement of analog and digital images.
A parallel-processing approach to computing for the geographic sciences
Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Haga, Jim; Maddox, Brian; Feller, Mark
2001-01-01
The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting research into various areas, such as advanced computer architecture, algorithms to meet the processing needs for real-time image and data processing, the creation of custom datasets from seamless source data, rapid turn-around of products for emergency response, and support for computationally intense spatial and temporal modeling.
NASA Technical Reports Server (NTRS)
Rubbert, P. E.
1978-01-01
The commercial airplane builder's viewpoint on the important issues involved in the development of improved computational aerodynamics tools such as powerful computers optimized for fluid flow problems is presented. The primary user of computational aerodynamics in a commercial aircraft company is the design engineer who is concerned with solving practical engineering problems. From his viewpoint, the development of program interfaces and pre-and post-processing capability for new computational methods is just as important as the algorithms and machine architecture. As more and more details of the entire flow field are computed, the visibility of the output data becomes a major problem which is then doubled when a design capability is added. The user must be able to see, understand, and interpret the results calculated. Enormous costs are expanded because of the need to work with programs having only primitive user interfaces.
Design of a high altitude long endurance flying-wing solar-powered unmanned air vehicle
NASA Astrophysics Data System (ADS)
Alsahlani, A. A.; Johnston, L. J.; Atcliffe, P. A.
2017-06-01
The low-Reynolds number environment of high-altitude §ight places severe demands on the aerodynamic design and stability and control of a high altitude, long endurance (HALE) unmanned air vehicle (UAV). The aerodynamic efficiency of a §ying-wing configuration makes it an attractive design option for such an application and is investigated in the present work. The proposed configuration has a high-aspect ratio, swept-wing planform, the wing sweep being necessary to provide an adequate moment arm for outboard longitudinal and lateral control surfaces. A design optimization framework is developed under a MATLAB environment, combining aerodynamic, structural, and stability analysis. Low-order analysis tools are employed to facilitate efficient computations, which is important when there are multiple optimization loops for the various engineering analyses. In particular, a vortex-lattice method is used to compute the wing planform aerodynamics, coupled to a twodimensional (2D) panel method to derive aerofoil sectional characteristics. Integral boundary-layer methods are coupled to the panel method in order to predict §ow separation boundaries during the design iterations. A quasi-analytical method is adapted for application to flyingwing con¦gurations to predict the wing weight and a linear finite-beam element approach is used for structural analysis of the wing-box. Stability is a particular concern in the low-density environment of high-altitude flight for flying-wing aircraft and so provision of adequate directional stability and control power forms part of the optimization process. At present, a modified Genetic Algorithm is used in all of the optimization loops. Each of the low-order engineering analysis tools is validated using higher-order methods to provide con¦dence in the use of these computationally-efficient tools in the present design-optimization framework. This paper includes the results of employing the present optimization tools in the design of a HALE, flying-wing UAV to indicate that this is a viable design configuration option.
González-Suárez, Ana; Berjano, Enrique; Guerra, Jose M.; Gerardo-Giorda, Luca
2016-01-01
Radiofrequency catheter ablation (RFCA) is a routine treatment for cardiac arrhythmias. During RFCA, the electrode-tissue interface temperature should be kept below 80°C to avoid thrombus formation. Open-irrigated electrodes facilitate power delivery while keeping low temperatures around the catheter. No computational model of an open-irrigated electrode in endocardial RFCA accounting for both the saline irrigation flow and the blood motion in the cardiac chamber has been proposed yet. We present the first computational model including both effects at once. The model has been validated against existing experimental results. Computational results showed that the surface lesion width and blood temperature are affected by both the electrode design and the irrigation flow rate. Smaller surface lesion widths and blood temperatures are obtained with higher irrigation flow rate, while the lesion depth is not affected by changing the irrigation flow rate. Larger lesions are obtained with increasing power and the electrode-tissue contact. Also, larger lesions are obtained when electrode is placed horizontally. Overall, the computational findings are in close agreement with previous experimental results providing an excellent tool for future catheter research. PMID:26938638
Probabilistic Fracture Mechanics of Reactor Pressure Vessels with Populations of Flaws
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Benjamin; Backman, Marie; Williams, Paul
This report documents recent progress in developing a tool that uses the Grizzly and RAVEN codes to perform probabilistic fracture mechanics analyses of reactor pressure vessels in light water reactor nuclear power plants. The Grizzly code is being developed with the goal of creating a general tool that can be applied to study a variety of degradation mechanisms in nuclear power plant components. Because of the central role of the reactor pressure vessel (RPV) in a nuclear power plant, particular emphasis is being placed on developing capabilities to model fracture in embrittled RPVs to aid in the process surrounding decisionmore » making relating to life extension of existing plants. A typical RPV contains a large population of pre-existing flaws introduced during the manufacturing process. The use of probabilistic techniques is necessary to assess the likelihood of crack initiation at one or more of these flaws during a transient event. This report documents development and initial testing of a capability to perform probabilistic fracture mechanics of large populations of flaws in RPVs using reduced order models to compute fracture parameters. The work documented here builds on prior efforts to perform probabilistic analyses of a single flaw with uncertain parameters, as well as earlier work to develop deterministic capabilities to model the thermo-mechanical response of the RPV under transient events, and compute fracture mechanics parameters at locations of pre-defined flaws. The capabilities developed as part of this work provide a foundation for future work, which will develop a platform that provides the flexibility needed to consider scenarios that cannot be addressed with the tools used in current practice.« less
Multiscale modeling of mucosal immune responses
2015-01-01
Computational modeling techniques are playing increasingly important roles in advancing a systems-level mechanistic understanding of biological processes. Computer simulations guide and underpin experimental and clinical efforts. This study presents ENteric Immune Simulator (ENISI), a multiscale modeling tool for modeling the mucosal immune responses. ENISI's modeling environment can simulate in silico experiments from molecular signaling pathways to tissue level events such as tissue lesion formation. ENISI's architecture integrates multiple modeling technologies including ABM (agent-based modeling), ODE (ordinary differential equations), SDE (stochastic modeling equations), and PDE (partial differential equations). This paper focuses on the implementation and developmental challenges of ENISI. A multiscale model of mucosal immune responses during colonic inflammation, including CD4+ T cell differentiation and tissue level cell-cell interactions was developed to illustrate the capabilities, power and scope of ENISI MSM. Background Computational techniques are becoming increasingly powerful and modeling tools for biological systems are of greater needs. Biological systems are inherently multiscale, from molecules to tissues and from nano-seconds to a lifespan of several years or decades. ENISI MSM integrates multiple modeling technologies to understand immunological processes from signaling pathways within cells to lesion formation at the tissue level. This paper examines and summarizes the technical details of ENISI, from its initial version to its latest cutting-edge implementation. Implementation Object-oriented programming approach is adopted to develop a suite of tools based on ENISI. Multiple modeling technologies are integrated to visualize tissues, cells as well as proteins; furthermore, performance matching between the scales is addressed. Conclusion We used ENISI MSM for developing predictive multiscale models of the mucosal immune system during gut inflammation. Our modeling predictions dissect the mechanisms by which effector CD4+ T cell responses contribute to tissue damage in the gut mucosa following immune dysregulation. PMID:26329787
Multiscale modeling of mucosal immune responses.
Mei, Yongguo; Abedi, Vida; Carbo, Adria; Zhang, Xiaoying; Lu, Pinyi; Philipson, Casandra; Hontecillas, Raquel; Hoops, Stefan; Liles, Nathan; Bassaganya-Riera, Josep
2015-01-01
Computational techniques are becoming increasingly powerful and modeling tools for biological systems are of greater needs. Biological systems are inherently multiscale, from molecules to tissues and from nano-seconds to a lifespan of several years or decades. ENISI MSM integrates multiple modeling technologies to understand immunological processes from signaling pathways within cells to lesion formation at the tissue level. This paper examines and summarizes the technical details of ENISI, from its initial version to its latest cutting-edge implementation. Object-oriented programming approach is adopted to develop a suite of tools based on ENISI. Multiple modeling technologies are integrated to visualize tissues, cells as well as proteins; furthermore, performance matching between the scales is addressed. We used ENISI MSM for developing predictive multiscale models of the mucosal immune system during gut inflammation. Our modeling predictions dissect the mechanisms by which effector CD4+ T cell responses contribute to tissue damage in the gut mucosa following immune dysregulation.Computational modeling techniques are playing increasingly important roles in advancing a systems-level mechanistic understanding of biological processes. Computer simulations guide and underpin experimental and clinical efforts. This study presents ENteric Immune Simulator (ENISI), a multiscale modeling tool for modeling the mucosal immune responses. ENISI's modeling environment can simulate in silico experiments from molecular signaling pathways to tissue level events such as tissue lesion formation. ENISI's architecture integrates multiple modeling technologies including ABM (agent-based modeling), ODE (ordinary differential equations), SDE (stochastic modeling equations), and PDE (partial differential equations). This paper focuses on the implementation and developmental challenges of ENISI. A multiscale model of mucosal immune responses during colonic inflammation, including CD4+ T cell differentiation and tissue level cell-cell interactions was developed to illustrate the capabilities, power and scope of ENISI MSM.
NASA Technical Reports Server (NTRS)
2005-01-01
Thin-Film Resistance Heat-Flux Sensors Circuit Indicates that Voice-Recording Disks are Nearly Full Optical Sensing of Combustion Instabilities in Gas Turbines Topics include: Crane-Load Contact Sensor; Hexagonal and Pentagonal Fractal Multiband Antennas; Multifunctional Logic Gate Controlled by Temperature; Multifunctional Logic Gate Controlled by Supply Voltage; Power Divider for Waveforms Rich in Harmonics; SCB Quantum Computers Using iSWAP and 1-Qubit Rotations; CSAM Metrology Software Tool; Update on Rover Sequencing and Visualization Program; Selecting Data from a Star Catalog; Rotating Desk for Collaboration by Two Computer Programmers; Variable-Pressure Washer; Magnetically Attached Multifunction Maintenance Rover; Improvements in Fabrication of Sand/Binder Cores for Casting; Solid Freeform Fabrication of Composite-Material Objects; Efficient Computational Model of Hysteresis; Gauges for Highly Precise Metrology of a Compound Mirror; Improved Electrolytic Hydrogen Peroxide Generator; High-Power Fiber Lasers Using Photonic Band Gap Materials; Ontology-Driven Information Integration; Quantifying Traversability of Terrain for a Mobile Robot; More About Arc-Welding Process for Making Carbon Nanotubes; Controlling Laser Spot Size in Outer Space; or Software-Reconfigurable Processors for Spacecraft.
For Kids, by Kids: Our City Podcast
ERIC Educational Resources Information Center
Vincent, Tony; van't Hooft, Mark
2007-01-01
In this article, the authors discuss podcasting and provide ways on how to create podcasts. A podcast is an audio or video file that is posted on the web that can easily be cataloged and automatically downloaded to a computer or mobile device capable of playing back audio or video files. Podcasting is a powerful tool for educators to get students…
Students' Informal Inference about the Binomial Distribution of "Bunny Hops": A Dialogic Perspective
ERIC Educational Resources Information Center
Kazak, Sibel; Fujita, Taro; Wegerif, Rupert
2016-01-01
The study explores the development of 11-year-old students' informal inference about random bunny hops through student talk and use of computer simulation tools. Our aim in this paper is to draw on dialogic theory to explain how students make shifts in perspective, from intuition-based reasoning to more powerful, formal ways of using probabilistic…
Not Scotch, but Rum: The Scope and Diffusion of the Scottish Presence in the Published Record
ERIC Educational Resources Information Center
Lavoie, Brian
2013-01-01
Big data sets and powerful computing capacity have transformed scholarly inquiry across many disciplines. While the impact of data-intensive research methodologies is perhaps most distinct in the natural and social sciences, the humanities have also benefited from these new analytical tools. While full-text data is necessary to study topics such…
Web 2.0 Technologies for Effective Knowledge Management in Organizations: A Qualitative Analysis
ERIC Educational Resources Information Center
Nath, Anupam Kumar
2012-01-01
A new generation of Internet-based collaborative tools, commonly known as Web 2.0, has increased in popularity, availability, and power in the last few years (Kane and Fichman, 2009). Web 2.0 is a set of Internet-based applications that harness network effects by facilitating collaborative and participative computing (O'Reilly, 2006).…
Visualizing Economic Development with ArcGIS Explorer
ERIC Educational Resources Information Center
Webster, Megan L.; Milson, Andrew J.
2011-01-01
Numerous educators have noted that Geographic Information Systems (GIS) is a powerful tool for social studies teaching and learning. Yet the use of GIS has been hampered by issues such as the cost of the software and the management of large spatial data files. One trend that shows great promise for GIS in education is the move to cloud computing.…
Soil vapor extraction (SVE) and bioventing (BV) are proven strategies for remediation of unsaturated zone soils. Mathematical models are powerful tools that can be used to integrate and quantify the interaction of physical, chemical, and biological processes occurring in field sc...
New Visions of Reality: Multimedia and Education.
ERIC Educational Resources Information Center
Ambron, Sueann
1986-01-01
Multimedia is a powerful tool that will change both the way we look at knowledge and our vision of reality, as well as our educational system and the business world. Multimedia as used here refers to the innovation of mixing text, audio, and video through the use of a computer. Not only will there be new products emerging from multimedia uses, but…
ERIC Educational Resources Information Center
Abramovich, S.
2014-01-01
The availability of sophisticated computer programs such as "Wolfram Alpha" has made many problems found in the secondary mathematics curriculum somewhat obsolete for they can be easily solved by the software. Against this background, an interplay between the power of a modern tool of technology and educational constraints it presents is…
ERIC Educational Resources Information Center
Pantzare, Anna Lind
2012-01-01
Calculators with computer algebra systems (CAS) are powerful tools when working with equations and algebraic expressions in mathematics. When calculators are allowed to be used during assessments but are not available or provided to every student, they may cause bias. The CAS calculators may also have an impact on the trustworthiness of results.…
Francisco Rodríguez y Silva; Juan Ramón Molina Martínez; Miguel Ángel Herrera Machuca; Jesús Mª Rodríguez Leal
2013-01-01
Progress made in recent years in fire science, particularly as applied to forest fire protection, coupled with the increased power offered by mathematical processors integrated into computers, has led to important developments in the field of dynamic and static simulation of forest fires. Furthermore, and similarly, econometric models applied to economic...
J. G. Isebrands; G. E. Host; K. Lenz; G. Wu; H. W. Stech
2000-01-01
Process models are powerful research tools for assessing the effects of multiple environmental stresses on forest plantations. These models are driven by interacting environmental variables and often include genetic factors necessary for assessing forest plantation growth over a range of different site, climate, and silvicultural conditions. However, process models are...
Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González
2016-01-01
Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering. PMID:27872840
Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González
2016-01-01
Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.
Parameter Estimation for a Turbulent Buoyant Jet Using Approximate Bayesian Computation
NASA Astrophysics Data System (ADS)
Christopher, Jason D.; Wimer, Nicholas T.; Hayden, Torrey R. S.; Lapointe, Caelan; Grooms, Ian; Rieker, Gregory B.; Hamlington, Peter E.
2016-11-01
Approximate Bayesian Computation (ABC) is a powerful tool that allows sparse experimental or other "truth" data to be used for the prediction of unknown model parameters in numerical simulations of real-world engineering systems. In this presentation, we introduce the ABC approach and then use ABC to predict unknown inflow conditions in simulations of a two-dimensional (2D) turbulent, high-temperature buoyant jet. For this test case, truth data are obtained from a simulation with known boundary conditions and problem parameters. Using spatially-sparse temperature statistics from the 2D buoyant jet truth simulation, we show that the ABC method provides accurate predictions of the true jet inflow temperature. The success of the ABC approach in the present test suggests that ABC is a useful and versatile tool for engineering fluid dynamics research.
DNA-binding specificity prediction with FoldX.
Nadra, Alejandro D; Serrano, Luis; Alibés, Andreu
2011-01-01
With the advent of Synthetic Biology, a field between basic science and applied engineering, new computational tools are needed to help scientists reach their goal, their design, optimizing resources. In this chapter, we present a simple and powerful method to either know the DNA specificity of a wild-type protein or design new specificities by using the protein design algorithm FoldX. The only basic requirement is having a good resolution structure of the complex. Protein-DNA interaction design may aid the development of new parts designed to be orthogonal, decoupled, and precise in its target. Further, it could help to fine-tune the systems in terms of specificity, discrimination, and binding constants. In the age of newly developed devices and invented systems, computer-aided engineering promises to be an invaluable tool. Copyright © 2011 Elsevier Inc. All rights reserved.
Shared Memory Parallelism for 3D Cartesian Discrete Ordinates Solver
NASA Astrophysics Data System (ADS)
Moustafa, Salli; Dutka-Malen, Ivan; Plagne, Laurent; Ponçot, Angélique; Ramet, Pierre
2014-06-01
This paper describes the design and the performance of DOMINO, a 3D Cartesian SN solver that implements two nested levels of parallelism (multicore+SIMD) on shared memory computation nodes. DOMINO is written in C++, a multi-paradigm programming language that enables the use of powerful and generic parallel programming tools such as Intel TBB and Eigen. These two libraries allow us to combine multi-thread parallelism with vector operations in an efficient and yet portable way. As a result, DOMINO can exploit the full power of modern multi-core processors and is able to tackle very large simulations, that usually require large HPC clusters, using a single computing node. For example, DOMINO solves a 3D full core PWR eigenvalue problem involving 26 energy groups, 288 angular directions (S16), 46 × 106 spatial cells and 1 × 1012 DoFs within 11 hours on a single 32-core SMP node. This represents a sustained performance of 235 GFlops and 40:74% of the SMP node peak performance for the DOMINO sweep implementation. The very high Flops/Watt ratio of DOMINO makes it a very interesting building block for a future many-nodes nuclear simulation tool.
Canbay, Ferhat; Levent, Vecdi Emre; Serbes, Gorkem; Ugurdag, H. Fatih; Goren, Sezer
2016-01-01
The authors aimed to develop an application for producing different architectures to implement dual tree complex wavelet transform (DTCWT) having near shift-invariance property. To obtain a low-cost and portable solution for implementing the DTCWT in multi-channel real-time applications, various embedded-system approaches are realised. For comparison, the DTCWT was implemented in C language on a personal computer and on a PIC microcontroller. However, in the former approach portability and in the latter desired speed performance properties cannot be achieved. Hence, implementation of the DTCWT on a reconfigurable platform such as field programmable gate array, which provides portable, low-cost, low-power, and high-performance computing, is considered as the most feasible solution. At first, they used the system generator DSP design tool of Xilinx for algorithm design. However, the design implemented by using such tools is not optimised in terms of area and power. To overcome all these drawbacks mentioned above, they implemented the DTCWT algorithm by using Verilog Hardware Description Language, which has its own difficulties. To overcome these difficulties, simplify the usage of proposed algorithms and the adaptation procedures, a code generator program that can produce different architectures is proposed. PMID:27733925
Canbay, Ferhat; Levent, Vecdi Emre; Serbes, Gorkem; Ugurdag, H Fatih; Goren, Sezer; Aydin, Nizamettin
2016-09-01
The authors aimed to develop an application for producing different architectures to implement dual tree complex wavelet transform (DTCWT) having near shift-invariance property. To obtain a low-cost and portable solution for implementing the DTCWT in multi-channel real-time applications, various embedded-system approaches are realised. For comparison, the DTCWT was implemented in C language on a personal computer and on a PIC microcontroller. However, in the former approach portability and in the latter desired speed performance properties cannot be achieved. Hence, implementation of the DTCWT on a reconfigurable platform such as field programmable gate array, which provides portable, low-cost, low-power, and high-performance computing, is considered as the most feasible solution. At first, they used the system generator DSP design tool of Xilinx for algorithm design. However, the design implemented by using such tools is not optimised in terms of area and power. To overcome all these drawbacks mentioned above, they implemented the DTCWT algorithm by using Verilog Hardware Description Language, which has its own difficulties. To overcome these difficulties, simplify the usage of proposed algorithms and the adaptation procedures, a code generator program that can produce different architectures is proposed.
NASA Astrophysics Data System (ADS)
Hassan, A. H.; Fluke, C. J.; Barnes, D. G.
2012-09-01
Upcoming and future astronomy research facilities will systematically generate terabyte-sized data sets moving astronomy into the Petascale data era. While such facilities will provide astronomers with unprecedented levels of accuracy and coverage, the increases in dataset size and dimensionality will pose serious computational challenges for many current astronomy data analysis and visualization tools. With such data sizes, even simple data analysis tasks (e.g. calculating a histogram or computing data minimum/maximum) may not be achievable without access to a supercomputing facility. To effectively handle such dataset sizes, which exceed today's single machine memory and processing limits, we present a framework that exploits the distributed power of GPUs and many-core CPUs, with a goal of providing data analysis and visualizing tasks as a service for astronomers. By mixing shared and distributed memory architectures, our framework effectively utilizes the underlying hardware infrastructure handling both batched and real-time data analysis and visualization tasks. Offering such functionality as a service in a “software as a service” manner will reduce the total cost of ownership, provide an easy to use tool to the wider astronomical community, and enable a more optimized utilization of the underlying hardware infrastructure.
Human eye haptics-based multimedia.
Velandia, David; Uribe-Quevedo, Alvaro; Perez-Gutierrez, Byron
2014-01-01
Immersive and interactive multimedia applications offer complementary study tools in anatomy as users can explore 3D models while obtaining information about the organ, tissue or part being explored. Haptics increases the sense of interaction with virtual objects improving user experience in a more realistic manner. Common eye studying tools are books, illustrations, assembly models, and more recently these are being complemented with mobile apps whose 3D capabilities, computing power and customers are increasing. The goal of this project is to develop a complementary eye anatomy and pathology study tool using deformable models within a multimedia application, offering the students the opportunity for exploring the eye from up close and within with relevant information. Validation of the tool provided feedback on the potential of the development, along with suggestions on improving haptic feedback and navigation.
Finite element analysis of hysteresis effects in piezoelectric transducers
NASA Astrophysics Data System (ADS)
Simkovics, Reinhard; Landes, Hermann; Kaltenbacher, Manfred; Hoffelner, Johann; Lerch, Reinhard
2000-06-01
The design of ultrasonic transducers for high power applications, e.g. in medical therapy or production engineering, asks for effective computer aided design tools to analyze the occurring nonlinear effects. In this paper the finite-element-boundary-element package CAPA is presented that allows to model different types of electromechanical sensors and actuators. These transducers are based on various physical coupling effects, such as piezoelectricity or magneto- mechanical interactions. Their computer modeling requires the numerical solution of a multifield problem, such as coupled electric-mechanical fields or magnetic-mechanical fields as well as coupled mechanical-acoustic fields. With the reported software environment we are able to compute the dynamic behavior of electromechanical sensors and actuators by taking into account geometric nonlinearities, nonlinear wave propagation and ferroelectric as well as magnetic material nonlinearities. After a short introduction to the basic theory of the numerical calculation schemes, two practical examples will demonstrate the applicability of the numerical simulation tool. As a first example an ultrasonic thickness mode transducer consisting of a piezoceramic material used for high power ultrasound production is examined. Due to ferroelectric hysteresis, higher order harmonics can be detected in the actuators input current. Also in case of electrical and mechanical prestressing a resonance frequency shift occurs, caused by ferroelectric hysteresis and nonlinear dependencies of the material coefficients on electric field and mechanical stresses. As a second example, a power ultrasound transducer used in HIFU-therapy (high intensity focused ultrasound) is presented. Due to the compressibility and losses in the propagating fluid a nonlinear shock wave generation can be observed. For both examples a good agreement between numerical simulation and experimental data has been achieved.
Toward an Improvement of the Analysis of Neural Coding.
Alegre-Cortés, Javier; Soto-Sánchez, Cristina; Albarracín, Ana L; Farfán, Fernando D; Val-Calvo, Mikel; Ferrandez, José M; Fernandez, Eduardo
2017-01-01
Machine learning and artificial intelligence have strong roots on principles of neural computation. Some examples are the structure of the first perceptron, inspired in the retina, neuroprosthetics based on ganglion cell recordings or Hopfield networks. In addition, machine learning provides a powerful set of tools to analyze neural data, which has already proved its efficacy in so distant fields of research as speech recognition, behavioral states classification, or LFP recordings. However, despite the huge technological advances in neural data reduction of dimensionality, pattern selection, and clustering during the last years, there has not been a proportional development of the analytical tools used for Time-Frequency (T-F) analysis in neuroscience. Bearing this in mind, we introduce the convenience of using non-linear, non-stationary tools, EMD algorithms in particular, for the transformation of the oscillatory neural data (EEG, EMG, spike oscillations…) into the T-F domain prior to its analysis with machine learning tools. We support that to achieve meaningful conclusions, the transformed data we analyze has to be as faithful as possible to the original recording, so that the transformations forced into the data due to restrictions in the T-F computation are not extended to the results of the machine learning analysis. Moreover, bioinspired computation such as brain-machine interface may be enriched from a more precise definition of neuronal coding where non-linearities of the neuronal dynamics are considered.
A framework for interactive visualization of digital medical images.
Koehring, Andrew; Foo, Jung Leng; Miyano, Go; Lobe, Thom; Winer, Eliot
2008-10-01
The visualization of medical images obtained from scanning techniques such as computed tomography and magnetic resonance imaging is a well-researched field. However, advanced tools and methods to manipulate these data for surgical planning and other tasks have not seen widespread use among medical professionals. Radiologists have begun using more advanced visualization packages on desktop computer systems, but most physicians continue to work with basic two-dimensional grayscale images or not work directly with the data at all. In addition, new display technologies that are in use in other fields have yet to be fully applied in medicine. It is our estimation that usability is the key aspect in keeping this new technology from being more widely used by the medical community at large. Therefore, we have a software and hardware framework that not only make use of advanced visualization techniques, but also feature powerful, yet simple-to-use, interfaces. A virtual reality system was created to display volume-rendered medical models in three dimensions. It was designed to run in many configurations, from a large cluster of machines powering a multiwalled display down to a single desktop computer. An augmented reality system was also created for, literally, hands-on interaction when viewing models of medical data. Last, a desktop application was designed to provide a simple visualization tool, which can be run on nearly any computer at a user's disposal. This research is directed toward improving the capabilities of medical professionals in the tasks of preoperative planning, surgical training, diagnostic assistance, and patient education.
Reduced Order Model Implementation in the Risk-Informed Safety Margin Characterization Toolkit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mandelli, Diego; Smith, Curtis L.; Alfonsi, Andrea
2015-09-01
The RISMC project aims to develop new advanced simulation-based tools to perform Probabilistic Risk Analysis (PRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermo-hydraulic behavior of the reactor primary and secondary systems but also external events temporal evolution and components/system ageing. Thus, this is not only a multi-physics problem but also a multi-scale problem (both spatial, µm-mm-m, and temporal, ms-s-minutes-years). As part of the RISMC PRA approach, a large amount of computationally expensive simulation runs are required. An important aspect is that even though computational power is regularly growing, themore » overall computational cost of a RISMC analysis may be not viable for certain cases. A solution that is being evaluated is the use of reduce order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RICM analysis computational cost by decreasing the number of simulations runs to perform and employ surrogate models instead of the actual simulation codes. This report focuses on the use of reduced order modeling techniques that can be applied to any RISMC analysis to generate, analyze and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (µs instead of hours/days). We apply reduced order and surrogate modeling techniques to several RISMC types of analyses using RAVEN and RELAP-7 and show the advantages that can be gained.« less
Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2012)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, David C.; Syamlal, Madhava; Cottrell, Roger
2012-09-30
The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is organized into 8 technical elements that fall under two focus areas. The first focus area (Physicochemical Models and Data) addresses the steps necessary to model and simulate the various technologies and processes needed to bring a new Carbon Capture and Storage (CCS) technology into production. The second focus area (Analysis & Software) is developing the software infrastructure to integrate the various components and implement the tools that are needed to make quantifiable decisions regarding the viability of new CCS technologies. CCSI also has an Industry Advisory Board (IAB). By working closely with industry from the inception of the project to identify industrial challenge problems, CCSI ensures that the simulation tools are developed for the carbon capture technologies of most relevance to industry. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories' core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI's industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI's academic participants (Carnegie Mellon University, Princeton University, West Virginia University, and Boston University) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 12, CCSI released its first set of computational tools and models. This pre-release, a year ahead of the originally planned first release, is the result of intense industry interest in getting early access to the tools and the phenomenal progress of the CCSI technical team. These initial components of the CCSI Toolset provide new models and computational capabilities that will accelerate the commercial development of carbon capture technologies as well as related technologies, such as those found in the power, refining, chemicals, and gas production industries. The release consists of new tools for process synthesis and optimization to help identify promising concepts more quickly, new physics-based models of potential capture equipment and processes that will reduce the time to design and troubleshoot new systems, a framework to quantify the uncertainty of model predictions, and various enabling tools that provide new capabilities such as creating reduced order models (ROMs) from reacting multiphase flow simulations and running thousands of process simulations concurrently for optimization and UQ.« less
NASA Technical Reports Server (NTRS)
Ferraro, R.; Some, R.
2002-01-01
The growth in data rates of instruments on future NASA spacecraft continues to outstrip the improvement in communications bandwidth and processing capabilities of radiation-hardened computers. Sophisticated autonomous operations strategies will further increase the processing workload. Given the reductions in spacecraft size and available power, standard radiation hardened computing systems alone will not be able to address the requirements of future missions. The REE project was intended to overcome this obstacle by developing a COTS- based supercomputer suitable for use as a science and autonomy data processor in most space environments. This development required a detailed knowledge of system behavior in the presence of Single Event Effect (SEE) induced faults so that mitigation strategies could be designed to recover system level reliability while maintaining the COTS throughput advantage. The REE project has developed a suite of tools and a methodology for predicting SEU induced transient fault rates in a range of natural space environments from ground-based radiation testing of component parts. In this paper we provide an overview of this methodology and tool set with a concentration on the radiation fault model and its use in the REE system development methodology. Using test data reported elsewhere in this and other conferences, we predict upset rates for a particular COTS single board computer configuration in several space environments.
NASA Astrophysics Data System (ADS)
Audigier, Chloé; Kim, Younsu; Dillow, Austin; Boctor, Emad M.
2017-03-01
Radiofrequency ablation (RFA) is the most widely used minimally invasive ablative therapy for liver cancer, but it is challenged by a lack of patient-specific monitoring. Inter-patient tissue variability and the presence of blood vessels make the prediction of the RFA difficult. A monitoring tool which can be personalized for a given patient during the intervention would be helpful to achieve a complete tumor ablation. However, the clinicians do not have access to such a tool, which results in incomplete treatment and a large number of recurrences. Computational models can simulate the phenomena and mechanisms governing this therapy. The temperature evolution as well as the resulted ablation can be modeled. When combined together with intraoperative measurements, computational modeling becomes an accurate and powerful tool to gain quantitative understanding and to enable improvements in the ongoing clinical settings. This paper shows how computational models of RFA can be evaluated using intra-operative measurements. First, simulations are used to demonstrate the feasibility of the method, which is then evaluated on two ex vivo datasets. RFA is simulated on a simplified geometry to generate realistic longitudinal temperature maps and the resulted necrosis. Computed temperatures are compared with the temperature evolution recorded using thermometers, and with temperatures monitored by ultrasound (US) in a 2D plane containing the ablation tip. Two ablations are performed on two cadaveric bovine livers, and we achieve error of 2.2 °C on average between the computed and the thermistors temperature and 1.4 °C and 2.7 °C on average between the temperature computed and monitored by US during the ablation at two different time points (t = 240 s and t = 900 s).
Single-Cell Genomics: Approaches and Utility in Immunology.
Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A
2017-02-01
Single-cell genomics offers powerful tools for studying immune cells, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population level. Advances in computer science and single-cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single-cell RNA-sequencing data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. Copyright © 2016 Elsevier Ltd. All rights reserved.
New insights into faster computation of uncertainties
NASA Astrophysics Data System (ADS)
Bhattacharya, Atreyee
2012-11-01
Heavy computation power, lengthy simulations, and an exhaustive number of model runs—often these seem like the only statistical tools that scientists have at their disposal when computing uncertainties associated with predictions, particularly in cases of environmental processes such as groundwater movement. However, calculation of uncertainties need not be as lengthy, a new study shows. Comparing two approaches—the classical Bayesian “credible interval” and a less commonly used regression-based “confidence interval” method—Lu et al. show that for many practical purposes both methods provide similar estimates of uncertainties. The advantage of the regression method is that it demands 10-1000 model runs, whereas the classical Bayesian approach requires 10,000 to millions of model runs.
I-deas TMG to NX Space Systems Thermal Model Conversion and Computational Performance Comparison
NASA Technical Reports Server (NTRS)
Somawardhana, Ruwan
2011-01-01
CAD/CAE packages change on a continuous basis as the power of the tools increase to meet demands. End -users must adapt to new products as they come to market and replace legacy packages. CAE modeling has continued to evolve and is constantly becoming more detailed and complex. Though this comes at the cost of increased computing requirements Parallel processing coupled with appropriate hardware can minimize computation time. Users of Maya Thermal Model Generator (TMG) are faced with transitioning from NX I -deas to NX Space Systems Thermal (SST). It is important to understand what differences there are when changing software packages We are looking for consistency in results.
NASA Technical Reports Server (NTRS)
Yarrow, Maurice; McCann, Karen M.; Biswas, Rupak; VanderWijngaart, Rob; Yan, Jerry C. (Technical Monitor)
2000-01-01
The creation of parameter study suites has recently become a more challenging problem as the parameter studies have now become multi-tiered and the computational environment has become a supercomputer grid. The parameter spaces are vast, the individual problem sizes are getting larger, and researchers are now seeking to combine several successive stages of parameterization and computation. Simultaneously, grid-based computing offers great resource opportunity but at the expense of great difficulty of use. We present an approach to this problem which stresses intuitive visual design tools for parameter study creation and complex process specification, and also offers programming-free access to grid-based supercomputer resources and process automation.
Design and development of a solar powered mobile laboratory
NASA Astrophysics Data System (ADS)
Jiao, L.; Simon, A.; Barrera, H.; Acharya, V.; Repke, W.
2016-08-01
This paper describes the design and development of a solar powered mobile laboratory (SPML) system. The SPML provides a mobile platform that schools, universities, and communities can use to give students and staff access to laboratory environments where dedicated laboratories are not available. The lab includes equipment like 3D printers, computers, and soldering stations. The primary power source of the system is solar PV which allows the laboratory to be operated in places where the grid power is not readily available or not sufficient to power all the equipment. The main system components include PV panels, junction box, battery, charge controller, and inverter. Not only is it used to teach students and staff how to use the lab equipment, but it is also a great tool to educate the public about solar PV technologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan; Hale, Elaine; Hodge, Bri-Mathias
2016-08-11
This paper discusses the development of, approaches for, experiences with, and some results from a large-scale, high-performance-computer-based (HPC-based) co-simulation of electric power transmission and distribution systems using the Integrated Grid Modeling System (IGMS). IGMS was developed at the National Renewable Energy Laboratory (NREL) as a novel Independent System Operator (ISO)-to-appliance scale electric power system modeling platform that combines off-the-shelf tools to simultaneously model 100s to 1000s of distribution systems in co-simulation with detailed ISO markets, transmission power flows, and AGC-level reserve deployment. Lessons learned from the co-simulation architecture development are shared, along with a case study that explores the reactivemore » power impacts of PV inverter voltage support on the bulk power system.« less
NASA Technical Reports Server (NTRS)
Deardorff, Glenn; Djomehri, M. Jahed; Freeman, Ken; Gambrel, Dave; Green, Bryan; Henze, Chris; Hinke, Thomas; Hood, Robert; Kiris, Cetin; Moran, Patrick;
2001-01-01
A series of NASA presentations for the Supercomputing 2001 conference are summarized. The topics include: (1) Mars Surveyor Landing Sites "Collaboratory"; (2) Parallel and Distributed CFD for Unsteady Flows with Moving Overset Grids; (3) IP Multicast for Seamless Support of Remote Science; (4) Consolidated Supercomputing Management Office; (5) Growler: A Component-Based Framework for Distributed/Collaborative Scientific Visualization and Computational Steering; (6) Data Mining on the Information Power Grid (IPG); (7) Debugging on the IPG; (8) Debakey Heart Assist Device: (9) Unsteady Turbopump for Reusable Launch Vehicle; (10) Exploratory Computing Environments Component Framework; (11) OVERSET Computational Fluid Dynamics Tools; (12) Control and Observation in Distributed Environments; (13) Multi-Level Parallelism Scaling on NASA's Origin 1024 CPU System; (14) Computing, Information, & Communications Technology; (15) NAS Grid Benchmarks; (16) IPG: A Large-Scale Distributed Computing and Data Management System; and (17) ILab: Parameter Study Creation and Submission on the IPG.
Energy loss analysis of an integrated space power distribution system
NASA Technical Reports Server (NTRS)
Kankam, M. D.; Ribeiro, P. F.
1992-01-01
The results of studies related to conceptual topologies of an integrated utility-like space power system are described. The system topologies are comparatively analyzed by considering their transmission energy losses as functions of mainly distribution voltage level and load composition. The analysis is expedited by use of a Distribution System Analysis and Simulation (DSAS) software. This recently developed computer program by the Electric Power Research Institute (EPRI) uses improved load models to solve the power flow within the system. However, present shortcomings of the software with regard to space applications, and incompletely defined characteristics of a space power system make the results applicable to only the fundamental trends of energy losses of the topologies studied. Accountability, such as included, for the effects of the various parameters on the system performance can constitute part of a planning tool for a space power distribution system.
Quantum computing on encrypted data
NASA Astrophysics Data System (ADS)
Fisher, K. A. G.; Broadbent, A.; Shalm, L. K.; Yan, Z.; Lavoie, J.; Prevedel, R.; Jennewein, T.; Resch, K. J.
2014-01-01
The ability to perform computations on encrypted data is a powerful tool for protecting privacy. Recently, protocols to achieve this on classical computing systems have been found. Here, we present an efficient solution to the quantum analogue of this problem that enables arbitrary quantum computations to be carried out on encrypted quantum data. We prove that an untrusted server can implement a universal set of quantum gates on encrypted quantum bits (qubits) without learning any information about the inputs, while the client, knowing the decryption key, can easily decrypt the results of the computation. We experimentally demonstrate, using single photons and linear optics, the encryption and decryption scheme on a set of gates sufficient for arbitrary quantum computations. As our protocol requires few extra resources compared with other schemes it can be easily incorporated into the design of future quantum servers. These results will play a key role in enabling the development of secure distributed quantum systems.
Quantum computing on encrypted data.
Fisher, K A G; Broadbent, A; Shalm, L K; Yan, Z; Lavoie, J; Prevedel, R; Jennewein, T; Resch, K J
2014-01-01
The ability to perform computations on encrypted data is a powerful tool for protecting privacy. Recently, protocols to achieve this on classical computing systems have been found. Here, we present an efficient solution to the quantum analogue of this problem that enables arbitrary quantum computations to be carried out on encrypted quantum data. We prove that an untrusted server can implement a universal set of quantum gates on encrypted quantum bits (qubits) without learning any information about the inputs, while the client, knowing the decryption key, can easily decrypt the results of the computation. We experimentally demonstrate, using single photons and linear optics, the encryption and decryption scheme on a set of gates sufficient for arbitrary quantum computations. As our protocol requires few extra resources compared with other schemes it can be easily incorporated into the design of future quantum servers. These results will play a key role in enabling the development of secure distributed quantum systems.
Computational Science at the Argonne Leadership Computing Facility
NASA Astrophysics Data System (ADS)
Romero, Nichols
2014-03-01
The goal of the Argonne Leadership Computing Facility (ALCF) is to extend the frontiers of science by solving problems that require innovative approaches and the largest-scale computing systems. ALCF's most powerful computer - Mira, an IBM Blue Gene/Q system - has nearly one million cores. How does one program such systems? What software tools are available? Which scientific and engineering applications are able to utilize such levels of parallelism? This talk will address these questions and describe a sampling of projects that are using ALCF systems in their research, including ones in nanoscience, materials science, and chemistry. Finally, the ways to gain access to ALCF resources will be presented. This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02-06CH11357.
Condor-COPASI: high-throughput computing for biochemical networks
2012-01-01
Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage. PMID:22834945
Galaxy CloudMan: delivering cloud compute clusters.
Afgan, Enis; Baker, Dannon; Coraor, Nate; Chapman, Brad; Nekrutenko, Anton; Taylor, James
2010-12-21
Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is "cloud computing", which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate "as is" use by experimental biologists. We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon's EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge.
Parallel Calculations in LS-DYNA
NASA Astrophysics Data System (ADS)
Vartanovich Mkrtychev, Oleg; Aleksandrovich Reshetov, Andrey
2017-11-01
Nowadays, structural mechanics exhibits a trend towards numeric solutions being found for increasingly extensive and detailed tasks, which requires that capacities of computing systems be enhanced. Such enhancement can be achieved by different means. E.g., in case a computing system is represented by a workstation, its components can be replaced and/or extended (CPU, memory etc.). In essence, such modification eventually entails replacement of the entire workstation, i.e. replacement of certain components necessitates exchange of others (faster CPUs and memory devices require buses with higher throughput etc.). Special consideration must be given to the capabilities of modern video cards. They constitute powerful computing systems capable of running data processing in parallel. Interestingly, the tools originally designed to render high-performance graphics can be applied for solving problems not immediately related to graphics (CUDA, OpenCL, Shaders etc.). However, not all software suites utilize video cards’ capacities. Another way to increase capacity of a computing system is to implement a cluster architecture: to add cluster nodes (workstations) and to increase the network communication speed between the nodes. The advantage of this approach is extensive growth due to which a quite powerful system can be obtained by combining not particularly powerful nodes. Moreover, separate nodes may possess different capacities. This paper considers the use of a clustered computing system for solving problems of structural mechanics with LS-DYNA software. To establish a range of dependencies a mere 2-node cluster has proven sufficient.
Medical imaging and registration in computer assisted surgery.
Simon, D A; Lavallée, S
1998-09-01
Imaging, sensing, and computing technologies that are being introduced to aid in the planning and execution of surgical procedures are providing orthopaedic surgeons with a powerful new set of tools for improving clinical accuracy, reliability, and patient outcomes while reducing costs and operating times. Current computer assisted surgery systems typically include a measurement process for collecting patient specific medical data, a decision making process for generating a surgical plan, a registration process for aligning the surgical plan to the patient, and an action process for accurately achieving the goals specified in the plan. Some of the key concepts in computer assisted surgery applied to orthopaedics with a focus on the basic framework and underlying technologies is outlined. In addition, technical challenges and future trends in the field are discussed.
Computer Aided Drug Design: Success and Limitations.
Baig, Mohammad Hassan; Ahmad, Khurshid; Roy, Sudeep; Ashraf, Jalaluddin Mohammad; Adil, Mohd; Siddiqui, Mohammad Haris; Khan, Saif; Kamal, Mohammad Amjad; Provazník, Ivo; Choi, Inho
2016-01-01
Over the last few decades, computer-aided drug design has emerged as a powerful technique playing a crucial role in the development of new drug molecules. Structure-based drug design and ligand-based drug design are two methods commonly used in computer-aided drug design. In this article, we discuss the theory behind both methods, as well as their successful applications and limitations. To accomplish this, we reviewed structure based and ligand based virtual screening processes. Molecular dynamics simulation, which has become one of the most influential tool for prediction of the conformation of small molecules and changes in their conformation within the biological target, has also been taken into account. Finally, we discuss the principles and concepts of molecular docking, pharmacophores and other methods used in computer-aided drug design.
Mathematical and Computational Challenges in Population Biology and Ecosystems Science
NASA Technical Reports Server (NTRS)
Levin, Simon A.; Grenfell, Bryan; Hastings, Alan; Perelson, Alan S.
1997-01-01
Mathematical and computational approaches provide powerful tools in the study of problems in population biology and ecosystems science. The subject has a rich history intertwined with the development of statistics and dynamical systems theory, but recent analytical advances, coupled with the enhanced potential of high-speed computation, have opened up new vistas and presented new challenges. Key challenges involve ways to deal with the collective dynamics of heterogeneous ensembles of individuals, and to scale from small spatial regions to large ones. The central issues-understanding how detail at one scale makes its signature felt at other scales, and how to relate phenomena across scales-cut across scientific disciplines and go to the heart of algorithmic development of approaches to high-speed computation. Examples are given from ecology, genetics, epidemiology, and immunology.
A strip chart recorder pattern recognition tool kit for Shuttle operations
NASA Technical Reports Server (NTRS)
Hammen, David G.; Moebes, Travis A.; Shelton, Robert O.; Savely, Robert T.
1993-01-01
During Space Shuttle operations, Mission Control personnel monitor numerous mission-critical systems such as electrical power; guidance, navigation, and control; and propulsion by means of paper strip chart recorders. For example, electrical power controllers monitor strip chart recorder pen traces to identify onboard electrical equipment activations and deactivations. Recent developments in pattern recognition technologies coupled with new capabilities that distribute real-time Shuttle telemetry data to engineering workstations make it possible to develop computer applications that perform some of the low-level monitoring now performed by controllers. The number of opportunities for such applications suggests a need to build a pattern recognition tool kit to reduce software development effort through software reuse. We are building pattern recognition applications while keeping such a tool kit in mind. We demonstrated the initial prototype application, which identifies electrical equipment activations, during three recent Shuttle flights. This prototype was developed to test the viability of the basic system architecture, to evaluate the performance of several pattern recognition techniques including those based on cross-correlation, neural networks, and statistical methods, to understand the interplay between an advanced automation application and human controllers to enhance utility, and to identify capabilities needed in a more general-purpose tool kit.
Advanced computations in plasma physics
NASA Astrophysics Data System (ADS)
Tang, W. M.
2002-05-01
Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Papp, G.C.
1991-03-01
In this paper general equations for the asynchronous squirrel-cage motor which contain the influence of space harmonics and the mutual slotting are derived by using among others the power-invariant symmetrical component transformation and a time-dependent transformation with which, under certain circumstances, the rotor-position angle can be removed from the coefficient matrix. The developed models implemented in a machine-independent computer program form powerful tools, with which the influence of space harmonics in relation to the geometric data of specific motors can be analyzed for steady-state and transient performances.
Using the FORTH Language to Develop an ICU Data Acquisition System
Goldberg, Arthur; SooHoo, Spencer L.; Koerner, Spencer K.; Chang, Robert S. Y.
1980-01-01
This paper describes a powerful programming tool that should be considered as an alternative to the more conventional programming languages now in use for developing medical computer systems. Forth provides instantaneous response to user commands, rapid program execution and tremendous programming versatility. An operating system and a language in one carefully designed unit, Forth is well suited for developing data acquisition systems and for interfacing computers to other instruments. We present some of the general features of Forth and describe its use in implementing a data collection system for a Respiratory Intensive Care Unit (RICU).
Computer-Aided Analysis of Patents for Product Technology Maturity Forecasting
NASA Astrophysics Data System (ADS)
Liang, Yanhong; Gan, Dequan; Guo, Yingchun; Zhang, Peng
Product technology maturity foresting is vital for any enterprises to hold the chance for innovation and keep competitive for a long term. The Theory of Invention Problem Solving (TRIZ) is acknowledged both as a systematic methodology for innovation and a powerful tool for technology forecasting. Based on TRIZ, the state -of-the-art on the technology maturity of product and the limits of application are discussed. With the application of text mining and patent analysis technologies, this paper proposes a computer-aided approach for product technology maturity forecasting. It can overcome the shortcomings of the current methods.
NREL Software Aids Offshore Wind Turbine Designs (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2013-10-01
NREL researchers are supporting offshore wind power development with computer models that allow detailed analyses of both fixed and floating offshore wind turbines. While existing computer-aided engineering (CAE) models can simulate the conditions and stresses that a land-based wind turbine experiences over its lifetime, offshore turbines require the additional considerations of variations in water depth, soil type, and wind and wave severity, which also necessitate the use of a variety of support-structure types. NREL's core wind CAE tool, FAST, models the additional effects of incident waves, sea currents, and the foundation dynamics of the support structures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pesaran, A.; Wierzbicki, T.; Sahraei, E.
The EV Everywhere Grand Challenge aims to produce plug-in electric vehicles as affordable and convenient for the American family as gasoline-powered vehicles by 2022. Among the requirements set by the challenge, electric vehicles must be as safe as conventional vehicles, and EV batteries must not lead to unsafe situations under abuse conditions. NREL's project started in October 2013, based on a proposal in response to the January 2013 DOE VTO FOA, with the goal of developing computer aided engineering tools to accelerate the development of safer lithium ion batteries.
Research in Computer Forensics
2002-06-01
systems and how they can aid in the recovery of digital evidence in a forensic analysis. Exposures to hacking techniques and tools in CS3675—Internet...cryptography, access control, authentication, biometrics, actions to be taken during an attack and case studies of hacking and information warfare. 11...chat, surfing, instant messaging and hacking with powerful access control and filter capabilities. The monitor can operates in a Prevention mode to
ERIC Educational Resources Information Center
Murugaiah, Puvaneswary
2016-01-01
In computer-assisted language learning (CALL), technological tools are often used both as an end and as a means to an end (Levy & Stockwell, 2006). Microsoft PowerPoint is an example of the latter as it is commonly used in oral presentations in classrooms. However, many student presentations are often boring as students generally read from…
Mineral resource of the month: cobalt
Shedd, Kim B.
2009-01-01
Cobalt is a metal used in numerous commercial, industrial and military applications. On a global basis, the leading use of cobalt is in rechargeable lithium-ion, nickel-cadmium and nickel-metal hydride battery electrodes. Cobalt use has grown rapidly since the early 1990s, with the development of new battery technologies and an increase in demand for portable electronics such as cell phones, laptop computers and cordless power tools.
Proceedings of the Workshop on software tools for distributed intelligent control systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herget, C.J.
1990-09-01
The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can formmore » the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.« less
MAGMA: Generalized Gene-Set Analysis of GWAS Data
de Leeuw, Christiaan A.; Mooij, Joris M.; Heskes, Tom; Posthuma, Danielle
2015-01-01
By aggregating data for complex traits in a biologically meaningful way, gene and gene-set analysis constitute a valuable addition to single-marker analysis. However, although various methods for gene and gene-set analysis currently exist, they generally suffer from a number of issues. Statistical power for most methods is strongly affected by linkage disequilibrium between markers, multi-marker associations are often hard to detect, and the reliance on permutation to compute p-values tends to make the analysis computationally very expensive. To address these issues we have developed MAGMA, a novel tool for gene and gene-set analysis. The gene analysis is based on a multiple regression model, to provide better statistical performance. The gene-set analysis is built as a separate layer around the gene analysis for additional flexibility. This gene-set analysis also uses a regression structure to allow generalization to analysis of continuous properties of genes and simultaneous analysis of multiple gene sets and other gene properties. Simulations and an analysis of Crohn’s Disease data are used to evaluate the performance of MAGMA and to compare it to a number of other gene and gene-set analysis tools. The results show that MAGMA has significantly more power than other tools for both the gene and the gene-set analysis, identifying more genes and gene sets associated with Crohn’s Disease while maintaining a correct type 1 error rate. Moreover, the MAGMA analysis of the Crohn’s Disease data was found to be considerably faster as well. PMID:25885710
MAGMA: generalized gene-set analysis of GWAS data.
de Leeuw, Christiaan A; Mooij, Joris M; Heskes, Tom; Posthuma, Danielle
2015-04-01
By aggregating data for complex traits in a biologically meaningful way, gene and gene-set analysis constitute a valuable addition to single-marker analysis. However, although various methods for gene and gene-set analysis currently exist, they generally suffer from a number of issues. Statistical power for most methods is strongly affected by linkage disequilibrium between markers, multi-marker associations are often hard to detect, and the reliance on permutation to compute p-values tends to make the analysis computationally very expensive. To address these issues we have developed MAGMA, a novel tool for gene and gene-set analysis. The gene analysis is based on a multiple regression model, to provide better statistical performance. The gene-set analysis is built as a separate layer around the gene analysis for additional flexibility. This gene-set analysis also uses a regression structure to allow generalization to analysis of continuous properties of genes and simultaneous analysis of multiple gene sets and other gene properties. Simulations and an analysis of Crohn's Disease data are used to evaluate the performance of MAGMA and to compare it to a number of other gene and gene-set analysis tools. The results show that MAGMA has significantly more power than other tools for both the gene and the gene-set analysis, identifying more genes and gene sets associated with Crohn's Disease while maintaining a correct type 1 error rate. Moreover, the MAGMA analysis of the Crohn's Disease data was found to be considerably faster as well.
Quinn, T. Alexander; Kohl, Peter
2013-01-01
Since the development of the first mathematical cardiac cell model 50 years ago, computational modelling has become an increasingly powerful tool for the analysis of data and for the integration of information related to complex cardiac behaviour. Current models build on decades of iteration between experiment and theory, representing a collective understanding of cardiac function. All models, whether computational, experimental, or conceptual, are simplified representations of reality and, like tools in a toolbox, suitable for specific applications. Their range of applicability can be explored (and expanded) by iterative combination of ‘wet’ and ‘dry’ investigation, where experimental or clinical data are used to first build and then validate computational models (allowing integration of previous findings, quantitative assessment of conceptual models, and projection across relevant spatial and temporal scales), while computational simulations are utilized for plausibility assessment, hypotheses-generation, and prediction (thereby defining further experimental research targets). When implemented effectively, this combined wet/dry research approach can support the development of a more complete and cohesive understanding of integrated biological function. This review illustrates the utility of such an approach, based on recent examples of multi-scale studies of cardiac structure and mechano-electric function. PMID:23334215
NASA Astrophysics Data System (ADS)
Gruska, Jozef
2012-06-01
One of the most basic tasks in quantum information processing, communication and security (QIPCC) research, theoretically deep and practically important, is to find bounds on how really important are inherently quantum resources for speeding up computations. This area of research is bringing a variety of results that imply, often in a very unexpected and counter-intuitive way, that: (a) surprisingly large classes of quantum circuits and algorithms can be efficiently simulated on classical computers; (b) the border line between quantum processes that can and cannot be efficiently simulated on classical computers is often surprisingly thin; (c) the addition of a seemingly very simple resource or a tool often enormously increases the power of available quantum tools. These discoveries have put also a new light on our understanding of quantum phenomena and quantum physics and on the potential of its inherently quantum and often mysteriously looking phenomena. The paper motivates and surveys research and its outcomes in the area of de-quantisation, especially presents various approaches and their outcomes concerning efficient classical simulations of various families of quantum circuits and algorithms. To motivate this area of research some outcomes in the area of de-randomization of classical randomized computations.
Teaching and Learning Physics in a 1:1 Laptop School
NASA Astrophysics Data System (ADS)
Zucker, Andrew A.; Hug, Sarah T.
2008-12-01
1:1 laptop programs, in which every student is provided with a personal computer to use during the school year, permit increased and routine use of powerful, user-friendly computer-based tools. Growing numbers of 1:1 programs are reshaping the roles of teachers and learners in science classrooms. At the Denver School of Science and Technology, a public charter high school where a large percentage of students come from low-income families, 1:1 laptops are used often by teachers and students. This article describes the school's use of laptops, the Internet, and related digital tools, especially for teaching and learning physics. The data are from teacher and student surveys, interviews, classroom observations, and document analyses. Physics students and teachers use an interactive digital textbook; Internet-based simulations (some developed by a Nobel Prize winner); word processors; digital drop boxes; email; formative electronic assessments; computer-based and stand-alone graphing calculators; probes and associated software; and digital video cameras to explore hypotheses, collaborate, engage in scientific inquiry, and to identify strengths and weaknesses of students' understanding of physics. Technology provides students at DSST with high-quality tools to explore scientific concepts and the experiences of teachers and students illustrate effective uses of digital technology for high school physics.
Woodhouse, Steven; Piterman, Nir; Wintersteiger, Christoph M; Göttgens, Berthold; Fisher, Jasmin
2018-05-25
Reconstruction of executable mechanistic models from single-cell gene expression data represents a powerful approach to understanding developmental and disease processes. New ambitious efforts like the Human Cell Atlas will soon lead to an explosion of data with potential for uncovering and understanding the regulatory networks which underlie the behaviour of all human cells. In order to take advantage of this data, however, there is a need for general-purpose, user-friendly and efficient computational tools that can be readily used by biologists who do not have specialist computer science knowledge. The Single Cell Network Synthesis toolkit (SCNS) is a general-purpose computational tool for the reconstruction and analysis of executable models from single-cell gene expression data. Through a graphical user interface, SCNS takes single-cell qPCR or RNA-sequencing data taken across a time course, and searches for logical rules that drive transitions from early cell states towards late cell states. Because the resulting reconstructed models are executable, they can be used to make predictions about the effect of specific gene perturbations on the generation of specific lineages. SCNS should be of broad interest to the growing number of researchers working in single-cell genomics and will help further facilitate the generation of valuable mechanistic insights into developmental, homeostatic and disease processes.
NASA Astrophysics Data System (ADS)
Friedrich, J.
1999-08-01
As lecturers, our main concern and goal is to develop more attractive and efficient ways of communicating up-to-date scientific knowledge to our students and facilitate an in-depth understanding of physical phenomena. Computer-based instruction is very promising to help both teachers and learners in their difficult task, which involves complex cognitive psychological processes. This complexity is reflected in high demands on the design and implementation methods used to create computer-assisted learning (CAL) programs. Due to their concepts, flexibility, maintainability and extended library resources, object-oriented modeling techniques are very suitable to produce this type of pedagogical tool. Computational fluid dynamics (CFD) enjoys not only a growing importance in today's research, but is also very powerful for teaching and learning fluid dynamics. For this purpose, an educational PC program for university level called 'CFDLab 1.1' for Windows™ was developed with an interactive graphical user interface (GUI) for multitasking and point-and-click operations. It uses the dual reciprocity boundary element method as a versatile numerical scheme, allowing to handle a variety of relevant governing equations in two dimensions on personal computers due to its simple pre- and postprocessing including 2D Laplace, Poisson, diffusion, transient convection-diffusion.
Corrias, A.; Jie, X.; Romero, L.; Bishop, M. J.; Bernabeu, M.; Pueyo, E.; Rodriguez, B.
2010-01-01
In this paper, we illustrate how advanced computational modelling and simulation can be used to investigate drug-induced effects on cardiac electrophysiology and on specific biomarkers of pro-arrhythmic risk. To do so, we first perform a thorough literature review of proposed arrhythmic risk biomarkers from the ionic to the electrocardiogram levels. The review highlights the variety of proposed biomarkers, the complexity of the mechanisms of drug-induced pro-arrhythmia and the existence of significant animal species differences in drug-induced effects on cardiac electrophysiology. Predicting drug-induced pro-arrhythmic risk solely using experiments is challenging both preclinically and clinically, as attested by the rise in the cost of releasing new compounds to the market. Computational modelling and simulation has significantly contributed to the understanding of cardiac electrophysiology and arrhythmias over the last 40 years. In the second part of this paper, we illustrate how state-of-the-art open source computational modelling and simulation tools can be used to simulate multi-scale effects of drug-induced ion channel block in ventricular electrophysiology at the cellular, tissue and whole ventricular levels for different animal species. We believe that the use of computational modelling and simulation in combination with experimental techniques could be a powerful tool for the assessment of drug safety pharmacology. PMID:20478918
Gasca, Fernando; Richter, Lars; Schweikard, Achim
2010-01-01
Transcranial Magnetic Stimulation (TMS) in the rat is a powerful tool for investigating brain function. However, the state-of-the-art experiments are considerably limited because the stimulation usually affects undesired anatomical structures. A simulation of a conductive shield plate placed between the coil stimulator and the rat brain during TMS is presented. The Finite Element (FE) method is used to obtain the 3D electric field distribution on a four-layer rat head model. The simulations show that the shield plate with a circular window can improve the focalization of stimulation, as quantitatively seen by computing the three-dimensional half power region (HPR). Focalization with the shield plate showed a clear compromise with the attenuation of the induced field. The results suggest that the shield plate can work as a helpful tool for conducting TMS rat experiments on specific targets.
Distributed data mining on grids: services, tools, and applications.
Cannataro, Mario; Congiusta, Antonio; Pugliese, Andrea; Talia, Domenico; Trunfio, Paolo
2004-12-01
Data mining algorithms are widely used today for the analysis of large corporate and scientific datasets stored in databases and data archives. Industry, science, and commerce fields often need to analyze very large datasets maintained over geographically distributed sites by using the computational power of distributed and parallel systems. The grid can play a significant role in providing an effective computational support for distributed knowledge discovery applications. For the development of data mining applications on grids we designed a system called Knowledge Grid. This paper describes the Knowledge Grid framework and presents the toolset provided by the Knowledge Grid for implementing distributed knowledge discovery. The paper discusses how to design and implement data mining applications by using the Knowledge Grid tools starting from searching grid resources, composing software and data components, and executing the resulting data mining process on a grid. Some performance results are also discussed.
[The virtual university in medicine. Context, concepts, specifications, users' manual].
Duvauferrier, R; Séka, L P; Rolland, Y; Rambeau, M; Le Beux, P; Morcet, N
1998-09-01
The widespread use of Web servers, with the emergence of interactive functions and the possibility of credit card payment via Internet, together with the requirement for continuing education and the subsequent need for a computer to link into the health care network have incited the development of a virtual university scheme on Internet. The Virtual University of Radiology is not only a computer-assisted teaching tool with a set of attractive features, but also a powerful engine allowing the organization, distribution and control of medical knowledge available in the www.server. The scheme provides patient access to general information, a secretary's office for enrollment and the Virtual University itself, with its library, image database, a forum for subspecialties and clinical case reports, an evaluation module and various guides and help tools for diagnosis, prescription and indexing. Currently the Virtual University of Radiology offers diagnostic imaging, but can also be used by other specialties and for general practice.
Cardiac magnetic resonance imaging and computed tomography in ischemic cardiomyopathy: an update*
Assunção, Fernanda Boldrini; de Oliveira, Diogo Costa Leandro; Souza, Vitor Frauches; Nacif, Marcelo Souto
2016-01-01
Ischemic cardiomyopathy is one of the major health problems worldwide, representing a significant part of mortality in the general population nowadays. Cardiac magnetic resonance imaging (CMRI) and cardiac computed tomography (CCT) are noninvasive imaging methods that serve as useful tools in the diagnosis of coronary artery disease and may also help in screening individuals with risk factors for developing this illness. Technological developments of CMRI and CCT have contributed to the rise of several clinical indications of these imaging methods complementarily to other investigation methods, particularly in cases where they are inconclusive. In terms of accuracy, CMRI and CCT are similar to the other imaging methods, with few absolute contraindications and minimal risks of adverse side-effects. This fact strengthens these methods as powerful and safe tools in the management of patients. The present study is aimed at describing the role played by CMRI and CCT in the diagnosis of ischemic cardiomyopathies. PMID:26929458
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz
This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variablesmore » that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.« less
Anatomical medial surfaces with efficient resolution of branches singularities.
Gil, Debora; Vera, Sergio; Borràs, Agnés; Andaluz, Albert; González Ballester, Miguel A
2017-01-01
Medial surfaces are powerful tools for shape description, but their use has been limited due to the sensibility of existing methods to branching artifacts. Medial branching artifacts are associated to perturbations of the object boundary rather than to geometric features. Such instability is a main obstacle for a confident application in shape recognition and description. Medial branches correspond to singularities of the medial surface and, thus, they are problematic for existing morphological and energy-based algorithms. In this paper, we use algebraic geometry concepts in an energy-based approach to compute a medial surface presenting a stable branching topology. We also present an efficient GPU-CPU implementation using standard image processing tools. We show the method computational efficiency and quality on a custom made synthetic database. Finally, we present some results on a medical imaging application for localization of abdominal pathologies. Copyright © 2016 Elsevier B.V. All rights reserved.
Smith, Daniel G A; Burns, Lori A; Sirianni, Dominic A; Nascimento, Daniel R; Kumar, Ashutosh; James, Andrew M; Schriber, Jeffrey B; Zhang, Tianyuan; Zhang, Boyi; Abbott, Adam S; Berquist, Eric J; Lechner, Marvin H; Cunha, Leonardo A; Heide, Alexander G; Waldrop, Jonathan M; Takeshita, Tyler Y; Alenaizan, Asem; Neuhauser, Daniel; King, Rollin A; Simmonett, Andrew C; Turney, Justin M; Schaefer, Henry F; Evangelista, Francesco A; DePrince, A Eugene; Crawford, T Daniel; Patkowski, Konrad; Sherrill, C David
2018-06-11
Psi4NumPy demonstrates the use of efficient computational kernels from the open-source Psi4 program through the popular NumPy library for linear algebra in Python to facilitate the rapid development of clear, understandable Python computer code for new quantum chemical methods, while maintaining a relatively low execution time. Using these tools, reference implementations have been created for a number of methods, including self-consistent field (SCF), SCF response, many-body perturbation theory, coupled-cluster theory, configuration interaction, and symmetry-adapted perturbation theory. Furthermore, several reference codes have been integrated into Jupyter notebooks, allowing background, underlying theory, and formula information to be associated with the implementation. Psi4NumPy tools and associated reference implementations can lower the barrier for future development of quantum chemistry methods. These implementations also demonstrate the power of the hybrid C++/Python programming approach employed by the Psi4 program.
Physics education through computational tools: the case of geometrical and physical optics
NASA Astrophysics Data System (ADS)
Rodríguez, Y.; Santana, A.; Mendoza, L. M.
2013-09-01
Recently, with the development of more powerful and accurate computational tools, the inclusion of new didactic materials in the classroom is known to have increased. However, the form in which these materials can be used to enhance the learning process is still under debate. Many different methodologies have been suggested for constructing new relevant curricular material and, among them, just-in-time teaching (JiTT) has arisen as an effective and successful way to improve the content of classes. In this paper, we will show the implemented pedagogic strategies for the courses of geometrical and optical physics for students of optometry. Thus, the use of the GeoGebra software for the geometrical optics class and the employment of new in-house software for the physical optics class created using the high-level programming language Python is shown with the corresponding activities developed for each of these applets.
Galaxy CloudMan: delivering cloud compute clusters
2010-01-01
Background Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is “cloud computing”, which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate “as is” use by experimental biologists. Results We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon’s EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. Conclusions The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge. PMID:21210983
An evaluation of software tools for the design and development of cockpit displays
NASA Technical Reports Server (NTRS)
Ellis, Thomas D., Jr.
1993-01-01
The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.
Remote Numerical Simulations of the Interaction of High Velocity Clouds with Random Magnetic Fields
NASA Astrophysics Data System (ADS)
Santillan, Alfredo; Hernandez--Cervantes, Liliana; Gonzalez--Ponce, Alejandro; Kim, Jongsoo
The numerical simulations associated with the interaction of High Velocity Clouds (HVC) with the Magnetized Galactic Interstellar Medium (ISM) are a powerful tool to describe the evolution of the interaction of these objects in our Galaxy. In this work we present a new project referred to as Theoretical Virtual i Observatories. It is oriented toward to perform numerical simulations in real time through a Web page. This is a powerful astrophysical computational tool that consists of an intuitive graphical user interface (GUI) and a database produced by numerical calculations. In this Website the user can make use of the existing numerical simulations from the database or run a new simulation introducing initial conditions such as temperatures, densities, velocities, and magnetic field intensities for both the ISM and HVC. The prototype is programmed using Linux, Apache, MySQL, and PHP (LAMP), based on the open source philosophy. All simulations were performed with the MHD code ZEUS-3D, which solves the ideal MHD equations by finite differences on a fixed Eulerian mesh. Finally, we present typical results that can be obtained with this tool.
Scoria: a Python module for manipulating 3D molecular data.
Ropp, Patrick; Friedman, Aaron; Durrant, Jacob D
2017-09-18
Third-party packages have transformed the Python programming language into a powerful computational-biology tool. Package installation is easy for experienced users, but novices sometimes struggle with dependencies and compilers. This presents a barrier that can hinder the otherwise broad adoption of new tools. We present Scoria, a Python package for manipulating three-dimensional molecular data. Unlike similar packages, Scoria requires no dependencies, compilation, or system-wide installation. One can incorporate the Scoria source code directly into their own programs. But Scoria is not designed to compete with other similar packages. Rather, it complements them. Our package leverages others (e.g. NumPy, SciPy), if present, to speed and extend its own functionality. To show its utility, we use Scoria to analyze a molecular dynamics trajectory. Our FootPrint script colors the atoms of one chain by the frequency of their contacts with a second chain. We are hopeful that Scoria will be a useful tool for the computational-biology community. A copy is available for download free of charge (Apache License 2.0) at http://durrantlab.com/scoria/ . Graphical abstract .
FITSManager: Management of Personal Astronomical Data
NASA Astrophysics Data System (ADS)
Cui, Chenzhou; Fan, Dongwei; Zhao, Yongheng; Kembhavi, Ajit; He, Boliang; Cao, Zihuang; Li, Jian; Nandrekar, Deoyani
2011-07-01
With the increase of personal storage capacity, it is easy to find hundreds to thousands of FITS files in the personal computer of an astrophysicist. Because Flexible Image Transport System (FITS) is a professional data format initiated by astronomers and used mainly in the small community, data management toolkits for FITS files are very few. Astronomers need a powerful tool to help them manage their local astronomical data. Although Virtual Observatory (VO) is a network oriented astronomical research environment, its applications and related technologies provide useful solutions to enhance the management and utilization of astronomical data hosted in an astronomer's personal computer. FITSManager is such a tool to provide astronomers an efficient management and utilization of their local data, bringing VO to astronomers in a seamless and transparent way. FITSManager provides fruitful functions for FITS file management, like thumbnail, preview, type dependent icons, header keyword indexing and search, collaborated working with other tools and online services, and so on. The development of the FITSManager is an effort to fill the gap between management and analysis of astronomical data.
Modeling NIF experimental designs with adaptive mesh refinement and Lagrangian hydrodynamics
NASA Astrophysics Data System (ADS)
Koniges, A. E.; Anderson, R. W.; Wang, P.; Gunney, B. T. N.; Becker, R.; Eder, D. C.; MacGowan, B. J.; Schneider, M. B.
2006-06-01
Incorporation of adaptive mesh refinement (AMR) into Lagrangian hydrodynamics algorithms allows for the creation of a highly powerful simulation tool effective for complex target designs with three-dimensional structure. We are developing an advanced modeling tool that includes AMR and traditional arbitrary Lagrangian-Eulerian (ALE) techniques. Our goal is the accurate prediction of vaporization, disintegration and fragmentation in National Ignition Facility (NIF) experimental target elements. Although our focus is on minimizing the generation of shrapnel in target designs and protecting the optics, the general techniques are applicable to modern advanced targets that include three-dimensional effects such as those associated with capsule fill tubes. Several essential computations in ordinary radiation hydrodynamics need to be redesigned in order to allow for AMR to work well with ALE, including algorithms associated with radiation transport. Additionally, for our goal of predicting fragmentation, we include elastic/plastic flow into our computations. We discuss the integration of these effects into a new ALE-AMR simulation code. Applications of this newly developed modeling tool as well as traditional ALE simulations in two and three dimensions are applied to NIF early-light target designs.
CFD research and systems in Kawasaki Heavy Industries and its future prospects
NASA Astrophysics Data System (ADS)
Hiraoka, Koichi
1990-09-01
KHI Computational Fluid Dynamics (CFD) system is composed of VP100 computer and 2-D and 3-D Euler and/or Navier-Stokes (NS) analysis softwares. For KHI, this system has become a very powerful aerodynamic tool together with the Kawasaki 1 m Transonic Wind Tunnel. The 2-D Euler/NS software, developed in-house, is fully automated, requires no special skill, and was successfully applied to the design of YXX high lift devices and SST supersonic inlet, etc. The 3-D Euler/NS software, developed under joint research with NAL, has an interactively operated Multi-Block type grid generator and can effectively generate grids around complex airplane shapes. Due to the main memory size limitation, 3-D analysis of relatively simple shape, such as SST wing-body, was computed in-house on VP100, otherwise, such as detailed 3-D analyses of ASUKA and HOPE, were computed on NAL VP400, which is 10 times more powerful than VP100, under KHI-NAL joint research. These analysis results have very good correlation with experimental results. However, the present CFD system is less productive than wind tunnel and has applicability limitations.
Membrane proteins structures: A review on computational modeling tools.
Almeida, Jose G; Preto, Antonio J; Koukos, Panagiotis I; Bonvin, Alexandre M J J; Moreira, Irina S
2017-10-01
Membrane proteins (MPs) play diverse and important functions in living organisms. They constitute 20% to 30% of the known bacterial, archaean and eukaryotic organisms' genomes. In humans, their importance is emphasized as they represent 50% of all known drug targets. Nevertheless, experimental determination of their three-dimensional (3D) structure has proven to be both time consuming and rather expensive, which has led to the development of computational algorithms to complement the available experimental methods and provide valuable insights. This review highlights the importance of membrane proteins and how computational methods are capable of overcoming challenges associated with their experimental characterization. It covers various MP structural aspects, such as lipid interactions, allostery, and structure prediction, based on methods such as Molecular Dynamics (MD) and Machine-Learning (ML). Recent developments in algorithms, tools and hybrid approaches, together with the increase in both computational resources and the amount of available data have resulted in increasingly powerful and trustworthy approaches to model MPs. Even though MPs are elementary and important in nature, the determination of their 3D structure has proven to be a challenging endeavor. Computational methods provide a reliable alternative to experimental methods. In this review, we focus on computational techniques to determine the 3D structure of MP and characterize their binding interfaces. We also summarize the most relevant databases and software programs available for the study of MPs. Copyright © 2017 Elsevier B.V. All rights reserved.
Extensible Computational Chemistry Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-08-09
ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing the power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of themore » inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less
Expert systems for space power supply - Design, analysis, and evaluation
NASA Technical Reports Server (NTRS)
Cooper, Ralph S.; Thomson, M. Kemer; Hoshor, Alan
1987-01-01
The feasibility of applying expert systems to the conceptual design, analysis, and evaluation of space power supplies in particular, and complex systems in general is evaluated. To do this, the space power supply design process and its associated knowledge base were analyzed and characterized in a form suitable for computer emulation of a human expert. The existing expert system tools and the results achieved with them were evaluated to assess their applicability to power system design. Some new concepts for combining program architectures (modular expert systems and algorithms) with information about the domain were applied to create a 'deep' system for handling the complex design problem. NOVICE, a code to solve a simplified version of a scoping study of a wide variety of power supply types for a broad range of missions, has been developed, programmed, and tested as a concrete feasibility demonstration.
High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations
NASA Technical Reports Server (NTRS)
Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.
2003-01-01
Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.
The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update
Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy
2016-01-01
High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889
Educational Utilization of Microsoft Powerpoint for Oral and Maxillofacial Cancer Presentations.
Carvalho, Francisco Samuel Rodrigues; Chaves, Filipe Nobre; Soares, Eduardo Costa Studart; Pereira, Karuza Maria Alves; Ribeiro, Thyciana Rodrigues; Fonteles, Cristiane Sa Roriz; Costa, Fabio Wildson Gurgel
2016-01-01
Electronic presentations have become useful tools for surgeons, other clinicians and patients, facilitating medical and legal support and scientific research. Microsoft® PowerPoint is by far and away the most commonly used computer-based presentation package. Setting up surgical clinical cases with PowerPoint makes it easy to register and follow patients for the purpose of discussion of treatment plan or scientific presentations. It facilitates communication between professionals, supervising clinical cases and teaching. It is often useful to create a template to standardize the presentation, offered by the software through the slide master. The purpose of this paper was to show a simple and practical method for creating a Microsoft® PowerPoint template for use in presentations concerning oral and maxillofacial cancer.
Continuum Electrostatics Approaches to Calculating pKas and Ems in Proteins
Gunner, MR; Baker, Nathan A.
2017-01-01
Proteins change their charge state through protonation and redox reactions as well as through binding charged ligands. The free energy of these reactions are dominated by solvation and electrostatic energies and modulated by protein conformational relaxation in response to the ionization state changes. Although computational methods for calculating these interactions can provide very powerful tools for predicting protein charge states, they include several critical approximations of which users should be aware. This chapter discusses the strengths, weaknesses, and approximations of popular computational methods for predicting charge states and understanding their underlying electrostatic interactions. The goal of this chapter is to inform users about applications and potential caveats of these methods as well as outline directions for future theoretical and computational research. PMID:27497160
Computations of unsteady multistage compressor flows in a workstation environment
NASA Technical Reports Server (NTRS)
Gundy-Burlet, Karen L.
1992-01-01
High-end graphics workstations are becoming a necessary tool in the computational fluid dynamics environment. In addition to their graphic capabilities, workstations of the latest generation have powerful floating-point-operation capabilities. As workstations become common, they could provide valuable computing time for such applications as turbomachinery flow calculations. This report discusses the issues involved in implementing an unsteady, viscous multistage-turbomachinery code (STAGE-2) on workstations. It then describes work in which the workstation version of STAGE-2 was used to study the effects of axial-gap spacing on the time-averaged and unsteady flow within a 2 1/2-stage compressor. The results included time-averaged surface pressures, time-averaged pressure contours, standard deviation of pressure contours, pressure amplitudes, and force polar plots.
Efficient Bayesian mixed model analysis increases association power in large cohorts
Loh, Po-Ru; Tucker, George; Bulik-Sullivan, Brendan K; Vilhjálmsson, Bjarni J; Finucane, Hilary K; Salem, Rany M; Chasman, Daniel I; Ridker, Paul M; Neale, Benjamin M; Berger, Bonnie; Patterson, Nick; Price, Alkes L
2014-01-01
Linear mixed models are a powerful statistical tool for identifying genetic associations and avoiding confounding. However, existing methods are computationally intractable in large cohorts, and may not optimize power. All existing methods require time cost O(MN2) (where N = #samples and M = #SNPs) and implicitly assume an infinitesimal genetic architecture in which effect sizes are normally distributed, which can limit power. Here, we present a far more efficient mixed model association method, BOLT-LMM, which requires only a small number of O(MN)-time iterations and increases power by modeling more realistic, non-infinitesimal genetic architectures via a Bayesian mixture prior on marker effect sizes. We applied BOLT-LMM to nine quantitative traits in 23,294 samples from the Women’s Genome Health Study (WGHS) and observed significant increases in power, consistent with simulations. Theory and simulations show that the boost in power increases with cohort size, making BOLT-LMM appealing for GWAS in large cohorts. PMID:25642633
Development of a computer-assisted learning software package on dental traumatology.
Tolidis, K; Crawford, P; Stephens, C; Papadogiannis, Y; Plakias, C
1998-10-01
The development of computer-assisted learning software packages is a relatively new field of computer application. The progress made in personal computer technology toward more user-friendly operating systems has stimulated the academic community to develop computer-assisted learning for pre- and postgraduate students. The ability of computers to combine audio and visual data in an interactive form provides a powerful educational tool. The purpose of this study was to develop and evaluate a computer-assisted learning package on dental traumatology. This program contains background information on the diagnosis, classification, and management of dental injuries in both the permanent and the deciduous dentitions. It is structured into chapters according to the nature of the injury and whether injury has occurred in the primary or permanent dentition. At the end of each chapter there is a self-assessment questionnaire as well as references to relevant literature. Extensive use of pictures and video provides a comprehensive overview of the subject.
On the usage of ultrasound computational models for decision making under ambiguity
NASA Astrophysics Data System (ADS)
Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron
2018-04-01
Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.
Supercomputers ready for use as discovery machines for neuroscience.
Helias, Moritz; Kunkel, Susanne; Masumoto, Gen; Igarashi, Jun; Eppler, Jochen Martin; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus
2012-01-01
NEST is a widely used tool to simulate biological spiking neural networks. Here we explain the improvements, guided by a mathematical model of memory consumption, that enable us to exploit for the first time the computational power of the K supercomputer for neuroscience. Multi-threaded components for wiring and simulation combine 8 cores per MPI process to achieve excellent scaling. K is capable of simulating networks corresponding to a brain area with 10(8) neurons and 10(12) synapses in the worst case scenario of random connectivity; for larger networks of the brain its hierarchical organization can be exploited to constrain the number of communicating computer nodes. We discuss the limits of the software technology, comparing maximum filling scaling plots for K and the JUGENE BG/P system. The usability of these machines for network simulations has become comparable to running simulations on a single PC. Turn-around times in the range of minutes even for the largest systems enable a quasi interactive working style and render simulations on this scale a practical tool for computational neuroscience.
Using the iPlant collaborative discovery environment.
Oliver, Shannon L; Lenards, Andrew J; Barthelson, Roger A; Merchant, Nirav; McKay, Sheldon J
2013-06-01
The iPlant Collaborative is an academic consortium whose mission is to develop an informatics and social infrastructure to address the "grand challenges" in plant biology. Its cyberinfrastructure supports the computational needs of the research community and facilitates solving major challenges in plant science. The Discovery Environment provides a powerful and rich graphical interface to the iPlant Collaborative cyberinfrastructure by creating an accessible virtual workbench that enables all levels of expertise, ranging from students to traditional biology researchers and computational experts, to explore, analyze, and share their data. By providing access to iPlant's robust data-management system and high-performance computing resources, the Discovery Environment also creates a unified space in which researchers can access scalable tools. Researchers can use available Applications (Apps) to execute analyses on their data, as well as customize or integrate their own tools to better meet the specific needs of their research. These Apps can also be used in workflows that automate more complicated analyses. This module describes how to use the main features of the Discovery Environment, using bioinformatics workflows for high-throughput sequence data as examples. © 2013 by John Wiley & Sons, Inc.
Numerical methods on some structured matrix algebra problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jessup, E.R.
1996-06-01
This proposal concerned the design, analysis, and implementation of serial and parallel algorithms for certain structured matrix algebra problems. It emphasized large order problems and so focused on methods that can be implemented efficiently on distributed-memory MIMD multiprocessors. Such machines supply the computing power and extensive memory demanded by the large order problems. We proposed to examine three classes of matrix algebra problems: the symmetric and nonsymmetric eigenvalue problems (especially the tridiagonal cases) and the solution of linear systems with specially structured coefficient matrices. As all of these are of practical interest, a major goal of this work was tomore » translate our research in linear algebra into useful tools for use by the computational scientists interested in these and related applications. Thus, in addition to software specific to the linear algebra problems, we proposed to produce a programming paradigm and library to aid in the design and implementation of programs for distributed-memory MIMD computers. We now report on our progress on each of the problems and on the programming tools.« less
ChemScreener: A Distributed Computing Tool for Scaffold based Virtual Screening.
Karthikeyan, Muthukumarasamy; Pandit, Deepak; Vyas, Renu
2015-01-01
In this work we present ChemScreener, a Java-based application to perform virtual library generation combined with virtual screening in a platform-independent distributed computing environment. ChemScreener comprises a scaffold identifier, a distinct scaffold extractor, an interactive virtual library generator as well as a virtual screening module for subsequently selecting putative bioactive molecules. The virtual libraries are annotated with chemophore-, pharmacophore- and toxicophore-based information for compound prioritization. The hits selected can then be further processed using QSAR, docking and other in silico approaches which can all be interfaced within the ChemScreener framework. As a sample application, in this work scaffold selectivity, diversity, connectivity and promiscuity towards six important therapeutic classes have been studied. In order to illustrate the computational power of the application, 55 scaffolds extracted from 161 anti-psychotic compounds were enumerated to produce a virtual library comprising 118 million compounds (17 GB) and annotated with chemophore, pharmacophore and toxicophore based features in a single step which would be non-trivial to perform with many standard software tools today on libraries of this size.
Supercomputers Ready for Use as Discovery Machines for Neuroscience
Helias, Moritz; Kunkel, Susanne; Masumoto, Gen; Igarashi, Jun; Eppler, Jochen Martin; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus
2012-01-01
NEST is a widely used tool to simulate biological spiking neural networks. Here we explain the improvements, guided by a mathematical model of memory consumption, that enable us to exploit for the first time the computational power of the K supercomputer for neuroscience. Multi-threaded components for wiring and simulation combine 8 cores per MPI process to achieve excellent scaling. K is capable of simulating networks corresponding to a brain area with 108 neurons and 1012 synapses in the worst case scenario of random connectivity; for larger networks of the brain its hierarchical organization can be exploited to constrain the number of communicating computer nodes. We discuss the limits of the software technology, comparing maximum filling scaling plots for K and the JUGENE BG/P system. The usability of these machines for network simulations has become comparable to running simulations on a single PC. Turn-around times in the range of minutes even for the largest systems enable a quasi interactive working style and render simulations on this scale a practical tool for computational neuroscience. PMID:23129998
TOPICAL REVIEW: Advances and challenges in computational plasma science
NASA Astrophysics Data System (ADS)
Tang, W. M.; Chan, V. S.
2005-02-01
Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.
Advances and challenges in computational plasma science
NASA Astrophysics Data System (ADS)
Tang, W. M.
2005-02-01
Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.
Bible, Paul W; Kanno, Yuka; Wei, Lai; Brooks, Stephen R; O'Shea, John J; Morasso, Maria I; Loganantharaj, Rasiah; Sun, Hong-Wei
2015-01-01
Comparative co-localization analysis of transcription factors (TFs) and epigenetic marks (EMs) in specific biological contexts is one of the most critical areas of ChIP-Seq data analysis beyond peak calling. Yet there is a significant lack of user-friendly and powerful tools geared towards co-localization analysis based exploratory research. Most tools currently used for co-localization analysis are command line only and require extensive installation procedures and Linux expertise. Online tools partially address the usability issues of command line tools, but slow response times and few customization features make them unsuitable for rapid data-driven interactive exploratory research. We have developed PAPST: Peak Assignment and Profile Search Tool, a user-friendly yet powerful platform with a unique design, which integrates both gene-centric and peak-centric co-localization analysis into a single package. Most of PAPST's functions can be completed in less than five seconds, allowing quick cycles of data-driven hypothesis generation and testing. With PAPST, a researcher with or without computational expertise can perform sophisticated co-localization pattern analysis of multiple TFs and EMs, either against all known genes or a set of genomic regions obtained from public repositories or prior analysis. PAPST is a versatile, efficient, and customizable tool for genome-wide data-driven exploratory research. Creatively used, PAPST can be quickly applied to any genomic data analysis that involves a comparison of two or more sets of genomic coordinate intervals, making it a powerful tool for a wide range of exploratory genomic research. We first present PAPST's general purpose features then apply it to several public ChIP-Seq data sets to demonstrate its rapid execution and potential for cutting-edge research with a case study in enhancer analysis. To our knowledge, PAPST is the first software of its kind to provide efficient and sophisticated post peak-calling ChIP-Seq data analysis as an easy-to-use interactive application. PAPST is available at https://github.com/paulbible/papst and is a public domain work.
Bible, Paul W.; Kanno, Yuka; Wei, Lai; Brooks, Stephen R.; O’Shea, John J.; Morasso, Maria I.; Loganantharaj, Rasiah; Sun, Hong-Wei
2015-01-01
Comparative co-localization analysis of transcription factors (TFs) and epigenetic marks (EMs) in specific biological contexts is one of the most critical areas of ChIP-Seq data analysis beyond peak calling. Yet there is a significant lack of user-friendly and powerful tools geared towards co-localization analysis based exploratory research. Most tools currently used for co-localization analysis are command line only and require extensive installation procedures and Linux expertise. Online tools partially address the usability issues of command line tools, but slow response times and few customization features make them unsuitable for rapid data-driven interactive exploratory research. We have developed PAPST: Peak Assignment and Profile Search Tool, a user-friendly yet powerful platform with a unique design, which integrates both gene-centric and peak-centric co-localization analysis into a single package. Most of PAPST’s functions can be completed in less than five seconds, allowing quick cycles of data-driven hypothesis generation and testing. With PAPST, a researcher with or without computational expertise can perform sophisticated co-localization pattern analysis of multiple TFs and EMs, either against all known genes or a set of genomic regions obtained from public repositories or prior analysis. PAPST is a versatile, efficient, and customizable tool for genome-wide data-driven exploratory research. Creatively used, PAPST can be quickly applied to any genomic data analysis that involves a comparison of two or more sets of genomic coordinate intervals, making it a powerful tool for a wide range of exploratory genomic research. We first present PAPST’s general purpose features then apply it to several public ChIP-Seq data sets to demonstrate its rapid execution and potential for cutting-edge research with a case study in enhancer analysis. To our knowledge, PAPST is the first software of its kind to provide efficient and sophisticated post peak-calling ChIP-Seq data analysis as an easy-to-use interactive application. PAPST is available at https://github.com/paulbible/papst and is a public domain work. PMID:25970601
CAESY - COMPUTER AIDED ENGINEERING SYSTEM
NASA Technical Reports Server (NTRS)
Wette, M. R.
1994-01-01
Many developers of software and algorithms for control system design have recognized that current tools have limits in both flexibility and efficiency. Many forces drive the development of new tools including the desire to make complex system modeling design and analysis easier and the need for quicker turnaround time in analysis and design. Other considerations include the desire to make use of advanced computer architectures to help in control system design, adopt new methodologies in control, and integrate design processes (e.g., structure, control, optics). CAESY was developed to provide a means to evaluate methods for dealing with user needs in computer-aided control system design. It is an interpreter for performing engineering calculations and incorporates features of both Ada and MATLAB. It is designed to be reasonably flexible and powerful. CAESY includes internally defined functions and procedures, as well as user defined ones. Support for matrix calculations is provided in the same manner as MATLAB. However, the development of CAESY is a research project, and while it provides some features which are not found in commercially sold tools, it does not exhibit the robustness that many commercially developed tools provide. CAESY is written in C-language for use on Sun4 series computers running SunOS 4.1.1 and later. The program is designed to optionally use the LAPACK math library. The LAPACK math routines are available through anonymous ftp from research.att.com. CAESY requires 4Mb of RAM for execution. The standard distribution medium is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. CAESY was developed in 1993 and is a copyrighted work with all copyright vested in NASA.
Wells, I G; Cartwright, R Y; Farnan, L P
1993-12-15
The computing strategy in our laboratories evolved from research in Artificial Intelligence, and is based on powerful software tools running on high performance desktop computers with a graphical user interface. This allows most tasks to be regarded as design problems rather than implementation projects, and both rapid prototyping and an object-oriented approach to be employed during the in-house development and enhancement of the laboratory information systems. The practical application of this strategy is discussed, with particular reference to the system designer, the laboratory user and the laboratory customer. Routine operation covers five departments, and the systems are stable, flexible and well accepted by the users. Client-server computing, currently undergoing final trials, is seen as the key to further development, and this approach to Pathology computing has considerable potential for the future.
Examining Trust, Forgiveness and Regret as Computational Concepts
NASA Astrophysics Data System (ADS)
Marsh, Stephen; Briggs, Pamela
The study of trust has advanced tremendously in recent years, to the extent that the goal of a more unified formalisation of the concept is becoming feasible. To that end, we have begun to examine the closely related concepts of regret and forgiveness and their relationship to trust and its siblings. The resultant formalisation allows computational tractability in, for instance, artificial agents. Moreover, regret and forgiveness, when allied to trust, are very powerful tools in the Ambient Intelligence (AmI) security area, especially where Human Computer Interaction and concrete human understanding are key. This paper introduces the concepts of regret and forgiveness, exploring them from social psychological as well as a computational viewpoint, and presents an extension to Marsh's original trust formalisation that takes them into account. It discusses and explores work in the AmI environment, and further potential applications.
Computational Failure Modeling of Lower Extremities
2012-01-01
bone fracture, ligament tear, and muscle rupture . While these injuries may seem well-defined through medical imaging, the process of injury and the...to vehicles from improvised explosives cause severe injuries to the lower extremities, in- cluding bone fracture, ligament tear, and muscle rupture ...modeling offers a powerful tool to explore the insult-to-injury process with high-resolution. When studying a complex dynamic process such as this, it is
ERIC Educational Resources Information Center
Bickford, J. H., III
2010-01-01
This paper is based on three beliefs. First, technology can engage and challenge students' thinking. Second, technology can assist students in creating quality work. Finally, computer-generated student-work can be used as educational tools in productive ways that other student-work cannot. This article suggests new ways to use old technologies to…
Advanced Computation in Plasma Physics
NASA Astrophysics Data System (ADS)
Tang, William
2001-10-01
Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. This talk will review recent progress and future directions for advanced simulations in magnetically-confined plasmas with illustrative examples chosen from areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop MPP's to produce 3-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for tens of thousands time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.
Artificial neural networks in biology and chemistry: the evolution of a new analytical tool.
Cartwright, Hugh M
2008-01-01
Once regarded as an eccentric and unpromising algorithm for the analysis of scientific data, the neural network has been developed in the last decade into a powerful computational tool. Its use now spans all areas of science, from the physical sciences and engineering to the life sciences and allied subjects. Applications range from the assessment of epidemiological data or the deconvolution of spectra to highly practical applications, such as the electronic nose. This introductory chapter considers briefly the growth in the use of neural networks and provides some general background in preparation for the more detailed chapters that follow.
HRLSim: a high performance spiking neural network simulator for GPGPU clusters.
Minkovich, Kirill; Thibeault, Corey M; O'Brien, Michael John; Nogin, Aleksey; Cho, Youngkwan; Srinivasa, Narayan
2014-02-01
Modeling of large-scale spiking neural models is an important tool in the quest to understand brain function and subsequently create real-world applications. This paper describes a spiking neural network simulator environment called HRL Spiking Simulator (HRLSim). This simulator is suitable for implementation on a cluster of general purpose graphical processing units (GPGPUs). Novel aspects of HRLSim are described and an analysis of its performance is provided for various configurations of the cluster. With the advent of inexpensive GPGPU cards and compute power, HRLSim offers an affordable and scalable tool for design, real-time simulation, and analysis of large-scale spiking neural networks.
DNS and Embedded DNS as Tools for Investigating Unsteady Heat Transfer Phenomena in Turbines
NASA Technical Reports Server (NTRS)
vonTerzi, Dominic; Bauer, H.-J.
2010-01-01
DNS is a powerful tool with high potential for investigating unsteady heat transfer and fluid flow phenomena, in particular for cases involving transition to turbulence and/or large coherent structures. - DNS of idealized configurations related to turbomachinery components is already possible. - For more realistic configurations and the inclusion of more effects, reduction of computational cost is key issue (e.g., hybrid methods). - Approach pursued here: Embedded DNS ( segregated coupling of DNS with LES and/or RANS). - Embedded DNS is an enabling technology for many studies. - Pre-transitional heat transfer and trailing-edge cutback film-cooling are good candidates for (embedded) DNS studies.
Bayesian learning for spatial filtering in an EEG-based brain-computer interface.
Zhang, Haihong; Yang, Huijuan; Guan, Cuntai
2013-07-01
Spatial filtering for EEG feature extraction and classification is an important tool in brain-computer interface. However, there is generally no established theory that links spatial filtering directly to Bayes classification error. To address this issue, this paper proposes and studies a Bayesian analysis theory for spatial filtering in relation to Bayes error. Following the maximum entropy principle, we introduce a gamma probability model for describing single-trial EEG power features. We then formulate and analyze the theoretical relationship between Bayes classification error and the so-called Rayleigh quotient, which is a function of spatial filters and basically measures the ratio in power features between two classes. This paper also reports our extensive study that examines the theory and its use in classification, using three publicly available EEG data sets and state-of-the-art spatial filtering techniques and various classifiers. Specifically, we validate the positive relationship between Bayes error and Rayleigh quotient in real EEG power features. Finally, we demonstrate that the Bayes error can be practically reduced by applying a new spatial filter with lower Rayleigh quotient.
VLSI Implementation of Fault Tolerance Multiplier based on Reversible Logic Gate
NASA Astrophysics Data System (ADS)
Ahmad, Nabihah; Hakimi Mokhtar, Ahmad; Othman, Nurmiza binti; Fhong Soon, Chin; Rahman, Ab Al Hadi Ab
2017-08-01
Multiplier is one of the essential component in the digital world such as in digital signal processing, microprocessor, quantum computing and widely used in arithmetic unit. Due to the complexity of the multiplier, tendency of errors are very high. This paper aimed to design a 2×2 bit Fault Tolerance Multiplier based on Reversible logic gate with low power consumption and high performance. This design have been implemented using 90nm Complemetary Metal Oxide Semiconductor (CMOS) technology in Synopsys Electronic Design Automation (EDA) Tools. Implementation of the multiplier architecture is by using the reversible logic gates. The fault tolerance multiplier used the combination of three reversible logic gate which are Double Feynman gate (F2G), New Fault Tolerance (NFT) gate and Islam Gate (IG) with the area of 160μm x 420.3μm (67.25 mm2). This design achieved a low power consumption of 122.85μW and propagation delay of 16.99ns. The fault tolerance multiplier proposed achieved a low power consumption and high performance which suitable for application of modern computing as it has a fault tolerance capabilities.
NASA Astrophysics Data System (ADS)
Moon, Hongsik
What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the changing computer hardware platforms in order to provide fast, accurate and efficient solutions to large, complex electromagnetic problems. The research in this dissertation proves that the performance of parallel code is intimately related to the configuration of the computer hardware and can be maximized for different hardware platforms. To benchmark and optimize the performance of parallel CEM software, a variety of large, complex projects are created and executed on a variety of computer platforms. The computer platforms used in this research are detailed in this dissertation. The projects run as benchmarks are also described in detail and results are presented. The parameters that affect parallel CEM software on High Performance Computing Clusters (HPCC) are investigated. This research demonstrates methods to maximize the performance of parallel CEM software code.
Energy Consumption Management of Virtual Cloud Computing Platform
NASA Astrophysics Data System (ADS)
Li, Lin
2017-11-01
For energy consumption management research on virtual cloud computing platforms, energy consumption management of virtual computers and cloud computing platform should be understood deeper. Only in this way can problems faced by energy consumption management be solved. In solving problems, the key to solutions points to data centers with high energy consumption, so people are in great need to use a new scientific technique. Virtualization technology and cloud computing have become powerful tools in people’s real life, work and production because they have strong strength and many advantages. Virtualization technology and cloud computing now is in a rapid developing trend. It has very high resource utilization rate. In this way, the presence of virtualization and cloud computing technologies is very necessary in the constantly developing information age. This paper has summarized, explained and further analyzed energy consumption management questions of the virtual cloud computing platform. It eventually gives people a clearer understanding of energy consumption management of virtual cloud computing platform and brings more help to various aspects of people’s live, work and son on.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muhleman, T.; Dempsey, P.
Although reduced activity has left its mark on engineering budgets and many projects have been delayed, industry remains committed to research and development. This year's emphasis is offshore where new-generation semi-submersibles are under construction for Arctic waters and where equipment technology is reaching maturity. Improved tubulars such as new process-forged drill pipe, special alloy, corrosion-resistant pipe and new tool joint designs are finding eager markets both on and offshore. And back in the office, microcomputers, a curiosity a few years ago, are making significant advances in improving drilling and production operations. Specific examples of this new technology include: Two high-tech,more » high-risk floaters Hard rock sidewall coring tool New torque-resistant tool joint Two improved riser connection systems Breakthrough in drill pipe manufacturing Power-packed portable drilling computer.« less
NASA Technical Reports Server (NTRS)
Modesitt, Kenneth L.
1990-01-01
Since 1984, an effort has been underway at Rocketdyne, manufacturer of the Space Shuttle Main Engine (SSME), to automate much of the analysis procedure conducted after engine test firings. Previously published articles at national and international conferences have contained the context of and justification for this effort. Here, progress is reported in building the full system, including the extensions of integrating large databases with the system, known as Scotty. Inductive knowledge acquisition has proven itself to be a key factor in the success of Scotty. The combination of a powerful inductive expert system building tool (ExTran), a relational data base management system (Reliance), and software engineering principles and Computer-Assisted Software Engineering (CASE) tools makes for a practical, useful and state-of-the-art application of an expert system.
H(2)- and H(infinity)-design tools for linear time-invariant systems
NASA Technical Reports Server (NTRS)
Ly, Uy-Loi
1989-01-01
Recent advances in optimal control have brought design techniques based on optimization of H(2) and H(infinity) norm criteria, closer to be attractive alternatives to single-loop design methods for linear time-variant systems. Significant steps forward in this technology are the deeper understanding of performance and robustness issues of these design procedures and means to perform design trade-offs. However acceptance of the technology is hindered by the lack of convenient design tools to exercise these powerful multivariable techniques, while still allowing single-loop design formulation. Presented is a unique computer tool for designing arbitrary low-order linear time-invarient controllers than encompasses both performance and robustness issues via the familiar H(2) and H(infinity) norm optimization. Application to disturbance rejection design for a commercial transport is demonstrated.
NASA Astrophysics Data System (ADS)
Wessel, Paul; Luis, Joaquim F.
2017-02-01
The GMT/MATLAB toolbox is a basic interface between MATLAB® (or Octave) and GMT, the Generic Mapping Tools, which allows MATLAB users full access to all GMT modules. Data may be passed between the two programs using intermediate MATLAB structures that organize the metadata needed; these are produced when GMT modules are run. In addition, standard MATLAB matrix data can be used directly as input to GMT modules. The toolbox improves interoperability between two widely used tools in the geosciences and extends the capability of both tools: GMT gains access to the powerful computational capabilities of MATLAB while the latter gains the ability to access specialized gridding algorithms and can produce publication-quality PostScript-based illustrations. The toolbox is available on all platforms and may be downloaded from the GMT website.
ITEP: an integrated toolkit for exploration of microbial pan-genomes.
Benedict, Matthew N; Henriksen, James R; Metcalf, William W; Whitaker, Rachel J; Price, Nathan D
2014-01-03
Comparative genomics is a powerful approach for studying variation in physiological traits as well as the evolution and ecology of microorganisms. Recent technological advances have enabled sequencing large numbers of related genomes in a single project, requiring computational tools for their integrated analysis. In particular, accurate annotations and identification of gene presence and absence are critical for understanding and modeling the cellular physiology of newly sequenced genomes. Although many tools are available to compare the gene contents of related genomes, new tools are necessary to enable close examination and curation of protein families from large numbers of closely related organisms, to integrate curation with the analysis of gain and loss, and to generate metabolic networks linking the annotations to observed phenotypes. We have developed ITEP, an Integrated Toolkit for Exploration of microbial Pan-genomes, to curate protein families, compute similarities to externally-defined domains, analyze gene gain and loss, and generate draft metabolic networks from one or more curated reference network reconstructions in groups of related microbial species among which the combination of core and variable genes constitute the their "pan-genomes". The ITEP toolkit consists of: (1) a series of modular command-line scripts for identification, comparison, curation, and analysis of protein families and their distribution across many genomes; (2) a set of Python libraries for programmatic access to the same data; and (3) pre-packaged scripts to perform common analysis workflows on a collection of genomes. ITEP's capabilities include de novo protein family prediction, ortholog detection, analysis of functional domains, identification of core and variable genes and gene regions, sequence alignments and tree generation, annotation curation, and the integration of cross-genome analysis and metabolic networks for study of metabolic network evolution. ITEP is a powerful, flexible toolkit for generation and curation of protein families. ITEP's modular design allows for straightforward extension as analysis methods and tools evolve. By integrating comparative genomics with the development of draft metabolic networks, ITEP harnesses the power of comparative genomics to build confidence in links between genotype and phenotype and helps disambiguate gene annotations when they are evaluated in both evolutionary and metabolic network contexts.
The 3D widgets for exploratory scientific visualization
NASA Technical Reports Server (NTRS)
Herndon, Kenneth P.; Meyer, Tom
1995-01-01
Computational fluid dynamics (CFD) techniques are used to simulate flows of fluids like air or water around such objects as airplanes and automobiles. These techniques usually generate very large amounts of numerical data which are difficult to understand without using graphical scientific visualization techniques. There are a number of commercial scientific visualization applications available today which allow scientists to control visualization tools via textual and/or 2D user interfaces. However, these user interfaces are often difficult to use. We believe that 3D direct-manipulation techniques for interactively controlling visualization tools will provide opportunities for powerful and useful interfaces with which scientists can more effectively explore their datasets. A few systems have been developed which use these techniques. In this paper, we will present a variety of 3D interaction techniques for manipulating parameters of visualization tools used to explore CFD datasets, and discuss in detail various techniques for positioning tools in a 3D scene.
Parallel Harmony Search Based Distributed Energy Resource Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ceylan, Oguzhan; Liu, Guodong; Tomsovic, Kevin
2015-01-01
This paper presents a harmony search based parallel optimization algorithm to minimize voltage deviations in three phase unbalanced electrical distribution systems and to maximize active power outputs of distributed energy resources (DR). The main contribution is to reduce the adverse impacts on voltage profile during a day as photovoltaics (PVs) output or electrical vehicles (EVs) charging changes throughout a day. The IEEE 123- bus distribution test system is modified by adding DRs and EVs under different load profiles. The simulation results show that by using parallel computing techniques, heuristic methods may be used as an alternative optimization tool in electricalmore » power distribution systems operation.« less
Breast Imaging in the Era of Big Data: Structured Reporting and Data Mining.
Margolies, Laurie R; Pandey, Gaurav; Horowitz, Eliot R; Mendelson, David S
2016-02-01
The purpose of this article is to describe structured reporting and the development of large databases for use in data mining in breast imaging. The results of millions of breast imaging examinations are reported with structured tools based on the BI-RADS lexicon. Much of these data are stored in accessible media. Robust computing power creates great opportunity for data scientists and breast imagers to collaborate to improve breast cancer detection and optimize screening algorithms. Data mining can create knowledge, but the questions asked and their complexity require extremely powerful and agile databases. New data technologies can facilitate outcomes research and precision medicine.
NASA Technical Reports Server (NTRS)
1979-01-01
The machinery pictured is a set of Turbodyne steam turbines which power a sugar mill at Bell Glade, Florida. A NASA-developed computer program called NASTRAN aided development of these and other turbines manufactured by Turbodyne Corporation's Steam Turbine Division, Wellsville, New York. An acronym for NASA Structural Analysis Program, NASTRAN is a predictive tool which advises development teams how a structural design will perform under service use conditions. Turbodyne uses NASTRAN to analyze the dynamic behavior of steam turbine components, achieving substantial savings in development costs. One of the most widely used spinoffs, NASTRAN is made available to private industry through NASA's Computer Software Management Information Center (COSMIC) at the University of Georgia.