Sample records for sophisticated computer simulation

  1. Fiber Composite Sandwich Thermostructural Behavior: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Aiello, R. A.; Murthy, P. L. N.

    1986-01-01

    Several computational levels of progressive sophistication/simplification are described to computationally simulate composite sandwich hygral, thermal, and structural behavior. The computational levels of sophistication include: (1) three-dimensional detailed finite element modeling of the honeycomb, the adhesive and the composite faces; (2) three-dimensional finite element modeling of the honeycomb assumed to be an equivalent continuous, homogeneous medium, the adhesive and the composite faces; (3) laminate theory simulation where the honeycomb (metal or composite) is assumed to consist of plies with equivalent properties; and (4) derivations of approximate, simplified equations for thermal and mechanical properties by simulating the honeycomb as an equivalent homogeneous medium. The approximate equations are combined with composite hygrothermomechanical and laminate theories to provide a simple and effective computational procedure for simulating the thermomechanical/thermostructural behavior of fiber composite sandwich structures.

  2. The Application of a Massively Parallel Computer to the Simulation of Electrical Wave Propagation Phenomena in the Heart Muscle Using Simplified Models

    NASA Technical Reports Server (NTRS)

    Karpoukhin, Mikhii G.; Kogan, Boris Y.; Karplus, Walter J.

    1995-01-01

    The simulation of heart arrhythmia and fibrillation are very important and challenging tasks. The solution of these problems using sophisticated mathematical models is beyond the capabilities of modern super computers. To overcome these difficulties it is proposed to break the whole simulation problem into two tightly coupled stages: generation of the action potential using sophisticated models. and propagation of the action potential using simplified models. The well known simplified models are compared and modified to bring the rate of depolarization and action potential duration restitution closer to reality. The modified method of lines is used to parallelize the computational process. The conditions for the appearance of 2D spiral waves after the application of a premature beat and the subsequent traveling of the spiral wave inside the simulated tissue are studied.

  3. Railroads and the Environment : Estimation of Fuel Consumption in Rail Transportation : Volume 3. Comparison of Computer Simulations with Field Measurements

    DOT National Transportation Integrated Search

    1978-09-01

    This report documents comparisons between extensive rail freight service measurements (previously presented in Volume II) and simulations of the same operations using a sophisticated train performance calculator computer program. The comparisons cove...

  4. A qualitative analysis of bus simulator training on transit incidents : a case study in Florida. [Summary].

    DOT National Transportation Integrated Search

    2013-01-01

    The simulator was once a very expensive, large-scale mechanical device for training military pilots or astronauts. Modern computers, linking sophisticated software and large-screen displays, have yielded simulators for the desktop or configured as sm...

  5. Evaluating coastal and river valley communities evacuation network performance using macroscopic productivity.

    DOT National Transportation Integrated Search

    2017-06-30

    The ever-increasing processing speed and computational power of computers and simulation systems has led to correspondingly larger, more sophisticated representations of evacuation traffic processes. Today, micro-level analyses can be conducted for m...

  6. Computers for Interactive Learning.

    ERIC Educational Resources Information Center

    Grabowski, Barbara; Aggen, William

    1984-01-01

    Analyzes features of computer-based interactive video including sophisticated answer judging, diagnostic feedback, simulation, animation, audible tones, touch sensitive screen, function keys, and video enhancements, and matches these to the characteristics and pedagogical styles of learners. The learner characteristics discussed include internal…

  7. Adaptive Language Games with Robots

    NASA Astrophysics Data System (ADS)

    Steels, Luc

    2010-11-01

    This paper surveys recent research into language evolution using computer simulations and robotic experiments. This field has made tremendous progress in the past decade going from simple simulations of lexicon formation with animallike cybernetic robots to sophisticated grammatical experiments with humanoid robots.

  8. A computer simulation of aircraft evacuation with fire

    NASA Technical Reports Server (NTRS)

    Middleton, V. E.

    1983-01-01

    A computer simulation was developed to assess passenger survival during the post-crash evacuation of a transport category aircraft when fire is a major threat. The computer code, FIREVAC, computes individual passenger exit paths and times to exit, taking into account delays and congestion caused by the interaction among the passengers and changing cabin conditions. Simple models for the physiological effects of the toxic cabin atmosphere are included with provision for including more sophisticated models as they become available. Both wide-body and standard-body aircraft may be simulated. Passenger characteristics are assigned stochastically from experimentally derived distributions. Results of simulations of evacuation trials and hypothetical evacuations under fire conditions are presented.

  9. Moose: An Open-Source Framework to Enable Rapid Development of Collaborative, Multi-Scale, Multi-Physics Simulation Tools

    NASA Astrophysics Data System (ADS)

    Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.

    2014-12-01

    The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.

  10. Performance of hybrid programming models for multiscale cardiac simulations: preparing for petascale computation.

    PubMed

    Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias

    2011-10-01

    Future multiscale and multiphysics models that support research into human disease, translational medical science, and treatment can utilize the power of high-performance computing (HPC) systems. We anticipate that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message-passing processes [e.g., the message-passing interface (MPI)] with multithreading (e.g., OpenMP, Pthreads). The objective of this study is to compare the performance of such hybrid programming models when applied to the simulation of a realistic physiological multiscale model of the heart. Our results show that the hybrid models perform favorably when compared to an implementation using only the MPI and, furthermore, that OpenMP in combination with the MPI provides a satisfactory compromise between performance and code complexity. Having the ability to use threads within MPI processes enables the sophisticated use of all processor cores for both computation and communication phases. Considering that HPC systems in 2012 will have two orders of magnitude more cores than what was used in this study, we believe that faster than real-time multiscale cardiac simulations can be achieved on these systems.

  11. ALFIL: A Crowd Simulation Serious Game for Massive Evacuation Training and Awareness

    ERIC Educational Resources Information Center

    García-García, César; Fernández-Robles, José Luis; Larios-Rosillo, Victor; Luga, Hervé

    2012-01-01

    This article presents the current development of a serious game for the simulation of massive evacuations. The purpose of this project is to promote self-protection through awareness of the procedures and different possible scenarios during the evacuation of a massive event. Sophisticated behaviors require massive computational power and it has…

  12. Assessment methodology for computer-based instructional simulations.

    PubMed

    Koenig, Alan; Iseli, Markus; Wainess, Richard; Lee, John J

    2013-10-01

    Computer-based instructional simulations are becoming more and more ubiquitous, particularly in military and medical domains. As the technology that drives these simulations grows ever more sophisticated, the underlying pedagogical models for how instruction, assessment, and feedback are implemented within these systems must evolve accordingly. In this article, we review some of the existing educational approaches to medical simulations, and present pedagogical methodologies that have been used in the design and development of games and simulations at the University of California, Los Angeles, Center for Research on Evaluation, Standards, and Student Testing. In particular, we present a methodology for how automated assessments of computer-based simulations can be implemented using ontologies and Bayesian networks, and discuss their advantages and design considerations for pedagogical use. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  13. Performance evaluation of GPU parallelization, space-time adaptive algorithms, and their combination for simulating cardiac electrophysiology.

    PubMed

    Sachetto Oliveira, Rafael; Martins Rocha, Bernardo; Burgarelli, Denise; Meira, Wagner; Constantinides, Christakis; Weber Dos Santos, Rodrigo

    2018-02-01

    The use of computer models as a tool for the study and understanding of the complex phenomena of cardiac electrophysiology has attained increased importance nowadays. At the same time, the increased complexity of the biophysical processes translates into complex computational and mathematical models. To speed up cardiac simulations and to allow more precise and realistic uses, 2 different techniques have been traditionally exploited: parallel computing and sophisticated numerical methods. In this work, we combine a modern parallel computing technique based on multicore and graphics processing units (GPUs) and a sophisticated numerical method based on a new space-time adaptive algorithm. We evaluate each technique alone and in different combinations: multicore and GPU, multicore and GPU and space adaptivity, multicore and GPU and space adaptivity and time adaptivity. All the techniques and combinations were evaluated under different scenarios: 3D simulations on slabs, 3D simulations on a ventricular mouse mesh, ie, complex geometry, sinus-rhythm, and arrhythmic conditions. Our results suggest that multicore and GPU accelerate the simulations by an approximate factor of 33×, whereas the speedups attained by the space-time adaptive algorithms were approximately 48. Nevertheless, by combining all the techniques, we obtained speedups that ranged between 165 and 498. The tested methods were able to reduce the execution time of a simulation by more than 498× for a complex cellular model in a slab geometry and by 165× in a realistic heart geometry simulating spiral waves. The proposed methods will allow faster and more realistic simulations in a feasible time with no significant loss of accuracy. Copyright © 2017 John Wiley & Sons, Ltd.

  14. DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imam, Neena; Poole, Stephen W

    2013-01-01

    In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET,more » and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.« less

  15. The role of simulation in neurosurgery.

    PubMed

    Rehder, Roberta; Abd-El-Barr, Muhammad; Hooten, Kristopher; Weinstock, Peter; Madsen, Joseph R; Cohen, Alan R

    2016-01-01

    In an era of residency duty-hour restrictions, there has been a recent effort to implement simulation-based training methods in neurosurgery teaching institutions. Several surgical simulators have been developed, ranging from physical models to sophisticated virtual reality systems. To date, there is a paucity of information describing the clinical benefits of existing simulators and the assessment strategies to help implement them into neurosurgical curricula. Here, we present a systematic review of the current models of simulation and discuss the state-of-the-art and future directions for simulation in neurosurgery. Retrospective literature review. Multiple simulators have been developed for neurosurgical training, including those for minimally invasive procedures, vascular, skull base, pediatric, tumor resection, functional neurosurgery, and spine surgery. The pros and cons of existing systems are reviewed. Advances in imaging and computer technology have led to the development of different simulation models to complement traditional surgical training. Sophisticated virtual reality (VR) simulators with haptic feedback and impressive imaging technology have provided novel options for training in neurosurgery. Breakthrough training simulation using 3D printing technology holds promise for future simulation practice, proving high-fidelity patient-specific models to complement residency surgical learning.

  16. Computational Modeling and Treatment Identification in the Myelodysplastic Syndromes.

    PubMed

    Drusbosky, Leylah M; Cogle, Christopher R

    2017-10-01

    This review discusses the need for computational modeling in myelodysplastic syndromes (MDS) and early test results. As our evolving understanding of MDS reveals a molecularly complicated disease, the need for sophisticated computer analytics is required to keep track of the number and complex interplay among the molecular abnormalities. Computational modeling and digital drug simulations using whole exome sequencing data input have produced early results showing high accuracy in predicting treatment response to standard of care drugs. Furthermore, the computational MDS models serve as clinically relevant MDS cell lines for pre-clinical assays of investigational agents. MDS is an ideal disease for computational modeling and digital drug simulations. Current research is focused on establishing the prediction value of computational modeling. Future research will test the clinical advantage of computer-informed therapy in MDS.

  17. Computational Challenges of Viscous Incompressible Flows

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin; Kim, Chang Sung

    2004-01-01

    Over the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of the computational fluid dynamics (CFD) discipline. Although incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to the rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low-speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient CFD took become increasingly important in fluid engineering for incompressible and low-speed flow. This paper reviews some of the successes made possible by advances in computational technologies during the same period, and discusses some of the current challenges faced in computing incompressible flows.

  18. Growth and yield models for central hardwoods

    Treesearch

    Martin E. Dale; Donald E. Hilt

    1989-01-01

    Over the last 20 years computers have become an efficient tool to estimate growth and yield. Computerized yield estimates vary from simple approximation or interpolation of traditional normal yield tables to highly sophisticated programs that simulate the growth and yield of each individual tree.

  19. Development of a Searchable Metabolite Database and Simulator of Xenobiotic Metabolism

    EPA Science Inventory

    A computational tool (MetaPath) has been developed for storage and analysis of metabolic pathways and associated metadata. The system is capable of sophisticated text and chemical structure/substructure searching as well as rapid comparison of metabolites formed across chemicals,...

  20. MoCog1: A computer simulation of recognition-primed human decision making

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  1. Optimized Materials From First Principles Simulations: Are We There Yet?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galli, G; Gygi, F

    2005-07-26

    In the past thirty years, the use of scientific computing has become pervasive in all disciplines: collection and interpretation of most experimental data is carried out using computers, and physical models in computable form, with various degrees of complexity and sophistication, are utilized in all fields of science. However, full prediction of physical and chemical phenomena based on the basic laws of Nature, using computer simulations, is a revolution still in the making, and it involves some formidable theoretical and computational challenges. We illustrate the progress and successes obtained in recent years in predicting fundamental properties of materials in condensedmore » phases and at the nanoscale, using ab-initio, quantum simulations. We also discuss open issues related to the validation of the approximate, first principles theories used in large scale simulations, and the resulting complex interplay between computation and experiment. Finally, we describe some applications, with focus on nanostructures and liquids, both at ambient and under extreme conditions.« less

  2. Simulating motivated cognition

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    A research effort to develop a sophisticated computer model of human behavior is described. A computer framework of motivated cognition was developed. Motivated cognition focuses on the motivations or affects that provide the context and drive in human cognition and decision making. A conceptual architecture of the human decision-making approach from the perspective of information processing in the human brain is developed in diagrammatic form. A preliminary version of such a diagram is presented. This architecture is then used as a vehicle for successfully constructing a computer program simulation Dweck and Leggett's findings that relate how an individual's implicit theories orient them toward particular goals, with resultant cognitions, affects, and behavior.

  3. Teaching Strategies for Using Projected Images to Develop Conceptual Understanding: Exploring Discussion Practices in Computer Simulation and Static Image-Based Lessons

    ERIC Educational Resources Information Center

    Price, Norman T.

    2013-01-01

    The availability and sophistication of visual display images, such as simulations, for use in science classrooms has increased exponentially however, it can be difficult for teachers to use these images to encourage and engage active student thinking. There is a need to describe flexible discussion strategies that use visual media to engage active…

  4. The Matter Simulation (R)evolution

    PubMed Central

    2018-01-01

    To date, the program for the development of methods and models for atomistic and continuum simulation directed toward chemicals and materials has reached an incredible degree of sophistication and maturity. Currently, one can witness an increasingly rapid emergence of advances in computing, artificial intelligence, and robotics. This drives us to consider the future of computer simulation of matter from the molecular to the human length and time scales in a radical way that deliberately dares to go beyond the foreseeable next steps in any given discipline. This perspective article presents a view on this future development that we believe is likely to become a reality during our lifetime. PMID:29532014

  5. Advancing botnet modeling techniques for military and security simulations

    NASA Astrophysics Data System (ADS)

    Banks, Sheila B.; Stytz, Martin R.

    2011-06-01

    Simulation environments serve many purposes, but they are only as good as their content. One of the most challenging and pressing areas that call for improved content is the simulation of bot armies (botnets) and their effects upon networks and computer systems. Botnets are a new type of malware, a type that is more powerful and potentially dangerous than any other type of malware. A botnet's power derives from several capabilities including the following: 1) the botnet's capability to be controlled and directed throughout all phases of its activity, 2) a command and control structure that grows increasingly sophisticated, and 3) the ability of a bot's software to be updated at any time by the owner of the bot (a person commonly called a bot master or bot herder.) Not only is a bot army powerful and agile in its technical capabilities, a bot army can be extremely large, can be comprised of tens of thousands, if not millions, of compromised computers or it can be as small as a few thousand targeted systems. In all botnets, their members can surreptitiously communicate with each other and their command and control centers. In sum, these capabilities allow a bot army to execute attacks that are technically sophisticated, difficult to trace, tactically agile, massive, and coordinated. To improve our understanding of their operation and potential, we believe that it is necessary to develop computer security simulations that accurately portray bot army activities, with the goal of including bot army simulations within military simulation environments. In this paper, we investigate issues that arise when simulating bot armies and propose a combination of the biologically inspired MSEIR infection spread model coupled with the jump-diffusion infection spread model to portray botnet propagation.

  6. The design and implementation of CRT displays in the TCV real-time simulation

    NASA Technical Reports Server (NTRS)

    Leavitt, J. B.; Tariq, S. I.; Steinmetz, G. G.

    1975-01-01

    The design and application of computer graphics to the Terminal Configured Vehicle (TCV) program were described. A Boeing 737-100 series aircraft was modified with a second flight deck and several computers installed in the passenger cabin. One of the elements in support of the TCV program is a sophisticated simulation system developed to duplicate the operation of the aft flight deck. This facility consists of an aft flight deck simulator, equipped with realistic flight instrumentation, a CDC 6600 computer, and an Adage graphics terminal; this terminal presents to the simulator pilot displays similar to those used on the aircraft with equivalent man-machine interactions. These two displays form the primary flight instrumentation for the pilot and are dynamic images depicting critical flight information. The graphics terminal is a high speed interactive refresh-type graphics system. To support the cockpit display, two remote CRT's were wired in parallel with two of the Adage scopes.

  7. C-SWAT: The Soil and Water Assessment Tool with consolidated input files in alleviating computational burden of recursive simulations

    USDA-ARS?s Scientific Manuscript database

    The temptation to include model parameters and high resolution input data together with the availability of powerful optimization and uncertainty analysis algorithms has significantly enhanced the complexity of hydrologic and water quality modeling. However, the ability to take advantage of sophist...

  8. The RCM: A Resource Management and Program Budgeting Approach for State and Local Educational Agencies.

    ERIC Educational Resources Information Center

    Chambers, Jay G.; Parrish, Thomas B.

    The Resource Cost Model (RCM) is a resource management system that combines the technical advantages of sophisticated computer simulation software with the practical benefits of group decision making to provide detailed information about educational program costs. The first section of this document introduces the conceptual framework underlying…

  9. MoCog1: A computer simulation of recognition-primed human decision making

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    This report describes the successful results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior. Most human decision-making is of the experience-based, relatively straight-forward, largely automatic, type of response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. This report describes the development of the architecture and computer program associated with such 'recognition-primed' decision-making. The resultant computer program was successfully utilized as a vehicle to simulate findings that relate how an individual's implicit theories orient them toward particular goals, with resultant cognitions, affects, and behavior in response to their environment. The present work is an expanded version and is based on research reported while the author was an employee of NASA ARC.

  10. Development and application of numerical techniques for general-relativistic magnetohydrodynamics simulations of black hole accretion

    NASA Astrophysics Data System (ADS)

    White, Christopher Joseph

    We describe the implementation of sophisticated numerical techniques for general-relativistic magnetohydrodynamics simulations in the Athena++ code framework. Improvements over many existing codes include the use of advanced Riemann solvers and of staggered-mesh constrained transport. Combined with considerations for computational performance and parallel scalability, these allow us to investigate black hole accretion flows with unprecedented accuracy. The capability of the code is demonstrated by exploring magnetically arrested disks.

  11. Application of foam-extend on turbulent fluid-structure interaction

    NASA Astrophysics Data System (ADS)

    Rege, K.; Hjertager, B. H.

    2017-12-01

    Turbulent flow around flexible structures is likely to induce structural vibrations which may eventually lead to fatigue failure. In order to assess the fatigue life of these structures, it is necessary to take the action of the flow on the structure into account, but also the influence of the vibrating structure on the fluid flow. This is achieved by performing fluid-structure interaction (FSI) simulations. In this work, we have investigated the capability of a FSI toolkit for the finite volume computational fluid dynamics software foam-extend to simulate turbulence-induced vibrations of a flexible structure. A large-eddy simulation (LES) turbulence model has been implemented to a basic FSI problem of a flexible wall which is placed in a confined, turbulent flow. This problem was simulated for 2.32 seconds. This short simulation required over 200 computation hours, using 20 processor cores. Thereby, it has been shown that the simulation of FSI with LES is possible, but also computationally demanding. In order to make turbulent FSI simulations with foam-extend more applicable, more sophisticated turbulence models and/or faster FSI iteration schemes should be applied.

  12. Climate Models

    NASA Technical Reports Server (NTRS)

    Druyan, Leonard M.

    2012-01-01

    Climate models is a very broad topic, so a single volume can only offer a small sampling of relevant research activities. This volume of 14 chapters includes descriptions of a variety of modeling studies for a variety of geographic regions by an international roster of authors. The climate research community generally uses the rubric climate models to refer to organized sets of computer instructions that produce simulations of climate evolution. The code is based on physical relationships that describe the shared variability of meteorological parameters such as temperature, humidity, precipitation rate, circulation, radiation fluxes, etc. Three-dimensional climate models are integrated over time in order to compute the temporal and spatial variations of these parameters. Model domains can be global or regional and the horizontal and vertical resolutions of the computational grid vary from model to model. Considering the entire climate system requires accounting for interactions between solar insolation, atmospheric, oceanic and continental processes, the latter including land hydrology and vegetation. Model simulations may concentrate on one or more of these components, but the most sophisticated models will estimate the mutual interactions of all of these environments. Advances in computer technology have prompted investments in more complex model configurations that consider more phenomena interactions than were possible with yesterday s computers. However, not every attempt to add to the computational layers is rewarded by better model performance. Extensive research is required to test and document any advantages gained by greater sophistication in model formulation. One purpose for publishing climate model research results is to present purported advances for evaluation by the scientific community.

  13. A description of the thruster attitude control simulation and its application to the HEAO-C study

    NASA Technical Reports Server (NTRS)

    Brandon, L. B.

    1971-01-01

    During the design and evaluation of a reaction control system (RCS), it is desirable to have a digital computer program simulating vehicle dynamics, disturbance torques, control torques, and RCS logic. The thruster attitude control simulation (TACS) is just such a computer program. The TACS is a relatively sophisticated digital computer program that includes all the major parameters involved in the attitude control of a vehicle using an RCS for control. It includes the effects of gravity gradient torques and HEAO-C aerodynamic torques so that realistic runs can be made in the areas of fuel consumption and engine actuation rates. Also, the program is general enough that any engine configuration and logic scheme can be implemented in a reasonable amount of time. The results of the application of the TACS in the HEAO-C study are included.

  14. Composite Load Spectra for Select Space Propulsion Structural Components

    NASA Technical Reports Server (NTRS)

    Ho, Hing W.; Newell, James F.

    1994-01-01

    Generic load models are described with multiple levels of progressive sophistication to simulate the composite (combined) load spectra (CLS) that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades and liquid oxygen (LOX) posts. These generic (coupled) models combine the deterministic models for composite load dynamic, acoustic, high-pressure and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients are then determined using advanced probabilistic simulation methods with and without strategically selected experimental data. The entire simulation process is included in a CLS computer code. Applications of the computer code to various components in conjunction with the PSAM (Probabilistic Structural Analysis Method) to perform probabilistic load evaluation and life prediction evaluations are also described to illustrate the effectiveness of the coupled model approach.

  15. Successes and Challenges of Incompressible Flow Simulation

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin

    2003-01-01

    During the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of CFD discipline. Even though incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient, CFD tools become indispensable in fluid engineering for incompressible and low speed flow. This paper is intended to review some of the successes made possible by advances in computational technologies during the same period, and discuss some of the current challenges.

  16. Minimum-complexity helicopter simulation math model

    NASA Technical Reports Server (NTRS)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  17. Natural three-qubit interactions in one-way quantum computing

    NASA Astrophysics Data System (ADS)

    Tame, M. S.; Paternostro, M.; Kim, M. S.; Vedral, V.

    2006-02-01

    We address the effects of natural three-qubit interactions on the computational power of one-way quantum computation. A benefit of using more sophisticated entanglement structures is the ability to construct compact and economic simulations of quantum algorithms with limited resources. We show that the features of our study are embodied by suitably prepared optical lattices, where effective three-spin interactions have been theoretically demonstrated. We use this to provide a compact construction for the Toffoli gate. Information flow and two-qubit interactions are also outlined, together with a brief analysis of relevant sources of imperfection.

  18. Simulation training tools for nonlethal weapons using gaming environments

    NASA Astrophysics Data System (ADS)

    Donne, Alexsana; Eagan, Justin; Tse, Gabriel; Vanderslice, Tom; Woods, Jerry

    2006-05-01

    Modern simulation techniques have a growing role for evaluating new technologies and for developing cost-effective training programs. A mission simulator facilitates the productive exchange of ideas by demonstration of concepts through compellingly realistic computer simulation. Revolutionary advances in 3D simulation technology have made it possible for desktop computers to process strikingly realistic and complex interactions with results depicted in real-time. Computer games now allow for multiple real human players and "artificially intelligent" (AI) simulated robots to play together. Advances in computer processing power have compensated for the inherent intensive calculations required for complex simulation scenarios. The main components of the leading game-engines have been released for user modifications, enabling game enthusiasts and amateur programmers to advance the state-of-the-art in AI and computer simulation technologies. It is now possible to simulate sophisticated and realistic conflict situations in order to evaluate the impact of non-lethal devices as well as conflict resolution procedures using such devices. Simulations can reduce training costs as end users: learn what a device does and doesn't do prior to use, understand responses to the device prior to deployment, determine if the device is appropriate for their situational responses, and train with new devices and techniques before purchasing hardware. This paper will present the status of SARA's mission simulation development activities, based on the Half-Life gameengine, for the purpose of evaluating the latest non-lethal weapon devices, and for developing training tools for such devices.

  19. An expert system for municipal solid waste management simulation analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsieh, M.C.; Chang, N.B.

    1996-12-31

    Optimization techniques were usually used to model the complicated metropolitan solid waste management system to search for the best dynamic combination of waste recycling, facility siting, and system operation, where sophisticated and well-defined interrelationship are required in the modeling process. But this paper applied the Concurrent Object-Oriented Simulation (COOS), a new simulation software construction method, to bridge the gap between the physical system and its computer representation. The case study of Kaohsiung solid waste management system in Taiwan is prepared for the illustration of the analytical methodology of COOS and its implementation in the creation of an expert system.

  20. Classical and all-floating FETI methods for the simulation of arterial tissues

    PubMed Central

    Augustin, Christoph M.; Holzapfel, Gerhard A.; Steinbach, Olaf

    2015-01-01

    High-resolution and anatomically realistic computer models of biological soft tissues play a significant role in the understanding of the function of cardiovascular components in health and disease. However, the computational effort to handle fine grids to resolve the geometries as well as sophisticated tissue models is very challenging. One possibility to derive a strongly scalable parallel solution algorithm is to consider finite element tearing and interconnecting (FETI) methods. In this study we propose and investigate the application of FETI methods to simulate the elastic behavior of biological soft tissues. As one particular example we choose the artery which is – as most other biological tissues – characterized by anisotropic and nonlinear material properties. We compare two specific approaches of FETI methods, classical and all-floating, and investigate the numerical behavior of different preconditioning techniques. In comparison to classical FETI, the all-floating approach has not only advantages concerning the implementation but in many cases also concerning the convergence of the global iterative solution method. This behavior is illustrated with numerical examples. We present results of linear elastic simulations to show convergence rates, as expected from the theory, and results from the more sophisticated nonlinear case where we apply a well-known anisotropic model to the realistic geometry of an artery. Although the FETI methods have a great applicability on artery simulations we will also discuss some limitations concerning the dependence on material parameters. PMID:26751957

  1. Simulation Of Seawater Intrusion With 2D And 3D Models: Nauru Island Case Study

    NASA Astrophysics Data System (ADS)

    Ghassemi, F.; Jakeman, A. J.; Jacobson, G.; Howard, K. W. F.

    1996-03-01

    With the advent of large computing capacities during the past few decades, sophisticated models have been developed for the simulation of seawater intrusion in coastal and island aquifers. Currently, several models are commercially available for the simulation of this problem. This paper describes the mathematical basis and application of the SUTRA and HST3D models to simulate seawater intrusion in Nauru Island, in the central Pacific Ocean. A comparison of the performance and limitations of these two models in simulating a real problem indicates that three-dimensional simulation of seawater intrusion with the HST3D model has the major advantage of being able to specify natural boundary conditions as well as pumping stresses. However, HST3D requires a small grid size and short time steps in order to maintain numerical stability and accuracy. These requirements lead to solution of a large set of linear equations that requires the availability of powerful computing facilities in terms of memory and computing speed. Combined results of the two simulation models indicate a safe pumping rate of 400 m3/d for the aquifer on Nauru Island, where additional fresh water is presently needed for the rehabilitation of mined-out land.

  2. Proposal for hierarchical description of software systems

    NASA Technical Reports Server (NTRS)

    Thauboth, H.

    1973-01-01

    The programming of digital computers has developed into a new dimension full of diffculties, because the hardware of computers has become so powerful that more complex applications are entrusted to computers. The costs of software development, verification, and maintenance are outpacing those of the hardware and the trend is toward futher increase of sophistication of application of computers and consequently of sophistication of software. To obtain better visibility into software systems and to improve the structure of software systems for better tests, verification, and maintenance, a clear, but rigorous description and documentation of software is needed. The purpose of the report is to extend the present methods in order to obtain a documentation that better reflects the interplay between the various components and functions of a software system at different levels of detail without losing the precision in expression. This is done by the use of block diagrams, sequence diagrams, and cross-reference charts. In the appendices, examples from an actual large sofware system, i.e. the Marshall System for Aerospace Systems Simulation (MARSYAS), are presented. The proposed documentation structure is compatible to automation of updating significant portions of the documentation for better software change control.

  3. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    NASA Astrophysics Data System (ADS)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  4. Programming of a flexible computer simulation to visualize pharmacokinetic-pharmacodynamic models.

    PubMed

    Lötsch, J; Kobal, G; Geisslinger, G

    2004-01-01

    Teaching pharmacokinetic-pharmacodynamic (PK/PD) models can be made more effective using computer simulations. We propose the programming of educational PK or PK/PD computer simulations as an alternative to the use of pre-built simulation software. This approach has the advantage of adaptability to non-standard or complicated PK or PK/PD models. Simplicity of the programming procedure was achieved by selecting the LabVIEW programming environment. An intuitive user interface to visualize the time courses of drug concentrations or effects can be obtained with pre-built elements. The environment uses a wiring analogy that resembles electrical circuit diagrams rather than abstract programming code. The goal of high interactivity of the simulation was attained by allowing the program to run in continuously repeating loops. This makes the program behave flexibly to the user input. The programming is described with the aid of a 2-compartment PK simulation. Examples of more sophisticated simulation programs are also given where the PK/PD simulation shows drug input, concentrations in plasma, and at effect site and the effects themselves as a function of time. A multi-compartmental model of morphine, including metabolite kinetics and effects is also included. The programs are available for download from the World Wide Web at http:// www. klinik.uni-frankfurt.de/zpharm/klin/ PKPDsimulation/content.html. For pharmacokineticists who only program occasionally, there is the possibility of building the computer simulation, together with the flexible interactive simulation algorithm for clinical pharmacological teaching in the field of PK/PD models.

  5. Defense Small Business Innovation Research Program (SBIR), Volume 4, Defense Agencies Abstracts of Phase 1 Awards 1991

    DTIC Science & Technology

    1991-01-01

    EXPERIENCE IN DEVELOPING INTEGRATED OPTICAL DEVICES, NONLINEAR MAGNETIC-OPTIC MATERIALS, HIGH FREQUENCY MODULATORS, COMPUTER-AIDED MODELING AND SOPHISTICATED... HIGH -LEVEL PRESENTATION AND DISTRIBUTED CONTROL MODELS FOR INTEGRATING HETEROGENEOUS MECHANICAL ENGINEERING APPLICATIONS AND TOOLS. THE DESIGN IS FOCUSED...STATISTICALLY ACCURATE WORST CASE DEVICE MODELS FOR CIRCUIT SIMULATION. PRESENT METHODS OF WORST CASE DEVICE DESIGN ARE AD HOC AND DO NOT ALLOW THE

  6. Computational Modeling of Cultural Dimensions in Adversary Organizations

    DTIC Science & Technology

    2010-01-01

    Nodes”, In the Proceedings of the 9th Conference on Uncertainty in Artificial Intelli - gence, 1993. [8] Pearl, J. Probabilistic Reasoning in...the artificial life simulations; in con- trast, models with only a few agents typically employ quite sophisticated cognitive agents capa- ble of...Model Construction 45 cisions as to how to allocate scarce ISR assets (two Unmanned Air Systems, UAS ) among the two Red activities while at the same

  7. The "Virtual ChemLab" Project: A Realistic and Sophisticated Simulation of Organic Synthesis and Organic Qualitative Analysis

    ERIC Educational Resources Information Center

    Woodfield, Brian F.; Andrus, Merritt B.; Waddoups, Gregory L.; Moore, Melissa S.; Swan, Richard; Allen, Rob; Bodily, Greg; Andersen, Tricia; Miller, Jordan; Simmons, Bryon; Stanger, Richard

    2005-01-01

    A set of sophisticated and realistic laboratory simulations is created for use in freshman- and sophomore-level chemistry classes and laboratories called 'Virtual ChemLab'. A detailed assessment of student responses is provided and the simulation's pedagogical utility is described using the organic simulation.

  8. Modeling of the Human - Operator in a Complex System Functioning Under Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Getzov, Peter; Hubenova, Zoia; Yordanov, Dimitar; Popov, Wiliam

    2013-12-01

    Problems, related to the explication of sophisticated control systems of objects, operating under extreme conditions, have been examined and the impact of the effectiveness of the operator's activity on the systems as a whole. The necessity of creation of complex simulation models, reflecting operator's activity, is discussed. Organizational and technical system of an unmanned aviation complex is described as a sophisticated ergatic system. Computer realization of main subsystems of algorithmic system of the man as a controlling system is implemented and specialized software for data processing and analysis is developed. An original computer model of a Man as a tracking system has been implemented. Model of unmanned complex for operators training and formation of a mental model in emergency situation, implemented in "matlab-simulink" environment, has been synthesized. As a unit of the control loop, the pilot (operator) is simplified viewed as an autocontrol system consisting of three main interconnected subsystems: sensitive organs (perception sensors); central nervous system; executive organs (muscles of the arms, legs, back). Theoretical-data model of prediction the level of operator's information load in ergatic systems is proposed. It allows the assessment and prediction of the effectiveness of a real working operator. Simulation model of operator's activity in takeoff based on the Petri nets has been synthesized.

  9. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  10. Physically Based Virtual Surgery Planning and Simulation Tools for Personal Health Care Systems

    NASA Astrophysics Data System (ADS)

    Dogan, Firat; Atilgan, Yasemin

    The virtual surgery planning and simulation tools have gained a great deal of importance in the last decade in a consequence of increasing capacities at the information technology level. The modern hardware architectures, large scale database systems, grid based computer networks, agile development processes, better 3D visualization and all the other strong aspects of the information technology brings necessary instruments into almost every desk. The last decade’s special software and sophisticated super computer environments are now serving to individual needs inside “tiny smart boxes” for reasonable prices. However, resistance to learning new computerized environments, insufficient training and all the other old habits prevents effective utilization of IT resources by the specialists of the health sector. In this paper, all the aspects of the former and current developments in surgery planning and simulation related tools are presented, future directions and expectations are investigated for better electronic health care systems.

  11. Software systems for modeling articulated figures

    NASA Technical Reports Server (NTRS)

    Phillips, Cary B.

    1989-01-01

    Research in computer animation and simulation of human task performance requires sophisticated geometric modeling and user interface tools. The software for a research environment should present the programmer with a powerful but flexible substrate of facilities for displaying and manipulating geometric objects, yet insure that future tools have a consistent and friendly user interface. Jack is a system which provides a flexible and extensible programmer and user interface for displaying and manipulating complex geometric figures, particularly human figures in a 3D working environment. It is a basic software framework for high-performance Silicon Graphics IRIS workstations for modeling and manipulating geometric objects in a general but powerful way. It provides a consistent and user-friendly interface across various applications in computer animation and simulation of human task performance. Currently, Jack provides input and control for applications including lighting specification and image rendering, anthropometric modeling, figure positioning, inverse kinematics, dynamic simulation, and keyframe animation.

  12. Computer simulation studies of the growth of strained layers by molecular-beam epitaxy

    NASA Astrophysics Data System (ADS)

    Faux, D. A.; Gaynor, G.; Carson, C. L.; Hall, C. K.; Bernholc, J.

    1990-08-01

    Two new types of discrete-space Monte Carlo computer simulation are presented for the modeling of the early stages of strained-layer growth by molecular-beam epitaxy. The simulations are more economical on computer resources than continuous-space Monte Carlo or molecular dynamics. Each model is applied to the study of growth onto a substrate in two dimensions with use of Lennard-Jones interatomic potentials. Up to seven layers are deposited for a variety of lattice mismatches, temperatures, and growth rates. Both simulations give similar results. At small lattice mismatches (<~4%) the growth is in registry with the substrate, while at high mismatches (>~6%) the growth is incommensurate with the substrate. At intermediate mismatches, a transition from registered to incommensurate growth is observed which commences at the top of the crystal and propagates down to the first layer. Faster growth rates are seen to inhibit this transition. The growth mode is van der Merwe (layer-by-layer) at 2% lattice mismatch, but at larger mismatches Volmer-Weber (island) growth is preferred. The Monte Carlo simulations are assessed in the light of these results and the ease at which they can be extended to three dimensions and to more sophisticated potentials is discussed.

  13. Development and Application of a Numerical Framework for Improving Building Foundation Heat Transfer Calculations

    NASA Astrophysics Data System (ADS)

    Kruis, Nathanael J. F.

    Heat transfer from building foundations varies significantly in all three spatial dimensions and has important dynamic effects at all timescales, from one hour to several years. With the additional consideration of moisture transport, ground freezing, evapotranspiration, and other physical phenomena, the estimation of foundation heat transfer becomes increasingly sophisticated and computationally intensive to the point where accuracy must be compromised for reasonable computation time. The tools currently available to calculate foundation heat transfer are often either too limited in their capabilities to draw meaningful conclusions or too sophisticated to use in common practices. This work presents Kiva, a new foundation heat transfer computational framework. Kiva provides a flexible environment for testing different numerical schemes, initialization methods, spatial and temporal discretizations, and geometric approximations. Comparisons within this framework provide insight into the balance of computation speed and accuracy relative to highly detailed reference solutions. The accuracy and computational performance of six finite difference numerical schemes are verified against established IEA BESTEST test cases for slab-on-grade heat conduction. Of the schemes tested, the Alternating Direction Implicit (ADI) scheme demonstrates the best balance between accuracy, performance, and numerical stability. Kiva features four approaches of initializing soil temperatures for an annual simulation. A new accelerated initialization approach is shown to significantly reduce the required years of presimulation. Methods of approximating three-dimensional heat transfer within a representative two-dimensional context further improve computational performance. A new approximation called the boundary layer adjustment method is shown to improve accuracy over other established methods with a negligible increase in computation time. This method accounts for the reduced heat transfer from concave foundation shapes, which has not been adequately addressed to date. Within the Kiva framework, three-dimensional heat transfer that can require several days to simulate is approximated in two-dimensions in a matter of seconds while maintaining a mean absolute deviation within 3%.

  14. Electronic prototyping

    NASA Technical Reports Server (NTRS)

    Hopcroft, J.

    1987-01-01

    The potential benefits of automation in space are significant. The science base needed to support this automation not only will help control costs and reduce lead-time in the earth-based design and construction of space stations, but also will advance the nation's capability for computer design, simulation, testing, and debugging of sophisticated objects electronically. Progress in automation will require the ability to electronically represent, reason about, and manipulate objects. Discussed here is the development of representations, languages, editors, and model-driven simulation systems to support electronic prototyping. In particular, it identifies areas where basic research is needed before further progress can be made.

  15. An analytical study of electric vehicle handling dynamics

    NASA Technical Reports Server (NTRS)

    Greene, J. E.; Segal, D. J.

    1979-01-01

    Hypothetical electric vehicle configurations were studied by applying available analytical methods. Elementary linearized models were used in addition to a highly sophisticated vehicle dynamics computer simulation technique. Physical properties of specific EV's were defined for various battery and powertrain packaging approaches applied to a range of weight distribution and inertial properties which characterize a generic class of EV's. Computer simulations of structured maneuvers were performed for predicting handling qualities in the normal driving range and during various extreme conditions related to accident avoidance. Results indicate that an EV with forward weight bias will possess handling qualities superior to a comparable EV that is rear-heavy or equally balanced. The importance of properly matching tires, suspension systems, and brake system front/rear torque proportioning to a given EV configuration during the design stage is demonstrated.

  16. Towards constructing multi-bit binary adder based on Belousov-Zhabotinsky reaction

    NASA Astrophysics Data System (ADS)

    Zhang, Guo-Mao; Wong, Ieong; Chou, Meng-Ta; Zhao, Xin

    2012-04-01

    It has been proposed that the spatial excitable media can perform a wide range of computational operations, from image processing, to path planning, to logical and arithmetic computations. The realizations in the field of chemical logical and arithmetic computations are mainly concerned with single simple logical functions in experiments. In this study, based on Belousov-Zhabotinsky reaction, we performed simulations toward the realization of a more complex operation, the binary adder. Combining with some of the existing functional structures that have been verified experimentally, we designed a planar geometrical binary adder chemical device. Through numerical simulations, we first demonstrated that the device can implement the function of a single-bit full binary adder. Then we show that the binary adder units can be further extended in plane, and coupled together to realize a two-bit, or even multi-bit binary adder. The realization of chemical adders can guide the constructions of other sophisticated arithmetic functions, ultimately leading to the implementation of chemical computer and other intelligent systems.

  17. Recent advances in modeling languages for pathway maps and computable biological networks.

    PubMed

    Slater, Ted

    2014-02-01

    As our theories of systems biology grow more sophisticated, the models we use to represent them become larger and more complex. Languages necessarily have the expressivity and flexibility required to represent these models in ways that support high-resolution annotation, and provide for simulation and analysis that are sophisticated enough to allow researchers to master their data in the proper context. These languages also need to facilitate model sharing and collaboration, which is currently best done by using uniform data structures (such as graphs) and language standards. In this brief review, we discuss three of the most recent systems biology modeling languages to appear: BEL, PySB and BCML, and examine how they meet these needs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Protocols for efficient simulations of long-time protein dynamics using coarse-grained CABS model.

    PubMed

    Jamroz, Michal; Kolinski, Andrzej; Kmiecik, Sebastian

    2014-01-01

    Coarse-grained (CG) modeling is a well-acknowledged simulation approach for getting insight into long-time scale protein folding events at reasonable computational cost. Depending on the design of a CG model, the simulation protocols vary from highly case-specific-requiring user-defined assumptions about the folding scenario-to more sophisticated blind prediction methods for which only a protein sequence is required. Here we describe the framework protocol for the simulations of long-term dynamics of globular proteins, with the use of the CABS CG protein model and sequence data. The simulations can start from a random or a selected (e.g., native) structure. The described protocol has been validated using experimental data for protein folding model systems-the prediction results agreed well with the experimental results.

  19. Design of a monitor and simulation terminal (master) for space station telerobotics and telescience

    NASA Technical Reports Server (NTRS)

    Lopez, L.; Konkel, C.; Harmon, P.; King, S.

    1989-01-01

    Based on Space Station and planetary spacecraft communication time delays and bandwidth limitations, it will be necessary to develop an intelligent, general purpose ground monitor terminal capable of sophisticated data display and control of on-orbit facilities and remote spacecraft. The basic elements that make up a Monitor and Simulation Terminal (MASTER) include computer overlay video, data compression, forward simulation, mission resource optimization and high level robotic control. Hardware and software elements of a MASTER are being assembled for testbed use. Applications of Neural Networks (NNs) to some key functions of a MASTER are also discussed. These functions are overlay graphics adjustment, object correlation and kinematic-dynamic characterization of the manipulator.

  20. MorphoHawk: Geometric-based Software for Manufacturing and More

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keith Arterburn

    2001-04-01

    Hollywood movies portray facial recognition as a perfected technology, but reality is that sophisticated computers and algorithmic calculations are far from perfect. In fact, the most sophisticated and successful computer for recognizing faces and other imagery already is the human brain with more than 10 billion nerve cells. Beginning at birth, humans process data and connect optical and sensory experiences that create unparalleled accumulation of data for people to associate images with life experiences, emotions and knowledge. Computers are powerful, rapid and tireless, but still cannot compare to the highly sophisticated relational calculations and associations that the human computer canmore » produce in connecting ‘what we see with what we know.’« less

  1. Simulation of Combustion Systems with Realistic g-jitter

    NASA Technical Reports Server (NTRS)

    Mell, William E.; McGrattan, Kevin B.; Baum, Howard R.

    2003-01-01

    In this project a transient, fully three-dimensional computer simulation code was developed to simulate the effects of realistic g-jitter on a number of combustion systems. The simulation code is capable of simulating flame spread on a solid and nonpremixed or premixed gaseous combustion in nonturbulent flow with simple combustion models. Simple combustion models were used to preserve computational efficiency since this is meant to be an engineering code. Also, the use of sophisticated turbulence models was not pursued (a simple Smagorinsky type model can be implemented if deemed appropriate) because if flow velocities are large enough for turbulence to develop in a reduced gravity combustion scenario it is unlikely that g-jitter disturbances (in NASA's reduced gravity facilities) will play an important role in the flame dynamics. Acceleration disturbances of realistic orientation, magnitude, and time dependence can be easily included in the simulation. The simulation algorithm was based on techniques used in an existing large eddy simulation code which has successfully simulated fire dynamics in complex domains. A series of simulations with measured and predicted acceleration disturbances on the International Space Station (ISS) are presented. The results of this series of simulations suggested a passive isolation system and appropriate scheduling of crew activity would provide a sufficiently "quiet" acceleration environment for spherical diffusion flames.

  2. Facial Animations: Future Research Directions & Challenges

    NASA Astrophysics Data System (ADS)

    Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Rehman, Amjad; Basori, Ahmad Hoirul

    2014-06-01

    Nowadays, computer facial animation is used in a significant multitude fields that brought human and social to study the computer games, films and interactive multimedia reality growth. Authoring the computer facial animation, complex and subtle expressions are challenging and fraught with problems. As a result, the current most authored using universal computer animation techniques often limit the production quality and quantity of facial animation. With the supplement of computer power, facial appreciative, software sophistication and new face-centric methods emerging are immature in nature. Therefore, this paper concentrates to define and managerially categorize current and emerged surveyed facial animation experts to define the recent state of the field, observed bottlenecks and developing techniques. This paper further presents a real-time simulation model of human worry and howling with detail discussion about their astonish, sorrow, annoyance and panic perception.

  3. Preliminary performance analysis of an interplanetary navigation system using asteroid based beacons

    NASA Technical Reports Server (NTRS)

    Jee, J. Rodney; Khatib, Ahmad R.; Muellerschoen, Ronald J.; Williams, Bobby G.; Vincent, Mark A.

    1988-01-01

    A futuristic interplanetary navigation system using transmitters placed on selected asteroids is introduced. This network of space beacons is seen as a needed alternative to the overly burdened Deep Space Network. Covariance analyses on the potential performance of these space beacons located on a candidate constellation of eight real asteroids are initiated. Simplified analytic calculations are performed to determine limiting accuracies attainable with the network for geometric positioning. More sophisticated computer simulations are also performed to determine potential accuracies using long arcs of range and Doppler data from the beacons. The results from these computations show promise for this navigation system.

  4. Large-Scale Brain Simulation and Disorders of Consciousness. Mapping Technical and Conceptual Issues.

    PubMed

    Farisco, Michele; Kotaleski, Jeanette H; Evers, Kathinka

    2018-01-01

    Modeling and simulations have gained a leading position in contemporary attempts to describe, explain, and quantitatively predict the human brain's operations. Computer models are highly sophisticated tools developed to achieve an integrated knowledge of the brain with the aim of overcoming the actual fragmentation resulting from different neuroscientific approaches. In this paper we investigate the plausibility of simulation technologies for emulation of consciousness and the potential clinical impact of large-scale brain simulation on the assessment and care of disorders of consciousness (DOCs), e.g., Coma, Vegetative State/Unresponsive Wakefulness Syndrome, Minimally Conscious State. Notwithstanding their technical limitations, we suggest that simulation technologies may offer new solutions to old practical problems, particularly in clinical contexts. We take DOCs as an illustrative case, arguing that the simulation of neural correlates of consciousness is potentially useful for improving treatments of patients with DOCs.

  5. Large-Scale Brain Simulation and Disorders of Consciousness. Mapping Technical and Conceptual Issues

    PubMed Central

    Farisco, Michele; Kotaleski, Jeanette H.; Evers, Kathinka

    2018-01-01

    Modeling and simulations have gained a leading position in contemporary attempts to describe, explain, and quantitatively predict the human brain’s operations. Computer models are highly sophisticated tools developed to achieve an integrated knowledge of the brain with the aim of overcoming the actual fragmentation resulting from different neuroscientific approaches. In this paper we investigate the plausibility of simulation technologies for emulation of consciousness and the potential clinical impact of large-scale brain simulation on the assessment and care of disorders of consciousness (DOCs), e.g., Coma, Vegetative State/Unresponsive Wakefulness Syndrome, Minimally Conscious State. Notwithstanding their technical limitations, we suggest that simulation technologies may offer new solutions to old practical problems, particularly in clinical contexts. We take DOCs as an illustrative case, arguing that the simulation of neural correlates of consciousness is potentially useful for improving treatments of patients with DOCs. PMID:29740372

  6. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    NASA Technical Reports Server (NTRS)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  7. Economic analysis of linking operating room scheduling and hospital material management information systems for just-in-time inventory control.

    PubMed

    Epstein, R H; Dexter, F

    2000-08-01

    Operating room (OR) scheduling information systems can decrease perioperative labor costs. Material management information systems can decrease perioperative inventory costs. We used computer simulation to investigate whether using the OR schedule to trigger purchasing of perioperative supplies is likely to further decrease perioperative inventory costs, as compared with using sophisticated, stand-alone material management inventory control. Although we designed the simulations to favor financially linking the information systems, we found that this strategy would be expected to decrease inventory costs substantively only for items of high price ($1000 each) and volume (>1000 used each year). Because expensive items typically have different models and sizes, each of which is used by a hospital less often than this, for almost all items there will be no benefit to making daily adjustments to the order volume based on booked cases. We conclude that, in a hospital with a sophisticated material management information system, OR managers will probably achieve greater cost reductions from focusing on negotiating less expensive purchase prices for items than on trying to link the OR information system with the hospital's material management information system to achieve just-in-time inventory control. In a hospital with a sophisticated material management information system, operating room managers will probably achieve greater cost reductions from focusing on negotiating less expensive purchase prices for items than on trying to link the operating room information system with the hospital's material management information system to achieve just-in-time inventory control.

  8. Study on the application of NASA energy management techniques for control of a terrestrial solar water heating system

    NASA Technical Reports Server (NTRS)

    Swanson, T. D.; Ollendorf, S.

    1979-01-01

    This paper addresses the potential for enhanced solar system performance through sophisticated control of the collector loop flow rate. Computer simulations utilizing the TRNSYS solar energy program were performed to study the relative effect on system performance of eight specific control algorithms. Six of these control algorithms are of the proportional type: two are concave exponentials, two are simple linear functions, and two are convex exponentials. These six functions are typical of what might be expected from future, more advanced, controllers. The other two algorithms are of the on/off type and are thus typical of existing control devices. Results of extensive computer simulations utilizing actual weather data indicate that proportional control does not significantly improve system performance. However, it is shown that thermal stratification in the liquid storage tank may significantly improve performance.

  9. Interoperability and complementarity of simulation tools for beamline design in the OASYS environment

    NASA Astrophysics Data System (ADS)

    Rebuffi, Luca; Sanchez del Rio, Manuel

    2017-08-01

    In the next years most of the major synchrotron radiation facilities around the world will upgrade to 4th-generation Diffraction Limited Storage Rings using multi-bend-achromat technology. Moreover, several Free Electron Lasers are ready-to-go or in phase of completion. These events represent a huge challenge for the optics physicists responsible of designing and calculating optical systems capable to exploit the revolutionary characteristics of the new photon beams. Reliable and robust beamline design is nowadays based on sophisticated computer simulations only possible by lumping together different simulation tools. The OASYS (OrAnge SYnchrotron Suite) suite drives several simulation tools providing new mechanisms of interoperability and communication within the same software environment. OASYS has been successfully used during the conceptual design of many beamline and optical designs for the ESRF and Elettra- Sincrotrone Trieste upgrades. Some examples are presented showing comparisons and benchmarking of simulations against calculated and experimental data.

  10. Knowledge-based computer systems for radiotherapy planning.

    PubMed

    Kalet, I J; Paluszynski, W

    1990-08-01

    Radiation therapy is one of the first areas of clinical medicine to utilize computers in support of routine clinical decision making. The role of the computer has evolved from simple dose calculations to elaborate interactive graphic three-dimensional simulations. These simulations can combine external irradiation from megavoltage photons, electrons, and particle beams with interstitial and intracavitary sources. With the flexibility and power of modern radiotherapy equipment and the ability of computer programs that simulate anything the machinery can do, we now face a challenge to utilize this capability to design more effective radiation treatments. How can we manage the increased complexity of sophisticated treatment planning? A promising approach will be to use artificial intelligence techniques to systematize our present knowledge about design of treatment plans, and to provide a framework for developing new treatment strategies. Far from replacing the physician, physicist, or dosimetrist, artificial intelligence-based software tools can assist the treatment planning team in producing more powerful and effective treatment plans. Research in progress using knowledge-based (AI) programming in treatment planning already has indicated the usefulness of such concepts as rule-based reasoning, hierarchical organization of knowledge, and reasoning from prototypes. Problems to be solved include how to handle continuously varying parameters and how to evaluate plans in order to direct improvements.

  11. Introducing DeBRa: a detailed breast model for radiological studies

    NASA Astrophysics Data System (ADS)

    Ma, Andy K. W.; Gunn, Spencer; Darambara, Dimitra G.

    2009-07-01

    Currently, x-ray mammography is the method of choice in breast cancer screening programmes. As the mammography technology moves from 2D imaging modalities to 3D, conventional computational phantoms do not have sufficient detail to support the studies of these advanced imaging systems. Studies of these 3D imaging systems call for a realistic and sophisticated computational model of the breast. DeBRa (Detailed Breast model for Radiological studies) is the most advanced, detailed, 3D computational model of the breast developed recently for breast imaging studies. A DeBRa phantom can be constructed to model a compressed breast, as in film/screen, digital mammography and digital breast tomosynthesis studies, or a non-compressed breast as in positron emission mammography and breast CT studies. Both the cranial-caudal and mediolateral oblique views can be modelled. The anatomical details inside the phantom include the lactiferous duct system, the Cooper ligaments and the pectoral muscle. The fibroglandular tissues are also modelled realistically. In addition, abnormalities such as microcalcifications, irregular tumours and spiculated tumours are inserted into the phantom. Existing sophisticated breast models require specialized simulation codes. Unlike its predecessors, DeBRa has elemental compositions and densities incorporated into its voxels including those of the explicitly modelled anatomical structures and the noise-like fibroglandular tissues. The voxel dimensions are specified as needed by any study and the microcalcifications are embedded into the voxels so that the microcalcification sizes are not limited by the voxel dimensions. Therefore, DeBRa works with general-purpose Monte Carlo codes. Furthermore, general-purpose Monte Carlo codes allow different types of imaging modalities and detector characteristics to be simulated with ease. DeBRa is a versatile and multipurpose model specifically designed for both x-ray and γ-ray imaging studies.

  12. GEOS. User Tutorials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Pengchen; Settgast, Randolph R.; Johnson, Scott M.

    2014-12-17

    GEOS is a massively parallel, multi-physics simulation application utilizing high performance computing (HPC) to address subsurface reservoir stimulation activities with the goal of optimizing current operations and evaluating innovative stimulation methods. GEOS enables coupling of di erent solvers associated with the various physical processes occurring during reservoir stimulation in unique and sophisticated ways, adapted to various geologic settings, materials and stimulation methods. Developed at the Lawrence Livermore National Laboratory (LLNL) as a part of a Laboratory-Directed Research and Development (LDRD) Strategic Initiative (SI) project, GEOS represents the culmination of a multi-year ongoing code development and improvement e ort that hasmore » leveraged existing code capabilities and sta expertise to design new computational geosciences software.« less

  13. [A new concept in digestive surgery: the computer assisted surgical procedure, from virtual reality to telemanipulation].

    PubMed

    Marescaux, J; Clément, J M; Nord, M; Russier, Y; Tassetti, V; Mutter, D; Cotin, S; Ayache, N

    1997-11-01

    Surgical simulation increasingly appears to be an essential aspect of tomorrow's surgery. The development of a hepatic surgery simulator is an advanced concept calling for a new writing system which will transform the medical world: virtual reality. Virtual reality extends the perception of our five senses by representing more than the real state of things by the means of computer sciences and robotics. It consists of three concepts: immersion, navigation and interaction. Three reasons have led us to develop this simulator: the first is to provide the surgeon with a comprehensive visualisation of the organ. The second reason is to allow for planning and surgical simulation that could be compared with the detailed flight-plan for a commercial jet pilot. The third lies in the fact that virtual reality is an integrated part of the concept of computer assisted surgical procedure. The project consists of a sophisticated simulator which has to include five requirements: visual fidelity, interactivity, physical properties, physiological properties, sensory input and output. In this report we will describe how to get a realistic 3D model of the liver from bi-dimensional 2D medical images for anatomical and surgical training. The introduction of a tumor and the consequent planning and virtual resection is also described, as are force feedback and real-time interaction.

  14. Simulation and animation of sensor-driven robots.

    PubMed

    Chen, C; Trivedi, M M; Bidlack, C R

    1994-10-01

    Most simulation and animation systems utilized in robotics are concerned with simulation of the robot and its environment without simulation of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. In this paper, a new design of an environment for simulation, animation, and visualization of sensor-driven robots is presented. As sensor technology advances, increasing numbers of robots are equipped with various types of sophisticated sensors. The main goal of creating the visualization environment is to aid the automatic robot programming and off-line programming capabilities of sensor-driven robots. The software system will help the users visualize the motion and reaction of the sensor-driven robot under their control program. Therefore, the efficiency of the software development is increased, the reliability of the software and the operation safety of the robot are ensured, and the cost of new software development is reduced. Conventional computer-graphics-based robot simulation and animation software packages lack of capabilities for robot sensing simulation. This paper describes a system designed to overcome this deficiency.

  15. Real-time simulation of ultrasound refraction phenomena using ray-trace based wavefront construction method.

    PubMed

    Szostek, Kamil; Piórkowski, Adam

    2016-10-01

    Ultrasound (US) imaging is one of the most popular techniques used in clinical diagnosis, mainly due to lack of adverse effects on patients and the simplicity of US equipment. However, the characteristics of the medium cause US imaging to imprecisely reconstruct examined tissues. The artifacts are the results of wave phenomena, i.e. diffraction or refraction, and should be recognized during examination to avoid misinterpretation of an US image. Currently, US training is based on teaching materials and simulators and ultrasound simulation has become an active research area in medical computer science. Many US simulators are limited by the complexity of the wave phenomena, leading to intensive sophisticated computation that makes it difficult for systems to operate in real time. To achieve the required frame rate, the vast majority of simulators reduce the problem of wave diffraction and refraction. The following paper proposes a solution for an ultrasound simulator based on methods known in geophysics. To improve simulation quality, a wavefront construction method was adapted which takes into account the refraction phenomena. This technique uses ray tracing and velocity averaging to construct wavefronts in the simulation. Instead of a geological medium, real CT scans are applied. This approach can produce more realistic projections of pathological findings and is also capable of providing real-time simulation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. WE-D-303-00: Computational Phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, John; Brigham and Women’s Hospital and Dana-Farber Cancer Institute, Boston, MA

    2015-06-15

    Modern medical physics deals with complex problems such as 4D radiation therapy and imaging quality optimization. Such problems involve a large number of radiological parameters, and anatomical and physiological breathing patterns. A major challenge is how to develop, test, evaluate and compare various new imaging and treatment techniques, which often involves testing over a large range of radiological parameters as well as varying patient anatomies and motions. It would be extremely challenging, if not impossible, both ethically and practically, to test every combination of parameters and every task on every type of patient under clinical conditions. Computer-based simulation using computationalmore » phantoms offers a practical technique with which to evaluate, optimize, and compare imaging technologies and methods. Within simulation, the computerized phantom provides a virtual model of the patient’s anatomy and physiology. Imaging data can be generated from it as if it was a live patient using accurate models of the physics of the imaging and treatment process. With sophisticated simulation algorithms, it is possible to perform virtual experiments entirely on the computer. By serving as virtual patients, computational phantoms hold great promise in solving some of the most complex problems in modern medical physics. In this proposed symposium, we will present the history and recent developments of computational phantom models, share experiences in their application to advanced imaging and radiation applications, and discuss their promises and limitations. Learning Objectives: Understand the need and requirements of computational phantoms in medical physics research Discuss the developments and applications of computational phantoms Know the promises and limitations of computational phantoms in solving complex problems.« less

  17. Development of hardware accelerator for molecular dynamics simulations: a computation board that calculates nonbonded interactions in cooperation with fast multipole method.

    PubMed

    Amisaki, Takashi; Toyoda, Shinjiro; Miyagawa, Hiroh; Kitamura, Kunihiro

    2003-04-15

    Evaluation of long-range Coulombic interactions still represents a bottleneck in the molecular dynamics (MD) simulations of biological macromolecules. Despite the advent of sophisticated fast algorithms, such as the fast multipole method (FMM), accurate simulations still demand a great amount of computation time due to the accuracy/speed trade-off inherently involved in these algorithms. Unless higher order multipole expansions, which are extremely expensive to evaluate, are employed, a large amount of the execution time is still spent in directly calculating particle-particle interactions within the nearby region of each particle. To reduce this execution time for pair interactions, we developed a computation unit (board), called MD-Engine II, that calculates nonbonded pairwise interactions using a specially designed hardware. Four custom arithmetic-processors and a processor for memory manipulation ("particle processor") are mounted on the computation board. The arithmetic processors are responsible for calculation of the pair interactions. The particle processor plays a central role in realizing efficient cooperation with the FMM. The results of a series of 50-ps MD simulations of a protein-water system (50,764 atoms) indicated that a more stringent setting of accuracy in FMM computation, compared with those previously reported, was required for accurate simulations over long time periods. Such a level of accuracy was efficiently achieved using the cooperative calculations of the FMM and MD-Engine II. On an Alpha 21264 PC, the FMM computation at a moderate but tolerable level of accuracy was accelerated by a factor of 16.0 using three boards. At a high level of accuracy, the cooperative calculation achieved a 22.7-fold acceleration over the corresponding conventional FMM calculation. In the cooperative calculations of the FMM and MD-Engine II, it was possible to achieve more accurate computation at a comparable execution time by incorporating larger nearby regions. Copyright 2003 Wiley Periodicals, Inc. J Comput Chem 24: 582-592, 2003

  18. NECAP: NASA's Energy-Cost Analysis Program. Part 1: User's manual

    NASA Technical Reports Server (NTRS)

    Henninger, R. H. (Editor)

    1975-01-01

    The NECAP is a sophisticated building design and energy analysis tool which has embodied within it all of the latest ASHRAE state-of-the-art techniques for performing thermal load calculation and energy usage predictions. It is a set of six individual computer programs which include: response factor program, data verification program, thermal load analysis program, variable temperature program, system and equipment simulation program, and owning and operating cost program. Each segment of NECAP is described, and instructions are set forth for preparing the required input data and for interpreting the resulting reports.

  19. Onyx-Advanced Aeropropulsion Simulation Framework Created

    NASA Technical Reports Server (NTRS)

    Reed, John A.

    2001-01-01

    The Numerical Propulsion System Simulation (NPSS) project at the NASA Glenn Research Center is developing a new software environment for analyzing and designing aircraft engines and, eventually, space transportation systems. Its purpose is to dramatically reduce the time, effort, and expense necessary to design and test jet engines by creating sophisticated computer simulations of an aerospace object or system (refs. 1 and 2). Through a university grant as part of that effort, researchers at the University of Toledo have developed Onyx, an extensible Java-based (Sun Micro-systems, Inc.), objectoriented simulation framework, to investigate how advanced software design techniques can be successfully applied to aeropropulsion system simulation (refs. 3 and 4). The design of Onyx's architecture enables users to customize and extend the framework to add new functionality or adapt simulation behavior as required. It exploits object-oriented technologies, such as design patterns, domain frameworks, and software components, to develop a modular system in which users can dynamically replace components with others having different functionality.

  20. Tools and procedures for visualization of proteins and other biomolecules.

    PubMed

    Pan, Lurong; Aller, Stephen G

    2015-04-01

    Protein, peptides, and nucleic acids are biomolecules that drive biological processes in living organisms. An enormous amount of structural data for a large number of these biomolecules has been described with atomic precision in the form of structural "snapshots" that are freely available in public repositories. These snapshots can help explain how the biomolecules function, the nature of interactions between multi-molecular complexes, and even how small-molecule drugs can modulate the biomolecules for clinical benefits. Furthermore, these structural snapshots serve as inputs for sophisticated computer simulations to turn the biomolecules into moving, "breathing" molecular machines for understanding their dynamic properties in real-time computer simulations. In order for the researcher to take advantage of such a wealth of structural data, it is necessary to gain competency in the use of computer molecular visualization tools for exploring the structures and visualizing three-dimensional spatial representations. Here, we present protocols for using two common visualization tools--the Web-based Jmol and the stand-alone PyMOL package--as well as a few examples of other popular tools. Copyright © 2015 John Wiley & Sons, Inc.

  1. Toward the design of alkynylimidazole fluorophores: computational and experimental characterization of spectroscopic features in solution and in poly(methyl methacrylate).

    PubMed

    Barone, Vincenzo; Bellina, Fabio; Biczysko, Malgorzata; Bloino, Julien; Fornaro, Teresa; Latouche, Camille; Lessi, Marco; Marianetti, Giulia; Minei, Pierpaolo; Panattoni, Alessandro; Pucci, Andrea

    2015-10-28

    The possibilities offered by organic fluorophores in the preparation of advanced plastic materials have been increased by designing novel alkynylimidazole dyes, featuring different push and pull groups. This new family of fluorescent dyes was synthesized by means of a one-pot sequential bromination-alkynylation of the heteroaromatic core, and their optical properties were investigated in tetrahydrofuran and in poly(methyl methacrylate). An efficient in silico pre-screening scheme was devised as consisting of a step-by-step procedure employing computational methodologies by simulation of electronic spectra within simple vertical energy and more sophisticated vibronic approaches. Such an approach was also extended to efficiently simulate one-photon absorption and emission spectra of the dyes in the polymer environment for their potential application in luminescent solar concentrators. Besides the specific applications of this novel material, the integration of computational and experimental techniques reported here provides an efficient protocol that can be applied to make a selection among similar dye candidates, which constitute the essential responsive part of those fluorescent plastic materials.

  2. Do dichromats see colours in this way? Assessing simulation tools without colorimetric measurements.

    PubMed

    Lillo Jover, Julio A; Álvaro Llorente, Leticia; Moreira Villegas, Humberto; Melnikova, Anna

    2016-11-01

    Simulcheck evaluates Colour Simulation Tools (CSTs, they transform colours to mimic those seen by colour vision deficients). Two CSTs (Variantor and Coblis) were used to know if the standard Simulcheck version (direct measurement based, DMB) can be substituted by another (RGB values based) not requiring sophisticated measurement instruments. Ten normal trichromats performed the two psychophysical tasks included in the Simulcheck method. The Pseudoachromatic Stimuli Identification task provided the h uv (hue angle) values of the pseudoachromatic stimuli: colours seen as red or green by normal trichromats but as grey by colour deficient people. The Minimum Achromatic Contrast task was used to compute the L R (relative luminance) values of the pseudoachromatic stimuli. Simulcheck DMB version showed that Variantor was accurate to simulate protanopia but neither Variantor nor Coblis were accurate to simulate deuteranopia. Simulcheck RGB version provided accurate h uv values, so this variable can be adequately estimated when lacking a colorimeter —an expensive and unusual apparatus—. Contrary, the inaccuracy of the L R estimations provided by Simulcheck RGB version makes it advisable to compute this variable from the measurements performed with a photometer, a cheap and easy to find apparatus.

  3. Forecasting techno-social systems: how physics and computing help to fight off global pandemics

    NASA Astrophysics Data System (ADS)

    Vespignani, Alessandro

    2010-03-01

    The crucial issue when planning for adequate public health interventions to mitigate the spread and impact of epidemics is risk evaluation and forecast. This amount to the anticipation of where, when and how strong the epidemic will strike. In the last decade advances in performance in computer technology, data acquisition, statistical physics and complex networks theory allow the generation of sophisticated simulations on supercomputer infrastructures to anticipate the spreading pattern of a pandemic. For the first time we are in the position of generating real time forecast of epidemic spreading. I will review the history of the current H1N1 pandemic, the major road-blocks the community has faced in its containment and mitigation and how physics and computing provide predictive tools that help us to battle epidemics.

  4. Numerical simulation of the modulation transfer function (MTF) in infrared focal plane arrays: simulation methodology and MTF optimization

    NASA Astrophysics Data System (ADS)

    Schuster, J.

    2018-02-01

    Military requirements demand both single and dual-color infrared (IR) imaging systems with both high resolution and sharp contrast. To quantify the performance of these imaging systems, a key measure of performance, the modulation transfer function (MTF), describes how well an optical system reproduces an objects contrast in the image plane at different spatial frequencies. At the center of an IR imaging system is the focal plane array (FPA). IR FPAs are hybrid structures consisting of a semiconductor detector pixel array, typically fabricated from HgCdTe, InGaAs or III-V superlattice materials, hybridized with heat/pressure to a silicon read-out integrated circuit (ROIC) with indium bumps on each pixel providing the mechanical and electrical connection. Due to the growing sophistication of the pixel arrays in these FPAs, sophisticated modeling techniques are required to predict, understand, and benchmark the pixel array MTF that contributes to the total imaging system MTF. To model the pixel array MTF, computationally exhaustive 2D and 3D numerical simulation approaches are required to correctly account for complex architectures and effects such as lateral diffusion from the pixel corners. It is paramount to accurately model the lateral di_usion (pixel crosstalk) as it can become the dominant mechanism limiting the detector MTF if not properly mitigated. Once the detector MTF has been simulated, it is directly decomposed into its constituent contributions to reveal exactly what is limiting the total detector MTF, providing a path for optimization. An overview of the MTF will be given and the simulation approach will be discussed in detail, along with how different simulation parameters effect the MTF calculation. Finally, MTF optimization strategies (crosstalk mitigation) will be discussed.

  5. Computational ecology as an emerging science

    PubMed Central

    Petrovskii, Sergei; Petrovskaya, Natalia

    2012-01-01

    It has long been recognized that numerical modelling and computer simulations can be used as a powerful research tool to understand, and sometimes to predict, the tendencies and peculiarities in the dynamics of populations and ecosystems. It has been, however, much less appreciated that the context of modelling and simulations in ecology is essentially different from those that normally exist in other natural sciences. In our paper, we review the computational challenges arising in modern ecology in the spirit of computational mathematics, i.e. with our main focus on the choice and use of adequate numerical methods. Somewhat paradoxically, the complexity of ecological problems does not always require the use of complex computational methods. This paradox, however, can be easily resolved if we recall that application of sophisticated computational methods usually requires clear and unambiguous mathematical problem statement as well as clearly defined benchmark information for model validation. At the same time, many ecological problems still do not have mathematically accurate and unambiguous description, and available field data are often very noisy, and hence it can be hard to understand how the results of computations should be interpreted from the ecological viewpoint. In this scientific context, computational ecology has to deal with a new paradigm: conventional issues of numerical modelling such as convergence and stability become less important than the qualitative analysis that can be provided with the help of computational techniques. We discuss this paradigm by considering computational challenges arising in several specific ecological applications. PMID:23565336

  6. The GLEaMviz computational tool, a publicly available software to explore realistic epidemic spreading scenarios at the global scale

    PubMed Central

    2011-01-01

    Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM) framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level of detail and the realism of its embedded modeling approach, opens up the platform to simulate realistic epidemic scenarios. These features make the GLEaMviz computational tool a convenient teaching/training tool as well as a first step toward the development of a computational tool aimed at facilitating the use and exploitation of computational models for the policy making and scenario analysis of infectious disease outbreaks. PMID:21288355

  7. Pyramidal neurovision architecture for vision machines

    NASA Astrophysics Data System (ADS)

    Gupta, Madan M.; Knopf, George K.

    1993-08-01

    The vision system employed by an intelligent robot must be active; active in the sense that it must be capable of selectively acquiring the minimal amount of relevant information for a given task. An efficient active vision system architecture that is based loosely upon the parallel-hierarchical (pyramidal) structure of the biological visual pathway is presented in this paper. Although the computational architecture of the proposed pyramidal neuro-vision system is far less sophisticated than the architecture of the biological visual pathway, it does retain some essential features such as the converging multilayered structure of its biological counterpart. In terms of visual information processing, the neuro-vision system is constructed from a hierarchy of several interactive computational levels, whereupon each level contains one or more nonlinear parallel processors. Computationally efficient vision machines can be developed by utilizing both the parallel and serial information processing techniques within the pyramidal computing architecture. A computer simulation of a pyramidal vision system for active scene surveillance is presented.

  8. Simulation of cold magnetized plasmas with the 3D electromagnetic software CST Microwave Studio®

    NASA Astrophysics Data System (ADS)

    Louche, Fabrice; Křivská, Alena; Messiaen, André; Wauters, Tom

    2017-10-01

    Detailed designs of ICRF antennas were made possible by the development of sophisticated commercial 3D codes like CST Microwave Studio® (MWS). This program allows for very detailed geometries of the radiating structures, but was only considering simple materials like equivalent isotropic dielectrics to simulate the reflection and the refraction of RF waves at the vacuum/plasma interface. The code was nevertheless used intensively, notably for computing the coupling properties of the ITER ICRF antenna. Until recently it was not possible to simulate gyrotropic medias like magnetized plasmas, but recent improvements have allowed programming any material described by a general dielectric or/and diamagnetic tensor. A Visual Basic macro was developed to exploit this feature and was tested for the specific case of a monochromatic plane wave propagating longitudinally with respect to the magnetic field direction. For specific cases the exact solution can be expressed in 1D as the sum of two circularly polarized waves connected by a reflection coefficient that can be analytically computed. Solutions for stratified media can also be derived. This allows for a direct comparison with MWS results. The agreement is excellent but accurate simulations for realistic geometries require large memory resources that could significantly restrict the possibility of simulating cold plasmas to small-scale machines.

  9. seismo-live: Training in Computational Seismology using Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Igel, H.; Krischer, L.; van Driel, M.; Tape, C.

    2016-12-01

    Practical training in computational methodologies is still underrepresented in Earth science curriculae despite the increasing use of sometimes highly sophisticated simulation technologies in research projects. At the same time well-engineered community codes make it easy to return simulation-based results yet with the danger that the inherent traps of numerical solutions are not well understood. It is our belief that training with highly simplified numerical solutions (here to the equations describing elastic wave propagation) with carefully chosen elementary ingredients of simulation technologies (e.g., finite-differencing, function interpolation, spectral derivatives, numerical integration) could substantially improve this situation. For this purpose we have initiated a community platform (www.seismo-live.org) where Python-based Jupyter notebooks can be accessed and run without and necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow combining markup language, graphics, equations with interactive, executable python codes. We demonstrate the potential with training notebooks for the finite-difference method, pseudospectral methods, finite/spectral element methods, the finite-volume and the discontinuous Galerkin method. The platform already includes general Python training, introduction to the ObsPy library for seismology as well as seismic data processing and noise analysis. Submission of Jupyter notebooks for general seismology are encouraged. The platform can be used for complementary teaching in Earth Science courses on compute-intensive research areas.

  10. Application of advanced virtual reality and 3D computer assisted technologies in tele-3D-computer assisted surgery in rhinology.

    PubMed

    Klapan, Ivica; Vranjes, Zeljko; Prgomet, Drago; Lukinović, Juraj

    2008-03-01

    The real-time requirement means that the simulation should be able to follow the actions of the user that may be moving in the virtual environment. The computer system should also store in its memory a three-dimensional (3D) model of the virtual environment. In that case a real-time virtual reality system will update the 3D graphic visualization as the user moves, so that up-to-date visualization is always shown on the computer screen. Upon completion of the tele-operation, the surgeon compares the preoperative and postoperative images and models of the operative field, and studies video records of the procedure itself Using intraoperative records, animated images of the real tele-procedure performed can be designed. Virtual surgery offers the possibility of preoperative planning in rhinology. The intraoperative use of computer in real time requires development of appropriate hardware and software to connect medical instrumentarium with the computer and to operate the computer by thus connected instrumentarium and sophisticated multimedia interfaces.

  11. Structure and dynamics of aqueous solutions from PBE-based first-principles molecular dynamics simulations.

    PubMed

    Pham, Tuan Anh; Ogitsu, Tadashi; Lau, Edmond Y; Schwegler, Eric

    2016-10-21

    Establishing an accurate and predictive computational framework for the description of complex aqueous solutions is an ongoing challenge for density functional theory based first-principles molecular dynamics (FPMD) simulations. In this context, important advances have been made in recent years, including the development of sophisticated exchange-correlation functionals. On the other hand, simulations based on simple generalized gradient approximation (GGA) functionals remain an active field, particularly in the study of complex aqueous solutions due to a good balance between the accuracy, computational expense, and the applicability to a wide range of systems. Such simulations are often performed at elevated temperatures to artificially "correct" for GGA inaccuracies in the description of liquid water; however, a detailed understanding of how the choice of temperature affects the structure and dynamics of other components, such as solvated ions, is largely unknown. To address this question, we carried out a series of FPMD simulations at temperatures ranging from 300 to 460 K for liquid water and three representative aqueous solutions containing solvated Na + , K + , and Cl - ions. We show that simulations at 390-400 K with the Perdew-Burke-Ernzerhof (PBE) exchange-correlation functional yield water structure and dynamics in good agreement with experiments at ambient conditions. Simultaneously, this computational setup provides ion solvation structures and ion effects on water dynamics consistent with experiments. Our results suggest that an elevated temperature around 390-400 K with the PBE functional can be used for the description of structural and dynamical properties of liquid water and complex solutions with solvated ions at ambient conditions.

  12. 1D-3D hybrid modeling-from multi-compartment models to full resolution models in space and time.

    PubMed

    Grein, Stephan; Stepniewski, Martin; Reiter, Sebastian; Knodel, Markus M; Queisser, Gillian

    2014-01-01

    Investigation of cellular and network dynamics in the brain by means of modeling and simulation has evolved into a highly interdisciplinary field, that uses sophisticated modeling and simulation approaches to understand distinct areas of brain function. Depending on the underlying complexity, these models vary in their level of detail, in order to cope with the attached computational cost. Hence for large network simulations, single neurons are typically reduced to time-dependent signal processors, dismissing the spatial aspect of each cell. For single cell or networks with relatively small numbers of neurons, general purpose simulators allow for space and time-dependent simulations of electrical signal processing, based on the cable equation theory. An emerging field in Computational Neuroscience encompasses a new level of detail by incorporating the full three-dimensional morphology of cells and organelles into three-dimensional, space and time-dependent, simulations. While every approach has its advantages and limitations, such as computational cost, integrated and methods-spanning simulation approaches, depending on the network size could establish new ways to investigate the brain. In this paper we present a hybrid simulation approach, that makes use of reduced 1D-models using e.g., the NEURON simulator-which couples to fully resolved models for simulating cellular and sub-cellular dynamics, including the detailed three-dimensional morphology of neurons and organelles. In order to couple 1D- and 3D-simulations, we present a geometry-, membrane potential- and intracellular concentration mapping framework, with which graph- based morphologies, e.g., in the swc- or hoc-format, are mapped to full surface and volume representations of the neuron and computational data from 1D-simulations can be used as boundary conditions for full 3D simulations and vice versa. Thus, established models and data, based on general purpose 1D-simulators, can be directly coupled to the emerging field of fully resolved, highly detailed 3D-modeling approaches. We present the developed general framework for 1D/3D hybrid modeling and apply it to investigate electrically active neurons and their intracellular spatio-temporal calcium dynamics.

  13. Simulation and animation of sensor-driven robots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, C.; Trivedi, M.M.; Bidlack, C.R.

    1994-10-01

    Most simulation and animation systems utilized in robotics are concerned with simulation of the robot and its environment without simulation of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. In this paper, a new design of an environment for simulation, animation, and visualization of sensor-driven robots is presented. As sensor technology advances, increasing numbers of robots are equipped with various types of sophisticated sensors. The main goal of creating the visualization environment is to aide the automatic robot programming and off-line programming capabilities of sensor-driven robots. The software system will help the usersmore » visualize the motion and reaction of the sensor-driven robot under their control program. Therefore, the efficiency of the software development is increased, the reliability of the software and the operation safety of the robot are ensured, and the cost of new software development is reduced. Conventional computer-graphics-based robot simulation and animation software packages lack of capabilities for robot sensing simulation. This paper describes a system designed to overcome this deficiency.« less

  14. Development of a Standalone Thermal Wellbore Simulator

    NASA Astrophysics Data System (ADS)

    Xiong, Wanqiang

    With continuous developments of various different sophisticated wells in the petroleum industry, wellbore modeling and simulation have increasingly received more attention. Especially in unconventional oil and gas recovery processes, there is a growing demand for more accurate wellbore modeling. Despite notable advancements made in wellbore modeling, none of the existing wellbore simulators has been as successful as reservoir simulators such as Eclipse and CMG's and further research works on handling issues such as accurate heat loss modeling and multi-tubing wellbore modeling are really necessary. A series of mathematical equations including main governing equations, auxiliary equations, PVT equations, thermodynamic equations, drift-flux model equations, and wellbore heat loss calculation equations are collected and screened from publications. Based on these modeling equations, workflows for wellbore simulation and software development are proposed. Research works are conducted in key steps for developing a wellbore simulator: discretization, a grid system, a solution method, a linear equation solver, and computer language. A standalone thermal wellbore simulator is developed by using standard C++ language. This wellbore simulator can simulate single-phase injection and production, two-phase steam injection and two-phase oil and water production. By implementing a multi-part scheme which divides a wellbore with sophisticated configuration into several relative simple simulation running units, this simulator can handle different complex wellbores: wellbore with multistage casings, horizontal wells, multilateral wells and double tubing. In pursuance of improved accuracy of heat loss calculations to surrounding formations, a semi-numerical method is proposed and a series of FLUENT simulations have been conducted in this study. This semi-numerical method involves extending the 2D formation heat transfer simulation to include a casing wall and cement and adopting new correlations regressed by this study. Meanwhile, a correlation for handling heat transfer in double-tubing annulus is regressed. This work initiates the research on heat transfer in a double-tubing wellbore system. A series of validation and test works are performed in hot water injection, steam injection, real filed data, a horizontal well, a double-tubing well and comparison with the Ramey method. The program in this study also performs well in matching with real measured field data, simulation in horizontal wells and double-tubing wells.

  15. Need for evaluative methodologies in land use, regional resource and waste management planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croke, E. J.

    The transfer of planning methodology from the research community to the practitioner very frequently takes the form of analytical and evaluative techniques and procedures. In the end, these become operational in the form of data acquisition, management and display systems, computational schemes that are codified in the form of manuals and handbooks, and computer simulation models. The complexity of the socioeconomic and physical processes that govern environmental resource and waste management have reinforced the need for computer assisted, scientifically sophisticated planning models that are fully operational, dependent on an attainable data base and accessible in terms of the resources normallymore » available to practitioners of regional resource management, waste management, and land use planning. A variety of models and procedures that attempt to meet one or more of the needs of these practitioners are discussed.« less

  16. Internal force corrections with machine learning for quantum mechanics/molecular mechanics simulations.

    PubMed

    Wu, Jingheng; Shen, Lin; Yang, Weitao

    2017-10-28

    Ab initio quantum mechanics/molecular mechanics (QM/MM) molecular dynamics simulation is a useful tool to calculate thermodynamic properties such as potential of mean force for chemical reactions but intensely time consuming. In this paper, we developed a new method using the internal force correction for low-level semiempirical QM/MM molecular dynamics samplings with a predefined reaction coordinate. As a correction term, the internal force was predicted with a machine learning scheme, which provides a sophisticated force field, and added to the atomic forces on the reaction coordinate related atoms at each integration step. We applied this method to two reactions in aqueous solution and reproduced potentials of mean force at the ab initio QM/MM level. The saving in computational cost is about 2 orders of magnitude. The present work reveals great potentials for machine learning in QM/MM simulations to study complex chemical processes.

  17. Reducing the Time and Cost of Testing Engines

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Producing a new aircraft engine currently costs approximately $1 billion, with 3 years of development time for a commercial engine and 10 years for a military engine. The high development time and cost make it extremely difficult to transition advanced technologies for cleaner, quieter, and more efficient new engines. To reduce this time and cost, NASA created a vision for the future where designers would use high-fidelity computer simulations early in the design process in order to resolve critical design issues before building the expensive engine hardware. To accomplish this vision, NASA's Glenn Research Center initiated a collaborative effort with the aerospace industry and academia to develop its Numerical Propulsion System Simulation (NPSS), an advanced engineering environment for the analysis and design of aerospace propulsion systems and components. Partners estimate that using NPSS has the potential to dramatically reduce the time, effort, and expense necessary to design and test jet engines by generating sophisticated computer simulations of an aerospace object or system. These simulations will permit an engineer to test various design options without having to conduct costly and time-consuming real-life tests. By accelerating and streamlining the engine system design analysis and test phases, NPSS facilitates bringing the final product to market faster. NASA's NPSS Version (V)1.X effort was a task within the Agency s Computational Aerospace Sciences project of the High Performance Computing and Communication program, which had a mission to accelerate the availability of high-performance computing hardware and software to the U.S. aerospace community for its use in design processes. The technology brings value back to NASA by improving methods of analyzing and testing space transportation components.

  18. An analog retina model for detecting dim moving objects against a bright moving background

    NASA Technical Reports Server (NTRS)

    Searfus, R. M.; Colvin, M. E.; Eeckman, F. H.; Teeters, J. L.; Axelrod, T. S.

    1991-01-01

    We are interested in applications that require the ability to track a dim target against a bright, moving background. Since the target signal will be less than or comparable to the variations in the background signal intensity, sophisticated techniques must be employed to detect the target. We present an analog retina model that adapts to the motion of the background in order to enhance targets that have a velocity difference with respect to the background. Computer simulation results and our preliminary concept of an analog 'Z' focal plane implementation are also presented.

  19. Scaling of data communications for an advanced supercomputer network

    NASA Technical Reports Server (NTRS)

    Levin, E.; Eaton, C. K.; Young, Bruce

    1986-01-01

    The goal of NASA's Numerical Aerodynamic Simulation (NAS) Program is to provide a powerful computational environment for advanced research and development in aeronautics and related disciplines. The present NAS system consists of a Cray 2 supercomputer connected by a data network to a large mass storage system, to sophisticated local graphics workstations and by remote communication to researchers throughout the United States. The program plan is to continue acquiring the most powerful supercomputers as they become available. The implications of a projected 20-fold increase in processing power on the data communications requirements are described.

  20. 1D-3D hybrid modeling—from multi-compartment models to full resolution models in space and time

    PubMed Central

    Grein, Stephan; Stepniewski, Martin; Reiter, Sebastian; Knodel, Markus M.; Queisser, Gillian

    2014-01-01

    Investigation of cellular and network dynamics in the brain by means of modeling and simulation has evolved into a highly interdisciplinary field, that uses sophisticated modeling and simulation approaches to understand distinct areas of brain function. Depending on the underlying complexity, these models vary in their level of detail, in order to cope with the attached computational cost. Hence for large network simulations, single neurons are typically reduced to time-dependent signal processors, dismissing the spatial aspect of each cell. For single cell or networks with relatively small numbers of neurons, general purpose simulators allow for space and time-dependent simulations of electrical signal processing, based on the cable equation theory. An emerging field in Computational Neuroscience encompasses a new level of detail by incorporating the full three-dimensional morphology of cells and organelles into three-dimensional, space and time-dependent, simulations. While every approach has its advantages and limitations, such as computational cost, integrated and methods-spanning simulation approaches, depending on the network size could establish new ways to investigate the brain. In this paper we present a hybrid simulation approach, that makes use of reduced 1D-models using e.g., the NEURON simulator—which couples to fully resolved models for simulating cellular and sub-cellular dynamics, including the detailed three-dimensional morphology of neurons and organelles. In order to couple 1D- and 3D-simulations, we present a geometry-, membrane potential- and intracellular concentration mapping framework, with which graph- based morphologies, e.g., in the swc- or hoc-format, are mapped to full surface and volume representations of the neuron and computational data from 1D-simulations can be used as boundary conditions for full 3D simulations and vice versa. Thus, established models and data, based on general purpose 1D-simulators, can be directly coupled to the emerging field of fully resolved, highly detailed 3D-modeling approaches. We present the developed general framework for 1D/3D hybrid modeling and apply it to investigate electrically active neurons and their intracellular spatio-temporal calcium dynamics. PMID:25120463

  1. Estimating rare events in biochemical systems using conditional sampling.

    PubMed

    Sundar, V S

    2017-01-28

    The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.

  2. Detection of feigned mental disorders on the personality assessment inventory: a discriminant analysis.

    PubMed

    Rogers, R; Sewell, K W; Morey, L C; Ustad, K L

    1996-12-01

    Psychological assessment with multiscale inventories is largely dependent on the honesty and forthrightness of those persons evaluated. We investigated the effectiveness of the Personality Assessment Inventory (PAI) in detecting participants feigning three specific disorders: schizophrenia, major depression, and generalized anxiety disorder. With a simulation design, we tested the PAI validity scales on 166 naive (undergraduates with minimal preparation) and 80 sophisticated (doctoral psychology students with 1 week preparation) participants. We compared their results to persons with the designated disorders: schizophrenia (n = 45), major depression (n = 136), and generalized anxiety disorder (n = 40). Although moderately effective with naive simulators, the validity scales evidenced only modest positive predictive power with their sophisticated counterparts. Therefore, we performed a two-stage discriminant analysis that yielded a moderately high hit rate (> 80%) that was maintained in the cross-validation sample, irrespective of the feigned disorder or the sophistication of the simulators.

  3. GPU-based Space Situational Awareness Simulation utilising Parallelism for Enhanced Multi-sensor Management

    NASA Astrophysics Data System (ADS)

    Hobson, T.; Clarkson, V.

    2012-09-01

    As a result of continual space activity since the 1950s, there are now a large number of man-made Resident Space Objects (RSOs) orbiting the Earth. Because of the large number of items and their relative speeds, the possibility of destructive collisions involving important space assets is now of significant concern to users and operators of space-borne technologies. As a result, a growing number of international agencies are researching methods for improving techniques to maintain Space Situational Awareness (SSA). Computer simulation is a method commonly used by many countries to validate competing methodologies prior to full scale adoption. The use of supercomputing and/or reduced scale testing is often necessary to effectively simulate such a complex problem on todays computers. Recently the authors presented a simulation aimed at reducing the computational burden by selecting the minimum level of fidelity necessary for contrasting methodologies and by utilising multi-core CPU parallelism for increased computational efficiency. The resulting simulation runs on a single PC while maintaining the ability to effectively evaluate competing methodologies. Nonetheless, the ability to control the scale and expand upon the computational demands of the sensor management system is limited. In this paper, we examine the advantages of increasing the parallelism of the simulation by means of General Purpose computing on Graphics Processing Units (GPGPU). As many sub-processes pertaining to SSA management are independent, we demonstrate how parallelisation via GPGPU has the potential to significantly enhance not only research into techniques for maintaining SSA, but also to enhance the level of sophistication of existing space surveillance sensors and sensor management systems. Nonetheless, the use of GPGPU imposes certain limitations and adds to the implementation complexity, both of which require consideration to achieve an effective system. We discuss these challenges and how they can be overcome. We further describe an application of the parallelised system where visibility prediction is used to enhance sensor management. This facilitates significant improvement in maximum catalogue error when RSOs become temporarily unobservable. The objective is to demonstrate the enhanced scalability and increased computational capability of the system.

  4. URDME: a modular framework for stochastic simulation of reaction-transport processes in complex geometries.

    PubMed

    Drawert, Brian; Engblom, Stefan; Hellander, Andreas

    2012-06-22

    Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic setting already at an early stage of development. In this paper we demonstrate, in a series of examples with high relevance to the molecular systems biology community, that the proposed software framework is a useful tool for both practitioners and developers of spatial stochastic simulation algorithms. Through the combined efforts of algorithm development and improved modeling accuracy, increasingly complex biological models become feasible to study through computational methods. URDME is freely available at http://www.urdme.org.

  5. WE-D-303-01: Development and Application of Digital Human Phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Segars, P.

    2015-06-15

    Modern medical physics deals with complex problems such as 4D radiation therapy and imaging quality optimization. Such problems involve a large number of radiological parameters, and anatomical and physiological breathing patterns. A major challenge is how to develop, test, evaluate and compare various new imaging and treatment techniques, which often involves testing over a large range of radiological parameters as well as varying patient anatomies and motions. It would be extremely challenging, if not impossible, both ethically and practically, to test every combination of parameters and every task on every type of patient under clinical conditions. Computer-based simulation using computationalmore » phantoms offers a practical technique with which to evaluate, optimize, and compare imaging technologies and methods. Within simulation, the computerized phantom provides a virtual model of the patient’s anatomy and physiology. Imaging data can be generated from it as if it was a live patient using accurate models of the physics of the imaging and treatment process. With sophisticated simulation algorithms, it is possible to perform virtual experiments entirely on the computer. By serving as virtual patients, computational phantoms hold great promise in solving some of the most complex problems in modern medical physics. In this proposed symposium, we will present the history and recent developments of computational phantom models, share experiences in their application to advanced imaging and radiation applications, and discuss their promises and limitations. Learning Objectives: Understand the need and requirements of computational phantoms in medical physics research Discuss the developments and applications of computational phantoms Know the promises and limitations of computational phantoms in solving complex problems.« less

  6. Multi-core and GPU accelerated simulation of a radial star target imaged with equivalent t-number circular and Gaussian pupils

    NASA Astrophysics Data System (ADS)

    Greynolds, Alan W.

    2013-09-01

    Results from the GelOE optical engineering software are presented for the through-focus, monochromatic coherent and polychromatic incoherent imaging of a radial "star" target for equivalent t-number circular and Gaussian pupils. The FFT-based simulations are carried out using OpenMP threading on a multi-core desktop computer, with and without the aid of a many-core NVIDIA GPU accessing its cuFFT library. It is found that a custom FFT optimized for the 12-core host has similar performance to a simply implemented 256-core GPU FFT. A more sophisticated version of the latter but tuned to reduce overhead on a 448-core GPU is 20 to 28 times faster than a basic FFT implementation running on one CPU core.

  7. Computational dynamics of soft machines

    NASA Astrophysics Data System (ADS)

    Hu, Haiyan; Tian, Qiang; Liu, Cheng

    2017-06-01

    Soft machine refers to a kind of mechanical system made of soft materials to complete sophisticated missions, such as handling a fragile object and crawling along a narrow tunnel corner, under low cost control and actuation. Hence, soft machines have raised great challenges to computational dynamics. In this review article, recent studies of the authors on the dynamic modeling, numerical simulation, and experimental validation of soft machines are summarized in the framework of multibody system dynamics. The dynamic modeling approaches are presented first for the geometric nonlinearities of coupled overall motions and large deformations of a soft component, the physical nonlinearities of a soft component made of hyperelastic or elastoplastic materials, and the frictional contacts/impacts of soft components, respectively. Then the computation approach is outlined for the dynamic simulation of soft machines governed by a set of differential-algebraic equations of very high dimensions, with an emphasis on the efficient computations of the nonlinear elastic force vector of finite elements. The validations of the proposed approaches are given via three case studies, including the locomotion of a soft quadrupedal robot, the spinning deployment of a solar sail of a spacecraft, and the deployment of a mesh reflector of a satellite antenna, as well as the corresponding experimental studies. Finally, some remarks are made for future studies.

  8. Multi-dimensional Core-Collapse Supernova Simulations with Neutrino Transport

    NASA Astrophysics Data System (ADS)

    Pan, Kuo-Chuan; Liebendörfer, Matthias; Hempel, Matthias; Thielemann, Friedrich-Karl

    We present multi-dimensional core-collapse supernova simulations using the Isotropic Diffusion Source Approximation (IDSA) for the neutrino transport and a modified potential for general relativity in two different supernova codes: FLASH and ELEPHANT. Due to the complexity of the core-collapse supernova explosion mechanism, simulations require not only high-performance computers and the exploitation of GPUs, but also sophisticated approximations to capture the essential microphysics. We demonstrate that the IDSA is an elegant and efficient neutrino radiation transfer scheme, which is portable to multiple hydrodynamics codes and fast enough to investigate long-term evolutions in two and three dimensions. Simulations with a 40 solar mass progenitor are presented in both FLASH (1D and 2D) and ELEPHANT (3D) as an extreme test condition. It is found that the black hole formation time is delayed in multiple dimensions and we argue that the strong standing accretion shock instability before black hole formation will lead to strong gravitational waves.

  9. Deep Part Load Flow Analysis in a Francis Model turbine by means of two-phase unsteady flow simulations

    NASA Astrophysics Data System (ADS)

    Conrad, Philipp; Weber, Wilhelm; Jung, Alexander

    2017-04-01

    Hydropower plants are indispensable to stabilize the grid by reacting quickly to changes of the energy demand. However, an extension of the operating range towards high and deep part load conditions without fatigue of the hydraulic components is desirable to increase their flexibility. In this paper a model sized Francis turbine at low discharge operating conditions (Q/QBEP = 0.27) is analyzed by means of computational fluid dynamics (CFD). Unsteady two-phase simulations for two Thoma-number conditions are conducted. Stochastic pressure oscillations, observed on the test rig at low discharge, require sophisticated numerical models together with small time steps, large grid sizes and long simulation times to cope with these fluctuations. In this paper the BSL-EARSM model (Explicit Algebraic Reynolds Stress) was applied as a compromise between scale resolving and two-equation turbulence models with respect to computational effort and accuracy. Simulation results are compared to pressure measurements showing reasonable agreement in resolving the frequency spectra and amplitude. Inner blade vortices were predicted successfully in shape and size. Surface streamlines in blade-to-blade view are presented, giving insights to the formation of the inner blade vortices. The acquired time dependent pressure fields can be used for quasi-static structural analysis (FEA) for fatigue calculations in the future.

  10. Anthropomorphic thorax phantom for cardio-respiratory motion simulation in tomographic imaging

    NASA Astrophysics Data System (ADS)

    Bolwin, Konstantin; Czekalla, Björn; Frohwein, Lynn J.; Büther, Florian; Schäfers, Klaus P.

    2018-02-01

    Patient motion during medical imaging using techniques such as computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), or single emission computed tomography (SPECT) is well known to degrade images, leading to blurring effects or severe artifacts. Motion correction methods try to overcome these degrading effects. However, they need to be validated under realistic conditions. In this work, a sophisticated anthropomorphic thorax phantom is presented that combines several aspects of a simulator for cardio-respiratory motion. The phantom allows us to simulate various types of cardio-respiratory motions inside a human-like thorax, including features such as inflatable lungs, beating left ventricular myocardium, respiration-induced motion of the left ventricle, moving lung lesions, and moving coronary artery plaques. The phantom is constructed to be MR-compatible. This means that we can not only perform studies in PET, SPECT and CT, but also inside an MRI system. The technical features of the anthropomorphic thorax phantom Wilhelm are presented with regard to simulating motion effects in hybrid emission tomography and radiotherapy. This is supplemented by a study on the detectability of small coronary plaque lesions in PET/CT under the influence of cardio-respiratory motion, and a study on the accuracy of left ventricular blood volumes.

  11. Petascale computation performance of lightweight multiscale cardiac models using hybrid programming models.

    PubMed

    Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias

    2011-01-01

    Future multiscale and multiphysics models must use the power of high performance computing (HPC) systems to enable research into human disease, translational medical science, and treatment. Previously we showed that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message passing processes (e.g. the message passing interface (MPI)) with multithreading (e.g. OpenMP, POSIX pthreads). The objective of this work is to compare the performance of such hybrid programming models when applied to the simulation of a lightweight multiscale cardiac model. Our results show that the hybrid models do not perform favourably when compared to an implementation using only MPI which is in contrast to our results using complex physiological models. Thus, with regards to lightweight multiscale cardiac models, the user may not need to increase programming complexity by using a hybrid programming approach. However, considering that model complexity will increase as well as the HPC system size in both node count and number of cores per node, it is still foreseeable that we will achieve faster than real time multiscale cardiac simulations on these systems using hybrid programming models.

  12. Exascale computing and what it means for shock physics

    NASA Astrophysics Data System (ADS)

    Germann, Timothy

    2015-06-01

    The U.S. Department of Energy is preparing to launch an Exascale Computing Initiative, to address the myriad challenges required to deploy and effectively utilize an exascale-class supercomputer (i.e., one capable of performing 1018 operations per second) in the 2023 timeframe. Since physical (power dissipation) requirements limit clock rates to at most a few GHz, this will necessitate the coordination of on the order of a billion concurrent operations, requiring sophisticated system and application software, and underlying mathematical algorithms, that may differ radically from traditional approaches. Even at the smaller workstation or cluster level of computation, the massive concurrency and heterogeneity within each processor will impact computational scientists. Through the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx), we have initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. In my talk, I will discuss these challenges, and what it will mean for exascale-era electronic structure, molecular dynamics, and engineering-scale simulations of shock-compressed condensed matter. In particular, we anticipate that the emerging hierarchical, heterogeneous architectures can be exploited to achieve higher physical fidelity simulations using adaptive physics refinement. This work is supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research.

  13. High accuracy mantle convection simulation through modern numerical methods - II: realistic models and problems

    NASA Astrophysics Data System (ADS)

    Heister, Timo; Dannberg, Juliane; Gassmöller, Rene; Bangerth, Wolfgang

    2017-08-01

    Computations have helped elucidate the dynamics of Earth's mantle for several decades already. The numerical methods that underlie these simulations have greatly evolved within this time span, and today include dynamically changing and adaptively refined meshes, sophisticated and efficient solvers, and parallelization to large clusters of computers. At the same time, many of the methods - discussed in detail in a previous paper in this series - were developed and tested primarily using model problems that lack many of the complexities that are common to the realistic models our community wants to solve today. With several years of experience solving complex and realistic models, we here revisit some of the algorithm designs of the earlier paper and discuss the incorporation of more complex physics. In particular, we re-consider time stepping and mesh refinement algorithms, evaluate approaches to incorporate compressibility, and discuss dealing with strongly varying material coefficients, latent heat, and how to track chemical compositions and heterogeneities. Taken together and implemented in a high-performance, massively parallel code, the techniques discussed in this paper then allow for high resolution, 3-D, compressible, global mantle convection simulations with phase transitions, strongly temperature dependent viscosity and realistic material properties based on mineral physics data.

  14. Multi-scale structural analysis of gas diffusion layers

    NASA Astrophysics Data System (ADS)

    Göbel, Martin; Godehardt, Michael; Schladitz, Katja

    2017-07-01

    The macroscopic properties of materials are strongly determined by their micro structure. Here, transport properties of gas diffusion layers (GDL) for fuel cells are considered. In order to simulate flow and thermal properties, detailed micro structural information is essential. 3D images obtained by high-resolution computed tomography using synchrotron radiation and scanning electron microscopy (SEM) combined with focused ion beam (FIB) serial slicing were used. A recent method for reconstruction of porous structures from FIB-SEM images and sophisticated morphological image transformations were applied to segment the solid structural components. The essential algorithmic steps for segmenting the different components in the tomographic data-sets are described and discussed. In this paper, two types of GDL, based on a non-woven substrate layer and a paper substrate layer were considered, respectively. More than three components are separated within the synchrotron radiation computed tomography data. That is, fiber system, polytetrafluoroethylene (PTFE) binder/impregnation, micro porous layer (MPL), inclusions within the latter, and pore space are segmented. The usage of the thus derived 3D structure data in different simulation applications can be demonstrated. Simulations of macroscopic properties such as thermal conductivity, depending on the flooding state of the GDL are possible.

  15. Catastrophic Disruption of Asteroids: First Simulations with Explicit Formation of Spinning Rigid and Semi-rigid Aggregates

    NASA Astrophysics Data System (ADS)

    Michel, Patrick; Richardson, D. C.

    2007-10-01

    We have made major improvements in simulations of asteroid disruption by computing explicitly aggregate formations during the gravitational reaccumulation of small fragments, allowing us to obtain information on their spin and shape. First results will be presented taking as examples asteroid families that we reproduced successfully with previous less sophisticated simulations. In the last years, we have simulated successfully the formation of asteroid families using a SPH hydrocode to compute the fragmentation following the impact of a projectile on the parent body, and the N-body code pkdgrav to compute the mutual interactions of the fragments. We found that fragments generated by the disruption of a km-size asteroid can have large enough masses to be attracted by each other during their ejection. Consequently, many reaccumulations take place. Eventually most large fragments correspond to gravitational aggregates formed by reaccumulation of smaller ones. Moreover, formation of satellites occurs around the largest and other big remnants. In these previous simulations, when fragments reaccumulate, they merge into a single sphere whose mass is the sum of their masses. Thus, no information is obtained on the actual shape of the aggregates, their spin, ... For the first time, we have now simulated the disruption of a family parent body by computing explicitly the formation of aggregates, along with the above-mentioned properties. Once formed these aggregates can interact and/or collide with each other and break up during their evolution. We will present these first simulations and their possible implications on properties of asteroids generated by disruption. Results can for instance be compared with data provided by the Japanese space mission Hayabusa of the asteroid Itokawa, a body now understood to be a reaccumulated fragment from a larger parent body. Acknowledgments: PM and DCR acknowledge supports from the French Programme National de Planétologie and grants NSF AST0307549&AST0708110.

  16. Load Balancing Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pearce, Olga Tkachyshyn

    2014-12-01

    The largest supercomputers have millions of independent processors, and concurrency levels are rapidly increasing. For ideal efficiency, developers of the simulations that run on these machines must ensure that computational work is evenly balanced among processors. Assigning work evenly is challenging because many large modern parallel codes simulate behavior of physical systems that evolve over time, and their workloads change over time. Furthermore, the cost of imbalanced load increases with scale because most large-scale scientific simulations today use a Single Program Multiple Data (SPMD) parallel programming model, and an increasing number of processors will wait for the slowest one atmore » the synchronization points. To address load imbalance, many large-scale parallel applications use dynamic load balance algorithms to redistribute work evenly. The research objective of this dissertation is to develop methods to decide when and how to load balance the application, and to balance it effectively and affordably. We measure and evaluate the computational load of the application, and develop strategies to decide when and how to correct the imbalance. Depending on the simulation, a fast, local load balance algorithm may be suitable, or a more sophisticated and expensive algorithm may be required. We developed a model for comparison of load balance algorithms for a specific state of the simulation that enables the selection of a balancing algorithm that will minimize overall runtime.« less

  17. Structure and dynamics of aqueous solutions from PBE-based first-principles molecular dynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, Tuan Anh; Ogitsu, Tadashi; Lau, Edmond Y.

    Establishing an accurate and predictive computational framework for the description of complex aqueous solutions is an ongoing challenge for density functional theory based first-principles molecular dynamics (FPMD) simulations. In this context, important advances have been made in recent years, including the development of sophisticated exchange-correlation functionals. On the other hand, simulations based on simple generalized gradient approximation (GGA) functionals remain an active field, particularly in the study of complex aqueous solutions due to a good balance between the accuracy, computational expense, and the applicability to a wide range of systems. In such simulations we often perform them at elevated temperaturesmore » to artificially “correct” for GGA inaccuracies in the description of liquid water; however, a detailed understanding of how the choice of temperature affects the structure and dynamics of other components, such as solvated ions, is largely unknown. In order to address this question, we carried out a series of FPMD simulations at temperatures ranging from 300 to 460 K for liquid water and three representative aqueous solutions containing solvated Na +, K +, and Cl - ions. We show that simulations at 390–400 K with the Perdew-Burke-Ernzerhof (PBE) exchange-correlation functional yield water structure and dynamics in good agreement with experiments at ambient conditions. Simultaneously, this computational setup provides ion solvation structures and ion effects on water dynamics consistent with experiments. These results suggest that an elevated temperature around 390–400 K with the PBE functional can be used for the description of structural and dynamical properties of liquid water and complex solutions with solvated ions at ambient conditions.« less

  18. Structure and dynamics of aqueous solutions from PBE-based first-principles molecular dynamics simulations

    DOE PAGES

    Pham, Tuan Anh; Ogitsu, Tadashi; Lau, Edmond Y.; ...

    2016-10-17

    Establishing an accurate and predictive computational framework for the description of complex aqueous solutions is an ongoing challenge for density functional theory based first-principles molecular dynamics (FPMD) simulations. In this context, important advances have been made in recent years, including the development of sophisticated exchange-correlation functionals. On the other hand, simulations based on simple generalized gradient approximation (GGA) functionals remain an active field, particularly in the study of complex aqueous solutions due to a good balance between the accuracy, computational expense, and the applicability to a wide range of systems. In such simulations we often perform them at elevated temperaturesmore » to artificially “correct” for GGA inaccuracies in the description of liquid water; however, a detailed understanding of how the choice of temperature affects the structure and dynamics of other components, such as solvated ions, is largely unknown. In order to address this question, we carried out a series of FPMD simulations at temperatures ranging from 300 to 460 K for liquid water and three representative aqueous solutions containing solvated Na +, K +, and Cl - ions. We show that simulations at 390–400 K with the Perdew-Burke-Ernzerhof (PBE) exchange-correlation functional yield water structure and dynamics in good agreement with experiments at ambient conditions. Simultaneously, this computational setup provides ion solvation structures and ion effects on water dynamics consistent with experiments. These results suggest that an elevated temperature around 390–400 K with the PBE functional can be used for the description of structural and dynamical properties of liquid water and complex solutions with solvated ions at ambient conditions.« less

  19. Numerical Simulation of Shock/Detonation-Deformable-Particle Interaction with Constrained Interface Reinitialization

    NASA Astrophysics Data System (ADS)

    Zhang, Ju; Jackson, Thomas; Balachandar, Sivaramakrishnan

    2015-06-01

    We will develop a computational model built upon our verified and validated in-house SDT code to provide improved description of the multiphase blast wave dynamics where solid particles are considered deformable and can even undergo phase transitions. Our SDT computational framework includes a reactive compressible flow solver with sophisticated material interface tracking capability and realistic equation of state (EOS) such as Mie-Gruneisen EOS for multiphase flow modeling. The behavior of diffuse interface models by Shukla et al. (2010) and Tiwari et al. (2013) at different shock impedance ratio will be first examined and characterized. The recent constrained interface reinitialization by Shukla (2014) will then be developed to examine if conservation property can be improved. This work was supported in part by the U.S. Department of Energy and by the Defense Threat Reduction Agency.

  20. Space environment and lunar surface processes

    NASA Technical Reports Server (NTRS)

    Comstock, G. M.

    1979-01-01

    The development of a general rock/soil model capable of simulating in a self consistent manner the mechanical and exposure history of an assemblage of solid and loose material from submicron to planetary size scales, applicable to lunar and other space exposed planetary surfaces is discussed. The model was incorporated into a computer code called MESS.2 (model for the evolution of space exposed surfaces). MESS.2, which represents a considerable increase in sophistication and scope over previous soil and rock surface models, is described. The capabilities of previous models for near surface soil and rock surfaces are compared with the rock/soil model, MESS.2.

  1. Memory Network For Distributed Data Processors

    NASA Technical Reports Server (NTRS)

    Bolen, David; Jensen, Dean; Millard, ED; Robinson, Dave; Scanlon, George

    1992-01-01

    Universal Memory Network (UMN) is modular, digital data-communication system enabling computers with differing bus architectures to share 32-bit-wide data between locations up to 3 km apart with less than one millisecond of latency. Makes it possible to design sophisticated real-time and near-real-time data-processing systems without data-transfer "bottlenecks". This enterprise network permits transmission of volume of data equivalent to an encyclopedia each second. Facilities benefiting from Universal Memory Network include telemetry stations, simulation facilities, power-plants, and large laboratories or any facility sharing very large volumes of data. Main hub of UMN is reflection center including smaller hubs called Shared Memory Interfaces.

  2. VPython: Writing Real-time 3D Physics Programs

    NASA Astrophysics Data System (ADS)

    Chabay, Ruth

    2001-06-01

    VPython (http://cil.andrew.cmu.edu/projects/visual) combines the Python programming language with an innovative 3D graphics module called Visual, developed by David Scherer. Designed to make 3D physics simulations accessible to novice programmers, VPython allows the programmer to write a purely computational program without any graphics code, and produces an interactive realtime 3D graphical display. In a program 3D objects are created and their positions modified by computational algorithms. Running in a separate thread, the Visual module monitors the positions of these objects and renders them many times per second. Using the mouse, one can zoom and rotate to navigate through the scene. After one hour of instruction, students in an introductory physics course at Carnegie Mellon University, including those who have never programmed before, write programs in VPython to model the behavior of physical systems and to visualize fields in 3D. The Numeric array processing module allows the construction of more sophisticated simulations and models as well. VPython is free and open source. The Visual module is based on OpenGL, and runs on Windows, Linux, and Macintosh.

  3. Large calculation of the flow over a hypersonic vehicle using a GPU

    NASA Astrophysics Data System (ADS)

    Elsen, Erich; LeGresley, Patrick; Darve, Eric

    2008-12-01

    Graphics processing units are capable of impressive computing performance up to 518 Gflops peak performance. Various groups have been using these processors for general purpose computing; most efforts have focussed on demonstrating relatively basic calculations, e.g. numerical linear algebra, or physical simulations for visualization purposes with limited accuracy. This paper describes the simulation of a hypersonic vehicle configuration with detailed geometry and accurate boundary conditions using the compressible Euler equations. To the authors' knowledge, this is the most sophisticated calculation of this kind in terms of complexity of the geometry, the physical model, the numerical methods employed, and the accuracy of the solution. The Navier-Stokes Stanford University Solver (NSSUS) was used for this purpose. NSSUS is a multi-block structured code with a provably stable and accurate numerical discretization which uses a vertex-based finite-difference method. A multi-grid scheme is used to accelerate the solution of the system. Based on a comparison of the Intel Core 2 Duo and NVIDIA 8800GTX, speed-ups of over 40× were demonstrated for simple test geometries and 20× for complex geometries.

  4. Study of aircraft centered navigation, guidance, and traffic situation system concept for terminal area operation

    NASA Technical Reports Server (NTRS)

    Anderson, W. W.; Will, R. W.; Grantham, C.

    1972-01-01

    A concept for automating the control of air traffic in the terminal area in which the primary man-machine interface is the cockpit is described. The ground and airborne inputs required for implementing this concept are discussed. Digital data link requirements of 10,000 bits per second are explained. A particular implementation of this concept including a sequencing and separation algorithm which generates flight paths and implements a natural order landing sequence is presented. Onboard computer/display avionics utilizing a traffic situation display is described. A preliminary simulation of this concept has been developed which includes a simple, efficient sequencing algorithm and a complete aircraft dynamics model. This simulated jet transport was flown through automated terminal-area traffic situations by pilots using relatively sophisticated displays, and pilot performance and observations are discussed.

  5. Updating source term and atmospheric dispersion simulations for the dose reconstruction in Fukushima Daiichi Nuclear Power Station Accident

    NASA Astrophysics Data System (ADS)

    Nagai, Haruyasu; Terada, Hiroaki; Tsuduki, Katsunori; Katata, Genki; Ota, Masakazu; Furuno, Akiko; Akari, Shusaku

    2017-09-01

    In order to assess the radiological dose to the public resulting from the Fukushima Daiichi Nuclear Power Station (FDNPS) accident in Japan, especially for the early phase of the accident when no measured data are available for that purpose, the spatial and temporal distribution of radioactive materials in the environment are reconstructed by computer simulations. In this study, by refining the source term of radioactive materials discharged into the atmosphere and modifying the atmospheric transport, dispersion and deposition model (ATDM), the atmospheric dispersion simulation of radioactive materials is improved. Then, a database of spatiotemporal distribution of radioactive materials in the air and on the ground surface is developed from the output of the simulation. This database is used in other studies for the dose assessment by coupling with the behavioral pattern of evacuees from the FDNPS accident. By the improvement of the ATDM simulation to use a new meteorological model and sophisticated deposition scheme, the ATDM simulations reproduced well the 137Cs and 131I deposition patterns. For the better reproducibility of dispersion processes, further refinement of the source term was carried out by optimizing it to the improved ATDM simulation by using new monitoring data.

  6. Reading wild minds: A computational assay of Theory of Mind sophistication across seven primate species

    PubMed Central

    Devaine, Marie; San-Galli, Aurore; Trapanese, Cinzia; Bardino, Giulia; Hano, Christelle; Saint Jalme, Michel; Bouret, Sebastien

    2017-01-01

    Theory of Mind (ToM), i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded). However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity) or social group size (a proxy for social network complexity) are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees) engage in simple dyadic games against artificial ToM players (via a familiar human caregiver). Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size). Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities. PMID:29112973

  7. Reading wild minds: A computational assay of Theory of Mind sophistication across seven primate species.

    PubMed

    Devaine, Marie; San-Galli, Aurore; Trapanese, Cinzia; Bardino, Giulia; Hano, Christelle; Saint Jalme, Michel; Bouret, Sebastien; Masi, Shelly; Daunizeau, Jean

    2017-11-01

    Theory of Mind (ToM), i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded). However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity) or social group size (a proxy for social network complexity) are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees) engage in simple dyadic games against artificial ToM players (via a familiar human caregiver). Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size). Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities.

  8. [Anesthesia simulators and training devices].

    PubMed

    Hartmannsgruber, M; Good, M; Carovano, R; Lampotang, S; Gravenstein, J S

    1993-07-01

    Simulators and training devices are used extensively by educators in 'high-tech' occupations, especially those requiring an understanding of complex systems and co-ordinated psychomotor skills. Because of advances in computer technology, anaesthetised patients can now be realistically simulated. This paper describes several training devices and a simulator currently being employed in the training of anaesthesia personnel at the University of Florida. This Gainesville Anesthesia Simulator (GAS) comprises a patient mannequin, anaesthesia gas machine, and a full set of normally operating monitoring instruments. The patient can spontaneously breathe, has audible heart and breath sounds, and palpable pulses. The mannequin contains a sophisticated lung model that consumes and eliminates gas according to physiological principles. Interconnected computers controlling the physical signs of the mannequin enable the presentation of a multitude of clinical signs. In addition, the anaesthesia machine, which is functionally intact, has hidden fault activators to challenge the user to correct equipment malfunctions. Concealed sensors monitor the users' actions and responses. A robust data acquisition and control system and a user-friendly scripting language for programming simulation scenarios are key features of GAS and make this system applicable for the training of both the beginning resident and the experienced practitioner. GAS enhances clinical education in anaesthesia by providing a non-threatening environment that fosters learning by doing. Exercises with the simulator are supported by sessions on a number of training devices. These present theoretical and practical interactive courses on the anaesthesia machine and on monitors. An extensive system, for example, introduces the student to the physics and clinical application of transoesophageal echocardiography.(ABSTRACT TRUNCATED AT 250 WORDS)

  9. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arndt, S.A.

    1997-07-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for codemore » use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities.« less

  10. Cluster analysis of accelerated molecular dynamics simulations: A case study of the decahedron to icosahedron transition in Pt nanoparticles.

    PubMed

    Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F; Perez, Danny

    2017-10-21

    Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.

  11. Cluster analysis of accelerated molecular dynamics simulations: A case study of the decahedron to icosahedron transition in Pt nanoparticles

    NASA Astrophysics Data System (ADS)

    Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F.; Perez, Danny

    2017-10-01

    Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.

  12. MITHRA 1.0: A full-wave simulation tool for free electron lasers

    NASA Astrophysics Data System (ADS)

    Fallahi, Arya; Yahaghi, Alireza; Kärtner, Franz X.

    2018-07-01

    Free Electron Lasers (FELs) are a solution for providing intense, coherent and bright radiation in the hard X-ray regime. Due to the low wall-plug efficiency of FEL facilities, it is crucial and additionally very useful to develop complete and accurate simulation tools for better optimizing a FEL interaction. The highly sophisticated dynamics involved in a FEL process was the main obstacle hindering the development of general simulation tools for this problem. We present a numerical algorithm based on finite difference time domain/Particle in cell (FDTD/PIC) in a Lorentz boosted coordinate system which is able to fulfill a full-wave simulation of a FEL process. The developed software offers a suitable tool for the analysis of FEL interactions without considering any of the usual approximations. A coordinate transformation to bunch rest frame makes the very different length scales of bunch size, optical wavelengths and the undulator period transform to values with the same order. Consequently, FDTD/PIC simulations in conjunction with efficient parallelization techniques make the full-wave simulation feasible using the available computational resources. Several examples of free electron lasers are analyzed using the developed software, the results are benchmarked based on standard FEL codes and discussed in detail.

  13. Numerical simulation of the interaction of biological cells with an ice front during freezing

    NASA Astrophysics Data System (ADS)

    Carin, M.; Jaeger, M.

    2001-12-01

    The goal of this study is a better understanding of the interaction between cells and a solidification front during a cryopreservation process. This technique of freezing is commonly used to conserve biological material for long periods at low temperatures. However the biophysical mechanisms of cell injuries during freezing are difficult to understand because a cell is a very sophisticated microstructure interacting with its environment. We have developed a finite element model to simulate the response of cells to an advancing solidification front. A special front-tracking technique is used to compute the motion of the cell membrane and the ice front during freezing. The model solves the conductive heat transfer equation and the diffusion equation of a solute on a domain containing three phases: one or more cells, the extra-cellular solution and the growing ice. This solid phase growing from a binary salt solution rejects the solute in the liquid phase and increases the solute gradient around the cell. This induces the shrinkage of the cell. The model is used to simulate the engulfment of one cell modelling a red blood cell by an advancing solidification front initially planar or not is computed. We compare the incorporation of a cell with that of a solid particle.

  14. RF Wave Simulation Using the MFEM Open Source FEM Package

    NASA Astrophysics Data System (ADS)

    Stillerman, J.; Shiraiwa, S.; Bonoli, P. T.; Wright, J. C.; Green, D. L.; Kolev, T.

    2016-10-01

    A new plasma wave simulation environment based on the finite element method is presented. MFEM, a scalable open-source FEM library, is used as the basis for this capability. MFEM allows for assembling an FEM matrix of arbitrarily high order in a parallel computing environment. A 3D frequency domain RF physics layer was implemented using a python wrapper for MFEM and a cold collisional plasma model was ported. This physics layer allows for defining the plasma RF wave simulation model without user knowledge of the FEM weak-form formulation. A graphical user interface is built on πScope, a python-based scientific workbench, such that a user can build a model definition file interactively. Benchmark cases have been ported to this new environment, with results being consistent with those obtained using COMSOL multiphysics, GENRAY, and TORIC/TORLH spectral solvers. This work is a first step in bringing to bear the sophisticated computational tool suite that MFEM provides (e.g., adaptive mesh refinement, solver suite, element types) to the linear plasma-wave interaction problem, and within more complicated integrated workflows, such as coupling with core spectral solver, or incorporating additional physics such as an RF sheath potential model or kinetic effects. USDoE Awards DE-FC02-99ER54512, DE-FC02-01ER54648.

  15. Toward A Simulation-Based Tool for the Treatment of Vocal Fold Paralysis

    PubMed Central

    Mittal, Rajat; Zheng, Xudong; Bhardwaj, Rajneesh; Seo, Jung Hee; Xue, Qian; Bielamowicz, Steven

    2011-01-01

    Advances in high-performance computing are enabling a new generation of software tools that employ computational modeling for surgical planning. Surgical management of laryngeal paralysis is one area where such computational tools could have a significant impact. The current paper describes a comprehensive effort to develop a software tool for planning medialization laryngoplasty where a prosthetic implant is inserted into the larynx in order to medialize the paralyzed vocal fold (VF). While this is one of the most common procedures used to restore voice in patients with VF paralysis, it has a relatively high revision rate, and the tool being developed is expected to improve surgical outcomes. This software tool models the biomechanics of airflow-induced vibration in the human larynx and incorporates sophisticated approaches for modeling the turbulent laryngeal flow, the complex dynamics of the VFs, as well as the production of voiced sound. The current paper describes the key elements of the modeling approach, presents computational results that demonstrate the utility of the approach and also describes some of the limitations and challenges. PMID:21556320

  16. Evaluation of CFD to Determine Two-Dimensional Airfoil Characteristics for Rotorcraft Applications

    NASA Technical Reports Server (NTRS)

    Smith, Marilyn J.; Wong, Tin-Chee; Potsdam, Mark; Baeder, James; Phanse, Sujeet

    2004-01-01

    The efficient prediction of helicopter rotor performance, vibratory loads, and aeroelastic properties still relies heavily on the use of comprehensive analysis codes by the rotorcraft industry. These comprehensive codes utilize look-up tables to provide two-dimensional aerodynamic characteristics. Typically these tables are comprised of a combination of wind tunnel data, empirical data and numerical analyses. The potential to rely more heavily on numerical computations based on Computational Fluid Dynamics (CFD) simulations has become more of a reality with the advent of faster computers and more sophisticated physical models. The ability of five different CFD codes applied independently to predict the lift, drag and pitching moments of rotor airfoils is examined for the SC1095 airfoil, which is utilized in the UH-60A main rotor. Extensive comparisons with the results of ten wind tunnel tests are performed. These CFD computations are found to be as good as experimental data in predicting many of the aerodynamic performance characteristics. Four turbulence models were examined (Baldwin-Lomax, Spalart-Allmaras, Menter SST, and k-omega).

  17. Core-Collapse Supernovae Explored by Multi-D Boltzmann Hydrodynamic Simulations

    NASA Astrophysics Data System (ADS)

    Sumiyoshi, Kohsuke; Nagakura, Hiroki; Iwakami, Wakana; Furusawa, Shun; Matsufuru, Hideo; Imakura, Akira; Yamada, Shoichi

    We report the latest results of numerical simulations of core-collapse supernovae by solving multi-D neutrino-radiation hydrodynamics with Boltzmann equations. One of the longstanding issues of the explosion mechanism of supernovae has been uncertainty in the approximations of the neutrino transfer in multi-D such as the diffusion approximation and ray-by-ray method. The neutrino transfer is essential, together with 2D/3D hydrodynamical instabilities, to evaluate the neutrino heating behind the shock wave for successful explosions and to predict the neutrino burst signals. We tackled this difficult problem by utilizing our solver of the 6D Boltzmann equation for neutrinos in 3D space and 3D neutrino momentum space coupled with multi-D hydrodynamics adding special and general relativistic extensions. We have performed a set of 2D core-collapse simulations from 11M ⊙ and 15M ⊙ stars on K-computer in Japan by following long-term evolution over 400 ms after bounce to reveal the outcome from the full Boltzmann hydrodynamic simulations with a sophisticated equation of state with multi-nuclear species and updated rates for electron captures on nuclei.

  18. Emerging Uses of Computer Technology in Qualitative Research.

    ERIC Educational Resources Information Center

    Parker, D. Randall

    The application of computer technology in qualitative research and evaluation ranges from simple word processing to doing sophisticated data sorting and retrieval. How computer software can be used for qualitative research is discussed. Researchers should consider the use of computers in data analysis in light of their own familiarity and comfort…

  19. The Next Computer Revolution.

    ERIC Educational Resources Information Center

    Peled, Abraham

    1987-01-01

    Discusses some of the future trends in the use of the computer in our society, suggesting that computing is now entering a new phase in which it will grow exponentially more powerful, flexible, and sophisticated in the next decade. Describes some of the latest breakthroughs in computer hardware and software technology. (TW)

  20. Application of a new model for groundwater age distributions: Modeling and isotopic analysis of artificial recharge in the Rialto-Colton basin, California

    USGS Publications Warehouse

    Ginn, T.R.; Woolfenden, L.

    2002-01-01

    A project for modeling and isotopic analysis of artificial recharge in the Rialto-Colton basin aquifer in California, is discussed. The Rialto-Colton aquifer has been divided into four primary and significant flowpaths following the general direction of groundwater flow from NW to SE. The introductory investigation include sophisticated chemical reaction modeling, with highly simplified flow path simulation. A comprehensive reactive transport model with the established set of geochemical reactions over the whole aquifer will also be developed for treating both reactions and transport realistically. This will be completed by making use of HBGC123D implemented with isotopic calculation step to compute Carbon-14 (C14) and stable Carbon-13 (C13) contents of the water. Computed carbon contents will also be calibrated with the measured carbon contents for assessment of the amount of imported recharge into the Linden pond.

  1. The STD/MHD codes - Comparison of analyses with experiments at AEDC/HPDE, Reynolds Metal Co., and Hercules, Inc. [for MHD generator flows

    NASA Technical Reports Server (NTRS)

    Vetter, A. A.; Maxwell, C. D.; Swean, T. F., Jr.; Demetriades, S. T.; Oliver, D. A.; Bangerter, C. D.

    1981-01-01

    Data from sufficiently well-instrumented, short-duration experiments at AEDC/HPDE, Reynolds Metal Co., and Hercules, Inc., are compared to analyses with multidimensional and time-dependent simulations with the STD/MHD computer codes. These analyses reveal detailed features of major transient events, severe loss mechanisms, and anomalous MHD behavior. In particular, these analyses predicted higher-than-design voltage drops, Hall voltage overshoots, and asymmetric voltage drops before the experimental data were available. The predictions obtained with these analyses are in excellent agreement with the experimental data and the failure predictions are consistent with the experiments. The design of large, high-interaction or advanced MHD experiments will require application of sophisticated, detailed and comprehensive computational procedures in order to account for the critical mechanisms which led to the observed behavior in these experiments.

  2. V/STOLAND digital avionics system for XV-15 tilt rotor

    NASA Technical Reports Server (NTRS)

    Liden, S.

    1980-01-01

    A digital flight control system for the tilt rotor research aircraft provides sophisticated navigation, guidance, control, display and data acquisition capabilities for performing terminal area navigation, guidance and control research. All functions of the XV-15 V/STOLAND system were demonstrated on the NASA-ARC S-19 simulation facility under a comprehensive dynamic acceptance test. The most noteworthy accomplishments of the system are: (1) automatic configuration control of a tilt-rotor aircraft over the total operating range; (2) total hands-off landing to touchdown on various selectable straight-in glide slopes and on a flight path that includes a two-revolution helix; (3) automatic guidance along a programmed three-dimensional reference flight path; (4) navigation data for the automatic guidance computed on board, based on VOR/DME, TACAN, or MLS navid data; and (5) integration of a large set of functions in a single computer, utilizing 16k words of storage for programs and data.

  3. Performance Enhancement Strategies for Multi-Block Overset Grid CFD Applications

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Biswas, Rupak

    2003-01-01

    The overset grid methodology has significantly reduced time-to-solution of highfidelity computational fluid dynamics (CFD) simulations about complex aerospace configurations. The solution process resolves the geometrical complexity of the problem domain by using separately generated but overlapping structured discretization grids that periodically exchange information through interpolation. However, high performance computations of such large-scale realistic applications must be handled efficiently on state-of-the-art parallel supercomputers. This paper analyzes the effects of various performance enhancement strategies on the parallel efficiency of an overset grid Navier-Stokes CFD application running on an SGI Origin2000 machinc. Specifically, the role of asynchronous communication, grid splitting, and grid grouping strategies are presented and discussed. Details of a sophisticated graph partitioning technique for grid grouping are also provided. Results indicate that performance depends critically on the level of latency hiding and the quality of load balancing across the processors.

  4. Slime mold solves maze in one pass, assisted by gradient of chemo-attractants.

    PubMed

    Adamatzky, Andrew

    2012-06-01

    Plasmodium of Physarum polycephalum is a large cell, visible by unaided eye, which exhibits sophisticated patterns of foraging behaviour. The plasmodium's behaviour is well interpreted in terms of computation, where data are spatially extended configurations of nutrients and obstacles, and results of computation are networks of protoplasmic tubes formed by the plasmodium. In laboratory experiments and numerical simulation we show that if plasmodium of P. polycephalum is inoculated in a maze's peripheral channel and an oat flake (source of attractants) in a the maze's central chamber then the plasmodium grows toward target oat flake and connects the flake with the site of original inoculation with a pronounced protoplasmic tube. The protoplasmic tube represents a path in the maze. The plasmodium solves maze in one pass because it is assisted by a gradient of chemo-attractants propagating from the target oat flake.

  5. Emulating weak localization using a solid-state quantum circuit.

    PubMed

    Chen, Yu; Roushan, P; Sank, D; Neill, C; Lucero, Erik; Mariantoni, Matteo; Barends, R; Chiaro, B; Kelly, J; Megrant, A; Mutus, J Y; O'Malley, P J J; Vainsencher, A; Wenner, J; White, T C; Yin, Yi; Cleland, A N; Martinis, John M

    2014-10-14

    Quantum interference is one of the most fundamental physical effects found in nature. Recent advances in quantum computing now employ interference as a fundamental resource for computation and control. Quantum interference also lies at the heart of sophisticated condensed matter phenomena such as Anderson localization, phenomena that are difficult to reproduce in numerical simulations. Here, employing a multiple-element superconducting quantum circuit, with which we manipulate a single microwave photon, we demonstrate that we can emulate the basic effects of weak localization. By engineering the control sequence, we are able to reproduce the well-known negative magnetoresistance of weak localization as well as its temperature dependence. Furthermore, we can use our circuit to continuously tune the level of disorder, a parameter that is not readily accessible in mesoscopic systems. Demonstrating a high level of control, our experiment shows the potential for employing superconducting quantum circuits as emulators for complex quantum phenomena.

  6. How to differentiate collective variables in free energy codes: Computer-algebra code generation and automatic differentiation

    NASA Astrophysics Data System (ADS)

    Giorgino, Toni

    2018-07-01

    The proper choice of collective variables (CVs) is central to biased-sampling free energy reconstruction methods in molecular dynamics simulations. The PLUMED 2 library, for instance, provides several sophisticated CV choices, implemented in a C++ framework; however, developing new CVs is still time consuming due to the need to provide code for the analytical derivatives of all functions with respect to atomic coordinates. We present two solutions to this problem, namely (a) symbolic differentiation and code generation, and (b) automatic code differentiation, in both cases leveraging open-source libraries (SymPy and Stan Math, respectively). The two approaches are demonstrated and discussed in detail implementing a realistic example CV, the local radius of curvature of a polymer. Users may use the code as a template to streamline the implementation of their own CVs using high-level constructs and automatic gradient computation.

  7. Quantifying chemical uncertainties in simulations of the ISM

    NASA Astrophysics Data System (ADS)

    Glover, Simon

    2018-06-01

    The ever-increasing power of large parallel computers now makes it possible to include increasingly sophisticated chemical models in three-dimensional simulations of the interstellar medium (ISM). This allows us to study the role that chemistry plays in the thermal balance of a realistically-structured, turbulent ISM, as well as enabling us to generated detailed synthetic observations of important atomic or molecular tracers. However, one major constraint on the accuracy of these models is the accuracy with which the input chemical rate coefficients are known. Uncertainties in these chemical rate coefficients inevitably introduce uncertainties into the model predictions. In this talk, I will review some of the methods we can use to quantify these uncertainties and to identify the key reactions where improved chemical data is most urgently required. I will also discuss a few examples, ranging from the local ISM to the high-redshift universe.

  8. Simulation tools for particle-based reaction-diffusion dynamics in continuous space

    PubMed Central

    2014-01-01

    Particle-based reaction-diffusion algorithms facilitate the modeling of the diffusional motion of individual molecules and the reactions between them in cellular environments. A physically realistic model, depending on the system at hand and the questions asked, would require different levels of modeling detail such as particle diffusion, geometrical confinement, particle volume exclusion or particle-particle interaction potentials. Higher levels of detail usually correspond to increased number of parameters and higher computational cost. Certain systems however, require these investments to be modeled adequately. Here we present a review on the current field of particle-based reaction-diffusion software packages operating on continuous space. Four nested levels of modeling detail are identified that capture incrementing amount of detail. Their applicability to different biological questions is discussed, arching from straight diffusion simulations to sophisticated and expensive models that bridge towards coarse grained molecular dynamics. PMID:25737778

  9. Ocean Models and Proper Orthogonal Decomposition

    NASA Astrophysics Data System (ADS)

    Salas-de-Leon, D. A.

    2007-05-01

    The increasing computational developments and the better understanding of mathematical and physical systems resulted in an increasing number of ocean models. Long time ago, modelers were like a secret organization and recognize each other by using secret codes and languages that only a select group of people was able to recognize and understand. The access to computational systems was reduced, on one hand equipment and the using time of computers were expensive and restricted, and on the other hand, they required an advance computational languages that not everybody wanted to learn. Now a days most college freshman own a personal computer (PC or laptop), and/or have access to more sophisticated computational systems than those available for research in the early 80's. The resource availability resulted in a mayor access to all kind models. Today computer speed and time and the algorithms does not seem to be a problem, even though some models take days to run in small computational systems. Almost every oceanographic institution has their own model, what is more, in the same institution from one office to the next there are different models for the same phenomena, developed by different research member, the results does not differ substantially since the equations are the same, and the solving algorithms are similar. The algorithms and the grids, constructed with algorithms, can be found in text books and/or over the internet. Every year more sophisticated models are constructed. The Proper Orthogonal Decomposition is a technique that allows the reduction of the number of variables to solve keeping the model properties, for which it can be a very useful tool in diminishing the processes that have to be solved using "small" computational systems, making sophisticated models available for a greater community.

  10. Advanced construction management for lunar base construction - Surface operations planner

    NASA Technical Reports Server (NTRS)

    Kehoe, Robert P.

    1992-01-01

    The study proposes a conceptual solution and lays the framework for developing a new, sophisticated and intelligent tool for a lunar base construction crew to use. This concept integrates expert systems for critical decision making, virtual reality for training, logistics and laydown optimization, automated productivity measurements, and an advanced scheduling tool to form a unique new planning tool. The concept features extensive use of computers and expert systems software to support the actual work, while allowing the crew to control the project from the lunar surface. Consideration is given to a logistics data base, laydown area management, flexible critical progress scheduler, video simulation of assembly tasks, and assembly information and tracking documentation.

  11. Interactive mission planning for a Space Shuttle flight experiment - A case history

    NASA Technical Reports Server (NTRS)

    Harris, H. M.

    1986-01-01

    Scientific experiments which use the Space Shuttle as a platform require the development of new operations techniques for the command and control of the instrument. Principal among these is the ability to simulate the complex maneuvers of the orbiter's path realistically. Computer generated graphics provide a window into the actual and predicted performance of the instrument and allow sophisticated control of the instrument under varying conditions. In October of 1984 the Shuttle carried a synthetic aperture radar built by JPL for the purpose of recording images of the earth surface. The mission deviated from planned operation in almost every conceivable way and provided an exacting test bed for concepts of interactive mission planning.

  12. A digitally implemented preambleless demodulator for maritime and mobile data communications

    NASA Astrophysics Data System (ADS)

    Chalmers, Harvey; Shenoy, Ajit; Verahrami, Farhad B.

    The hardware design and software algorithms for a low-bit-rate, low-cost, all-digital preambleless demodulator are described. The demodulator operates under severe high-noise conditions, fast Doppler frequency shifts, large frequency offsets, and multipath fading. Sophisticated algorithms, including a fast Fourier transform (FFT)-based burst acquisition algorithm, a cycle-slip resistant carrier phase tracker, an innovative Doppler tracker, and a fast acquisition symbol synchronizer, were developed and extensively simulated for reliable burst reception. The compact digital signal processor (DSP)-based demodulator hardware uses a unique personal computer test interface for downloading test data files. The demodulator test results demonstrate a near-ideal performance within 0.2 dB of theory.

  13. Goals and Objectives for Computing in the Associated Colleges of the St. Lawrence Valley.

    ERIC Educational Resources Information Center

    Grupe, Fritz H.

    A forecast of the computing requirements of the Associated Colleges of the St. Lawrence Valley, an analysis of their needs, and specifications for a joint computer system are presented. Problems encountered included the lack of resources and computer sophistication at the member schools and a dearth of experience with long-term computer consortium…

  14. Inferring population history with DIY ABC: a user-friendly approach to approximate Bayesian computation.

    PubMed

    Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A; Robert, Christian P; Marin, Jean-Michel; Balding, David J; Guillemaud, Thomas; Estoup, Arnaud

    2008-12-01

    Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC. The software DIY ABC is freely available at http://www.montpellier.inra.fr/CBGP/diyabc.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petiteau, Antoine; Auger, Gerard; Halloin, Hubert

    A new LISA simulator (LISACode) is presented. Its ambition is to achieve a new degree of sophistication allowing to map, as closely as possible, the impact of the different subsystems on the measurements. LISACode is not a detailed simulator at the engineering level but rather a tool whose purpose is to bridge the gap between the basic principles of LISA and a future, sophisticated end-to-end simulator. This is achieved by introducing, in a realistic manner, most of the ingredients that will influence LISA's sensitivity as well as the application of TDI combinations. Many user-defined parameters allow the code to studymore » different configurations of LISA thus helping to finalize the definition of the detector. Another important use of LISACode is in generating time-series for data analysis developments.« less

  16. Logic via Computer Programming.

    ERIC Educational Resources Information Center

    Wieschenberg, Agnes A.

    This paper proposed the question "How do we teach logical thinking and sophisticated mathematics to unsophisticated college students?" One answer among many is through the writing of computer programs. The writing of computer algorithms is mathematical problem solving and logic in disguise and it may attract students who would otherwise stop…

  17. Prior Consent: Not-So-Strange Bedfellows Plan Library/Computing Partnerships.

    ERIC Educational Resources Information Center

    McDonough, Kristin

    The increasing sophistication of information technologies and the nearly universal access to computing have blurred distinctions among information delivery units on college campuses, forcing institutions to rethink the separate organizational structures that evolved when computing in academe was more localized and less prevalent. Experiences in…

  18. An Instructional Simulation for Organizational Communication.

    ERIC Educational Resources Information Center

    Pacanowsky, Michael; Farace, Richard V.

    Ineffective communication in an organization is costly. This paper examines one of the many approaches to solving this problem--increasing employee awareness of communication by increasing employee communication skills and sophistication. Simulation games are an effective means of improving employee awareness. The simulation provides a common…

  19. RAVEN: a GUI and an Artificial Intelligence Engine in a Dynamic PRA Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. Rabiti; D. Mandelli; A. Alfonsi

    Increases in computational power and pressure for more accurate simulations and estimations of accident scenario consequences are driving the need for Dynamic Probabilistic Risk Assessment (PRA) [1] of very complex models. While more sophisticated algorithms and computational power address the back end of this challenge, the front end is still handled by engineers that need to extract meaningful information from the large amount of data and build these complex models. Compounding this problem is the difficulty in knowledge transfer and retention, and the increasing speed of software development. The above-described issues would have negatively impacted deployment of the new highmore » fidelity plant simulator RELAP-7 (Reactor Excursion and Leak Analysis Program) at Idaho National Laboratory. Therefore, RAVEN that was initially focused to be the plant controller for RELAP-7 will help mitigate future RELAP-7 software engineering risks. In order to accomplish this task, Reactor Analysis and Virtual Control Environment (RAVEN) has been designed to provide an easy to use Graphical User Interface (GUI) for building plant models and to leverage artificial intelligence algorithms in order to reduce computational time, improve results, and help the user to identify the behavioral pattern of the Nuclear Power Plants (NPPs). In this paper we will present the GUI implementation and its current capability status. We will also introduce the support vector machine algorithms and show our evaluation of their potentiality in increasing the accuracy and reducing the computational costs of PRA analysis. In this evaluation we will refer to preliminary studies performed under the Risk Informed Safety Margins Characterization (RISMC) project of the Light Water Reactors Sustainability (LWRS) campaign [3]. RISMC simulation needs and algorithm testing are currently used as a guidance to prioritize RAVEN developments relevant to PRA.« less

  20. Learning and evolution in bacterial taxis: an operational amplifier circuit modeling the computational dynamics of the prokaryotic 'two component system' protein network.

    PubMed

    Di Paola, Vieri; Marijuán, Pedro C; Lahoz-Beltra, Rafael

    2004-01-01

    Adaptive behavior in unicellular organisms (i.e., bacteria) depends on highly organized networks of proteins governing purposefully the myriad of molecular processes occurring within the cellular system. For instance, bacteria are able to explore the environment within which they develop by utilizing the motility of their flagellar system as well as a sophisticated biochemical navigation system that samples the environmental conditions surrounding the cell, searching for nutrients or moving away from toxic substances or dangerous physical conditions. In this paper we discuss how proteins of the intervening signal transduction network could be modeled as artificial neurons, simulating the dynamical aspects of the bacterial taxis. The model is based on the assumption that, in some important aspects, proteins can be considered as processing elements or McCulloch-Pitts artificial neurons that transfer and process information from the bacterium's membrane surface to the flagellar motor. This simulation of bacterial taxis has been carried out on a hardware realization of a McCulloch-Pitts artificial neuron using an operational amplifier. Based on the behavior of the operational amplifier we produce a model of the interaction between CheY and FliM, elements of the prokaryotic two component system controlling chemotaxis, as well as a simulation of learning and evolution processes in bacterial taxis. On the one side, our simulation results indicate that, computationally, these protein 'switches' are similar to McCulloch-Pitts artificial neurons, suggesting a bridge between evolution and learning in dynamical systems at cellular and molecular levels and the evolutive hardware approach. On the other side, important protein 'tactilizing' properties are not tapped by the model, and this suggests further complexity steps to explore in the approach to biological molecular computing.

  1. Validation of coupled atmosphere-fire behavior models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bossert, J.E.; Reisner, J.M.; Linn, R.R.

    1998-12-31

    Recent advances in numerical modeling and computer power have made it feasible to simulate the dynamical interaction and feedback between the heat and turbulence induced by wildfires and the local atmospheric wind and temperature fields. At Los Alamos National Laboratory, the authors have developed a modeling system that includes this interaction by coupling a high resolution atmospheric dynamics model, HIGRAD, with a fire behavior model, BEHAVE, to predict the spread of wildfires. The HIGRAD/BEHAVE model is run at very high resolution to properly resolve the fire/atmosphere interaction. At present, these coupled wildfire model simulations are computationally intensive. The additional complexitymore » of these models require sophisticated methods for assuring their reliability in real world applications. With this in mind, a substantial part of the research effort is directed at model validation. Several instrumented prescribed fires have been conducted with multi-agency support and participation from chaparral, marsh, and scrub environments in coastal areas of Florida and inland California. In this paper, the authors first describe the data required to initialize the components of the wildfire modeling system. Then they present results from one of the Florida fires, and discuss a strategy for further testing and improvement of coupled weather/wildfire models.« less

  2. Protonation States in molecular dynamics simulations of peptide folding and binding.

    PubMed

    Ben-Shimon, Avraham; Shalev, Deborah E; Niv, Masha Y

    2013-01-01

    Peptides are important signaling modules, acting both as individual hormones and as parts of larger molecules, mediating their protein-protein interactions. Many peptidic and peptidomimetic drugs have reached the marketplace and opportunities for peptide-based drug discovery are on the rise. pH-dependent behavior of peptides is well documented in the context of misfolding diseases and peptide translocation. Changes in the protonation states of peptide residues often have a crucial effect on a peptide's structure, dynamics and function, which may be exploited for biotechnological applications. The current review surveys the increasing levels of sophistication in the treatment of protonation states in computational studies involving peptides. Specifically we describe I) the common practice of assigning a single protonation state and using it throughout the dynamic simulation, II) approaches that consider multiple protonation states and compare computed observables to experimental ones, III) constant pH molecular dynamics methods that couple changes in protonation states with conformational dynamics "on the fly". Applications of conformational dynamics treatment of peptides in the context of binding, folding and interactions with the membrane are presented, illustrating the growing body of work in this field and highlighting the importance of careful handling of protonation states of peptidic residues.

  3. Early MIMD experience on the CRAY X-MP

    NASA Astrophysics Data System (ADS)

    Rhoades, Clifford E.; Stevens, K. G.

    1985-07-01

    This paper describes some early experience with converting four physics simulation programs to the CRAY X-MP, a current Multiple Instruction, Multiple Data (MIMD) computer consisting of two processors each with an architecture similar to that of the CRAY-1. As a multi-processor, the CRAY X-MP together with the high speed Solid-state Storage Device (SSD) in an ideal machine upon which to study MIMD algorithms for solving the equations of mathematical physics because it is fast enough to run real problems. The computer programs used in this study are all FORTRAN versions of original production codes. They range in sophistication from a one-dimensional numerical simulation of collisionless plasma to a two-dimensional hydrodynamics code with heat flow to a couple of three-dimensional fluid dynamics codes with varying degrees of viscous modeling. Early research with a dual processor configuration has shown speed-ups ranging from 1.55 to 1.98. It has been observed that a few simple extensions to FORTRAN allow a typical programmer to achieve a remarkable level of efficiency. These extensions involve the concept of memory local to a concurrent subprogram and memory common to all concurrent subprograms.

  4. On the kinematics of scalar iso-surfaces in turbulent flow

    NASA Astrophysics Data System (ADS)

    Blakeley, Brandon C.; Riley, James J.; Storti, Duane W.; Wang, Weirong

    2017-11-01

    The behavior of scalar iso-surfaces in turbulent flows is of fundamental interest and importance in a number of problems, e.g., the stoichiometric surface in non-premixed reactions, and the turbulent/non-turbulent interface in localized turbulent shear flows. Of particular interest here is the behavior of the average surface area per unit volume, Σ. We report on the use of direct numerical simulations and sophisticated surface tracking techniques to directly compute Σ and model its evolution. We consider two different scalar configurations in decaying, isotropic turbulence: first, the iso-surface is initially homogenous and isotropic in space, second, the iso-surface is initially planar. A novel method of computing integral properties from regularly-sampled values of a scalar function is leveraged to provide accurate estimates of Σ. Guided by simulation results, modeling is introduced from two perspectives. The first approach models the various terms in the evolution equation for Σ, while the second uses Rice's theorem to model Σ directly. In particular, the two principal effects on the evolution of Σ, i.e., the growth of the surface area due to local surface stretching, and the ultimate decay due to molecular destruction, are addressed.

  5. Implementation and use of direct-flow connections in a coupled ground-water and surface-water model

    USGS Publications Warehouse

    Swain, Eric D.

    1994-01-01

    The U.S. Geological Survey's MODFLOW finite-difference ground-water flow model has been coupled with three surface-water packages - the MODBRANCH, River, and Stream packages - to simulate surface water and its interaction with ground water. Prior to the development of the coupling packages, the only interaction between these modeling packages was that leakage values could be passed between MODFLOW and the three surface-water packages. To facilitate wider and more flexible uses of the models, a computer program was developed and added to MODFLOW to allow direct flows or stages to be passed between any of the packages and MODFLOW. The flows or stages calculated in one package can be set as boundary discharges or stages to be used in another package. Several modeling packages can be used in the same simulation depending upon the level of sophistication needed in the various reaches being modeled. This computer program is especially useful when any of the River, Stream, or MODBRANCH packages are used to model a river flowing directly into or out of wetlands in direct connection with the aquifer and represented in the model as an aquifer block. A field case study is shown to illustrate an application.

  6. Ground-water models as a management tool in Florida

    USGS Publications Warehouse

    Hutchinson, C.B.

    1984-01-01

    Highly sophisticated computer models provide powerful tools for analyzing historic data and for simulating future water levels, water movement, and water chemistry under stressed conditions throughout the ground-water system in Florida. Models that simulate the movement of heat and subsidence of land in response to aquifer pumping also have potential for application to hydrologic problems in the State. Florida, with 20 ground-water modeling studies reported since 1972, has applied computer modeling techniques to a variety of water-resources problems. Models in Florida generally have been used to provide insight to problems of water supply, contamination, and impact on the environment. The model applications range from site-specific studies, such as estimating contamination by wastewater injection at St. Petersburg, to a regional model of the entire State that may be used to assess broad-scale environmental impact of water-resources development. Recently, groundwater models have been used as management tools by the State regulatory authority to permit or deny development of water resources. As modeling precision, knowledge, and confidence increase, the use of ground-water models will shift more and more toward regulation of development and enforcement of environmental laws. (USGS)

  7. Reproducible research in vadose zone sciences

    USDA-ARS?s Scientific Manuscript database

    A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...

  8. Experience with a sophisticated computer based authoring system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, P.R.

    1984-04-01

    In the November 1982 issue of ADCIS SIG CBT Newsletter the editor arrives at two conclusions regarding Computer Based Authoring Systems (CBAS): (1) CBAS drastically reduces programming time and the need for expert programmers, and (2) CBAS appears to have minimal impact on initial lesson design. Both of these comments have significant impact on any Cost-Benefit analysis for Computer-Based Training. The first tends to improve cost-effectiveness but only toward the limits imposed by the second. Westinghouse Hanford Company (WHC) recently purchased a sophisticated CBAS, the WISE/SMART system from Wicat (Orem, UT), for use in the Nuclear Power Industry. This reportmore » details our experience with this system relative to Items (1) and (2) above; lesson design time will be compared with lesson input time. Also provided will be the WHC experience in the use of subject matter experts (though computer neophytes) for the design and inputting of CBT materials.« less

  9. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems

    PubMed Central

    Wu, Jun; Su, Zhou; Li, Jianhua

    2017-01-01

    Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on “friend” relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems. PMID:28758943

  10. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems.

    PubMed

    Wu, Jun; Su, Zhou; Wang, Shen; Li, Jianhua

    2017-07-30

    Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on "friend" relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems.

  11. Using Interactive Computer to Communicate Scientific Information.

    ERIC Educational Resources Information Center

    Selnow, Gary W.

    1988-01-01

    Asks whether the computer is another channel of communication, if its interactive qualities make it an information source, or if it is an undefined hybrid. Concludes that computers are neither the medium nor the source but will in the future provide the possibility of a sophisticated interaction between human intelligence and artificial…

  12. How You Can Protect Public Access Computers "and" Their Users

    ERIC Educational Resources Information Center

    Huang, Phil

    2007-01-01

    By providing the public with online computing facilities, librarians make available a world of information resources beyond their traditional print materials. Internet-connected computers in libraries greatly enhance the opportunity for patrons to enjoy the benefits of the digital age. Unfortunately, as hackers become more sophisticated and…

  13. InteractiveROSETTA: a graphical user interface for the PyRosetta protein modeling suite.

    PubMed

    Schenkelberg, Christian D; Bystroff, Christopher

    2015-12-15

    Modern biotechnical research is becoming increasingly reliant on computational structural modeling programs to develop novel solutions to scientific questions. Rosetta is one such protein modeling suite that has already demonstrated wide applicability to a number of diverse research projects. Unfortunately, Rosetta is largely a command-line-driven software package which restricts its use among non-computational researchers. Some graphical interfaces for Rosetta exist, but typically are not as sophisticated as commercial software. Here, we present InteractiveROSETTA, a graphical interface for the PyRosetta framework that presents easy-to-use controls for several of the most widely used Rosetta protocols alongside a sophisticated selection system utilizing PyMOL as a visualizer. InteractiveROSETTA is also capable of interacting with remote Rosetta servers, facilitating sophisticated protocols that are not accessible in PyRosetta or which require greater computational resources. InteractiveROSETTA is freely available at https://github.com/schenc3/InteractiveROSETTA/releases and relies upon a separate download of PyRosetta which is available at http://www.pyrosetta.org after obtaining a license (free for academic use). © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Airbreathing Propulsion System Analysis Using Multithreaded Parallel Processing

    NASA Technical Reports Server (NTRS)

    Schunk, Richard Gregory; Chung, T. J.; Rodriguez, Pete (Technical Monitor)

    2000-01-01

    In this paper, parallel processing is used to analyze the mixing, and combustion behavior of hypersonic flow. Preliminary work for a sonic transverse hydrogen jet injected from a slot into a Mach 4 airstream in a two-dimensional duct combustor has been completed [Moon and Chung, 1996]. Our aim is to extend this work to three-dimensional domain using multithreaded domain decomposition parallel processing based on the flowfield-dependent variation theory. Numerical simulations of chemically reacting flows are difficult because of the strong interactions between the turbulent hydrodynamic and chemical processes. The algorithm must provide an accurate representation of the flowfield, since unphysical flowfield calculations will lead to the faulty loss or creation of species mass fraction, or even premature ignition, which in turn alters the flowfield information. Another difficulty arises from the disparity in time scales between the flowfield and chemical reactions, which may require the use of finite rate chemistry. The situations are more complex when there is a disparity in length scales involved in turbulence. In order to cope with these complicated physical phenomena, it is our plan to utilize the flowfield-dependent variation theory mentioned above, facilitated by large eddy simulation. Undoubtedly, the proposed computation requires the most sophisticated computational strategies. The multithreaded domain decomposition parallel processing will be necessary in order to reduce both computational time and storage. Without special treatments involved in computer engineering, our attempt to analyze the airbreathing combustion appears to be difficult, if not impossible.

  15. Implementing Computer Algebra Enabled Questions for the Assessment and Learning of Mathematics

    ERIC Educational Resources Information Center

    Sangwin, Christopher J.; Naismith, Laura

    2008-01-01

    We present principles for the design of an online system to support computer algebra enabled questions for use within the teaching and learning of mathematics in higher education. The introduction of a computer algebra system (CAS) into a computer aided assessment (CAA) system affords sophisticated response processing of student provided answers.…

  16. A case for spiking neural network simulation based on configurable multiple-FPGA systems.

    PubMed

    Yang, Shufan; Wu, Qiang; Li, Renfa

    2011-09-01

    Recent neuropsychological research has begun to reveal that neurons encode information in the timing of spikes. Spiking neural network simulations are a flexible and powerful method for investigating the behaviour of neuronal systems. Simulation of the spiking neural networks in software is unable to rapidly generate output spikes in large-scale of neural network. An alternative approach, hardware implementation of such system, provides the possibility to generate independent spikes precisely and simultaneously output spike waves in real time, under the premise that spiking neural network can take full advantage of hardware inherent parallelism. We introduce a configurable FPGA-oriented hardware platform for spiking neural network simulation in this work. We aim to use this platform to combine the speed of dedicated hardware with the programmability of software so that it might allow neuroscientists to put together sophisticated computation experiments of their own model. A feed-forward hierarchy network is developed as a case study to describe the operation of biological neural systems (such as orientation selectivity of visual cortex) and computational models of such systems. This model demonstrates how a feed-forward neural network constructs the circuitry required for orientation selectivity and provides platform for reaching a deeper understanding of the primate visual system. In the future, larger scale models based on this framework can be used to replicate the actual architecture in visual cortex, leading to more detailed predictions and insights into visual perception phenomenon.

  17. A multiple hypotheses uncertainty analysis in hydrological modelling: about model structure, landscape parameterization, and numerical integration

    NASA Astrophysics Data System (ADS)

    Pilz, Tobias; Francke, Till; Bronstert, Axel

    2016-04-01

    Until today a large number of competing computer models has been developed to understand hydrological processes and to simulate and predict streamflow dynamics of rivers. This is primarily the result of a lack of a unified theory in catchment hydrology due to insufficient process understanding and uncertainties related to model development and application. Therefore, the goal of this study is to analyze the uncertainty structure of a process-based hydrological catchment model employing a multiple hypotheses approach. The study focuses on three major problems that have received only little attention in previous investigations. First, to estimate the impact of model structural uncertainty by employing several alternative representations for each simulated process. Second, explore the influence of landscape discretization and parameterization from multiple datasets and user decisions. Third, employ several numerical solvers for the integration of the governing ordinary differential equations to study the effect on simulation results. The generated ensemble of model hypotheses is then analyzed and the three sources of uncertainty compared against each other. To ensure consistency and comparability all model structures and numerical solvers are implemented within a single simulation environment. First results suggest that the selection of a sophisticated numerical solver for the differential equations positively affects simulation outcomes. However, already some simple and easy to implement explicit methods perform surprisingly well and need less computational efforts than more advanced but time consuming implicit techniques. There is general evidence that ambiguous and subjective user decisions form a major source of uncertainty and can greatly influence model development and application at all stages.

  18. Learning Performance with Interactive Simulations in Medical Education: Lessons Learned from Results of Learning Complex Physiological Models with the HAEMOdynamics SIMulator

    ERIC Educational Resources Information Center

    Holzinger, Andreas; Kickmeier-Rust, Michael D.; Wassertheurer, Sigi; Hessinger, Michael

    2009-01-01

    Objective: Since simulations are often accepted uncritically, with excessive emphasis being placed on technological sophistication at the expense of underlying psychological and educational theories, we evaluated the learning performance of simulation software, in order to gain insight into the proper use of simulations for application in medical…

  19. Evolution of Computational Toxicology-from Primitive ...

    EPA Pesticide Factsheets

    Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 on the Evolution of Computational Toxicology-from Primitive Beginnings to Sophisticated Application

  20. NMSim web server: integrated approach for normal mode-based geometric simulations of biologically relevant conformational transitions in proteins.

    PubMed

    Krüger, Dennis M; Ahmed, Aqeel; Gohlke, Holger

    2012-07-01

    The NMSim web server implements a three-step approach for multiscale modeling of protein conformational changes. First, the protein structure is coarse-grained using the FIRST software. Second, a rigid cluster normal-mode analysis provides low-frequency normal modes. Third, these modes are used to extend the recently introduced idea of constrained geometric simulations by biasing backbone motions of the protein, whereas side chain motions are biased toward favorable rotamer states (NMSim). The generated structures are iteratively corrected regarding steric clashes and stereochemical constraint violations. The approach allows performing three simulation types: unbiased exploration of conformational space; pathway generation by a targeted simulation; and radius of gyration-guided simulation. On a data set of proteins with experimentally observed conformational changes, the NMSim approach has been shown to be a computationally efficient alternative to molecular dynamics simulations for conformational sampling of proteins. The generated conformations and pathways of conformational transitions can serve as input to docking approaches or more sophisticated sampling techniques. The web server output is a trajectory of generated conformations, Jmol representations of the coarse-graining and a subset of the trajectory and data plots of structural analyses. The NMSim webserver, accessible at http://www.nmsim.de, is free and open to all users with no login requirement.

  1. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  2. A performance comparison of scalar, vector, and concurrent vector computers including supercomputers for modeling transport of reactive contaminants in groundwater

    NASA Astrophysics Data System (ADS)

    Tripathi, Vijay S.; Yeh, G. T.

    1993-06-01

    Sophisticated and highly computation-intensive models of transport of reactive contaminants in groundwater have been developed in recent years. Application of such models to real-world contaminant transport problems, e.g., simulation of groundwater transport of 10-15 chemically reactive elements (e.g., toxic metals) and relevant complexes and minerals in two and three dimensions over a distance of several hundred meters, requires high-performance computers including supercomputers. Although not widely recognized as such, the computational complexity and demand of these models compare with well-known computation-intensive applications including weather forecasting and quantum chemical calculations. A survey of the performance of a variety of available hardware, as measured by the run times for a reactive transport model HYDROGEOCHEM, showed that while supercomputers provide the fastest execution times for such problems, relatively low-cost reduced instruction set computer (RISC) based scalar computers provide the best performance-to-price ratio. Because supercomputers like the Cray X-MP are inherently multiuser resources, often the RISC computers also provide much better turnaround times. Furthermore, RISC-based workstations provide the best platforms for "visualization" of groundwater flow and contaminant plumes. The most notable result, however, is that current workstations costing less than $10,000 provide performance within a factor of 5 of a Cray X-MP.

  3. Simple stochastic simulation.

    PubMed

    Schilstra, Maria J; Martin, Stephen R

    2009-01-01

    Stochastic simulations may be used to describe changes with time of a reaction system in a way that explicitly accounts for the fact that molecules show a significant degree of randomness in their dynamic behavior. The stochastic approach is almost invariably used when small numbers of molecules or molecular assemblies are involved because this randomness leads to significant deviations from the predictions of the conventional deterministic (or continuous) approach to the simulation of biochemical kinetics. Advances in computational methods over the three decades that have elapsed since the publication of Daniel Gillespie's seminal paper in 1977 (J. Phys. Chem. 81, 2340-2361) have allowed researchers to produce highly sophisticated models of complex biological systems. However, these models are frequently highly specific for the particular application and their description often involves mathematical treatments inaccessible to the nonspecialist. For anyone completely new to the field to apply such techniques in their own work might seem at first sight to be a rather intimidating prospect. However, the fundamental principles underlying the approach are in essence rather simple, and the aim of this article is to provide an entry point to the field for a newcomer. It focuses mainly on these general principles, both kinetic and computational, which tend to be not particularly well covered in specialist literature, and shows that interesting information may even be obtained using very simple operations in a conventional spreadsheet.

  4. Alternative methods for the median lethal dose (LD(50)) test: the up-and-down procedure for acute oral toxicity.

    PubMed

    Rispin, Amy; Farrar, David; Margosches, Elizabeth; Gupta, Kailash; Stitzel, Katherine; Carr, Gregory; Greene, Michael; Meyer, William; McCall, Deborah

    2002-01-01

    The authors have developed an improved version of the up-and-down procedure (UDP) as one of the replacements for the traditional acute oral toxicity test formerly used by the Organisation for Economic Co-operation and Development member nations to characterize industrial chemicals, pesticides, and their mixtures. This method improves the performance of acute testing for applications that use the median lethal dose (classic LD50) test while achieving significant reductions in animal use. It uses sequential dosing, together with sophisticated computer-assisted computational methods during the execution and calculation phases of the test. Staircase design, a form of sequential test design, can be applied to acute toxicity testing with its binary experimental endpoints (yes/no outcomes). The improved UDP provides a point estimate of the LD50 and approximate confidence intervals in addition to observed toxic signs for the substance tested. It does not provide information about the dose-response curve. Computer simulation was used to test performance of the UDP without the need for additional laboratory validation.

  5. Towards a complex systems approach in sports injury research: simulating running-related injury development with agent-based modelling.

    PubMed

    Hulme, Adam; Thompson, Jason; Nielsen, Rasmus Oestergaard; Read, Gemma J M; Salmon, Paul M

    2018-06-18

    There have been recent calls for the application of the complex systems approach in sports injury research. However, beyond theoretical description and static models of complexity, little progress has been made towards formalising this approach in way that is practical to sports injury scientists and clinicians. Therefore, our objective was to use a computational modelling method and develop a dynamic simulation in sports injury research. Agent-based modelling (ABM) was used to model the occurrence of sports injury in a synthetic athlete population. The ABM was developed based on sports injury causal frameworks and was applied in the context of distance running-related injury (RRI). Using the acute:chronic workload ratio (ACWR), we simulated the dynamic relationship between changes in weekly running distance and RRI through the manipulation of various 'athlete management tools'. The findings confirmed that building weekly running distances over time, even within the reported ACWR 'sweet spot', will eventually result in RRI as athletes reach and surpass their individual physical workload limits. Introducing training-related error into the simulation and the modelling of a 'hard ceiling' dynamic resulted in a higher RRI incidence proportion across the population at higher absolute workloads. The presented simulation offers a practical starting point to further apply more sophisticated computational models that can account for the complex nature of sports injury aetiology. Alongside traditional forms of scientific inquiry, the use of ABM and other simulation-based techniques could be considered as a complementary and alternative methodological approach in sports injury research. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  6. FDTD Modeling of LEMP Propagation in the Earth-Ionosphere Waveguide With Emphasis on Realistic Representation of Lightning Source

    NASA Astrophysics Data System (ADS)

    Tran, Thang H.; Baba, Yoshihiro; Somu, Vijaya B.; Rakov, Vladimir A.

    2017-12-01

    The finite difference time domain (FDTD) method in the 2-D cylindrical coordinate system was used to compute the nearly full-frequency-bandwidth vertical electric field and azimuthal magnetic field waveforms produced on the ground surface by lightning return strokes. The lightning source was represented by the modified transmission-line model with linear current decay with height, which was implemented in the FDTD computations as an appropriate vertical phased-current-source array. The conductivity of atmosphere was assumed to increase exponentially with height, with different conductivity profiles being used for daytime and nighttime conditions. The fields were computed at distances ranging from 50 to 500 km. Sky waves (reflections from the ionosphere) were identified in computed waveforms and used for estimation of apparent ionospheric reflection heights. It was found that our model reproduces reasonably well the daytime electric field waveforms measured at different distances and simulated (using a more sophisticated propagation model) by Qin et al. (2017). Sensitivity of model predictions to changes in the parameters of atmospheric conductivity profile, as well as influences of the lightning source characteristics (current waveshape parameters, return-stroke speed, and channel length) and ground conductivity were examined.

  7. Automatic Compilation from High-Level Biologically-Oriented Programming Language to Genetic Regulatory Networks

    PubMed Central

    Beal, Jacob; Lu, Ting; Weiss, Ron

    2011-01-01

    Background The field of synthetic biology promises to revolutionize our ability to engineer biological systems, providing important benefits for a variety of applications. Recent advances in DNA synthesis and automated DNA assembly technologies suggest that it is now possible to construct synthetic systems of significant complexity. However, while a variety of novel genetic devices and small engineered gene networks have been successfully demonstrated, the regulatory complexity of synthetic systems that have been reported recently has somewhat plateaued due to a variety of factors, including the complexity of biology itself and the lag in our ability to design and optimize sophisticated biological circuitry. Methodology/Principal Findings To address the gap between DNA synthesis and circuit design capabilities, we present a platform that enables synthetic biologists to express desired behavior using a convenient high-level biologically-oriented programming language, Proto. The high level specification is compiled, using a regulatory motif based mechanism, to a gene network, optimized, and then converted to a computational simulation for numerical verification. Through several example programs we illustrate the automated process of biological system design with our platform, and show that our compiler optimizations can yield significant reductions in the number of genes () and latency of the optimized engineered gene networks. Conclusions/Significance Our platform provides a convenient and accessible tool for the automated design of sophisticated synthetic biological systems, bridging an important gap between DNA synthesis and circuit design capabilities. Our platform is user-friendly and features biologically relevant compiler optimizations, providing an important foundation for the development of sophisticated biological systems. PMID:21850228

  8. Automatic compilation from high-level biologically-oriented programming language to genetic regulatory networks.

    PubMed

    Beal, Jacob; Lu, Ting; Weiss, Ron

    2011-01-01

    The field of synthetic biology promises to revolutionize our ability to engineer biological systems, providing important benefits for a variety of applications. Recent advances in DNA synthesis and automated DNA assembly technologies suggest that it is now possible to construct synthetic systems of significant complexity. However, while a variety of novel genetic devices and small engineered gene networks have been successfully demonstrated, the regulatory complexity of synthetic systems that have been reported recently has somewhat plateaued due to a variety of factors, including the complexity of biology itself and the lag in our ability to design and optimize sophisticated biological circuitry. To address the gap between DNA synthesis and circuit design capabilities, we present a platform that enables synthetic biologists to express desired behavior using a convenient high-level biologically-oriented programming language, Proto. The high level specification is compiled, using a regulatory motif based mechanism, to a gene network, optimized, and then converted to a computational simulation for numerical verification. Through several example programs we illustrate the automated process of biological system design with our platform, and show that our compiler optimizations can yield significant reductions in the number of genes (~ 50%) and latency of the optimized engineered gene networks. Our platform provides a convenient and accessible tool for the automated design of sophisticated synthetic biological systems, bridging an important gap between DNA synthesis and circuit design capabilities. Our platform is user-friendly and features biologically relevant compiler optimizations, providing an important foundation for the development of sophisticated biological systems.

  9. Numerical simulations for quantitative analysis of electrostatic interaction between atomic force microscopy probe and an embedded electrode within a thin dielectric: meshing optimization, sensitivity to potential distribution and impact of cantilever contribution

    NASA Astrophysics Data System (ADS)

    Azib, M.; Baudoin, F.; Binaud, N.; Villeneuve-Faure, C.; Bugarin, F.; Segonds, S.; Teyssedre, G.

    2018-04-01

    Recent experimental results demonstrated that an electrostatic force distance curve (EFDC) can be used for space charge probing in thin dielectric layers. A main advantage of the method is claimed to be its sensitivity to charge localization, which, however, needs to be substantiated by numerical simulations. In this paper, we have developed a model which permits us to compute an EFDC accurately by using the most sophisticated and accurate geometry for the atomic force microscopy probe. To avoid simplifications and in order to reproduce experimental conditions, the EFDC has been simulated for a system constituted of a polarized electrode embedded in a thin dielectric layer (SiN x ). The individual contributions of forces on the tip and on the cantilever have been analyzed separately to account for possible artefacts. The EFDC sensitivity to potential distribution is studied through the change in electrode shape, namely the width and the depth. Finally, the numerical results have been compared with experimental data.

  10. Effect of task-related extracerebral circulation on diffuse optical tomography: experimental data and simulations on the forehead.

    PubMed

    Näsi, Tiina; Mäki, Hanna; Hiltunen, Petri; Heiskala, Juha; Nissilä, Ilkka; Kotilahti, Kalle; Ilmoniemi, Risto J

    2013-03-01

    The effect of task-related extracerebral circulatory changes on diffuse optical tomography (DOT) of brain activation was evaluated using experimental data from 14 healthy human subjects and computer simulations. Total hemoglobin responses to weekday-recitation, verbal-fluency, and hand-motor tasks were measured with a high-density optode grid placed on the forehead. The tasks caused varying levels of mental and physical stress, eliciting extracerebral circulatory changes that the reconstruction algorithm was unable to fully distinguish from cerebral hemodynamic changes, resulting in artifacts in the brain activation images. Crosstalk between intra- and extracranial layers was confirmed by the simulations. The extracerebral effects were attenuated by superficial signal regression and depended to some extent on the heart rate, thus allowing identification of hemodynamic changes related to brain activation during the verbal-fluency task. During the hand-motor task, the extracerebral component was stronger, making the separation less clear. DOT provides a tool for distinguishing extracerebral components from signals of cerebral origin. Especially in the case of strong task-related extracerebral circulatory changes, however, sophisticated reconstruction methods are needed to eliminate crosstalk artifacts.

  11. AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.

    EPA Science Inventory

    As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...

  12. A Computer Analysis of Library Postcards. (CALP)

    ERIC Educational Resources Information Center

    Stevens, Norman D.

    1974-01-01

    A description of a sophisticated application of computer techniques to the analysis of a collection of picture postcards of library buildings in an attempt to establish the minimum architectural requirements needed to distinguish one style of library building from another. (Author)

  13. Virtual Computing Laboratories: A Case Study with Comparisons to Physical Computing Laboratories

    ERIC Educational Resources Information Center

    Burd, Stephen D.; Seazzu, Alessandro F.; Conway, Christopher

    2009-01-01

    Current technology enables schools to provide remote or virtual computing labs that can be implemented in multiple ways ranging from remote access to banks of dedicated workstations to sophisticated access to large-scale servers hosting virtualized workstations. This paper reports on the implementation of a specific lab using remote access to…

  14. Factors Affecting Utilization of Information Output of Computer-Based Modeling Procedures in Local Government Organizations.

    ERIC Educational Resources Information Center

    Komsky, Susan

    Fiscal Impact Budgeting Systems (FIBS) are sophisticated computer based modeling procedures used in local government organizations, whose results, however, are often overlooked or ignored by decision makers. A study attempted to discover the reasons for this situation by focusing on four factors: potential usefulness, faith in computers,…

  15. Use of Computer-Assisted Technologies (CAT) to Enhance Social, Communicative, and Language Development in Children with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Ploog, Bertram O.; Scharf, Alexa; Nelson, DeShawn; Brooks, Patricia J.

    2013-01-01

    Major advances in multimedia computer technology over the past decades have made sophisticated computer games readily available to the public. This, combined with the observation that most children, including those with autism spectrum disorders (ASD), show an affinity to computers, has led researchers to recognize the potential of computer…

  16. VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data

    PubMed Central

    Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel

    2014-01-01

    This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198

  17. 3-d finite element model development for biomechanics: a software demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollerbach, K.; Hollister, A.M.; Ashby, E.

    1997-03-01

    Finite element analysis is becoming an increasingly important part of biomechanics and orthopedic research, as computational resources become more powerful, and data handling algorithms become more sophisticated. Until recently, tools with sufficient power did not exist or were not accessible to adequately model complicated, three-dimensional, nonlinear biomechanical systems. In the past, finite element analyses in biomechanics have often been limited to two-dimensional approaches, linear analyses, or simulations of single tissue types. Today, we have the resources to model fully three-dimensional, nonlinear, multi-tissue, and even multi-joint systems. The authors will present the process of developing these kinds of finite element models,more » using human hand and knee examples, and will demonstrate their software tools.« less

  18. >From naive to sophisticated behavior in multiagents-based financial market models

    NASA Astrophysics Data System (ADS)

    Mansilla, R.

    2000-09-01

    The behavior of physical complexity and mutual information function of the outcome of a model of heterogeneous, inductive rational agents inspired by the El Farol Bar problem and the Minority Game is studied. The first magnitude is a measure rooted in the Kolmogorov-Chaitin theory and the second a measure related to Shannon's information entropy. Extensive computer simulations were done, as a result of which, is proposed an ansatz for physical complexity of the type C(l)=lα and the dependence of the exponent α from the parameters of the model is established. The accuracy of our results and the relationship with the behavior of mutual information function as a measure of time correlation of agents choice are discussed.

  19. DYNA3D: A computer code for crashworthiness engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hallquist, J.O.; Benson, D.J.

    1986-09-01

    A finite element program with crashworthiness applications has been developed at LLNL. DYNA3D, an explicit, fully vectorized, finite deformation structural dynamics program, has four capabilities that are critical for the efficient and realistic modeling crash phenomena: (1) fully optimized nonlinear solid, shell, and beam elements for representing a structure; (2) a broad range of constitutive models for simulating material behavior; (3) sophisticated contact algorithms for impact interactions; (4) a rigid body capability to represent the bodies away from the impact region at a greatly reduced cost without sacrificing accuracy in the momentum calculations. Basic methodologies of the program are brieflymore » presented along with several crashworthiness calculations. Efficiencies of the Hughes-Liu and Belytschko-Tsay shell formulations are considered.« less

  20. Tratamiento formal de imágenes astronómicas con PSF espacialmente variable

    NASA Astrophysics Data System (ADS)

    Sánchez, B. O.; Domínguez, M. J.; Lares, M.

    2017-10-01

    We present a python implementation of a method for PSF determination in the context of optimal subtraction of astronomical images. We introduce an expansion of the spatially variant point spread function (PSF) in terms of the Karhunen Loève basis. The advantage of this approach is that the basis is able to naturally adapt to the data, instead of imposing a fixed ad-hoc analytic form. Simulated image reconstruction was analyzed, by using the measured PSF, with good agreement in terms of sky background level between the reconstructed and original images. The technique is simple enough to be implemented on more sophisticated image subtraction methods, since it improves its results without extra computational cost in a spatially variant PSF environment.

  1. Airburst-Generated Tsunamis

    NASA Astrophysics Data System (ADS)

    Berger, Marsha; Goodman, Jonathan

    2018-04-01

    This paper examines the questions of whether smaller asteroids that burst in the air over water can generate tsunamis that could pose a threat to distant locations. Such airburst-generated tsunamis are qualitatively different than the more frequently studied earthquake-generated tsunamis, and differ as well from tsunamis generated by asteroids that strike the ocean. Numerical simulations are presented using the shallow water equations in several settings, demonstrating very little tsunami threat from this scenario. A model problem with an explicit solution that demonstrates and explains the same phenomena found in the computations is analyzed. We discuss the question of whether compressibility and dispersion are important effects that should be included, and show results from a more sophisticated model problem using the linearized Euler equations that begins to addresses this.

  2. Artificial life and the Chinese room argument.

    PubMed

    Anderson, David; Copeland, B Jack

    2002-01-01

    "Strong artificial life" refers to the thesis that a sufficiently sophisticated computer simulation of a life form is a life form in its own right. Can John Searle's Chinese room argument [12]-originally intended by him to show that the thesis he dubs "strong AI" is false-be deployed against strong ALife? We have often encountered the suggestion that it can be (even in print; see Harnad [8]). We do our best to transfer the argument from the domain of AI to that of ALife. We do so in order to show once and for all that the Chinese room argument proves nothing about ALife. There may indeed be powerful philosophical objections to the thesis of strong ALife, but the Chinese room argument is not among them.

  3. Evolution of the INMARSAT aeronautical system: Service, system, and business considerations

    NASA Technical Reports Server (NTRS)

    Sengupta, Jay R.

    1995-01-01

    A market-driven approach was adopted to develop enhancements to the Inmarsat-Aeronautical system, to address the requirements of potential new market segments. An evolutionary approach and well differentiated product/service portfolio was required, to minimize system upgrade costs and market penetration, respectively. The evolved system definition serves to minimize equipment cost/size/mass for short/medium range aircraft, by reducing the antenna gain requirement and relaxing the performance requirements for non safety-related communications. A validation program involving simulation, laboratory tests, over-satellite tests and flight trials is being conducted to confirm the system definition. Extensive market research has been conducted to determine user requirements and to quantify market demand for future Inmarsat Aero-1 AES, using sophisticated computer assisted survey techniques.

  4. Simulations and Experiments of Dynamic Granular Compaction in Non-ideal Geometries

    NASA Astrophysics Data System (ADS)

    Homel, Michael; Herbold, Eric; Lind, John; Crum, Ryan; Hurley, Ryan; Akin, Minta; Pagan, Darren; LLNL Team

    2017-06-01

    Accurately describing the dynamic compaction of granular materials is a persistent challenge in computational mechanics. Using a synchrotron x-ray source we have obtained detailed imaging of the evolving compaction front in synthetic olivine powder impacted at 300 - 600 m / s . To facilitate imaging, a non-traditional sample geometry is used, producing multiple load paths within the sample. We demonstrate that (i) commonly used models for porous compaction may produce inaccurate results for complex loading, even if the 1 - D , uniaxial-strain compaction response is reasonable, and (ii) the experimental results can be used along with simulations to determine parameters for sophisticated constitutive models that more accurately describe the strength, softening, bulking, and poroelastic response. Effects of experimental geometry and alternative configurations are discussed. Our understanding of the material response is further enhanced using mesoscale simulations that allow us to relate the mechanisms of grain fracture, contact, and comminution to the macroscale continuum response. Numerical considerations in both continuum and mesoscale simulations are described. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LDRD#16-ERD-010. LLNL-ABS-725113.

  5. MHD modeling of a DIII-D low-torque QH-mode discharge and comparison to observations

    NASA Astrophysics Data System (ADS)

    King, J. R.; Kruger, S. E.; Burrell, K. H.; Chen, X.; Garofalo, A. M.; Groebner, R. J.; Olofsson, K. E. J.; Pankin, A. Y.; Snyder, P. B.

    2017-05-01

    Extended-MHD modeling of DIII-D tokamak [J. L. Luxon, Nucl. Fusion 42, 614 (2002)] quiescent H-mode (QH-mode) discharges with nonlinear NIMROD [C. R. Sovinec et al., J. Comput. Phys. 195, 355 (2004)] simulations saturates into a turbulent state but does not saturate when the steady-state flow inferred from measurements is not included. This is consistent with the experimental observations of the quiescent regime on DIII-D. The simulation with flow develops into a saturated turbulent state where the nϕ=1 and 2 toroidal modes become dominant through an inverse cascade. Each mode in the range of nϕ=1 -5 is dominant at a different time. Consistent with experimental observations during QH-mode, the simulated state leads to large particle transport relative to the thermal transport. Analysis shows that the amplitude and phase of the density and temperature perturbations differ resulting in greater fluctuation-induced convective particle transport relative to the convective thermal transport. Comparison to magnetic-coil measurements shows that rotation frequencies differ between the simulation and experiment, which indicates that more sophisticated extended-MHD two-fluid modeling is required.

  6. PhET Interactive Simulations: Transformative Tools for Teaching Chemistry

    ERIC Educational Resources Information Center

    Moore, Emily B.; Chamberlain, Julia M.; Parson, Robert; Perkins, Katherine K.

    2014-01-01

    Developing fluency across symbolic-, macroscopic-, and particulate-level representations is central to learning chemistry. Within the chemistry education community, animations and simulations that support multi-representational fluency are considered critical. With advances in the accessibility and sophistication of technology,…

  7. High-Performance Computing Act of 1991. Report of the Senate Committee on Commerce, Science, and Transportation on S. 272. Senate, 102d Congress, 1st Session.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Senate Committee on Commerce, Science, and Transportation.

    This report discusses Senate Bill no. 272, which provides for a coordinated federal research and development program to ensure continued U.S. leadership in high-performance computing. High performance computing is defined as representing the leading edge of technological advancement in computing, i.e., the most sophisticated computer chips, the…

  8. Inferring population history with DIY ABC: a user-friendly approach to approximate Bayesian computation

    PubMed Central

    Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A.; Robert, Christian P.; Marin, Jean-Michel; Balding, David J.; Guillemaud, Thomas; Estoup, Arnaud

    2008-01-01

    Summary: Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC. Availability: The software DIY ABC is freely available at http://www.montpellier.inra.fr/CBGP/diyabc. Contact: j.cornuet@imperial.ac.uk Supplementary information: Supplementary data are also available at http://www.montpellier.inra.fr/CBGP/diyabc PMID:18842597

  9. Magnetosphere Modeling: From Cartoons to Simulations

    NASA Astrophysics Data System (ADS)

    Gombosi, T. I.

    2017-12-01

    Over the last half a century physics-based global computer simulations became a bridge between experiment and basic theory and now it represents the "third pillar" of geospace research. Today, many of our scientific publications utilize large-scale simulations to interpret observations, test new ideas, plan campaigns, or design new instruments. Realistic simulations of the complex Sun-Earth system have been made possible by the dramatically increased power of both computing hardware and numerical algorithms. Early magnetosphere models were based on simple E&M concepts (like the Chapman-Ferraro cavity) and hydrodynamic analogies (bow shock). At the beginning of the space age current system models were developed culminating in the sophisticated Tsyganenko-type description of the magnetic configuration. The first 3D MHD simulations of the magnetosphere were published in the early 1980s. A decade later there were several competing global models that were able to reproduce many fundamental properties of the magnetosphere. The leading models included the impact of the ionosphere by using a height-integrated electric potential description. Dynamic coupling of global and regional models started in the early 2000s by integrating a ring current and a global magnetosphere model. It has been recognized for quite some time that plasma kinetic effects play an important role. Presently, global hybrid simulations of the dynamic magnetosphere are expected to be possible on exascale supercomputers, while fully kinetic simulations with realistic mass ratios are still decades away. In the 2010s several groups started to experiment with PIC simulations embedded in large-scale 3D MHD models. Presently this integrated MHD-PIC approach is at the forefront of magnetosphere simulations and this technique is expected to lead to some important advances in our understanding of magnetosheric physics. This talk will review the evolution of magnetosphere modeling from cartoons to current systems, to global MHD to MHD-PIC and discuss the role of state-of-the-art models in forecasting space weather.

  10. Optimization-Based Inverse Identification of the Parameters of a Concrete Cap Material Model

    NASA Astrophysics Data System (ADS)

    Král, Petr; Hokeš, Filip; Hušek, Martin; Kala, Jiří; Hradil, Petr

    2017-10-01

    Issues concerning the advanced numerical analysis of concrete building structures in sophisticated computing systems currently require the involvement of nonlinear mechanics tools. The efforts to design safer, more durable and mainly more economically efficient concrete structures are supported via the use of advanced nonlinear concrete material models and the geometrically nonlinear approach. The application of nonlinear mechanics tools undoubtedly presents another step towards the approximation of the real behaviour of concrete building structures within the framework of computer numerical simulations. However, the success rate of this application depends on having a perfect understanding of the behaviour of the concrete material models used and having a perfect understanding of the used material model parameters meaning. The effective application of nonlinear concrete material models within computer simulations often becomes very problematic because these material models very often contain parameters (material constants) whose values are difficult to obtain. However, getting of the correct values of material parameters is very important to ensure proper function of a concrete material model used. Today, one possibility, which permits successful solution of the mentioned problem, is the use of optimization algorithms for the purpose of the optimization-based inverse material parameter identification. Parameter identification goes hand in hand with experimental investigation while it trying to find parameter values of the used material model so that the resulting data obtained from the computer simulation will best approximate the experimental data. This paper is focused on the optimization-based inverse identification of the parameters of a concrete cap material model which is known under the name the Continuous Surface Cap Model. Within this paper, material parameters of the model are identified on the basis of interaction between nonlinear computer simulations, gradient based and nature inspired optimization algorithms and experimental data, the latter of which take the form of a load-extension curve obtained from the evaluation of uniaxial tensile test results. The aim of this research was to obtain material model parameters corresponding to the quasi-static tensile loading which may be further used for the research involving dynamic and high-speed tensile loading. Based on the obtained results it can be concluded that the set goal has been reached.

  11. Progress in Computational Electron-Molecule Collisions

    NASA Astrophysics Data System (ADS)

    Rescigno, Tn

    1997-10-01

    The past few years have witnessed tremendous progress in the development of sophisticated ab initio methods for treating collisions of slow electrons with isolated small molecules. Researchers in this area have benefited greatly from advances in computer technology; indeed, the advent of parallel computers has made it possible to carry out calculations at a level of sophistication inconceivable a decade ago. But bigger and faster computers are only part of the picture. Even with today's computers, the practical need to study electron collisions with the kinds of complex molecules and fragments encountered in real-world plasma processing environments is taxing present methods beyond their current capabilities. Since extrapolation of existing methods to handle increasingly larger targets will ultimately fail as it would require computational resources beyond any imagined, continued progress must also be linked to new theoretical developments. Some of the techniques recently introduced to address these problems will be discussed and illustrated with examples of electron-molecule collision calculations we have carried out on some fairly complex target gases encountered in processing plasmas. Electron-molecule scattering continues to pose many formidable theoretical and computational challenges. I will touch on some of the outstanding open questions.

  12. Applications of the microdosimetric function implemented in the macroscopic particle transport simulation code PHITS.

    PubMed

    Sato, Tatsuhiko; Watanabe, Ritsuko; Sihver, Lembit; Niita, Koji

    2012-01-01

    Microdosimetric quantities such as lineal energy are generally considered to be better indices than linear energy transfer (LET) for expressing the relative biological effectiveness (RBE) of high charge and energy particles. To calculate their probability densities (PD) in macroscopic matter, it is necessary to integrate microdosimetric tools such as track-structure simulation codes with macroscopic particle transport simulation codes. As an integration approach, the mathematical model for calculating the PD of microdosimetric quantities developed based on track-structure simulations was incorporated into the macroscopic particle transport simulation code PHITS (Particle and Heavy Ion Transport code System). The improved PHITS enables the PD in macroscopic matter to be calculated within a reasonable computation time, while taking their stochastic nature into account. The microdosimetric function of PHITS was applied to biological dose estimation for charged-particle therapy and risk estimation for astronauts. The former application was performed in combination with the microdosimetric kinetic model, while the latter employed the radiation quality factor expressed as a function of lineal energy. Owing to the unique features of the microdosimetric function, the improved PHITS has the potential to establish more sophisticated systems for radiological protection in space as well as for the treatment planning of charged-particle therapy.

  13. Regional-Scale Modeling at NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Tao, W.-K.; Adler, R.; Baker, D.; Braun, S.; Chou, M.-D.; Jasinski, M. F.; Jia, Y.; Kakar, R.; Karyampudi, M.; Lang, S.

    2003-01-01

    Over the past decade, the Goddard Mesoscale Modeling and Dynamics Group has used a popular regional scale model, MM5, to study precipitation processes. Our group is making contributions to the MM5 by incorporating the following physical and numerical packages: improved Goddard cloud processes, a land processes model (Parameterization for Land-Atmosphere-Cloud Exchange - PLACE), efficient but sophisticated radiative processes, conservation of hydrometeor mass (water budget), four-dimensional data assimilation for rainfall, and better computational methods for trace gas transport. At NASA Goddard, the MM5 has been used to study: (1) the impact of initial conditions, assimilation of satellite-derived rainfall, and cumulus parameterizations on rapidly intensifying oceanic cyclones, hurricanes and typhoons, (2) the dynamic and thermodynamic processes associated with the development of narrow cold frontal rainbands, (3) regional climate and water cycles, (4) the impact of vertical transport by clouds and lightning on trace gas distributiodproduction associated with South and North American mesoscale convective systems, (5) the development of a westerly wind burst (WWB) that occurred during the TOGA COARE and the diurnal variation of precipitation in the tropics, (6) a Florida sea breeze convective event and a Mid-US flood event using a sophisticated land surface model, (7) the influence of soil heterogeneity on land surface energy balance in the southwest GCIP region, (8) explicit simulations (with 1.33 to 4 km horizontal resolution) of hurricanes Bob (1991) and Bonnie (1998), (9) a heavy precipitation event over Taiwan, and (10) to make real time forecasts for a major NASA field program. In this paper, the modifications and simulated cases will be described and discussed.

  14. Choosing a Computer Language for Institutional Research. The AIR Professional File No. 6.

    ERIC Educational Resources Information Center

    Strenglein, Denise

    1980-01-01

    It is suggested that much thought should be given to choosing an appropriate computer language for an institutional research office, considering the sophistication of the staff, types of planned application, size and type of computer, and availability of central programming support in the institution. For offices that prepare straight reports and…

  15. New Ways of Using Computers in Language Teaching. New Ways in TESOL Series II. Innovative Classroom Techniques.

    ERIC Educational Resources Information Center

    Boswood, Tim, Ed.

    A collection of classroom approaches and activities using computers for language learning is presented. Some require sophisticated installations, but most do not, and most use software readily available on most workplace computer systems. The activities were chosen because they use sound language learning strategies. The book is divided into five…

  16. Web Service Model for Plasma Simulations with Automatic Post Processing and Generation of Visual Diagnostics*

    NASA Astrophysics Data System (ADS)

    Exby, J.; Busby, R.; Dimitrov, D. A.; Bruhwiler, D.; Cary, J. R.

    2003-10-01

    We present our design and initial implementation of a web service model for running particle-in-cell (PIC) codes remotely from a web browser interface. PIC codes have grown significantly in complexity and now often require parallel execution on multiprocessor computers, which in turn requires sophisticated post-processing and data analysis. A significant amount of time and effort is required for a physicist to develop all the necessary skills, at the expense of actually doing research. Moreover, parameter studies with a computationally intensive code justify the systematic management of results with an efficient way to communicate them among a group of remotely located collaborators. Our initial implementation uses the OOPIC Pro code [1], Linux, Apache, MySQL, Python, and PHP. The Interactive Data Language is used for visualization. [1] D.L. Bruhwiler et al., Phys. Rev. ST-AB 4, 101302 (2001). * This work is supported by DOE grant # DE-FG02-03ER83857 and by Tech-X Corp. ** Also University of Colorado.

  17. Data communication requirements for the advanced NAS network

    NASA Technical Reports Server (NTRS)

    Levin, Eugene; Eaton, C. K.; Young, Bruce

    1986-01-01

    The goal of the Numerical Aerodynamic Simulation (NAS) Program is to provide a powerful computational environment for advanced research and development in aeronautics and related disciplines. The present NAS system consists of a Cray 2 supercomputer connected by a data network to a large mass storage system, to sophisticated local graphics workstations, and by remote communications to researchers throughout the United States. The program plan is to continue acquiring the most powerful supercomputers as they become available. In the 1987/1988 time period it is anticipated that a computer with 4 times the processing speed of a Cray 2 will be obtained and by 1990 an additional supercomputer with 16 times the speed of the Cray 2. The implications of this 20-fold increase in processing power on the data communications requirements are described. The analysis was based on models of the projected workload and system architecture. The results are presented together with the estimates of their sensitivity to assumptions inherent in the models.

  18. Home, Hearth and Computing.

    ERIC Educational Resources Information Center

    Seelig, Anita

    1982-01-01

    Advantages of having children use microcomputers at school and home include learning about sophisticated concepts early in life without a great deal of prodding, playing games that expand knowledge, and becoming literate in computer knowledge needed later in life. Includes comments from parents on their experiences with microcomputers and…

  19. Elliptic Length Scales in Laminar, Two-Dimensional Supersonic Flows

    DTIC Science & Technology

    2015-06-01

    sophisticated computational fluid dynamics ( CFD ) methods. Additionally, for 3D interactions, the length scales would require determination in spanwise as well...Manna, M. “Experimental, Analytical, and Computational Methods Applied to Hypersonic Compression Ramp Flows,” AIAA Journal, Vol. 32, No. 2, Feb. 1994

  20. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  1. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  2. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  3. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  4. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  5. Space Spurred Computer Graphics

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Dicomed Corporation was asked by NASA in the early 1970s to develop processing capabilities for recording images sent from Mars by Viking spacecraft. The company produced a film recorder which increased the intensity levels and the capability for color recording. This development led to a strong technology base resulting in sophisticated computer graphics equipment. Dicomed systems are used to record CAD (computer aided design) and CAM (computer aided manufacturing) equipment, to update maps and produce computer generated animation.

  6. A computer assisted tutorial for applications of computer spreadsheets in nursing financial management.

    PubMed

    Edwardson, S R; Pejsa, J

    1993-01-01

    A computer-based tutorial for teaching nursing financial management concepts was developed using the macro function of a commercially available spreadsheet program. The goals of the tutorial were to provide students with an experience with spreadsheets as a computer tool and to teach selected financial management concepts. Preliminary results show the tutorial was well received by students. Suggestions are made for overcoming the general lack of computer sophistication among students.

  7. The Impact of a Simulation and Problem-Based Learning Design Project on Student Learning and Teamwork Skills. CSE Technical Report.

    ERIC Educational Resources Information Center

    Chung, Gregory K. W. K.

    This study examined a civil engineering capstone course that embedded a sophisticated simulation-based task within instruction. Students (n=28) were required to conduct a hazardous waste site investigation using simulation software designed specifically for the course (Interactive Site Investigation Software) (ISIS). The software simulated…

  8. Towards a Comprehensive Computational Simulation System for Turbomachinery

    NASA Technical Reports Server (NTRS)

    Shih, Ming-Hsin

    1994-01-01

    The objective of this work is to develop algorithms associated with a comprehensive computational simulation system for turbomachinery flow fields. This development is accomplished in a modular fashion. These modules includes grid generation, visualization, network, simulation, toolbox, and flow modules. An interactive grid generation module is customized to facilitate the grid generation process associated with complicated turbomachinery configurations. With its user-friendly graphical user interface, the user may interactively manipulate the default settings to obtain a quality grid within a fraction of time that is usually required for building a grid about the same geometry with a general-purpose grid generation code. Non-Uniform Rational B-Spline formulations are utilized in the algorithm to maintain geometry fidelity while redistributing grid points on the solid surfaces. Bezier curve formulation is used to allow interactive construction of inner boundaries. It is also utilized to allow interactive point distribution. Cascade surfaces are transformed from three-dimensional surfaces of revolution into two-dimensional parametric planes for easy manipulation. Such a transformation allows these manipulated plane grids to be mapped to surfaces of revolution by any generatrix definition. A sophisticated visualization module is developed to al-low visualization for both grid and flow solution, steady or unsteady. A network module is built to allow data transferring in the heterogeneous environment. A flow module is integrated into this system, using an existing turbomachinery flow code. A simulation module is developed to combine the network, flow, and visualization module to achieve near real-time flow simulation about turbomachinery geometries. A toolbox module is developed to support the overall task. A batch version of the grid generation module is developed to allow portability and has been extended to allow dynamic grid generation for pitch changing turbomachinery configurations. Various applications with different characteristics are presented to demonstrate the success of this system.

  9. Assessment of input uncertainty by seasonally categorized latent variables using SWAT

    USDA-ARS?s Scientific Manuscript database

    Watershed processes have been explored with sophisticated simulation models for the past few decades. It has been stated that uncertainty attributed to alternative sources such as model parameters, forcing inputs, and measured data should be incorporated during the simulation process. Among varyin...

  10. A sophisticated simulation for the fracture behavior of concrete material using XFEM

    NASA Astrophysics Data System (ADS)

    Zhai, Changhai; Wang, Xiaomin; Kong, Jingchang; Li, Shuang; Xie, Lili

    2017-10-01

    The development of a powerful numerical model to simulate the fracture behavior of concrete material has long been one of the dominant research areas in earthquake engineering. A reliable model should be able to adequately represent the discontinuous characteristics of cracks and simulate various failure behaviors under complicated loading conditions. In this paper, a numerical formulation, which incorporates a sophisticated rigid-plastic interface constitutive model coupling cohesion softening, contact, friction and shear dilatation into the XFEM, is proposed to describe various crack behaviors of concrete material. An effective numerical integration scheme for accurately assembling the contribution to the weak form on both sides of the discontinuity is introduced. The effectiveness of the proposed method has been assessed by simulating several well-known experimental tests. It is concluded that the numerical method can successfully capture the crack paths and accurately predict the fracture behavior of concrete structures. The influence of mode-II parameters on the mixed-mode fracture behavior is further investigated to better determine these parameters.

  11. Peering into the Future of Advertising.

    ERIC Educational Resources Information Center

    Hsia, H. J.

    All areas in mass communications (i.e., newspapers, magazines, television, radio, films, photos, and books) will be transformed because of the increasing sophistication of computer users, the decreasing costs for interactive computer systems, and the global adoption of integrated services digital networks (ISDN). ISDN refer to the digitization of…

  12. Improving Undergraduate Computer Instruction: Experiments and Strategies

    ERIC Educational Resources Information Center

    Kalman, Howard K.; Ellis, Maureen L.

    2007-01-01

    Today, undergraduate students enter college with increasingly more sophisticated computer skills compared to their counterparts of 20 years ago. However, many instructors are still using traditional instructional strategies to teach this new generation. This research study discusses a number of strategies that were employed to teach a…

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoehler, M; McCallen, D; Noble, C

    The analysis, and subsequent retrofit, of concrete arch bridges during recent years has relied heavily on the use of computational simulation. For seismic analysis in particular, computer simulation, typically utilizing linear approximations of structural behavior, has become standard practice. This report presents the results of a comprehensive study of the significance of model sophistication (i.e. linear vs. nonlinear) and pertinent modeling assumptions on the dynamic response of concrete arch bridges. The study uses the Bixby Creek Bridge, located in California, as a case study. In addition to presenting general recommendations for analysis of this class of structures, this report providesmore » an independent evaluation of the proposed seismic retrofit for the Bixby Creek Bridge. Results from the study clearly illustrate a reduction of displacement drifts and redistribution of member forces brought on by the inclusion of material nonlinearity. The analyses demonstrate that accurate modeling of expansion joints, for the Bixby Creek Bridge in particular, is critical to achieve representative modal and transient behavior. The inclusion of near-field displacement pulses in ground motion records was shown to significantly increase demand on the relatively softer, longer period Bixby Creek Bridge arch. Stiffer, shorter period arches, however, are more likely susceptible to variable support motions arising from the canyon topography typical for this class of bridges.« less

  14. Optimizing radiotherapy protocols using computer automata to model tumour cell death as a function of oxygen diffusion processes.

    PubMed

    Paul-Gilloteaux, Perrine; Potiron, Vincent; Delpon, Grégory; Supiot, Stéphane; Chiavassa, Sophie; Paris, François; Costes, Sylvain V

    2017-05-23

    The concept of hypofractionation is gaining momentum in radiation oncology centres, enabled by recent advances in radiotherapy apparatus. The gain of efficacy of this innovative treatment must be defined. We present a computer model based on translational murine data for in silico testing and optimization of various radiotherapy protocols with respect to tumour resistance and the microenvironment heterogeneity. This model combines automata approaches with image processing algorithms to simulate the cellular response of tumours exposed to ionizing radiation, modelling the alteration of oxygen permeabilization in blood vessels against repeated doses, and introducing mitotic catastrophe (as opposed to arbitrary delayed cell-death) as a means of modelling radiation-induced cell death. Published data describing cell death in vitro as well as tumour oxygenation in vivo are used to inform parameters. Our model is validated by comparing simulations to in vivo data obtained from the radiation treatment of mice transplanted with human prostate tumours. We then predict the efficacy of untested hypofractionation protocols, hypothesizing that tumour control can be optimized by adjusting daily radiation dosage as a function of the degree of hypoxia in the tumour environment. Further biological refinement of this tool will permit the rapid development of more sophisticated strategies for radiotherapy.

  15. Solid oxide fuel cell simulation and design optimization with numerical adjoint techniques

    NASA Astrophysics Data System (ADS)

    Elliott, Louie C.

    This dissertation reports on the application of numerical optimization techniques as applied to fuel cell simulation and design. Due to the "multi-physics" inherent in a fuel cell, which results in a highly coupled and non-linear behavior, an experimental program to analyze and improve the performance of fuel cells is extremely difficult. This program applies new optimization techniques with computational methods from the field of aerospace engineering to the fuel cell design problem. After an overview of fuel cell history, importance, and classification, a mathematical model of solid oxide fuel cells (SOFC) is presented. The governing equations are discretized and solved with computational fluid dynamics (CFD) techniques including unstructured meshes, non-linear solution methods, numerical derivatives with complex variables, and sensitivity analysis with adjoint methods. Following the validation of the fuel cell model in 2-D and 3-D, the results of the sensitivity analysis are presented. The sensitivity derivative for a cost function with respect to a design variable is found with three increasingly sophisticated techniques: finite difference, direct differentiation, and adjoint. A design cycle is performed using a simple optimization method to improve the value of the implemented cost function. The results from this program could improve fuel cell performance and lessen the world's dependence on fossil fuels.

  16. Using hypermedia to develop an intelligent tutorial/diagnostic system for the Space Shuttle Main Engine Controller Lab

    NASA Technical Reports Server (NTRS)

    Oreilly, Daniel; Williams, Robert; Yarborough, Kevin

    1988-01-01

    This is a tutorial/diagnostic system for training personnel in the use of the Space Shuttle Main Engine Controller (SSMEC) Simulation Lab. It also provides a diagnostic capable of isolating lab failures at least to the major lab component. The system was implemented using Hypercard, which is an program of hypermedia running on Apple Macintosh computers. Hypercard proved to be a viable platform for the development and use of sophisticated tutorial systems and moderately capable diagnostic systems. This tutorial/diagnostic system uses the basic Hypercard tools to provide the tutorial. The diagnostic part of the system uses a simple interpreter written in the Hypercard language (Hypertalk) to implement the backward chaining rule based logic commonly found in diagnostic systems using Prolog. Some of the advantages of Hypercard in developing this type of system include sophisticated graphics, animation, sound and voice capabilities, its ability as a hypermedia tool, and its ability to include digitized pictures. The major disadvantage is the slow execution time for evaluation of rules (due to the interpretive processing of the language). Other disadvantages include the limitation on the size of the cards, that color is not supported, that it does not support grey scale graphics, and its lack of selectable fonts for text fields.

  17. Computational discovery of metal-organic frameworks with high gas deliverable capacity

    NASA Astrophysics Data System (ADS)

    Bao, Yi

    Metal-organic frameworks (MOFs) are a rapidly emerging class of nanoporous materials with largely tunable chemistry and diverse applications in gas storage, gas purification, catalysis, sensing and drug delivery. Efforts have been made to develop new MOFs with desirable properties both experimentally and computationally for decades. To guide experimental synthesis, we here develop a computational methodology to explore MOFs with high gas deliverable capacity. This de novo design procedure applies known chemical reactions, considers synthesizability and geometric requirements of organic linkers, and efficiently evolves a population of MOFs to optimize a desirable property. We identify 48 MOFs with higher methane deliverable capacity at 65-5.8 bar condition than the MOF-5 reference in nine networks. In a more comprehensive work, we predict two sets of MOFs with high methane deliverable capacity at a 65-5.8 bar loading-delivery condition or a 35-5.8 bar loading-delivery condition. We also optimize a set of MOFs with high methane accessible internal surface area to investigate the relationship between deliverable capacities and internal surface area. This methodology can be extended to MOFs with multiple types of linkers and multiple SBUs. Flexibile MOFs may allow for sophisticated heat management strategies and also provide higher gas deliverable capacity than rigid frameworks. We investigate flexible MOFs, such as MIL-53 families, and Fe(bdp) and Co(bdp) analogs, to understand the structural phase transition of frameworks and the resulting influence on heat of adsorption. Challenges of simulating a system with a flexible host structure and incoming guest molecules are discussed. Preliminary results from isotherm simulation using the hybrid MC/MD simulation scheme on MIL-53(Cr) are presented. Suggestions for proceeding to understand the free energy profile of flexible MOFs are provided.

  18. Passive motion paradigm: an alternative to optimal control.

    PubMed

    Mohan, Vishwanathan; Morasso, Pietro

    2011-01-01

    IN THE LAST YEARS, OPTIMAL CONTROL THEORY (OCT) HAS EMERGED AS THE LEADING APPROACH FOR INVESTIGATING NEURAL CONTROL OF MOVEMENT AND MOTOR COGNITION FOR TWO COMPLEMENTARY RESEARCH LINES: behavioral neuroscience and humanoid robotics. In both cases, there are general problems that need to be addressed, such as the "degrees of freedom (DoFs) problem," the common core of production, observation, reasoning, and learning of "actions." OCT, directly derived from engineering design techniques of control systems quantifies task goals as "cost functions" and uses the sophisticated formal tools of optimal control to obtain desired behavior (and predictions). We propose an alternative "softer" approach passive motion paradigm (PMP) that we believe is closer to the biomechanics and cybernetics of action. The basic idea is that actions (overt as well as covert) are the consequences of an internal simulation process that "animates" the body schema with the attractor dynamics of force fields induced by the goal and task-specific constraints. This internal simulation offers the brain a way to dynamically link motor redundancy with task-oriented constraints "at runtime," hence solving the "DoFs problem" without explicit kinematic inversion and cost function computation. We argue that the function of such computational machinery is not only restricted to shaping motor output during action execution but also to provide the self with information on the feasibility, consequence, understanding and meaning of "potential actions." In this sense, taking into account recent developments in neuroscience (motor imagery, simulation theory of covert actions, mirror neuron system) and in embodied robotics, PMP offers a novel framework for understanding motor cognition that goes beyond the engineering control paradigm provided by OCT. Therefore, the paper is at the same time a review of the PMP rationale, as a computational theory, and a perspective presentation of how to develop it for designing better cognitive architectures.

  19. High-Fidelity Roadway Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Wang, Jie; Papelis, Yiannis; Shen, Yuzhong; Unal, Ozhan; Cetin, Mecit

    2010-01-01

    Roads are an essential feature in our daily lives. With the advances in computing technologies, 2D and 3D road models are employed in many applications, such as computer games and virtual environments. Traditional road models were generated by professional artists manually using modeling software tools such as Maya and 3ds Max. This approach requires both highly specialized and sophisticated skills and massive manual labor. Automatic road generation based on procedural modeling can create road models using specially designed computer algorithms or procedures, reducing the tedious manual editing needed for road modeling dramatically. But most existing procedural modeling methods for road generation put emphasis on the visual effects of the generated roads, not the geometrical and architectural fidelity. This limitation seriously restricts the applicability of the generated road models. To address this problem, this paper proposes a high-fidelity roadway generation method that takes into account road design principles practiced by civil engineering professionals, and as a result, the generated roads can support not only general applications such as games and simulations in which roads are used as 3D assets, but also demanding civil engineering applications, which requires accurate geometrical models of roads. The inputs to the proposed method include road specifications, civil engineering road design rules, terrain information, and surrounding environment. Then the proposed method generates in real time 3D roads that have both high visual and geometrical fidelities. This paper discusses in details the procedures that convert 2D roads specified in shape files into 3D roads and civil engineering road design principles. The proposed method can be used in many applications that have stringent requirements on high precision 3D models, such as driving simulations and road design prototyping. Preliminary results demonstrate the effectiveness of the proposed method.

  20. The Multi-dimensional Character of Core-collapse Supernovae

    DOE PAGES

    Hix, W. R.; Lentz, E. J.; Bruenn, S. W.; ...

    2016-03-01

    Core-collapse supernovae, the culmination of massive stellar evolution, are spectacular astronomical events and the principle actors in the story of our elemental origins. Our understanding of these events, while still incomplete, centers around a neutrino-driven central engine that is highly hydrodynamically unstable. Increasingly sophisticated simulations reveal a shock that stalls for hundreds of milliseconds before reviving. Though brought back to life by neutrino heating, the development of the supernova explosion is inextricably linked to multi-dimensional fluid flows. In this paper, the outcomes of three-dimensional simulations that include sophisticated nuclear physics and spectral neutrino transport are juxtaposed to learn about themore » nature of the three-dimensional fluid flow that shapes the explosion. Comparison is also made between the results of simulations in spherical symmetry from several groups, to give ourselves confidence in the understanding derived from this juxtaposition.« less

  1. Student Thinking Processes. The Influence of Immediate Computer Access on Students' Thinking. First- and Second-Year Findings. ACOT Report #3.

    ERIC Educational Resources Information Center

    Tierney, Robert J.

    This 2-year longitudinal study explored whether computers promote more sophisticated thinking, and examined how students' thinking changes as they become experienced computer users. The first-year study examined the thinking process of four ninth-grade Apple Classrooms of Tomorrow (ACOT) students. The second-year study continued following these…

  2. Objective and Item Banking Computer Software and Its Use in Comprehensive Achievement Monitoring.

    ERIC Educational Resources Information Center

    Schriber, Peter E.; Gorth, William P.

    The current emphasis on objectives and test item banks for constructing more effective tests is being augmented by increasingly sophisticated computer software. Items can be catalogued in numerous ways for retrieval. The items as well as instructional objectives can be stored and test forms can be selected and printed by the computer. It is also…

  3. Verification of a three-dimensional FEM model for FBGs in PANDA fibers by transversal load experiments

    NASA Astrophysics Data System (ADS)

    Fischer, Bennet; Hopf, Barbara; Lindner, Markus; Koch, Alexander W.; Roths, Johannes

    2017-04-01

    A 3D FEM model of an FBG in a PANDA fiber with an extended fiber length of 25.4 mm is presented. Simulating long fiber lengths with limited computer power is achieved by using an iterative solver and by optimizing the FEM mesh. For verification purposes, the model is adapted to a configuration with transversal loads on the fiber. The 3D FEM model results correspond with experimental data and with the results of an additional 2D FEM plain strain model. In further studies, this 3D model shall be applied to more sophisticated situations, for example to study the temperature dependence of surface-glued or embedded FBGs in PANDA fibers that are used for strain-temperature decoupling.

  4. Molecular dynamics-driven drug discovery: leaping forward with confidence.

    PubMed

    Ganesan, Aravindhan; Coote, Michelle L; Barakat, Khaled

    2017-02-01

    Given the significant time and financial costs of developing a commercial drug, it remains important to constantly reform the drug discovery pipeline with novel technologies that can narrow the candidates down to the most promising lead compounds for clinical testing. The past decade has witnessed tremendous growth in computational capabilities that enable in silico approaches to expedite drug discovery processes. Molecular dynamics (MD) has become a particularly important tool in drug design and discovery. From classical MD methods to more sophisticated hybrid classical/quantum mechanical (QM) approaches, MD simulations are now able to offer extraordinary insights into ligand-receptor interactions. In this review, we discuss how the applications of MD approaches are significantly transforming current drug discovery and development efforts. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Failover in Cellular Automata

    NASA Astrophysics Data System (ADS)

    Kumar, Shailesh; Rao, Shrisha

    This paper studies a phenomenon called failover, and shows that this phenomenon (in particular, stateless failover) can be modeled by Game of Life cellular automata. This is the first time that this sophisticated real-life system behavior has been modeled in abstract terms. A cellular automata (CA) configuration is constructed that exhibits emergent failover. The configuration is based on standard Game of Life rules. Gliders and glider-guns form the core messaging structure in the configuration. The blinker is represented as the basic computational unit, and it is shown how it can be recreated in case of a failure. Stateless failover using the primary-backup mechanism is demonstrated. The details of the CA components used in the configuration and its working are described, and a simulation of the complete configuration is also presented.

  6. Understanding valence-shell electron-pair repulsion (VSEPR) theory using origami molecular models

    NASA Astrophysics Data System (ADS)

    Endah Saraswati, Teguh; Saputro, Sulistyo; Ramli, Murni; Praseptiangga, Danar; Khasanah, Nurul; Marwati, Sri

    2017-01-01

    Valence-shell electron-pair repulsion (VSEPR) theory is conventionally used to predict molecular geometry. However, it is difficult to explore the full implications of this theory by simply drawing chemical structures. Here, we introduce origami modelling as a more accessible approach for exploration of the VSEPR theory. Our technique is simple, readily accessible and inexpensive compared with other sophisticated methods such as computer simulation or commercial three-dimensional modelling kits. This method can be implemented in chemistry education at both the high school and university levels. We discuss the example of a simple molecular structure prediction for ammonia (NH3). Using the origami model, both molecular shape and the scientific justification can be visualized easily. This ‘hands-on’ approach to building molecules will help promote understanding of VSEPR theory.

  7. An Innovative Learning Model for Computation in First Year Mathematics

    ERIC Educational Resources Information Center

    Tonkes, E. J.; Loch, B. I.; Stace, A. W.

    2005-01-01

    MATLAB is a sophisticated software tool for numerical analysis and visualization. The University of Queensland has adopted Matlab as its official teaching package across large first year mathematics courses. In the past, the package has met severe resistance from students who have not appreciated their computational experience. Several main…

  8. Electronic Networking as an Avenue of Enhanced Professional Interchange.

    ERIC Educational Resources Information Center

    Ratcliff, James L.

    Electronic networking is communication between two or more people that involves one or more telecommunications media. There is electronic networking software available for most computers, including IBM, Apple, and Radio Shack personal computers. Depending upon the sophistication of the hardware and software used, individuals and groups can…

  9. Teaching for CAD Expertise

    ERIC Educational Resources Information Center

    Chester, Ivan

    2007-01-01

    CAD (Computer Aided Design) has now become an integral part of Technology Education. The recent introduction of highly sophisticated, low-cost CAD software and CAM hardware capable of running on desktop computers has accelerated this trend. There is now quite widespread introduction of solid modeling CAD software into secondary schools but how…

  10. Computer Aided Tests.

    ERIC Educational Resources Information Center

    Steinke, Elisabeth

    An approach to using the computer to assemble German tests is described. The purposes of the system would be: (1) an expansion of the bilingual lexical memory bank to list and store idioms of all degrees of difficulty, with frequency data and with complete and sophisticated retrieval possibility for assembly; (2) the creation of an…

  11. Towards real-time communication between in vivo neurophysiological data sources and simulator-based brain biomimetic models.

    PubMed

    Lee, Giljae; Matsunaga, Andréa; Dura-Bernal, Salvador; Zhang, Wenjie; Lytton, William W; Francis, Joseph T; Fortes, José Ab

    2014-11-01

    Development of more sophisticated implantable brain-machine interface (BMI) will require both interpretation of the neurophysiological data being measured and subsequent determination of signals to be delivered back to the brain. Computational models are the heart of the machine of BMI and therefore an essential tool in both of these processes. One approach is to utilize brain biomimetic models (BMMs) to develop and instantiate these algorithms. These then must be connected as hybrid systems in order to interface the BMM with in vivo data acquisition devices and prosthetic devices. The combined system then provides a test bed for neuroprosthetic rehabilitative solutions and medical devices for the repair and enhancement of damaged brain. We propose here a computer network-based design for this purpose, detailing its internal modules and data flows. We describe a prototype implementation of the design, enabling interaction between the Plexon Multichannel Acquisition Processor (MAP) server, a commercial tool to collect signals from microelectrodes implanted in a live subject and a BMM, a NEURON-based model of sensorimotor cortex capable of controlling a virtual arm. The prototype implementation supports an online mode for real-time simulations, as well as an offline mode for data analysis and simulations without real-time constraints, and provides binning operations to discretize continuous input to the BMM and filtering operations for dealing with noise. Evaluation demonstrated that the implementation successfully delivered monkey spiking activity to the BMM through LAN environments, respecting real-time constraints.

  12. The advanced role of computational mechanics and visualization in science and technology: analysis of the Germanwings Flight 9525 crash

    NASA Astrophysics Data System (ADS)

    Chen, Goong; Wang, Yi-Ching; Perronnet, Alain; Gu, Cong; Yao, Pengfei; Bin-Mohsin, Bandar; Hajaiej, Hichem; Scully, Marlan O.

    2017-03-01

    Computational mathematics, physics and engineering form a major constituent of modern computational science, which now stands on an equal footing with the established branches of theoretical and experimental sciences. Computational mechanics solves problems in science and engineering based upon mathematical modeling and computing, bypassing the need for expensive and time-consuming laboratory setups and experimental measurements. Furthermore, it allows the numerical simulations of large scale systems, such as the formation of galaxies that could not be done in any earth bound laboratories. This article is written as part of the 21st Century Frontiers Series to illustrate some state-of-the-art computational science. We emphasize how to do numerical modeling and visualization in the study of a contemporary event, the pulverizing crash of the Germanwings Flight 9525 on March 24, 2015, as a showcase. Such numerical modeling and the ensuing simulation of aircraft crashes into land or mountain are complex tasks as they involve both theoretical study and supercomputing of a complex physical system. The most tragic type of crash involves ‘pulverization’ such as the one suffered by this Germanwings flight. Here, we show pulverizing airliner crashes by visualization through video animations from supercomputer applications of the numerical modeling tool LS-DYNA. A sound validation process is challenging but essential for any sophisticated calculations. We achieve this by validation against the experimental data from a crash test done in 1993 of an F4 Phantom II fighter jet into a wall. We have developed a method by hybridizing two primary methods: finite element analysis and smoothed particle hydrodynamics. This hybrid method also enhances visualization by showing a ‘debris cloud’. Based on our supercomputer simulations and the visualization, we point out that prior works on this topic based on ‘hollow interior’ modeling can be quite problematic and, thus, not likely to be correct. We discuss the effects of terrain on pulverization using the information from the recovered flight-data-recorder and show our forensics and assessments of what may have happened during the final moments of the crash. Finally, we point out that our study has potential for being made into real-time flight crash simulators to help the study of crashworthiness and survivability for future aviation safety. Some forward-looking statements are also made.

  13. A Modular Simulation Framework for Assessing Swarm Search Models

    DTIC Science & Technology

    2014-09-01

    SUBTITLE A MODULAR SIMULATION FRAMEWORK FOR ASSESSING SWARM SEARCH MODELS 5. FUNDING NUMBERS 6. AUTHOR(S) Blake M. Wanier 7. PERFORMING ORGANIZATION...Numerical studies demonstrate the ability to leverage the developed simulation and analysis framework to investigate three canonical swarm search models ...as benchmarks for future exploration of more sophisticated swarm search scenarios. 14. SUBJECT TERMS Swarm Search, Search Theory, Modeling Framework

  14. Toward a molecular programming language for algorithmic self-assembly

    NASA Astrophysics Data System (ADS)

    Patitz, Matthew John

    Self-assembly is the process whereby relatively simple components autonomously combine to form more complex objects. Nature exhibits self-assembly to form everything from microscopic crystals to living cells to galaxies. With a desire to both form increasingly sophisticated products and to understand the basic components of living systems, scientists have developed and studied artificial self-assembling systems. One such framework is the Tile Assembly Model introduced by Erik Winfree in 1998. In this model, simple two-dimensional square 'tiles' are designed so that they self-assemble into desired shapes. The work in this thesis consists of a series of results which build toward the future goal of designing an abstracted, high-level programming language for designing the molecular components of self-assembling systems which can perform powerful computations and form into intricate structures. The first two sets of results demonstrate self-assembling systems which perform infinite series of computations that characterize computably enumerable and decidable languages, and exhibit tools for algorithmically generating the necessary sets of tiles. In the next chapter, methods for generating tile sets which self-assemble into complicated shapes, namely a class of discrete self-similar fractal structures, are presented. Next, a software package for graphically designing tile sets, simulating their self-assembly, and debugging designed systems is discussed. Finally, a high-level programming language which abstracts much of the complexity and tedium of designing such systems, while preventing many of the common errors, is presented. The summation of this body of work presents a broad coverage of the spectrum of desired outputs from artificial self-assembling systems and a progression in the sophistication of tools used to design them. By creating a broader and deeper set of modular tools for designing self-assembling systems, we hope to increase the complexity which is attainable. These tools provide a solid foundation for future work in both the Tile Assembly Model and explorations into more advanced models.

  15. Aircrew Training Devices: Utility and Utilization of Advanced Instructional Features (Phase IV--Summary Report).

    ERIC Educational Resources Information Center

    Polzella, Donald J.; And Others

    Modern aircrew training devices (ATDs) are equipped with sophisticated hardware and software capabilities, known as advanced instructional features (AIFs), that permit a simulator instructor to prepare briefings, manage training, vary task difficulty/fidelity, monitor performance, and provide feedback for flight simulation training missions. The…

  16. Application of large-scale, multi-resolution watershed modeling framework using the Hydrologic and Water Quality System (HAWQS)

    USDA-ARS?s Scientific Manuscript database

    In recent years, large-scale watershed modeling has been implemented broadly in the field of water resources planning and management. Complex hydrological, sediment, and nutrient processes can be simulated by sophisticated watershed simulation models for important issues such as water resources all...

  17. A structure adapted multipole method for electrostatic interactions in protein dynamics

    NASA Astrophysics Data System (ADS)

    Niedermeier, Christoph; Tavan, Paul

    1994-07-01

    We present an algorithm for rapid approximate evaluation of electrostatic interactions in molecular dynamics simulations of proteins. Traditional algorithms require computational work of the order O(N2) for a system of N particles. Truncation methods which try to avoid that effort entail untolerably large errors in forces, energies and other observables. Hierarchical multipole expansion algorithms, which can account for the electrostatics to numerical accuracy, scale with O(N log N) or even with O(N) if they become augmented by a sophisticated scheme for summing up forces. To further reduce the computational effort we propose an algorithm that also uses a hierarchical multipole scheme but considers only the first two multipole moments (i.e., charges and dipoles). Our strategy is based on the consideration that numerical accuracy may not be necessary to reproduce protein dynamics with sufficient correctness. As opposed to previous methods, our scheme for hierarchical decomposition is adjusted to structural and dynamical features of the particular protein considered rather than chosen rigidly as a cubic grid. As compared to truncation methods we manage to reduce errors in the computation of electrostatic forces by a factor of 10 with only marginal additional effort.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dahlburg, Jill; Corones, James; Batchelor, Donald

    Fusion is potentially an inexhaustible energy source whose exploitation requires a basic understanding of high-temperature plasmas. The development of a science-based predictive capability for fusion-relevant plasmas is a challenge central to fusion energy science, in which numerical modeling has played a vital role for more than four decades. A combination of the very wide range in temporal and spatial scales, extreme anisotropy, the importance of geometric detail, and the requirement of causality which makes it impossible to parallelize over time, makes this problem one of the most challenging in computational physics. Sophisticated computational models are under development for many individualmore » features of magnetically confined plasmas and increases in the scope and reliability of feasible simulations have been enabled by increased scientific understanding and improvements in computer technology. However, full predictive modeling of fusion plasmas will require qualitative improvements and innovations to enable cross coupling of a wider variety of physical processes and to allow solution over a larger range of space and time scales. The exponential growth of computer speed, coupled with the high cost of large-scale experimental facilities, makes an integrated fusion simulation initiative a timely and cost-effective opportunity. Worldwide progress in laboratory fusion experiments provides the basis for a recent FESAC recommendation to proceed with a burning plasma experiment (see FESAC Review of Burning Plasma Physics Report, September 2001). Such an experiment, at the frontier of the physics of complex systems, would be a huge step in establishing the potential of magnetic fusion energy to contribute to the world’s energy security. An integrated simulation capability would dramatically enhance the utilization of such a facility and lead to optimization of toroidal fusion plasmas in general. This science-based predictive capability, which was cited in the FESAC integrated planning document (IPPA, 2000), represents a significant opportunity for the DOE Office of Science to further the understanding of fusion plasmas to a level unparalleled worldwide.« less

  19. A framework using cluster-based hybrid network architecture for collaborative virtual surgery.

    PubMed

    Qin, Jing; Choi, Kup-Sze; Poon, Wai-Sang; Heng, Pheng-Ann

    2009-12-01

    Research on collaborative virtual environments (CVEs) opens the opportunity for simulating the cooperative work in surgical operations. It is however a challenging task to implement a high performance collaborative surgical simulation system because of the difficulty in maintaining state consistency with minimum network latencies, especially when sophisticated deformable models and haptics are involved. In this paper, an integrated framework using cluster-based hybrid network architecture is proposed to support collaborative virtual surgery. Multicast transmission is employed to transmit updated information among participants in order to reduce network latencies, while system consistency is maintained by an administrative server. Reliable multicast is implemented using distributed message acknowledgment based on cluster cooperation and sliding window technique. The robustness of the framework is guaranteed by the failure detection chain which enables smooth transition when participants join and leave the collaboration, including normal and involuntary leaving. Communication overhead is further reduced by implementing a number of management approaches such as computational policies and collaborative mechanisms. The feasibility of the proposed framework is demonstrated by successfully extending an existing standalone orthopedic surgery trainer into a collaborative simulation system. A series of experiments have been conducted to evaluate the system performance. The results demonstrate that the proposed framework is capable of supporting collaborative surgical simulation.

  20. Numerical Benchmark of 3D Ground Motion Simulation in the Alpine valley of Grenoble, France.

    NASA Astrophysics Data System (ADS)

    Tsuno, S.; Chaljub, E.; Cornou, C.; Bard, P.

    2006-12-01

    Thank to the use of sophisticated numerical methods and to the access to increasing computational resources, our predictions of strong ground motion become more and more realistic and need to be carefully compared. We report our effort of benchmarking numerical methods of ground motion simulation in the case of the valley of Grenoble in the French Alps. The Grenoble valley is typical of a moderate seismicity area where strong site effects occur. The benchmark consisted in computing the seismic response of the `Y'-shaped Grenoble valley to (i) two local earthquakes (Ml<=3) for which recordings were avalaible; and (ii) two local hypothetical events (Mw=6) occuring on the so-called Belledonne Border Fault (BBF) [1]. A free-style prediction was also proposed, in which participants were allowed to vary the source and/or the model parameters and were asked to provide the resulting uncertainty in their estimation of ground motion. We received a total of 18 contributions from 14 different groups; 7 of these use 3D methods, among which 3 could handle surface topography, the other half comprises predictions based upon 1D (2 contributions), 2D (4 contributions) and empirical Green's function (EGF) (3 contributions) methods. Maximal frequency analysed ranged between 2.5 Hz for 3D calculations and 40 Hz for EGF predictions. We present a detailed comparison of the different predictions using raw indicators (e.g. peak values of ground velocity and acceleration, Fourier spectra, site over reference spectral ratios, ...) as well as sophisticated misfit criteria based upon previous works [2,3]. We further discuss the variability in estimating the importance of particular effects such as non-linear rheology, or surface topography. References: [1] Thouvenot F. et al., The Belledonne Border Fault: identification of an active seismic strike-slip fault in the western Alps, Geophys. J. Int., 155 (1), p. 174-192, 2003. [2] Anderson J., Quantitative measure of the goodness-of-fit of synthetic seismograms, proceedings of the 13th World Conference on Earthquake Engineering, Vancouver, paper #243, 2004. [3] Kristekova M. et al., Misfit Criteria for Quantitative Comparison of Seismograms, Bull. Seism. Soc. Am., in press, 2006.

  1. Acceleration of atmospheric Cherenkov telescope signal processing to real-time speed with the Auto-Pipe design system

    NASA Astrophysics Data System (ADS)

    Tyson, Eric J.; Buckley, James; Franklin, Mark A.; Chamberlain, Roger D.

    2008-10-01

    The imaging atmospheric Cherenkov technique for high-energy gamma-ray astronomy is emerging as an important new technique for studying the high energy universe. Current experiments have data rates of ≈20TB/year and duty cycles of about 10%. In the future, more sensitive experiments may produce up to 1000 TB/year. The data analysis task for these experiments requires keeping up with this data rate in close to real-time. Such data analysis is a classic example of a streaming application with very high performance requirements. This class of application often benefits greatly from the use of non-traditional approaches for computation including using special purpose hardware (FPGAs and ASICs), or sophisticated parallel processing techniques. However, designing, debugging, and deploying to these architectures is difficult and thus they are not widely used by the astrophysics community. This paper presents the Auto-Pipe design toolset that has been developed to address many of the difficulties in taking advantage of complex streaming computer architectures for such applications. Auto-Pipe incorporates a high-level coordination language, functional and performance simulation tools, and the ability to deploy applications to sophisticated architectures. Using the Auto-Pipe toolset, we have implemented the front-end portion of an imaging Cherenkov data analysis application, suitable for real-time or offline analysis. The application operates on data from the VERITAS experiment, and shows how Auto-Pipe can greatly ease performance optimization and application deployment of a wide variety of platforms. We demonstrate a performance improvement over a traditional software approach of 32x using an FPGA solution and 3.6x using a multiprocessor based solution.

  2. US Geological Survey National Computer Technology Meeting; Proceedings, Phoenix, Arizona, November 14-18, 1988

    USGS Publications Warehouse

    Balthrop, Barbara H.; Terry, J.E.

    1991-01-01

    The U.S. Geological Survey National Computer Technology Meetings (NCTM) are sponsored by the Water Resources Division and provide a forum for the presentation of technical papers and the sharing of ideas or experiences related to computer technology. This report serves as a proceedings of the meeting held in November, 1988 at the Crescent Hotel in Phoenix, Arizona. The meeting was attended by more than 200 technical and managerial people representing all Divisions of the U.S. Geological Survey.Scientists in every Division of the U.S. Geological Survey rely heavily upon state-of-the-art computer technology (both hardware and sofnuare). Today the goals of each Division are pursued in an environment where high speed computers, distributed communications, distributed data bases, high technology input/output devices, and very sophisticated simulation tools are used regularly. Therefore, information transfer and the sharing of advances in technology are very important issues that must be addressed regularly.This report contains complete papers and abstracts of papers that were presented at the 1988 NCTM. The report is divided into topical sections that reflect common areas of interest and application. In each section, papers are presented first followed by abstracts. For these proceedings, the publication of a complete paper or only an abstract was at the discretion of the author, although complete papers were encouraged.Some papers presented at the 1988 NCTM are not published in these proceedings.

  3. Using Web Speech Technology with Language Learning Applications

    ERIC Educational Resources Information Center

    Daniels, Paul

    2015-01-01

    In this article, the author presents the history of human-to-computer interaction based upon the design of sophisticated computerized speech recognition algorithms. Advancements such as the arrival of cloud-based computing and software like Google's Web Speech API allows anyone with an Internet connection and Chrome browser to take advantage of…

  4. Evaluation of Imagine Learning English, a Computer-Assisted Instruction of Language and Literacy for Kindergarten Students

    ERIC Educational Resources Information Center

    Longberg, Pauline Oliphant

    2012-01-01

    As computer assisted instruction (CAI) becomes increasingly sophisticated, its appeal as a viable method of literacy intervention with young children continues despite limited evidence of effectiveness. The present study sought to assess the impact of one such CAI program, "Imagine Learning English" (ILE), on both the receptive…

  5. Microcomputer Based Computer-Assisted Learning System: CASTLE.

    ERIC Educational Resources Information Center

    Garraway, R. W. T.

    The purpose of this study was to investigate the extent to which a sophisticated computer assisted instruction (CAI) system could be implemented on the type of microcomputer system currently found in the schools. A method was devised for comparing CAI languages and was used to rank five common CAI languages. The highest ranked language, NATAL,…

  6. Undecidability and Irreducibility Conditions for Open-Ended Evolution and Emergence.

    PubMed

    Hernández-Orozco, Santiago; Hernández-Quiroz, Francisco; Zenil, Hector

    2018-01-01

    Is undecidability a requirement for open-ended evolution (OEE)? Using methods derived from algorithmic complexity theory, we propose robust computational definitions of open-ended evolution and the adaptability of computable dynamical systems. Within this framework, we show that decidability imposes absolute limits on the stable growth of complexity in computable dynamical systems. Conversely, systems that exhibit (strong) open-ended evolution must be undecidable, establishing undecidability as a requirement for such systems. Complexity is assessed in terms of three measures: sophistication, coarse sophistication, and busy beaver logical depth. These three complexity measures assign low complexity values to random (incompressible) objects. As time grows, the stated complexity measures allow for the existence of complex states during the evolution of a computable dynamical system. We show, however, that finding these states involves undecidable computations. We conjecture that for similar complexity measures that assign low complexity values, decidability imposes comparable limits on the stable growth of complexity, and that such behavior is necessary for nontrivial evolutionary systems. We show that the undecidability of adapted states imposes novel and unpredictable behavior on the individuals or populations being modeled. Such behavior is irreducible. Finally, we offer an example of a system, first proposed by Chaitin, that exhibits strong OEE.

  7. Results from teleoperated free-flying spacecraft simulations in the Martin Marietta space operations simulator lab

    NASA Technical Reports Server (NTRS)

    Hartley, Craig S.

    1990-01-01

    To augment the capabilities of the Space Transportation System, NASA has funded studies and developed programs aimed at developing reusable, remotely piloted spacecraft and satellite servicing systems capable of delivering, retrieving, and servicing payloads at altitudes and inclinations beyond the reach of the present Shuttle Orbiters. Since the mid 1970's, researchers at the Martin Marietta Astronautics Group Space Operations Simulation (SOS) Laboratory have been engaged in investigations of remotely piloted and supervised autonomous spacecraft operations. These investigations were based on high fidelity, real-time simulations and have covered a wide range of human factors issues related to controllability. Among these are: (1) mission conditions, including thruster plume impingements and signal time delays; (2) vehicle performance variables, including control authority, control harmony, minimum impulse, and cross coupling of accelerations; (3) maneuvering task requirements such as target distance and dynamics; (4) control parameters including various control modes and rate/displacement deadbands; and (5) display parameters involving camera placement and function, visual aids, and presentation of operational feedback from the spacecraft. This presentation includes a brief description of the capabilities of the SOS Lab to simulate real-time free-flyer operations using live video, advanced technology ground and on-orbit workstations, and sophisticated computer models of on-orbit spacecraft behavior. Sample results from human factors studies in the five categories cited above are provided.

  8. Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozacik, Stephen

    Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.

  9. Cloud computing can simplify HIT infrastructure management.

    PubMed

    Glaser, John

    2011-08-01

    Software as a Service (SaaS), built on cloud computing technology, is emerging as the forerunner in IT infrastructure because it helps healthcare providers reduce capital investments. Cloud computing leads to predictable, monthly, fixed operating expenses for hospital IT staff. Outsourced cloud computing facilities are state-of-the-art data centers boasting some of the most sophisticated networking equipment on the market. The SaaS model helps hospitals safeguard against technology obsolescence, minimizes maintenance requirements, and simplifies management.

  10. Fidelity of Simulation and Transfer of Training: A Review of the Problem.

    ERIC Educational Resources Information Center

    Gerathewohl, Siegfried J.

    The document is concerned with the several kinds of flight simulators available today which are valuable tools for research, training, and proficiency measurement. They range from simple trainer type devices useful for learning specific tasks, to very sophisticated ground based facilities and aircraft used for crew training under simulated…

  11. First of all: Do not harm! Use of simulation for the training of regional anaesthesia techniques: Which skills can be trained without the patient as substitute for a mannequin.

    PubMed

    Sujatta, Susanne

    2015-03-01

    Character of clinical skills training is always influenced by technical improvement and cultural changes. Over the last years, two trends have changed the way of traditional apprenticeship-style training in regional anaesthesia: firstly, the development in ultrasound-guided regional anaesthesia, and secondly, the reduced acceptance of using patients as mannequins for invasive techniques. Against this background, simulation techniques are explored, ranging from simple low-fidelity part-task training models to train skills in needle application, to highly sophisticated virtual reality models – the full range is covered. This review tries to discuss all available options with benefits and neglects. The task in clinical practice will be in choosing the right level of sophistication for the desired approach and trainee level. However, the transfer of simulated skills to clinical practice has not been evaluated. It has to be proven whether simulation-trained skills could, as a last consequence, reduce the risk to patients. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Tools for Atmospheric Radiative Transfer: Streamer and FluxNet. Revised

    NASA Technical Reports Server (NTRS)

    Key, Jeffrey R.; Schweiger, Axel J.

    1998-01-01

    Two tools for the solution of radiative transfer problems are presented. Streamer is a highly flexible medium spectral resolution radiative transfer model based on the plane-parallel theory of radiative transfer. Capable of computing either fluxes or radiances, it is suitable for studying radiative processes at the surface or within the atmosphere and for the development of remote-sensing algorithms. FluxNet is a fast neural network-based implementation of Streamer for computing surface fluxes. It allows for a sophisticated treatment of radiative processes in the analysis of large data sets and potential integration into geophysical models where computational efficiency is an issue. Documentation and tools for the development of alternative versions of Fluxnet are available. Collectively, Streamer and FluxNet solve a wide variety of problems related to radiative transfer: Streamer provides the detail and sophistication needed to perform basic research on most aspects of complex radiative processes while the efficiency and simplicity of FluxNet make it ideal for operational use.

  13. SuML: A Survey Markup Language for Generalized Survey Encoding

    PubMed Central

    Barclay, MW; Lober, WB; Karras, BT

    2002-01-01

    There is a need in clinical and research settings for a sophisticated, generalized, web based survey tool that supports complex logic, separation of content and presentation, and computable guidelines. There are many commercial and open source survey packages available that provide simple logic; few provide sophistication beyond “goto” statements; none support the use of guidelines. These tools are driven by databases, static web pages, and structured documents using markup languages such as eXtensible Markup Language (XML). We propose a generalized, guideline aware language and an implementation architecture using open source standards.

  14. Consequence modeling using the fire dynamics simulator.

    PubMed

    Ryder, Noah L; Sutula, Jason A; Schemel, Christopher F; Hamer, Andrew J; Van Brunt, Vincent

    2004-11-11

    The use of Computational Fluid Dynamics (CFD) and in particular Large Eddy Simulation (LES) codes to model fires provides an efficient tool for the prediction of large-scale effects that include plume characteristics, combustion product dispersion, and heat effects to adjacent objects. This paper illustrates the strengths of the Fire Dynamics Simulator (FDS), an LES code developed by the National Institute of Standards and Technology (NIST), through several small and large-scale validation runs and process safety applications. The paper presents two fire experiments--a small room fire and a large (15 m diameter) pool fire. The model results are compared to experimental data and demonstrate good agreement between the models and data. The validation work is then extended to demonstrate applicability to process safety concerns by detailing a model of a tank farm fire and a model of the ignition of a gaseous fuel in a confined space. In this simulation, a room was filled with propane, given time to disperse, and was then ignited. The model yields accurate results of the dispersion of the gas throughout the space. This information can be used to determine flammability and explosive limits in a space and can be used in subsequent models to determine the pressure and temperature waves that would result from an explosion. The model dispersion results were compared to an experiment performed by Factory Mutual. Using the above examples, this paper will demonstrate that FDS is ideally suited to build realistic models of process geometries in which large scale explosion and fire failure risks can be evaluated with several distinct advantages over more traditional CFD codes. Namely transient solutions to fire and explosion growth can be produced with less sophisticated hardware (lower cost) than needed for traditional CFD codes (PC type computer verses UNIX workstation) and can be solved for longer time histories (on the order of hundreds of seconds of computed time) with minimal computer resources and length of model run. Additionally results that are produced can be analyzed, viewed, and tabulated during and following a model run within a PC environment. There are some tradeoffs, however, as rapid computations in PC's may require a sacrifice in the grid resolution or in the sub-grid modeling, depending on the size of the geometry modeled.

  15. 32 CFR Appendix E to Part 323 - OMB Guidelines for Matching Programs

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... concern expressed by the Congress in the Privacy Act of 1974 that “the increasing use of computers and sophisticated information technology, while essential to the efficient operation of the Government, has greatly... which a computer is used to compare two or more automated systems of records or a system of records with...

  16. 32 CFR Appendix E to Part 323 - OMB Guidelines for Matching Programs

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... concern expressed by the Congress in the Privacy Act of 1974 that “the increasing use of computers and sophisticated information technology, while essential to the efficient operation of the Government, has greatly... which a computer is used to compare two or more automated systems of records or a system of records with...

  17. 32 CFR Appendix E to Part 323 - OMB Guidelines for Matching Programs

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... concern expressed by the Congress in the Privacy Act of 1974 that “the increasing use of computers and sophisticated information technology, while essential to the efficient operation of the Government, has greatly... which a computer is used to compare two or more automated systems of records or a system of records with...

  18. Large-Signal Klystron Simulations Using KLSC

    NASA Astrophysics Data System (ADS)

    Carlsten, B. E.; Ferguson, P.

    1997-05-01

    We describe a new, 2-1/2 dimensional, klystron-simulation code, KLSC. This code has a sophisticated input cavity model for calculating the klystron gain with arbitrary input cavity matching and tuning, and is capable of modeling coupled output cavities. We will discuss the input and output cavity models, and present simulation results from a high-power, S-band design. We will use these results to explore tuning issues with coupled output cavities.

  19. MCViNE- An object oriented Monte Carlo neutron ray tracing simulation package

    DOE PAGES

    Lin, J. Y. Y.; Smith, Hillary L.; Granroth, Garrett E.; ...

    2015-11-28

    MCViNE (Monte-Carlo VIrtual Neutron Experiment) is an open-source Monte Carlo (MC) neutron ray-tracing software for performing computer modeling and simulations that mirror real neutron scattering experiments. We exploited the close similarity between how instrument components are designed and operated and how such components can be modeled in software. For example we used object oriented programming concepts for representing neutron scatterers and detector systems, and recursive algorithms for implementing multiple scattering. Combining these features together in MCViNE allows one to handle sophisticated neutron scattering problems in modern instruments, including, for example, neutron detection by complex detector systems, and single and multiplemore » scattering events in a variety of samples and sample environments. In addition, MCViNE can use simulation components from linear-chain-based MC ray tracing packages which facilitates porting instrument models from those codes. Furthermore it allows for components written solely in Python, which expedites prototyping of new components. These developments have enabled detailed simulations of neutron scattering experiments, with non-trivial samples, for time-of-flight inelastic instruments at the Spallation Neutron Source. Examples of such simulations for powder and single-crystal samples with various scattering kernels, including kernels for phonon and magnon scattering, are presented. As a result, with simulations that closely reproduce experimental results, scattering mechanisms can be turned on and off to determine how they contribute to the measured scattering intensities, improving our understanding of the underlying physics.« less

  20. Development of a New System for Transport Simulation and Analysis at General Atomics

    NASA Astrophysics Data System (ADS)

    St. John, H. E.; Peng, Q.; Freeman, J.; Crotinger, J.

    1997-11-01

    General Atomics has begun a long term program to improve all aspects of experimental data analysis related to DIII--D. The object is to make local and visiting physicists as productive as possible, with only a small investment in training, by developing intuitive, sophisticated interfaces to existing and newly created computer programs. Here we describe our initial work and results of a pilot project in this program. The pilot project is a collaboratory effort between LLNL and GA which will ultimately result in the merger of Corsica and ONETWO (and selected modules from other codes) into a new advanced transport code system. The initial goal is to produce a graphical user interface to the transport code ONETWO which will couple to a programmable (steerable) front end designed for the transport system. This will be an object oriented scheme written primarily in python. The programmable application will integrate existing C, C^++, and Fortran methods in a single computational paradigm. Its most important feature is the use of plug in physics modules which will allow a high degree of customization.

  1. seismo-live: Training in Seismology using Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Igel, Heiner; Krischer, Lion; van Driel, Martin; Tape, Carl

    2017-04-01

    Practical training in computational methodologies is still underrepresented in Earth science curriculae despite the increasing use of sometimes highly sophisticated simulation and data processing technologies in research projects. At the same time well-engineered community codes make it easy to return results yet with the danger that the inherent traps of black-box solutions are not well understood. For this purpose we have initiated a community platform (www.seismo-live.org) where Python-based Jupyter notebooks can be accessed and run without necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow combining markup language, graphics, equations, with interactive, executable python codes. The platform already includes general Python training, introduction to the ObsPy library for seismology as well as seismic data processing, noise analysis, and a variety of forward solvers for seismic wave propagation. In addition, an example is shown how Jupyter notebooks can be used to increase reproducibility of published results. Submission of Jupyter notebooks for general seismology are encouraged. The platform can be used for complementary teaching in Earth Science courses on compute-intensive research areas. We present recent developments and new features.

  2. A normal mode-based geometric simulation approach for exploring biologically relevant conformational transitions in proteins.

    PubMed

    Ahmed, Aqeel; Rippmann, Friedrich; Barnickel, Gerhard; Gohlke, Holger

    2011-07-25

    A three-step approach for multiscale modeling of protein conformational changes is presented that incorporates information about preferred directions of protein motions into a geometric simulation algorithm. The first two steps are based on a rigid cluster normal-mode analysis (RCNMA). Low-frequency normal modes are used in the third step (NMSim) to extend the recently introduced idea of constrained geometric simulations of diffusive motions in proteins by biasing backbone motions of the protein, whereas side-chain motions are biased toward favorable rotamer states. The generated structures are iteratively corrected regarding steric clashes and stereochemical constraint violations. The approach allows performing three simulation types: unbiased exploration of conformational space; pathway generation by a targeted simulation; and radius of gyration-guided simulation. When applied to a data set of proteins with experimentally observed conformational changes, conformational variabilities are reproduced very well for 4 out of 5 proteins that show domain motions, with correlation coefficients r > 0.70 and as high as r = 0.92 in the case of adenylate kinase. In 7 out of 8 cases, NMSim simulations starting from unbound structures are able to sample conformations that are similar (root-mean-square deviation = 1.0-3.1 Å) to ligand bound conformations. An NMSim generated pathway of conformational change of adenylate kinase correctly describes the sequence of domain closing. The NMSim approach is a computationally efficient alternative to molecular dynamics simulations for conformational sampling of proteins. The generated conformations and pathways of conformational transitions can serve as input to docking approaches or as starting points for more sophisticated sampling techniques.

  3. Machine learning of frustrated classical spin models. I. Principal component analysis

    NASA Astrophysics Data System (ADS)

    Wang, Ce; Zhai, Hui

    2017-10-01

    This work aims at determining whether artificial intelligence can recognize a phase transition without prior human knowledge. If this were successful, it could be applied to, for instance, analyzing data from the quantum simulation of unsolved physical models. Toward this goal, we first need to apply the machine learning algorithm to well-understood models and see whether the outputs are consistent with our prior knowledge, which serves as the benchmark for this approach. In this work, we feed the computer data generated by the classical Monte Carlo simulation for the X Y model in frustrated triangular and union jack lattices, which has two order parameters and exhibits two phase transitions. We show that the outputs of the principal component analysis agree very well with our understanding of different orders in different phases, and the temperature dependences of the major components detect the nature and the locations of the phase transitions. Our work offers promise for using machine learning techniques to study sophisticated statistical models, and our results can be further improved by using principal component analysis with kernel tricks and the neural network method.

  4. Wargaming and interactive color graphics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bly, S.; Buzzell, C.; Smith, G.

    1980-08-04

    JANUS is a two-sided interactive color graphic simulation in which human commanders can direct their forces, each trying to accomplish their mission. This competitive synthetic battlefield is used to explore the range of human ingenuity under conditions of incomplete information about enemy strength and deployment. Each player can react to new situations by planning new unit movements, using conventional and nuclear weapons, or modifying unit objectives. Conventional direct fire among tanks, infantry fighting vehicles, helicopters, and other units is automated subject to constraints of target acquisition, reload rate, range, suppression, etc. Artillery and missile indirect fire systems deliver conventional munitions,more » smoke, and nuclear weapons. Players use reconnaissance units, helicopters, or fixed wing aircraft to search for enemy unit locations. Counter-battery radars acquire enemy artillery. The JANUS simulation at LLL has demonstrated the value of the computer as a sophisticated blackboard. A small dedicated minicomputer is adequate for detailed calculations, and may be preferable to sharing a more powerful machine. Real-time color interactive graphics are essential to allow realistic command decision inputs. Competitive human-versus-human synthetic experiences are intense and well-remembered. 2 figures.« less

  5. Flat-plate solar array project. Volume 8: Project analysis and integration

    NASA Technical Reports Server (NTRS)

    Mcguire, P.; Henry, P.

    1986-01-01

    Project Analysis and Integration (PA&I) performed planning and integration activities to support management of the various Flat-Plate Solar Array (FSA) Project R&D activities. Technical and economic goals were established by PA&I for each R&D task within the project to coordinate the thrust toward the National Photovoltaic Program goals. A sophisticated computer modeling capability was developed to assess technical progress toward meeting the economic goals. These models included a manufacturing facility simulation, a photovoltaic power station simulation and a decision aid model incorporating uncertainty. This family of analysis tools was used to track the progress of the technology and to explore the effects of alternative technical paths. Numerous studies conducted by PA&I signaled the achievement of milestones or were the foundation of major FSA project and national program decisions. The most important PA&I activities during the project history are summarized. The PA&I planning function is discussed and how it relates to project direction and important analytical models developed by PA&I for its analytical and assessment activities are reviewed.

  6. Using a cloud to replenish parched groundwater modeling efforts.

    PubMed

    Hunt, Randall J; Luchette, Joseph; Schreuder, Willem A; Rumbaugh, James O; Doherty, John; Tonkin, Matthew J; Rumbaugh, Douglas B

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate "virtual" computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  7. Using a cloud to replenish parched groundwater modeling efforts

    USGS Publications Warehouse

    Hunt, Randall J.; Luchette, Joseph; Schreuder, Willem A.; Rumbaugh, James O.; Doherty, John; Tonkin, Matthew J.; Rumbaugh, Douglas B.

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate “virtual” computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  8. Effect of two viscosity models on lethality estimation in sterilization of liquid canned foods.

    PubMed

    Calderón-Alvarado, M P; Alvarado-Orozco, J M; Herrera-Hernández, E C; Martínez-González, G M; Miranda-López, R; Jiménez-Islas, H

    2016-09-01

    A numerical study on 2D natural convection in cylindrical cavities during the sterilization of liquid foods was performed. The mathematical model was established on momentum and energy balances and predicts both the heating dynamics of the slowest heating zone (SHZ) and the lethal rate achieved in homogeneous liquid canned foods. Two sophistication levels were proposed in viscosity modelling: 1) considering average viscosity and 2) using an Arrhenius-type model to include the effect of temperature on viscosity. The remaining thermodynamic properties were kept constant. The governing equations were spatially discretized via orthogonal collocation (OC) with mesh size of 25 × 25. Computational simulations were performed using proximate and thermodynamic data for carrot-orange soup, broccoli-cheddar soup, tomato puree, and cream-style corn. Flow patterns, isothermals, heating dynamics of the SHZ, and the sterilization rate achieved for the cases studied were compared for both viscosity models. The dynamics of coldest point and the lethal rate F0 in all food fluids studied were approximately equal in both cases, although the second sophistication level is closer to physical behavior. The model accuracy was compared favorably with reported sterilization time for cream-style corn packed at 303 × 406 can size, predicting 66 min versus an experimental time of 68 min at retort temperature of 121.1 ℃. © The Author(s) 2016.

  9. Interfacing Simulations with Training Content

    DTIC Science & Technology

    2006-09-01

    a panelist at numerous international training and elearning conferences, ADL Plugfests and IMS Global Learning Consortium Open Technical Forums. Dr...communication technologies has enabled higher quality learning to be made available through increasingly sophisticated modes of presentation. Traditional...However, learning is a comprehensive process which does not simply consist of the transmission and learning of content. While simulations offer the

  10. That Elusive, Eclectic Thing Called Thermal Environment: What a Board Should Know About It

    ERIC Educational Resources Information Center

    Schutte, Frederick

    1970-01-01

    Discussion of proper thermal environment for protection of sophisticated educational equipment such as computer and data-processing machines, magnetic tapes, closed-circuit television and video tape communications systems.

  11. School Architecture: New Activities Dictate New Designs.

    ERIC Educational Resources Information Center

    Hill, Robert

    1984-01-01

    Changing educational requirements have led to many school building design developments in recent years, including technologically sophisticated music and computer rooms, large school kitchens, and Title IX-mandated equal facilities available for both sexes. (MLF)

  12. The origins of computer weather prediction and climate modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lynch, Peter

    2008-03-20

    Numerical simulation of an ever-increasing range of geophysical phenomena is adding enormously to our understanding of complex processes in the Earth system. The consequences for mankind of ongoing climate change will be far-reaching. Earth System Models are capable of replicating climate regimes of past millennia and are the best means we have of predicting the future of our climate. The basic ideas of numerical forecasting and climate modeling were developed about a century ago, long before the first electronic computer was constructed. There were several major practical obstacles to be overcome before numerical prediction could be put into practice. Amore » fuller understanding of atmospheric dynamics allowed the development of simplified systems of equations; regular radiosonde observations of the free atmosphere and, later, satellite data, provided the initial conditions; stable finite difference schemes were developed; and powerful electronic computers provided a practical means of carrying out the prodigious calculations required to predict the changes in the weather. Progress in weather forecasting and in climate modeling over the past 50 years has been dramatic. In this presentation, we will trace the history of computer forecasting through the ENIAC integrations to the present day. The useful range of deterministic prediction is increasing by about one day each decade, and our understanding of climate change is growing rapidly as Earth System Models of ever-increasing sophistication are developed.« less

  13. The origins of computer weather prediction and climate modeling

    NASA Astrophysics Data System (ADS)

    Lynch, Peter

    2008-03-01

    Numerical simulation of an ever-increasing range of geophysical phenomena is adding enormously to our understanding of complex processes in the Earth system. The consequences for mankind of ongoing climate change will be far-reaching. Earth System Models are capable of replicating climate regimes of past millennia and are the best means we have of predicting the future of our climate. The basic ideas of numerical forecasting and climate modeling were developed about a century ago, long before the first electronic computer was constructed. There were several major practical obstacles to be overcome before numerical prediction could be put into practice. A fuller understanding of atmospheric dynamics allowed the development of simplified systems of equations; regular radiosonde observations of the free atmosphere and, later, satellite data, provided the initial conditions; stable finite difference schemes were developed; and powerful electronic computers provided a practical means of carrying out the prodigious calculations required to predict the changes in the weather. Progress in weather forecasting and in climate modeling over the past 50 years has been dramatic. In this presentation, we will trace the history of computer forecasting through the ENIAC integrations to the present day. The useful range of deterministic prediction is increasing by about one day each decade, and our understanding of climate change is growing rapidly as Earth System Models of ever-increasing sophistication are developed.

  14. The Operation of a Specialized Scientific Information and Data Analysis Center With Computer Base and Associated Communications Network.

    ERIC Educational Resources Information Center

    Cottrell, William B.; And Others

    The Nuclear Safety Information Center (NSIC) is a highly sophisticated scientific information center operated at Oak Ridge National Laboratory (ORNL) for the U.S. Atomic Energy Commission. Its information file, which consists of both data and bibliographic information, is computer stored and numerous programs have been developed to facilitate the…

  15. Integrating the iPod Touch in K-12 Education: Visions and Vices

    ERIC Educational Resources Information Center

    Banister, Savilla

    2010-01-01

    Advocates of ubiquitous computing have long been documenting classroom benefits of one-to-one ratios of students to handheld or laptop computers. The recent sophisticated capabilities of the iPod Touch, iPhone, and iPad have encouraged further speculation on exactly how K-12 teaching and learning might be energized by such devices. This paper…

  16. Realistic computer network simulation for network intrusion detection dataset generation

    NASA Astrophysics Data System (ADS)

    Payer, Garrett

    2015-05-01

    The KDD-99 Cup dataset is dead. While it can continue to be used as a toy example, the age of this dataset makes it all but useless for intrusion detection research and data mining. Many of the attacks used within the dataset are obsolete and do not reflect the features important for intrusion detection in today's networks. Creating a new dataset encompassing a large cross section of the attacks found on the Internet today could be useful, but would eventually fall to the same problem as the KDD-99 Cup; its usefulness would diminish after a period of time. To continue research into intrusion detection, the generation of new datasets needs to be as dynamic and as quick as the attacker. Simply examining existing network traffic and using domain experts such as intrusion analysts to label traffic is inefficient, expensive, and not scalable. The only viable methodology is simulation using technologies including virtualization, attack-toolsets such as Metasploit and Armitage, and sophisticated emulation of threat and user behavior. Simulating actual user behavior and network intrusion events dynamically not only allows researchers to vary scenarios quickly, but enables online testing of intrusion detection mechanisms by interacting with data as it is generated. As new threat behaviors are identified, they can be added to the simulation to make quicker determinations as to the effectiveness of existing and ongoing network intrusion technology, methodology and models.

  17. The process group approach to reliable distributed computing

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  18. Computation of repetitions and regularities of biologically weighted sequences.

    PubMed

    Christodoulakis, M; Iliopoulos, C; Mouchard, L; Perdikuri, K; Tsakalidis, A; Tsichlas, K

    2006-01-01

    Biological weighted sequences are used extensively in molecular biology as profiles for protein families, in the representation of binding sites and often for the representation of sequences produced by a shotgun sequencing strategy. In this paper, we address three fundamental problems in the area of biologically weighted sequences: (i) computation of repetitions, (ii) pattern matching, and (iii) computation of regularities. Our algorithms can be used as basic building blocks for more sophisticated algorithms applied on weighted sequences.

  19. Extensible Computational Chemistry Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-08-09

    ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing the power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of themore » inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less

  20. Global 7 km mesh nonhydrostatic Model Intercomparison Project for improving TYphoon forecast (TYMIP-G7): experimental design and preliminary results

    NASA Astrophysics Data System (ADS)

    Nakano, Masuo; Wada, Akiyoshi; Sawada, Masahiro; Yoshimura, Hiromasa; Onishi, Ryo; Kawahara, Shintaro; Sasaki, Wataru; Nasuno, Tomoe; Yamaguchi, Munehiko; Iriguchi, Takeshi; Sugi, Masato; Takeuchi, Yoshiaki

    2017-03-01

    Recent advances in high-performance computers facilitate operational numerical weather prediction by global hydrostatic atmospheric models with horizontal resolutions of ˜ 10 km. Given further advances in such computers and the fact that the hydrostatic balance approximation becomes invalid for spatial scales < 10 km, the development of global nonhydrostatic models with high accuracy is urgently required. The Global 7 km mesh nonhydrostatic Model Intercomparison Project for improving TYphoon forecast (TYMIP-G7) is designed to understand and statistically quantify the advantages of high-resolution nonhydrostatic global atmospheric models to improve tropical cyclone (TC) prediction. A total of 137 sets of 5-day simulations using three next-generation nonhydrostatic global models with horizontal resolutions of 7 km and a conventional hydrostatic global model with a horizontal resolution of 20 km were run on the Earth Simulator. The three 7 km mesh nonhydrostatic models are the nonhydrostatic global spectral atmospheric Double Fourier Series Model (DFSM), the Multi-Scale Simulator for the Geoenvironment (MSSG) and the Nonhydrostatic ICosahedral Atmospheric Model (NICAM). The 20 km mesh hydrostatic model is the operational Global Spectral Model (GSM) of the Japan Meteorological Agency. Compared with the 20 km mesh GSM, the 7 km mesh models reduce systematic errors in the TC track, intensity and wind radii predictions. The benefits of the multi-model ensemble method were confirmed for the 7 km mesh nonhydrostatic global models. While the three 7 km mesh models reproduce the typical axisymmetric mean inner-core structure, including the primary and secondary circulations, the simulated TC structures and their intensities in each case are very different for each model. In addition, the simulated track is not consistently better than that of the 20 km mesh GSM. These results suggest that the development of more sophisticated initialization techniques and model physics is needed to further improve the TC prediction.

  1. Resilient and Robust High Performance Computing Platforms for Scientific Computing Integrity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Yier

    As technology advances, computer systems are subject to increasingly sophisticated cyber-attacks that compromise both their security and integrity. High performance computing platforms used in commercial and scientific applications involving sensitive, or even classified data, are frequently targeted by powerful adversaries. This situation is made worse by a lack of fundamental security solutions that both perform efficiently and are effective at preventing threats. Current security solutions fail to address the threat landscape and ensure the integrity of sensitive data. As challenges rise, both private and public sectors will require robust technologies to protect its computing infrastructure. The research outcomes from thismore » project try to address all these challenges. For example, we present LAZARUS, a novel technique to harden kernel Address Space Layout Randomization (KASLR) against paging-based side-channel attacks. In particular, our scheme allows for fine-grained protection of the virtual memory mappings that implement the randomization. We demonstrate the effectiveness of our approach by hardening a recent Linux kernel with LAZARUS, mitigating all of the previously presented side-channel attacks on KASLR. Our extensive evaluation shows that LAZARUS incurs only 0.943% overhead for standard benchmarks, and is therefore highly practical. We also introduced HA2lloc, a hardware-assisted allocator that is capable of leveraging an extended memory management unit to detect memory errors in the heap. We also perform testing using HA2lloc in a simulation environment and find that the approach is capable of preventing common memory vulnerabilities.« less

  2. Approaches for scalable modeling and emulation of cyber systems : LDRD final report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayo, Jackson R.; Minnich, Ronald G.; Armstrong, Robert C.

    2009-09-01

    The goal of this research was to combine theoretical and computational approaches to better understand the potential emergent behaviors of large-scale cyber systems, such as networks of {approx} 10{sup 6} computers. The scale and sophistication of modern computer software, hardware, and deployed networked systems have significantly exceeded the computational research community's ability to understand, model, and predict current and future behaviors. This predictive understanding, however, is critical to the development of new approaches for proactively designing new systems or enhancing existing systems with robustness to current and future cyber threats, including distributed malware such as botnets. We have developed preliminarymore » theoretical and modeling capabilities that can ultimately answer questions such as: How would we reboot the Internet if it were taken down? Can we change network protocols to make them more secure without disrupting existing Internet connectivity and traffic flow? We have begun to address these issues by developing new capabilities for understanding and modeling Internet systems at scale. Specifically, we have addressed the need for scalable network simulation by carrying out emulations of a network with {approx} 10{sup 6} virtualized operating system instances on a high-performance computing cluster - a 'virtual Internet'. We have also explored mappings between previously studied emergent behaviors of complex systems and their potential cyber counterparts. Our results provide foundational capabilities for further research toward understanding the effects of complexity in cyber systems, to allow anticipating and thwarting hackers.« less

  3. A non-local mixing-length theory able to compute core overshooting

    NASA Astrophysics Data System (ADS)

    Gabriel, M.; Belkacem, K.

    2018-04-01

    Turbulent convection is certainly one of the most important and thorny issues in stellar physics. Our deficient knowledge of this crucial physical process introduces a fairly large uncertainty concerning the internal structure and evolution of stars. A striking example is overshoot at the edge of convective cores. Indeed, nearly all stellar evolutionary codes treat the overshooting zones in a very approximative way that considers both its extent and the profile of the temperature gradient as free parameters. There are only a few sophisticated theories of stellar convection such as Reynolds stress approaches, but they also require the adjustment of a non-negligible number of free parameters. We present here a theory, based on the plume theory as well as on the mean-field equations, but without relying on the usual Taylor's closure hypothesis. It leads us to a set of eight differential equations plus a few algebraic ones. Our theory is essentially a non-mixing length theory. It enables us to compute the temperature gradient in a shrinking convective core and its overshooting zone. The case of an expanding convective core is also discussed, though more briefly. Numerical simulations have quickly improved during recent years and enabling us to foresee that they will probably soon provide a model of convection adapted to the computation of 1D stellar models.

  4. Verification for measurement-only blind quantum computing

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki

    2014-06-01

    Blind quantum computing is a new secure quantum computing protocol where a client who does not have any sophisticated quantum technology can delegate her quantum computing to a server without leaking any privacy. It is known that a client who has only a measurement device can perform blind quantum computing [T. Morimae and K. Fujii, Phys. Rev. A 87, 050301(R) (2013), 10.1103/PhysRevA.87.050301]. It has been an open problem whether the protocol can enjoy the verification, i.e., the ability of the client to check the correctness of the computing. In this paper, we propose a protocol of verification for the measurement-only blind quantum computing.

  5. Simulating Coupling Complexity in Space Plasmas: First Results from a new code

    NASA Astrophysics Data System (ADS)

    Kryukov, I.; Zank, G. P.; Pogorelov, N. V.; Raeder, J.; Ciardo, G.; Florinski, V. A.; Heerikhuisen, J.; Li, G.; Petrini, F.; Shematovich, V. I.; Winske, D.; Shaikh, D.; Webb, G. M.; Yee, H. M.

    2005-12-01

    The development of codes that embrace 'coupling complexity' via the self-consistent incorporation of multiple physical scales and multiple physical processes in models has been identified by the NRC Decadal Survey in Solar and Space Physics as a crucial necessary development in simulation/modeling technology for the coming decade. The National Science Foundation, through its Information Technology Research (ITR) Program, is supporting our efforts to develop a new class of computational code for plasmas and neutral gases that integrates multiple scales and multiple physical processes and descriptions. We are developing a highly modular, parallelized, scalable code that incorporates multiple scales by synthesizing 3 simulation technologies: 1) Computational fluid dynamics (hydrodynamics or magneto-hydrodynamics-MHD) for the large-scale plasma; 2) direct Monte Carlo simulation of atoms/neutral gas, and 3) transport code solvers to model highly energetic particle distributions. We are constructing the code so that a fourth simulation technology, hybrid simulations for microscale structures and particle distributions, can be incorporated in future work, but for the present, this aspect will be addressed at a test-particle level. This synthesis we will provide a computational tool that will advance our understanding of the physics of neutral and charged gases enormously. Besides making major advances in basic plasma physics and neutral gas problems, this project will address 3 Grand Challenge space physics problems that reflect our research interests: 1) To develop a temporal global heliospheric model which includes the interaction of solar and interstellar plasma with neutral populations (hydrogen, helium, etc., and dust), test-particle kinetic pickup ion acceleration at the termination shock, anomalous cosmic ray production, interaction with galactic cosmic rays, while incorporating the time variability of the solar wind and the solar cycle. 2) To develop a coronal mass ejection and interplanetary shock propagation model for the inner and outer heliosphere, including, at a test-particle level, wave-particle interactions and particle acceleration at traveling shock waves and compression regions. 3) To develop an advanced Geospace General Circulation Model (GGCM) capable of realistically modeling space weather events, in particular the interaction with CMEs and geomagnetic storms. Furthermore, by implementing scalable run-time supports and sophisticated off- and on-line prediction algorithms, we anticipate important advances in the development of automatic and intelligent system software to optimize a wide variety of 'embedded' computations on parallel computers. Finally, public domain MHD and hydrodynamic codes had a transforming effect on space and astrophysics. We expect that our new generation, open source, public domain multi-scale code will have a similar transformational effect in a variety of disciplines, opening up new classes of problems to physicists and engineers alike.

  6. Transmission loss optimization in acoustic sandwich panels

    NASA Astrophysics Data System (ADS)

    Makris, S. E.; Dym, C. L.; MacGregor Smith, J.

    1986-06-01

    Considering the sound transmission loss (TL) of a sandwich panel as the single objective, different optimization techniques are examined and a sophisticated computer program is used to find the optimum TL. Also, for one of the possible case studies such as core optimization, closed-form expressions are given between TL and the core-design variables for different sets of skins. The significance of these functional relationships lies in the fact that the panel designer can bypass the necessity of using a sophisticated software package in order to assess explicitly the dependence of the TL on core thickness and density.

  7. Artificial Exo-Society Modeling: a New Tool for SETI Research

    NASA Astrophysics Data System (ADS)

    Gardner, James N.

    2002-01-01

    One of the newest fields of complexity research is artificial society modeling. Methodologically related to artificial life research, artificial society modeling utilizes agent-based computer simulation tools like SWARM and SUGARSCAPE developed by the Santa Fe Institute, Los Alamos National Laboratory and the Bookings Institution in an effort to introduce an unprecedented degree of rigor and quantitative sophistication into social science research. The broad aim of artificial society modeling is to begin the development of a more unified social science that embeds cultural evolutionary processes in a computational environment that simulates demographics, the transmission of culture, conflict, economics, disease, the emergence of groups and coadaptation with an environment in a bottom-up fashion. When an artificial society computer model is run, artificial societal patterns emerge from the interaction of autonomous software agents (the "inhabitants" of the artificial society). Artificial society modeling invites the interpretation of society as a distributed computational system and the interpretation of social dynamics as a specialized category of computation. Artificial society modeling techniques offer the potential of computational simulation of hypothetical alien societies in much the same way that artificial life modeling techniques offer the potential to model hypothetical exobiological phenomena. NASA recently announced its intention to begin exploring the possibility of including artificial life research within the broad portfolio of scientific fields comprised by the interdisciplinary astrobiology research endeavor. It may be appropriate for SETI researchers to likewise commence an exploration of the possible inclusion of artificial exo-society modeling within the SETI research endeavor. Artificial exo-society modeling might be particularly useful in a post-detection environment by (1) coherently organizing the set of data points derived from a detected ETI signal, (2) mapping trends in the data points over time (assuming receipt of an extended ETI signal), and (3) projecting such trends forward to derive alternative cultural evolutionary scenarios for the exo-society under analysis. The latter exercise might be particularly useful to compensate for the inevitable time lag between generation of an ETI signal and receipt of an ETI signal on Earth. For this reason, such an exercise might be a helpful adjunct to the decisional process contemplated by Paragraph 9 of the Declaration of Principles Concerning Activities Following the Detection of Extraterrestrial Intelligence.

  8. Does a better model yield a better argument? An info-gap analysis

    NASA Astrophysics Data System (ADS)

    Ben-Haim, Yakov

    2017-04-01

    Theories, models and computations underlie reasoned argumentation in many areas. The possibility of error in these arguments, though of low probability, may be highly significant when the argument is used in predicting the probability of rare high-consequence events. This implies that the choice of a theory, model or computational method for predicting rare high-consequence events must account for the probability of error in these components. However, error may result from lack of knowledge or surprises of various sorts, and predicting the probability of error is highly uncertain. We show that the putatively best, most innovative and sophisticated argument may not actually have the lowest probability of error. Innovative arguments may entail greater uncertainty than more standard but less sophisticated methods, creating an innovation dilemma in formulating the argument. We employ info-gap decision theory to characterize and support the resolution of this problem and present several examples.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    The Second SIAM Conference on Computational Science and Engineering was held in San Diego from February 10-12, 2003. Total conference attendance was 553. This is a 23% increase in attendance over the first conference. The focus of this conference was to draw attention to the tremendous range of major computational efforts on large problems in science and engineering, to promote the interdisciplinary culture required to meet these large-scale challenges, and to encourage the training of the next generation of computational scientists. Computational Science & Engineering (CS&E) is now widely accepted, along with theory and experiment, as a crucial third modemore » of scientific investigation and engineering design. Aerospace, automotive, biological, chemical, semiconductor, and other industrial sectors now rely on simulation for technical decision support. For federal agencies also, CS&E has become an essential support for decisions on resources, transportation, and defense. CS&E is, by nature, interdisciplinary. It grows out of physical applications and it depends on computer architecture, but at its heart are powerful numerical algorithms and sophisticated computer science techniques. From an applied mathematics perspective, much of CS&E has involved analysis, but the future surely includes optimization and design, especially in the presence of uncertainty. Another mathematical frontier is the assimilation of very large data sets through such techniques as adaptive multi-resolution, automated feature search, and low-dimensional parameterization. The themes of the 2003 conference included, but were not limited to: Advanced Discretization Methods; Computational Biology and Bioinformatics; Computational Chemistry and Chemical Engineering; Computational Earth and Atmospheric Sciences; Computational Electromagnetics; Computational Fluid Dynamics; Computational Medicine and Bioengineering; Computational Physics and Astrophysics; Computational Solid Mechanics and Materials; CS&E Education; Meshing and Adaptivity; Multiscale and Multiphysics Problems; Numerical Algorithms for CS&E; Discrete and Combinatorial Algorithms for CS&E; Inverse Problems; Optimal Design, Optimal Control, and Inverse Problems; Parallel and Distributed Computing; Problem-Solving Environments; Software and Wddleware Systems; Uncertainty Estimation and Sensitivity Analysis; and Visualization and Computer Graphics.« less

  10. Can reduction of uncertainties in cervix cancer brachytherapy potentially improve clinical outcome?

    PubMed

    Nesvacil, Nicole; Tanderup, Kari; Lindegaard, Jacob C; Pötter, Richard; Kirisits, Christian

    2016-09-01

    The aim of this study was to quantify the impact of different types and magnitudes of dosimetric uncertainties in cervix cancer brachytherapy (BT) on tumour control probability (TCP) and normal tissue complication probability (NTCP) curves. A dose-response simulation study was based on systematic and random dose uncertainties and TCP/NTCP models for CTV and rectum. Large patient cohorts were simulated assuming different levels of dosimetric uncertainties. TCP and NTCP were computed, based on the planned doses, the simulated dose uncertainty, and an underlying TCP/NTCP model. Systematic uncertainties of 3-20% and random uncertainties with a 5-30% standard deviation per BT fraction were analysed. Systematic dose uncertainties of 5% lead to a 1% decrease/increase of TCP/NTCP, while random uncertainties of 10% had negligible impact on the dose-response curve at clinically relevant dose levels for target and OAR. Random OAR dose uncertainties of 30% resulted in an NTCP increase of 3-4% for planned doses of 70-80Gy EQD2. TCP is robust to dosimetric uncertainties when dose prescription is in the more flat region of the dose-response curve at doses >75Gy. For OARs, improved clinical outcome is expected by reduction of uncertainties via sophisticated dose delivery and treatment verification. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Adapting NBODY4 with a GRAPE-6a Supercomputer for Web Access, Using NBodyLab

    NASA Astrophysics Data System (ADS)

    Johnson, V.; Aarseth, S.

    2006-07-01

    A demonstration site has been developed by the authors that enables researchers and students to experiment with the capabilities and performance of NBODY4 running on a GRAPE-6a over the web. NBODY4 is a sophisticated open-source N-body code for high accuracy simulations of dense stellar systems (Aarseth 2003). In 2004, NBODY4 was successfully tested with a GRAPE-6a, yielding an unprecedented low-cost tool for astrophysical research. The GRAPE-6a is a supercomputer card developed by astrophysicists to accelerate high accuracy N-body simulations with a cluster or a desktop PC (Fukushige et al. 2005, Makino & Taiji 1998). The GRAPE-6a card became commercially available in 2004, runs at 125 Gflops peak, has a standard PCI interface and costs less than 10,000. Researchers running the widely used NBODY6 (which does not require GRAPE hardware) can compare their own PC or laptop performance with simulations run on http://www.NbodyLab.org. Such comparisons may help justify acquisition of a GRAPE-6a. For workgroups such as university physics or astronomy departments, the demonstration site may be replicated or serve as a model for a shared computing resource. The site was constructed using an NBodyLab server-side framework.

  12. Modeling interactions of agriculture and groundwater nitrate contaminants: application of The STICS-Eau-Dyssée coupled models over the Seine River Basin

    NASA Astrophysics Data System (ADS)

    Tavakoly, A. A.; Habets, F.; Saleh, F.; Yang, Z. L.

    2017-12-01

    Human activities such as the cultivation of N-fixing crops, burning of fossil fuels, discharging of industrial and domestic effluents, and extensive usage of fertilizers have recently accelerated the nitrogen loading to watersheds worldwide. Increasing nitrate concentration in surface water and groundwater is a major concern in watersheds with extensive agricultural activities. Nutrient enrichment is one of the major environmental problems in the French coastal zone. To understand and predict interactions between agriculture, surface water and groundwater nitrate contaminants, this study presents a modeling framework that couples the agronomic STICS model with Eau-Dyssée, a distributed hydrologic modeling system to simulate groundwater-surface water interaction. The coupled system is implemented on the Seine River Basin with an area of 88,000 km2 to compute daily nitrate contaminants. Representing a sophisticated hydrosystem with several aquifers and including the megalopolis of Paris, the Seine River Basin is well-known as one of the most productive agricultural areas in France. The STICS-EauDyssée framework is evaluated for a long-term simulation covering 39 years (1971-2010). Model results show that the simulated nitrate highly depends on the inflow produced by surface and subsurface waters. Daily simulation shows that the model captures the seasonal variation of observations and that the overall long-term simulation of nitrate contaminant is satisfactory at the regional scale.

  13. Computer programs: Information retrieval and data analysis, a compilation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The items presented in this compilation are divided into two sections. Section one treats of computer usage devoted to the retrieval of information that affords the user rapid entry into voluminous collections of data on a selective basis. Section two is a more generalized collection of computer options for the user who needs to take such data and reduce it to an analytical study within a specific discipline. These programs, routines, and subroutines should prove useful to users who do not have access to more sophisticated and expensive computer software.

  14. Simple deterministic models and applications. Comment on "Coupled disease-behavior dynamics on complex networks: A review" by Z. Wang et al.

    NASA Astrophysics Data System (ADS)

    Yang, Hyun Mo

    2015-12-01

    Currently, discrete modellings are largely accepted due to the access to computers with huge storage capacity and high performance processors and easy implementation of algorithms, allowing to develop and simulate increasingly sophisticated models. Wang et al. [7] present a review of dynamics in complex networks, focusing on the interaction between disease dynamics and human behavioral and social dynamics. By doing an extensive review regarding to the human behavior responding to disease dynamics, the authors briefly describe the complex dynamics found in the literature: well-mixed populations networks, where spatial structure can be neglected, and other networks considering heterogeneity on spatially distributed populations. As controlling mechanisms are implemented, such as social distancing due 'social contagion', quarantine, non-pharmaceutical interventions and vaccination, adaptive behavior can occur in human population, which can be easily taken into account in the dynamics formulated by networked populations.

  15. Image size invariant visual cryptography for general access structures subject to display quality constraints.

    PubMed

    Lee, Kai-Hui; Chiu, Pei-Ling

    2013-10-01

    Conventional visual cryptography (VC) suffers from a pixel-expansion problem, or an uncontrollable display quality problem for recovered images, and lacks a general approach to construct visual secret sharing schemes for general access structures. We propose a general and systematic approach to address these issues without sophisticated codebook design. This approach can be used for binary secret images in non-computer-aided decryption environments. To avoid pixel expansion, we design a set of column vectors to encrypt secret pixels rather than using the conventional VC-based approach. We begin by formulating a mathematic model for the VC construction problem to find the column vectors for the optimal VC construction, after which we develop a simulated-annealing-based algorithm to solve the problem. The experimental results show that the display quality of the recovered image is superior to that of previous papers.

  16. Workshop on Models for Plasma Spectroscopy

    NASA Astrophysics Data System (ADS)

    1993-09-01

    A meeting was held at St. Johns College, Oxford from Monday 27th to Thursday 30th of September 1993 to bring together a group of physicists working on computational modelling of plasma spectroscopy. The group came from the UK, France, Israel and the USA. The meeting was organized by myself, Dr. Steven Rose of RAL and Dr. R.W. Lee of LLNL. It was funded by the U.S. European Office of Aerospace Research and Development and by LLNL. The meeting grew out of a wish by a group of core participants to make available to practicing plasma physicists (particularly those engaged in the design and analysis of experiments) sophisticated numerical models of plasma physics. Additional plasma physicists attended the meeting in Oxford by invitation. These were experimentalists and users of plasma physics simulation codes whose input to the meeting was to advise the core group as to what was really needed.

  17. Hot Corrosion Test Facility at the NASA Lewis Special Projects Laboratory

    NASA Technical Reports Server (NTRS)

    Robinson, Raymond C.; Cuy, Michael D.

    1994-01-01

    The Hot Corrosion Test Facility (HCTF) at the NASA Lewis Special Projects Laboratory (SPL) is a high-velocity, pressurized burner rig currently used to evaluate the environmental durability of advanced ceramic materials such as SiC and Si3N4. The HCTF uses laboratory service air which is preheated, mixed with jet fuel, and ignited to simulate the conditions of a gas turbine engine. Air, fuel, and water systems are computer-controlled to maintain test conditions which include maximum air flows of 250 kg/hr (550 lbm/hr), pressures of 100-600 kPa (1-6 atm), and gas temperatures exceeding 1500 C (2732 F). The HCTF provides a relatively inexpensive, yet sophisticated means for researchers to study the high-temperature oxidation of advanced materials, and the injection of a salt solution provides the added capability of conducting hot corrosion studies.

  18. Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.

    The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less

  19. A Physiological Signal Transmission Model to be Used for Specific Diagnosis of Cochlear Impairments

    NASA Astrophysics Data System (ADS)

    Saremi, Amin; Stenfelt, Stefan

    2011-11-01

    Many of the sophisticated characteristics of human auditory system are attributed to cochlea. Also, most of patients with a hearing loss suffer from impairments that originate from cochlea (sensorineural). Despite this, today's clinical diagnosis methods do not probe the specific origins of such cochlear lesions. The aim of this research is to introduce a physiological signal transmission model to be clinically used as a tool for diagnosis of cochlear losses. This model enables simulation of different bio-mechano-electrical processes which occur in the auditory organ of Corti inside the cochlea. What makes this model different from many available computational models is its loyalty to physiology since the ultimate goal is to model each single physiological phenomenon. This includes passive BM vibration, outer hair cells' performances such as nonlinear mechanoelectrical transduction (MET), active amplifications by somatic motor, as well as vibration to neural conversion at the inner hair cells.

  20. A Cartesian grid approach with hierarchical refinement for compressible flows

    NASA Technical Reports Server (NTRS)

    Quirk, James J.

    1994-01-01

    Many numerical studies of flows that involve complex geometries are limited by the difficulties in generating suitable grids. We present a Cartesian boundary scheme for two-dimensional, compressible flows that is unfettered by the need to generate a computational grid and so it may be used, routinely, even for the most awkward of geometries. In essence, an arbitrary-shaped body is allowed to blank out some region of a background Cartesian mesh and the resultant cut-cells are singled out for special treatment. This is done within a finite-volume framework and so, in principle, any explicit flux-based integration scheme can take advantage of this method for enforcing solid boundary conditions. For best effect, the present Cartesian boundary scheme has been combined with a sophisticated, local mesh refinement scheme, and a number of examples are shown in order to demonstrate the efficacy of the combined algorithm for simulations of shock interaction phenomena.

  1. Pendulums, Pedagogy, and Matter: Lessons from the Editing of Newton's Principia

    NASA Astrophysics Data System (ADS)

    Biener, Zvi; Smeenk, Chris

    Teaching Newtonian physics involves the replacement of students'' ideas about physical situations with precise concepts appropriate for mathematical applications. This paper focuses on the concepts of `matter'' and `mass''. We suggest that students, like some pre-Newtonian scientists we examine, use these terms in a way that conflicts with their Newtonian meaning. Specifically, `matter''and `mass'' indicate to them the sorts of things that are tangible,bulky, and take up space. In Newtonian mechanics, however, the terms are defined by Newton's Second Law: `mass'' is simply a measure of the acceleration generated by an impressed force. We examine the relationship between these conceptions as it was discussed by Newton and his editor, Roger Cotes, when analyzing a series of pendulum experiments. We suggest that these experiments, as well as more sophisticated computer simulations, can be used in the classroom to sufficiently differentiate the colloquial and precise meaning of these terms.

  2. JSPAM: A restricted three-body code for simulating interacting galaxies

    NASA Astrophysics Data System (ADS)

    Wallin, J. F.; Holincheck, A. J.; Harvey, A.

    2016-07-01

    Restricted three-body codes have a proven ability to recreate much of the disturbed morphology of actual interacting galaxies. As more sophisticated n-body models were developed and computer speed increased, restricted three-body codes fell out of favor. However, their supporting role for performing wide searches of parameter space when fitting orbits to real systems demonstrates a continuing need for their use. Here we present the model and algorithm used in the JSPAM code. A precursor of this code was originally described in 1990, and was called SPAM. We have recently updated the software with an alternate potential and a treatment of dynamical friction to more closely mimic the results from n-body tree codes. The code is released publicly for use under the terms of the Academic Free License ("AFL") v. 3.0 and has been added to the Astrophysics Source Code Library.

  3. Development of a CFD code for casting simulation

    NASA Technical Reports Server (NTRS)

    Murph, Jesse E.

    1992-01-01

    The task of developing a computational fluid dynamics (CFD) code to accurately model the mold filling phase of a casting operation was accomplished in a systematic manner. First the state-of-the-art was determined through a literature search, a code search, and participation with casting industry personnel involved in consortium startups. From this material and inputs from industry personnel, an evaluation of the currently available codes was made. It was determined that a few of the codes already contained sophisticated CFD algorithms and further validation of one of these codes could preclude the development of a new CFD code for this purpose. With industry concurrence, ProCAST was chosen for further evaluation. Two benchmark cases were used to evaluate the code's performance using a Silicon Graphics Personal Iris system. The results of these limited evaluations (because of machine and time constraints) are presented along with discussions of possible improvements and recommendations for further evaluation.

  4. Debating complexity in modeling

    USGS Publications Warehouse

    Hunt, Randall J.; Zheng, Chunmiao

    1999-01-01

    As scientists trying to understand the natural world, how should our effort be apportioned? We know that the natural world is characterized by complex and interrelated processes. Yet do we need to explicitly incorporate these intricacies to perform the tasks we are charged with? In this era of expanding computer power and development of sophisticated preprocessors and postprocessors, are bigger machines making better models? Put another way, do we understand the natural world better now with all these advancements in our simulation ability? Today the public's patience for long-term projects producing indeterminate results is wearing thin. This increases pressure on the investigator to use the appropriate technology efficiently. On the other hand, bringing scientific results into the legal arena opens up a new dimension to the issue: to the layperson, a tool that includes more of the complexity known to exist in the real world is expected to provide the more scientifically valid answer.

  5. Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies

    DOE PAGES

    Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.; ...

    2017-03-28

    The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less

  6. Multi-frequency tapping-mode atomic force microscopy beyond three eigenmodes in ambient air

    PubMed Central

    An, Sangmin; Long, Christian J

    2014-01-01

    Summary We present an exploratory study of multimodal tapping-mode atomic force microscopy driving more than three cantilever eigenmodes. We present tetramodal (4-eigenmode) imaging experiments conducted on a thin polytetrafluoroethylene (PTFE) film and computational simulations of pentamodal (5-eigenmode) cantilever dynamics and spectroscopy, focusing on the case of large amplitude ratios between the fundamental eigenmode and the higher eigenmodes. We discuss the dynamic complexities of the tip response in time and frequency space, as well as the average amplitude and phase response. We also illustrate typical images and spectroscopy curves and provide a very brief description of the observed contrast. Overall, our findings are promising in that they help to open the door to increasing sophistication and greater versatility in multi-frequency AFM through the incorporation of a larger number of driven eigenmodes, and in highlighting specific future research opportunities. PMID:25383276

  7. Passive Motion Paradigm: An Alternative to Optimal Control

    PubMed Central

    Mohan, Vishwanathan; Morasso, Pietro

    2011-01-01

    In the last years, optimal control theory (OCT) has emerged as the leading approach for investigating neural control of movement and motor cognition for two complementary research lines: behavioral neuroscience and humanoid robotics. In both cases, there are general problems that need to be addressed, such as the “degrees of freedom (DoFs) problem,” the common core of production, observation, reasoning, and learning of “actions.” OCT, directly derived from engineering design techniques of control systems quantifies task goals as “cost functions” and uses the sophisticated formal tools of optimal control to obtain desired behavior (and predictions). We propose an alternative “softer” approach passive motion paradigm (PMP) that we believe is closer to the biomechanics and cybernetics of action. The basic idea is that actions (overt as well as covert) are the consequences of an internal simulation process that “animates” the body schema with the attractor dynamics of force fields induced by the goal and task-specific constraints. This internal simulation offers the brain a way to dynamically link motor redundancy with task-oriented constraints “at runtime,” hence solving the “DoFs problem” without explicit kinematic inversion and cost function computation. We argue that the function of such computational machinery is not only restricted to shaping motor output during action execution but also to provide the self with information on the feasibility, consequence, understanding and meaning of “potential actions.” In this sense, taking into account recent developments in neuroscience (motor imagery, simulation theory of covert actions, mirror neuron system) and in embodied robotics, PMP offers a novel framework for understanding motor cognition that goes beyond the engineering control paradigm provided by OCT. Therefore, the paper is at the same time a review of the PMP rationale, as a computational theory, and a perspective presentation of how to develop it for designing better cognitive architectures. PMID:22207846

  8. Computational Models Used to Assess US Tobacco Control Policies.

    PubMed

    Feirman, Shari P; Glasser, Allison M; Rose, Shyanika; Niaura, Ray; Abrams, David B; Teplitskaya, Lyubov; Villanti, Andrea C

    2017-11-01

    Simulation models can be used to evaluate existing and potential tobacco control interventions, including policies. The purpose of this systematic review was to synthesize evidence from computational models used to project population-level effects of tobacco control interventions. We provide recommendations to strengthen simulation models that evaluate tobacco control interventions. Studies were eligible for review if they employed a computational model to predict the expected effects of a non-clinical US-based tobacco control intervention. We searched five electronic databases on July 1, 2013 with no date restrictions and synthesized studies qualitatively. Six primary non-clinical intervention types were examined across the 40 studies: taxation, youth prevention, smoke-free policies, mass media campaigns, marketing/advertising restrictions, and product regulation. Simulation models demonstrated the independent and combined effects of these interventions on decreasing projected future smoking prevalence. Taxation effects were the most robust, as studies examining other interventions exhibited substantial heterogeneity with regard to the outcomes and specific policies examined across models. Models should project the impact of interventions on overall tobacco use, including nicotine delivery product use, to estimate preventable health and cost-saving outcomes. Model validation, transparency, more sophisticated models, and modeling policy interactions are also needed to inform policymakers to make decisions that will minimize harm and maximize health. In this systematic review, evidence from multiple studies demonstrated the independent effect of taxation on decreasing future smoking prevalence, and models for other tobacco control interventions showed that these strategies are expected to decrease smoking, benefit population health, and are reasonable to implement from a cost perspective. Our recommendations aim to help policymakers and researchers minimize harm and maximize overall population-level health benefits by considering the real-world context in which tobacco control interventions are implemented. © The Author 2017. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Integrative multicellular biological modeling: a case study of 3D epidermal development using GPU algorithms

    PubMed Central

    2010-01-01

    Background Simulation of sophisticated biological models requires considerable computational power. These models typically integrate together numerous biological phenomena such as spatially-explicit heterogeneous cells, cell-cell interactions, cell-environment interactions and intracellular gene networks. The recent advent of programming for graphical processing units (GPU) opens up the possibility of developing more integrative, detailed and predictive biological models while at the same time decreasing the computational cost to simulate those models. Results We construct a 3D model of epidermal development and provide a set of GPU algorithms that executes significantly faster than sequential central processing unit (CPU) code. We provide a parallel implementation of the subcellular element method for individual cells residing in a lattice-free spatial environment. Each cell in our epidermal model includes an internal gene network, which integrates cellular interaction of Notch signaling together with environmental interaction of basement membrane adhesion, to specify cellular state and behaviors such as growth and division. We take a pedagogical approach to describing how modeling methods are efficiently implemented on the GPU including memory layout of data structures and functional decomposition. We discuss various programmatic issues and provide a set of design guidelines for GPU programming that are instructive to avoid common pitfalls as well as to extract performance from the GPU architecture. Conclusions We demonstrate that GPU algorithms represent a significant technological advance for the simulation of complex biological models. We further demonstrate with our epidermal model that the integration of multiple complex modeling methods for heterogeneous multicellular biological processes is both feasible and computationally tractable using this new technology. We hope that the provided algorithms and source code will be a starting point for modelers to develop their own GPU implementations, and encourage others to implement their modeling methods on the GPU and to make that code available to the wider community. PMID:20696053

  10. Decadal predictions of Southern Ocean sea ice : testing different initialization methods with an Earth-system Model of Intermediate Complexity

    NASA Astrophysics Data System (ADS)

    Zunz, Violette; Goosse, Hugues; Dubinkina, Svetlana

    2013-04-01

    The sea ice extent in the Southern Ocean has increased since 1979 but the causes of this expansion have not been firmly identified. In particular, the contribution of internal variability and external forcing to this positive trend has not been fully established. In this region, the lack of observations and the overestimation of internal variability of the sea ice by contemporary General Circulation Models (GCMs) make it difficult to understand the behaviour of the sea ice. Nevertheless, if its evolution is governed by the internal variability of the system and if this internal variability is in some way predictable, a suitable initialization method should lead to simulations results that better fit the reality. Current GCMs decadal predictions are generally initialized through a nudging towards some observed fields. This relatively simple method does not seem to be appropriated to the initialization of sea ice in the Southern Ocean. The present study aims at identifying an initialization method that could improve the quality of the predictions of Southern Ocean sea ice at decadal timescales. We use LOVECLIM, an Earth-system Model of Intermediate Complexity that allows us to perform, within a reasonable computational time, the large amount of simulations required to test systematically different initialization procedures. These involve three data assimilation methods: a nudging, a particle filter and an efficient particle filter. In a first step, simulations are performed in an idealized framework, i.e. data from a reference simulation of LOVECLIM are used instead of observations, herein after called pseudo-observations. In this configuration, the internal variability of the model obviously agrees with the one of the pseudo-observations. This allows us to get rid of the issues related to the overestimation of the internal variability by models compared to the observed one. This way, we can work out a suitable methodology to assess the efficiency of the initialization procedures tested. It also allows us determine the upper limit of improvement that can be expected if more sophisticated initialization methods are used in decadal prediction simulations and if models have an internal variability agreeing with the observed one. Furthermore, since pseudo-observations are available everywhere at any time step, we also analyse the differences between simulations initialized with a complete dataset of pseudo-observations and the ones for which pseudo-observations data are not assimilated everywhere. In a second step, simulations are realized in a realistic framework, i.e. through the use of actual available observations. The same data assimilation methods are tested in order to check if more sophisticated methods can improve the reliability and the accuracy of decadal prediction simulations, even if they are performed with models that overestimate the internal variability of the sea ice extent in the Southern Ocean.

  11. Numerical simulation of gender differences in a long-term microgravity exposure

    NASA Astrophysics Data System (ADS)

    Perez-Poch, Antoni

    The objective of this work is to analyse and simulate gender differences when individuals are exposed to long-term microgravity. Risk probability of a health impairment which may put in jeopardy a long-term mission is also evaluated. Computer simulations are becoming a promising research line of work, as physiological models become more and more sophisticated and reliable. Technological advances in state-of-the-art hardware technology and software allow nowadays for better and more accurate simulations of complex phenomena, such as the response of the human cardiovascular system to long-term exposure to microgravity. Experimental data for long-term missions are difficult to achieve and reproduce, therefore the predictions of computer simulations are of a major importance in this field. Our approach is based on a previous model developed and implemented in our laboratory (NELME: Numerical Evaluation of Long-term Microgravity Effects). The software simulates the behaviour of the cardiovascular system and different human organs, has a modular architecture, and allows to introduce perturbations such as physical exercise or countermeasures. The implementation is based on a complex electricallike model of this control system, using inexpensive software development frameworks, and has been tested and validated with the available experimental data. Gender differences have been implemented for this specific work, as an adjustment of a number of parameters that are included in the model. Women versus men physiological differences have been therefore taken into account, based upon estimations from the physiology bibliography. A number of simulations have been carried out for long-term exposure to microgravity. Gravity varying from Earth-based to zero, and time exposure are the two main variables involved in the construction of results, including responses to patterns of physical aerobical exercise, and also thermal stress simulating an extra-vehicular activity. Results show that significant differences appear between men and women physiological response after long-term exposure (more than three months) to microgravity. Risk evaluation for every gender, and specific risk thresholds are provided. Initial results are compatible with the existing data, and provide unique information regarding different patterns of microgravity exposure. We conclude that computer-based models such us NELME are a promising line of work to predict health risks in long-term missions. More experimental work is needed to adjust some parameters of the model. This work may be seen as another contribution to a better understanding of the underlying processes involved for both women in man adaptation to long-term microgravity.

  12. On the role of numerical simulations in studies of reduced gravity-induced physiological effects in humans. Results from NELME.

    NASA Astrophysics Data System (ADS)

    Perez-Poch, Antoni

    Computer simulations are becoming a promising research line of work, as physiological models become more and more sophisticated and reliable. Technological advances in state-of-the-art hardware technology and software allow nowadays for better and more accurate simulations of complex phenomena, such as the response of the human cardiovascular system to long-term exposure to microgravity. Experimental data for long-term missions are difficult to achieve and reproduce, therefore the predictions of computer simulations are of a major importance in this field. Our approach is based on a previous model developed and implemented in our laboratory (NELME: Numercial Evaluation of Long-term Microgravity Effects). The software simulates the behaviour of the cardiovascular system and different human organs, has a modular archi-tecture, and allows to introduce perturbations such as physical exercise or countermeasures. The implementation is based on a complex electrical-like model of this control system, using inexpensive development frameworks, and has been tested and validated with the available experimental data. The objective of this work is to analyse and simulate long-term effects and gender differences when individuals are exposed to long-term microgravity. Risk probability of a health impairement which may put in jeopardy a long-term mission is also evaluated. . Gender differences have been implemented for this specific work, as an adjustment of a number of parameters that are included in the model. Women versus men physiological differences have been therefore taken into account, based upon estimations from the physiology bibliography. A number of simulations have been carried out for long-term exposure to microgravity. Gravity varying continuosly from Earth-based to zero, and time exposure are the two main variables involved in the construction of results, including responses to patterns of physical aerobic ex-ercise and thermal stress simulating an extra-vehicular activity. Results show that significant differences appear between men and women physiological response after long-term exposure (more than three months) to microgravity. Risk evaluation for every gender, and specific risk thresholds are provided. Different scenarios like a long-term mission to Moon or Mars are evaluated, including countermeasures such as aerobic exercise. Initial results are compatible with the existing data, and provide useful insights regarding different patterns of microgravity exposure. We conclude that computer-based models such us NELME are a promising line of work to predict health risks in long-term missions.

  13. Graphical Requirements for Force Level Planning. Volume 2

    DTIC Science & Technology

    1991-09-01

    technology review includes graphics algorithms, computer hardware, computer software, and design methodologies. The technology can either exist today or...level graphics language. 7.4 User Interface Design Tools As user interfaces have become more sophisticated, they have become harder to develop. Xl...Setphen M. Pizer, editors. Proceedings 1986 Workshop on Interactive 31) Graphics , October 1986. 18 J. S. Dumas. Designing User Interface Software. Prentice

  14. Use of Microcomputers and Personal Computers in Pacing

    PubMed Central

    Sasmor, L.; Tarjan, P.; Mumford, V.; Smith, E.

    1983-01-01

    This paper describes the evolution from the early discrete circuit pacemaker of the past to the sophisticated microprocessor based pacemakers of today. The necessary computerized supporting instrumentation is also described. Technological and economical reasons for this evolution are discussed.

  15. Toward a New Voice

    ERIC Educational Resources Information Center

    Murphy, Patti

    2007-01-01

    Frequently linked to sophisticated speech communication devices resembling laptop computers, augmentative and alternative communication (AAC) encompasses a spectrum of tools and strategies ranging from pointing, writing, gestures, and facial expressions to sign language, manual alphabet boards, picture symbols, and photographs used to convey…

  16. V/STOL AND digital avionics system for UH-1H

    NASA Technical Reports Server (NTRS)

    Liden, S.

    1978-01-01

    A hardware and software system for the Bell UH-1H helicopter was developed that provides sophisticated navigation, guidance, control, display, and data acquisition capabilities for performing terminal area navigation, guidance and control research. Two Sperry 1819B general purpose digital computers were used. One contains the development software that performs all the specified system flight computations. The second computer is available to NASA for experimental programs that run simultaneously with the other computer programs and which may, at the push of a button, replace selected computer computations. Other features that provide research flexibility include keyboard selectable gains and parameters and software generated alphanumeric and CRT displays.

  17. The journey from forensic to predictive materials science using density functional theory

    DOE PAGES

    Schultz, Peter A.

    2017-09-12

    Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.

  18. The journey from forensic to predictive materials science using density functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, Peter A.

    Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.

  19. Practical aspects of modeling aircraft dynamics from flight data

    NASA Technical Reports Server (NTRS)

    Iliff, K. W.; Maine, R. E.

    1984-01-01

    The purpose of parameter estimation, a subset of system identification, is to estimate the coefficients (such as stability and control derivatives) of the aircraft differential equations of motion from sampled measured dynamic responses. In the past, the primary reason for estimating stability and control derivatives from flight tests was to make comparisons with wind tunnel estimates. As aircraft became more complex, and as flight envelopes were expanded to include flight regimes that were not well understood, new requirements for the derivative estimates evolved. For many years, the flight determined derivatives were used in simulations to aid in flight planning and in pilot training. The simulations were particularly important in research flight test programs in which an envelope expansion into new flight regimes was required. Parameter estimation techniques for estimating stability and control derivatives from flight data became more sophisticated to support the flight test programs. As knowledge of these new flight regimes increased, more complex aircraft were flown. Much of this increased complexity was in sophisticated flight control systems. The design and refinement of the control system required higher fidelity simulations than were previously required.

  20. Examination of mitral regurgitation with a goat heart model for the development of intelligent artificial papillary muscle.

    PubMed

    Shiraishi, Y; Yambe, T; Yoshizawa, M; Hashimoto, H; Yamada, A; Miura, H; Hashem, M; Kitano, T; Shiga, T; Homma, D

    2012-01-01

    Annuloplasty for functional mitral or tricuspid regurgitation has been made for surgical restoration of valvular diseases. However, these major techniques may sometimes be ineffective because of chamber dilation and valve tethering. We have been developing a sophisticated intelligent artificial papillary muscle (PM) by using an anisotropic shape memory alloy fiber for an alternative surgical reconstruction of the continuity of the mitral structural apparatus and the left ventricular myocardium. This study exhibited the mitral regurgitation with regard to the reduction in the PM tension quantitatively with an originally developed ventricular simulator using isolated goat hearts for the sophisticated artificial PM. Aortic and mitral valves with left ventricular free wall portions of isolated goat hearts (n=9) were secured on the elastic plastic membrane and statically pressurized, which led to valvular leaflet-papillary muscle positional change and central mitral regurgitation. PMs were connected to the load cell, and the relationship between the tension of regurgitation and PM tension were measured. Then we connected the left ventricular specimen model to our hydraulic ventricular simulator and achieved hemodynamic simulation with the controlled tension of PMs.

  1. Utilization of the k-space Computational Method to Design an Intracavitary Transrectal Ultrasound Phased Array Applicator for Hyperthermia Treatment of Prostate Cancer

    NASA Astrophysics Data System (ADS)

    Al-Bataineh, Osama M.; Collins, Christopher M.; Sparrow, Victor W.; Keolian, Robert M.; Smith, Nadine Barrie

    2006-05-01

    This research utilizes the k-space computational method to design an intracavitary probe for hyperthermia treatment of prostate cancer. A three-dimensional (3D) photographical prostate model, utilizing imaging data from the Visible Human Project®, was the basis for inhomogeneous acoustical model development. The acoustical model accounted for sound speed, density, and absorption variations. The k-space computational method was used to simulate ultrasound wave propagation of the designed phased array through the acoustical model. To insure the uniformity and spread of the pressure in the length of the array, and the steering and focusing capability in the width of the array, the equal-sized elements of the phased array were 1 × 14 mm. The anatomical measurements of the prostate were used to predict the final phased array specifications (4 × 20 planar array, 1.2 MHz, element size = 1 × 14 mm, array size = 56 × 20 mm). Good agreement between the exposimetry and the k-space results was achieved. As an example, the -3 dB distances of the focal volume were differing by 9.1% in the propagation direction for k-space prostate simulation and exposimetry results. Temperature simulations indicated that the rectal wall temperature was elevated less than 2°C during hyperthermia treatment. Steering and focusing ability of the designed probe, in both azimuth and propagation directions, were found to span the entire prostate volume with minimal grating lobes (-10 dB reduction from the main lobe) and least heat damage to the rectal wall. Evaluations of the probe included ex vivo and in vivo controlled experiments to deliver the required thermal dose to the targeted tissue. With a desired temperature plateau of 43.0°C, the MRI temperature results at the steady state were 42.9 ± 0.38°C and 43.1 ± 0.80°C for ex vivo and in vivo experiments, respectively. Unlike conventional computational methods, the k-space method provides a powerful tool to predict pressure wavefield and temperature rise in sophisticated, large scale, 3D, inhomogeneous and coarse grid models.

  2. Towards high-resolution mantle convection simulations

    NASA Astrophysics Data System (ADS)

    Höink, T.; Richards, M. A.; Lenardic, A.

    2009-12-01

    The motion of tectonic plates at the Earth’s surface, earthquakes, most forms of volcanism, the growth and evolution of continents, and the volatile fluxes that govern the composition and evolution of the oceans and atmosphere are all controlled by the process of solid-state thermal convection in the Earth’s rocky mantle, with perhaps a minor contribution from convection in the iron core. Similar processes govern the evolution of other planetary objects such as Mars, Venus, Titan, and Europa, all of which might conceivably shed light on the origin and evolution of life on Earth. Modeling and understanding this complicated dynamical system is one of the true “grand challenges” of Earth and planetary science. In the past three decades much progress towards understanding the dynamics of mantle convection has been made, with the increasing aid of computational modeling. Numerical sophistication has evolved significantly, and a small number of independent codes have been successfully employed. Computational power continues to increase dramatically, and with it the ability to resolve increasingly finer fluid mechanical structures. Yet, the perhaps most often cited limitation in numerical modeling based publications is still the limitation of computing power, because the ability to resolve thermal boundary layers within the convecting mantle (e.g., lithospheric plates), requires a spatial resolution of ~ 10 km. At present, the largest supercomputing facilities still barely approach the power to resolve this length scale in mantle convection simulations that include the physics necessary to model plate-like behavior. Our goal is to use supercomputing facilities to perform 3D spherical mantle convection simulations that include the ingredients for plate-like behavior, i.e. strongly temperature- and stress-dependent viscosity, at Earth-like convective vigor with a global resolution of order 10 km. In order to qualify to use such facilities, it is also necessary to demonstrate good parallel efficiency. Here we will present two kinds of results: (1) scaling properties of the community code CitcomS on DOE/NERSC's supercomputer Franklin for up to ~ 6000 processors, and (2) preliminary simulations that illustrate the role of a low-viscosity asthenosphere in plate-like behavior in mantle convection.

  3. Initial inclusion of thermodynamic considerations in Kayenta.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brannon, Rebecca Moss; Bishop, Joseph E.; Fuller, Timothy J.

    A persistent challenge in simulating damage of natural geological materials, as well as rock-like engineered materials, is the development of efficient and accurate constitutive models. The common feature for these brittle and quasi-brittle materials are the presence of flaws such as porosity and network of microcracks. The desired models need to be able to predict the material responses over a wide range of porosities and strain rate. Kayenta (formerly called the Sandia GeoModel) is a unified general-purpose constitutive model that strikes a balance between first-principles micromechanics and phenomenological or semi-empirical modeling strategies. However, despite its sophistication and ability to reducemore » to several classical plasticity theories, Kayenta is incapable of modeling deformation of ductile materials in which deformation is dominated by dislocation generation and movement which can lead to significant heating. This stems from Kayenta's roots as a geological model, where heating due to inelastic deformation is often neglected or presumed to be incorporated implicitly through the elastic moduli. The sophistication of Kayenta and its large set of extensive features, however, make Kayenta an attractive candidate model to which thermal effects can be added. This report outlines the initial work in doing just that, extending the capabilities of Kayenta to include deformation of ductile materials, for which thermal effects cannot be neglected. Thermal effects are included based on an assumption of adiabatic loading by computing the bulk and thermal responses of the material with the Kerley Mie-Grueneisen equation of state and adjusting the yield surface according to the updated thermal state. This new version of Kayenta, referred to as Thermo-Kayenta throughout this report, is capable of reducing to classical Johnson-Cook plasticity in special case single element simulations and has been used to obtain reasonable results in more complicated Taylor impact simulations in LS-Dyna. Despite these successes, however, Thermo-Kayenta requires additional refinement for it to be consistent in the thermodynamic sense and for it to be considered superior to other, more mature thermoplastic models. The initial thermal development, results, and required refinements are all detailed in the following report.« less

  4. Real-time Java simulations of multiple interference dielectric filters

    NASA Astrophysics Data System (ADS)

    Kireev, Alexandre N.; Martin, Olivier J. F.

    2008-12-01

    An interactive Java applet for real-time simulation and visualization of the transmittance properties of multiple interference dielectric filters is presented. The most commonly used interference filters as well as the state-of-the-art ones are embedded in this platform-independent applet which can serve research and education purposes. The Transmittance applet can be freely downloaded from the site http://cpc.cs.qub.ac.uk. Program summaryProgram title: Transmittance Catalogue identifier: AEBQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBQ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5778 No. of bytes in distributed program, including test data, etc.: 90 474 Distribution format: tar.gz Programming language: Java Computer: Developed on PC-Pentium platform Operating system: Any Java-enabled OS. Applet was tested on Windows ME, XP, Sun Solaris, Mac OS RAM: Variable Classification: 18 Nature of problem: Sophisticated wavelength selective multiple interference filters can include some tens or even hundreds of dielectric layers. The spectral response of such a stack is not obvious. On the other hand, there is a strong demand from application designers and students to get a quick insight into the properties of a given filter. Solution method: A Java applet was developed for the computation and the visualization of the transmittance of multilayer interference filters. It is simple to use and the embedded filter library can serve educational purposes. Also, its ability to handle complex structures will be appreciated as a useful research and development tool. Running time: Real-time simulations

  5. Using genetic information while protecting the privacy of the soul.

    PubMed

    Moor, J H

    1999-01-01

    Computing plays an important role in genetics (and vice versa). Theoretically, computing provides a conceptual model for the function and malfunction of our genetic machinery. Practically, contemporary computers and robots equipped with advanced algorithms make the revelation of the complete human genome imminent--computers are about to reveal our genetic souls for the first time. Ethically, computers help protect privacy by restricting access in sophisticated ways to genetic information. But the inexorable fact that computers will increasingly collect, analyze, and disseminate abundant amounts of genetic information made available through the genetic revolution, not to mention that inexpensive computing devices will make genetic information gathering easier, underscores the need for strong and immediate privacy legislation.

  6. Cutter Resource Effectiveness Evaluation (CREE) Program : A Guide for Users and Analysts

    DOT National Transportation Integrated Search

    1978-03-01

    The Cutter Resource Effectiveness Evaluation (CREE) project has developed a sophisticated, user-oriented computer model which can evaluate the effectiveness of any existing Coast Guard craft, or the effectiveness of any of a number of proposed altern...

  7. At Home in the Cell.

    ERIC Educational Resources Information Center

    Flannery, Maura C.

    1999-01-01

    Argues that biologists' understanding of the cell has become richer over the past 30 years. Describes how genetic engineering and sophisticated computer technology have provided an increased knowledge of genes, gene products, components of cells, and the structure and function of proteins. (CCM)

  8. The High-Tech Surge. Focus on Careers.

    ERIC Educational Resources Information Center

    Vo, Chuong-Dai Hong

    1996-01-01

    The computer industry is growing at a phenomenal rate as technology advances and prices fall, stimulating unprecedented demand from business, government, and individuals. Higher levels of education will be the key to securing employment as organizations increasingly rely on sophisticated technology. (Author)

  9. Chandelier: Picturing Potential

    ERIC Educational Resources Information Center

    Tebbs, Trevor J.

    2014-01-01

    The author--artist, scientist, educator, and visual-spatial thinker--describes the genesis of, and provides insight into, an innovative, strength-based, visually dynamic computer-aided communication system called Chandelier©. This system is the powerful combination of a sophisticated, user-friendly software program and an organizational…

  10. Design and function of biomimetic multilayer water purification membranes

    PubMed Central

    Ling, Shengjie; Qin, Zhao; Huang, Wenwen; Cao, Sufeng; Kaplan, David L.; Buehler, Markus J.

    2017-01-01

    Multilayer architectures in water purification membranes enable increased water throughput, high filter efficiency, and high molecular loading capacity. However, the preparation of membranes with well-organized multilayer structures, starting from the nanoscale to maximize filtration efficiency, remains a challenge. We report a complete strategy to fully realize a novel biomaterial-based multilayer nanoporous membrane via the integration of computational simulation and experimental fabrication. Our comparative computational simulations, based on coarse-grained models of protein nanofibrils and mineral plates, reveal that the multilayer structure can only form with weak interactions between nanofibrils and mineral plates. We demonstrate experimentally that silk nanofibril (SNF) and hydroxyapatite (HAP) can be used to fabricate highly ordered multilayer membranes with nanoporous features by combining protein self-assembly and in situ biomineralization. The production is optimized to be a simple and highly repeatable process that does not require sophisticated equipment and is suitable for scaled production of low-cost water purification membranes. These membranes not only show ultrafast water penetration but also exhibit broad utility and high efficiency of removal and even reuse (in some cases) of contaminants, including heavy metal ions, dyes, proteins, and other nanoparticles in water. Our biomimetic design and synthesis of these functional SNF/HAP materials have established a paradigm that could lead to the large-scale, low-cost production of multilayer materials with broad spectrum and efficiency for water purification, with applications in wastewater treatment, biomedicine, food industry, and the life sciences. PMID:28435877

  11. Design and function of biomimetic multilayer water purification membranes.

    PubMed

    Ling, Shengjie; Qin, Zhao; Huang, Wenwen; Cao, Sufeng; Kaplan, David L; Buehler, Markus J

    2017-04-01

    Multilayer architectures in water purification membranes enable increased water throughput, high filter efficiency, and high molecular loading capacity. However, the preparation of membranes with well-organized multilayer structures, starting from the nanoscale to maximize filtration efficiency, remains a challenge. We report a complete strategy to fully realize a novel biomaterial-based multilayer nanoporous membrane via the integration of computational simulation and experimental fabrication. Our comparative computational simulations, based on coarse-grained models of protein nanofibrils and mineral plates, reveal that the multilayer structure can only form with weak interactions between nanofibrils and mineral plates. We demonstrate experimentally that silk nanofibril (SNF) and hydroxyapatite (HAP) can be used to fabricate highly ordered multilayer membranes with nanoporous features by combining protein self-assembly and in situ biomineralization. The production is optimized to be a simple and highly repeatable process that does not require sophisticated equipment and is suitable for scaled production of low-cost water purification membranes. These membranes not only show ultrafast water penetration but also exhibit broad utility and high efficiency of removal and even reuse (in some cases) of contaminants, including heavy metal ions, dyes, proteins, and other nanoparticles in water. Our biomimetic design and synthesis of these functional SNF/HAP materials have established a paradigm that could lead to the large-scale, low-cost production of multilayer materials with broad spectrum and efficiency for water purification, with applications in wastewater treatment, biomedicine, food industry, and the life sciences.

  12. Wing Leading Edge RCC Rapid Response Damage Prediction Tool (IMPACT2)

    NASA Technical Reports Server (NTRS)

    Clark, Robert; Cottter, Paul; Michalopoulos, Constantine

    2013-01-01

    This rapid response computer program predicts Orbiter Wing Leading Edge (WLE) damage caused by ice or foam impact during a Space Shuttle launch (Program "IMPACT2"). The program was developed after the Columbia accident in order to assess quickly WLE damage due to ice, foam, or metal impact (if any) during a Shuttle launch. IMPACT2 simulates an impact event in a few minutes for foam impactors, and in seconds for ice and metal impactors. The damage criterion is derived from results obtained from one sophisticated commercial program, which requires hours to carry out simulations of the same impact events. The program was designed to run much faster than the commercial program with prediction of projectile threshold velocities within 10 to 15% of commercial-program values. The mathematical model involves coupling of Orbiter wing normal modes of vibration to nonlinear or linear springmass models. IMPACT2 solves nonlinear or linear impact problems using classical normal modes of vibration of a target, and nonlinear/ linear time-domain equations for the projectile. Impact loads and stresses developed in the target are computed as functions of time. This model is novel because of its speed of execution. A typical model of foam, or other projectile characterized by material nonlinearities, impacting an RCC panel is executed in minutes instead of hours needed by the commercial programs. Target damage due to impact can be assessed quickly, provided that target vibration modes and allowable stress are known.

  13. The r-Java 2.0 code: nuclear physics

    NASA Astrophysics Data System (ADS)

    Kostka, M.; Koning, N.; Shand, Z.; Ouyed, R.; Jaikumar, P.

    2014-08-01

    Aims: We present r-Java 2.0, a nucleosynthesis code for open use that performs r-process calculations, along with a suite of other analysis tools. Methods: Equipped with a straightforward graphical user interface, r-Java 2.0 is capable of simulating nuclear statistical equilibrium (NSE), calculating r-process abundances for a wide range of input parameters and astrophysical environments, computing the mass fragmentation from neutron-induced fission and studying individual nucleosynthesis processes. Results: In this paper we discuss enhancements to this version of r-Java, especially the ability to solve the full reaction network. The sophisticated fission methodology incorporated in r-Java 2.0 that includes three fission channels (beta-delayed, neutron-induced, and spontaneous fission), along with computation of the mass fragmentation, is compared to the upper limit on mass fission approximation. The effects of including beta-delayed neutron emission on r-process yield is studied. The role of Coulomb interactions in NSE abundances is shown to be significant, supporting previous findings. A comparative analysis was undertaken during the development of r-Java 2.0 whereby we reproduced the results found in the literature from three other r-process codes. This code is capable of simulating the physical environment of the high-entropy wind around a proto-neutron star, the ejecta from a neutron star merger, or the relativistic ejecta from a quark nova. Likewise the users of r-Java 2.0 are given the freedom to define a custom environment. This software provides a platform for comparing proposed r-process sites.

  14. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    PubMed

    Fushing, Hsieh; McAssey, Michael P; Beisner, Brianne; McCowan, Brenda

    2011-03-15

    We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  15. When Machines Think: Radiology's Next Frontier.

    PubMed

    Dreyer, Keith J; Geis, J Raymond

    2017-12-01

    Artificial intelligence (AI), machine learning, and deep learning are terms now seen frequently, all of which refer to computer algorithms that change as they are exposed to more data. Many of these algorithms are surprisingly good at recognizing objects in images. The combination of large amounts of machine-consumable digital data, increased and cheaper computing power, and increasingly sophisticated statistical models combine to enable machines to find patterns in data in ways that are not only cost-effective but also potentially beyond humans' abilities. Building an AI algorithm can be surprisingly easy. Understanding the associated data structures and statistics, on the other hand, is often difficult and obscure. Converting the algorithm into a sophisticated product that works consistently in broad, general clinical use is complex and incompletely understood. To show how these AI products reduce costs and improve outcomes will require clinical translation and industrial-grade integration into routine workflow. Radiology has the chance to leverage AI to become a center of intelligently aggregated, quantitative, diagnostic information. Centaur radiologists, formed as a synergy of human plus computer, will provide interpretations using data extracted from images by humans and image-analysis computer algorithms, as well as the electronic health record, genomics, and other disparate sources. These interpretations will form the foundation of precision health care, or care customized to an individual patient. © RSNA, 2017.

  16. Adding a solar-radiance function to the Hošek-Wilkie skylight model.

    PubMed

    Hošek, Lukáš; Wilkie, Alexander

    2013-01-01

    One prerequisite for realistic renderings of outdoor scenes is the proper capturing of the sky's appearance. Currently, an explicit simulation of light scattering in the atmosphere isn't computationally feasible, and won't be in the foreseeable future. Captured luminance patterns have proven their usefulness in practice but can't meet all user needs. To fill this capability gap, computer graphics technology has employed analytical models of sky-dome luminance patterns for more than two decades. For technical reasons, such models deal with only the sky dome's appearance, though, and exclude the solar disc. The widely used model proposed by Arcot Preetham and colleagues employed a separately derived analytical formula for adding a solar emitter of suitable radiant intensity. Although this yields reasonable results, the formula is derived in a manner that doesn't exactly match the conditions in their sky-dome model. But the more sophisticated a skylight model is and the more subtly it can represent different conditions, the more the solar radiance should exactly match the skylight's conditions. Toward that end, researchers propose a solar-radiance function that exactly matches a recently published high-quality analytical skylight model.

  17. Intelligent model-based diagnostics for vehicle health management

    NASA Astrophysics Data System (ADS)

    Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki

    2003-08-01

    The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.

  18. Scalable splitting algorithms for big-data interferometric imaging in the SKA era

    NASA Astrophysics Data System (ADS)

    Onose, Alexandru; Carrillo, Rafael E.; Repetti, Audrey; McEwen, Jason D.; Thiran, Jean-Philippe; Pesquet, Jean-Christophe; Wiaux, Yves

    2016-11-01

    In the context of next-generation radio telescopes, like the Square Kilometre Array (SKA), the efficient processing of large-scale data sets is extremely important. Convex optimization tasks under the compressive sensing framework have recently emerged and provide both enhanced image reconstruction quality and scalability to increasingly larger data sets. We focus herein mainly on scalability and propose two new convex optimization algorithmic structures able to solve the convex optimization tasks arising in radio-interferometric imaging. They rely on proximal splitting and forward-backward iterations and can be seen, by analogy, with the CLEAN major-minor cycle, as running sophisticated CLEAN-like iterations in parallel in multiple data, prior, and image spaces. Both methods support any convex regularization function, in particular, the well-studied ℓ1 priors promoting image sparsity in an adequate domain. Tailored for big-data, they employ parallel and distributed computations to achieve scalability, in terms of memory and computational requirements. One of them also exploits randomization, over data blocks at each iteration, offering further flexibility. We present simulation results showing the feasibility of the proposed methods as well as their advantages compared to state-of-the-art algorithmic solvers. Our MATLAB code is available online on GitHub.

  19. Visualization of particle interactions in granular media.

    PubMed

    Meier, Holger A; Schlemmer, Michael; Wagner, Christian; Kerren, Andreas; Hagen, Hans; Kuhl, Ellen; Steinmann, Paul

    2008-01-01

    Interaction between particles in so-called granular media, such as soil and sand, plays an important role in the context of geomechanical phenomena and numerous industrial applications. A two scale homogenization approach based on a micro and a macro scale level is briefly introduced in this paper. Computation of granular material in such a way gives a deeper insight into the context of discontinuous materials and at the same time reduces the computational costs. However, the description and the understanding of the phenomena in granular materials are not yet satisfactory. A sophisticated problem-specific visualization technique would significantly help to illustrate failure phenomena on the microscopic level. As main contribution, we present a novel 2D approach for the visualization of simulation data, based on the above outlined homogenization technique. Our visualization tool supports visualization on micro scale level as well as on macro scale level. The tool shows both aspects closely arranged in form of multiple coordinated views to give users the possibility to analyze the particle behavior effectively. A novel type of interactive rose diagrams was developed to represent the dynamic contact networks on the micro scale level in a condensed and efficient way.

  20. Microgravity

    NASA Image and Video Library

    1999-05-26

    Looking for a faster computer? How about an optical computer that processes data streams simultaneously and works with the speed of light? In space, NASA researchers have formed optical thin-film. By turning these thin-films into very fast optical computer components, scientists could improve computer tasks, such as pattern recognition. Dr. Hossin Abdeldayem, physicist at NASA/Marshall Space Flight Center (MSFC) in Huntsville, Al, is working with lasers as part of an optical system for pattern recognition. These systems can be used for automated fingerprinting, photographic scarning and the development of sophisticated artificial intelligence systems that can learn and evolve. Photo credit: NASA/Marshall Space Flight Center (MSFC)

  1. Implementing multiresolution models and families of models: from entity-level simulation to desktop stochastic models and "repro" models

    NASA Astrophysics Data System (ADS)

    McEver, Jimmie; Davis, Paul K.; Bigelow, James H.

    2000-06-01

    We have developed and used families of multiresolution and multiple-perspective models (MRM and MRMPM), both in our substantive analytic work for the Department of Defense and to learn more about how such models can be designed and implemented. This paper is a brief case history of our experience with a particular family of models addressing the use of precision fires in interdicting and halting an invading army. Our models were implemented as closed-form analytic solutions, in spreadsheets, and in the more sophisticated AnalyticaTM environment. We also drew on an entity-level simulation for data. The paper reviews the importance of certain key attributes of development environments (visual modeling, interactive languages, friendly use of array mathematics, facilities for experimental design and configuration control, statistical analysis tools, graphical visualization tools, interactive post-processing, and relational database tools). These can go a long way towards facilitating MRMPM work, but many of these attributes are not yet widely available (or available at all) in commercial model-development tools--especially for use with personal computers. We conclude with some lessons learned from our experience.

  2. Prediction of the Lorentz Force Detuning and pressure sensitivity for a Pillbox cavity

    DOE PAGES

    Parise, M.

    2018-05-18

    The Lorentz Force Detuning (LFD) and the pressure sensitivity are two critical concerns during the design of a Superconducting Radio Frequency (SRF) cavity resonator. The mechanical deformation of the bare Niobium cavity walls, due to the electromagnetic fields and fluctuation of the external pressure in the Helium bath, can dynamically and statically detune the frequency of the cavity and can cause beam phase errors. The frequency shift can be compensated by additional RF power, that is required to maintain the accelerating gradient, or by sophisticated tuning mechanisms and control-compensation algorithms. Passive stiffening is one of the simplest and most effectivemore » tools that can be used during the early design phase, capable of satisfying the Radio Frequency (RF) requisites. This approach requires several multiphysics simulations as well as a deep mechanical and RF knowledge of the phenomena involved. In this paper, is presented a new numerical model for a pillbox cavity that can predict the frequency shifts caused by the LFD and external pressure. This method allows to greatly reduce the computational effort, which is necessary to meet the RF requirements and to keep track of the frequency shifts without using the time consuming multiphysics simulations.« less

  3. Measuring water fluxes in forests: The need for integrative platforms of analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, Eric J.

    To understand the importance of analytical tools such as those provided by Berdanier et al. (2016) in this issue of Tree Physiology, one must understand both the grand challenges facing Earth system modelers, as well as the minutia of engaging in ecophysiological research in the field. It is between these two extremes of scale that many ecologists struggle to translate empirical research into useful conclusions that guide our understanding of how ecosystems currently function and how they are likely to change in the future. Likewise, modelers struggle to build complexity into their models that match this sophisticated understanding of howmore » ecosystems function, so that necessary simplifications required by large scales do not themselves change the conclusions drawn from these simulations. As both monitoring technology and computational power increase, along with the continual effort in both empirical and modeling research, the gap between the scale of Earth system models and ecological observations continually closes. In addition, this creates a need for platforms of model–data interaction that incorporate uncertainties in both simulations and observations when scaling from one to the other, moving beyond simple comparisons of monthly or annual sums and means.« less

  4. Measuring water fluxes in forests: The need for integrative platforms of analysis

    DOE PAGES

    Ward, Eric J.

    2016-08-09

    To understand the importance of analytical tools such as those provided by Berdanier et al. (2016) in this issue of Tree Physiology, one must understand both the grand challenges facing Earth system modelers, as well as the minutia of engaging in ecophysiological research in the field. It is between these two extremes of scale that many ecologists struggle to translate empirical research into useful conclusions that guide our understanding of how ecosystems currently function and how they are likely to change in the future. Likewise, modelers struggle to build complexity into their models that match this sophisticated understanding of howmore » ecosystems function, so that necessary simplifications required by large scales do not themselves change the conclusions drawn from these simulations. As both monitoring technology and computational power increase, along with the continual effort in both empirical and modeling research, the gap between the scale of Earth system models and ecological observations continually closes. In addition, this creates a need for platforms of model–data interaction that incorporate uncertainties in both simulations and observations when scaling from one to the other, moving beyond simple comparisons of monthly or annual sums and means.« less

  5. Exploring the simulation requirements for virtual regional anesthesia training

    NASA Astrophysics Data System (ADS)

    Charissis, V.; Zimmer, C. R.; Sakellariou, S.; Chan, W.

    2010-01-01

    This paper presents an investigation towards the simulation requirements for virtual regional anaesthesia training. To this end we have developed a prototype human-computer interface designed to facilitate Virtual Reality (VR) augmenting educational tactics for regional anaesthesia training. The proposed interface system, aims to compliment nerve blocking techniques methods. The system is designed to operate in real-time 3D environment presenting anatomical information and enabling the user to explore the spatial relation of different human parts without any physical constrains. Furthermore the proposed system aims to assist the trainee anaesthetists so as to build a mental, three-dimensional map of the anatomical elements and their depictive relationship to the Ultra-Sound imaging which is used for navigation of the anaesthetic needle. Opting for a sophisticated approach of interaction, the interface elements are based on simplified visual representation of real objects, and can be operated through haptic devices and surround auditory cues. This paper discusses the challenges involved in the HCI design, introduces the visual components of the interface and presents a tentative plan of future work which involves the development of realistic haptic feedback and various regional anaesthesia training scenarios.

  6. Prediction of the Lorentz Force Detuning and pressure sensitivity for a Pillbox cavity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parise, M.

    The Lorentz Force Detuning (LFD) and the pressure sensitivity are two critical concerns during the design of a Superconducting Radio Frequency (SRF) cavity resonator. The mechanical deformation of the bare Niobium cavity walls, due to the electromagnetic fields and fluctuation of the external pressure in the Helium bath, can dynamically and statically detune the frequency of the cavity and can cause beam phase errors. The frequency shift can be compensated by additional RF power, that is required to maintain the accelerating gradient, or by sophisticated tuning mechanisms and control-compensation algorithms. Passive stiffening is one of the simplest and most effectivemore » tools that can be used during the early design phase, capable of satisfying the Radio Frequency (RF) requisites. This approach requires several multiphysics simulations as well as a deep mechanical and RF knowledge of the phenomena involved. In this paper, is presented a new numerical model for a pillbox cavity that can predict the frequency shifts caused by the LFD and external pressure. This method allows to greatly reduce the computational effort, which is necessary to meet the RF requirements and to keep track of the frequency shifts without using the time consuming multiphysics simulations.« less

  7. Prediction of the Lorentz Force Detuning and pressure sensitivity for a Pillbox cavity

    NASA Astrophysics Data System (ADS)

    Parise, M.

    2018-05-01

    The Lorentz Force Detuning (LFD) and the pressure sensitivity are two critical concerns during the design of a Superconducting Radio Frequency (SRF) cavity resonator. The mechanical deformation of the bare Niobium cavity walls, due to the electromagnetic fields and fluctuation of the external pressure in the Helium bath, can dynamically and statically detune the frequency of the cavity and can cause beam phase errors. The frequency shift can be compensated by additional RF power, that is required to maintain the accelerating gradient, or by sophisticated tuning mechanisms and control-compensation algorithms. Passive stiffening is one of the simplest and most effective tools that can be used during the early design phase, capable of satisfying the Radio Frequency (RF) requisites. This approach requires several multiphysics simulations as well as a deep mechanical and RF knowledge of the phenomena involved. In this paper, is presented a new numerical model for a pillbox cavity that can predict the frequency shifts caused by the LFD and external pressure. This method allows to greatly reduce the computational effort, which is necessary to meet the RF requirements and to keep track of the frequency shifts without using the time consuming multiphysics simulations.

  8. Prediction of the Lorentz Force Detuning and Pressure Sensitivity for a Pillbox Cavity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parise, M.

    2018-04-23

    The Lorentz Force Detuning (LFD) and the pressure sensitivity are two critical concerns during the design of a Superconducting Radio Frequency (SRF) cavity resonator. The mechanical deformation of the bare Niobium cavity walls, due to the electromagnetic fields and fluctuation of the external pressure in the Helium bath, can dynamically and statically detune the frequency of the cavity and can cause beam phase errors. The frequency shift can be compensated by additional RF power, that is required to maintain the accelerating gradient, or by sophisticated tuning mechanisms and control-compensation algorithms. Passive stiffening is one of the simplest and most effectivemore » tools that can be used during the early design phase, capable of satisfying the Radio Frequency (RF) requisites. This approach requires several multiphysics simulations as well as a deep mechanical and RF knowledge of the phenomena involved. In this paper, is presented a new numerical model for a pillbox cavity that can predict the frequency shifts caused by the LFD and external pressure. This method allows to greatly reduce the computational effort, which is necessary to meet the RF requirements and to keep track of the frequency shifts without using the time consuming multiphysics simulations.« less

  9. Load Balancing Integrated Least Slack Time-Based Appliance Scheduling for Smart Home Energy Management.

    PubMed

    Silva, Bhagya Nathali; Khan, Murad; Han, Kijun

    2018-02-25

    The emergence of smart devices and smart appliances has highly favored the realization of the smart home concept. Modern smart home systems handle a wide range of user requirements. Energy management and energy conservation are in the spotlight when deploying sophisticated smart homes. However, the performance of energy management systems is highly influenced by user behaviors and adopted energy management approaches. Appliance scheduling is widely accepted as an effective mechanism to manage domestic energy consumption. Hence, we propose a smart home energy management system that reduces unnecessary energy consumption by integrating an automated switching off system with load balancing and appliance scheduling algorithm. The load balancing scheme acts according to defined constraints such that the cumulative energy consumption of the household is managed below the defined maximum threshold. The scheduling of appliances adheres to the least slack time (LST) algorithm while considering user comfort during scheduling. The performance of the proposed scheme has been evaluated against an existing energy management scheme through computer simulation. The simulation results have revealed a significant improvement gained through the proposed LST-based energy management scheme in terms of cost of energy, along with reduced domestic energy consumption facilitated by an automated switching off mechanism.

  10. A Pipeline for Constructing a Catalog of Multi-method Models of Interacting Galaxies

    NASA Astrophysics Data System (ADS)

    Holincheck, Anthony

    Galaxies represent a fundamental unit of matter for describing the large-scale structure of the universe. One of the major processes affecting the formation and evolution of galaxies are mutual interactions. These interactions can including gravitational tidal distortion, mass transfer, and even mergers. In any hierarchical model, mergers are the key mechanism in galaxy formation and evolution. Computer simulations of interacting galaxies have evolved in the last four decades from simple restricted three-body algorithms to full n-body gravity models. These codes often included sophisticated physical mechanisms such as gas dynamics, supernova feedback, and central blackholes. As the level of complexity, and perhaps realism, increases so does the amount of computational resources needed. These advanced simulations are often used in parameter studies of interactions. They are usually only employed in an ad hoc fashion to recreate the dynamical history of specific sets of interacting galaxies. These specific models are often created with only a few dozen or at most few hundred sets of simulation parameters being attempted. This dissertation presents a prototype pipeline for modeling specific pairs of interacting galaxies in bulk. The process begins with a simple image of the current disturbed morphology and an estimate of distance to the system and mass of the galaxies. With the use of an updated restricted three-body simulation code and the help of Citizen Scientists, the pipeline is able to sample hundreds of thousands of points in parameter space for each system. Through the use of a convenient interface and innovative scoring algorithm, the pipeline aids researchers in identifying the best set of simulation parameters. This dissertation demonstrates a successful recreation of the disturbed morphologies of 62 pairs of interacting galaxies. The pipeline also provides for examining the level of convergence and uniqueness of the dynamical properties of each system. By creating a population of models for actual systems, the current research is able to compare simulation-based and observational values on a larger scale than previous efforts. Several potential relationships between star formation rate and dynamical time since closest approach are presented.

  11. Biomolecular computing systems: principles, progress and potential.

    PubMed

    Benenson, Yaakov

    2012-06-12

    The task of information processing, or computation, can be performed by natural and man-made 'devices'. Man-made computers are made from silicon chips, whereas natural 'computers', such as the brain, use cells and molecules. Computation also occurs on a much smaller scale in regulatory and signalling pathways in individual cells and even within single biomolecules. Indeed, much of what we recognize as life results from the remarkable capacity of biological building blocks to compute in highly sophisticated ways. Rational design and engineering of biological computing systems can greatly enhance our ability to study and to control biological systems. Potential applications include tissue engineering and regeneration and medical treatments. This Review introduces key concepts and discusses recent progress that has been made in biomolecular computing.

  12. High-Tech Conservation: Information-Age Tools Have Revolutionized the Work of Ecologists.

    ERIC Educational Resources Information Center

    Chiles, James R.

    1992-01-01

    Describes a new direction for conservation efforts influenced by the advance of the information age and the introduction of many technologically sophisticated information collecting devices. Devices include microscopic computer chips, miniature electronic components, and Earth-observation satellite. (MCO)

  13. Using Visual Basic to Teach Programming for Geographers.

    ERIC Educational Resources Information Center

    Slocum, Terry A.; Yoder, Stephen C.

    1996-01-01

    Outlines reasons why computer programming should be taught to geographers. These include experience using macro (scripting) languages and sophisticated visualization software, and developing a deeper understanding of general hardware and software capabilities. Discusses the distinct advantages and few disadvantages of the programming language…

  14. Controls for Burning Solid Wastes

    ERIC Educational Resources Information Center

    Toro, Richard F.; Weinstein, Norman J.

    1975-01-01

    Modern thermal solid waste processing systems are becoming more complex, incorporating features that require instrumentation and control systems to a degree greater than that previously required just for proper combustion control. With the advent of complex, sophisticated, thermal processing systems, TV monitoring and computer control should…

  15. Resistance Is Futile

    ERIC Educational Resources Information Center

    O'Hanlon, Charlene

    2009-01-01

    How odd it should seem that even today, with interactive whiteboards, content management systems, wireless broadband, handhelds, and every sort of sophisticated computing device penetrating and improving the classroom experience for students, David Roh, general manager for Follett Digital Resources can still say, "There are hundreds of…

  16. Phase Distribution Phenomena for Simulated Microgravity Conditions: Experimental Work

    NASA Technical Reports Server (NTRS)

    Singhal, Maneesh; Bonetto, Fabian J.; Lahey, R. T., Jr.

    1996-01-01

    This report summarizes the work accomplished at Rensselaer to study phase distribution phenomenon under simulated microgravity conditions. Our group at Rensselaer has been able to develop sophisticated analytical models to predict phase distribution in two-phase flows under a variety of conditions. These models are based on physics and data obtained from carefully controlled experiments that are being conducted here. These experiments also serve to verify the models developed.

  17. Phase Distribution Phenomena for Simulated Microgravity Conditions: Experimental Work

    NASA Technical Reports Server (NTRS)

    Singhal, Maneesh; Bonetto, Fabian J.; Lahey, R. T., Jr.

    1996-01-01

    This report summarizes the work accomplished at Rensselaer to study phase distribution phenomenon under simulated microgravity conditions. Our group at Rensselaer has been able to develop sophisticated analytical models to predict phase distribution in two-phase flows under variety of conditions. These models are based on physics and data obtained from carefully controlled experiments that are being conducted here. These experiments also serve to verify the models developed.

  18. Wireless Interconnects for Intra-chip & Inter-chip Transmission

    NASA Astrophysics Data System (ADS)

    Narde, Rounak Singh

    With the emergence of Internet of Things and information revolution, the demand of high performance computing systems is increasing. The copper interconnects inside the computing chips have evolved into a sophisticated network of interconnects known as Network on Chip (NoC) comprising of routers, switches, repeaters, just like computer networks. When network on chip is implemented on a large scale like in Multicore Multichip (MCMC) systems for High Performance Computing (HPC) systems, length of interconnects increases and so are the problems like power dissipation, interconnect delays, clock synchronization and electrical noise. In this thesis, wireless interconnects are chosen as the substitute for wired copper interconnects. Wireless interconnects offer easy integration with CMOS fabrication and chip packaging. Using wireless interconnects working at unlicensed mm-wave band (57-64GHz), high data rate of Gbps can be achieved. This thesis presents study of transmission between zigzag antennas as wireless interconnects for Multichip multicores (MCMC) systems and 3D IC. For MCMC systems, a four-chips 16-cores model is analyzed with only four wireless interconnects in three configurations with different antenna orientations and locations. Return loss and transmission coefficients are simulated in ANSYS HFSS. Moreover, wireless interconnects are designed, fabricated and tested on a 6'' silicon wafer with resistivity of 55O-cm using a basic standard CMOS process. Wireless interconnect are designed to work at 30GHz using ANSYS HFSS. The fabricated antennas are resonating around 20GHz with a return loss of less than -10dB. The transmission coefficients between antenna pair within a 20mm x 20mm silicon die is found to be varying between -45dB to -55dB. Furthermore, wireless interconnect approach is extended for 3D IC. Wireless interconnects are implemented as zigzag antenna. This thesis extends the work of analyzing the wireless interconnects in 3D IC with different configurations of antenna orientations and coolants. The return loss and transmission coefficients are simulated using ANSYS HFSS.

  19. Adiabatic quantum computing with spin qubits hosted by molecules.

    PubMed

    Yamamoto, Satoru; Nakazawa, Shigeaki; Sugisaki, Kenji; Sato, Kazunobu; Toyota, Kazuo; Shiomi, Daisuke; Takui, Takeji

    2015-01-28

    A molecular spin quantum computer (MSQC) requires electron spin qubits, which pulse-based electron spin/magnetic resonance (ESR/MR) techniques can afford to manipulate for implementing quantum gate operations in open shell molecular entities. Importantly, nuclear spins, which are topologically connected, particularly in organic molecular spin systems, are client qubits, while electron spins play a role of bus qubits. Here, we introduce the implementation for an adiabatic quantum algorithm, suggesting the possible utilization of molecular spins with optimized spin structures for MSQCs. We exemplify the utilization of an adiabatic factorization problem of 21, compared with the corresponding nuclear magnetic resonance (NMR) case. Two molecular spins are selected: one is a molecular spin composed of three exchange-coupled electrons as electron-only qubits and the other an electron-bus qubit with two client nuclear spin qubits. Their electronic spin structures are well characterized in terms of the quantum mechanical behaviour in the spin Hamiltonian. The implementation of adiabatic quantum computing/computation (AQC) has, for the first time, been achieved by establishing ESR/MR pulse sequences for effective spin Hamiltonians in a fully controlled manner of spin manipulation. The conquered pulse sequences have been compared with the NMR experiments and shown much faster CPU times corresponding to the interaction strength between the spins. Significant differences are shown in rotational operations and pulse intervals for ESR/MR operations. As a result, we suggest the advantages and possible utilization of the time-evolution based AQC approach for molecular spin quantum computers and molecular spin quantum simulators underlain by sophisticated ESR/MR pulsed spin technology.

  20. The CSIRO Mk3L climate system model v1.0 coupled to the CABLE land surface scheme v1.4b: evaluation of the control climatology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mao, Jiafu; Phipps, S.J.; Pitman, A.J.

    The CSIRO Mk3L climate system model, a reduced-resolution coupled general circulation model, has previously been described in this journal. The model is configured for millennium scale or multiple century scale simulations. This paper reports the impact of replacing the relatively simple land surface scheme that is the default parameterisation in Mk3L with a sophisticated land surface model that simulates the terrestrial energy, water and carbon balance in a physically and biologically consistent way. An evaluation of the new model s near-surface climatology highlights strengths and weaknesses, but overall the atmospheric variables, including the near-surface air temperature and precipitation, are simulatedmore » well. The impact of the more sophisticated land surface model on existing variables is relatively small, but generally positive. More significantly, the new land surface scheme allows an examination of surface carbon-related quantities including net primary productivity which adds significantly to the capacity of Mk3L. Overall, results demonstrate that this reduced-resolution climate model is a good foundation for exploring long time scale phenomena. The addition of the more sophisticated land surface model enables an exploration of important Earth System questions including land cover change and abrupt changes in terrestrial carbon storage.« less

  1. Efficient Reservoir Simulation with Cubic Plus Association and Cross-Association Equation of State for Multicomponent Three-Phase Compressible Flow with Applications in CO2 Storage and Methane Leakage

    NASA Astrophysics Data System (ADS)

    Moortgat, J.

    2017-12-01

    We present novel simulation tools to model multiphase multicomponent flow and transport in porous media for mixtures that contain non-polar hydrocarbons, self-associating polar water, and cross-associating molecules like methane, ethane, unsaturated hydrocarbons, CO2 and H2S. Such mixtures often occur when CO2 is injected and stored in saline aquifers, or when methane is leaking into groundwater. To accurately predict the species transfer between aqueous, gaseous and oleic phases, and the subsequent change in phase properties, the self- and cross-associating behavior of molecules needs to be taken into account, particularly at the typical temperatures and pressures in deep formations. The Cubic-Plus-Association equation-of-state (EOS) has been demonstrated to be highly accurate for such problems but its excessive computational cost has prevented widespread use in reservoir simulators. We discuss the thermodynamical framework and develop sophisticated numerical algorithms that allow reservoir simulations with efficiencies comparable to a simple cubic EOS. This approach improves our predictive powers for highly nonlinear fluid behavior related to geological carbon sequestration, such as density driven flow and natural convection (solubility trapping), evaporation of water into the CO2-rich gas phase, and competitive dissolution-evaporation when CO2 is injected in, e.g., methane saturated aquifers. Several examples demonstrate the accuracy and robustness of this EOS framework for complex applications.

  2. Monte Carlo simulation as a tool to predict blasting fragmentation based on the Kuz Ram model

    NASA Astrophysics Data System (ADS)

    Morin, Mario A.; Ficarazzo, Francesco

    2006-04-01

    Rock fragmentation is considered the most important aspect of production blasting because of its direct effects on the costs of drilling and blasting and on the economics of the subsequent operations of loading, hauling and crushing. Over the past three decades, significant progress has been made in the development of new technologies for blasting applications. These technologies include increasingly sophisticated computer models for blast design and blast performance prediction. Rock fragmentation depends on many variables such as rock mass properties, site geology, in situ fracturing and blasting parameters and as such has no complete theoretical solution for its prediction. However, empirical models for the estimation of size distribution of rock fragments have been developed. In this study, a blast fragmentation Monte Carlo-based simulator, based on the Kuz-Ram fragmentation model, has been developed to predict the entire fragmentation size distribution, taking into account intact and joints rock properties, the type and properties of explosives and the drilling pattern. Results produced by this simulator were quite favorable when compared with real fragmentation data obtained from a blast quarry. It is anticipated that the use of Monte Carlo simulation will increase our understanding of the effects of rock mass and explosive properties on the rock fragmentation by blasting, as well as increase our confidence in these empirical models. This understanding will translate into improvements in blasting operations, its corresponding costs and the overall economics of open pit mines and rock quarries.

  3. A YEAR-LONG MM5 EVALUATION USING A MODEL EVALUATION TOOLKIT

    EPA Science Inventory

    Air quality modeling has expanded in both sophistication and application over the past decade. Meteorological and air quality modeling tools are being used for research, forecasting, and regulatory related emission control strategies. Results from air quality simulations have far...

  4. Verification, Validation, and Accreditation Challenges of Distributed Simulation for Space Exploration Technology

    NASA Technical Reports Server (NTRS)

    Thomas, Danny; Hartway, Bobby; Hale, Joe

    2006-01-01

    Throughout its rich history, NASA has invested heavily in sophisticated simulation capabilities. These capabilities reside in NASA facilities across the country - and with partners around the world. NASA s Exploration Systems Mission Directorate (ESMD) has the opportunity to leverage these considerable investments to resolve technical questions relating to its missions. The distributed nature of the assets, both in terms of geography and organization, present challenges to their combined and coordinated use, but precedents of geographically distributed real-time simulations exist. This paper will show how technological advances in simulation can be employed to address the issues associated with netting NASA simulation assets.

  5. Simplified galaxy formation with mesh-less hydrodynamics

    NASA Astrophysics Data System (ADS)

    Lupi, Alessandro; Volonteri, Marta; Silk, Joseph

    2017-09-01

    Numerical simulations have become a necessary tool to describe the complex interactions among the different processes involved in galaxy formation and evolution, unfeasible via an analytic approach. The last decade has seen a great effort by the scientific community in improving the sub-grid physics modelling and the numerical techniques used to make numerical simulations more predictive. Although the recently publicly available code gizmo has proven to be successful in reproducing galaxy properties when coupled with the model of the MUFASA simulations and the more sophisticated prescriptions of the Feedback In Realistic Environment (FIRE) set-up, it has not been tested yet using delayed cooling supernova feedback, which still represent a reasonable approach for large cosmological simulations, for which detailed sub-grid models are prohibitive. In order to limit the computational cost and to be able to resolve the disc structure in the galaxies we perform a suite of zoom-in cosmological simulations with rather low resolution centred around a sub-L* galaxy with a halo mass of 3 × 1011 M⊙ at z = 0, to investigate the ability of this simple model, coupled with the new hydrodynamic method of gizmo, to reproduce observed galaxy scaling relations (stellar to halo mass, stellar and baryonic Tully-Fisher, stellar mass-metallicity and mass-size). We find that the results are in good agreement with the main scaling relations, except for the total stellar mass, larger than that predicted by the abundance matching technique, and the effective sizes for the most massive galaxies in the sample, which are too small.

  6. Pumping Optimization Model for Pump and Treat Systems - 15091

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, S.; Ivarson, Kristine A.; Karanovic, M.

    2015-01-15

    Pump and Treat systems are being utilized to remediate contaminated groundwater in the Hanford 100 Areas adjacent to the Columbia River in Eastern Washington. Design of the systems was supported by a three-dimensional (3D) fate and transport model. This model provided sophisticated simulation capabilities but requires many hours to calculate results for each simulation considered. Many simulations are required to optimize system performance, so a two-dimensional (2D) model was created to reduce run time. The 2D model was developed as a equivalent-property version of the 3D model that derives boundary conditions and aquifer properties from the 3D model. It producesmore » predictions that are very close to the 3D model predictions, allowing it to be used for comparative remedy analyses. Any potential system modifications identified by using the 2D version are verified for use by running the 3D model to confirm performance. The 2D model was incorporated into a comprehensive analysis system (the Pumping Optimization Model, POM) to simplify analysis of multiple simulations. It allows rapid turnaround by utilizing a graphical user interface that: 1 allows operators to create hypothetical scenarios for system operation, 2 feeds the input to the 2D fate and transport model, and 3 displays the scenario results to evaluate performance improvement. All of the above is accomplished within the user interface. Complex analyses can be completed within a few hours and multiple simulations can be compared side-by-side. The POM utilizes standard office computing equipment and established groundwater modeling software.« less

  7. Sharing Digital Data

    ERIC Educational Resources Information Center

    Benedis-Grab, Gregory

    2011-01-01

    Computers have changed the landscape of scientific research in profound ways. Technology has always played an important role in scientific experimentation--through the development of increasingly sophisticated tools, the measurement of elusive quantities, and the processing of large amounts of data. However, the advent of social networking and the…

  8. FRAMEWORK FOR EVALUATION OF PHYSIOLOGICALLY-BASED PHARMACOKINETIC MODELS FOR USE IN SAFETY OR RISK ASSESSMENT

    EPA Science Inventory

    ABSTRACT

    Proposed applications of increasingly sophisticated biologically-based computational models, such as physiologically-based pharmacokinetic (PBPK) models, raise the issue of how to evaluate whether the models are adequate for proposed uses including safety or risk ...

  9. Suicide Prevention in a Treatment Setting.

    ERIC Educational Resources Information Center

    Litman, Robert E.

    1995-01-01

    The author anticipates that sophisticated interactive computer programs will be effective in improving screening and case finding of the suicidal and that they will become invaluable in improving training for primary care providers and outpatient mental health workers. Additionally, improved communication networks will help maintain continuity of…

  10. Analysis of a Distributed Pulse Power System Using a Circuit Analysis Code

    DTIC Science & Technology

    1979-06-01

    dose rate was then integrated to give a number that could be compared with measure- ments made using thermal luminescent dosimeters ( TLD ’ s). Since...NM 8 7117 AND THE BDM CORPORATION, ALBUQUERQUE, NM 87106 Abstract A sophisticated computer code (SCEPTRE), used to analyze electronic circuits...computer code (SCEPTRE), used to analyze electronic circuits, was used to evaluate the performance of a large flash X-ray machine. This device was

  11. Simplified jet-A kinetic mechanism for combustor application

    NASA Technical Reports Server (NTRS)

    Lee, Chi-Ming; Kundu, Krishna; Ghorashi, Bahman

    1993-01-01

    Successful modeling of combustion and emissions in gas turbine engine combustors requires an adequate description of the reaction mechanism. For hydrocarbon oxidation, detailed mechanisms are only available for the simplest types of hydrocarbons such as methane, ethane, acetylene, and propane. These detailed mechanisms contain a large number of chemical species participating simultaneously in many elementary kinetic steps. Current computational fluid dynamic (CFD) models must include fuel vaporization, fuel-air mixing, chemical reactions, and complicated boundary geometries. To simulate these conditions a very sophisticated computer model is required, which requires large computer memory capacity and long run times. Therefore, gas turbine combustion modeling has frequently been simplified by using global reaction mechanisms, which can predict only the quantities of interest: heat release rates, flame temperature, and emissions. Jet fuels are wide-boiling-range hydrocarbons with ranges extending through those of gasoline and kerosene. These fuels are chemically complex, often containing more than 300 components. Jet fuel typically can be characterized as containing 70 vol pct paraffin compounds and 25 vol pct aromatic compounds. A five-step Jet-A fuel mechanism which involves pyrolysis and subsequent oxidation of paraffin and aromatic compounds is presented here. This mechanism is verified by comparing with Jet-A fuel ignition delay time experimental data, and species concentrations obtained from flametube experiments. This five-step mechanism appears to be better than the current one- and two-step mechanisms.

  12. A Wigner Monte Carlo approach to density functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sellier, J.M., E-mail: jeanmichel.sellier@gmail.com; Dimov, I.

    2014-08-01

    In order to simulate quantum N-body systems, stationary and time-dependent density functional theories rely on the capacity of calculating the single-electron wave-functions of a system from which one obtains the total electron density (Kohn–Sham systems). In this paper, we introduce the use of the Wigner Monte Carlo method in ab-initio calculations. This approach allows time-dependent simulations of chemical systems in the presence of reflective and absorbing boundary conditions. It also enables an intuitive comprehension of chemical systems in terms of the Wigner formalism based on the concept of phase-space. Finally, being based on a Monte Carlo method, it scales verymore » well on parallel machines paving the way towards the time-dependent simulation of very complex molecules. A validation is performed by studying the electron distribution of three different systems, a Lithium atom, a Boron atom and a hydrogenic molecule. For the sake of simplicity, we start from initial conditions not too far from equilibrium and show that the systems reach a stationary regime, as expected (despite no restriction is imposed in the choice of the initial conditions). We also show a good agreement with the standard density functional theory for the hydrogenic molecule. These results demonstrate that the combination of the Wigner Monte Carlo method and Kohn–Sham systems provides a reliable computational tool which could, eventually, be applied to more sophisticated problems.« less

  13. PUBLISHING SPILL IMPACT MAPS OVER THE WEB

    EPA Science Inventory

    This paper discusses the implementaiton of a web-based map publishing technology within a USEPA GIS laboratory. A sophisticated spill travel prediction model for the Ohio River has been installed within the GIS laboratory, and is used by personnel from the NRMRL. The spill simul...

  14. Role of Atmospheric Chemistry in the Climate Impacts of Stratospheric Volcanic Injections

    NASA Technical Reports Server (NTRS)

    Legrande, Allegra N.; Tsigaridis, Kostas; Bauer, Susanne E.

    2016-01-01

    The climate impact of a volcanic eruption is known to be dependent on the size, location and timing of the eruption. However, the chemistry and composition of the volcanic plume also control its impact on climate. It is not just sulfur dioxide gas, but also the coincident emissions of water, halogens and ash that influence the radiative and climate forcing of an eruption. Improvements in the capability of models to capture aerosol microphysics, and the inclusion of chemistry and aerosol microphysics modules in Earth system models, allow us to evaluate the interaction of composition and chemistry within volcanic plumes in a new way. These modeling efforts also illustrate the role of water vapor in controlling the chemical evolution, and hence climate impacts, of the plume. A growing realization of the importance of the chemical composition of volcanic plumes is leading to a more sophisticated and realistic representation of volcanic forcing in climate simulations, which in turn aids in reconciling simulations and proxy reconstructions of the climate impacts of past volcanic eruptions. More sophisticated simulations are expected to help, eventually, with predictions of the impact on the Earth system of any future large volcanic eruptions.

  15. Interactive Physical Simulation of Catheter Motion within Mayor Vessel Structures and Cavities for ASD/VSD Treatment

    NASA Astrophysics Data System (ADS)

    Becherer, Nico; Hesser, Jürgen; Kornmesser, Ulrike; Schranz, Dietmar; Männer, Reinhard

    2007-03-01

    Simulation systems are becoming increasingly essential in medical education. Hereby, capturing the physical behaviour of the real world requires a sophisticated modelling of instruments within the virtual environment. Most models currently used are not capable of user interactive simulations due to the computation of the complex underlying analytical equations. Alternatives are often based on simplifying mass-spring systems, being able to deliver high update rates that come at the cost of less realistic motion. In addition, most techniques are limited to narrow and tubular vessel structures or restrict shape alterations to two degrees of freedom, not allowing instrument deformations like torsion. In contrast, our approach combines high update rates with highly realistic motion and can in addition be used with respect to arbitrary structures like vessels or cavities (e.g. atrium, ventricle) without limiting the degrees of freedom. Based on energy minimization, bending energies and vessel structures are considered as linear elastic elements; energies are evaluated at regularly spaced points on the instrument, while the distance of the points is fixed, i.e. we simulate an articulated structure of joints with fixed connections between them. Arbitrary tissue structures are modeled through adaptive distance fields and are connected by nodes via an undirected graph system. The instrument points are linked to nodes by a system of rules. Energy minimization uses a Quasi Newton method without preconditioning and, hereby, gradients are estimated using a combination of analytical and numerical terms. Results show a high quality in motion simulation when compared to a phantom model. The approach is also robust and fast. Simulating an instrument with 100 joints runs at 100 Hz on a 3 GHz PC.

  16. Coupled Multi-physical Simulations for the Assessment of Nuclear Waste Repository Concepts: Modeling, Software Development and Simulation

    NASA Astrophysics Data System (ADS)

    Massmann, J.; Nagel, T.; Bilke, L.; Böttcher, N.; Heusermann, S.; Fischer, T.; Kumar, V.; Schäfers, A.; Shao, H.; Vogel, P.; Wang, W.; Watanabe, N.; Ziefle, G.; Kolditz, O.

    2016-12-01

    As part of the German site selection process for a high-level nuclear waste repository, different repository concepts in the geological candidate formations rock salt, clay stone and crystalline rock are being discussed. An open assessment of these concepts using numerical simulations requires physical models capturing the individual particularities of each rock type and associated geotechnical barrier concept to a comparable level of sophistication. In a joint work group of the Helmholtz Centre for Environmental Research (UFZ) and the German Federal Institute for Geosciences and Natural Resources (BGR), scientists of the UFZ are developing and implementing multiphysical process models while BGR scientists apply them to large scale analyses. The advances in simulation methods for waste repositories are incorporated into the open-source code OpenGeoSys. Here, recent application-driven progress in this context is highlighted. A robust implementation of visco-plasticity with temperature-dependent properties into a framework for the thermo-mechanical analysis of rock salt will be shown. The model enables the simulation of heat transport along with its consequences on the elastic response as well as on primary and secondary creep or the occurrence of dilatancy in the repository near field. Transverse isotropy, non-isothermal hydraulic processes and their coupling to mechanical stresses are taken into account for the analysis of repositories in clay stone. These processes are also considered in the near field analyses of engineered barrier systems, including the swelling/shrinkage of the bentonite material. The temperature-dependent saturation evolution around the heat-emitting waste container is described by different multiphase flow formulations. For all mentioned applications, we illustrate the workflow from model development and implementation, over verification and validation, to repository-scale application simulations using methods of high performance computing.

  17. Skylab fluid mechanics simulations: Oscillation, rotation, collision and coalescence of water droplets under low-gravity environment

    NASA Technical Reports Server (NTRS)

    Vaughan, O. H., Jr.; Hung, R. J.

    1975-01-01

    Skylab 4 crew members performed a series of demonstrations showing the oscillations, rotations, as well as collision coalescence of water droplets which simulate various physical models of fluids under low gravity environment. The results from Skylab demonstrations provide information and illustrate the potential of an orbiting space-oriented research laboratory for the study of more sophisticated fluid mechanic experiments. Experiments and results are discussed.

  18. Extreme Scale Computing to Secure the Nation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D L; McGraw, J R; Johnson, J R

    2009-11-10

    Since the dawn of modern electronic computing in the mid 1940's, U.S. national security programs have been dominant users of every new generation of high-performance computer. Indeed, the first general-purpose electronic computer, ENIAC (the Electronic Numerical Integrator and Computer), was used to calculate the expected explosive yield of early thermonuclear weapons designs. Even the U. S. numerical weather prediction program, another early application for high-performance computing, was initially funded jointly by sponsors that included the U.S. Air Force and Navy, agencies interested in accurate weather predictions to support U.S. military operations. For the decades of the cold war, national securitymore » requirements continued to drive the development of high performance computing (HPC), including advancement of the computing hardware and development of sophisticated simulation codes to support weapons and military aircraft design, numerical weather prediction as well as data-intensive applications such as cryptography and cybersecurity U.S. national security concerns continue to drive the development of high-performance computers and software in the U.S. and in fact, events following the end of the cold war have driven an increase in the growth rate of computer performance at the high-end of the market. This mainly derives from our nation's observance of a moratorium on underground nuclear testing beginning in 1992, followed by our voluntary adherence to the Comprehensive Test Ban Treaty (CTBT) beginning in 1995. The CTBT prohibits further underground nuclear tests, which in the past had been a key component of the nation's science-based program for assuring the reliability, performance and safety of U.S. nuclear weapons. In response to this change, the U.S. Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship (SBSS) program in response to the Fiscal Year 1994 National Defense Authorization Act, which requires, 'in the absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This will be accomplished through significant increases in the scientific bases that underlie the computational tools. Computer codes must be developed that replace phenomenology with increased levels of scientific understanding together with an accompanying quantification of uncertainty. These advanced codes will place significantly higher demands on the computing infrastructure than do the current 3D ASC codes. This article discusses not only the need for a future computing capability at the exascale for the SBSS program, but also considers high performance computing requirements for broader national security questions. For example, the increasing concern over potential nuclear terrorist threats demands a capability to assess threats and potential disablement technologies as well as a rapid forensic capability for determining a nuclear weapons design from post-detonation evidence (nuclear counterterrorism).« less

  19. A HyperCard Program for Business German.

    ERIC Educational Resources Information Center

    Paulsell, Patricia R.

    Although the use of computer-assisted language instruction software has been mainly limited to grammatical/syntactical drills, the increasing number of language professionals with programming skills is leading to the development of more sophisticated language education programs. This report describes the generation of such a program using the…

  20. Inventors in the Making

    ERIC Educational Resources Information Center

    Murray, Jenny; Bartelmay, Kathy

    2005-01-01

    Can second-grade students construct an understanding of sophisticated science processes and explore physics concepts while creating their own inventions? Yes! Students accomplished this and much more through a month-long project in which they used Legos and Robolab, the Lego computer programing software, to create their own inventions. One…

  1. Education in the Information Age.

    ERIC Educational Resources Information Center

    Hay, Lee

    1983-01-01

    This essay considers the revolutionized education of a projected future of cheap and sophisticated technology. Predictions include a redefinition of literacy and basic skills and a restructuring of educational delivery employing computers to dispense information in order to free teachers to work directly with students on cognitive development.…

  2. Research and Development in Natural Language Understanding as Part of the Strategic Computing Program.

    DTIC Science & Technology

    1987-04-01

    facilities. BBN is developing a series of increasingly sophisticated natural language understanding systems which will serve as an integrated interface...Haas, A.R. A Syntactic Theory of Belief and Action. Artificial Intelligence. 1986. Forthcoming. [6] Hinrichs, E. Temporale Anaphora im Englischen

  3. 12 Math Rules That Expire in the Middle Grades

    ERIC Educational Resources Information Center

    Karp, Karen S.; Bush, Sarah B.; Dougherty, Barbara J.

    2015-01-01

    Many rules taught in mathematics classrooms "expire" when students develop knowledge that is more sophisticated, such as using new number systems. For example, in elementary grades, students are sometimes taught that "addition makes bigger" or "subtraction makes smaller" when learning to compute with whole numbers,…

  4. Symbionic Technology and Education. Report 83-02.

    ERIC Educational Resources Information Center

    Cartwright, Glenn F.

    Research findings indicate that major breakthroughs in education will have to occur through direct cortical intervention, using either chemical or electronic means. It will eventually be possible to build sophisticated intelligence amplifiers that will be internal extensions of our brains, significantly more powerful than present day computers,…

  5. Assessing Higher Order Thinking in Video Games

    ERIC Educational Resources Information Center

    Rice, John

    2007-01-01

    Computer video games have become highly interesting to educators and researchers since their sophistication has improved considerably over the last decade. Studies indicate simple video games touting educational benefits are common in classrooms. However, a need for identifying truly useful games for educational purposes exists. This article…

  6. STAF: A Powerful and Sophisticated CAI System.

    ERIC Educational Resources Information Center

    Loach, Ken

    1982-01-01

    Describes the STAF (Science Teacher's Authoring Facility) computer-assisted instruction system developed at Leeds University (England), focusing on STAF language and major program features. Although programs for the system emphasize physical chemistry and organic spectroscopy, the system and language are general purpose and can be used in any…

  7. Interactive Video-Based Industrial Training in Basic Electronics.

    ERIC Educational Resources Information Center

    Mirkin, Barry

    The Wisconsin Foundation for Vocational, Technical, and Adult Education is currently involved in the development, implementation, and distribution of a sophisticated interactive computer and video learning system. Designed to offer trainees an open entry and open exit opportunity to pace themselves through a comprehensive competency-based,…

  8. Spectrum simulation in DTSA-II.

    PubMed

    Ritchie, Nicholas W M

    2009-10-01

    Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.

  9. Numerical Propulsion System Simulation (NPSS): An Award Winning Propulsion System Simulation Tool

    NASA Technical Reports Server (NTRS)

    Stauber, Laurel J.; Naiman, Cynthia G.

    2002-01-01

    The Numerical Propulsion System Simulation (NPSS) is a full propulsion system simulation tool used by aerospace engineers to predict and analyze the aerothermodynamic behavior of commercial jet aircraft, military applications, and space transportation. The NPSS framework was developed to support aerospace, but other applications are already leveraging the initial capabilities, such as aviation safety, ground-based power, and alternative energy conversion devices such as fuel cells. By using the framework and developing the necessary components, future applications that NPSS could support include nuclear power, water treatment, biomedicine, chemical processing, and marine propulsion. NPSS will dramatically reduce the time, effort, and expense necessary to design and test jet engines. It accomplishes that by generating sophisticated computer simulations of an aerospace object or system, thus enabling engineers to "test" various design options without having to conduct costly, time-consuming real-life tests. The ultimate goal of NPSS is to create a numerical "test cell" that enables engineers to create complete engine simulations overnight on cost-effective computing platforms. Using NPSS, engine designers will be able to analyze different parts of the engine simultaneously, perform different types of analysis simultaneously (e.g., aerodynamic and structural), and perform analysis in a more efficient and less costly manner. NPSS will cut the development time of a new engine in half, from 10 years to 5 years. And NPSS will have a similar effect on the cost of development: new jet engines will cost about a billion dollars to develop rather than two billion. NPSS is also being applied to the development of space transportation technologies, and it is expected that similar efficiencies and cost savings will result. Advancements of NPSS in fiscal year 2001 included enhancing the NPSS Developer's Kit to easily integrate external components of varying fidelities, providing the initial Visual-Based Syntax (VBS) capability, and developing additional capabilities to support space transportation. NPSS was supported under NASA's High Performance Computing and Communications Program. Through the NASA/Industry Cooperative Effort agreement, NASA Glenn and its industry and Government partners are developing NPSS. The NPSS team consists of propulsion experts and software engineers from GE Aircraft Engines, Pratt & Whitney, The Boeing Company, Honeywell, Rolls-Royce Corporation, Williams International, Teledyne Continental Motors, Arnold Engineering Development Center, Wright Patterson Air Force Base, and the NASA Glenn Research Center. Glenn is leading the way in developing NPSS--a method for solving complex design problems that's faster, better, and cheaper.

  10. An Ecological Framework for Cancer Communication: Implications for Research

    PubMed Central

    Intille, Stephen S; Zabinski, Marion F

    2005-01-01

    The field of cancer communication has undergone a major revolution as a result of the Internet. As recently as the early 1990s, face-to-face, print, and the telephone were the dominant methods of communication between health professionals and individuals in support of the prevention and treatment of cancer. Computer-supported interactive media existed, but this usually required sophisticated computer and video platforms that limited availability. The introduction of point-and-click interfaces for the Internet dramatically improved the ability of non-expert computer users to obtain and publish information electronically on the Web. Demand for Web access has driven computer sales for the home setting and improved the availability, capability, and affordability of desktop computers. New advances in information and computing technologies will lead to similarly dramatic changes in the affordability and accessibility of computers. Computers will move from the desktop into the environment and onto the body. Computers are becoming smaller, faster, more sophisticated, more responsive, less expensive, and—essentially—ubiquitous. Computers are evolving into much more than desktop communication devices. New computers include sensing, monitoring, geospatial tracking, just-in-time knowledge presentation, and a host of other information processes. The challenge for cancer communication researchers is to acknowledge the expanded capability of the Web and to move beyond the approaches to health promotion, behavior change, and communication that emerged during an era when language- and image-based interpersonal and mass communication strategies predominated. Ecological theory has been advanced since the early 1900s to explain the highly complex relationships among individuals, society, organizations, the built and natural environments, and personal and population health and well-being. This paper provides background on ecological theory, advances an Ecological Model of Internet-Based Cancer Communication intended to broaden the vision of potential uses of the Internet for cancer communication, and provides some examples of how such a model might inform future research and development in cancer communication. PMID:15998614

  11. Control technology for future aircraft propulsion systems

    NASA Technical Reports Server (NTRS)

    Zeller, J. R.; Szuch, J. R.; Merrill, W. C.; Lehtinen, B.; Soeder, J. F.

    1984-01-01

    The need for a more sophisticated engine control system is discussed. The improvements in better thrust-to-weight ratios demand the manipulation of more control inputs. New technological solutions to the engine control problem are practiced. The digital electronic engine control (DEEC) system is a step in the evolution to digital electronic engine control. Technology issues are addressed to ensure a growth in confidence in sophisticated electronic controls for aircraft turbine engines. The need of a control system architecture which permits propulsion controls to be functionally integrated with other aircraft systems is established. Areas of technology studied include: (1) control design methodology; (2) improved modeling and simulation methods; and (3) implementation technologies. Objectives, results and future thrusts are summarized.

  12. TAIR: A transonic airfoil analysis computer code

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.; Holst, T. L.; Grundy, K. L.; Thomas, S. D.

    1981-01-01

    The operation of the TAIR (Transonic AIRfoil) computer code, which uses a fast, fully implicit algorithm to solve the conservative full-potential equation for transonic flow fields about arbitrary airfoils, is described on two levels of sophistication: simplified operation and detailed operation. The program organization and theory are elaborated to simplify modification of TAIR for new applications. Examples with input and output are given for a wide range of cases, including incompressible, subcritical compressible, and transonic calculations.

  13. Improving Access to Data While Protecting Confidentiality: Prospects for the Future.

    ERIC Educational Resources Information Center

    Duncan, George T.; Pearson, Robert W.

    Providing researchers, especially those in the social sciences, with access to publicly collected microdata furthers research while advancing public policy goals in a democratic society. However, while technological improvements have eased remote access to these databases and enabled computer using researchers to perform sophisticated statistical…

  14. Available for the Apple II: FIRM: Florida InteRactive Modeler.

    ERIC Educational Resources Information Center

    Levy, C. Michael; And Others

    1983-01-01

    The Apple II microcomputer program described allows instructors with minimal programing experience to construct computer models of psychological phenomena for students to investigate. Use of these models eliminates need to maintain/house/breed animals or purchase sophisticated laboratory equipment. Several content models are also described,…

  15. MODELS-3 (CMAQ). NARSTO NEWS (VOL. 3, NO. 2, SUMMER/FALL 1999)

    EPA Science Inventory

    A revised version of the U.S. EPA's Models-3/CMAQ system was released on June 30, 1999. Models-3 consists of a sophisticated computational framework for environmental models allowing for much flexibility in the communications between component parts of the system, in updating or ...

  16. Analysis of an Anti-Phishing Lab Activity

    ERIC Educational Resources Information Center

    Werner, Laurie A.; Courte, Jill

    2010-01-01

    Despite advances in spam detection software, anti-spam laws, and increasingly sophisticated users, the number of successful phishing scams continues to grow. In addition to monetary losses attributable to phishing, there is also a loss of confidence that stifles use of online services. Using in-class activities in an introductory computer course…

  17. Artificial Intelligence Applications in Special Education: How Feasible? Final Report.

    ERIC Educational Resources Information Center

    Hofmeister, Alan M.; Ferrara, Joseph M.

    The research project investigated whether expert system tools have become sophisticated enough to be applied efficiently to problems in special education. (Expert systems are a development of artificial intelligence that combines the computer's capacity for storing specialized knowledge with a general set of rules intended to replicate the…

  18. Instructional Design Considerations in Converting Non-CBT Materials into CBT Courses.

    ERIC Educational Resources Information Center

    Ng, Raymond

    Instructional designers who are asked to convert existing training materials into computer-based training (CBT) must take special precautions to avoid making the product into a sophisticated page turner. Although conversion may save considerable time on subject research and analysis, courses to be delivered through microcomputers may require…

  19. CBES--An Efficient Implementation of the Coursewriter Language.

    ERIC Educational Resources Information Center

    Franks, Edward W.

    An extensive computer based education system (CBES) built around the IBM Coursewriter III program product at Ohio State University is described. In this system, numerous extensions have been added to the Coursewriter III language to provide capabilities needed to implement sophisticated instructional strategies. CBES design goals include lower CPU…

  20. Detecting Satisficing in Online Surveys

    ERIC Educational Resources Information Center

    Salifu, Shani

    2012-01-01

    The proliferation of computers and high speed internet services are making online activities an integral part of peoples' lives as connect with friends, shop, and exchange data. The increasing ability of the internet to handle sophisticated data exchanges is endearing it to researchers interested in gathering all kinds of data. This method has the…

  1. A Performance Support Tool for Cisco Training Program Managers

    ERIC Educational Resources Information Center

    Benson, Angela D.; Bothra, Jashoda; Sharma, Priya

    2004-01-01

    Performance support systems can play an important role in corporations by managing and allowing distribution of information more easily. These systems run the gamut from simple paper job aids to sophisticated computer- and web-based software applications that support the entire corporate supply chain. According to Gery (1991), a performance…

  2. Introduction to Autonomous Mobile Robotics Using "Lego Mindstorms" NXT

    ERIC Educational Resources Information Center

    Akin, H. Levent; Meriçli, Çetin; Meriçli, Tekin

    2013-01-01

    Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the…

  3. Robotics for Computer Scientists: What's the Big Idea?

    ERIC Educational Resources Information Center

    Touretzky, David S.

    2013-01-01

    Modern robots, like today's smartphones, are complex devices with intricate software systems. Introductory robot programming courses must evolve to reflect this reality, by teaching students to make use of the sophisticated tools their robots provide rather than reimplementing basic algorithms. This paper focuses on teaching with Tekkotsu, an open…

  4. Teaching Conversations with the XDS Sigma 7. Systems Description.

    ERIC Educational Resources Information Center

    Bork, Alfred M.; Mosmann, Charles

    Some computers permit conventional programing languages to be extended by the use of macro-instructions, a sophisticated programing tool which is especially useful in writing instructional dialogs. Macro-instructions (or "macro's") are complex commands defined in terms of the machine language or other macro-instructions. Like terms in…

  5. Data management in the mission data system

    NASA Technical Reports Server (NTRS)

    Wagner, David A.

    2005-01-01

    As spacecraft evolve from simple embedded devices to become more sophisticated computing platforms with complex behaviors it is increasingly necessary to model and manage the flow of data, and to provide uniform models for managing data that promote adaptability, yet pay heed to the physical limitations of the embedded and space environments.

  6. Technology Acceptance in Social Work Education: Implications for the Field Practicum

    ERIC Educational Resources Information Center

    Colvin, Alex Don; Bullock, Angela N.

    2014-01-01

    The exponential growth and sophistication of new information and computer technology (ICT) have greatly influenced human interactions and provided new metaphors for understanding the world. The acceptance and integration of ICT into social work field education are examined here using the technological acceptance model. This article also explores…

  7. Models and Methodologies for Multimedia Courseware Production.

    ERIC Educational Resources Information Center

    Barker, Philip; Giller, Susan

    Many new technologies are now available for delivering and/or providing access to computer-based learning (CBL) materials. These technologies vary in sophistication in many important ways, depending upon the bandwidth that they provide, the interactivity that they offer and the types of end-user connectivity that they support.Invariably,…

  8. Mantle circulation models with variational data assimilation: Inferring past mantle flow and structure from plate motion histories and seismic tomography

    NASA Astrophysics Data System (ADS)

    Bunge, Hans-Peter

    2002-08-01

    Earth's mantle overturns itself about once every 200 Million years (myrs). Prima facie evidence for this overturn is the motion of tectonic plates at the surface of the Earth driving the geologic activity of our planet. Supporting evidence also comes from seismic tomograms of the Earth's interior that reveal the convective currents in remarkable clarity. Much has been learned about the physics of solid state mantle convection over the past two decades aided primarily by sophisticated computer simulations. Such simulations are reaching the threshold of fully resolving the convective system globally. In this talk we will review recent progress in mantle dynamics studies. We will then turn our attention to the fundamental question of whether it is possible to explicitly reconstruct mantle flow back in time. This is a classic problem of history matching, amenable to control theory and data assimilation. The technical advances that make such approach feasible are dramatically increasing compute resources, represented for example through Beowulf clusters, and new observational initiatives, represented for example through the US-Array effort that should lead to an order-of-magnitude improvement in our ability to resolve Earth structure seismically below North America. In fact, new observational constraints on deep Earth structure illustrate the growing importance of of improving our data assimilation skills in deep Earth models. We will explore data assimilation through high resolution global adjoint models of mantle circulation and conclude that it is feasible to reconstruct mantle flow back in time for at least the past 100 myrs.

  9. The future of simulation technologies for complex cardiovascular procedures.

    PubMed

    Cates, Christopher U; Gallagher, Anthony G

    2012-09-01

    Changing work practices and the evolution of more complex interventions in cardiovascular medicine are forcing a paradigm shift in the way doctors are trained. Implantable cardioverter defibrillator (ICD), transcatheter aortic valve implantation (TAVI), carotid artery stenting (CAS), and acute stroke intervention procedures are forcing these changes at a faster pace than in other disciplines. As a consequence, cardiovascular medicine has had to develop a sophisticated understanding of precisely what is meant by 'training' and 'skill'. An evolving conclusion is that procedure training on a virtual reality (VR) simulator presents a viable current solution. These simulations should characterize the important performance characteristics of procedural skill that have metrics derived and defined from, and then benchmarked to experienced operators (i.e. level of proficiency). Simulation training is optimal with metric-based feedback, particularly formative trainee error assessments, proximate to their performance. In prospective, randomized studies, learners who trained to a benchmarked proficiency level on the simulator performed significantly better than learners who were traditionally trained. In addition, cardiovascular medicine now has available the most sophisticated virtual reality simulators in medicine and these have been used for the roll-out of interventions such as CAS in the USA and globally with cardiovascular society and industry partnered training programmes. The Food and Drug Administration has advocated the use of VR simulation as part of the approval of new devices and the American Board of Internal Medicine has adopted simulation as part of its maintenance of certification. Simulation is rapidly becoming a mainstay of cardiovascular education, training, certification, and the safe adoption of new technology. If cardiovascular medicine is to continue to lead in the adoption and integration of simulation, then, it must take a proactive position in the development of metric-based simulation curriculum, adoption of proficiency benchmarking definitions, and then resolve to commit resources so as to continue to lead this revolution in physician training.

  10. Lunar Simulation in the Lunar Dust Adhesion Bell Jar

    NASA Technical Reports Server (NTRS)

    Gaier, James R.; Sechkar, Edward A.

    2007-01-01

    The Lunar Dust Adhesion Bell Jar has been assembled at the NASA Glenn Research Center to provide a high fidelity lunar simulation facility to test the interactions of lunar dust and lunar dust simulant with candidate aerospace materials and coatings. It has a sophisticated design which enables it to treat dust in a way that will remove adsorbed gases and create a chemically reactive surface. It can simulate the vacuum, thermal, and radiation environments of the Moon, including proximate areas of illuminated heat and extremely cold shadow. It is expected to be a valuable tool in the development of dust repellant and cleaning technologies for lunar surface systems.

  11. Kinetic Modeling of Radiative Turbulence in Relativistic Astrophysical Plasmas: Particle Acceleration and High-Energy Flares

    NASA Astrophysics Data System (ADS)

    Wise, John

    In the near future, next-generation telescopes, covering most of the electromagnetic spectrum, will provide a view into the very earliest stages of galaxy formation. To accurately interpret these future observations, accurate and high-resolution simulations of the first stars and galaxies are vital. This proposal is centered on the formation of the first galaxies in the Universe and their observational signatures in preparation for these future observatories. This proposal has two overall goals: 1. To simulate the formation and evolution of a statistically significant sample of galaxies during the first billion years of the Universe, including all relevant astrophysics while resolving individual molecular clouds, in various cosmological environments. These simulations will utilize a sophisticated physical model of star and black hole formation and feedback, including radiation transport and magnetic fields, which will lead to the most realistic and resolved predictions for the early universe; 2. To predict the observational features of the first galaxies throughout the electromagnetic spectrum, allowing for optimal extraction of galaxy and dark matter halo properties from their photometry, imaging, and spectra; The proposed research plan addresses a timely and relevant issue to theoretically prepare for the interpretation of future observations of the first galaxies in the Universe. A suite of adaptive mesh refinement simulations will be used to follow the formation and evolution of thousands of galaxies observable with the James Webb Space Telescope (JWST) that will be launched during the second year of this project. The simulations will have also tracked the formation and death of over 100,000 massive metal-free stars. Currently, there is a gap of two orders of magnitude in stellar mass between the smallest observed z > 6 galaxy and the largest simulated galaxy from "first principles", capturing its entire star formation history. This project will eliminate this gap between simulations and observations of the first galaxies, providing predictions for next-generation observations coming online throughout the next decade. The proposed activities present the graduate students involved in the project with opportunities to gain expertise in numerical algorithms, high performance computing, and software engineering. With this experience, the students will be in a powerful position to face the challenging job market. The computational tools produced by this project will be made freely available and incorporated into their respective frameworks to preserve their sustainability.

  12. Diffuse-Interface Capturing Methods for Compressible Two-Phase Flows

    NASA Astrophysics Data System (ADS)

    Saurel, Richard; Pantano, Carlos

    2018-01-01

    Simulation of compressible flows became a routine activity with the appearance of shock-/contact-capturing methods. These methods can determine all waves, particularly discontinuous ones. However, additional difficulties may appear in two-phase and multimaterial flows due to the abrupt variation of thermodynamic properties across the interfacial region, with discontinuous thermodynamical representations at the interfaces. To overcome this difficulty, researchers have developed augmented systems of governing equations to extend the capturing strategy. These extended systems, reviewed here, are termed diffuse-interface models, because they are designed to compute flow variables correctly in numerically diffused zones surrounding interfaces. In particular, they facilitate coupling the dynamics on both sides of the (diffuse) interfaces and tend to the proper pure fluid-governing equations far from the interfaces. This strategy has become efficient for contact interfaces separating fluids that are governed by different equations of state, in the presence or absence of capillary effects, and with phase change. More sophisticated materials than fluids (e.g., elastic-plastic materials) have been considered as well.

  13. Flexural-torsional vibration of simply supported open cross-section steel beams under moving loads

    NASA Astrophysics Data System (ADS)

    Michaltsos, G. T.; Sarantithou, E.; Sophianopoulos, D. S.

    2005-02-01

    SummaryThe present work deals with linearized modal analysis of the combined flexural-torsional vibration of simply supported steel beams with open monosymmetric cross-sections, acted upon by a load of constant magnitude, traversing its span eccentrically with constant velocity. After thoroughly investigating the free vibrations of the structure, which simulates a commonly used highway bridge, its forced motions under the aforementioned loading type are investigated. Utilizing the capabilities of symbolic computations within modern mathematical software, the effect of the most significant geometrical and cross-sectional beam properties on the free vibration characteristics of the beam are established and presented in tabular and graphical form. Moreover, adopting realistic values of the simplified vehicle model adopted, the effects of eccentricity, load magnitude and corresponding velocity are assessed and interesting conclusions for structural design purposes are drawn. The proposed methodology may serve as a starting point for further in-depth study of the whole scientific subject, in which sophisticated vehicle models, energy dissipation and more complicated bridge models may be used.

  14. On the Spectrum of the Plenoptic Function.

    PubMed

    Gilliam, Christopher; Dragotti, Pier-Luigi; Brookes, Mike

    2014-02-01

    The plenoptic function is a powerful tool to analyze the properties of multi-view image data sets. In particular, the understanding of the spectral properties of the plenoptic function is essential in many computer vision applications, including image-based rendering. In this paper, we derive for the first time an exact closed-form expression of the plenoptic spectrum of a slanted plane with finite width and use this expression as the elementary building block to derive the plenoptic spectrum of more sophisticated scenes. This is achieved by approximating the geometry of the scene with a set of slanted planes and evaluating the closed-form expression for each plane in the set. We then use this closed-form expression to revisit uniform plenoptic sampling. In this context, we derive a new Nyquist rate for the plenoptic sampling of a slanted plane and a new reconstruction filter. Through numerical simulations, on both real and synthetic scenes, we show that the new filter outperforms alternative existing filters.

  15. Thermoelectrokinetic instability in micro/nanoscales

    NASA Astrophysics Data System (ADS)

    Ganchenko, Georgy; Ganchenko, Natalia

    2016-11-01

    A novel sophisticated type of electro-hydrodynamic instability in an electrolyte solution near ion-selective surfaces in an external electric field is discovered theoretically. The key mechanism of the instability is caused by Joule heating but dramatically differs from the well-known Raleigh-Benard convection. The investigation is based on the Nernst-Planck-Poisson-Navier-Stokes system along with the energy equation and corresponding BCs. The 1D quiescent steady state in microscales can be unstable with respect to either short-wave Rubinstein-Zaltzman or long-wave thermoelectokinenetic instability. The last one prevails in long microchannels and good enough thermal insulation of the system. In addition to the linear stability analysis a direct numerical simulation of the full 3D nonlinear system is fulfilled using a parallel computing. In the final coherent structures salt concentration, temperature and electric current are localized in narrow long fingers normal to the ion-selective surface while space charge forms crown-like micro-patterns. The investigation results can be useful in desalination problem.

  16. Fourier-interpolation superresolution optical fluctuation imaging (fSOFi) (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Enderlein, Joerg; Stein, Simon C.; Huss, Anja; Hähnel, Dirk; Gregor, Ingo

    2016-02-01

    Stochastic Optical Fluctuation Imaging (SOFI) is a superresolution fluorescence microscopy technique which allows to enhance the spatial resolution of an image by evaluating the temporal fluctuations of blinking fluorescent emitters. SOFI is not based on the identification and localization of single molecules such as in the widely used Photoactivation Localization Microsopy (PALM) or Stochastic Optical Reconstruction Microscopy (STORM), but computes a superresolved image via temporal cumulants from a recorded movie. A technical challenge hereby is that, when directly applying the SOFI algorithm to a movie of raw images, the pixel size of the final SOFI image is the same as that of the original images, which becomes problematic when the final SOFI resolution is much smaller than this value. In the past, sophisticated cross-correlation schemes have been used for tackling this problem. Here, we present an alternative, exact, straightforward, and simple solution using an interpolation scheme based on Fourier transforms. We exemplify the method on simulated and experimental data.

  17. Programming Hierarchical Self-Assembly of Patchy Particles into Colloidal Crystals via Colloidal Molecules.

    PubMed

    Morphew, Daniel; Shaw, James; Avins, Christopher; Chakrabarti, Dwaipayan

    2018-03-27

    Colloidal self-assembly is a promising bottom-up route to a wide variety of three-dimensional structures, from clusters to crystals. Programming hierarchical self-assembly of colloidal building blocks, which can give rise to structures ordered at multiple levels to rival biological complexity, poses a multiscale design problem. Here we explore a generic design principle that exploits a hierarchy of interaction strengths and employ this design principle in computer simulations to demonstrate the hierarchical self-assembly of triblock patchy colloidal particles into two distinct colloidal crystals. We obtain cubic diamond and body-centered cubic crystals via distinct clusters of uniform size and shape, namely, tetrahedra and octahedra, respectively. Such a conceptual design framework has the potential to reliably encode hierarchical self-assembly of colloidal particles into a high level of sophistication. Moreover, the design framework underpins a bottom-up route to cubic diamond colloidal crystals, which have remained elusive despite being much sought after for their attractive photonic applications.

  18. How can surgical training benefit from theories of skilled motor development, musical skill acquisition and performance psychology?

    PubMed

    McCaskie, Andrew W; Kenny, Dianna T; Deshmukh, Sandeep

    2011-05-02

    Trainee surgeons must acquire expert status in the context of reduced hours, reduced operating room time and the need to learn complex skills involving screen-mediated techniques, computers and robotics. Ever more sophisticated surgical simulation strategies have been helpful in providing surgeons with the opportunity to practise, but not all of these strategies are widely available. Similarities in the motor skills required in skilled musical performance and surgery suggest that models of music learning, and particularly skilled motor development, may be applicable in training surgeons. More attention should be paid to factors associated with optimal arousal and optimal performance in surgical training - lessons learned from helping anxious musicians optimise performance and manage anxiety may also be transferable to trainee surgeons. The ways in which the trainee surgeon moves from novice to expert need to be better understood so that this process can be expedited using current knowledge in other disciplines requiring the performance of complex fine motor tasks with high cognitive load under pressure.

  19. The effectiveness of virtual reality distraction for pain reduction: a systematic review.

    PubMed

    Malloy, Kevin M; Milling, Leonard S

    2010-12-01

    Virtual reality technology enables people to become immersed in a computer-simulated, three-dimensional environment. This article provides a comprehensive review of controlled research on the effectiveness of virtual reality (VR) distraction for reducing pain. To be included in the review, studies were required to use a between-subjects or mixed model design in which VR distraction was compared with a control condition or an alternative intervention in relieving pain. An exhaustive search identified 11 studies satisfying these criteria. VR distraction was shown to be effective for reducing experimental pain, as well as the discomfort associated with burn injury care. Studies of needle-related pain provided less consistent findings. Use of more sophisticated virtual reality technology capable of fully immersing the individual in a virtual environment was associated with greater relief. Overall, controlled research suggests that VR distraction may be a useful tool for clinicians who work with a variety of pain problems. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. Ab initio modeling of CW-ESR spectra of the double spin labeled peptide Fmoc-(Aib-Aib-TOAC)2-Aib-OMe in acetonitrile.

    PubMed

    Zerbetto, Mirco; Carlotto, Silvia; Polimeno, Antonino; Corvaja, Carlo; Franco, Lorenzo; Toniolo, Claudio; Formaggio, Fernando; Barone, Vincenzo; Cimino, Paola

    2007-03-15

    In this work we address the interpretation, via an ab initio integrated computational approach, of the CW-ESR spectra of the double spin labeled, 310-helical, peptide Fmoc-(Aib-Aib-TOAC)2-Aib-OMe dissolved in acetonitrile. Our approach is based on the determination of geometric and local magnetic parameters of the heptapeptide by quantum mechanical density functional calculations taking into account solvent and, when needed, vibrational averaging contributions. The system is then described by a stochastic Liouville equation for the two electron spins interacting with each other and with two 14N nuclear spins, in the presence of diffusive rotational dynamics. Parametrization of the diffusion rotational tensor is provided by a hydrodynamic model. CW-ESR spectra are simulated with minimal resorting to fitting procedures, proving that the combination of sensitive ESR spectroscopy and sophisticated modeling can be highly helpful in providing 3D structural and dynamic information on molecular systems.

  1. GIS Data Based Automatic High-Fidelity 3D Road Network Modeling

    NASA Technical Reports Server (NTRS)

    Wang, Jie; Shen, Yuzhong

    2011-01-01

    3D road models are widely used in many computer applications such as racing games and driving simulations_ However, almost all high-fidelity 3D road models were generated manually by professional artists at the expense of intensive labor. There are very few existing methods for automatically generating 3D high-fidelity road networks, especially those existing in the real world. This paper presents a novel approach thai can automatically produce 3D high-fidelity road network models from real 2D road GIS data that mainly contain road. centerline in formation. The proposed method first builds parametric representations of the road centerlines through segmentation and fitting . A basic set of civil engineering rules (e.g., cross slope, superelevation, grade) for road design are then selected in order to generate realistic road surfaces in compliance with these rules. While the proposed method applies to any types of roads, this paper mainly addresses automatic generation of complex traffic interchanges and intersections which are the most sophisticated elements in the road networks

  2. Current trends in geomathematics

    USGS Publications Warehouse

    Griffiths, J.C.

    1970-01-01

    Geoscience has extended its role and improved its applications by the development of geophysics since the nineteen-thirties, geochemistry since the nineteen-fifties and now, in the late nineteen-sixties, a new synergism leads to geomathematics; again the greatest pressure for change arises from areas of application of geoscience and, as the problems to which geoscience is applied increase in complexity, the analytical tools become more sophisticated, a development which is accelerated by growth in the use of computers in geological problem-solving. In the next decade the problems with greatest public impact appear to be the ones which will receive greatest emphasis and support. This will require that the geosciences comprehend exceedingly complex probabilistic systems and these, in turn, demand the use of operations research, cybernetics and systems analysis. Such a development may well lead to a change in the paradigms underlying geoscience; they will certainly include more realistic models of "real-world" systems and the tool of simulation with cybernetic models may well become the basis for rejuvenation of experimentation in the geosciences. ?? 1970.

  3. (3+1)D hydrodynamic simulation of relativistic heavy-ion collisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schenke, Bjoern; Jeon, Sangyong; Gale, Charles

    2010-07-15

    We present music, an implementation of the Kurganov-Tadmor algorithm for relativistic 3+1 dimensional fluid dynamics in heavy-ion collision scenarios. This Riemann-solver-free, second-order, high-resolution scheme is characterized by a very small numerical viscosity and its ability to treat shocks and discontinuities very well. We also incorporate a sophisticated algorithm for the determination of the freeze-out surface using a three dimensional triangulation of the hypersurface. Implementing a recent lattice based equation of state, we compute p{sub T}-spectra and pseudorapidity distributions for Au+Au collisions at sq root(s)=200 GeV and present results for the anisotropic flow coefficients v{sub 2} and v{sub 4} as amore » function of both p{sub T} and pseudorapidity eta. We were able to determine v{sub 4} with high numerical precision, finding that it does not strongly depend on the choice of initial condition or equation of state.« less

  4. Track finding in ATLAS using GPUs

    NASA Astrophysics Data System (ADS)

    Mattmann, J.; Schmitt, C.

    2012-12-01

    The reconstruction and simulation of collision events is a major task in modern HEP experiments involving several ten thousands of standard CPUs. On the other hand the graphics processors (GPUs) have become much more powerful and are by far outperforming the standard CPUs in terms of floating point operations due to their massive parallel approach. The usage of these GPUs could therefore significantly reduce the overall reconstruction time per event or allow for the usage of more sophisticated algorithms. In this paper the track finding in the ATLAS experiment will be used as an example on how the GPUs can be used in this context: the implementation on the GPU requires a change in the algorithmic flow to allow the code to work in the rather limited environment on the GPU in terms of memory, cache, and transfer speed from and to the GPU and to make use of the massive parallel computation. Both, the specific implementation of parts of the ATLAS track reconstruction chain and the performance improvements obtained will be discussed.

  5. The Sky's the Limit When Super Students Meet Supercomputers.

    ERIC Educational Resources Information Center

    Trotter, Andrew

    1991-01-01

    In a few select high schools in the U.S., supercomputers are allowing talented students to attempt sophisticated research projects using simultaneous simulations of nature, culture, and technology not achievable by ordinary microcomputers. Schools can get their students online by entering contests and seeking grants and partnerships with…

  6. Teaching Basic Quantum Mechanics in Secondary School Using Concepts of Feynman Path Integrals Method

    ERIC Educational Resources Information Center

    Fanaro, Maria de los Angeles; Otero, Maria Rita; Arlego, Marcelo

    2012-01-01

    This paper discusses the teaching of basic quantum mechanics in high school. Rather than following the usual formalism, our approach is based on Feynman's path integral method. Our presentation makes use of simulation software and avoids sophisticated mathematical formalism. (Contains 3 figures.)

  7. An Intelligent Simulator for Telerobotics Training

    ERIC Educational Resources Information Center

    Belghith, K.; Nkambou, R.; Kabanza, F.; Hartman, L.

    2012-01-01

    Roman Tutor is a tutoring system that uses sophisticated domain knowledge to monitor the progress of students and advise them while they are learning how to operate a space telerobotic system. It is intended to help train operators of the Space Station Remote Manipulator System (SSRMS) including astronauts, operators involved in ground-based…

  8. The Surveillance of Teachers and the Simulation of Teaching

    ERIC Educational Resources Information Center

    Page, Damien

    2017-01-01

    Just as surveillance in general has become more sophisticated, penetrative and ubiquitous, so has the surveillance of teachers. Enacted through an assemblage of strategies such as learning walks, parental networks, student voice and management information systems, the surveillance of teachers has proliferated as a means of managing the risks of…

  9. Effects of soil moisture on the diurnal pattern of pesticide emission: Numerical simulation and sensitivity analysis

    USDA-ARS?s Scientific Manuscript database

    Accurate prediction of pesticide volatilization is important for the protection of human and environmental health. Due to the complexity of the volatilization process, sophisticated predictive models are needed, especially for dry soil conditions. A mathematical model was developed to allow simulati...

  10. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  11. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  12. POLICY ISSUES ASSOCIATED WITH USING SIMULATION TO ASSESS ENVIRONMENTAL IMPACTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uchitel, Kirsten; Tanana, Heather

    This report examines the relationship between simulation-based science and judicial assessments of simulations or models supporting evaluations of environmental harms or risks, considering both how it exists currently and how it might be shaped in the future. This report considers the legal standards relevant to judicial assessments of simulation-based science and provides examples of the judicial application of those legal standards. Next, this report discusses the factors that inform whether there is a correlation between the sophistication of a challenged simulation and judicial support for that simulation. Finally, this report examines legal analysis of the broader issues that must bemore » addressed for simulation-based science to be better understood and utilized in the context of judicial challenge and evaluation. !« less

  13. The theatre of high-fidelity simulation education.

    PubMed

    Roberts, Debbie; Greene, Leah

    2011-10-01

    High-fidelity simulation is a useful mechanism to aid progression, development and skill acquisition in nurse education. However, nurse lecturers are daunted by sophisticated simulation technology. This paper presents a new method of introducing human patient simulation to students and educators, whilst seeking to demystify the roles, responsibilities and underpinning pedagogy. The analogy of simulation as theatre outlines the concepts of the theatre and stage (simulation laboratory); the play itself (Simulated Clinical Experience, SCE); the actors (nursing students); audience (peer review panel); director (session facilitator); and the production team (technical coordinators). Performing in front of people in a safe environment, repeated practice and taking on a new role teaches students to act, think and be like a nurse. This in turn supports student learning and enhances self confidence. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. Studies of Pilot Control During Launching and Reentry of Space Vehicles, Utilizing the Human Centrifuge

    NASA Technical Reports Server (NTRS)

    Clark, Carl C.; Woodling, C. H.

    1959-01-01

    With the ever increasing complexity of airplanes and the nearness to reality of manned space vehicles the use of pilot-controlled flight simulators has become imperative. The state of the art in flight simulation has progressed well with the demand. Pilot-controlled flight simulators are finding increasing uses in aeromedical research, airplane and airplane systems design, and preflight training. At the present many flight simulators are in existence with various degrees of sophistication and sundry purposes. These vary from fixed base simulators where the pilot applies control inputs according to visual cues presented to him on an instrument display to moving base simulators where various combinations of angular and linear motions are added in an attempt to improve the flight simulation.

  15. A Low Cost Remote Sensing System Using PC and Stereo Equipment

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.; Flood, Michael A.; Prasad, Narasimha S.; Hodson, Wade D.

    2011-01-01

    A system using a personal computer, speaker, and a microphone is used to detect objects, and make crude measurements using a carrier modulated by a pseudorandom noise (PN) code. This system can be constructed using a personal computer and audio equipment commonly found in the laboratory or at home, or more sophisticated equipment that can be purchased at reasonable cost. We demonstrate its value as an instructional tool for teaching concepts of remote sensing and digital signal processing.

  16. Neuroprosthetic Decoder Training as Imitation Learning.

    PubMed

    Merel, Josh; Carlson, David; Paninski, Liam; Cunningham, John P

    2016-05-01

    Neuroprosthetic brain-computer interfaces function via an algorithm which decodes neural activity of the user into movements of an end effector, such as a cursor or robotic arm. In practice, the decoder is often learned by updating its parameters while the user performs a task. When the user's intention is not directly observable, recent methods have demonstrated value in training the decoder against a surrogate for the user's intended movement. Here we show that training a decoder in this way is a novel variant of an imitation learning problem, where an oracle or expert is employed for supervised training in lieu of direct observations, which are not available. Specifically, we describe how a generic imitation learning meta-algorithm, dataset aggregation (DAgger), can be adapted to train a generic brain-computer interface. By deriving existing learning algorithms for brain-computer interfaces in this framework, we provide a novel analysis of regret (an important metric of learning efficacy) for brain-computer interfaces. This analysis allows us to characterize the space of algorithmic variants and bounds on their regret rates. Existing approaches for decoder learning have been performed in the cursor control setting, but the available design principles for these decoders are such that it has been impossible to scale them to naturalistic settings. Leveraging our findings, we then offer an algorithm that combines imitation learning with optimal control, which should allow for training of arbitrary effectors for which optimal control can generate goal-oriented control. We demonstrate this novel and general BCI algorithm with simulated neuroprosthetic control of a 26 degree-of-freedom model of an arm, a sophisticated and realistic end effector.

  17. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.

    PubMed

    Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V

    2014-07-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.

  18. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology

    PubMed Central

    Deodhar, Suruchi; Bisset, Keith R.; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V.

    2014-01-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity. PMID:25530914

  19. The Matrix Element Method: Past, Present, and Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gainer, James S.; Lykken, Joseph; Matchev, Konstantin T.

    2013-07-12

    The increasing use of multivariate methods, and in particular the Matrix Element Method (MEM), represents a revolution in experimental particle physics. With continued exponential growth in computing capabilities, the use of sophisticated multivariate methods-- already common-- will soon become ubiquitous and ultimately almost compulsory. While the existence of sophisticated algorithms for disentangling signal and background might naively suggest a diminished role for theorists, the use of the MEM, with its inherent connection to the calculation of differential cross sections will benefit from collaboration between theorists and experimentalists. In this white paper, we will briefly describe the MEM and some ofmore » its recent uses, note some current issues and potential resolutions, and speculate about exciting future opportunities.« less

  20. Preoperative simulation for the planning of microsurgical clipping of intracranial aneurysms.

    PubMed

    Marinho, Paulo; Vermandel, Maximilien; Bourgeois, Philippe; Lejeune, Jean-Paul; Mordon, Serge; Thines, Laurent

    2014-12-01

    The safety and success of intracranial aneurysm (IA) surgery could be improved through the dedicated application of simulation covering the procedure from the 3-dimensional (3D) description of the surgical scene to the visual representation of the clip application. We aimed in this study to validate the technical feasibility and clinical relevance of such a protocol. All patients preoperatively underwent 3D magnetic resonance imaging and 3D computed tomography angiography to build 3D reconstructions of the brain, cerebral arteries, and surrounding cranial bone. These 3D models were segmented and merged using Osirix, a DICOM image processing application. This provided the surgical scene that was subsequently imported into Blender, a modeling platform for 3D animation. Digitized clips and appliers could then be manipulated in the virtual operative environment, allowing the visual simulation of clipping. This simulation protocol was assessed in a series of 10 IAs by 2 neurosurgeons. The protocol was feasible in all patients. The visual similarity between the surgical scene and the operative view was excellent in 100% of the cases, and the identification of the vascular structures was accurate in 90% of the cases. The neurosurgeons found the simulation helpful for planning the surgical approach (ie, the bone flap, cisternal opening, and arterial tree exposure) in 100% of the cases. The correct number of final clip(s) needed was predicted from the simulation in 90% of the cases. The preoperatively expected characteristics of the optimal clip(s) (ie, their number, shape, size, and orientation) were validated during surgery in 80% of the cases. This study confirmed that visual simulation of IA clipping based on the processing of high-resolution 3D imaging can be effective. This is a new and important step toward the development of a more sophisticated integrated simulation platform dedicated to cerebrovascular surgery.

  1. Multiscale molecular dynamics simulations of membrane remodeling by Bin/Amphiphysin/Rvs family proteins

    NASA Astrophysics Data System (ADS)

    Chun, Chan; Haohua, Wen; Lanyuan, Lu; Jun, Fan

    2016-01-01

    Membrane curvature is no longer thought of as a passive property of the membrane; rather, it is considered as an active, regulated state that serves various purposes in the cell such as between cells and organelle definition. While transport is usually mediated by tiny membrane bubbles known as vesicles or membrane tubules, such communication requires complex interplay between the lipid bilayers and cytosolic proteins such as members of the Bin/Amphiphysin/Rvs (BAR) superfamily of proteins. With rapid developments in novel experimental techniques, membrane remodeling has become a rapidly emerging new field in recent years. Molecular dynamics (MD) simulations are important tools for obtaining atomistic information regarding the structural and dynamic aspects of biological systems and for understanding the physics-related aspects. The availability of more sophisticated experimental data poses challenges to the theoretical community for developing novel theoretical and computational techniques that can be used to better interpret the experimental results to obtain further functional insights. In this review, we summarize the general mechanisms underlying membrane remodeling controlled or mediated by proteins. While studies combining experiments and molecular dynamics simulations recall existing mechanistic models, concurrently, they extend the role of different BAR domain proteins during membrane remodeling processes. We review these recent findings, focusing on how multiscale molecular dynamics simulations aid in understanding the physical basis of BAR domain proteins, as a representative of membrane-remodeling proteins. Project supported by the National Natural Science Foundation of China (Grant No. 21403182) and the Research Grants Council of Hong Kong, China (Grant No. CityU 21300014).

  2. Use of three-dimensional computer graphic animation to illustrate cleft lip and palate surgery.

    PubMed

    Cutting, C; Oliker, A; Haring, J; Dayan, J; Smith, D

    2002-01-01

    Three-dimensional (3D) computer animation is not commonly used to illustrate surgical techniques. This article describes the surgery-specific processes that were required to produce animations to teach cleft lip and palate surgery. Three-dimensional models were created using CT scans of two Chinese children with unrepaired clefts (one unilateral and one bilateral). We programmed several custom software tools, including an incision tool, a forceps tool, and a fat tool. Three-dimensional animation was found to be particularly useful for illustrating surgical concepts. Positioning the virtual "camera" made it possible to view the anatomy from angles that are impossible to obtain with a real camera. Transparency allows the underlying anatomy to be seen during surgical repair while maintaining a view of the overlaying tissue relationships. Finally, the representation of motion allows modeling of anatomical mechanics that cannot be done with static illustrations. The animations presented in this article can be viewed on-line at http://www.smiletrain.org/programs/virtual_surgery2.htm. Sophisticated surgical procedures are clarified with the use of 3D animation software and customized software tools. The next step in the development of this technology is the creation of interactive simulators that recreate the experience of surgery in a safe, digital environment. Copyright 2003 Wiley-Liss, Inc.

  3. Using multimedia virtual patients to enhance the clinical curriculum for medical students.

    PubMed

    McGee, J B; Neill, J; Goldman, L; Casey, E

    1998-01-01

    Changes in the environment in which clinical medical education takes place in the United States has profoundly affected the quality of the learning experience. A shift to out-patient based care, minimization of hospitalization time, and shrinking clinical revenues has changed the teaching hospital or "classroom" to a degree that we must develop innovative approaches to medical education. One solution is the Virtual Patient Project. Utilizing state-of-the-art computer-based multimedia technology, we are building a library of simulated patient encounters that will serve to fill some of the educational gaps that the current health care system has created. This project is part of a newly formed and unique organization, the Harvard Medical School-Beth Israel Deaconess Mount Auburn Institute for Education and Research (the Institute), which supports in-house educational design, production, and faculty time to create Virtual Patients. These problem-based clinical cases allow the medical student to evaluate a patient at initial presentation, order diagnostic tests, observe the outcome and obtain context-sensitive feedback through a computer program designed at the Institute. Multimedia technology and authoring programs have reached a level of sophistication to allow content experts (the teaching faculty) to design and create the majority of the program themselves and to allow students to adapt the program to their individual learning needs.

  4. A comparison of line enhancement techniques: applications to guide-wire detection and respiratory motion tracking

    NASA Astrophysics Data System (ADS)

    Bismuth, Vincent; Vancamberg, Laurence; Gorges, Sébastien

    2009-02-01

    During interventional radiology procedures, guide-wires are usually inserted into the patients vascular tree for diagnosis or healing purpose. These procedures are monitored with an Xray interventional system providing images of the interventional devices navigating through the patient's body. The automatic detection of such tools by image processing means has gained maturity over the past years and enables applications ranging from image enhancement to multimodal image fusion. Sophisticated detection methods are emerging, which rely on a variety of device enhancement techniques. In this article we reviewed and classified these techniques into three families. We chose a state of the art approach in each of them and built a rigorous framework to compare their detection capability and their computational complexity. Through simulations and the intensive use of ROC curves we demonstrated that the Hessian based methods are the most robust to strong curvature of the devices and that the family of rotated filters technique is the most suited for detecting low CNR and low curvature devices. The steerable filter approach demonstrated less interesting detection capabilities and appears to be the most expensive one to compute. Finally we demonstrated the interest of automatic guide-wire detection on a clinical topic: the compensation of respiratory motion in multimodal image fusion.

  5. Novel opportunities for computational biology and sociology in drug discovery☆

    PubMed Central

    Yao, Lixia; Evans, James A.; Rzhetsky, Andrey

    2013-01-01

    Current drug discovery is impossible without sophisticated modeling and computation. In this review we outline previous advances in computational biology and, by tracing the steps involved in pharmaceutical development, explore a range of novel, high-value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy–industry links for scientific and human benefit. Attention to these opportunities could promise punctuated advance and will complement the well-established computational work on which drug discovery currently relies. PMID:20349528

  6. Testing Collisional Scaling Laws: Comparing with Observables

    NASA Astrophysics Data System (ADS)

    Davis, D. R.; Marzari, F.; Farinella, P.

    1999-09-01

    How large bodies break up in response to energetic collisions is a problem that has attracted considerable attention in recent years. Ever more sophisticated computation methods have also been developed; prominent among these are hydrocode simulations of collisional disruption by Benz and Asphaug (1999, Icarus, in press), Love and Ahrens (1996, LPSC XXVII, 777-778), and Melosh and Ryan (1997, Icarus 129, 562-564). Durda et al. (1998, Icarus 135, 431-440) used the observed asteroid size distribution to infer a scaling algorithm. The present situation is that there are several proposed scaling laws that differ by as much as two orders of magnitude at particular sizes. We have expanded upon the work of Davis et al. (1994, Goutelas Proceedings) and tested the suite of proposed scaling algorithms against observations of the main-belt asteroids. The effects of collisions among the asteroids produce the following observables: (a) the size distribution has been significantly shaped by collisions, (b) collisions have produced about 25 well recognized asteroid families, and (c) the basaltic crust of Vesta has been largely preserved in the face of about 4.5 Byr of impacts. We will present results from a numerical simulation of asteroid collisional evolution over the age of the solar system using proposed scaling laws and a range of hypothetical initial populations.

  7. Room-temperature d0 ferromagnetism in carbon-doped Y2O3 for spintronic applications: A density functional theory study

    NASA Astrophysics Data System (ADS)

    Chakraborty, Brahmananda; Nandi, Prithwish K.; Kawazoe, Yoshiyuki; Ramaniah, Lavanya M.

    2018-05-01

    Through density functional theory simulations with the generalized gradient approximation, confirmed by the more sophisticated hybrid functional, we predict the triggering of d0 ferromagnetism in C doped Y2O3 at a hole density of 3.36 ×1021c m-3 (one order less than the critical hole density of ZnO) having magnetic moment of 2.0 μB per defect with ferromagnetic coupling large enough to promote room-temperature ferromagnetism. The persistence of ferromagnetism at room temperature is established through computation of the Curie temperature by the mean field approximation and ab initio molecular dynamics simulations. The induced magnetic moment is mainly contributed by the 2 p orbital of the impurity C and the 2 p orbital of O and we quantitatively and extensively demonstrate through the analysis of density of states and ferromagnetic coupling that the Stoner criterion is satisfied to activate room-temperature ferromagnetism. As the system is stable at room temperature, C doped Y2O3 has feasible defect formation energy and ferromagnetism survives for the choice of hybrid exchange functional, and at room temperature we strongly believe that C doped Y2O3 can be tailored as a room-temperature diluted magnetic semiconductor for spintronic applications.

  8. Status and Plans for the TRANSP Interpretive and Predictive Simulation Code

    NASA Astrophysics Data System (ADS)

    Kaye, Stanley; Andre, Robert; Marina, Gorelenkova; Yuan, Xingqui; Hawryluk, Richard; Jardin, Steven; Poli, Francesca

    2015-11-01

    TRANSP is an integrated interpretive and predictive transport analysis tool that incorporates state of the art heating/current drive sources and transport models. The treatments and transport solvers are becoming increasingly sophisticated and comprehensive. For instance, the ISOLVER component provides a free boundary equilibrium solution, while the PT_SOLVER transport solver is especially suited for stiff transport models such as TGLF. TRANSP also incorporates such source models as NUBEAM for neutral beam injection, GENRAY, TORAY, TORBEAM, TORIC and CQL3D for ICRH, LHCD, ECH and HHFW. The implementation of selected components makes efficient use of MPI for speed up of code calculations. TRANSP has a wide international user-base, and it is run on the FusionGrid to allow for timely support and quick turnaround by the PPPL Computational Plasma Physics Group. It is being used as a basis for both analysis and development of control algorithms and discharge operational scenarios, including simulation of ITER plasmas. This poster will describe present uses of the code worldwide, as well as plans for upgrading the physics modules and code framework. Progress on implementing TRANSP as a component in the ITER IMAS will also be described. This research was supported by the U.S. Department of Energy under contracts DE-AC02-09CH11466.

  9. Load Balancing Integrated Least Slack Time-Based Appliance Scheduling for Smart Home Energy Management

    PubMed Central

    Silva, Bhagya Nathali; Khan, Murad; Han, Kijun

    2018-01-01

    The emergence of smart devices and smart appliances has highly favored the realization of the smart home concept. Modern smart home systems handle a wide range of user requirements. Energy management and energy conservation are in the spotlight when deploying sophisticated smart homes. However, the performance of energy management systems is highly influenced by user behaviors and adopted energy management approaches. Appliance scheduling is widely accepted as an effective mechanism to manage domestic energy consumption. Hence, we propose a smart home energy management system that reduces unnecessary energy consumption by integrating an automated switching off system with load balancing and appliance scheduling algorithm. The load balancing scheme acts according to defined constraints such that the cumulative energy consumption of the household is managed below the defined maximum threshold. The scheduling of appliances adheres to the least slack time (LST) algorithm while considering user comfort during scheduling. The performance of the proposed scheme has been evaluated against an existing energy management scheme through computer simulation. The simulation results have revealed a significant improvement gained through the proposed LST-based energy management scheme in terms of cost of energy, along with reduced domestic energy consumption facilitated by an automated switching off mechanism. PMID:29495346

  10. CANFAR + Skytree: Mining Massive Datasets as an Essential Part of the Future of Astronomy

    NASA Astrophysics Data System (ADS)

    Ball, Nicholas M.

    2013-01-01

    The future study of large astronomical datasets, consisting of hundreds of millions to billions of objects, will be dominated by large computing resources, and by analysis tools of the necessary scalability and sophistication to extract useful information. Significant effort will be required to fulfil their potential as a provider of the next generation of science results. To-date, computing systems have allowed either sophisticated analysis of small datasets, e.g., most astronomy software, or simple analysis of large datasets, e.g., database queries. At the Canadian Astronomy Data Centre, we have combined our cloud computing system, the Canadian Advanced Network for Astronomical Research (CANFAR), with the world's most advanced machine learning software, Skytree, to create the world's first cloud computing system for data mining in astronomy. This allows the full sophistication of the huge fields of data mining and machine learning to be applied to the hundreds of millions of objects that make up current large datasets. CANFAR works by utilizing virtual machines, which appear to the user as equivalent to a desktop. Each machine is replicated as desired to perform large-scale parallel processing. Such an arrangement carries far more flexibility than other cloud systems, because it enables the user to immediately install and run the same code that they already utilize for science on their desktop. We demonstrate the utility of the CANFAR + Skytree system by showing science results obtained, including assigning photometric redshifts with full probability density functions (PDFs) to a catalog of approximately 133 million galaxies from the MegaPipe reductions of the Canada-France-Hawaii Telescope Legacy Wide and Deep surveys. Each PDF is produced nonparametrically from 100 instances of the photometric parameters for each galaxy, generated by perturbing within the errors on the measurements. Hence, we produce, store, and assign redshifts to, a catalog of over 13 billion object instances. This catalog is comparable in size to those expected from next-generation surveys, such as Large Synoptic Survey Telescope. The CANFAR+Skytree system is open for use by any interested member of the astronomical community.

  11. The transition prediction toolkit: LST, SIT, PSE, DNS, and LES

    NASA Technical Reports Server (NTRS)

    Zang, Thomas A.; Chang, Chau-Lyan; Ng, Lian L.

    1992-01-01

    The e(sup N) method for predicting transition onset is an amplitude ratio criterion that is on the verge of full maturation for three-dimensional, compressible, real gas flows. Many of the components for a more sophisticated, absolute amplitude criterion are now emerging: receptivity theory, secondary instability theory, parabolized stability equations approaches, direct numerical simulation and large-eddy simulation. This paper will provide a description of each of these new theoretical tools and provide indications of their current status.

  12. Open source hardware and software platform for robotics and artificial intelligence applications

    NASA Astrophysics Data System (ADS)

    Liang, S. Ng; Tan, K. O.; Lai Clement, T. H.; Ng, S. K.; Mohammed, A. H. Ali; Mailah, Musa; Azhar Yussof, Wan; Hamedon, Zamzuri; Yussof, Zulkifli

    2016-02-01

    Recent developments in open source hardware and software platforms (Android, Arduino, Linux, OpenCV etc.) have enabled rapid development of previously expensive and sophisticated system within a lower budget and flatter learning curves for developers. Using these platform, we designed and developed a Java-based 3D robotic simulation system, with graph database, which is integrated in online and offline modes with an Android-Arduino based rubbish picking remote control car. The combination of the open source hardware and software system created a flexible and expandable platform for further developments in the future, both in the software and hardware areas, in particular in combination with graph database for artificial intelligence, as well as more sophisticated hardware, such as legged or humanoid robots.

  13. GiveMe Shelter: A People-Centred Design Process for Promoting Independent Inquiry-Led Learning in Engineering

    ERIC Educational Resources Information Center

    Dyer, Mark; Grey, Thomas; Kinnane, Oliver

    2017-01-01

    It has become increasingly common for tasks traditionally carried out by engineers to be undertaken by technicians and technologist with access to sophisticated computers and software that can often perform complex calculations that were previously the responsibility of engineers. Not surprisingly, this development raises serious questions about…

  14. A Developing Market for Continuing Higher Education: The Reserve Components.

    ERIC Educational Resources Information Center

    Watt, David M.

    Due to increasingly sophisticated military equipment, the Reserve Components of the armed forces need to raise the educational standards for recruits. A number of U.S. educational institutions have responded to their needs for continuing higher education in the areas of job skill enhancement (such as computer operation), regular courses directly…

  15. Deep FIFO Surge Buffer

    NASA Technical Reports Server (NTRS)

    Temple, Gerald; Siegel, Marc; Amitai, Zwie

    1991-01-01

    First-in/first-out (FIFO) temporarily stores short surges of data generated by data-acquisition system at excessively high rate and releases data at lower rate suitable for processing by computer. Size and complexity reduced while capacity enhanced by use of newly developed, sophisticated integrated circuits and by "byte-folding" scheme doubling effective depth and data rate.

  16. Let's Dance the "Robot Hokey-Pokey!": Children's Programming Approaches and Achievement throughout Early Cognitive Development

    ERIC Educational Resources Information Center

    Flannery, Louise P.; Bers, Marina Umaschi

    2013-01-01

    Young learners today generate, express, and interact with sophisticated ideas using a range of digital tools to explore interactive stories, animations, computer games, and robotics. In recent years, new developmentally appropriate robotics kits have been entering early childhood classrooms. This paper presents a retrospective analysis of one…

  17. Using Excel's Solver Function to Facilitate Reciprocal Service Department Cost Allocations

    ERIC Educational Resources Information Center

    Leese, Wallace R.

    2013-01-01

    The reciprocal method of service department cost allocation requires linear equations to be solved simultaneously. These computations are often so complex as to cause the abandonment of the reciprocal method in favor of the less sophisticated and theoretically incorrect direct or step-down methods. This article illustrates how Excel's Solver…

  18. Children's Behavior toward and Understanding of Robotic and Living Dogs

    ERIC Educational Resources Information Center

    Melson, Gail F.; Kahn, Peter H., Jr.; Beck, Alan; Friedman, Batya; Roberts, Trace; Garrett, Erik; Gill, Brian T.

    2009-01-01

    This study investigated children's reasoning about and behavioral interactions with a computationally sophisticated robotic dog (Sony's AIBO) compared to a live dog (an Australian Shepherd). Seventy-two children from three age groups (7-9 years, 10-12 years, and 13-15 years) participated in this study. Results showed that more children…

  19. Using Novel Word Context Measures to Predict Human Ratings of Lexical Proficiency

    ERIC Educational Resources Information Center

    Berger, Cynthia M.; Crossley, Scott A.; Kyle, Kristopher

    2017-01-01

    This study introduces a model of lexical proficiency based on novel computational indices related to word context. The indices come from an updated version of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES) and include associative, lexical, and semantic measures of word context. Human ratings of holistic lexical proficiency…

  20. Using Excel's Matrix Operations to Facilitate Reciprocal Cost Allocations

    ERIC Educational Resources Information Center

    Leese, Wallace R.; Kizirian, Tim

    2009-01-01

    The reciprocal method of service department cost allocation requires linear equations to be solved simultaneously. These computations are often so complex as to cause the abandonment of the reciprocal method in favor of the less sophisticated direct or step-down methods. Here is a short example demonstrating how Excel's sometimes unknown matrix…

  1. Game Theory of Mind

    PubMed Central

    Yoshida, Wako; Dolan, Ray J.; Friston, Karl J.

    2008-01-01

    This paper introduces a model of ‘theory of mind’, namely, how we represent the intentions and goals of others to optimise our mutual interactions. We draw on ideas from optimum control and game theory to provide a ‘game theory of mind’. First, we consider the representations of goals in terms of value functions that are prescribed by utility or rewards. Critically, the joint value functions and ensuing behaviour are optimised recursively, under the assumption that I represent your value function, your representation of mine, your representation of my representation of yours, and so on ad infinitum. However, if we assume that the degree of recursion is bounded, then players need to estimate the opponent's degree of recursion (i.e., sophistication) to respond optimally. This induces a problem of inferring the opponent's sophistication, given behavioural exchanges. We show it is possible to deduce whether players make inferences about each other and quantify their sophistication on the basis of choices in sequential games. This rests on comparing generative models of choices with, and without, inference. Model comparison is demonstrated using simulated and real data from a ‘stag-hunt’. Finally, we note that exactly the same sophisticated behaviour can be achieved by optimising the utility function itself (through prosocial utility), producing unsophisticated but apparently altruistic agents. This may be relevant ethologically in hierarchal game theory and coevolution. PMID:19112488

  2. Phantom-GRAPE: Numerical software library to accelerate collisionless N-body simulation with SIMD instruction set on x86 architecture

    NASA Astrophysics Data System (ADS)

    Tanikawa, Ataru; Yoshikawa, Kohji; Nitadori, Keigo; Okamoto, Takashi

    2013-02-01

    We have developed a numerical software library for collisionless N-body simulations named "Phantom-GRAPE" which highly accelerates force calculations among particles by use of a new SIMD instruction set extension to the x86 architecture, Advanced Vector eXtensions (AVX), an enhanced version of the Streaming SIMD Extensions (SSE). In our library, not only the Newton's forces, but also central forces with an arbitrary shape f(r), which has a finite cutoff radius rcut (i.e. f(r)=0 at r>rcut), can be quickly computed. In computing such central forces with an arbitrary force shape f(r), we refer to a pre-calculated look-up table. We also present a new scheme to create the look-up table whose binning is optimal to keep good accuracy in computing forces and whose size is small enough to avoid cache misses. Using an Intel Core i7-2600 processor, we measure the performance of our library for both of the Newton's forces and the arbitrarily shaped central forces. In the case of Newton's forces, we achieve 2×109 interactions per second with one processor core (or 75 GFLOPS if we count 38 operations per interaction), which is 20 times higher than the performance of an implementation without any explicit use of SIMD instructions, and 2 times than that with the SSE instructions. With four processor cores, we obtain the performance of 8×109 interactions per second (or 300 GFLOPS). In the case of the arbitrarily shaped central forces, we can calculate 1×109 and 4×109 interactions per second with one and four processor cores, respectively. The performance with one processor core is 6 times and 2 times higher than those of the implementations without any use of SIMD instructions and with the SSE instructions. These performances depend only weakly on the number of particles, irrespective of the force shape. It is good contrast with the fact that the performance of force calculations accelerated by graphics processing units (GPUs) depends strongly on the number of particles. Substantially weak dependence of the performance on the number of particles is suitable to collisionless N-body simulations, since these simulations are usually performed with sophisticated N-body solvers such as Tree- and TreePM-methods combined with an individual timestep scheme. We conclude that collisionless N-body simulations accelerated with our library have significant advantage over those accelerated by GPUs, especially on massively parallel environments.

  3. Free energies of binding from large-scale first-principles quantum mechanical calculations: application to ligand hydration energies.

    PubMed

    Fox, Stephen J; Pittock, Chris; Tautermann, Christofer S; Fox, Thomas; Christ, Clara; Malcolm, N O J; Essex, Jonathan W; Skylaris, Chris-Kriton

    2013-08-15

    Schemes of increasing sophistication for obtaining free energies of binding have been developed over the years, where configurational sampling is used to include the all-important entropic contributions to the free energies. However, the quality of the results will also depend on the accuracy with which the intermolecular interactions are computed at each molecular configuration. In this context, the energy change associated with the rearrangement of electrons (electronic polarization and charge transfer) upon binding is a very important effect. Classical molecular mechanics force fields do not take this effect into account explicitly, and polarizable force fields and semiempirical quantum or hybrid quantum-classical (QM/MM) calculations are increasingly employed (at higher computational cost) to compute intermolecular interactions in free-energy schemes. In this work, we investigate the use of large-scale quantum mechanical calculations from first-principles as a way of fully taking into account electronic effects in free-energy calculations. We employ a one-step free-energy perturbation (FEP) scheme from a molecular mechanical (MM) potential to a quantum mechanical (QM) potential as a correction to thermodynamic integration calculations within the MM potential. We use this approach to calculate relative free energies of hydration of small aromatic molecules. Our quantum calculations are performed on multiple configurations from classical molecular dynamics simulations. The quantum energy of each configuration is obtained from density functional theory calculations with a near-complete psinc basis set on over 600 atoms using the ONETEP program.

  4. A Review of the IEE's Involvement in Academic Gaming.

    ERIC Educational Resources Information Center

    Ellington, H. I.; And Others

    In partnership with the Institute of Technology in Aberdeen, the Institution of Electrical Engineers (IEE) has pioneered the development of a range of highly sophisticated simulation games and case studies based on realistic engineering scenarios for use in secondary and higher education and industrial training. The initial involvement of IEE in…

  5. SNOWMIP2: An evaluation of forest snow process simulations

    Treesearch

    Richard Essery; Nick Rutter; John Pomeroy; Robert Baxter; Manfred Stahli; David Gustafsson; Alan Barr; Paul Bartlett; Kelly Elder

    2009-01-01

    Models of terrestrial snow cover, or snow modules within land surface models, are used in many meteorological, hydrological, and ecological applications. Such models were developed first, and have achieved their greatest sophistication, for snow in open areas; however, huge tracts of the Northern Hemisphere both have seasonal snow cover and are forested (Fig. 1)....

  6. Use of artificial landscapes to isolate controls on burn probability

    Treesearch

    Marc-Andre Parisien; Carol Miller; Alan A. Ager; Mark A. Finney

    2010-01-01

    Techniques for modeling burn probability (BP) combine the stochastic components of fire regimes (ignitions and weather) with sophisticated fire growth algorithms to produce high-resolution spatial estimates of the relative likelihood of burning. Despite the numerous investigations of fire patterns from either observed or simulated sources, the specific influence of...

  7. An End-to-End Model of a Hall Thruster

    DTIC Science & Technology

    2000-09-01

    and deposition of sputtered material, simulation of the operator of a Hall Thruster in a vacuum tank and the extension to the near-plume of a...sophisticated Hall thruster transient hybrid PlC model which had been previously used only to describe the internal flow. The first two items have been

  8. Enhancing Student Learning of Enterprise Integration and Business Process Orientation through an ERP Business Simulation Game

    ERIC Educational Resources Information Center

    Seethamraju, Ravi

    2011-01-01

    The sophistication of the integrated world of work and increased recognition of business processes as critical corporate assets require graduates to develop "process orientation" and an "integrated view" of business. Responding to these dynamic changes in business organizations, business schools are also continuing to modify…

  9. Modeling, simulation, and analysis of optical remote sensing systems

    NASA Technical Reports Server (NTRS)

    Kerekes, John Paul; Landgrebe, David A.

    1989-01-01

    Remote Sensing of the Earth's resources from space-based sensors has evolved in the past 20 years from a scientific experiment to a commonly used technological tool. The scientific applications and engineering aspects of remote sensing systems have been studied extensively. However, most of these studies have been aimed at understanding individual aspects of the remote sensing process while relatively few have studied their interrelations. A motivation for studying these interrelationships has arisen with the advent of highly sophisticated configurable sensors as part of the Earth Observing System (EOS) proposed by NASA for the 1990's. Two approaches to investigating remote sensing systems are developed. In one approach, detailed models of the scene, the sensor, and the processing aspects of the system are implemented in a discrete simulation. This approach is useful in creating simulated images with desired characteristics for use in sensor or processing algorithm development. A less complete, but computationally simpler method based on a parametric model of the system is also developed. In this analytical model the various informational classes are parameterized by their spectral mean vector and covariance matrix. These class statistics are modified by models for the atmosphere, the sensor, and processing algorithms and an estimate made of the resulting classification accuracy among the informational classes. Application of these models is made to the study of the proposed High Resolution Imaging Spectrometer (HRIS). The interrelationships among observational conditions, sensor effects, and processing choices are investigated with several interesting results.

  10. Multi-fluid modelling of pulsed discharges for flow control applications

    NASA Astrophysics Data System (ADS)

    Poggie, J.

    2015-02-01

    Experimental evidence suggests that short-pulse dielectric barrier discharge actuators are effective for speeds corresponding to take-off and approach of large aircraft, and thus are a fruitful direction for flow control technology development. Large-eddy simulations have reproduced some of the main fluid dynamic effects. The plasma models used in such simulations are semi-empirical, however, and need to be tuned for each flowfield under consideration. In this paper, the discharge physics is examined in more detail with multi-fluid modelling, comparing a five-moment model (continuity, momentum, and energy equations) to a two-moment model (continuity and energy equations). A steady-state, one-dimensional discharge was considered first, and the five-moment model was found to predict significantly lower ionisation rates and number densities than the two-moment model. A two-dimensional, transient discharge problem with an elliptical cathode was studied next. Relative to the two-moment model, the five-moment model predicted a slower response to the activation of the cathode, and lower electron velocities and temperatures as the simulation approached steady-state. The primary reason for the differences in the predictions of the two models can be attributed to the effects of particle inertia, particularly electron inertia in the cathode layer. The computational cost of the five-moment model is only about twice that of the simpler variant, suggesting that it may be feasible to use the more sophisticated model in practical calculations for flow control actuator design.

  11. LCG MCDB—a knowledgebase of Monte-Carlo simulated events

    NASA Astrophysics Data System (ADS)

    Belov, S.; Dudko, L.; Galkin, E.; Gusev, A.; Pokorski, W.; Sherstnev, A.

    2008-02-01

    In this paper we report on LCG Monte-Carlo Data Base (MCDB) and software which has been developed to operate MCDB. The main purpose of the LCG MCDB project is to provide a storage and documentation system for sophisticated event samples simulated for the LHC Collaborations by experts. In many cases, the modern Monte-Carlo simulation of physical processes requires expert knowledge in Monte-Carlo generators or significant amount of CPU time to produce the events. MCDB is a knowledgebase mainly dedicated to accumulate simulated events of this type. The main motivation behind LCG MCDB is to make the sophisticated MC event samples available for various physical groups. All the data from MCDB is accessible in several convenient ways. LCG MCDB is being developed within the CERN LCG Application Area Simulation project. Program summaryProgram title: LCG Monte-Carlo Data Base Catalogue identifier: ADZX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 30 129 No. of bytes in distributed program, including test data, etc.: 216 943 Distribution format: tar.gz Programming language: Perl Computer: CPU: Intel Pentium 4, RAM: 1 Gb, HDD: 100 Gb Operating system: Scientific Linux CERN 3/4 RAM: 1 073 741 824 bytes (1 Gb) Classification: 9 External routines:perl >= 5.8.5; Perl modules DBD-mysql >= 2.9004, File::Basename, GD::SecurityImage, GD::SecurityImage::AC, Linux::Statistics, XML::LibXML > 1.6, XML::SAX, XML::NamespaceSupport; Apache HTTP Server >= 2.0.59; mod auth external >= 2.2.9; edg-utils-system RPM package; gd >= 2.0.28; rpm package CASTOR-client >= 2.1.2-4; arc-server (optional) Nature of problem: Often, different groups of experimentalists prepare similar samples of particle collision events or turn to the same group of authors of Monte-Carlo (MC) generators to prepare the events. For example, the same MC samples of Standard Model (SM) processes can be employed for the investigations either in the SM analyses (as a signal) or in searches for new phenomena in Beyond Standard Model analyses (as a background). If the samples are made available publicly and equipped with corresponding and comprehensive documentation, it can speed up cross checks of the samples themselves and physical models applied. Some event samples require a lot of computing resources for preparation. So, a central storage of the samples prevents possible waste of researcher time and computing resources, which can be used to prepare the same events many times. Solution method: Creation of a special knowledgebase (MCDB) designed to keep event samples for the LHC experimental and phenomenological community. The knowledgebase is realized as a separate web-server ( http://mcdb.cern.ch). All event samples are kept on types at CERN. Documentation describing the events is the main contents of MCDB. Users can browse the knowledgebase, read and comment articles (documentation), and download event samples. Authors can upload new event samples, create new articles, and edit own articles. Restrictions: The software is adopted to solve the problems, described in the article and there are no any additional restrictions. Unusual features: The software provides a framework to store and document large files with flexible authentication and authorization system. Different external storages with large capacity can be used to keep the files. The WEB Content Management System provides all of the necessary interfaces for the authors of the files, end-users and administrators. Running time: Real time operations. References: [1] The main LCG MCDB server, http://mcdb.cern.ch/. [2] P. Bartalini, L. Dudko, A. Kryukov, I.V. Selyuzhenkov, A. Sherstnev, A. Vologdin, LCG Monte-Carlo data base, hep-ph/0404241. [3] J.P. Baud, B. Couturier, C. Curran, J.D. Durand, E. Knezo, S. Occhetti, O. Barring, CASTOR: status and evolution, cs.oh/0305047.

  12. Astrophysical fluid simulations of thermally ideal gases with non-constant adiabatic index: numerical implementation

    NASA Astrophysics Data System (ADS)

    Vaidya, B.; Mignone, A.; Bodo, G.; Massaglia, S.

    2015-08-01

    Context. An equation of state (EoS) is a relation between thermodynamic state variables and it is essential for closing the set of equations describing a fluid system. Although an ideal EoS with a constant adiabatic index Γ is the preferred choice owing to its simplistic implementation, many astrophysical fluid simulations may benefit from a more sophisticated treatment that can account for diverse chemical processes. Aims: In the present work we first review the basic thermodynamic principles of a gas mixture in terms of its thermal and caloric EoS by including effects like ionization, dissociation, and temperature dependent degrees of freedom such as molecular vibrations and rotations. The formulation is revisited in the context of plasmas that are either in equilibrium conditions (local thermodynamic- or collisional excitation-equilibria) or described by non-equilibrium chemistry coupled to optically thin radiative cooling. We then present a numerical implementation of thermally ideal gases obeying a more general caloric EoS with non-constant adiabatic index in Godunov-type numerical schemes. Methods: We discuss the necessary modifications to the Riemann solver and to the conversion between total energy and pressure (or vice versa) routinely invoked in Godunov-type schemes. We then present two different approaches for computing the EoS. The first employs root-finder methods and it is best suited for EoS in analytical form. The second is based on lookup tables and interpolation and results in a more computationally efficient approach, although care must be taken to ensure thermodynamic consistency. Results: A number of selected benchmarks demonstrate that the employment of a non-ideal EoS can lead to important differences in the solution when the temperature range is 500-104 K where dissociation and ionization occur. The implementation of selected EoS introduces additional computational costs although the employment of lookup table methods (when possible) can significantly reduce the overhead by a factor of ~ 3-4.

  13. Perceptual organization in computer vision - A review and a proposal for a classificatory structure

    NASA Technical Reports Server (NTRS)

    Sarkar, Sudeep; Boyer, Kim L.

    1993-01-01

    The evolution of perceptual organization in biological vision, and its necessity in advanced computer vision systems, arises from the characteristic that perception, the extraction of meaning from sensory input, is an intelligent process. This is particularly so for high order organisms and, analogically, for more sophisticated computational models. The role of perceptual organization in computer vision systems is explored. This is done from four vantage points. First, a brief history of perceptual organization research in both humans and computer vision is offered. Next, a classificatory structure in which to cast perceptual organization research to clarify both the nomenclature and the relationships among the many contributions is proposed. Thirdly, the perceptual organization work in computer vision in the context of this classificatory structure is reviewed. Finally, the array of computational techniques applied to perceptual organization problems in computer vision is surveyed.

  14. Probabilistic volcanic hazard assessments of Pyroclastic Density Currents: ongoing practices and future perspectives

    NASA Astrophysics Data System (ADS)

    Tierz, Pablo; Sandri, Laura; Ramona Stefanescu, Elena; Patra, Abani; Marzocchi, Warner; Costa, Antonio; Sulpizio, Roberto

    2014-05-01

    Explosive volcanoes and, especially, Pyroclastic Density Currents (PDCs) pose an enormous threat to populations living in the surroundings of volcanic areas. Difficulties in the modeling of PDCs are related to (i) very complex and stochastic physical processes, intrinsic to their occurrence, and (ii) to a lack of knowledge about how these processes actually form and evolve. This means that there are deep uncertainties (namely, of aleatory nature due to point (i) above, and of epistemic nature due to point (ii) above) associated to the study and forecast of PDCs. Consequently, the assessment of their hazard is better described in terms of probabilistic approaches rather than by deterministic ones. What is actually done to assess probabilistic hazard from PDCs is to couple deterministic simulators with statistical techniques that can, eventually, supply probabilities and inform about the uncertainties involved. In this work, some examples of both PDC numerical simulators (Energy Cone and TITAN2D) and uncertainty quantification techniques (Monte Carlo sampling -MC-, Polynomial Chaos Quadrature -PCQ- and Bayesian Linear Emulation -BLE-) are presented, and their advantages, limitations and future potential are underlined. The key point in choosing a specific method leans on the balance between its related computational cost, the physical reliability of the simulator and the pursued target of the hazard analysis (type of PDCs considered, time-scale selected for the analysis, particular guidelines received from decision-making agencies, etc.). Although current numerical and statistical techniques have brought important advances in probabilistic volcanic hazard assessment from PDCs, some of them may be further applicable to more sophisticated simulators. In addition, forthcoming improvements could be focused on three main multidisciplinary directions: 1) Validate the simulators frequently used (through comparison with PDC deposits and other simulators), 2) Decrease simulator runtimes (whether by increasing the knowledge about the physical processes or by doing more efficient programming, parallelization, ...) and 3) Improve uncertainty quantification techniques.

  15. An introduction to real-time graphical techniques for analyzing multivariate data

    NASA Astrophysics Data System (ADS)

    Friedman, Jerome H.; McDonald, John Alan; Stuetzle, Werner

    1987-08-01

    Orion I is a graphics system used to study applications of computer graphics - especially interactive motion graphics - in statistics. Orion I is the newest of a family of "Prim" systems, whose most striking common feature is the use of real-time motion graphics to display three dimensional scatterplots. Orion I differs from earlier Prim systems through the use of modern and relatively inexpensive raster graphics and microprocessor technology. It also delivers more computing power to its user; Orion I can perform more sophisticated real-time computations than were possible on previous such systems. We demonstrate some of Orion I's capabilities in our film: "Exploring data with Orion I".

  16. Computer Assisted Navigation in Knee Arthroplasty

    PubMed Central

    Bae, Dae Kyung

    2011-01-01

    Computer assisted surgery (CAS) was used to improve the positioning of implants during total knee arthroplasty (TKA). Most studies have reported that computer assisted navigation reduced the outliers of alignment and component malpositioning. However, additional sophisticated studies are necessary to determine if the improvement of alignment will improve long-term clinical results and increase the survival rate of the implant. Knowledge of CAS-TKA technology and understanding the advantages and limitations of navigation are crucial to the successful application of the CAS technique in TKA. In this article, we review the components of navigation, classification of the system, surgical method, potential error, clinical results, advantages, and disadvantages. PMID:22162787

  17. Sydney Observatory and astronomy teaching in the 90s

    NASA Astrophysics Data System (ADS)

    Lomb, N.

    1996-05-01

    Computers and the Internet have created a revolution in the way astronomy can be communicated to the public. At Sydney Observatory we make full use of these recent developments. In our lecture room a variety of sophisticated computer programs can show, with the help of a projection TV system, the appearance and motion of the sky at any place, date or time. The latest HST images obtained from the Internet can be shown, as can images taken through our own Meade 16 inch telescope. This recently installed computer-controlled telescope with its accurate pointing is an ideal instrument for a light-polluted site such as ours.

  18. Creation of Anatomically Accurate Computer-Aided Design (CAD) Solid Models from Medical Images

    NASA Technical Reports Server (NTRS)

    Stewart, John E.; Graham, R. Scott; Samareh, Jamshid A.; Oberlander, Eric J.; Broaddus, William C.

    1999-01-01

    Most surgical instrumentation and implants used in the world today are designed with sophisticated Computer-Aided Design (CAD)/Computer-Aided Manufacturing (CAM) software. This software automates the mechanical development of a product from its conceptual design through manufacturing. CAD software also provides a means of manipulating solid models prior to Finite Element Modeling (FEM). Few surgical products are designed in conjunction with accurate CAD models of human anatomy because of the difficulty with which these models are created. We have developed a novel technique that creates anatomically accurate, patient specific CAD solids from medical images in a matter of minutes.

  19. An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.

    PubMed Central

    Undrill, P E; Frazer, S C

    1979-01-01

    A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340

  20. Planetary investigation utilizing an imaging spectrometer system based upon charge injection technology

    NASA Technical Reports Server (NTRS)

    Wattson, R. B.; Harvey, P.; Swift, R.

    1975-01-01

    An intrinsic silicon charge injection device (CID) television sensor array has been used in conjunction with a CaMoO4 colinear tunable acousto optic filter, a 61 inch reflector, a sophisticated computer system, and a digital color TV scan converter/computer to produce near IR images of Saturn and Jupiter with 10A spectral resolution and approximately 3 inch spatial resolution. The CID camera has successfully obtained digitized 100 x 100 array images with 5 minutes of exposure time, and slow-scanned readout to a computer. Details of the equipment setup, innovations, problems, experience, data and final equipment performance limits are given.

Top