Science.gov

Sample records for advanced computer simulation

  1. Recent advances in computer image generation simulation.

    PubMed

    Geltmacher, H E

    1988-11-01

    An explosion in flight simulator technology over the past 10 years is revolutionizing U.S. Air Force (USAF) operational training. The single, most important development has been in computer image generation. However, other significant advances are being made in simulator handling qualities, real-time computation systems, and electro-optical displays. These developments hold great promise for achieving high fidelity combat mission simulation. This article reviews the progress to date and predicts its impact, along with that of new computer science advances such as very high speed integrated circuits (VHSIC), on future USAF aircrew simulator training. Some exciting possibilities are multiship, full-mission simulators at replacement training units, miniaturized unit level mission rehearsal training simulators, onboard embedded training capability, and national scale simulator networking.

  2. Simulation methods for advanced scientific computing

    SciTech Connect

    Booth, T.E.; Carlson, J.A.; Forster, R.A.

    1998-11-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of the project was to create effective new algorithms for solving N-body problems by computer simulation. The authors concentrated on developing advanced classical and quantum Monte Carlo techniques. For simulations of phase transitions in classical systems, they produced a framework generalizing the famous Swendsen-Wang cluster algorithms for Ising and Potts models. For spin-glass-like problems, they demonstrated the effectiveness of an extension of the multicanonical method for the two-dimensional, random bond Ising model. For quantum mechanical systems, they generated a new method to compute the ground-state energy of systems of interacting electrons. They also improved methods to compute excited states when the diffusion quantum Monte Carlo method is used and to compute longer time dynamics when the stationary phase quantum Monte Carlo method is used.

  3. Advances in Monte Carlo computer simulation

    NASA Astrophysics Data System (ADS)

    Swendsen, Robert H.

    2011-03-01

    Since the invention of the Metropolis method in 1953, Monte Carlo methods have been shown to provide an efficient, practical approach to the calculation of physical properties in a wide variety of systems. In this talk, I will discuss some of the advances in the MC simulation of thermodynamics systems, with an emphasis on optimization to obtain a maximum of useful information.

  4. Use of advanced computers for aerodynamic flow simulation

    NASA Technical Reports Server (NTRS)

    Bailey, F. R.; Ballhaus, W. F.

    1980-01-01

    The current and projected use of advanced computers for large-scale aerodynamic flow simulation applied to engineering design and research is discussed. The design use of mature codes run on conventional, serial computers is compared with the fluid research use of new codes run on parallel and vector computers. The role of flow simulations in design is illustrated by the application of a three dimensional, inviscid, transonic code to the Sabreliner 60 wing redesign. Research computations that include a more complete description of the fluid physics by use of Reynolds averaged Navier-Stokes and large-eddy simulation formulations are also presented. Results of studies for a numerical aerodynamic simulation facility are used to project the feasibility of design applications employing these more advanced three dimensional viscous flow simulations.

  5. Advanced Simulation and Computing Business Plan

    SciTech Connect

    Rummel, E.

    2015-07-09

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.

  6. Advanced computer graphic techniques for laser range finder (LRF) simulation

    NASA Astrophysics Data System (ADS)

    Bedkowski, Janusz; Jankowski, Stanislaw

    2008-11-01

    This paper show an advanced computer graphic techniques for laser range finder (LRF) simulation. The LRF is the common sensor for unmanned ground vehicle, autonomous mobile robot and security applications. The cost of the measurement system is extremely high, therefore the simulation tool is designed. The simulation gives an opportunity to execute algorithm such as the obstacle avoidance[1], slam for robot localization[2], detection of vegetation and water obstacles in surroundings of the robot chassis[3], LRF measurement in crowd of people[1]. The Axis Aligned Bounding Box (AABB) and alternative technique based on CUDA (NVIDIA Compute Unified Device Architecture) is presented.

  7. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  8. Modeling emergency department operations using advanced computer simulation systems.

    PubMed

    Saunders, C E; Makens, P K; Leblanc, L J

    1989-02-01

    We developed a computer simulation model of emergency department operations using simulation software. This model uses multiple levels of preemptive patient priority; assigns each patient to an individual nurse and physician; incorporates all standard tests, procedures, and consultations; and allows patient service processes to proceed simultaneously, sequentially, repetitively, or a combination of these. Selected input data, including the number of physicians, nurses, and treatment beds, and the blood test turnaround time, then were varied systematically to determine their simulated effect on patient throughput time, selected queue sizes, and rates of resource utilization. Patient throughput time varied directly with laboratory service times and inversely with the number of physician or nurse servers. Resource utilization rates varied inversely with resource availability, and patient waiting time and patient throughput time varied indirectly with the level of patient acuity. The simulation can be animated on a computer monitor, showing simulated patients, specimens, and staff members moving throughout the ED. Computer simulation is a potentially useful tool that can help predict the results of changes in the ED system without actually altering it and may have implications for planning, optimizing resources, and improving the efficiency and quality of care.

  9. Advanced Simulation and Computing Co-Design Strategy

    SciTech Connect

    Ang, James A.; Hoang, Thuc T.; Kelly, Suzanne M.; McPherson, Allen; Neely, Rob

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  10. Advanced computing

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Advanced concepts in hardware, software and algorithms are being pursued for application in next generation space computers and for ground based analysis of space data. The research program focuses on massively parallel computation and neural networks, as well as optical processing and optical networking which are discussed under photonics. Also included are theoretical programs in neural and nonlinear science, and device development for magnetic and ferroelectric memories.

  11. Software Partitioning Schemes for Advanced Simulation Computer Systems. Final Report.

    ERIC Educational Resources Information Center

    Clymer, S. J.

    Conducted to design software partitioning techniques for use by the Air Force to partition a large flight simulator program for optimal execution on alternative configurations, this study resulted in a mathematical model which defines characteristics for an optimal partition, and a manually demonstrated partitioning algorithm design which…

  12. FY05-FY06 Advanced Simulation and Computing Implementation Plan, Volume 2

    SciTech Connect

    Baron, A L

    2004-07-19

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapon design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile life extension programs and the resolution of significant finding investigations (SFIs). This requires a balanced system of technical staff, hardware, simulation software, and computer science solutions.

  13. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    McCoy, Michel; Archer, Bill; Matzen, M. Keith

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.

  14. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    NASA Technical Reports Server (NTRS)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  15. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    SciTech Connect

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  16. Advanced Simulation and Computing FY07-08 Implementation Plan Volume 2

    SciTech Connect

    Kusnezov, D; Hale, A; McCoy, M; Hopson, J

    2006-06-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  17. Advanced Simulation & Computing FY09-FY10 Implementation Plan Volume 2, Rev. 0

    SciTech Connect

    Meisner, R; Perry, J; McCoy, M; Hopson, J

    2008-04-30

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  18. Advanced Simulation and Computing FY10-11 Implementation Plan Volume 2, Rev. 0

    SciTech Connect

    Carnes, B

    2009-06-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  19. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    SciTech Connect

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-04-25

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  20. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    SciTech Connect

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  1. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  2. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  3. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    SciTech Connect

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  4. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  5. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    PubMed

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging.

  6. Recent advances in computational methodology for simulation of mechanical circulatory assist devices

    PubMed Central

    Marsden, Alison L.; Bazilevs, Yuri; Long, Christopher C.; Behr, Marek

    2014-01-01

    Ventricular assist devices (VADs) provide mechanical circulatory support to offload the work of one or both ventricles during heart failure. They are used in the clinical setting as destination therapy, as bridge to transplant, or more recently as bridge to recovery to allow for myocardial remodeling. Recent developments in computational simulation allow for detailed assessment of VAD hemodynamics for device design and optimization for both children and adults. Here, we provide a focused review of the recent literature on finite element methods and optimization for VAD simulations. As VAD designs typically fall into two categories, pulsatile and continuous flow devices, we separately address computational challenges of both types of designs, and the interaction with the circulatory system with three representative case studies. In particular, we focus on recent advancements in finite element methodology that has increased the fidelity of VAD simulations. We outline key challenges, which extend to the incorporation of biological response such as thrombosis and hemolysis, as well as shape optimization methods and challenges in computational methodology. PMID:24449607

  7. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    SciTech Connect

    McCoy, M.; Archer, B.; Hendrickson, B.

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  8. An expanded framework for the advanced computational testing and simulation toolkit

    SciTech Connect

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  9. Current Advances in the Computational Simulation of the Formation of Low-Mass Stars

    SciTech Connect

    Klein, R I; Inutsuka, S; Padoan, P; Tomisaka, K

    2005-10-24

    Developing a theory of low-mass star formation ({approx} 0.1 to 3 M{sub {circle_dot}}) remains one of the most elusive and important goals of theoretical astrophysics. The star-formation process is the outcome of the complex dynamics of interstellar gas involving non-linear interactions of turbulence, gravity, magnetic field and radiation. The evolution of protostellar condensations, from the moment they are assembled by turbulent flows to the time they reach stellar densities, spans an enormous range of scales, resulting in a major computational challenge for simulations. Since the previous Protostars and Planets conference, dramatic advances in the development of new numerical algorithmic techniques have been successfully implemented on large scale parallel supercomputers. Among such techniques, Adaptive Mesh Refinement and Smooth Particle Hydrodynamics have provided frameworks to simulate the process of low-mass star formation with a very large dynamic range. It is now feasible to explore the turbulent fragmentation of molecular clouds and the gravitational collapse of cores into stars self-consistently within the same calculation. The increased sophistication of these powerful methods comes with substantial caveats associated with the use of the techniques and the interpretation of the numerical results. In this review, we examine what has been accomplished in the field and present a critique of both numerical methods and scientific results. We stress that computational simulations should obey the available observational constraints and demonstrate numerical convergence. Failing this, results of large scale simulations do not advance our understanding of low-mass star formation.

  10. Advanced Simulation and Computing: A Summary Report to the Director's Review

    SciTech Connect

    McCoy, M G; Peck, T

    2003-06-01

    It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way to sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.

  11. A Computational Methodology for Simulating Thermal Loss Testing of the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Wilson, Scott D.; Schifer, Nicholas A.; Briggs, Maxwell H.

    2012-01-01

    The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two highefficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including the use of multidimensional numerical models. Validation test hardware has also been used to provide a direct comparison of numerical results and validate the multi-dimensional numerical models used to predict convertor net heat input and efficiency. These validation tests were designed to simulate the temperature profile of an operating Stirling convertor and resulted in a measured net heat input of 244.4 W. The methodology was applied to the multi-dimensional numerical model which resulted in a net heat input of 240.3 W. The computational methodology resulted in a value of net heat input that was 1.7 percent less than that measured during laboratory testing. The resulting computational methodology and results are discussed.

  12. A computer program for estimating the power-density spectrum of advanced continuous simulation language generated time histories

    NASA Technical Reports Server (NTRS)

    Dunn, H. J.

    1981-01-01

    A computer program for performing frequency analysis of time history data is presented. The program uses circular convolution and the fast Fourier transform to calculate power density spectrum (PDS) of time history data. The program interfaces with the advanced continuous simulation language (ACSL) so that a frequency analysis may be performed on ACSL generated simulation variables. An example of the calculation of the PDS of a Van de Pol oscillator is presented.

  13. In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation

    SciTech Connect

    G. R. Odette; G. E. Lucas

    2005-11-15

    This final report on "In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation" (DE-FG03-01ER54632) consists of a series of summaries of work that has been published, or presented at meetings, or both. It briefly describes results on the following topics: 1) A Transport and Fate Model for Helium and Helium Management; 2) Atomistic Studies of Point Defect Energetics, Dynamics and Interactions; 3) Multiscale Modeling of Fracture consisting of: 3a) A Micromechanical Model of the Master Curve (MC) Universal Fracture Toughness-Temperature Curve Relation, KJc(T - To), 3b) An Embrittlement DTo Prediction Model for the Irradiation Hardening Dominated Regime, 3c) Non-hardening Irradiation Assisted Thermal and Helium Embrittlement of 8Cr Tempered Martensitic Steels: Compilation and Analysis of Existing Data, 3d) A Model for the KJc(T) of a High Strength NFA MA957, 3e) Cracked Body Size and Geometry Effects of Measured and Effective Fracture Toughness-Model Based MC and To Evaluations of F82H and Eurofer 97, 3-f) Size and Geometry Effects on the Effective Toughness of Cracked Fusion Structures; 4) Modeling the Multiscale Mechanics of Flow Localization-Ductility Loss in Irradiation Damaged BCC Alloys; and 5) A Universal Relation Between Indentation Hardness and True Stress-Strain Constitutive Behavior. Further details can be found in the cited references or presentations that generally can be accessed on the internet, or provided upon request to the authors. Finally, it is noted that this effort was integrated with our base program in fusion materials, also funded by the DOE OFES.

  14. Advanced computational simulation for design and manufacturing of lightweight material components for automotive applications

    SciTech Connect

    Simunovic, S.; Aramayo, G.A.; Zacharia, T.; Toridis, T.G.; Bandak, F.; Ragland, C.L.

    1997-04-01

    Computational vehicle models for the analysis of lightweight material performance in automobiles have been developed through collaboration between Oak Ridge National Laboratory, the National Highway Transportation Safety Administration, and George Washington University. The vehicle models have been verified against experimental data obtained from vehicle collisions. The crashed vehicles were analyzed, and the main impact energy dissipation mechanisms were identified and characterized. Important structural parts were extracted and digitized and directly compared with simulation results. High-performance computing played a key role in the model development because it allowed for rapid computational simulations and model modifications. The deformation of the computational model shows a very good agreement with the experiments. This report documents the modifications made to the computational model and relates them to the observations and findings on the test vehicle. Procedural guidelines are also provided that the authors believe need to be followed to create realistic models of passenger vehicles that could be used to evaluate the performance of lightweight materials in automotive structural components.

  15. Development of Computational Approaches for Simulation and Advanced Controls for Hybrid Combustion-Gasification Chemical Looping

    SciTech Connect

    Joshi, Abhinaya; Lou, Xinsheng; Neuschaefer, Carl; Chaudry, Majid; Quinn, Joseph

    2012-07-31

    This document provides the results of the project through September 2009. The Phase I project has recently been extended from September 2009 to March 2011. The project extension will begin work on Chemical Looping (CL) Prototype modeling and advanced control design exploration in preparation for a scale-up phase. The results to date include: successful development of dual loop chemical looping process models and dynamic simulation software tools, development and test of several advanced control concepts and applications for Chemical Looping transport control and investigation of several sensor concepts and establishment of two feasible sensor candidates recommended for further prototype development and controls integration. There are three sections in this summary and conclusions. Section 1 presents the project scope and objectives. Section 2 highlights the detailed accomplishments by project task area. Section 3 provides conclusions to date and recommendations for future work.

  16. Computational Advances in the Arctic Terrestrial Simulator: Modeling Permafrost Degradation in a Warming Arctic

    NASA Astrophysics Data System (ADS)

    Coon, E.; Berndt, M.; Garimella, R.; Moulton, J. D.; Manzini, G.; Painter, S. L.

    2013-12-01

    The terrestrial Arctic has been a net sink of carbon for thousands of years, but warming trends suggest this may change. As the terrestrial Arctic warms, degradation of the permafrost results in significant melting of the ice wedges that support low-centered polygonal ground. This leads to subsidence of the topography, inversion of the polygonal ground, and restructuring of drainage networks. The change in hydrology and vegetation that result from these processes is poorly understood. Predictive simulation of the fate of this carbon is critical for understanding feedback effects between the terrestrial Arctic and climate change. Simulation of this system at fine scales presents many challenges. Flow and energy equations are solved on both the surface and subsurface domains, and deformation of the soil subsurface must couple with both. Additional processes such as snow, evapo-transpiration, and biogeochemistry supplement this THMC model. While globally implicit coupling methods enable conservation of mass and energy on the combined domain, care must be taken to ensure conservation as the soil subsides and the mesh deforms. Uncertainty in both critical physics of each process model and in coupling to maintain accuracy between processes suggests the need for a versatile many-physics framework. This framework should allow swapping of both processes and constitutive relations, and enable easy numerical experimentation of coupling strategies. Deformation dictates the need for advanced discretizations which maintain accuracy and a mesh framework capable of calculating smooth deformation with remapped fields. And latent heat introduces strong nonlinearities, requiring robust solvers and an efficient globalization strategy. Here we discuss advances as implemented in the Arctic Terrestrial Simulator (ATS), a many-physics framework and collection of physics kernels based upon Amanzi. We demonstrate the deformation capability, conserving mass and energy while simulating soil

  17. Advances in computer simulation of genome evolution: toward more realistic evolutionary genomics analysis by approximate bayesian computation.

    PubMed

    Arenas, Miguel

    2015-04-01

    NGS technologies present a fast and cheap generation of genomic data. Nevertheless, ancestral genome inference is not so straightforward due to complex evolutionary processes acting on this material such as inversions, translocations, and other genome rearrangements that, in addition to their implicit complexity, can co-occur and confound ancestral inferences. Recently, models of genome evolution that accommodate such complex genomic events are emerging. This letter explores these novel evolutionary models and proposes their incorporation into robust statistical approaches based on computer simulations, such as approximate Bayesian computation, that may produce a more realistic evolutionary analysis of genomic data. Advantages and pitfalls in using these analytical methods are discussed. Potential applications of these ancestral genomic inferences are also pointed out.

  18. Advanced adaptive computational methods for Navier-Stokes simulations in rotorcraft aerodynamics

    NASA Technical Reports Server (NTRS)

    Stowers, S. T.; Bass, J. M.; Oden, J. T.

    1993-01-01

    A phase 2 research and development effort was conducted in area transonic, compressible, inviscid flows with an ultimate goal of numerically modeling complex flows inherent in advanced helicopter blade designs. The algorithms and methodologies therefore are classified as adaptive methods, which are error estimation techniques for approximating the local numerical error, and automatically refine or unrefine the mesh so as to deliver a given level of accuracy. The result is a scheme which attempts to produce the best possible results with the least number of grid points, degrees of freedom, and operations. These types of schemes automatically locate and resolve shocks, shear layers, and other flow details to an accuracy level specified by the user of the code. The phase 1 work involved a feasibility study of h-adaptive methods for steady viscous flows, with emphasis on accurate simulation of vortex initiation, migration, and interaction. Phase 2 effort focused on extending these algorithms and methodologies to a three-dimensional topology.

  19. Interfaces for Advanced Computing.

    ERIC Educational Resources Information Center

    Foley, James D.

    1987-01-01

    Discusses the coming generation of supercomputers that will have the power to make elaborate "artificial realities" that facilitate user-computer communication. Illustrates these technological advancements with examples of the use of head-mounted monitors which are connected to position and orientation sensors, and gloves that track finger and…

  20. Displaying Computer Simulations Of Physical Phenomena

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1991-01-01

    Paper discusses computer simulation as means of experiencing and learning to understand physical phenomena. Covers both present simulation capabilities and major advances expected in near future. Visual, aural, tactile, and kinesthetic effects used to teach such physical sciences as dynamics of fluids. Recommends classrooms in universities, government, and industry be linked to advanced computing centers so computer simulations integrated into education process.

  1. Advances in Computational Radiation Biophysics for Cancer Therapy: Simulating Nano-Scale Damage by Low-Energy Electrons

    NASA Astrophysics Data System (ADS)

    Kuncic, Zdenka

    Computational radiation biophysics is a rapidly growing area that is contributing, alongside new hardware technologies, to ongoing developments in cancer imaging and therapy. Recent advances in theoretical and computational modeling have enabled the simulation of discrete, event-by-event interactions of very low energy (≪ 100 eV) electrons with water in its liquid thermodynamic phase. This represents a significant advance in our ability to investigate the initial stages of radiation induced biological damage at the molecular level. Such studies are important for the development of novel cancer treatment strategies, an example of which is given by microbeam radiation therapy (MRT). Here, new results are shown demonstrating that when excitations and ionizations are resolved down to nano-scales, their distribution extends well outside the primary microbeam path, into regions that are not directly irradiated. This suggests that radiation dose alone is insufficient to fully quantify biological damage. These results also suggest that the radiation cross-fire may be an important clue to understanding the different observed responses of healthy cells and tumor cells to MRT.

  2. Advances in Computational Radiation Biophysics for Cancer Therapy: Simulating Nano-Scale Damage by Low-Energy Electrons

    NASA Astrophysics Data System (ADS)

    Kuncic, Zdenka

    2015-10-01

    Computational radiation biophysics is a rapidly growing area that is contributing, alongside new hardware technologies, to ongoing developments in cancer imaging and therapy. Recent advances in theoretical and computational modeling have enabled the simulation of discrete, event-by-event interactions of very low energy (≪ 100 eV) electrons with water in its liquid thermodynamic phase. This represents a significant advance in our ability to investigate the initial stages of radiation induced biological damage at the molecular level. Such studies are important for the development of novel cancer treatment strategies, an example of which is given by microbeam radiation therapy (MRT). Here, new results are shown demonstrating that when excitations and ionizations are resolved down to nano-scales, their distribution extends well outside the primary microbeam path, into regions that are not directly irradiated. This suggests that radiation dose alone is insufficient to fully quantify biological damage. These results also suggest that the radiation cross-fire may be an important clue to understanding the different observed responses of healthy cells and tumor cells to MRT.

  3. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    SciTech Connect

    Spentzouris, P.; Cary, J.; McInnes, L.C.; Mori, W.; Ng, C.; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    for software development and applications accounts for the natural domain areas (beam dynamics, electromagnetics, and advanced acceleration), and all areas depend on the enabling technologies activities, such as solvers and component technology, to deliver the desired performance and integrated simulation environment. The ComPASS applications focus on computationally challenging problems important for design or performance optimization to all major HEP, NP, and BES accelerator facilities. With the cost and complexity of particle accelerators rising, the use of computation to optimize their designs and find improved operating regimes becomes essential, potentially leading to significant cost savings with modest investment.

  4. A Computational Methodology for Simulating Thermal Loss Testing of the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Wilson, Scott D.; Schifer, Nicholas A.; Briggs, Maxwell H.

    2011-01-01

    Convertor and generator testing is carried out in tests designed to characterize convertor performance when subjected to environments intended to simulate launch and space conditions. The value of net heat input must be known in order to calculate convertor efficiency and to validate convertor performance. Specially designed test hardware was used to verify and validate a two step methodology for the prediction of net heat input. This lessons learned from these simulations have been applied to previous convertor simulations. As heat is supplied to the convertors, electric power is produced and measured. Net heat input to the convertor is one parameter that will contribute to the calculation of efficiency. This parameter is not measured directly. Insulation Loss. Determine the current status of the thermal conductivity of the micro-porous insulation. - Match heat source and hot end temperatures. - Match temperature difference across Kaowool insulation

  5. Exploring Interactive and Dynamic Simulations Using a Computer Algebra System in an Advanced Placement Chemistry Course

    ERIC Educational Resources Information Center

    Matsumoto, Paul S.

    2014-01-01

    The article describes the use of Mathematica, a computer algebra system (CAS), in a high school chemistry course. Mathematica was used to generate a graph, where a slider controls the value of parameter(s) in the equation; thus, students can visualize the effect of the parameter(s) on the behavior of the system. Also, Mathematica can show the…

  6. Recent advances in computational aerodynamics

    NASA Astrophysics Data System (ADS)

    Agarwal, Ramesh K.; Desse, Jerry E.

    1991-04-01

    The current state of the art in computational aerodynamics is described. Recent advances in the discretization of surface geometry, grid generation, and flow simulation algorithms have led to flowfield predictions for increasingly complex and realistic configurations. As a result, computational aerodynamics is emerging as a crucial enabling technology for the development and design of flight vehicles. Examples illustrating the current capability for the prediction of aircraft, launch vehicle and helicopter flowfields are presented. Unfortunately, accurate modeling of turbulence remains a major difficulty in the analysis of viscosity-dominated flows. In the future inverse design methods, multidisciplinary design optimization methods, artificial intelligence technology and massively parallel computer technology will be incorporated into computational aerodynamics, opening up greater opportunities for improved product design at substantially reduced costs.

  7. Recent advances in renal hypoxia: insights from bench experiments and computer simulations.

    PubMed

    Layton, Anita T

    2016-07-01

    The availability of oxygen in renal tissue is determined by the complex interactions among a host of processes, including renal blood flow, glomerular filtration, arterial-to-venous oxygen shunting, medullary architecture, Na(+) transport, and oxygen consumption. When this delicate balance is disrupted, the kidney may become susceptible to hypoxic injury. Indeed, renal hypoxia has been implicated as one of the major causes of acute kidney injury and chronic kidney diseases. This review highlights recent advances in our understanding of renal hypoxia; some of these studies were published in response to a recent Call for Papers of this journal: Renal Hypoxia. PMID:27147670

  8. Simultaneous Multiple-Jet Impacts in Concrete-Experiments and Advanced Computational Simulations

    SciTech Connect

    Baum, D.W.; Kuklo, R.M.; Routh, J.W.; Simonson, S.C.

    1999-08-12

    The simultaneous impact of multiple shaped-charge jets on a concrete target has been observed experimentally to lead to the formation of a larger and deeper entrance crater than would be expected from the superposition of the craters of the individual jets. The problem has been modeled with the 3-D simulation code ALE3D, running on massively parallel processors. These calculations indicate that the enlarged damage area is the result of tensile stresses caused by the interactions among the pressure waves simultaneously emanating from the three impact sites. This phenomenon has the potential for enhancing the penetration of a follow-on projectile.

  9. Using an Advance Organizer to Improve Knowledge Application by Medical Students in Computer-Based Clinical Simulations.

    ERIC Educational Resources Information Center

    Krahn, Corrie G.; Blanchaer, Marcel C.

    1986-01-01

    This study investigated the efficacy of using the advance organizer as a device to improve medical students' understanding of a clinical case simulation on the microcomputer and to enhance performance on a posttest. Advance organizers were found to be effective and most consistent with Mayer's assimilation theory. (MBR)

  10. Center for Advanced Computational Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  11. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    SciTech Connect

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  12. Advances in computational solvation thermodynamics

    NASA Astrophysics Data System (ADS)

    Wyczalkowski, Matthew A.

    The aim of this thesis is to develop improved methods for calculating the free energy, entropy and enthalpy of solvation from molecular simulations. Solvation thermodynamics of model compounds provides quantitative measurements used to analyze the stability of protein conformations in aqueous milieus. Solvation free energies govern the favorability of the solvation process, while entropy and enthalpy decompositions give insight into the molecular mechanisms by which the process occurs. Computationally, a coupling parameter lambda modulates solute-solvent interactions to simulate an insertion process, and multiple lengthy simulations at a fixed lambda value are typically required for free energy calculations to converge; entropy and enthalpy decompositions generally take 10-100 times longer. This thesis presents three advances which accelerate the convergence of such calculations: (1) Development of entropy and enthalpy estimators which combine data from multiple simulations; (2) Optimization of lambda schedules, or the set of parameter values associated with each simulation; (3) Validation of Hamiltonian replica exchange, a technique which swaps lambda values between two otherwise independent simulations. Taken together, these techniques promise to increase the accuracy and precision of free energy, entropy and enthalpy calculations. Improved estimates, in turn, can be used to investigate the validity and limits of existing solvation models and refine force field parameters, with the goal of understanding better the collapse transition and aggregation behavior of polypeptides.

  13. Advanced Computing for Medicine.

    ERIC Educational Resources Information Center

    Rennels, Glenn D.; Shortliffe, Edward H.

    1987-01-01

    Discusses contributions that computers and computer networks are making to the field of medicine. Emphasizes the computer's speed in storing and retrieving data. Suggests that doctors may soon be able to use computers to advise on diagnosis and treatment. (TW)

  14. Learning through Computer Simulations.

    ERIC Educational Resources Information Center

    Braun, Ludwig

    Prior to the relatively easy access to computers which began in the mid-1960's, simulation was a tool only of researchers. Even now, students are frequently excluded from direct laboratory experiences for many reasons. However, computer simulation can open up these experiences, providing a powerful teaching tool for individuals, for small and…

  15. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    SciTech Connect

    Brown, D L

    2009-05-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex networked systems

  16. Advances in time-domain electromagnetic simulation capabilities through the use of overset grids and massively parallel computing

    NASA Astrophysics Data System (ADS)

    Blake, Douglas Clifton

    A new methodology is presented for conducting numerical simulations of electromagnetic scattering and wave-propagation phenomena on massively parallel computing platforms. A process is constructed which is rooted in the Finite-Volume Time-Domain (FVTD) technique to create a simulation capability that is both versatile and practical. In terms of versatility, the method is platform independent, is easily modifiable, and is capable of solving a large number of problems with no alterations. In terms of practicality, the method is sophisticated enough to solve problems of engineering significance and is not limited to mere academic exercises. In order to achieve this capability, techniques are integrated from several scientific disciplines including computational fluid dynamics, computational electromagnetics, and parallel computing. The end result is the first FVTD solver capable of utilizing the highly flexible overset-gridding process in a distributed-memory computing environment. In the process of creating this capability, work is accomplished to conduct the first study designed to quantify the effects of domain-decomposition dimensionality on the parallel performance of hyperbolic partial differential equations solvers; to develop a new method of partitioning a computational domain comprised of overset grids; and to provide the first detailed assessment of the applicability of overset grids to the field of computational electromagnetics. Using these new methods and capabilities, results from a large number of wave propagation and scattering simulations are presented. The overset-grid FVTD algorithm is demonstrated to produce results of comparable accuracy to single-grid simulations while simultaneously shortening the grid-generation process and increasing the flexibility and utility of the FVTD technique. Furthermore, the new domain-decomposition approaches developed for overset grids are shown to be capable of producing partitions that are better load balanced and

  17. Advances and Challenges in Computational Plasma Science

    SciTech Connect

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  18. Advanced Computing for Science.

    ERIC Educational Resources Information Center

    Hut, Piet; Sussman, Gerald Jay

    1987-01-01

    Discusses some of the contributions that high-speed computing is making to the study of science. Emphasizes the use of computers in exploring complicated systems without the simplification required in traditional methods of observation and experimentation. Provides examples of computer assisted investigations in astronomy and physics. (TW)

  19. Hybrid deterministic and stochastic x-ray transport simulation for transmission computed tomography with advanced detector noise model

    NASA Astrophysics Data System (ADS)

    Popescu, Lucretiu M.

    2016-03-01

    We present a model for simulation of noisy X-ray computed tomography data sets. The model is made of two main components, a photon transport simulation component that generates the noiseless photon field incident on the detector, and a detector response model that takes as input the incident photon field parameters and given the X-ray source intensity and exposure time can generate noisy data sets, accordingly. The photon transport simulation component combines direct ray-tracing of polychromatic X-rays for calculation of transmitted data, with Monte Carlo simulation for calculation of the scattered-photon data. The Monte Carlo scatter simulation is accelerated by implementing particle splitting and importance sampling variance reduction techniques. The detector-incident photon field data are stored as energy expansion coefficients on a refined grid that covers the detector area. From these data the detector response model is able to generate noisy detector data realizations, by reconstituting the main parameters that describe each detector element response in statistical terms, including spatial correlations. The model is able to generate very fast, on the fly, CT data sets corresponding to different radiation doses, as well as detector response characteristics, facilitating data management in extensive optimization studies by reducing the computation time and storage space demands.

  20. Advanced simulation of digital filters

    NASA Astrophysics Data System (ADS)

    Doyle, G. S.

    1980-09-01

    An Advanced Simulation of Digital Filters has been implemented on the IBM 360/67 computer utilizing Tektronix hardware and software. The program package is appropriate for use by persons beginning their study of digital signal processing or for filter analysis. The ASDF programs provide the user with an interactive method by which filter pole and zero locations can be manipulated. Graphical output on both the Tektronix graphics screen and the Versatec plotter are provided to observe the effects of pole-zero movement.

  1. Computational Aerodynamic Simulations of an 840 ft/sec Tip Speed Advanced Ducted Propulsor Fan System Model for Acoustic Methods Assessment and Development

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.

    2014-01-01

    Computational Aerodynamic simulations of an 840 ft/sec tip speed, Advanced Ducted Propulsor fan system were performed at five different operating points on the fan operating line, in order to provide detailed internal flow field information for use with fan acoustic prediction methods presently being developed, assessed and validated. The fan system is a sub-scale, lownoise research fan/nacelle model that has undergone extensive experimental testing in the 9- by 15- foot Low Speed Wind Tunnel at the NASA Glenn Research Center, resulting in quality, detailed aerodynamic and acoustic measurement data. Details of the fan geometry, the computational fluid dynamics methods, the computational grids, and various computational parameters relevant to the numerical simulations are discussed. Flow field results for three of the five operating conditions simulated are presented in order to provide a representative look at the computed solutions. Each of the five fan aerodynamic simulations involved the entire fan system, excluding a long core duct section downstream of the core inlet guide vane. As a result, only fan rotational speed and system bypass ratio, set by specifying static pressure downstream of the core inlet guide vane row, were adjusted in order to set the fan operating point, leading to operating points that lie on a fan operating line and making mass flow rate a fully dependent parameter. The resulting mass flow rates are in good agreement with measurement values. The computed blade row flow fields for all five fan operating points are, in general, aerodynamically healthy. Rotor blade and fan exit guide vane flow characteristics are good, including incidence and deviation angles, chordwise static pressure distributions, blade surface boundary layers, secondary flow structures, and blade wakes. Examination of the computed flow fields reveals no excessive boundary layer separations or related secondary-flow problems. A few spanwise comparisons between

  2. Advances and trends in computational structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1986-01-01

    Recent developments in computational structural mechanics are reviewed with reference to computational needs for future structures technology, advances in computational models for material behavior, discrete element technology, assessment and control of numerical simulations of structural response, hybrid analysis, and techniques for large-scale optimization. Research areas in computational structural mechanics which have high potential for meeting future technological needs are identified. These include prediction and analysis of the failure of structural components made of new materials, development of computational strategies and solution methodologies for large-scale structural calculations, and assessment of reliability and adaptive improvement of response predictions.

  3. Computer Modeling and Simulation

    SciTech Connect

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  4. Augmented Reality Simulations on Handheld Computers

    ERIC Educational Resources Information Center

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  5. Advanced computer languages

    SciTech Connect

    Bryce, H.

    1984-05-03

    If software is to become an equal partner in the so-called fifth generation of computers-which of course it must-programming languages and the human interface will need to clear some high hurdles. Again, the solutions being sought turn to cerebral emulation-here, the way that human beings understand language. The result would be natural or English-like languages that would allow a person to communicate with a computer much as he or she does with another person. In the discussion the authors look at fourth level languages and fifth level languages, used in meeting the goal of AI. The higher level languages aim to be non procedural. Application of LISP, and Forth to natural language interface are described as well as programs such as natural link technology package, written in C.

  6. Advances and trends in computational structures technology

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Venneri, S. L.

    1990-01-01

    The major goals of computational structures technology (CST) are outlined, and recent advances in CST are examined. These include computational material modeling, stochastic-based modeling, computational methods for articulated structural dynamics, strategies and numerical algorithms for new computing systems, multidisciplinary analysis and optimization. The role of CST in the future development of structures technology and the multidisciplinary design of future flight vehicles is addressed, and the future directions of CST research in the prediction of failures of structural components, the solution of large-scale structural problems, and quality assessment and control of numerical simulations are discussed.

  7. Advancing manufacturing through computational chemistry

    SciTech Connect

    Noid, D.W.; Sumpter, B.G.; Tuzun, R.E.

    1995-12-31

    The capabilities of nanotechnology and computational chemistry are reaching a point of convergence. New computer hardware and novel computational methods have created opportunities to test proposed nanometer-scale devices, investigate molecular manufacturing and model and predict properties of new materials. Experimental methods are also beginning to provide new capabilities that make the possibility of manufacturing various devices with atomic precision tangible. In this paper, we will discuss some of the novel computational methods we have used in molecular dynamics simulations of polymer processes, neural network predictions of new materials, and simulations of proposed nano-bearings and fluid dynamics in nano- sized devices.

  8. Commnity Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    SciTech Connect

    Spentzouris, Panagiotis; Cary, John; Mcinnes, Lois Curfman; Mori, Warren; Ng, Cho; Ng, Esmond; Ryne, Robert; /LBL, Berkeley

    2008-07-01

    The design and performance optimization of particle accelerators is essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC1 Accelerator Science and Technology project, the SciDAC2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modeling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multi-physics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  9. Community petascale project for accelerator science and simulation : Advancing computational science for future accelerators and accelerator technologies.

    SciTech Connect

    Spentzouris, P.; Cary, J.; McInnes, L. C.; Mori, W.; Ng, C.; Ng, E.; Ryne, R.

    2008-01-01

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R & D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  10. Commnity Petascale Project for Accelerator Science And Simulation: Advancing Computational Science for Future Accelerators And Accelerator Technologies

    SciTech Connect

    Spentzouris, Panagiotis; Cary, John; Mcinnes, Lois Curfman; Mori, Warren; Ng, Cho; Ng, Esmond; Ryne, Robert; /LBL, Berkeley

    2011-10-21

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  11. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    SciTech Connect

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  12. Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan. Part 2, Mappings for the ASC software quality engineering practices. Version 1.0.

    SciTech Connect

    Ellis, Molly A.; Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, 'ASCI Software Quality Engineering: Goals, Principles, and Guidelines'. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  13. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    SciTech Connect

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  14. Predictive Dynamic Security Assessment through Advanced Computing

    SciTech Connect

    Huang, Zhenyu; Diao, Ruisheng; Jin, Shuangshuang; Chen, Yousu

    2014-11-30

    Abstract— Traditional dynamic security assessment is limited by several factors and thus falls short in providing real-time information to be predictive for power system operation. These factors include the steady-state assumption of current operating points, static transfer limits, and low computational speed. This addresses these factors and frames predictive dynamic security assessment. The primary objective of predictive dynamic security assessment is to enhance the functionality and computational process of dynamic security assessment through the use of high-speed phasor measurements and the application of advanced computing technologies for faster-than-real-time simulation. This paper presents algorithms, computing platforms, and simulation frameworks that constitute the predictive dynamic security assessment capability. Examples of phasor application and fast computation for dynamic security assessment are included to demonstrate the feasibility and speed enhancement for real-time applications.

  15. MO-E-18C-04: Advanced Computer Simulation and Visualization Tools for Enhanced Understanding of Core Medical Physics Concepts

    SciTech Connect

    Naqvi, S

    2014-06-15

    Purpose: Most medical physics programs emphasize proficiency in routine clinical calculations and QA. The formulaic aspect of these calculations and prescriptive nature of measurement protocols obviate the need to frequently apply basic physical principles, which, therefore, gradually decay away from memory. E.g. few students appreciate the role of electron transport in photon dose, making it difficult to understand key concepts such as dose buildup, electronic disequilibrium effects and Bragg-Gray theory. These conceptual deficiencies manifest when the physicist encounters a new system, requiring knowledge beyond routine activities. Methods: Two interactive computer simulation tools are developed to facilitate deeper learning of physical principles. One is a Monte Carlo code written with a strong educational aspect. The code can “label” regions and interactions to highlight specific aspects of the physics, e.g., certain regions can be designated as “starters” or “crossers,” and any interaction type can be turned on and off. Full 3D tracks with specific portions highlighted further enhance the visualization of radiation transport problems. The second code calculates and displays trajectories of a collection electrons under arbitrary space/time dependent Lorentz force using relativistic kinematics. Results: Using the Monte Carlo code, the student can interactively study photon and electron transport through visualization of dose components, particle tracks, and interaction types. The code can, for instance, be used to study kerma-dose relationship, explore electronic disequilibrium near interfaces, or visualize kernels by using interaction forcing. The electromagnetic simulator enables the student to explore accelerating mechanisms and particle optics in devices such as cyclotrons and linacs. Conclusion: The proposed tools are designed to enhance understanding of abstract concepts by highlighting various aspects of the physics. The simulations serve as

  16. Computer simulation of earthquakes

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1977-01-01

    In a computer simulation study of earthquakes a seismically active strike slip fault is represented by coupled mechanical blocks which are driven by a moving plate and which slide on a friction surface. Elastic forces and time independent friction are used to generate main shock events, while viscoelastic forces and time dependent friction add aftershock features. The study reveals that the size, length, and time and place of event occurrence are strongly influenced by the magnitude and degree of homogeneity in the elastic, viscous, and friction parameters of the fault region. For example, periodically reoccurring similar events are observed in simulations with near-homogeneous parameters along the fault, whereas seismic gaps are a common feature of simulations employing large variations in the fault parameters. The study also reveals correlations between strain energy release and fault length and average displacement and between main shock and aftershock displacements.

  17. Computer simulation of earthquakes

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1976-01-01

    Two computer simulation models of earthquakes were studied for the dependence of the pattern of events on the model assumptions and input parameters. Both models represent the seismically active region by mechanical blocks which are connected to one another and to a driving plate. The blocks slide on a friction surface. In the first model elastic forces were employed and time independent friction to simulate main shock events. The size, length, and time and place of event occurrence were influenced strongly by the magnitude and degree of homogeniety in the elastic and friction parameters of the fault region. Periodically reoccurring similar events were frequently observed in simulations with near homogeneous parameters along the fault, whereas, seismic gaps were a common feature of simulations employing large variations in the fault parameters. The second model incorporated viscoelastic forces and time-dependent friction to account for aftershock sequences. The periods between aftershock events increased with time and the aftershock region was confined to that which moved in the main event.

  18. Inversion based on computational simulations

    SciTech Connect

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-09-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal.

  19. Computational Models of Human Performance: Validation of Memory and Procedural Representation in Advanced Air/Ground Simulation

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Labacqz, J. Victor (Technical Monitor)

    1997-01-01

    The Man-Machine Interaction Design and Analysis System (MIDAS) under joint U.S. Army and NASA cooperative is intended to assist designers of complex human/automation systems in successfully incorporating human performance capabilities and limitations into decision and action support systems. MIDAS is a computational representation of multiple human operators, selected perceptual, cognitive, and physical functions of those operators, and the physical/functional representation of the equipment with which they operate. MIDAS has been used as an integrated predictive framework for the investigation of human/machine systems, particularly in situations with high demands on the operators. We have extended the human performance models to include representation of both human operators and intelligent aiding systems in flight management, and air traffic service. The focus of this development is to predict human performance in response to aiding system developed to identify aircraft conflict and to assist in the shared authority for resolution. The demands of this application requires representation of many intelligent agents sharing world-models, coordinating action/intention, and cooperative scheduling of goals and action in an somewhat unpredictable world of operations. In recent applications to airborne systems development, MIDAS has demonstrated an ability to predict flight crew decision-making and procedural behavior when interacting with automated flight management systems and Air Traffic Control. In this paper, we describe two enhancements to MIDAS. The first involves the addition of working memory in the form of an articulatory buffer for verbal communication protocols and a visuo-spatial buffer for communications via digital datalink. The second enhancement is a representation of multiple operators working as a team. This enhanced model was used to predict the performance of human flight crews and their level of compliance with commercial aviation communication

  20. Advanced flight computer. Special study

    NASA Technical Reports Server (NTRS)

    Coo, Dennis

    1995-01-01

    This report documents a special study to define a 32-bit radiation hardened, SEU tolerant flight computer architecture, and to investigate current or near-term technologies and development efforts that contribute to the Advanced Flight Computer (AFC) design and development. An AFC processing node architecture is defined. Each node may consist of a multi-chip processor as needed. The modular, building block approach uses VLSI technology and packaging methods that demonstrate a feasible AFC module in 1998 that meets that AFC goals. The defined architecture and approach demonstrate a clear low-risk, low-cost path to the 1998 production goal, with intermediate prototypes in 1996.

  1. Advanced Wellbore Thermal Simulator

    1992-03-04

    GEOTEMP2, which is based on the earlier GEOTEMP program, is a wellbore thermal simulator designed for geothermal well drilling and production applications. The code treats natural and forced convection and conduction within the wellbore and heat conduction within the surrounding rock matrix. A variety of well operations can be modeled including injection, production, forward and reverse circulation with gas or liquid, gas or liquid drilling, and two-phase steam injection and production. Well completion with severalmore » different casing sizes and cement intervals can be modeled. The code allows variables, such as flow rate, to change with time enabling a realistic treatment of well operations. Provision is made in the flow equations to allow the flow areas of the tubing to vary with depth in the wellbore. Multiple liquids can exist in GEOTEMP2 simulations. Liquid interfaces are tracked through the tubing and annulus as one liquid displaces another. GEOTEMP2, however, does not attempt to simulate displacement of liquids with a gas or two-phase steam or vice versa. This means that it is not possible to simulate an operation where the type of drilling fluid changes, e.g. mud going to air. GEOTEMP2 was designed primarily for use in predicting the behavior of geothermal wells, but it is flexible enough to handle many typical drilling, production, and injection problems in the oil industry as well. However, GEOTEMP2 does not allow the modeling of gas-filled annuli in production or injection problems. In gas or mist drilling, no radiation losses are included in the energy balance. No attempt is made to model flow in the formation. Average execution time is 50 CP seconds on a CDC CYBER170. This edition of GEOTEMP2 is designated as Version 2.0 by the contributors.« less

  2. r.avaflow: An advanced open source computational framework for the GIS-based simulation of two-phase mass flows and process chains

    NASA Astrophysics Data System (ADS)

    Mergili, Martin; Fischer, Jan-Thomas; Fellin, Wolfgang; Ostermann, Alexander; Pudasaini, Shiva P.

    2015-04-01

    Geophysical mass flows stand for a broad range of processes and process chains such as flows and avalanches of snow, soil, debris or rock, and their interactions with water bodies resulting in flood waves. Despite considerable efforts put in model development, the simulation, and therefore the appropriate prediction of these types of events still remains a major challenge in terms of the complex material behaviour, strong phase interactions, process transformations and the complex mountain topography. Sophisticated theories exist, but they have hardly been brought to practice yet. We fill this gap by developing a novel and unified high-resolution computational tool, r.avaflow, representing a comprehensive and advanced open source GIS simulation environment for geophysical mass flows. Based on the latest and most advanced two-phase physical-mathematical models, r.avaflow includes the following features: (i) it is suitable for a broad spectrum of mass flows such as rock, rock-ice and snow avalanches, glacial lake outburst floods, debris and hyperconcentrated flows, and even landslide-induced tsunamis and submarine landslides, as well as process chains involving more than one of these phenomena; (ii) it accounts for the real two-phase nature of many flow types: viscous fluids and solid particles are considered separately with advanced mechanics and strong phase interactions; (iii) it is freely available and adoptable along with the GRASS GIS software. In the future, it will include the intrinsic topographic influences on the flow dynamics and morphology as well as an advanced approach to simulate the entrainment and deposition of solid and fluid material. As input r.avaflow needs information on (a) the mountain topography, (b) the material properties and (c) the spatial distribution of the solid and fluid release masses or one or more hydrographs of fluid and solid material. We demonstrate the functionalities and performance of r.avaflow by using some generic and real

  3. Perspective: Computer simulations of long time dynamics

    PubMed Central

    Elber, Ron

    2016-01-01

    Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances. PMID:26874473

  4. Advanced numerical methods for the simulation of flows in heterogeneous porous media and their application to parallel computing

    SciTech Connect

    Rame, M.

    1990-01-01

    Flows in highly heterogeneous porous media arise in a variety of processes including enhanced oil recovery, in situ bioremediation of underground contaminants, transport in underground aquifers and transport through biological membranes. The common denominator of these processes is the transport (and possibly reaction) of a multi-component fluid in several phases. A new numerical methodology for the analysis of flows in heterogeneous porous media is presented. Cases of miscible and immiscible displacement are simulated to investigate the influence of the local heterogeneities on the flow paths. This numerical scheme allows for a fine description of the flowing medium and the concentration and saturation distributions thus generated show low numerical dispersion. If the size of the area of interest is a square of a thousand feet per side, geological information on the porous medium can be incorporated to a length scale of about one to two feet. The technique here introduced, Operator Splitting on Multiple Grids, solves the elliptic operators by a higher-order finite-element technique on a coarse grid that proves efficient and accurate in incorporating different scales of heterogeneities. This coarse solution is interpolated to a fine grid by a splines-under-tension technique. The equations for the conservation of species are solved on this fine grid (of approximately half a million cells) by a finite-difference technique yielding numerical dispersions of less than ten feet. Cases presented herein involve a single phase miscible flow, and liquid-phase immiscible displacements. Cases are presented for model distributions of physical properties and for porosity and permeability data taken from a real reservoir. Techniques for the extension of the methods to compressible flow situations and compositional simulations are discussed.

  5. Grid computing and biomolecular simulation.

    PubMed

    Woods, Christopher J; Ng, Muan Hong; Johnston, Steven; Murdock, Stuart E; Wu, Bing; Tai, Kaihsu; Fangohr, Hans; Jeffreys, Paul; Cox, Simon; Frey, Jeremy G; Sansom, Mark S P; Essex, Jonathan W

    2005-08-15

    Biomolecular computer simulations are now widely used not only in an academic setting to understand the fundamental role of molecular dynamics on biological function, but also in the industrial context to assist in drug design. In this paper, two applications of Grid computing to this area will be outlined. The first, involving the coupling of distributed computing resources to dedicated Beowulf clusters, is targeted at simulating protein conformational change using the Replica Exchange methodology. In the second, the rationale and design of a database of biomolecular simulation trajectories is described. Both applications illustrate the increasingly important role modern computational methods are playing in the life sciences.

  6. Computational Design of Advanced Nuclear Fuels

    SciTech Connect

    Savrasov, Sergey; Kotliar, Gabriel; Haule, Kristjan

    2014-06-03

    The objective of the project was to develop a method for theoretical understanding of nuclear fuel materials whose physical and thermophysical properties can be predicted from first principles using a novel dynamical mean field method for electronic structure calculations. We concentrated our study on uranium, plutonium, their oxides, nitrides, carbides, as well as some rare earth materials whose 4f eletrons provide a simplified framework for understanding complex behavior of the f electrons. We addressed the issues connected to the electronic structure, lattice instabilities, phonon and magnon dynamics as well as thermal conductivity. This allowed us to evaluate characteristics of advanced nuclear fuel systems using computer based simulations and avoid costly experiments.

  7. ATCA for Machines-- Advanced Telecommunications Computing Architecture

    SciTech Connect

    Larsen, R.S.; /SLAC

    2008-04-22

    The Advanced Telecommunications Computing Architecture is a new industry open standard for electronics instrument modules and shelves being evaluated for the International Linear Collider (ILC). It is the first industrial standard designed for High Availability (HA). ILC availability simulations have shown clearly that the capabilities of ATCA are needed in order to achieve acceptable integrated luminosity. The ATCA architecture looks attractive for beam instruments and detector applications as well. This paper provides an overview of ongoing R&D including application of HA principles to power electronics systems.

  8. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  9. Software simulator for multiple computer simulation system

    NASA Technical Reports Server (NTRS)

    Ogrady, E. P.

    1983-01-01

    A description is given of the structure and use of a computer program that simulates the operation of a parallel processor simulation system. The program is part of an investigation to determine algorithms that are suitable for simulating continous systems on a parallel processor configuration. The simulator is designed to accurately simulate the problem-solving phase of a simulation study. Care has been taken to ensure the integrity and correctness of data exchanges and to correctly sequence periods of computation and periods of data exchange. It is pointed out that the functions performed during a problem-setup phase or a reset phase are not simulated. In particular, there is no attempt to simulate the downloading process that loads object code into the local, transfer, and mapping memories of processing elements or the memories of the run control processor and the system control processor. The main program of the simulator carries out some problem-setup functions of the system control processor in that it requests the user to enter values for simulation system parameters and problem parameters. The method by which these values are transferred to the other processors, however, is not simulated.

  10. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    SciTech Connect

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management and software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.

  11. Enabling computational technologies for subsurface simulations

    SciTech Connect

    Falgout, R D

    1999-02-22

    We collaborated with Environmental Programs to develop and apply advanced computational methodologies for simulating multiphase flow through heterogeneous porous media. The primary focus was on developing a fast accurate advection scheme using a new temporal subcycling technique and on the scalable and efficient solution of the nonlinear Richards' equation used to model two-phase (variably saturated) flow. The resulting algorithms can be orders-of-magnitude faster than existing methods. Our computational technologies were applied to the simulation of subsurface fluid flow and chemical transport in the context of two important applications: water resource management and groundwater remediation.

  12. Computer Simulation of Mutagenesis.

    ERIC Educational Resources Information Center

    North, J. C.; Dent, M. T.

    1978-01-01

    A FORTRAN program is described which simulates point-substitution mutations in the DNA strands of typical organisms. Its objective is to help students to understand the significance and structure of the genetic code, and the mechanisms and effect of mutagenesis. (Author/BB)

  13. Composite Erosion by Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2006-01-01

    Composite degradation is evaluated by computational simulation when the erosion degradation occurs on a ply-by-ply basis and the degrading medium (device) is normal to the ply. The computational simulation is performed by a multi factor interaction model and by a multi scale and multi physics available computer code. The erosion process degrades both the fiber and the matrix simultaneously in the same slice (ply). Both the fiber volume ratio and the matrix volume ratio approach zero while the void volume ratio increases as the ply degrades. The multi factor interaction model simulates the erosion degradation, provided that the exponents and factor ratios are selected judiciously. Results obtained by the computational composite mechanics show that most composite characterization properties degrade monotonically and approach "zero" as the ply degrades completely.

  14. Advanced Potential Energy Surfaces for Molecular Simulation.

    PubMed

    Albaugh, Alex; Boateng, Henry A; Bradshaw, Richard T; Demerdash, Omar N; Dziedzic, Jacek; Mao, Yuezhi; Margul, Daniel T; Swails, Jason; Zeng, Qiao; Case, David A; Eastman, Peter; Wang, Lee-Ping; Essex, Jonathan W; Head-Gordon, Martin; Pande, Vijay S; Ponder, Jay W; Shao, Yihan; Skylaris, Chris-Kriton; Todorov, Ilian T; Tuckerman, Mark E; Head-Gordon, Teresa

    2016-09-22

    Advanced potential energy surfaces are defined as theoretical models that explicitly include many-body effects that transcend the standard fixed-charge, pairwise-additive paradigm typically used in molecular simulation. However, several factors relating to their software implementation have precluded their widespread use in condensed-phase simulations: the computational cost of the theoretical models, a paucity of approximate models and algorithmic improvements that can ameliorate their cost, underdeveloped interfaces and limited dissemination in computational code bases that are widely used in the computational chemistry community, and software implementations that have not kept pace with modern high-performance computing (HPC) architectures, such as multicore CPUs and modern graphics processing units (GPUs). In this Feature Article we review recent progress made in these areas, including well-defined polarization approximations and new multipole electrostatic formulations, novel methods for solving the mutual polarization equations and increasing the MD time step, combining linear-scaling electronic structure methods with new QM/MM methods that account for mutual polarization between the two regions, and the greatly improved software deployment of these models and methods onto GPU and CPU hardware platforms. We have now approached an era where multipole-based polarizable force fields can be routinely used to obtain computational results comparable to state-of-the-art density functional theory while reaching sampling statistics that are acceptable when compared to that obtained from simpler fixed partial charge force fields.

  15. Advanced Potential Energy Surfaces for Molecular Simulation.

    PubMed

    Albaugh, Alex; Boateng, Henry A; Bradshaw, Richard T; Demerdash, Omar N; Dziedzic, Jacek; Mao, Yuezhi; Margul, Daniel T; Swails, Jason; Zeng, Qiao; Case, David A; Eastman, Peter; Wang, Lee-Ping; Essex, Jonathan W; Head-Gordon, Martin; Pande, Vijay S; Ponder, Jay W; Shao, Yihan; Skylaris, Chris-Kriton; Todorov, Ilian T; Tuckerman, Mark E; Head-Gordon, Teresa

    2016-09-22

    Advanced potential energy surfaces are defined as theoretical models that explicitly include many-body effects that transcend the standard fixed-charge, pairwise-additive paradigm typically used in molecular simulation. However, several factors relating to their software implementation have precluded their widespread use in condensed-phase simulations: the computational cost of the theoretical models, a paucity of approximate models and algorithmic improvements that can ameliorate their cost, underdeveloped interfaces and limited dissemination in computational code bases that are widely used in the computational chemistry community, and software implementations that have not kept pace with modern high-performance computing (HPC) architectures, such as multicore CPUs and modern graphics processing units (GPUs). In this Feature Article we review recent progress made in these areas, including well-defined polarization approximations and new multipole electrostatic formulations, novel methods for solving the mutual polarization equations and increasing the MD time step, combining linear-scaling electronic structure methods with new QM/MM methods that account for mutual polarization between the two regions, and the greatly improved software deployment of these models and methods onto GPU and CPU hardware platforms. We have now approached an era where multipole-based polarizable force fields can be routinely used to obtain computational results comparable to state-of-the-art density functional theory while reaching sampling statistics that are acceptable when compared to that obtained from simpler fixed partial charge force fields. PMID:27513316

  16. Computer Simulation and ESL Reading.

    ERIC Educational Resources Information Center

    Wu, Mary A.

    It is noted that although two approaches to second language instruction--the communicative approach emphasizing genuine language use and computer assisted instruction--have come together in the form of some lower level reading instruction materials for English as a second language (ESL), advanced level ESL reading materials using computer…

  17. Remote access of the ILLIAC 4. [computer flow distribution simulations

    NASA Technical Reports Server (NTRS)

    Stevens, K. G., Jr.

    1975-01-01

    The ILLIAC-4 hardware is described. The Illiac system, the Advanced Research Projects Agency computer network, and IMLAC PDS-1 are included. The space shuttle flow simulation is demonstrated to show the feasibility of using an advanced computer from a remote location.

  18. The Australian Computational Earth Systems Simulator

    NASA Astrophysics Data System (ADS)

    Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.

    2001-12-01

    Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic

  19. A Guide to Computer Simulations of Three Adaptive Instructional Models for the Advanced Instructional System Phases II and III. Final Report.

    ERIC Educational Resources Information Center

    Hansen, Duncan N.; And Others

    Computer simulations of three individualized adaptive instructional models (AIM) were undertaken to determine if these models function as prescribed in Air Force technical training programs. In addition, the project sought to develop a user's guide for effective understanding of adaptive models during field implementation. Successful simulations…

  20. Computer Simulation of Aircraft Aerodynamics

    NASA Technical Reports Server (NTRS)

    Inouye, Mamoru

    1989-01-01

    The role of Ames Research Center in conducting basic aerodynamics research through computer simulations is described. The computer facilities, including supercomputers and peripheral equipment that represent the state of the art, are described. The methodology of computational fluid dynamics is explained briefly. Fundamental studies of turbulence and transition are being pursued to understand these phenomena and to develop models that can be used in the solution of the Reynolds-averaged Navier-Stokes equations. Four applications of computer simulations for aerodynamics problems are described: subsonic flow around a fuselage at high angle of attack, subsonic flow through a turbine stator-rotor stage, transonic flow around a flexible swept wing, and transonic flow around a wing-body configuration that includes an inlet and a tail.

  1. Quantum chromodynamics with advanced computing

    SciTech Connect

    Kronfeld, Andreas S.; /Fermilab

    2008-07-01

    We survey results in lattice quantum chromodynamics from groups in the USQCD Collaboration. The main focus is on physics, but many aspects of the discussion are aimed at an audience of computational physicists.

  2. Taxis through Computer Simulation Programs.

    ERIC Educational Resources Information Center

    Park, David

    1983-01-01

    Describes a sequence of five computer programs (listings for Apple II available from author) on tactic responses (oriented movement of a cell, cell group, or whole organism in reponse to stimuli). The simulation programs are useful in helping students examine mechanisms at work in real organisms. (JN)

  3. Computer Simulation of Diffraction Patterns.

    ERIC Educational Resources Information Center

    Dodd, N. A.

    1983-01-01

    Describes an Apple computer program (listing available from author) which simulates Fraunhofer and Fresnel diffraction using vector addition techniques (vector chaining) and allows user to experiment with different shaped multiple apertures. Graphics output include vector resultants, phase difference, diffraction patterns, and the Cornu spiral…

  4. Aerodynamic Analyses Requiring Advanced Computers, Part 1

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers are presented which deal with results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include: viscous flows, boundary layer equations, turbulence modeling and Navier-Stokes equations, and internal flows.

  5. Aerodynamic Analyses Requiring Advanced Computers, part 2

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers given at the conference present the results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include two-dimensional configurations, three-dimensional configurations, transonic aircraft, and the space shuttle.

  6. Bringing Advanced Computational Techniques to Energy Research

    SciTech Connect

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  7. Computational Challenges in Nuclear Weapons Simulation

    SciTech Connect

    McMillain, C F; Adams, T F; McCoy, M G; Christensen, R B; Pudliner, B S; Zika, M R; Brantley, P S; Vetter, J S; May, J M

    2003-08-29

    After a decade of experience, the Stockpile Stewardship Program continues to ensure the safety, security and reliability of the nation's nuclear weapons. The Advanced Simulation and Computing (ASCI) program was established to provide leading edge, high-end simulation capabilities needed to meet the program's assessment and certification requirements. The great challenge of this program lies in developing the tools and resources necessary for the complex, highly coupled, multi-physics calculations required to simulate nuclear weapons. This paper describes the hardware and software environment we have applied to fulfill our nuclear weapons responsibilities. It also presents the characteristics of our algorithms and codes, especially as they relate to supercomputing resource capabilities and requirements. It then addresses impediments to the development and application of nuclear weapon simulation software and hardware and concludes with a summary of observations and recommendations on an approach for working with industry and government agencies to address these impediments.

  8. Onyx-Advanced Aeropropulsion Simulation Framework Created

    NASA Technical Reports Server (NTRS)

    Reed, John A.

    2001-01-01

    The Numerical Propulsion System Simulation (NPSS) project at the NASA Glenn Research Center is developing a new software environment for analyzing and designing aircraft engines and, eventually, space transportation systems. Its purpose is to dramatically reduce the time, effort, and expense necessary to design and test jet engines by creating sophisticated computer simulations of an aerospace object or system (refs. 1 and 2). Through a university grant as part of that effort, researchers at the University of Toledo have developed Onyx, an extensible Java-based (Sun Micro-systems, Inc.), objectoriented simulation framework, to investigate how advanced software design techniques can be successfully applied to aeropropulsion system simulation (refs. 3 and 4). The design of Onyx's architecture enables users to customize and extend the framework to add new functionality or adapt simulation behavior as required. It exploits object-oriented technologies, such as design patterns, domain frameworks, and software components, to develop a modular system in which users can dynamically replace components with others having different functionality.

  9. Advanced algorithm for orbit computation

    NASA Technical Reports Server (NTRS)

    Szenbehely, V.

    1983-01-01

    Computational and analytical techniques which simplify the solution of complex problems in orbit mechanics, Astrodynamics and Celestial Mechanics were developed. The major tool of the simplification is the substitution of transformations in place of numerical or analytical integrations. In this way the rather complicated equations of orbit mechanics might sometimes be reduced to linear equations representing harmonic oscillators with constant coefficients.

  10. Software Framework for Advanced Power Plant Simulations

    SciTech Connect

    John Widmann; Sorin Munteanu; Aseem Jain; Pankaj Gupta; Mark Moales; Erik Ferguson; Lewis Collins; David Sloan; Woodrow Fiveland; Yi-dong Lang; Larry Biegler; Michael Locke; Simon Lingard; Jay Yun

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. These include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.

  11. Computer simulation of liquid metals

    NASA Astrophysics Data System (ADS)

    Belashchenko, D. K.

    2013-12-01

    Methods for and the results of the computer simulation of liquid metals are reviewed. Two basic methods, classical molecular dynamics with known interparticle potentials and the ab initio method, are considered. Most attention is given to the simulated results obtained using the embedded atom model (EAM). The thermodynamic, structural, and diffusion properties of liquid metal models under normal and extreme (shock) pressure conditions are considered. Liquid-metal simulated results for the Groups I - IV elements, a number of transition metals, and some binary systems (Fe - C, Fe - S) are examined. Possibilities for the simulation to account for the thermal contribution of delocalized electrons to energy and pressure are considered. Solidification features of supercooled metals are also discussed.

  12. 77 FR 62231 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-12

    .../Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of Open Meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building;...

  13. 76 FR 31945 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-02

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown...

  14. Advanced Biomedical Computing Center (ABCC) | DSITP

    Cancer.gov

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  15. Computer simulation of martensitic transformations

    SciTech Connect

    Xu, Ping

    1993-11-01

    The characteristics of martensitic transformations in solids are largely determined by the elastic strain that develops as martensite particles grow and interact. To study the development of microstructure, a finite-element computer simulation model was constructed to mimic the transformation process. The transformation is athermal and simulated at each incremental step by transforming the cell which maximizes the decrease in the free energy. To determine the free energy change, the elastic energy developed during martensite growth is calculated from the theory of linear elasticity for elastically homogeneous media, and updated as the transformation proceeds.

  16. Opportunities in computational mechanics: Advances in parallel computing

    SciTech Connect

    Lesar, R.A.

    1999-02-01

    In this paper, the authors will discuss recent advances in computing power and the prospects for using these new capabilities for studying plasticity and failure. They will first review the new capabilities made available with parallel computing. They will discuss how these machines perform and how well their architecture might work on materials issues. Finally, they will give some estimates on the size of problems possible using these computers.

  17. Computer simulation and scientific visualization

    SciTech Connect

    Weber, D.P.; Moszur, F.M.

    1990-01-01

    The simulation of processes in engineering and the physical sciences has progressed rapidly over the last several years. With rapid developments in supercomputers, parallel processing, numerical algorithms and software, scientists and engineers are now positioned to quantitatively simulate systems requiring many billions of arithmetic operations. The need to understand and assimilate such massive amounts of data has been a driving force in the development of both hardware and software to create visual representations of the underling physical systems. In this paper, and the accompanying videotape, the evolution and development of the visualization process in scientific computing will be reviewed. Specific applications and associated imaging hardware and software technology illustrate both the computational needs and the evolving trends. 6 refs.

  18. Biomes computed from simulated climatologies

    NASA Astrophysics Data System (ADS)

    Claussen, Martin; Esch, Monika

    1994-01-01

    The biome model of Prentice et al. (1992a) is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fur Meteorologie. This study is undertaken in order to show the advantage of this biome model in diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to failures in simulated rainfall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are seen for the tropical rain forests. A potential northeast shift of biomes is expected from a simulation with enhanced C02 concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting changes in vegetation patterns due to a rapid climate change, the latter simulation has to be taken as a prediction of changes in conditions favourable for the existence of certain biomes, not as a prediction of a future distribution of biomes.[/ab

  19. Biomes computed from simulated climatologies

    SciTech Connect

    Claussen, M.; Esch, M.

    1994-01-01

    The biome model of Prentice et al. is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fuer Meteorologie. This study undertaken in order to show the advantage of this biome model in diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to in simulated rainfall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are for the tropical rain forests. A potential northeast shift of biomes is expected from a simulation with enhanced CO{sub 2} concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting chances in vegetation patterns due to a rapid climate change, the latter simulation to be taken as a prediction of chances in conditions favourable for the existence of certain biomes, not as a reduction of a future distribution of biomes. 15 refs., 8 figs., 2 tabs.

  20. Computer simulation of nonequilibrium processes

    SciTech Connect

    Wallace, D.C.

    1985-07-01

    The underlying concepts of nonequilibrium statistical mechanics, and of irreversible thermodynamics, will be described. The question at hand is then, how are these concepts to be realize in computer simulations of many-particle systems. The answer will be given for dissipative deformation processes in solids, on three hierarchical levels: heterogeneous plastic flow, dislocation dynamics, an molecular dynamics. Aplication to the shock process will be discussed.

  1. Role of HPC in Advancing Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2004-01-01

    On behalf of the High Performance Computing and Modernization Program (HPCMP) and NASA Advanced Supercomputing Division (NAS) a study is conducted to assess the role of supercomputers on computational aeroelasticity of aerospace vehicles. The study is mostly based on the responses to a web based questionnaire that was designed to capture the nuances of high performance computational aeroelasticity, particularly on parallel computers. A procedure is presented to assign a fidelity-complexity index to each application. Case studies based on major applications using HPCMP resources are presented.

  2. Computer Simulation for Emergency Incident Management

    SciTech Connect

    Brown, D L

    2004-12-03

    This report describes the findings and recommendations resulting from the Department of Homeland Security (DHS) Incident Management Simulation Workshop held by the DHS Advanced Scientific Computing Program in May 2004. This workshop brought senior representatives of the emergency response and incident-management communities together with modeling and simulation technologists from Department of Energy laboratories. The workshop provided an opportunity for incident responders to describe the nature and substance of the primary personnel roles in an incident response, to identify current and anticipated roles of modeling and simulation in support of incident response, and to begin a dialog between the incident response and simulation technology communities that will guide and inform planned modeling and simulation development for incident response. This report provides a summary of the discussions at the workshop as well as a summary of simulation capabilities that are relevant to incident-management training, and recommendations for the use of simulation in both incident management and in incident management training, based on the discussions at the workshop. In addition, the report discusses areas where further research and development will be required to support future needs in this area.

  3. America's most computer advanced healthcare facilities.

    PubMed

    1993-02-01

    Healthcare Informatics polled industry experts for nominations for this listing of America's Most Computer-Advanced Healthcare Facilities. Nominations were reviewed for extent of departmental automation, leading-edge applications, advanced point-of-care technologies, and networking communications capabilities. Additional consideration was given to smaller facilities automated beyond "normal expectations." Facility representatives who believe their organizations should be included in our next listing, please contact Healthcare Informatics for a nomination form.

  4. Advances in Numerical Boundary Conditions for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.

    1997-01-01

    Advances in Computational Aeroacoustics (CAA) depend critically on the availability of accurate, nondispersive, least dissipative computation algorithm as well as high quality numerical boundary treatments. This paper focuses on the recent developments of numerical boundary conditions. In a typical CAA problem, one often encounters two types of boundaries. Because a finite computation domain is used, there are external boundaries. On the external boundaries, boundary conditions simulating the solution outside the computation domain are to be imposed. Inside the computation domain, there may be internal boundaries. On these internal boundaries, boundary conditions simulating the presence of an object or surface with specific acoustic characteristics are to be applied. Numerical boundary conditions, both external or internal, developed for simple model problems are reviewed and examined. Numerical boundary conditions for real aeroacoustic problems are also discussed through specific examples. The paper concludes with a description of some much needed research in numerical boundary conditions for CAA.

  5. OPENING REMARKS: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2006-01-01

    Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such

  6. A Digital Computer Simulation of Cardiovascular and Renal Physiology.

    ERIC Educational Resources Information Center

    Tidball, Charles S.

    1979-01-01

    Presents the physiological MACPEE, one of a family of digital computer simulations used in Canada and Great Britain. A general description of the model is provided, along with a sample of computer output format, options for making interventions, advanced capabilities, an evaluation, and technical information for running a MAC model. (MA)

  7. Computational Biology, Advanced Scientific Computing, and Emerging Computational Architectures

    SciTech Connect

    2007-06-27

    This CRADA was established at the start of FY02 with $200 K from IBM and matching funds from DOE to support post-doctoral fellows in collaborative research between International Business Machines and Oak Ridge National Laboratory to explore effective use of emerging petascale computational architectures for the solution of computational biology problems. 'No cost' extensions of the CRADA were negotiated with IBM for FY03 and FY04.

  8. Preface to advances in numerical simulation of plasmas

    NASA Astrophysics Data System (ADS)

    Parker, Scott E.; Chacon, Luis

    2016-10-01

    This Journal of Computational Physics Special Issue, titled "Advances in Numerical Simulation of Plasmas," presents a snapshot of the international state of the art in the field of computational plasma physics. The articles herein are a subset of the topics presented as invited talks at the 24th International Conference on the Numerical Simulation of Plasmas (ICNSP), August 12-14, 2015 in Golden, Colorado. The choice of papers was highly selective. The ICNSP is held every other year and is the premier scientific meeting in the field of computational plasma physics.

  9. Advanced networks and computing in healthcare

    PubMed Central

    Ackerman, Michael

    2011-01-01

    As computing and network capabilities continue to rise, it becomes increasingly important to understand the varied applications for using them to provide healthcare. The objective of this review is to identify key characteristics and attributes of healthcare applications involving the use of advanced computing and communication technologies, drawing upon 45 research and development projects in telemedicine and other aspects of healthcare funded by the National Library of Medicine over the past 12 years. Only projects publishing in the professional literature were included in the review. Four projects did not publish beyond their final reports. In addition, the authors drew on their first-hand experience as project officers, reviewers and monitors of the work. Major themes in the corpus of work were identified, characterizing key attributes of advanced computing and network applications in healthcare. Advanced computing and network applications are relevant to a range of healthcare settings and specialties, but they are most appropriate for solving a narrower range of problems in each. Healthcare projects undertaken primarily to explore potential have also demonstrated effectiveness and depend on the quality of network service as much as bandwidth. Many applications are enabling, making it possible to provide service or conduct research that previously was not possible or to achieve outcomes in addition to those for which projects were undertaken. Most notable are advances in imaging and visualization, collaboration and sense of presence, and mobility in communication and information-resource use. PMID:21486877

  10. Advanced Space Shuttle simulation model

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Smith, S. R.

    1982-01-01

    A non-recursive model (based on von Karman spectra) for atmospheric turbulence along the flight path of the shuttle orbiter was developed. It provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gusts gradients. Based on this model the time series for both gusts and gust gradients were generated and stored on a series of magnetic tapes, entitled Shuttle Simulation Turbulence Tapes (SSTT). The time series are designed to represent atmospheric turbulence from ground level to an altitude of 120,000 meters. A description of the turbulence generation procedure is provided. The results of validating the simulated turbulence are described. Conclusions and recommendations are presented. One-dimensional von Karman spectra are tabulated, while a discussion of the minimum frequency simulated is provided. The results of spectral and statistical analyses of the SSTT are presented.

  11. New Computer Simulations of Macular Neural Functioning

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.

    1994-01-01

    We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.

  12. Computer simulation of polymer surfaces

    NASA Astrophysics Data System (ADS)

    Jang, Jee Hwan

    One of the main objectives of computer simulation is to isolate the effect of a specific variable in a physical or chemical system of interest, but with ambiguity in experimental interpretation. The area of polymer surface or interface contains such an ambiguity due to absence of a major thermodynamic driving force and difficulty of the complete control of experimental design. Considering the length and the time scales that define a phenomenon observed in polymeric systems, the appropriate choice of a method among the currently available methodologies in computational chemistry that have been developed mostly for small molecules is very demanding because of the Imitation of computational resources. In this study, a computationally efficient Monte Carlo simulation on a high coordination lattice employing the RIS scheme for short range interactions and a Lennard-Jones potential for long-range interaction has been applied to various boundary situations which define the material status and distinguish the properties of the material at an interface or surface from those in the bulk state. The polymer surfaces of interest in this study include a free polymer surface, a surface near an attractive solid substrate, a polymer surface generated by compression between two repulsive hard walls, and a polymer-polymer interface. Several focuses are on the change of the static properties and dynamic properties at the interfaces, which includes density profiles, distribution of a specific constituent of a polymer chain at the interfaces, chain orientation, local conformational state, and chain diffusivity. Each property at an interface is greatly affected by the characteristic of the imposed heterogeneity. One common feature is that the chains are confined at an interface along the direction normal to a surface regardless of the detailed nature of the surface. In addition, the effect of a surface gradually diminishes toward a bulk region and each property has its own effective

  13. Computer simulation of microstructural dynamics

    SciTech Connect

    Grest, G.S.; Anderson, M.P.; Srolovitz, D.J.

    1985-01-01

    Since many of the physical properties of materials are determined by their microstructure, it is important to be able to predict and control microstructural development. A number of approaches have been taken to study this problem, but they assume that the grains can be described as spherical or hexagonal and that growth occurs in an average environment. We have developed a new technique to bridge the gap between the atomistic interactions and the macroscopic scale by discretizing the continuum system such that the microstructure retains its topological connectedness, yet is amenable to computer simulations. Using this technique, we have studied grain growth in polycrystalline aggregates. The temporal evolution and grain morphology of our model are in excellent agreement with experimental results for metals and ceramics.

  14. Priority Queues for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in new priority queue data structures for event list management of computer simulations, and includes a new priority queue data structure and an improved event horizon applied to priority queue data structures. ne new priority queue data structure is a Qheap and is made out of linked lists for robust, fast, reliable, and stable event list management and uses a temporary unsorted list to store all items until one of the items is needed. Then the list is sorted, next, the highest priority item is removed, and then the rest of the list is inserted in the Qheap. Also, an event horizon is applied to binary tree and splay tree priority queue data structures to form the improved event horizon for event management.

  15. Advancing the LSST Operations Simulator

    NASA Astrophysics Data System (ADS)

    Saha, Abhijit; Ridgway, S. T.; Cook, K. H.; Delgado, F.; Chandrasekharan, S.; Petry, C. E.; Operations Simulator Group

    2013-01-01

    The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions (including weather and seeing), as well as additional scheduled and unscheduled downtime. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history database are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. This poster reports recent work which has focussed on an architectural restructuring of the code that will allow us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator will be used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities, and assist with performance margin investigations of the LSST system.

  16. Computer simulations of liquid crystals

    NASA Astrophysics Data System (ADS)

    Smondyrev, Alexander M.

    Liquid crystal physics is an exciting interdisciplinary field of research with important practical applications. Their complexity and the presence of strong translational and orientational fluctuations require a computational approach, especially in the studies of nonequlibrium phenomena. In this dissertation we present the results of computer simulation studies of liquid crystals using the molecular dynamics technique. We employed the Gay-Berne phenomenological model of liquid crystals to describe the interaction between the molecules. Both equilibrium and non-equilibrium phenomena were studied. In the first case we studied the flow properties of the liquid crystal system in equilibrium as well as the dynamics of the director. We measured the viscosities of the Gay-Berne model in the nematic and isotropic phases. The temperature-dependence of the rotational and shear viscosities, including the nonmonotonic behavior of one shear viscosity, are in good agreement with experimental data. The bulk viscosities are significantly larger than the shear viscosities, again in agreement with experiment. The director motion was found to be ballistic at short times and diffusive at longer times. The second class of problems we focused on is the properties of the system which was rapidly quenched to very low temperatures from the nematic phase. We find a glass transition to a metastable phase with nematic order and frozen translational and orientational degrees of freedom. For fast quench rates the local structure is nematic-like, while for slower quench rates smectic order is present as well. Finally, we considered a system in the isotropic phase which is then cooled to temperatures below the isotropic-nematic transition temperature. We expect topological defects to play a central role in the subsequent equilibration of the system. To identify and study these defects we require a simulation of a system with several thousand particles. We present the results of large

  17. Computer Simulation of Fracture in Aerogels

    NASA Technical Reports Server (NTRS)

    Good, Brian S.

    2006-01-01

    Aerogels are of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While the gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. In this work, we investigate the strength and fracture behavior of silica aerogels using a molecular statics-based computer simulation technique. The gels' structure is simulated via a Diffusion Limited Cluster Aggregation (DLCA) algorithm, which produces fractal structures representing experimentally observed aggregates of so-called secondary particles, themselves composed of amorphous silica primary particles an order of magnitude smaller. We have performed multi-length-scale simulations of fracture in silica aerogels, in which the interaction b e e n two secondary particles is assumed to be described by a Morse pair potential parameterized such that the potential range is much smaller than the secondary particle size. These Morse parameters are obtained by atomistic simulation of models of the experimentally-observed amorphous silica "bridges," with the fracture behavior of these bridges modeled via molecular statics using a Morse/Coulomb potential for silica. We consider the energetics of the fracture, and compare qualitative features of low-and high-density gel fracture.

  18. Multiscale Computer Simulation of Failure in Aerogels

    NASA Technical Reports Server (NTRS)

    Good, Brian S.

    2008-01-01

    Aerogels have been of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While such gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. We have previously performed computer simulations of aerogel thermal conductivity and tensile and compressive failure, with results that are in qualitative, and sometimes quantitative, agreement with experiment. However, recent experiments in our laboratory suggest that gels having similar densities may exhibit substantially different properties. In this work, we extend our original diffusion limited cluster aggregation (DLCA) model for gel structure to incorporate additional variation in DLCA simulation parameters, with the aim of producing DLCA clusters of similar densities that nevertheless have different fractal dimension and secondary particle coordination. We perform particle statics simulations of gel strain on these clusters, and consider the effects of differing DLCA simulation conditions, and the resultant differences in fractal dimension and coordination, on gel strain properties.

  19. Simulation Of Advanced Train Control Systems

    NASA Astrophysics Data System (ADS)

    Craven, Paul; Oman, Paul

    This paper describes an Advanced Train Control System (ATCS) simulation environment created using the Network Simulator 2 (ns-2) discrete event network simulation system. The ATCS model is verified using ATCS monitoring software, laboratory results and a comparison with a mathematical model of ATCS communications. The simulation results are useful in understanding ATCS communication characteristics and identifying protocol strengths, weaknesses, vulnerabilities and mitigation techniques. By setting up a suite of ns-2 scripts, an engineer can simulate hundreds of possible scenarios in the space of a few seconds to investigate failure modes and consequences.

  20. The Shuttle Mission Simulator computer generated imagery

    NASA Technical Reports Server (NTRS)

    Henderson, T. H.

    1984-01-01

    Equipment available in the primary training facility for the Space Transportation System (STS) flight crews includes the Fixed Base Simulator, the Motion Base Simulator, the Spacelab Simulator, and the Guidance and Navigation Simulator. The Shuttle Mission Simulator (SMS) consists of the Fixed Base Simulator and the Motion Base Simulator. The SMS utilizes four visual Computer Generated Image (CGI) systems. The Motion Base Simulator has a forward crew station with six-degrees of freedom motion simulation. Operation of the Spacelab Simulator is planned for the spring of 1983. The Guidance and Navigation Simulator went into operation in 1982. Aspects of orbital visual simulation are discussed, taking into account the earth scene, payload simulation, the generation and display of 1079 stars, the simulation of sun glare, and Reaction Control System jet firing plumes. Attention is also given to landing site visual simulation, and night launch and landing simulation.

  1. Airborne Advanced Reconfigurable Computer System (ARCS)

    NASA Technical Reports Server (NTRS)

    Bjurman, B. E.; Jenkins, G. M.; Masreliez, C. J.; Mcclellan, K. L.; Templeman, J. E.

    1976-01-01

    A digital computer subsystem fault-tolerant concept was defined, and the potential benefits and costs of such a subsystem were assessed when used as the central element of a new transport's flight control system. The derived advanced reconfigurable computer system (ARCS) is a triple-redundant computer subsystem that automatically reconfigures, under multiple fault conditions, from triplex to duplex to simplex operation, with redundancy recovery if the fault condition is transient. The study included criteria development covering factors at the aircraft's operation level that would influence the design of a fault-tolerant system for commercial airline use. A new reliability analysis tool was developed for evaluating redundant, fault-tolerant system availability and survivability; and a stringent digital system software design methodology was used to achieve design/implementation visibility.

  2. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  3. An advanced coarse-grained nucleosome core particle model for computer simulations of nucleosome-nucleosome interactions under varying ionic conditions.

    PubMed

    Fan, Yanping; Korolev, Nikolay; Lyubartsev, Alexander P; Nordenskiöld, Lars

    2013-01-01

    In the eukaryotic cell nucleus, DNA exists as chromatin, a compact but dynamic complex with histone proteins. The first level of DNA organization is the linear array of nucleosome core particles (NCPs). The NCP is a well-defined complex of 147 bp DNA with an octamer of histones. Interactions between NCPs are of paramount importance for higher levels of chromatin compaction. The polyelectrolyte nature of the NCP implies that nucleosome-nucleosome interactions must exhibit a great influence from both the ionic environment as well as the positively charged and highly flexible N-terminal histone tails, protruding out from the NCP. The large size of the system precludes a modelling analysis of chromatin at an all-atom level and calls for coarse-grained approximations. Here, a model of the NCP that include the globular histone core and the flexible histone tails described by one particle per each amino acid and taking into account their net charge is proposed. DNA wrapped around the histone core was approximated at the level of two base pairs represented by one bead (bases and sugar) plus four beads of charged phosphate groups. Computer simulations, using a Langevin thermostat, in a dielectric continuum with explicit monovalent (K(+)), divalent (Mg(2+)) or trivalent (Co(NH(3))(6) (3+)) cations were performed for systems with one or ten NCPs. Increase of the counterion charge results in a switch from repulsive NCP-NCP interaction in the presence of K(+), to partial aggregation with Mg(2+) and to strong mutual attraction of all 10 NCPs in the presence of CoHex(3+). The new model reproduced experimental results and the structure of the NCP-NCP contacts is in agreement with available data. Cation screening, ion-ion correlations and tail bridging contribute to the NCP-NCP attraction and the new NCP model accounts for these interactions. PMID:23418426

  4. An advanced coarse-grained nucleosome core particle model for computer simulations of nucleosome-nucleosome interactions under varying ionic conditions.

    PubMed

    Fan, Yanping; Korolev, Nikolay; Lyubartsev, Alexander P; Nordenskiöld, Lars

    2013-01-01

    In the eukaryotic cell nucleus, DNA exists as chromatin, a compact but dynamic complex with histone proteins. The first level of DNA organization is the linear array of nucleosome core particles (NCPs). The NCP is a well-defined complex of 147 bp DNA with an octamer of histones. Interactions between NCPs are of paramount importance for higher levels of chromatin compaction. The polyelectrolyte nature of the NCP implies that nucleosome-nucleosome interactions must exhibit a great influence from both the ionic environment as well as the positively charged and highly flexible N-terminal histone tails, protruding out from the NCP. The large size of the system precludes a modelling analysis of chromatin at an all-atom level and calls for coarse-grained approximations. Here, a model of the NCP that include the globular histone core and the flexible histone tails described by one particle per each amino acid and taking into account their net charge is proposed. DNA wrapped around the histone core was approximated at the level of two base pairs represented by one bead (bases and sugar) plus four beads of charged phosphate groups. Computer simulations, using a Langevin thermostat, in a dielectric continuum with explicit monovalent (K(+)), divalent (Mg(2+)) or trivalent (Co(NH(3))(6) (3+)) cations were performed for systems with one or ten NCPs. Increase of the counterion charge results in a switch from repulsive NCP-NCP interaction in the presence of K(+), to partial aggregation with Mg(2+) and to strong mutual attraction of all 10 NCPs in the presence of CoHex(3+). The new model reproduced experimental results and the structure of the NCP-NCP contacts is in agreement with available data. Cation screening, ion-ion correlations and tail bridging contribute to the NCP-NCP attraction and the new NCP model accounts for these interactions.

  5. Simulating Drosophila Genetics with the Computer.

    ERIC Educational Resources Information Center

    Small, James W., Jr.; Edwards, Kathryn L.

    1979-01-01

    Presents some techniques developed to help improve student understanding of Mendelian principles through the use of a computer simulation model by the genetic system of the fruit fly. Includes discussion and evaluation of this computer assisted program. (MA)

  6. Protocols for Handling Messages Between Simulation Computers

    NASA Technical Reports Server (NTRS)

    Balcerowski, John P.; Dunnam, Milton

    2006-01-01

    Practical Simulator Network (PSimNet) is a set of data-communication protocols designed especially for use in handling messages between computers that are engaging cooperatively in real-time or nearly-real-time training simulations. In a typical application, computers that provide individualized training at widely dispersed locations would communicate, by use of PSimNet, with a central host computer that would provide a common computational- simulation environment and common data. Originally intended for use in supporting interfaces between training computers and computers that simulate the responses of spacecraft scientific payloads, PSimNet could be especially well suited for a variety of other applications -- for example, group automobile-driver training in a classroom. Another potential application might lie in networking of automobile-diagnostic computers at repair facilities to a central computer that would compile the expertise of numerous technicians and engineers and act as an expert consulting technician.

  7. Computational simulation methods for composite fracture mechanics

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.

    1988-01-01

    Structural integrity, durability, and damage tolerance of advanced composites are assessed by studying damage initiation at various scales (micro, macro, and global) and accumulation and growth leading to global failure, quantitatively and qualitatively. In addition, various fracture toughness parameters associated with a typical damage and its growth must be determined. Computational structural analysis codes to aid the composite design engineer in performing these tasks were developed. CODSTRAN (COmposite Durability STRuctural ANalysis) is used to qualitatively and quantitatively assess the progressive damage occurring in composite structures due to mechanical and environmental loads. Next, methods are covered that are currently being developed and used at Lewis to predict interlaminar fracture toughness and related parameters of fiber composites given a prescribed damage. The general purpose finite element code MSC/NASTRAN was used to simulate the interlaminar fracture and the associated individual as well as mixed-mode strain energy release rates in fiber composites.

  8. Monte Carlo Computer Simulation of a Rainbow.

    ERIC Educational Resources Information Center

    Olson, Donald; And Others

    1990-01-01

    Discusses making a computer-simulated rainbow using principles of physics, such as reflection and refraction. Provides BASIC program for the simulation. Appends a program illustrating the effects of dispersion of the colors. (YP)

  9. Advanced Vadose Zone Simulations Using TOUGH

    SciTech Connect

    Finsterle, S.; Doughty, C.; Kowalsky, M.B.; Moridis, G.J.; Pan,L.; Xu, T.; Zhang, Y.; Pruess, K.

    2007-02-01

    The vadose zone can be characterized as a complex subsurfacesystem in which intricate physical and biogeochemical processes occur inresponse to a variety of natural forcings and human activities. Thismakes it difficult to describe, understand, and predict the behavior ofthis specific subsurface system. The TOUGH nonisothermal multiphase flowsimulators are well-suited to perform advanced vadose zone studies. Theconceptual models underlying the TOUGH simulators are capable ofrepresenting features specific to the vadose zone, and of addressing avariety of coupled phenomena. Moreover, the simulators are integratedinto software tools that enable advanced data analysis, optimization, andsystem-level modeling. We discuss fundamental and computationalchallenges in simulating vadose zone processes, review recent advances inmodeling such systems, and demonstrate some capabilities of the TOUGHsuite of codes using illustrative examples.

  10. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  11. Advances in atomic oxygen simulation

    NASA Technical Reports Server (NTRS)

    Froechtenigt, Joseph F.; Bareiss, Lyle E.

    1990-01-01

    Atomic oxygen (AO) present in the atmosphere at orbital altitudes of 200 to 700 km has been shown to degrade various exposed materials on Shuttle flights. The relative velocity of the AO with the spacecraft, together with the AO density, combine to yield an environment consisting of a 5 eV beam energy with a flux of 10(exp 14) to 10(exp 15) oxygen atoms/sq cm/s. An AO ion beam apparatus that produces flux levels and energy similar to that encountered by spacecraft in low Earth orbit (LEO) has been in existence since 1987. Test data was obtained from the interaction of the AO ion beam with materials used in space applications (carbon, silver, kapton) and with several special coatings of interest deposited on various surfaces. The ultimate design goal of the AO beam simulation device is to produce neutral AO at sufficient flux levels to replicate on-orbit conditions. A newly acquired mass spectrometer with energy discrimination has allowed 5 eV neutral oxygen atoms to be separated and detected from the background of thermal oxygen atoms of approx 0.2 eV. Neutralization of the AO ion beam at 5 eV was shown at the Martin Marietta AO facility.

  12. 78 FR 6087 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  13. 75 FR 9887 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  14. 76 FR 9765 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research, SC-21/Germantown Building, U.S. Department of...

  15. 78 FR 41046 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... hereby given that the Advanced Scientific Computing Advisory Committee will be renewed for a two-year... (DOE), on the Advanced Scientific Computing Research Program managed by the Office of...

  16. 75 FR 64720 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    .../Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department...

  17. 76 FR 41234 - Advanced Scientific Computing Advisory Committee Charter Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... Advanced Scientific Computing Advisory Committee Charter Renewal AGENCY: Department of Energy, Office of... Administration, notice is hereby given that the Advanced Scientific Computing Advisory Committee will be renewed... concerning the Advanced Scientific Computing program in response only to charges from the Director of...

  18. 78 FR 56871 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  19. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    .../Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  20. 75 FR 43518 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, DOE. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing Advisory..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  1. 77 FR 12823 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of Open Meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research, SC-21/Germantown Building, U.S. Department of...

  2. Computer Simulation in Chemical Kinetics

    ERIC Educational Resources Information Center

    Anderson, Jay Martin

    1976-01-01

    Discusses the use of the System Dynamics technique in simulating a chemical reaction for kinetic analysis. Also discusses the use of simulation modelling in biology, ecology, and the social sciences, where experimentation may be impractical or impossible. (MLH)

  3. Advances in computed tomography imaging technology.

    PubMed

    Ginat, Daniel Thomas; Gupta, Rajiv

    2014-07-11

    Computed tomography (CT) is an essential tool in diagnostic imaging for evaluating many clinical conditions. In recent years, there have been several notable advances in CT technology that already have had or are expected to have a significant clinical impact, including extreme multidetector CT, iterative reconstruction algorithms, dual-energy CT, cone-beam CT, portable CT, and phase-contrast CT. These techniques and their clinical applications are reviewed and illustrated in this article. In addition, emerging technologies that address deficiencies in these modalities are discussed.

  4. Hybrid and Electric Advanced Vehicle Systems Simulation

    NASA Technical Reports Server (NTRS)

    Beach, R. F.; Hammond, R. A.; Mcgehee, R. K.

    1985-01-01

    Predefined components connected to represent wide variety of propulsion systems. Hybrid and Electric Advanced Vehicle System (HEAVY) computer program is flexible tool for evaluating performance and cost of electric and hybrid vehicle propulsion systems. Allows designer to quickly, conveniently, and economically predict performance of proposed drive train.

  5. Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.

    ERIC Educational Resources Information Center

    Moore, Gwendolyn B.; And Others

    1986-01-01

    Describes possible applications of new technologies to special education. Discusses results of a study designed to explore the use of robotics, artificial intelligence, and computer simulations to aid people with handicapping conditions. Presents several scenarios in which specific technological advances may contribute to special education…

  6. Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.

    ERIC Educational Resources Information Center

    Moore, Gwendolyn B.; And Others

    The report describes three advanced technologies--robotics, artificial intelligence, and computer simulation--and identifies the ways in which they might contribute to special education. A hybrid methodology was employed to identify existing technology and forecast future needs. Following this framework, each of the technologies is defined,…

  7. An introduction to NASA's advanced computing program: Integrated computing systems in advanced multichip modules

    NASA Technical Reports Server (NTRS)

    Fang, Wai-Chi; Alkalai, Leon

    1996-01-01

    Recent changes within NASA's space exploration program favor the design, implementation, and operation of low cost, lightweight, small and micro spacecraft with multiple launches per year. In order to meet the future needs of these missions with regard to the use of spacecraft microelectronics, NASA's advanced flight computing (AFC) program is currently considering industrial cooperation and advanced packaging architectures. In relation to this, the AFC program is reviewed, considering the design and implementation of NASA's AFC multichip module.

  8. Computer Simulation in Undergraduate Instruction: A Symposium.

    ERIC Educational Resources Information Center

    Street, Warren R.; And Others

    These symposium papers discuss the instructional use of computers in psychology, with emphasis on computer-produced simulations. The first, by Rich Edwards, briefly outlines LABSIM, a general purpose system of FORTRAN programs which simulate data collection in more than a dozen experimental models in psychology and are designed to train students…

  9. Computer simulation of inhibitor application -- A review

    SciTech Connect

    Banerjee, G.; Vasanth, K.L.

    1997-12-01

    The rapid development of powerful software as well as hardware in computer technology has changed the traditional approach to all areas of science and technology. In the field of corrosion inhibitors, computers are used to model, simulate, analyze and monitor inhibitor applications in both laboratory and industrial environments. This paper will present an up-to-date critical review of such simulation studies.

  10. Center for Advanced Modeling and Simulation Intern

    ScienceCinema

    Gertman, Vanessa

    2016-07-12

    Some interns just copy papers and seal envelopes. Not at INL! Check out how Vanessa Gertman, an INL intern working at the Center for Advanced Modeling and Simulation, spent her summer working with some intense visualization software. Lots more content like this is available at INL's facebook page http://www.facebook.com/idahonationallaboratory.

  11. Center for Advanced Modeling and Simulation Intern

    SciTech Connect

    Gertman, Vanessa

    2010-01-01

    Some interns just copy papers and seal envelopes. Not at INL! Check out how Vanessa Gertman, an INL intern working at the Center for Advanced Modeling and Simulation, spent her summer working with some intense visualization software. Lots more content like this is available at INL's facebook page http://www.facebook.com/idahonationallaboratory.

  12. Computer simulation of nonequilibrium processes

    SciTech Connect

    Hoover, W.G.; Moran, B.; Holian, B.L.; Posch, H.A.; Bestiale, S.

    1987-01-01

    Recent atomistic simulations of irreversible macroscopic hydrodynamic flows are illustrated. An extension of Nose's reversible atomistic mechanics makes it possible to simulate such non-equilibrium systems with completely reversible equations of motion. The new techniques show that macroscopic irreversibility is a natural inevitable consequence of time-reversible Lyapunov-unstable microscopic equations of motion.

  13. CASL: The Consortium for Advanced Simulation of Light Water Reactors

    NASA Astrophysics Data System (ADS)

    Kothe, Douglas B.

    2010-11-01

    Like the fusion community, the nuclear engineering community is embarking on a new computational effort to create integrated, multiphysics simulations. The Consortium for Advanced Simulation of Light Water Reactors (CASL), one of 3 newly-funded DOE Energy Innovation Hubs, brings together an exceptionally capable team that will apply existing modeling and simulation capabilities and develop advanced capabilities to create a usable environment for predictive simulation of light water reactors (LWRs). This environment, designated the Virtual Reactor (VR), will: 1) Enable the use of leadership-class computing for engineering design and analysis to improve reactor capabilities, 2) Promote an enhanced scientific basis and understanding by replacing empirically based design and analysis tools with predictive capabilities, 3) Develop a highly integrated multiphysics environment for engineering analysis through increased fidelity methods, and 4) Incorporate UQ as a basis for developing priorities and supporting, application of the VR tools for predictive simulation. In this presentation, we present the plans for CASL and comment on the similarity and differences with the proposed Fusion Simulation Project (FSP).

  14. Combining high performance simulation, data acquisition, and graphics display computers

    NASA Technical Reports Server (NTRS)

    Hickman, Robert J.

    1989-01-01

    Issues involved in the continuing development of an advanced simulation complex are discussed. This approach provides the capability to perform the majority of tests on advanced systems, non-destructively. The controlled test environments can be replicated to examine the response of the systems under test to alternative treatments of the system control design, or test the function and qualification of specific hardware. Field tests verify that the elements simulated in the laboratories are sufficient. The digital computer is hosted by a Digital Equipment Corp. MicroVAX computer with an Aptec Computer Systems Model 24 I/O computer performing the communication function. An Applied Dynamics International AD100 performs the high speed simulation computing and an Evans and Sutherland PS350 performs on-line graphics display. A Scientific Computer Systems SCS40 acts as a high performance FORTRAN program processor to support the complex, by generating numerous large files from programs coded in FORTRAN that are required for the real time processing. Four programming languages are involved in the process, FORTRAN, ADSIM, ADRIO, and STAPLE. FORTRAN is employed on the MicroVAX host to initialize and terminate the simulation runs on the system. The generation of the data files on the SCS40 also is performed with FORTRAN programs. ADSIM and ADIRO are used to program the processing elements of the AD100 and its IOCP processor. STAPLE is used to program the Aptec DIP and DIA processors.

  15. Filtration theory using computer simulations

    SciTech Connect

    Bergman, W.; Corey, I.

    1997-08-01

    We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three-dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements. 5 refs., 11 figs.

  16. Filtration theory using computer simulations

    SciTech Connect

    Bergman, W.; Corey, I.

    1997-01-01

    We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three- dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements.

  17. Dynamic Simulations of Advanced Fuel Cycles

    SciTech Connect

    Steven J. Piet; Brent W. Dixon; Jacob J. Jacobson; Gretchen E. Matthern; David E. Shropshire

    2011-03-01

    Years of performing dynamic simulations of advanced nuclear fuel cycle options provide insights into how they could work and how one might transition from the current once-through fuel cycle. This paper summarizes those insights from the context of the 2005 objectives and goals of the U.S. Advanced Fuel Cycle Initiative (AFCI). Our intent is not to compare options, assess options versus those objectives and goals, nor recommend changes to those objectives and goals. Rather, we organize what we have learned from dynamic simulations in the context of the AFCI objectives for waste management, proliferation resistance, uranium utilization, and economics. Thus, we do not merely describe “lessons learned” from dynamic simulations but attempt to answer the “so what” question by using this context. The analyses have been performed using the Verifiable Fuel Cycle Simulation of Nuclear Fuel Cycle Dynamics (VISION). We observe that the 2005 objectives and goals do not address many of the inherently dynamic discriminators among advanced fuel cycle options and transitions thereof.

  18. Advances in Computational Stability Analysis of Composite Aerospace Structures

    SciTech Connect

    Degenhardt, R.; Araujo, F. C. de

    2010-09-30

    European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.

  19. Evaluation of Visual Computer Simulator for Computer Architecture Education

    ERIC Educational Resources Information Center

    Imai, Yoshiro; Imai, Masatoshi; Moritoh, Yoshio

    2013-01-01

    This paper presents trial evaluation of a visual computer simulator in 2009-2011, which has been developed to play some roles of both instruction facility and learning tool simultaneously. And it illustrates an example of Computer Architecture education for University students and usage of e-Learning tool for Assembly Programming in order to…

  20. Hybrid and electric advanced vehicle systems (heavy) simulation

    NASA Technical Reports Server (NTRS)

    Hammond, R. A.; Mcgehee, R. K.

    1981-01-01

    A computer program to simulate hybrid and electric advanced vehicle systems (HEAVY) is described. It is intended for use early in the design process: concept evaluation, alternative comparison, preliminary design, control and management strategy development, component sizing, and sensitivity studies. It allows the designer to quickly, conveniently, and economically predict the performance of a proposed drive train. The user defines the system to be simulated using a library of predefined component models that may be connected to represent a wide variety of propulsion systems. The development of three models are discussed as examples.

  1. Advanced Scientific Computing Research Network Requirements

    SciTech Connect

    Bacon, Charles; Bell, Greg; Canon, Shane; Dart, Eli; Dattoria, Vince; Goodwin, Dave; Lee, Jason; Hicks, Susan; Holohan, Ed; Klasky, Scott; Lauzon, Carolyn; Rogers, Jim; Shipman, Galen; Skinner, David; Tierney, Brian

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  2. Computational Spectrum of Agent Model Simulation

    SciTech Connect

    Perumalla, Kalyan S

    2010-01-01

    The study of human social behavioral systems is finding renewed interest in military, homeland security and other applications. Simulation is the most generally applied approach to studying complex scenarios in such systems. Here, we outline some of the important considerations that underlie the computational aspects of simulation-based study of human social systems. The fundamental imprecision underlying questions and answers in social science makes it necessary to carefully distinguish among different simulation problem classes and to identify the most pertinent set of computational dimensions associated with those classes. We identify a few such classes and present their computational implications. The focus is then shifted to the most challenging combinations in the computational spectrum, namely, large-scale entity counts at moderate to high levels of fidelity. Recent developments in furthering the state-of-the-art in these challenging cases are outlined. A case study of large-scale agent simulation is provided in simulating large numbers (millions) of social entities at real-time speeds on inexpensive hardware. Recent computational results are identified that highlight the potential of modern high-end computing platforms to push the envelope with respect to speed, scale and fidelity of social system simulations. Finally, the problem of shielding the modeler or domain expert from the complex computational aspects is discussed and a few potential solution approaches are identified.

  3. High-Performance Computing for Advanced Smart Grid Applications

    SciTech Connect

    Huang, Zhenyu; Chen, Yousu

    2012-07-06

    The power grid is becoming far more complex as a result of the grid evolution meeting an information revolution. Due to the penetration of smart grid technologies, the grid is evolving as an unprecedented speed and the information infrastructure is fundamentally improved with a large number of smart meters and sensors that produce several orders of magnitude larger amounts of data. How to pull data in, perform analysis, and put information out in a real-time manner is a fundamental challenge in smart grid operation and planning. The future power grid requires high performance computing to be one of the foundational technologies in developing the algorithms and tools for the significantly increased complexity. New techniques and computational capabilities are required to meet the demands for higher reliability and better asset utilization, including advanced algorithms and computing hardware for large-scale modeling, simulation, and analysis. This chapter summarizes the computational challenges in smart grid and the need for high performance computing, and present examples of how high performance computing might be used for future smart grid operation and planning.

  4. Fiber Composite Sandwich Thermostructural Behavior: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Aiello, R. A.; Murthy, P. L. N.

    1986-01-01

    Several computational levels of progressive sophistication/simplification are described to computationally simulate composite sandwich hygral, thermal, and structural behavior. The computational levels of sophistication include: (1) three-dimensional detailed finite element modeling of the honeycomb, the adhesive and the composite faces; (2) three-dimensional finite element modeling of the honeycomb assumed to be an equivalent continuous, homogeneous medium, the adhesive and the composite faces; (3) laminate theory simulation where the honeycomb (metal or composite) is assumed to consist of plies with equivalent properties; and (4) derivations of approximate, simplified equations for thermal and mechanical properties by simulating the honeycomb as an equivalent homogeneous medium. The approximate equations are combined with composite hygrothermomechanical and laminate theories to provide a simple and effective computational procedure for simulating the thermomechanical/thermostructural behavior of fiber composite sandwich structures.

  5. METC Gasifier Advanced Simulation (MGAS) model

    SciTech Connect

    Syamlal, M.; Bissett, L.A.

    1992-01-01

    Morgantown Energy Technology Center is developing an advanced moving-bed gasifier, which is the centerpiece of the Integrated Gasifier Combined-Cycle (IGCC) system, with the features of good efficiency, low cost, and minimal environmental impact. A mathematical model of the gasifier, the METC-Gasifier Advanced Simulation (MGAS) model, has been developed for the analysis and design of advanced gasifiers and other moving-bed gasifiers. This report contains the technical and the user manuals of the MGAS model. The MGAS model can describe the transient operation of coflow, counterflow, or fixed-bed gasifiers. It is a one-dimensional model and can simulate the addition and withdrawal of gas and solids at multiple locations in the bed, a feature essential for simulating beds with recycle. The model describes the reactor in terms of a gas phase and a solids (coal or char) phase. These phases may exist at different temperatures. The model considers several combustion, gasification, and initial stage reactions. The model consists of a set of mass balances for 14 gas species and three coal (pseudo-) species and energy balances for the gas and the solids phases. The resulting partial differential equations are solved using a finite difference technique.

  6. 76 FR 45786 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, Department of Energy... Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770) requires... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown...

  7. 75 FR 57742 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building;...

  8. 78 FR 64931 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-30

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION... Computing Advisory Committee (ASCAC). This meeting replaces the cancelled ASCAC meeting that was to be held... Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of Energy;...

  9. 76 FR 64330 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  10. 78 FR 50404 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-19

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ] ACTION... Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown...

  11. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    ERIC Educational Resources Information Center

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  12. Computer simulation of engine systems

    NASA Technical Reports Server (NTRS)

    Fishbach, L. H.

    1980-01-01

    The use of computerized simulations of the steady state and transient performance of jet engines throughout the flight regime is discussed. In addition, installation effects on thrust and specific fuel consumption is accounted for as well as engine weight, dimensions and cost. The availability throughout the government and industry of analytical methods for calculating these quantities are pointed out.

  13. Astronomy Simulation with Computer Graphics.

    ERIC Educational Resources Information Center

    Thomas, William E.

    1982-01-01

    "Planetary Motion Simulations" is a system of programs designed for students to observe motions of a superior planet (one whose orbit lies outside the orbit of the earth). Programs run on the Apple II microcomputer and employ high-resolution graphics to present the motions of Saturn. (Author/JN)

  14. Economic Analysis. Computer Simulation Models.

    ERIC Educational Resources Information Center

    Sterling Inst., Washington, DC. Educational Technology Center.

    A multimedia course in economic analysis was developed and used in conjunction with the United States Naval Academy. (See ED 043 790 and ED 043 791 for final reports of the project evaluation and development model.) This volume of the text discusses the simulation of behavioral relationships among variable elements in an economy and presents…

  15. Psychology on Computers: Simulations, Experiments and Projects.

    ERIC Educational Resources Information Center

    Belcher, Duane M.; Smith, Stephen D.

    PSYCOM is a unique mixed media package which combines high interest projects on the computer with a written text of expository material. It goes beyond most computer-assisted instruction which emphasizes drill and practice and testing of knowledge. A project might consist of a simulation or an actual experiment, or it might be a demonstration, a…

  16. Teaching Environmental Systems Modelling Using Computer Simulation.

    ERIC Educational Resources Information Center

    Moffatt, Ian

    1986-01-01

    A computer modeling course in environmental systems and dynamics is presented. The course teaches senior undergraduates to analyze a system of interest, construct a system flow chart, and write computer programs to simulate real world environmental processes. An example is presented along with a course evaluation, figures, tables, and references.…

  17. Computer-based first-principles kinetic Monte Carlo simulation of polyethylene glycol degradation in aqueous phase UV/H2O2 advanced oxidation process.

    PubMed

    Guo, Xin; Minakata, Daisuke; Crittenden, John

    2014-09-16

    We have developed a computer-based first-principles kinetic Monte Carlo (CF-KMC) model to predict degradation mechanisms and fates of intermediates and byproducts produced from the degradation of polyethylene glycol (PEG) in the presence of hydrogen peroxide (UV/H2O2). The CF-KMC model is composed of a reaction pathway generator, a reaction rate constant estimator, and a KMC solver. The KMC solver is able to solve the predicted pathways successfully without solving ordinary differential equations. The predicted time-dependent profiles of averaged molecular weight, and polydispersitivity index (i.e., the ratio of the weight-averaged molecular weight to the number-averaged molecular weight) for the PEG degradation were validated with experimental observations. These predictions are consistent with the experimental data. The model provided detailed and quantitative insights into the time evolutions of molecular weight distribution and concentration profiles of low molecular weight products and functional groups. Our approach may be useful to predict the fates of degradation products for a wide range of complicated organic contaminants.

  18. Computer Simulation Of A Small Turboshaft Engine

    NASA Technical Reports Server (NTRS)

    Ballin, Mark G.

    1991-01-01

    Component-type mathematical model of small turboshaft engine developed for use in real-time computer simulations of dynamics of helicopter flight. Yields shaft speeds, torques, fuel-consumption rates, and other operating parameters with sufficient accuracy for use in real-time simulation of maneuvers involving large transients in power and/or severe accelerations.

  19. Salesperson Ethics: An Interactive Computer Simulation

    ERIC Educational Resources Information Center

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  20. Simulations of Probabilities for Quantum Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  1. Reservoir Thermal Recover Simulation on Parallel Computers

    NASA Astrophysics Data System (ADS)

    Li, Baoyan; Ma, Yuanle

    The rapid development of parallel computers has provided a hardware background for massive refine reservoir simulation. However, the lack of parallel reservoir simulation software has blocked the application of parallel computers on reservoir simulation. Although a variety of parallel methods have been studied and applied to black oil, compositional, and chemical model numerical simulations, there has been limited parallel software available for reservoir simulation. Especially, the parallelization study of reservoir thermal recovery simulation has not been fully carried out, because of the complexity of its models and algorithms. The authors make use of the message passing interface (MPI) standard communication library, the domain decomposition method, the block Jacobi iteration algorithm, and the dynamic memory allocation technique to parallelize their serial thermal recovery simulation software NUMSIP, which is being used in petroleum industry in China. The parallel software PNUMSIP was tested on both IBM SP2 and Dawn 1000A distributed-memory parallel computers. The experiment results show that the parallelization of I/O has great effects on the efficiency of parallel software PNUMSIP; the data communication bandwidth is also an important factor, which has an influence on software efficiency. Keywords: domain decomposition method, block Jacobi iteration algorithm, reservoir thermal recovery simulation, distributed-memory parallel computer

  2. Computer simulation of gear tooth manufacturing processes

    NASA Technical Reports Server (NTRS)

    Mavriplis, Dimitri; Huston, Ronald L.

    1990-01-01

    The use of computer graphics to simulate gear tooth manufacturing procedures is discussed. An analytical basis for the simulation is established for spur gears. The simulation itself, however, is developed not only for spur gears, but for straight bevel gears as well. The applications of the developed procedure extend from the development of finite element models of heretofore intractable geometrical forms, to exploring the fabrication of nonstandard tooth forms.

  3. Advanced studies on Simulation Methodologies for very Complicated Fracture Phenomena

    NASA Astrophysics Data System (ADS)

    Nishioka, Toshihisa

    2010-06-01

    Although nowadays, computational techniques are well developed, for Extremely Complicated Fracture Phenomena, they are still very difficult to simulate, for general engineers, researchers. To overcome many difficulties in those simulations, we have developed not only Simulation Methodologies but also theoretical basis and concepts. We sometimes observe extremely complicated fracture patterns, especially in dynamic fracture phenomena such as dynamic crack branching, kinking, curving, etc. For examples, although the humankind, from primitive men to modern scientists such as Albert Einstein had watched the post-mortem patterns of dynamic crack branching, the governing condition for the onset of the phenomena had been unsolved until our experimental study. From in these studies, we found the governing condition of dynamic crack bifurcation, as follows. When the total energy flux per unit time into a propagating crack tip reaches the material crack resistance, the crack braches into two cracks [total energy flux criterion]. The crack branches many times whenever the criterion is satisfied. Furthermore, the complexities also arise due to their time-dependence and/or their-deformation dependence. In order to make it possible to simulate such extremely complicated fracture phenomena, we developed many original advanced computational methods and technologies. These are (i)moving finite element method based on Delaunay automatic triangulation (MFEMBOAT), path independent,(ii) equivalent domain integral expression of the dynamic J integral associated with a continuous auxiliary function,(iii) Mixed phase path-prediction mode simulation, (iv) implicit path prediction criterion. In this paper, these advanced computational methods are thoroughly explained together with successful comparison with the experimental results. Since multiple dynamic crack branching phenomena may be most complicated fracture due to complicated fracture paths, and its time dependence (transient), this

  4. Cluster computing software for GATE simulations

    SciTech Connect

    Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.

    2007-06-15

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values.

  5. Computer Code for Nanostructure Simulation

    NASA Technical Reports Server (NTRS)

    Filikhin, Igor; Vlahovic, Branislav

    2009-01-01

    Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

  6. Computer simulation of bubble formation.

    SciTech Connect

    Insepov, Z.; Bazhirov, T.; Norman, G.; Stegailov, V.; Mathematics and Computer Science; Institute for High Energy Densities of Joint Institute for High Temperatures of RAS

    2007-01-01

    Properties of liquid metals (Li, Pb, Na) containing nanoscale cavities were studied by atomistic Molecular Dynamics (MD). Two atomistic models of cavity simulation were developed that cover a wide area in the phase diagram with negative pressure. In the first model, the thermodynamics of cavity formation, stability and the dynamics of cavity evolution in bulk liquid metals have been studied. Radial densities, pressures, surface tensions, and work functions of nano-scale cavities of various radii were calculated for liquid Li, Na, and Pb at various temperatures and densities, and at small negative pressures near the liquid-gas spinodal, and the work functions for cavity formation in liquid Li were calculated and compared with the available experimental data. The cavitation rate can further be obtained by using the classical nucleation theory (CNT). The second model is based on the stability study and on the kinetics of cavitation of the stretched liquid metals. A MD method was used to simulate cavitation in a metastable Pb and Li melts and determine the stability limits. States at temperatures below critical (T < 0.5Tc) and large negative pressures were considered. The kinetic boundary of liquid phase stability was shown to be different from the spinodal. The kinetics and dynamics of cavitation were studied. The pressure dependences of cavitation frequencies were obtained for several temperatures. The results of MD calculations were compared with estimates based on classical nucleation theory.

  7. Polymer Composites Corrosive Degradation: A Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2007-01-01

    A computational simulation of polymer composites corrosive durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured pH factor and is represented by voids, temperature and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  8. New Developments in the Simulation of Advanced Accelerator Concepts

    SciTech Connect

    Bruhwiler, David L.; Cary, John R.; Cowan, Benjamin M.; Paul, Kevin; Mullowney, Paul J.; Messmer, Peter; Geddes, Cameron G. R.; Esarey, Eric; Cormier-Michel, Estelle; Leemans, Wim; Vay, Jean-Luc

    2009-01-22

    Improved computational methods are essential to the diverse and rapidly developing field of advanced accelerator concepts. We present an overview of some computational algorithms for laser-plasma concepts and high-brightness photocathode electron sources. In particular, we discuss algorithms for reduced laser-plasma models that can be orders of magnitude faster than their higher-fidelity counterparts, as well as important on-going efforts to include relevant additional physics that has been previously neglected. As an example of the former, we present 2D laser wakefield accelerator simulations in an optimal Lorentz frame, demonstrating >10 GeV energy gain of externally injected electrons over a 2 m interaction length, showing good agreement with predictions from scaled simulations and theory, with a speedup factor of {approx}2,000 as compared to standard particle-in-cell.

  9. New Developments in the Simulation of Advanced Accelerator Concepts

    SciTech Connect

    Paul, K.; Cary, J.R.; Cowan, B.; Bruhwiler, D.L.; Geddes, C.G.R.; Mullowney, P.J.; Messmer, P.; Esarey, E.; Cormier-Michel, E.; Leemans, W.P.; Vay, J.-L.

    2008-09-10

    Improved computational methods are essential to the diverse and rapidly developing field of advanced accelerator concepts. We present an overview of some computational algorithms for laser-plasma concepts and high-brightness photocathode electron sources. In particular, we discuss algorithms for reduced laser-plasma models that can be orders of magnitude faster than their higher-fidelity counterparts, as well as important on-going efforts to include relevant additional physics that has been previously neglected. As an example of the former, we present 2D laser wakefield accelerator simulations in an optimal Lorentz frame, demonstrating>10 GeV energy gain of externally injected electrons over a 2 m interaction length, showing good agreement with predictions from scaled simulations and theory, with a speedup factor of ~;;2,000 as compared to standard particle-in-cell.

  10. Recent advances of strong-strong beam-beam simulation

    SciTech Connect

    Qiang, Ji; Furman, Miguel A.; Ryne, Robert D.; Fischer, Wolfram; Ohmi,Kazuhito

    2004-09-15

    In this paper, we report on recent advances in strong-strong beam-beam simulation. Numerical methods used in the calculation of the beam-beam forces are reviewed. A new computational method to solve the Poisson equation on nonuniform grid is presented. This method reduces the computational cost by a half compared with the standard FFT based method on uniform grid. It is also more accurate than the standard method for a colliding beam with low transverse aspect ratio. In applications, we present the study of coherent modes with multi-bunch, multi-collision beam-beam interactions at RHIC. We also present the strong-strong simulation of the luminosity evolution at KEKB with and without finite crossing angle.

  11. Developing an Advanced Environment for Collaborative Computing

    NASA Technical Reports Server (NTRS)

    Becerra-Fernandez, Irma; Stewart, Helen; DelAlto, Martha; DelAlto, Martha; Knight, Chris

    1999-01-01

    Knowledge management in general tries to organize and make available important know-how, whenever and where ever is needed. Today, organizations rely on decision-makers to produce "mission critical" decisions that am based on inputs from multiple domains. The ideal decision-maker has a profound understanding of specific domains that influence the decision-making process coupled with the experience that allows them to act quickly and decisively on the information. In addition, learning companies benefit by not repeating costly mistakes, and by reducing time-to-market in Research & Development projects. Group-decision making tools can help companies make better decisions by capturing the knowledge from groups of experts. Furthermore, companies that capture their customers preferences can improve their customer service, which translates to larger profits. Therefore collaborative computing provides a common communication space, improves sharing of knowledge, provides a mechanism for real-time feedback on the tasks being performed, helps to optimize processes, and results in a centralized knowledge warehouse. This paper presents the research directions. of a project which seeks to augment an advanced collaborative web-based environment called Postdoc, with workflow capabilities. Postdoc is a "government-off-the-shelf" document management software developed at NASA-Ames Research Center (ARC).

  12. Computer simulation of space station computer steered high gain antenna

    NASA Technical Reports Server (NTRS)

    Beach, S. W.

    1973-01-01

    The mathematical modeling and programming of a complete simulation program for a space station computer-steered high gain antenna are described. The program provides for reading input data cards, numerically integrating up to 50 first order differential equations, and monitoring up to 48 variables on printed output and on plots. The program system consists of a high gain antenna, an antenna gimbal control system, an on board computer, and the environment in which all are to operate.

  13. The Consortium for Advanced Simulation of Light Water Reactors

    SciTech Connect

    Ronaldo Szilard; Hongbin Zhang; Doug Kothe; Paul Turinsky

    2011-10-01

    The Consortium for Advanced Simulation of Light Water Reactors (CASL) is a DOE Energy Innovation Hub for modeling and simulation of nuclear reactors. It brings together an exceptionally capable team from national labs, industry and academia that will apply existing modeling and simulation capabilities and develop advanced capabilities to create a usable environment for predictive simulation of light water reactors (LWRs). This environment, designated as the Virtual Environment for Reactor Applications (VERA), will incorporate science-based models, state-of-the-art numerical methods, modern computational science and engineering practices, and uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs). It will couple state-of-the-art fuel performance, neutronics, thermal-hydraulics (T-H), and structural models with existing tools for systems and safety analysis and will be designed for implementation on both today's leadership-class computers and the advanced architecture platforms now under development by the DOE. CASL focuses on a set of challenge problems such as CRUD induced power shift and localized corrosion, grid-to-rod fretting fuel failures, pellet clad interaction, fuel assembly distortion, etc. that encompass the key phenomena limiting the performance of PWRs. It is expected that much of the capability developed will be applicable to other types of reactors. CASL's mission is to develop and apply modeling and simulation capabilities to address three critical areas of performance for nuclear power plants: (1) reduce capital and operating costs per unit energy by enabling power uprates and plant lifetime extension, (2) reduce nuclear waste volume generated by enabling higher fuel burnup, and (3) enhance nuclear safety by enabling high-fidelity predictive capability for component performance.

  14. Computer simulations of isolated conductors in electrostatic equilibrium.

    PubMed

    Chang, Herng-Hua

    2008-11-01

    A computer simulation model is introduced to study the characteristics of isolated conductors in electrostatic equilibrium. Drawing an analogy between electrons and how they move to the surface of isolated conductors, we randomly initialize a large number of particles inside a small region at the center of simulated conductors and advance them according to their forces of repulsion. By use of optimized numerical techniques of the finite-size particle method associated with Poisson's equation, the particles are quickly advanced using a fast Fourier transform and their charge is efficiently shared using the clouds-in-cells method. The particle populations in the simulations range from 50x10;{3} to 1x10;{6} that move in various computation domains equal to 128x128 , 256x256 , and 512x512 grids. When the particles come to an electrostatic equilibrium, they lie on the boundaries of the simulated conductors, from which the equilibrium properties are obtained. Consistent with the theory of electrostatics and charged conductors, we found that the particles move in response to the conductor geometry in such a way that the electrostatic energy is minimized. Good approximation results for the equilibrium properties were obtained using the proposed computer simulation model.

  15. Modeling and Computer Simulation: Molecular Dynamics and Kinetic Monte Carlo

    SciTech Connect

    Wirth, B.D.; Caturla, M.J.; Diaz de la Rubia, T.

    2000-10-10

    Recent years have witnessed tremendous advances in the realistic multiscale simulation of complex physical phenomena, such as irradiation and aging effects of materials, made possible by the enormous progress achieved in computational physics for calculating reliable, yet tractable interatomic potentials and the vast improvements in computational power and parallel computing. As a result, computational materials science is emerging as an important complement to theory and experiment to provide fundamental materials science insight. This article describes the atomistic modeling techniques of molecular dynamics (MD) and kinetic Monte Carlo (KMC), and an example of their application to radiation damage production and accumulation in metals. It is important to note at the outset that the primary objective of atomistic computer simulation should be obtaining physical insight into atomic-level processes. Classical molecular dynamics is a powerful method for obtaining insight about the dynamics of physical processes that occur on relatively short time scales. Current computational capability allows treatment of atomic systems containing as many as 10{sup 9} atoms for times on the order of 100 ns (10{sup -7}s). The main limitation of classical MD simulation is the relatively short times accessible. Kinetic Monte Carlo provides the ability to reach macroscopic times by modeling diffusional processes and time-scales rather than individual atomic vibrations. Coupling MD and KMC has developed into a powerful, multiscale tool for the simulation of radiation damage in metals.

  16. Advancing Material Models for Automotive Forming Simulations

    NASA Astrophysics Data System (ADS)

    Vegter, H.; An, Y.; ten Horn, C. H. L. J.; Atzema, E. H.; Roelofsen, M. E.

    2005-08-01

    Simulations in automotive industry need more advanced material models to achieve highly reliable forming and springback predictions. Conventional material models implemented in the FEM-simulation models are not capable to describe the plastic material behaviour during monotonic strain paths with sufficient accuracy. Recently, ESI and Corus co-operate on the implementation of an advanced material model in the FEM-code PAMSTAMP 2G. This applies to the strain hardening model, the influence of strain rate, and the description of the yield locus in these models. A subsequent challenge is the description of the material after a change of strain path. The use of advanced high strength steels in the automotive industry requires a description of plastic material behaviour of multiphase steels. The simplest variant is dual phase steel consisting of a ferritic and a martensitic phase. Multiphase materials also contain a bainitic phase in addition to the ferritic and martensitic phase. More physical descriptions of strain hardening than simple fitted Ludwik/Nadai curves are necessary. Methods to predict plastic behaviour of single-phase materials use a simple dislocation interaction model based on the formed cells structures only. At Corus, a new method is proposed to predict plastic behaviour of multiphase materials have to take hard phases into account, which deform less easily. The resulting deformation gradients create geometrically necessary dislocations. Additional micro-structural information such as morphology and size of hard phase particles or grains is necessary to derive the strain hardening models for this type of materials. Measurements available from the Numisheet benchmarks allow these models to be validated. At Corus, additional measured values are available from cross-die tests. This laboratory test can attain critical deformations by large variations in blank size and processing conditions. The tests are a powerful tool in optimising forming simulations

  17. Airport Simulations Using Distributed Computational Resources

    NASA Technical Reports Server (NTRS)

    McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The Virtual National Airspace Simulation (VNAS) will improve the safety of Air Transportation. In 2001, using simulation and information management software running over a distributed network of super-computers, researchers at NASA Ames, Glenn, and Langley Research Centers developed a working prototype of a virtual airspace. This VNAS prototype modeled daily operations of the Atlanta airport by integrating measured operational data and simulation data on up to 2,000 flights a day. The concepts and architecture developed by NASA for this prototype are integral to the National Airspace Simulation to support the development of strategies improving aviation safety, identifying precursors to component failure.

  18. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  19. Advanced flight computers for planetary exploration

    NASA Technical Reports Server (NTRS)

    Stephenson, R. Rhoads

    1988-01-01

    Research concerning flight computers for use on interplanetary probes is reviewed. The history of these computers from the Viking mission to the present is outlined. The differences between ground commercial computers and computers for planetary exploration are listed. The development of a computer for the Mariner Mark II comet rendezvous asteroid flyby mission is described. Various aspects of recently developed computer systems are examined, including the Max real time, embedded computer, a hypercube distributed supercomputer, a SAR data processor, a processor for the High Resolution IR Imaging Spectrometer, and a robotic vision multiresolution pyramid machine for processsing images obtained by a Mars Rover.

  20. Computer Series, 108. Computer Simulation of Chemical Equilibrium.

    ERIC Educational Resources Information Center

    Cullen, John F., Jr.

    1989-01-01

    Presented is a computer simulation called "The Great Chemical Bead Game" which can be used to teach the concepts of equilibrium and kinetics to introductory chemistry students more clearly than through an experiment. Discussed are the rules of the game, the application of rate laws and graphical analysis. (CW)

  1. Computer simulation of breathing systems for divers

    SciTech Connect

    Sexton, P.G.; Nuckols, M.L.

    1983-02-01

    A powerful new tool for the analysis and design of underwater breathing gas systems is being developed. A versatile computer simulator is described which makes possible the modular ''construction'' of any conceivable breathing gas system from computer memory-resident components. The analysis of a typical breathing gas system is demonstrated using this simulation technique, and the effects of system modifications on performance of the breathing system are shown. This modeling technique will ultimately serve as the foundation for a proposed breathing system simulator under development by the Navy. The marriage of this computer modeling technique with an interactive graphics system will provide the designer with an efficient, cost-effective tool for the development of new and improved diving systems.

  2. Enabling Computational Technologies for Terascale Scientific Simulations

    SciTech Connect

    Ashby, S.F.

    2000-08-24

    We develop scalable algorithms and object-oriented code frameworks for terascale scientific simulations on massively parallel processors (MPPs). Our research in multigrid-based linear solvers and adaptive mesh refinement enables Laboratory programs to use MPPs to explore important physical phenomena. For example, our research aids stockpile stewardship by making practical detailed 3D simulations of radiation transport. The need to solve large linear systems arises in many applications, including radiation transport, structural dynamics, combustion, and flow in porous media. These systems result from discretizations of partial differential equations on computational meshes. Our first research objective is to develop multigrid preconditioned iterative methods for such problems and to demonstrate their scalability on MPPs. Scalability describes how total computational work grows with problem size; it measures how effectively additional resources can help solve increasingly larger problems. Many factors contribute to scalability: computer architecture, parallel implementation, and choice of algorithm. Scalable algorithms have been shown to decrease simulation times by several orders of magnitude.

  3. Software Engineering for Scientific Computer Simulations

    NASA Astrophysics Data System (ADS)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  4. Automatic temperature computation for realistic IR simulation

    NASA Astrophysics Data System (ADS)

    Le Goff, Alain; Kersaudy, Philippe; Latger, Jean; Cathala, Thierry; Stolte, Nilo; Barillot, Philippe

    2000-07-01

    Polygon temperature computation in 3D virtual scenes is fundamental for IR image simulation. This article describes in detail the temperature calculation software and its current extensions, briefly presented in [1]. This software, called MURET, is used by the simulation workshop CHORALE of the French DGA. MURET is a one-dimensional thermal software, which accurately takes into account the material thermal attributes of three-dimensional scene and the variation of the environment characteristics (atmosphere) as a function of the time. Concerning the environment, absorbed incident fluxes are computed wavelength by wavelength, for each half an hour, druing 24 hours before the time of the simulation. For each polygon, incident fluxes are compsed of: direct solar fluxes, sky illumination (including diffuse solar fluxes). Concerning the materials, classical thermal attributes are associated to several layers, such as conductivity, absorption, spectral emissivity, density, specific heat, thickness and convection coefficients are taken into account. In the future, MURET will be able to simulate permeable natural materials (water influence) and vegetation natural materials (woods). This model of thermal attributes induces a very accurate polygon temperature computation for the complex 3D databases often found in CHORALE simulations. The kernel of MUET consists of an efficient ray tracer allowing to compute the history (over 24 hours) of the shadowed parts of the 3D scene and a library, responsible for the thermal computations. The great originality concerns the way the heating fluxes are computed. Using ray tracing, the flux received in each 3D point of the scene accurately takes into account the masking (hidden surfaces) between objects. By the way, this library supplies other thermal modules such as a thermal shows computation tool.

  5. Structural Composites Corrosive Management by Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2006-01-01

    A simulation of corrosive management on polymer composites durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured Ph factor and is represented by voids, temperature, and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure, and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply managed degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  6. Task simulation in computer-based training

    SciTech Connect

    Gardner, P.R.

    1988-02-01

    Westinghouse Hanford Company (WHC) makes extensive use of job-task simulations in company-developed computer-based training (CBT) courseware. This courseware is different from most others because it does not simulate process control machinery or other computer programs, instead the WHC Excerises model day-to-day tasks such as physical work preparations, progress, and incident handling. These Exercises provide a higher level of motivation and enable the testing of more complex patterns of behavior than those typically measured by multiple-choice and short questions. Examples from the WHC Radiation Safety and Crane Safety courses will be used as illustrations. 3 refs.

  7. Computer simulation: A modern day crystal ball?

    NASA Technical Reports Server (NTRS)

    Sham, Michael; Siprelle, Andrew

    1994-01-01

    It has long been the desire of managers to be able to look into the future and predict the outcome of decisions. With the advent of computer simulation and the tremendous capability provided by personal computers, that desire can now be realized. This paper presents an overview of computer simulation and modeling, and discusses the capabilities of Extend. Extend is an iconic-driven Macintosh-based software tool that brings the power of simulation to the average computer user. An example of an Extend based model is presented in the form of the Space Transportation System (STS) Processing Model. The STS Processing Model produces eight shuttle launches per year, yet it takes only about ten minutes to run. In addition, statistical data such as facility utilization, wait times, and processing bottlenecks are produced. The addition or deletion of resources, such as orbiters or facilities, can be easily modeled and their impact analyzed. Through the use of computer simulation, it is possible to look into the future to see the impact of today's decisions.

  8. Computer simulation in sport and industry.

    PubMed

    Hubbard, M

    1993-01-01

    The last several decades have brought decreases in the specific cost of computer memory and increases in processor throughput. As a result simulation has become correspondingly more important as a component of industrial design and as a method for the study of general biomechanics and sports techniques. This paper illustrates, by way of examples, several of the more important aspects of the application of computer simulation to dynamic problems. Topics include (1) the ideas of suitable model complexity and its tradeoff with interpretability; (2) the sequential and iterative nature of model building and the importance of experimental data in the modelling and validation process; (3) the essential role of user-friendly software and graphical interfaces in the interchange of information between simulation programs and the users; and 4) the role of computer simulation in learning feedback loops, both in the field and in the computer laboratory. Most industrial use of simulation is in the design process. A similar approach is equally valid in biomechanics and sport applications through the incorporation of design variables, which may be easily changed in the model experiment.

  9. Computer simulation of the threshold sensitivity determinations

    NASA Technical Reports Server (NTRS)

    Gayle, J. B.

    1974-01-01

    A computer simulation study was carried out to evaluate various methods for determining threshold stimulus levels for impact sensitivity tests. In addition, the influence of a number of variables (initial stimulus level, particular stimulus response curve, and increment size) on the apparent threshold values and on the corresponding population response levels was determined. Finally, a critical review of previous assumptions regarding the stimulus response curve for impact testing is presented in the light of the simulation results.

  10. Computation Simulation Of Autonomous Vehicle Navigation

    NASA Astrophysics Data System (ADS)

    Meystel, A.; Koch, E.

    1984-06-01

    A concept of navigation is simulated based upon heuristic search. A mobile robot with a vision system navigates with an unknown or an unclear map. The range of vision is limited, thus, inflicting various judgments concerned with the comparison of alternatives of motion. The frequency of the decision-making procedure is limited by a definite time of computation. The system is simulated with a number of maps and the results of navigation are compared.

  11. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  12. Advances in NLTE Modeling for Integrated Simulations

    SciTech Connect

    Scott, H A; Hansen, S B

    2009-07-08

    The last few years have seen significant progress in constructing the atomic models required for non-local thermodynamic equilibrium (NLTE) simulations. Along with this has come an increased understanding of the requirements for accurately modeling the ionization balance, energy content and radiative properties of different elements for a wide range of densities and temperatures. Much of this progress is the result of a series of workshops dedicated to comparing the results from different codes and computational approaches applied to a series of test problems. The results of these workshops emphasized the importance of atomic model completeness, especially in doubly excited states and autoionization transitions, to calculating ionization balance, and the importance of accurate, detailed atomic data to producing reliable spectra. We describe a simple screened-hydrogenic model that calculates NLTE ionization balance with surprising accuracy, at a low enough computational cost for routine use in radiation-hydrodynamics codes. The model incorporates term splitting, {Delta}n = 0 transitions, and approximate UTA widths for spectral calculations, with results comparable to those of much more detailed codes. Simulations done with this model have been increasingly successful at matching experimental data for laser-driven systems and hohlraums. Accurate and efficient atomic models are just one requirement for integrated NLTE simulations. Coupling the atomic kinetics to hydrodynamics and radiation transport constrains both discretizations and algorithms to retain energy conservation, accuracy and stability. In particular, the strong coupling between radiation and populations can require either very short timesteps or significantly modified radiation transport algorithms to account for NLTE material response. Considerations such as these continue to provide challenges for NLTE simulations.

  13. Computing Advances in the Teaching of Chemistry.

    ERIC Educational Resources Information Center

    Baskett, W. P.; Matthews, G. P.

    1984-01-01

    Discusses three trends in computer-oriented chemistry instruction: (1) availability of interfaces to integrate computers with experiments; (2) impact of the development of higher resolution graphics and greater memory capacity; and (3) role of videodisc technology on computer assisted instruction. Includes program listings for auto-titration and…

  14. Uncertainty and error in computational simulations

    SciTech Connect

    Oberkampf, W.L.; Diegert, K.V.; Alvin, K.F.; Rutherford, B.M.

    1997-10-01

    The present paper addresses the question: ``What are the general classes of uncertainty and error sources in complex, computational simulations?`` This is the first step of a two step process to develop a general methodology for quantitatively estimating the global modeling and simulation uncertainty in computational modeling and simulation. The second step is to develop a general mathematical procedure for representing, combining and propagating all of the individual sources through the simulation. The authors develop a comprehensive view of the general phases of modeling and simulation. The phases proposed are: conceptual modeling of the physical system, mathematical modeling of the system, discretization of the mathematical model, computer programming of the discrete model, numerical solution of the model, and interpretation of the results. This new view is built upon combining phases recognized in the disciplines of operations research and numerical solution methods for partial differential equations. The characteristics and activities of each of these phases is discussed in general, but examples are given for the fields of computational fluid dynamics and heat transfer. They argue that a clear distinction should be made between uncertainty and error that can arise in each of these phases. The present definitions for uncertainty and error are inadequate and. therefore, they propose comprehensive definitions for these terms. Specific classes of uncertainty and error sources are then defined that can occur in each phase of modeling and simulation. The numerical sources of error considered apply regardless of whether the discretization procedure is based on finite elements, finite volumes, or finite differences. To better explain the broad types of sources of uncertainty and error, and the utility of their categorization, they discuss a coupled-physics example simulation.

  15. Advanced Techniques for Simulating the Behavior of Sand

    NASA Astrophysics Data System (ADS)

    Clothier, M.; Bailey, M.

    2009-12-01

    research is to simulate the look and behavior of sand, this work will go beyond simple particle collision. In particular, we can continue to use our parallel algorithms not only on single particles but on particle “clumps” that consist of multiple combined particles. Since sand is typically not spherical in nature, these particle “clumps” help to simulate the coarse nature of sand. In a simulation environment, multiple combined particles could be used to simulate the polygonal and granular nature of sand grains. Thus, a diversity of sand particles can be generated. The interaction between these particles can then be parallelized using GPU hardware. As such, this research will investigate different graphics and physics techniques and determine the tradeoffs in performance and visual quality for sand simulation. An enhanced sand model through the use of high performance computing and GPUs has great potential to impact research for both earth and space scientists. Interaction with JPL has provided an opportunity for us to refine our simulation techniques that can ultimately be used for their vehicle simulator. As an added benefit of this work, advancements in simulating sand can also benefit scientists here on earth, especially in regard to understanding landslides and debris flows.

  16. Quantitative computer simulations of extraterrestrial processing operations

    NASA Technical Reports Server (NTRS)

    Vincent, T. L.; Nikravesh, P. E.

    1989-01-01

    The automation of a small, solid propellant mixer was studied. Temperature control is under investigation. A numerical simulation of the system is under development and will be tested using different control options. Control system hardware is currently being put into place. The construction of mathematical models and simulation techniques for understanding various engineering processes is also studied. Computer graphics packages were utilized for better visualization of the simulation results. The mechanical mixing of propellants is examined. Simulation of the mixing process is being done to study how one can control for chaotic behavior to meet specified mixing requirements. An experimental mixing chamber is also being built. It will allow visual tracking of particles under mixing. The experimental unit will be used to test ideas from chaos theory, as well as to verify simulation results. This project has applications to extraterrestrial propellant quality and reliability.

  17. Progress in Computational Simulation of Earthquakes

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Gregory; Judd, Michele; Li, P. Peggy; Norton, Charles; Tisdale, Edwin; Granat, Robert

    2006-01-01

    GeoFEST(P) is a computer program written for use in the QuakeSim project, which is devoted to development and improvement of means of computational simulation of earthquakes. GeoFEST(P) models interacting earthquake fault systems from the fault-nucleation to the tectonic scale. The development of GeoFEST( P) has involved coupling of two programs: GeoFEST and the Pyramid Adaptive Mesh Refinement Library. GeoFEST is a message-passing-interface-parallel code that utilizes a finite-element technique to simulate evolution of stress, fault slip, and plastic/elastic deformation in realistic materials like those of faulted regions of the crust of the Earth. The products of such simulations are synthetic observable time-dependent surface deformations on time scales from days to decades. Pyramid Adaptive Mesh Refinement Library is a software library that facilitates the generation of computational meshes for solving physical problems. In an application of GeoFEST(P), a computational grid can be dynamically adapted as stress grows on a fault. Simulations on workstations using a few tens of thousands of stress and displacement finite elements can now be expanded to multiple millions of elements with greater than 98-percent scaled efficiency on over many hundreds of parallel processors (see figure).

  18. A School Finance Computer Simulation Model

    ERIC Educational Resources Information Center

    Boardman, Gerald R.

    1974-01-01

    Presents a description of the computer simulation model developed by the National Educational Finance Project for use by States in planning and evaluating alternative approaches for State support programs. Provides a general introduction to the model, a program operation overview, a sample run, and some conclusions. (Author/WM)

  19. Factors Promoting Engaged Exploration with Computer Simulations

    ERIC Educational Resources Information Center

    Podolefsky, Noah S.; Perkins, Katherine K.; Adams, Wendy K.

    2010-01-01

    This paper extends prior research on student use of computer simulations (sims) to engage with and explore science topics, in this case wave interference. We describe engaged exploration; a process that involves students actively interacting with educational materials, sense making, and exploring primarily via their own questioning. We analyze…

  20. Assessing Moderator Variables: Two Computer Simulation Studies.

    ERIC Educational Resources Information Center

    Mason, Craig A.; And Others

    1996-01-01

    A strategy is proposed for conceptualizing moderating relationships based on their type (strictly correlational and classically correlational) and form, whether continuous, noncontinuous, logistic, or quantum. Results of computer simulations comparing three statistical approaches for assessing moderator variables are presented, and advantages of…

  1. Decision Making in Computer-Simulated Experiments.

    ERIC Educational Resources Information Center

    Suits, J. P.; Lagowski, J. J.

    A set of interactive, computer-simulated experiments was designed to respond to the large range of individual differences in aptitude and reasoning ability generally exhibited by students enrolled in first-semester general chemistry. These experiments give students direct experience in the type of decision making needed in an experimental setting.…

  2. GENMAP--A Microbial Genetics Computer Simulation.

    ERIC Educational Resources Information Center

    Day, M. J.; And Others

    1985-01-01

    An interactive computer program in microbial genetics is described. The simulation allows students to work at their own pace and develop understanding of microbial techniques as they choose donor bacterial strains, specify selective media, and interact with demonstration experiments. Sample questions and outputs are included. (DH)

  3. Designing Online Scaffolds for Interactive Computer Simulation

    ERIC Educational Resources Information Center

    Chen, Ching-Huei; Wu, I-Chia; Jen, Fen-Lan

    2013-01-01

    The purpose of this study was to examine the effectiveness of online scaffolds in computer simulation to facilitate students' science learning. We first introduced online scaffolds to assist and model students' science learning and to demonstrate how a system embedded with online scaffolds can be designed and implemented to help high…

  4. Spiking network simulation code for petascale computers

    PubMed Central

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M.; Plesser, Hans E.; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  5. Spiking network simulation code for petascale computers.

    PubMed

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  6. Spiking network simulation code for petascale computers.

    PubMed

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today.

  7. Two Computer Simulations for Astronomy Education

    NASA Astrophysics Data System (ADS)

    Stoner, Ronald

    1997-05-01

    Two-dimensional media, such as transparencies and textbook illustrations are often inadequate for representing three-dimensional phenomena. Computer simulation using animation and interactive graphics can solve the pedagogic problem of allowing students to visualize inherently 3-D phenomena in physics and astronomy. This paper demonstrates two such computer simulations intended for use in introductory astronomy courses. The first permits visualization of astronomical structures on several different size scales by converting catalogs of astronomical objects at known distances (stars, star clusters, galaxies, etc.) to 3-D arrays of color-coded points that can be rotated in simulation to reveal 3-D structure. The second simulates the apparent motion of the sun in the sky of an arbitrary planet, simultaneously with the combined rotational and orbital motion of the planet that is responsible for it. These simulations were written in Borland Pascal for MS-DOS computers using the utilities package distributed with CUPS software (Educational software packages produced by the Consortium on Upper-level Physics Software (CUPS) are available from John Wiley & Sons, Inc.).

  8. Computational algorithms for simulations in atmospheric optics.

    PubMed

    Konyaev, P A; Lukin, V P

    2016-04-20

    A computer simulation technique for atmospheric and adaptive optics based on parallel programing is discussed. A parallel propagation algorithm is designed and a modified spectral-phase method for computer generation of 2D time-variant random fields is developed. Temporal power spectra of Laguerre-Gaussian beam fluctuations are considered as an example to illustrate the applications discussed. Implementation of the proposed algorithms using Intel MKL and IPP libraries and NVIDIA CUDA technology is shown to be very fast and accurate. The hardware system for the computer simulation is an off-the-shelf desktop with an Intel Core i7-4790K CPU operating at a turbo-speed frequency up to 5 GHz and an NVIDIA GeForce GTX-960 graphics accelerator with 1024 1.5 GHz processors.

  9. Computational algorithms for simulations in atmospheric optics.

    PubMed

    Konyaev, P A; Lukin, V P

    2016-04-20

    A computer simulation technique for atmospheric and adaptive optics based on parallel programing is discussed. A parallel propagation algorithm is designed and a modified spectral-phase method for computer generation of 2D time-variant random fields is developed. Temporal power spectra of Laguerre-Gaussian beam fluctuations are considered as an example to illustrate the applications discussed. Implementation of the proposed algorithms using Intel MKL and IPP libraries and NVIDIA CUDA technology is shown to be very fast and accurate. The hardware system for the computer simulation is an off-the-shelf desktop with an Intel Core i7-4790K CPU operating at a turbo-speed frequency up to 5 GHz and an NVIDIA GeForce GTX-960 graphics accelerator with 1024 1.5 GHz processors. PMID:27140113

  10. Computer simulations of WIGWAM underwater experiment

    SciTech Connect

    Kamegai, Minao; White, J.W.

    1993-11-01

    We performed computer simulations of the WIGWAM underwater experiment with a 2-D hydro-code, CALE. First, we calculated the bubble pulse and the signal strength at the closest gauge in one-dimensional geometry. The calculation shows excellent agreement with the measured data. Next, we made two-dimensional simulations of WIGWAM applying the gravity over-pressure, and calculated the signals at three selected gauge locations where measurements were recorded. The computed peak pressures at those gauge locations come well within the 15% experimental error bars. The signal at the farthest gauge is of the order of 200 bars. This is significant, because at this pressure the CALE output can be linked to a hydro-acoustics computer program, NPE Code (Nonlinear Progressive Wave-equation Code), to analyze the long distance propagation of acoustical signals from the underwater explosions on a global scale.

  11. Cosmological Simulations on a Grid of Computers

    NASA Astrophysics Data System (ADS)

    Depardon, Benjamin; Caron, Eddy; Desprez, Frédéric; Blaizot, Jérémy; Courtois, Hélène

    2010-06-01

    The work presented in this paper aims at restricting the input parameter values of the semi-analytical model used in GALICS and MOMAF, so as to derive which parameters influence the most the results, e.g., star formation, feedback and halo recycling efficiencies, etc. Our approach is to proceed empirically: we run lots of simulations and derive the correct ranges of values. The computation time needed is so large, that we need to run on a grid of computers. Hence, we model GALICS and MOMAF execution time and output files size, and run the simulation using a grid middleware: DIET. All the complexity of accessing resources, scheduling simulations and managing data is harnessed by DIET and hidden behind a web portal accessible to the users.

  12. Computational Simulation of Composite Structural Fatigue

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos C. (Technical Monitor)

    2005-01-01

    Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.

  13. Computational Simulation of Composite Structural Fatigue

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon

    2004-01-01

    Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.

  14. Advancing crime scene computer forensics techniques

    NASA Astrophysics Data System (ADS)

    Hosmer, Chet; Feldman, John; Giordano, Joe

    1999-02-01

    Computers and network technology have become inexpensive and powerful tools that can be applied to a wide range of criminal activity. Computers have changed the world's view of evidence because computers are used more and more as tools in committing `traditional crimes' such as embezzlements, thefts, extortion and murder. This paper will focus on reviewing the current state-of-the-art of the data recovery and evidence construction tools used in both the field and laboratory for prosection purposes.

  15. Recent Advances in Computational Mechanics of the Human Knee Joint

    PubMed Central

    Kazemi, M.; Dabiri, Y.; Li, L. P.

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling. PMID:23509602

  16. High-performance computing MRI simulations.

    PubMed

    Stöcker, Tony; Vahedipour, Kaveh; Pflugfelder, Daniel; Shah, N Jon

    2010-07-01

    A new open-source software project is presented, JEMRIS, the Jülich Extensible MRI Simulator, which provides an MRI sequence development and simulation environment for the MRI community. The development was driven by the desire to achieve generality of simulated three-dimensional MRI experiments reflecting modern MRI systems hardware. The accompanying computational burden is overcome by means of parallel computing. Many aspects are covered that have not hitherto been simultaneously investigated in general MRI simulations such as parallel transmit and receive, important off-resonance effects, nonlinear gradients, and arbitrary spatiotemporal parameter variations at different levels. The latter can be used to simulate various types of motion, for instance. The JEMRIS user interface is very simple to use, but nevertheless it presents few limitations. MRI sequences with arbitrary waveforms and complex interdependent modules are modeled in a graphical user interface-based environment requiring no further programming. This manuscript describes the concepts, methods, and performance of the software. Examples of novel simulation results in active fields of MRI research are given.

  17. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... airplane simulators. The requirements in this appendix are in addition to the simulator approval requirements in § 121.407. Each simulator used under this appendix must be approved as a Level B, C, or D simulator, as appropriate. Advanced Simulation Training Program For an operator to conduct Level C or...

  18. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... airplane simulators. The requirements in this appendix are in addition to the simulator approval requirements in § 121.407. Each simulator used under this appendix must be approved as a Level B, C, or D simulator, as appropriate. Advanced Simulation Training Program For an operator to conduct Level C or...

  19. Advances in free-energy-based simulations of protein folding and ligand binding.

    PubMed

    Perez, Alberto; Morrone, Joseph A; Simmerling, Carlos; Dill, Ken A

    2016-02-01

    Free-energy-based simulations are increasingly providing the narratives about the structures, dynamics and biological mechanisms that constitute the fabric of protein science. Here, we review two recent successes. It is becoming practical: first, to fold small proteins with free-energy methods without knowing substructures and second, to compute ligand-protein binding affinities, not just their binding poses. Over the past 40 years, the timescales that can be simulated by atomistic MD are doubling every 1.3 years--which is faster than Moore's law. Thus, these advances are not simply due to the availability of faster computers. Force fields, solvation models and simulation methodology have kept pace with computing advancements, and are now quite good. At the tip of the spear recently are GPU-based computing, improved fast-solvation methods, continued advances in force fields, and conformational sampling methods that harness external information. PMID:26773233

  20. Advances in free-energy-based simulations of protein folding and ligand binding.

    PubMed

    Perez, Alberto; Morrone, Joseph A; Simmerling, Carlos; Dill, Ken A

    2016-02-01

    Free-energy-based simulations are increasingly providing the narratives about the structures, dynamics and biological mechanisms that constitute the fabric of protein science. Here, we review two recent successes. It is becoming practical: first, to fold small proteins with free-energy methods without knowing substructures and second, to compute ligand-protein binding affinities, not just their binding poses. Over the past 40 years, the timescales that can be simulated by atomistic MD are doubling every 1.3 years--which is faster than Moore's law. Thus, these advances are not simply due to the availability of faster computers. Force fields, solvation models and simulation methodology have kept pace with computing advancements, and are now quite good. At the tip of the spear recently are GPU-based computing, improved fast-solvation methods, continued advances in force fields, and conformational sampling methods that harness external information.

  1. Advanced Techniques for Reservoir Simulation and Modeling of Non-Conventional Wells

    SciTech Connect

    Durlofsky, Louis J.

    2000-08-28

    This project targets the development of (1) advanced reservoir simulation techniques for modeling non-conventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and well index (for use in simulation models), including the effects of wellbore flow; and (3) accurate approaches to account for heterogeneity in the near-well region.

  2. Computer Simulations Improve University Instructional Laboratories1

    PubMed Central

    2004-01-01

    Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and “wet” laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff. PMID:15592599

  3. Aerodynamic optimization studies on advanced architecture computers

    NASA Technical Reports Server (NTRS)

    Chawla, Kalpana

    1995-01-01

    The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.

  4. Memory interface simulator: A computer design aid

    NASA Technical Reports Server (NTRS)

    Taylor, D. S.; Williams, T.; Weatherbee, J. E.

    1972-01-01

    Results are presented of a study conducted with a digital simulation model being used in the design of the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. The model simulates the activity involved as instructions are fetched from random access memory for execution in one of the system central processing units. A series of model runs measured instruction execution time under various assumptions pertaining to the CPU's and the interface between the CPU's and RAM. Design tradeoffs are presented in the following areas: Bus widths, CPU microprogram read only memory cycle time, multiple instruction fetch, and instruction mix.

  5. Introduction to computational oral absorption simulation.

    PubMed

    Sugano, Kiyohiko

    2009-03-01

    Computational oral absorption simulation (COAS) is anticipated to be a powerful tool in improving the productivity of drug discovery and development. This article reviews the theories of pharmaceutical sciences that consist of COAS. Although most of these theories are classical, they are revisited from the context of modern drug discovery and development. The theories of solubility, diffusion, dissolution, precipitation, intestinal membrane permeation and gastrointestinal transit are comprehensively described. Prediction strategy is then discussed based on the biopharmaceutical classification system. In the final part, good simulation practice is proposed and many frequently asked questions answered.

  6. Computer Simulation of the VASIMR Engine

    NASA Technical Reports Server (NTRS)

    Garrison, David

    2005-01-01

    The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.

  7. Fiber Composite Sandwich Thermostructural Behavior - Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Aiello, Robert A.; Murthy, Pappu L. N.

    1988-01-01

    Four computational simulation methods with different levels of sophistication were used to simulate thermal behavior and structural changes of composite sandwich panels with a honeycomb core subjected to a variety of environmental effects. The models on thich these methods are based include three-dimensional finite-element modeling, three-dimensional finite-element modeling assuming a homogeneous core, laminate theory, and simple equations for predicting the equivalent properties of the honeycomb core. A procedure was developed and embedded in a composite mechanics computer code, which made it possile to conduct parametric studies to determine 'optimum' composite sandwich configurations for specific applications. The procedure was applied for the evaluation of composite sandwich behavior at the global, local, laminate, ply, and micromechanics levels when the composite sandwich is subjected to hygral, thermal, and mechanical loading environments.

  8. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    NASA Technical Reports Server (NTRS)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  9. Metal matrix composites microfracture: Computational simulation

    NASA Technical Reports Server (NTRS)

    Mital, Subodh K.; Caruso, John J.; Chamis, Christos C.

    1990-01-01

    Fiber/matrix fracture and fiber-matrix interface debonding in a metal matrix composite (MMC) are computationally simulated. These simulations are part of a research activity to develop computational methods for microfracture, microfracture propagation and fracture toughness of the metal matrix composites. The three-dimensional finite element model used in the simulation consists of a group of nine unidirectional fibers in three by three unit cell array of SiC/Ti15 metal matrix composite with a fiber volume ration of 0.35. This computational procedure is used to predict the fracture process and establish the hierarchy of fracture modes based on strain energy release rate. It is also used to predict stress redistribution to surrounding matrix-fibers due to initial and progressive fracture of fiber/matrix and due to debonding of fiber-matrix interface. Microfracture results for various loading cases such as longitudinal, transverse, shear and bending are presented and discussed. Step-by-step procedures are outlined to evaluate composite microfracture for a given composite system.

  10. Computer Simulations of Voltage-Gated Cation Channels

    PubMed Central

    Treptow, Werner; Klein, Michael L.

    2012-01-01

    The relentless growth in computational power has seen increasing applications of molecular dynamics (MD) simulation to the study of membrane proteins in realistic membrane environments, which include explicit membrane lipids, water and ions. The concomitant increasing availability of membrane protein structures for ion channels, and transporters -- to name just two examples -- has stimulated many of these MD studies. In the case of voltage-gated cation channels (VGCCs) recent computational works have focused on ion-conduction and gating mechanisms, along with their regulation by agonist/antagonist ligands. The information garnered from these computational studies is largely inaccessible to experiment and is crucial for understanding the interplay between the structure and function as well as providing new directions for experiments. This article highlights recent advances in probing the structure and function of potassium channels and offers a perspective on the challenges likely to arise in making analogous progress in characterizing sodium channels. PMID:22523619

  11. Real-time simulation of an automotive gas turbine using the hybrid computer

    NASA Technical Reports Server (NTRS)

    Costakis, W.; Merrill, W. C.

    1984-01-01

    A hybrid computer simulation of an Advanced Automotive Gas Turbine Powertrain System is reported. The system consists of a gas turbine engine, an automotive drivetrain with four speed automatic transmission, and a control system. Generally, dynamic performance is simulated on the analog portion of the hybrid computer while most of the steady state performance characteristics are calculated to run faster than real time and makes this simulation a useful tool for a variety of analytical studies.

  12. A Generic Scheduling Simulator for High Performance Parallel Computers

    SciTech Connect

    Yoo, B S; Choi, G S; Jette, M A

    2001-08-01

    It is well known that efficient job scheduling plays a crucial role in achieving high system utilization in large-scale high performance computing environments. A good scheduling algorithm should schedule jobs to achieve high system utilization while satisfying various user demands in an equitable fashion. Designing such a scheduling algorithm is a non-trivial task even in a static environment. In practice, the computing environment and workload are constantly changing. There are several reasons for this. First, the computing platforms constantly evolve as the technology advances. For example, the availability of relatively powerful commodity off-the-shelf (COTS) components at steadily diminishing prices have made it feasible to construct ever larger massively parallel computers in recent years [1, 4]. Second, the workload imposed on the system also changes constantly. The rapidly increasing compute resources have provided many applications developers with the opportunity to radically alter program characteristics and take advantage of these additional resources. New developments in software technology may also trigger changes in user applications. Finally, political climate change may alter user priorities or the mission of the organization. System designers in such dynamic environments must be able to accurately forecast the effect of changes in the hardware, software, and/or policies under consideration. If the environmental changes are significant, one must also reassess scheduling algorithms. Simulation has frequently been relied upon for this analysis, because other methods such as analytical modeling or actual measurements are usually too difficult or costly. A drawback of the simulation approach, however, is that developing a simulator is a time-consuming process. Furthermore, an existing simulator cannot be easily adapted to a new environment. In this research, we attempt to develop a generic job-scheduling simulator, which facilitates the evaluation of

  13. Integrated computer simulation on FIR FEL dynamics

    SciTech Connect

    Furukawa, H.; Kuruma, S.; Imasaki, K.

    1995-12-31

    An integrated computer simulation code has been developed to analyze the RF-Linac FEL dynamics. First, the simulation code on the electron beam acceleration and transport processes in RF-Linac: (LUNA) has been developed to analyze the characteristics of the electron beam in RF-Linac and to optimize the parameters of RF-Linac. Second, a space-time dependent 3D FEL simulation code (Shipout) has been developed. The RF-Linac FEL total simulations have been performed by using the electron beam data from LUNA in Shipout. The number of particles using in a RF-Linac FEL total simulation is approximately 1000. The CPU time for the simulation of 1 round trip is about 1.5 minutes. At ILT/ILE, Osaka, a 8.5MeV RF-Linac with a photo-cathode RF-gun is used for FEL oscillation experiments. By using 2 cm wiggler, the FEL oscillation in the wavelength approximately 46 {mu}m are investigated. By the simulations using LUNA with the parameters of an ILT/ILE experiment, the pulse shape and the energy spectra of the electron beam at the end of the linac are estimated. The pulse shape of the electron beam at the end of the linac has sharp rise-up and it slowly decays as a function of time. By the RF-linac FEL total simulations with the parameters of an ILT/ILE experiment, the dependencies of the start up of the FEL oscillations on the pulse shape of the electron beam at the end of the linac are estimated. The coherent spontaneous emission effects and the quick start up of FEL oscillations have been observed by the RF-Linac FEL total simulations.

  14. Accelerating Climate Simulations Through Hybrid Computing

    NASA Technical Reports Server (NTRS)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  15. Advanced modeling and simulation to design and manufacture high performance and reliable advanced microelectronics and microsystems.

    SciTech Connect

    Nettleship, Ian (University of Pittsburgh, Pittsburgh, PA); Hinklin, Thomas; Holcomb, David Joseph; Tandon, Rajan; Arguello, Jose Guadalupe, Jr.; Dempsey, James Franklin; Ewsuk, Kevin Gregory; Neilsen, Michael K.; Lanagan, Michael (Pennsylvania State University, University Park, PA)

    2007-07-01

    An interdisciplinary team of scientists and engineers having broad expertise in materials processing and properties, materials characterization, and computational mechanics was assembled to develop science-based modeling/simulation technology to design and reproducibly manufacture high performance and reliable, complex microelectronics and microsystems. The team's efforts focused on defining and developing a science-based infrastructure to enable predictive compaction, sintering, stress, and thermomechanical modeling in ''real systems'', including: (1) developing techniques to and determining materials properties and constitutive behavior required for modeling; (2) developing new, improved/updated models and modeling capabilities, (3) ensuring that models are representative of the physical phenomena being simulated; and (4) assessing existing modeling capabilities to identify advances necessary to facilitate the practical application of Sandia's predictive modeling technology.

  16. Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    2003-01-01

    The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.

  17. Advanced Computer Image Generation Techniques Exploiting Perceptual Characteristics. Final Report.

    ERIC Educational Resources Information Center

    Stenger, Anthony J.; And Others

    This study suggests and identifies computer image generation (CIG) algorithms for visual simulation that improve the training effectiveness of CIG simulators and identifies areas of basic research in visual perception that are significant for improving CIG technology. The first phase of the project entailed observing three existing CIG simulators.…

  18. Space data systems: Advanced flight computers

    NASA Technical Reports Server (NTRS)

    Benz, Harry F.

    1991-01-01

    The technical objectives are to develop high-performance, space-qualifiable, onboard computing, storage, and networking technologies. The topics are presented in viewgraph form and include the following: technology challenges; state-of-the-art assessment; program description; relationship to external programs; and cooperation and coordination effort.

  19. Advances in Computer-Supported Learning

    ERIC Educational Resources Information Center

    Neto, Francisco; Brasileiro, Francisco

    2007-01-01

    The Internet and growth of computer networks have eliminated geographic barriers, creating an environment where education can be brought to a student no matter where that student may be. The success of distance learning programs and the availability of many Web-supported applications and multimedia resources have increased the effectiveness of…

  20. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... and a means for achieving flightcrew training in advanced airplane simulators. The requirements in this appendix are in addition to the simulator approval requirements in § 121.407. Each simulator used under this appendix must be approved as a Level B, C, or D simulator, as appropriate....

  1. Communication: Spin-free quantum computational simulations and symmetry adapted states.

    PubMed

    Whitfield, James Daniel

    2013-07-14

    The ideas of digital simulation of quantum systems using a quantum computer parallel the original ideas of numerical simulation using a classical computer. In order for quantum computational simulations to advance to a competitive point, many techniques from classical simulations must be imported into the quantum domain. In this article, we consider the applications of symmetry in the context of quantum simulation. Building upon well established machinery, we propose a form of first quantized simulation that only requires the spatial part of the wave function, thereby allowing spin-free quantum computational simulations. We go further and discuss the preparation of N-body states with specified symmetries based on projection techniques. We consider two simple examples, molecular hydrogen and cyclopropenyl cation, to illustrate the ideas. The methods here are the first to explicitly deal with preparing N-body symmetry-adapted states and open the door for future investigations into group theory, chemistry, and quantum simulation. PMID:23862919

  2. Communication: Spin-free quantum computational simulations and symmetry adapted states.

    PubMed

    Whitfield, James Daniel

    2013-07-14

    The ideas of digital simulation of quantum systems using a quantum computer parallel the original ideas of numerical simulation using a classical computer. In order for quantum computational simulations to advance to a competitive point, many techniques from classical simulations must be imported into the quantum domain. In this article, we consider the applications of symmetry in the context of quantum simulation. Building upon well established machinery, we propose a form of first quantized simulation that only requires the spatial part of the wave function, thereby allowing spin-free quantum computational simulations. We go further and discuss the preparation of N-body states with specified symmetries based on projection techniques. We consider two simple examples, molecular hydrogen and cyclopropenyl cation, to illustrate the ideas. The methods here are the first to explicitly deal with preparing N-body symmetry-adapted states and open the door for future investigations into group theory, chemistry, and quantum simulation.

  3. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  4. Computer Science Techniques Applied to Parallel Atomistic Simulation

    NASA Astrophysics Data System (ADS)

    Nakano, Aiichiro

    1998-03-01

    Recent developments in parallel processing technology and multiresolution numerical algorithms have established large-scale molecular dynamics (MD) simulations as a new research mode for studying materials phenomena such as fracture. However, this requires large system sizes and long simulated times. We have developed: i) Space-time multiresolution schemes; ii) fuzzy-clustering approach to hierarchical dynamics; iii) wavelet-based adaptive curvilinear-coordinate load balancing; iv) multilevel preconditioned conjugate gradient method; and v) spacefilling-curve-based data compression for parallel I/O. Using these techniques, million-atom parallel MD simulations are performed for the oxidation dynamics of nanocrystalline Al. The simulations take into account the effect of dynamic charge transfer between Al and O using the electronegativity equalization scheme. The resulting long-range Coulomb interaction is calculated efficiently with the fast multipole method. Results for temperature and charge distributions, residual stresses, bond lengths and bond angles, and diffusivities of Al and O will be presented. The oxidation of nanocrystalline Al is elucidated through immersive visualization in virtual environments. A unique dual-degree education program at Louisiana State University will also be discussed in which students can obtain a Ph.D. in Physics & Astronomy and a M.S. from the Department of Computer Science in five years. This program fosters interdisciplinary research activities for interfacing High Performance Computing and Communications with large-scale atomistic simulations of advanced materials. This work was supported by NSF (CAREER Program), ARO, PRF, and Louisiana LEQSF.

  5. Advances in Computationally Modeling Human Oral Bioavailability

    PubMed Central

    Wang, Junmei; Hou, Tingjun

    2015-01-01

    Although significant progress has been made in experimental high throughput screening (HTS) of ADME (absorption, distribution, metabolism, excretion) and pharmacokinetic properties, the ADME and Toxicity (ADME-Tox) in silico modeling is still indispensable in drug discovery as it can guide us to wisely select drug candidates prior to expensive ADME screenings and clinical trials. Compared to other ADME-Tox properties, human oral bioavailability (HOBA) is particularly important but extremely difficult to predict. In this paper, the advances in human oral bioavailability modeling will be reviewed. Moreover, our deep insight on how to construct more accurate and reliable HOBA QSAR and classification models will also discussed. PMID:25582307

  6. Neural network computer simulation of medical aerosols.

    PubMed

    Richardson, C J; Barlow, D J

    1996-06-01

    Preliminary investigations have been conducted to assess the potential for using artificial neural networks to simulate aerosol behaviour, with a view to employing this type of methodology in the evaluation and design of pulmonary drug-delivery systems. Details are presented of the general purpose software developed for these tasks; it implements a feed-forward back-propagation algorithm with weight decay and connection pruning, the user having complete run-time control of the network architecture and mode of training. A series of exploratory investigations is then reported in which different network structures and training strategies are assessed in terms of their ability to simulate known patterns of fluid flow in simple model systems. The first of these involves simulations of cellular automata-generated data for fluid flow through a partially obstructed two-dimensional pipe. The artificial neural networks are shown to be highly successful in simulating the behaviour of this simple linear system, but with important provisos relating to the information content of the training data and the criteria used to judge when the network is properly trained. A second set of investigations is then reported in which similar networks are used to simulate patterns of fluid flow through aerosol generation devices, using training data furnished through rigorous computational fluid dynamics modelling. These more complex three-dimensional systems are modelled with equal success. It is concluded that carefully tailored, well trained networks could provide valuable tools not just for predicting but also for analysing the spatial dynamics of pharmaceutical aerosols.

  7. Utility of computer simulations in landscape genetics.

    PubMed

    Epperson, Bryan K; McRae, Brad H; Scribner, Kim; Cushman, Samuel A; Rosenberg, Michael S; Fortin, Marie-Josée; James, Patrick M A; Murphy, Melanie; Manel, Stéphanie; Legendre, Pierre; Dale, Mark R T

    2010-09-01

    Population genetics theory is primarily based on mathematical models in which spatial complexity and temporal variability are largely ignored. In contrast, the field of landscape genetics expressly focuses on how population genetic processes are affected by complex spatial and temporal environmental heterogeneity. It is spatially explicit and relates patterns to processes by combining complex and realistic life histories, behaviours, landscape features and genetic data. Central to landscape genetics is the connection of spatial patterns of genetic variation to the usually highly stochastic space-time processes that create them over both historical and contemporary time periods. The field should benefit from a shift to computer simulation approaches, which enable incorporation of demographic and environmental stochasticity. A key role of simulations is to show how demographic processes such as dispersal or reproduction interact with landscape features to affect probability of site occupancy, population size, and gene flow, which in turn determine spatial genetic structure. Simulations could also be used to compare various statistical methods and determine which have correct type I error or the highest statistical power to correctly identify spatio-temporal and environmental effects. Simulations may also help in evaluating how specific spatial metrics may be used to project future genetic trends. This article summarizes some of the fundamental aspects of spatial-temporal population genetic processes. It discusses the potential use of simulations to determine how various spatial metrics can be rigorously employed to identify features of interest, including contrasting locus-specific spatial patterns due to micro-scale environmental selection.

  8. A computer simulation of chromosomal instability

    NASA Astrophysics Data System (ADS)

    Goodwin, E.; Cornforth, M.

    The transformation of a normal cell into a cancerous growth can be described as a process of mutation and selection occurring within the context of clonal expansion. Radiation, in addition to initial DNA damage, induces a persistent and still poorly understood genomic instability process that contributes to the mutational burden. It will be essential to include a quantitative description of this phenomenon in any attempt at science-based risk assessment. Monte Carlo computer simulations are a relatively simple way to model processes that are characterized by an element of randomness. A properly constructed simulation can capture the essence of a phenomenon that, as is often the case in biology, can be extraordinarily complex, and can do so even though the phenomenon itself is incompletely understood. A simple computer simulation of one manifestation of genomic instability known as chromosomal instability will be presented. The model simulates clonal expansion of a single chromosomally unstable cell into a colony. Instability is characterized by a single parameter, the rate of chromosomal rearrangement. With each new chromosome aberration, a unique subclone arises (subclones are defined as having a unique karyotype). The subclone initially has just one cell, but it can expand with cell division if the aberration is not lethal. The computer program automatically keeps track of the number of subclones within the expanding colony, and the number of cells within each subclone. Because chromosome aberrations kill some cells during colony growth, colonies arising from unstable cells tend to be smaller than those arising from stable cells. For any chosen level of instability, the computer program calculates the mean number of cells per colony averaged over many runs. These output should prove useful for investigating how such radiobiological phenomena as slow growth colonies, increased doubling time, and delayed cell death depend on chromosomal instability. Also of

  9. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    ERIC Educational Resources Information Center

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-01-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…

  10. Precision Casting via Advanced Simulation and Manufacturing

    NASA Technical Reports Server (NTRS)

    1997-01-01

    A two-year program was conducted to develop and commercially implement selected casting manufacturing technologies to enable significant reductions in the costs of castings, increase the complexity and dimensional accuracy of castings, and reduce the development times for delivery of high quality castings. The industry-led R&D project was cost shared with NASA's Aerospace Industry Technology Program (AITP). The Rocketdyne Division of Boeing North American, Inc. served as the team lead with participation from Lockheed Martin, Ford Motor Company, Howmet Corporation, PCC Airfoils, General Electric, UES, Inc., University of Alabama, Auburn University, Robinson, Inc., Aracor, and NASA-LeRC. The technical effort was organized into four distinct tasks. The accomplishments reported herein. Task 1.0 developed advanced simulation technology for core molding. Ford headed up this task. On this program, a specialized core machine was designed and built. Task 2.0 focused on intelligent process control for precision core molding. Howmet led this effort. The primary focus of these experimental efforts was to characterize the process parameters that have a strong impact on dimensional control issues of injection molded cores during their fabrication. Task 3.0 developed and applied rapid prototyping to produce near net shape castings. Rocketdyne was responsible for this task. CAD files were generated using reverse engineering, rapid prototype patterns were fabricated using SLS and SLA, and castings produced and evaluated. Task 4.0 was aimed at developing technology transfer. Rocketdyne coordinated this task. Casting related technology, explored and evaluated in the first three tasks of this program, was implemented into manufacturing processes.

  11. Computer simulations of the Ni2MnGa alloys

    NASA Astrophysics Data System (ADS)

    Breczko, Teodor M.; Nelayev, Vladislav; Dovzhik, Krishna; Najbuk, Miroslaw

    2008-07-01

    This article reports an computer simulations of physical properties of Heusler NiMnGa alloy. Computer simulation are devoted to austenite phase. The chemical composition of researched specimens causes generation martesite and austenite phases.

  12. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Adel Sarofim; Connie Senior

    2004-12-22

    , immersive environment. The Virtual Engineering Framework (VEF), in effect a prototype framework, was developed through close collaboration with NETL supported research teams from Iowa State University Virtual Reality Applications Center (ISU-VRAC) and Carnegie Mellon University (CMU). The VEF is open source, compatible across systems ranging from inexpensive desktop PCs to large-scale, immersive facilities and provides support for heterogeneous distributed computing of plant simulations. The ability to compute plant economics through an interface that coupled the CMU IECM tool to the VEF was demonstrated, and the ability to couple the VEF to Aspen Plus, a commercial flowsheet modeling tool, was demonstrated. Models were interfaced to the framework using VES-Open. Tests were performed for interfacing CAPE-Open-compliant models to the framework. Where available, the developed models and plant simulations have been benchmarked against data from the open literature. The VEF has been installed at NETL. The VEF provides simulation capabilities not available in commercial simulation tools. It provides DOE engineers, scientists, and decision makers with a flexible and extensible simulation system that can be used to reduce the time, technical risk, and cost to develop the next generation of advanced, coal-fired power systems that will have low emissions and high efficiency. Furthermore, the VEF provides a common simulation system that NETL can use to help manage Advanced Power Systems Research projects, including both combustion- and gasification-based technologies.

  13. Investigation of Carbohydrate Recognition via Computer Simulation

    SciTech Connect

    Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas; Shen, Tongye

    2015-04-28

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We review the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.

  14. Parallel Proximity Detection for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1998-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are included by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  15. Parallel Proximity Detection for Computer Simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1997-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are includes by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  16. Investigation of Carbohydrate Recognition via Computer Simulation.

    PubMed

    Johnson, Quentin R; Lindsay, Richard J; Petridis, Loukas; Shen, Tongye

    2015-01-01

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Recently, interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. We focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We review the current state of this technique and its successful applications for studying protein-sugar interactions in recent years. PMID:25927900

  17. Investigation of Carbohydrate Recognition via Computer Simulation

    DOE PAGES

    Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas; Shen, Tongye

    2015-04-28

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We reviewmore » the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.« less

  18. Investigation of Carbohydrate Recognition via Computer Simulation.

    PubMed

    Johnson, Quentin R; Lindsay, Richard J; Petridis, Loukas; Shen, Tongye

    2015-04-28

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Recently, interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. We focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We review the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.

  19. Computer simulation of spacecraft/environment interaction.

    PubMed

    Krupnikov, K K; Makletsov, A A; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-10-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991 1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language. PMID:11542669

  20. Computer simulation of spacecraft/environment interaction.

    PubMed

    Krupnikov, K K; Makletsov, A A; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-10-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991 1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  1. Multidimensional computer simulation of Stirling cycle engines

    NASA Technical Reports Server (NTRS)

    Hall, C. A.; Porsching, T. A.; Medley, J.; Tew, R. C.

    1990-01-01

    The computer code ALGAE (algorithms for the gas equations) treats incompressible, thermally expandable, or locally compressible flows in complicated two-dimensional flow regions. The solution method, finite differencing schemes, and basic modeling of the field equations in ALGAE are applicable to engineering design settings of the type found in Stirling cycle engines. The use of ALGAE to model multiple components of the space power research engine (SPRE) is reported. Videotape computer simulations of the transient behavior of the working gas (helium) in the heater-regenerator-cooler complex of the SPRE demonstrate the usefulness of such a program in providing information on thermal and hydraulic phenomena in multiple component sections of the SPRE.

  2. Computer Simulation Studies of Gramicidin Channel

    NASA Astrophysics Data System (ADS)

    Song, Hyundeok; Beck, Thomas

    2009-04-01

    Ion channels are large membrane proteins, and their function is to facilitate the passage of ions across biological membranes. Recently, Dr. John Cuppoletti's group at UC showed that the gramicidin channel could function at high temperatures (360 -- 390K) with significant currents. This finding may have large implications for fuel cell technology. In this project, we will examine the experimental system by computer simulation. We will investigate how the temperature affects the current and differences in magnitude of the currents between two forms of Gramicidin, A and D. This research will help to elucidate the underlying molecular mechanism in this promising new technology.

  3. Computer simulation in the daily practice of orthognathic surgery.

    PubMed

    Schendel, S A

    2015-12-01

    The availability of computers and advances in imaging, especially over the last 10 years, have allowed the adoption of three-dimensional (3D) imaging in the office setting. The affordability and ease of use of this modality has led to its widespread implementation in diagnosis and treatment planning, teaching, and follow-up care. 3D imaging is particularly useful when the deformities are complex and involve both function and aesthetics, such as those in the dentofacial area, and for orthognathic surgery. Computer imaging involves combining images obtained from different modalities to create a virtual record of an individual. In this article, the system is described and its use in the office demonstrated. Computer imaging with simulation, and more specifically patient-specific anatomic records (PSAR), permit a more accurate analysis of the deformity as an aid to diagnosis and treatment planning. 3D imaging and computer simulation can be used effectively for the planning of office-based procedures. The technique can be used to perform virtual surgery and establish a definitive and objective treatment plan for correction of the facial deformity. In addition, patient education and follow-up can be facilitated. The end result is improved patient care and decreased expense.

  4. Advances in computational studies of energy materials.

    PubMed

    Catlow, C R A; Guo, Z X; Miskufova, M; Shevlin, S A; Smith, A G H; Sokol, A A; Walsh, A; Wilson, D J; Woodley, S M

    2010-07-28

    We review recent developments and applications of computational modelling techniques in the field of materials for energy technologies including hydrogen production and storage, energy storage and conversion, and light absorption and emission. In addition, we present new work on an Sn2TiO4 photocatalyst containing an Sn(II) lone pair, new interatomic potential models for SrTiO3 and GaN, an exploration of defects in the kesterite/stannite-structured solar cell absorber Cu2ZnSnS4, and report details of the incorporation of hydrogen into Ag2O and Cu2O. Special attention is paid to the modelling of nanostructured systems, including ceria (CeO2, mixed Ce(x)O(y) and Ce2O3) and group 13 sesquioxides. We consider applications based on both interatomic potential and electronic structure methodologies; and we illustrate the increasingly quantitative and predictive nature of modelling in this field. PMID:20566517

  5. Computer simulation of solder joint failure

    SciTech Connect

    Burchett, S.N.; Frear, D.R.; Rashid, M.M.

    1997-04-01

    The thermomechanical fatigue failure of solder joints is increasingly becoming an important reliability issue for electronic packages. The purpose of this Laboratory Directed Research and Development (LDRD) project was to develop computational tools for simulating the behavior of solder joints under strain and temperature cycling, taking into account the microstructural heterogeneities that exist in as-solidified near eutectic Sn-Pb joints, as well as subsequent microstructural evolution. The authors present two computational constitutive models, a two-phase model and a single-phase model, that were developed to predict the behavior of near eutectic Sn-Pb solder joints under fatigue conditions. Unique metallurgical tests provide the fundamental input for the constitutive relations. The two-phase model mathematically predicts the heterogeneous coarsening behavior of near eutectic Sn-Pb solder. The finite element simulations with this model agree qualitatively with experimental thermomechanical fatigue tests. The simulations show that the presence of an initial heterogeneity in the solder microstructure could significantly degrade the fatigue lifetime. The single-phase model was developed to predict solder joint behavior using materials data for constitutive relation constants that could be determined through straightforward metallurgical experiments. Special thermomechanical fatigue tests were developed to give fundamental materials input to the models, and an in situ SEM thermomechanical fatigue test system was developed to characterize microstructural evolution and the mechanical behavior of solder joints during the test. A shear/torsion test sample was developed to impose strain in two different orientations. Materials constants were derived from these tests. The simulation results from the two-phase model showed good fit to the experimental test results.

  6. Mapping lava flow hazards using computer simulation

    NASA Astrophysics Data System (ADS)

    Wadge, G.; Young, P. A. V.; McKendrick, I. J.

    1994-01-01

    Computer simulations of the paths of flowing lava are achieved using a program, FLOWFRONT, that describes the behavior of flow and digital models of the terrain. Two methods of application of simulations of the hazards posed by lava flows are described. The first, deterministic, method requires that program parameters such as vent position, minimum flow thickness, and thickness/slope relationship be based on the ambient eruptive conditions so that the future course of a specific lava flow can be simulated. This is illustrated using retrospective modeling of the first 21 days of the eruption of an andesitic lava flow at Lonquimay volcano, Chile, in 1988-1989. The usefulness of this method for real-time predictive modeling is likely to be limited by the lack of accurate field data on flow characteristics, the simple nature of the model, and the sensitivity to parameter choice of the final planimetric form of the model flow. The second application is probabilistic in nature and creates a map of the likelihood of inundation by lava flows that is useful for long-term land use planning. This method uses the historical record of past eruptions to constrain a series of Monte Carlo simulations and is illustrated using data from Etna volcano in Sicily. A multivariate statistical analysis of nine parameters for the 1763-1989 eruption catalog using simulated annealing permitted a classification of Etna's flank eruptions into two types: A and B. Type A eruptions are short-lived and produce linear lava flows; type B eruptions are long-lived, and produce lava flows that are much broader in shape, and their vents are restricted to the eastern flank of the volcano. The simulation method consists of creating a probability surface of the location of future eruption vents and segmenting the region according to the most likely historical eruption on which to base the simulation. Analysis of the autocorrelation of the historical eruptions shows that type A eruptions are strongly

  7. An Advanced Leakage Scheme for Neutrino Treatment in Astrophysical Simulations

    NASA Astrophysics Data System (ADS)

    Perego, A.; Cabezón, R. M.; Käppeli, R.

    2016-04-01

    We present an Advanced Spectral Leakage (ASL) scheme to model neutrinos in the context of core-collapse supernovae (CCSNe) and compact binary mergers. Based on previous gray leakage schemes, the ASL scheme computes the neutrino cooling rates by interpolating local production and diffusion rates (relevant in optically thin and thick regimes, respectively) separately for discretized values of the neutrino energy. Neutrino trapped components are also modeled, based on equilibrium and timescale arguments. The better accuracy achieved by the spectral treatment allows a more reliable computation of neutrino heating rates in optically thin conditions. The scheme has been calibrated and tested against Boltzmann transport in the context of Newtonian spherically symmetric models of CCSNe. ASL shows a very good qualitative and a partial quantitative agreement for key quantities from collapse to a few hundreds of milliseconds after core bounce. We have proved the adaptability and flexibility of our ASL scheme, coupling it to an axisymmetric Eulerian and to a three-dimensional smoothed particle hydrodynamics code to simulate core collapse. Therefore, the neutrino treatment presented here is ideal for large parameter-space explorations, parametric studies, high-resolution tests, code developments, and long-term modeling of asymmetric configurations, where more detailed neutrino treatments are not available or are currently computationally too expensive.

  8. Computer Simulations in Science Education: Implications for Distance Education

    ERIC Educational Resources Information Center

    Sahin, Sami

    2006-01-01

    This paper is a review of literature about the use of computer simulations in science education. This review examines types and examples of computer simulations. The literature review indicated that although computer simulations cannot replace science classroom and laboratory activities completely, they offer various advantages both for classroom…

  9. The Learning Effects of Computer Simulations in Science Education

    ERIC Educational Resources Information Center

    Rutten, Nico; van Joolingen, Wouter R.; van der Veen, Jan T.

    2012-01-01

    This article reviews the (quasi)experimental research of the past decade on the learning effects of computer simulations in science education. The focus is on two questions: how use of computer simulations can enhance traditional education, and how computer simulations are best used in order to improve learning processes and outcomes. We report on…

  10. Scientific and Computational Challenges of the Fusion Simulation Program (FSP)

    SciTech Connect

    William M. Tang

    2011-02-09

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  11. COMPUTER MODEL AND SIMULATION OF A GLOVE BOX PROCESS

    SciTech Connect

    C. FOSTER; ET AL

    2001-01-01

    most glove box operations and demonstrates the ability and advantages of advance computer based modeling. The three-dimensional model also enables better comprehension of problems to non-technical staff. There are many barriers to the seamless integration between the initial design specifications and a computer simulation. Problems include the lack of a standard model and inexact manufacturing of components used in the glove box. The benefits and drawbacks are discussed; however, the results are useful.

  12. Advancements in Afterbody Radiative Heating Simulations for Earth Entry

    NASA Technical Reports Server (NTRS)

    Johnston, Christopher O.; Panesi, Marco; Brandis, Aaron M.

    2016-01-01

    Four advancements to the simulation of backshell radiative heating for Earth entry are presented. The first of these is the development of a flow field model that treats electronic levels of the dominant backshell radiator, N, as individual species. This is shown to allow improvements in the modeling of electron-ion recombination and two-temperature modeling, which are shown to increase backshell radiative heating by 10 to 40%. By computing the electronic state populations of N within the flow field solver, instead of through the quasi-steady state approximation in the radiation code, the coupling of radiative transition rates to the species continuity equations for the levels of N, including the impact of non-local absorption, becomes feasible. Implementation of this additional level of coupling between the flow field and radiation codes represents the second advancement presented in this work, which is shown to increase the backshell radiation by another 10 to 50%. The impact of radiative transition rates due to non-local absorption indicates the importance of accurate radiation transport in the relatively complex flow geometry of the backshell. This motivates the third advancement, which is the development of a ray-tracing radiation transport approach to compute the radiative transition rates and divergence of the radiative flux at every point for coupling to the flow field, therefore allowing the accuracy of the commonly applied tangent-slab approximation to be assessed for radiative source terms. For the sphere considered at lunar-return conditions, the tangent-slab approximation is shown to provide a sufficient level of accuracy for the radiative source terms, even for backshell cases. This is in contrast to the agreement between the two approaches for computing the radiative flux to the surface, which differ by up to 40%. The final advancement presented is the development of a nonequilibrium model for NO radiation, which provides significant backshell

  13. Computational simulation of liquid fuel rocket injectors

    NASA Technical Reports Server (NTRS)

    Landrum, D. Brian

    1994-01-01

    A major component of any liquid propellant rocket is the propellant injection system. Issues of interest include the degree of liquid vaporization and its impact on the combustion process, the pressure and temperature fields in the combustion chamber, and the cooling of the injector face and chamber walls. The Finite Difference Navier-Stokes (FDNS) code is a primary computational tool used in the MSFC Computational Fluid Dynamics Branch. The branch has dedicated a significant amount of resources to development of this code for prediction of both liquid and solid fuel rocket performance. The FDNS code is currently being upgraded to include the capability to model liquid/gas multi-phase flows for fuel injection simulation. An important aspect of this effort is benchmarking the code capabilities to predict existing experimental injection data. The objective of this MSFC/ASEE Summer Faculty Fellowship term was to evaluate the capabilities of the modified FDNS code to predict flow fields with liquid injection. Comparisons were made between code predictions and existing experimental data. A significant portion of the effort included a search for appropriate validation data. Also, code simulation deficiencies were identified.

  14. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Adel Sarofim; Bene Risio

    2002-07-28

    This is the seventh Quarterly Technical Report for DOE Cooperative Agreement No.: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on the development of the IGCC workbench. A series of parametric CFD simulations for single stage and two stage generic gasifier configurations have been performed. An advanced flowing slag model has been implemented into the CFD based gasifier model. A literature review has been performed on published gasification kinetics. Reactor models have been developed and implemented into the workbench for the majority of the heat exchangers, gas clean up system and power generation system for the Vision 21 reference configuration. Modifications to the software infrastructure of the workbench have been commenced to allow interfacing to the workbench reactor models that utilize the CAPE{_}Open software interface protocol.

  15. Mapping lava flow hazards using computer simulation

    SciTech Connect

    Wadge, G.; Young, P.A.V.; Mckendrick, I.J.

    1994-01-01

    Computer simulations of the paths of flowing lava are achieved using a program, FLOWFRONT, that describes the behavior of flow and digital models of the terrain. Two methods of application of simulations of the hazards posed by lava flows are described. The first, deterministic, method requires that program parameters such as vent position, minimum flow thickness, and thickness/slope relationship be based on the ambient eruptive conditions so that the future course of a specific lava flow can be simulated. This is illustrated using retrospective modeling of the first 21 days of the eruption of an andesitic lava flow at Lonquimay volcano, Chile, in 1988-1989. The usefulness of this method for real-time predictive modeling is likely to be limited by the lack of accurate field data on flow characteristics, the simple nature of the model, and the sensitivity to parameter choice of the final planimetric form of the model flow. The second application is probabilistic in nature and creates a map of the likelihood of inundation by lava flows that is useful for long-term land use planning. This method uses the historical record of past eruptions to constrain a series of Monte Carlo simulations and is illustrated using data from Etna volcano in Sicily. A multivariate statistical analysis of nine parameters for the 1763-1989 eruption catalog using simulated annealing permitted a classification of Etna`s flank eruptions into two types: A and B. Type A eruptions are short-lived and produce linear lava flows; type B eruptions are long-lived, and produce lava flows that are much broader in shape, and their vents are restricted to the eastern flank of the volcano.

  16. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  17. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  18. Infrared Flight Simulation Using Computer Generated Imagery

    NASA Astrophysics Data System (ADS)

    Weathersby, Marshall R.; Finlay, W. Mark

    1985-01-01

    A sophisticated deterministic interactive software model for computer generation of three-dimensionally projected infrared scenes has been developed. Scenes can be produced using either a self-emission or near infrared reflectance model. The software allows for generation of flight paths through a data base consisting of both feature and topography and near real-time display of stored precomputed images. The emphasis in the model development has been in computer generation of infrared scenes which accurately reproduce the characteristics of real-world imagery. The software combines computer graphics and infrared physics to produce synthetic scenes with the statistical properties of real scenes. Options exist for generation of images in near-infrared, 3-5 or 8-12 micron spectral bands including atmospheric attenuation effects. The three-dimensional projection algorithms allow for viewing of the scenes from any geometry and include concave and convex surfaces as well as hidden objects. Features exist for insertion of additional objects into the three-dimensional scenes. Thus targets, buildings, and other natural or man-made objects can be inserted with any orientation anywhere in the scenes. This allows full simulation of varying depression angles, range closure, and fly-over. The three-dimensional infrared background clutter model is an evaluation tool capable of both assessing system performance in clutter and increasing our understanding of clutter itself. The model in its current form represents a powerful tool for the fundamental understanding of infrared clutter. Possible applications include, but are most certainly not limited to, sensor operator training in the area of target discrimination with dynamic imagery, evaluation of automatic target recognizer (ATR) algorithms, and simulations allowing pilots to pre-fly missions.

  19. Computer simulation of surface and film processes

    NASA Technical Reports Server (NTRS)

    Tiller, W. A.; Halicioglu, M. T.

    1984-01-01

    All the investigations which were performed employed in one way or another a computer simulation technique based on atomistic level considerations. In general, three types of simulation methods were used for modeling systems with discrete particles that interact via well defined potential functions: molecular dynamics (a general method for solving the classical equations of motion of a model system); Monte Carlo (the use of Markov chain ensemble averaging technique to model equilibrium properties of a system); and molecular statics (provides properties of a system at T = 0 K). The effects of three-body forces on the vibrational frequencies of triatomic cluster were investigated. The multilayer relaxation phenomena for low index planes of an fcc crystal was analyzed also as a function of the three-body interactions. Various surface properties for Si and SiC system were calculated. Results obtained from static simulation calculations for slip formation were presented. The more elaborate molecular dynamics calculations on the propagation of cracks in two-dimensional systems were outlined.

  20. Computer simulation of fatigue under diametrical compression

    SciTech Connect

    Carmona, H. A.; Kun, F.; Andrade, J. S. Jr.; Herrmann, H. J.

    2007-04-15

    We study the fatigue fracture of disordered materials by means of computer simulations of a discrete element model. We extend a two-dimensional fracture model to capture the microscopic mechanisms relevant for fatigue and we simulate the diametric compression of a disc shape specimen under a constant external force. The model allows us to follow the development of the fracture process on the macrolevel and microlevel varying the relative influence of the mechanisms of damage accumulation over the load history and healing of microcracks. As a specific example we consider recent experimental results on the fatigue fracture of asphalt. Our numerical simulations show that for intermediate applied loads the lifetime of the specimen presents a power law behavior. Under the effect of healing, more prominent for small loads compared to the tensile strength of the material, the lifetime of the sample increases and a fatigue limit emerges below which no macroscopic failure occurs. The numerical results are in a good qualitative agreement with the experimental findings.

  1. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Zumao Chen; Temi Linjewile; Mike Maguire; Adel Sarofim; Connie Senior; Changguan Yang; Hong-Shig Shim

    2004-04-28

    This is the fourteenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a Virtual Engineering-based framework for simulating the performance of Advanced Power Systems. Within the last quarter, good progress has been made on all aspects of the project. Software development efforts have focused primarily on completing a prototype detachable user interface for the framework and on integrating Carnegie Mellon Universities IECM model core with the computational engine. In addition to this work, progress has been made on several other development and modeling tasks for the program. These include: (1) improvements to the infrastructure code of the computational engine, (2) enhancements to the model interfacing specifications, (3) additional development to increase the robustness of all framework components, (4) enhanced coupling of the computational and visualization engine components, (5) a series of detailed simulations studying the effects of gasifier inlet conditions on the heat flux to the gasifier injector, and (6) detailed plans for implementing models for mercury capture for both warm and cold gas cleanup have been created.

  2. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Zumao Chen; Temi Linjewile; Adel Sarofim; Bene Risio

    2003-04-25

    This is the tenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on all aspects of the project. Calculations for a full Vision 21 plant configuration have been performed for two gasifier types. An improved process model for simulating entrained flow gasifiers has been implemented into the workbench. Model development has focused on: a pre-processor module to compute global gasification parameters from standard fuel properties and intrinsic rate information; a membrane based water gas shift; and reactors to oxidize fuel cell exhaust gas. The data visualization capabilities of the workbench have been extended by implementing the VTK visualization software that supports advanced visualization methods, including inexpensive Virtual Reality techniques. The ease-of-use, functionality and plug-and-play features of the workbench were highlighted through demonstrations of the workbench at a DOE sponsored coal utilization conference. A white paper has been completed that contains recommendations on the use of component architectures, model interface protocols and software frameworks for developing a Vision 21 plant simulator.

  3. Simulating Subsurface Reactive Flows on Ultrascale Computers with PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Mills, R. T.; Hammond, G. E.; Lichtner, P. C.; Lu, C.; Smith, B. F.; Philip, B.

    2009-12-01

    To provide true predictive utility, subsurface simulations often must accurately resolve--in three dimensions--complicated, multi-phase flow fields in highly heterogeneous geology with numerous chemical species and complex chemistry. This task is especially daunting because of the wide range of spatial scales involved--from the pore scale to the field scale--ranging over six orders of magnitude, and the wide range of time scales ranging from seconds or less to millions of years. This represents a true "Grand Challenge" computational problem, requiring not only the largest-scale ("ultrascale") supercomputers, but accompanying advances in algorithms for the efficient numerical solution of systems of PDEs using these machines, and in mathematical modeling techniques that can adequately capture the truly multi-scale nature of these problems. We describe some of the specific challenges involved and present the software and algorithmic approaches that are being using in the computer code PFLOTRAN to provide scalable performance for such simulations on tens of thousands of processors. We focus particularly on scalable techniques for solving the large (up to billions of total degrees of freedom), sparse algebraic systems that arise. We also describe ongoing work to address disparate time and spatial scales by both the development of adaptive mesh refinement methods and the use of multiple continuum formulations. Finally, we present some examples from recent simulations conducted on Jaguar, the 150152 processor core Cray XT5 system at Oak Ridge National Laboratory that is currently one of the most powerful supercomputers in the world.

  4. Advanced Simulation Capability for Environmental Management (ASCEM): Early Site Demonstration

    SciTech Connect

    Meza, Juan; Hubbard, Susan; Freshley, Mark D.; Gorton, Ian; Moulton, David; Denham, Miles E.

    2011-03-07

    The U.S. Department of Energy Office of Environmental Management, Technology Innovation and Development (EM-32), is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high performance computing tool will facilitate integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. As part of the initial development process, a series of demonstrations were defined to test ASCEM components and provide feedback to developers, engage end users in applications, and lead to an outcome that would benefit the sites. The demonstration was implemented for a sub-region of the Savannah River Site General Separations Area that includes the F-Area Seepage Basins. The physical domain included the unsaturated and saturated zones in the vicinity of the seepage basins and Fourmile Branch, using an unstructured mesh fit to the hydrostratigraphy and topography of the site. The calculations modeled variably saturated flow and the resulting flow field was used in simulations of the advection of non-reactive species and the reactive-transport of uranium. As part of the demonstrations, a new set of data management, visualization, and uncertainty quantification tools were developed to analyze simulation results and existing site data. These new tools can be used to provide summary statistics, including information on which simulation parameters were most important in the prediction of uncertainty and to visualize the relationships between model input and output.

  5. A Mass Spectrometer Simulator in Your Computer

    NASA Astrophysics Data System (ADS)

    Gagnon, Michel

    2012-12-01

    Introduced to study components of ionized gas, the mass spectrometer has evolved into a highly accurate device now used in many undergraduate and research laboratories. Unfortunately, despite their importance in the formation of future scientists, mass spectrometers remain beyond the financial reach of many high schools and colleges. As a result, it is not possible for instructors to take full advantage of this equipment. Therefore, to facilitate accessibility to this tool, we have developed a realistic computer-based simulator. Using this software, students are able to practice their ability to identify the components of the original gas, thereby gaining a better understanding of the underlying physical laws. The software is available as a free download.

  6. Miller experiments in atomistic computer simulations

    PubMed Central

    Saitta, Antonino Marco; Saija, Franz

    2014-01-01

    The celebrated Miller experiments reported on the spontaneous formation of amino acids from a mixture of simple molecules reacting under an electric discharge, giving birth to the research field of prebiotic chemistry. However, the chemical reactions involved in those experiments have never been studied at the atomic level. Here we report on, to our knowledge, the first ab initio computer simulations of Miller-like experiments in the condensed phase. Our study, based on the recent method of treatment of aqueous systems under electric fields and on metadynamics analysis of chemical reactions, shows that glycine spontaneously forms from mixtures of simple molecules once an electric field is switched on and identifies formic acid and formamide as key intermediate products of the early steps of the Miller reactions, and the crucible of formation of complex biological molecules. PMID:25201948

  7. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    ERIC Educational Resources Information Center

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  8. Advanced Placement Computer Science with Pascal. Volume 2. Experimental Edition.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY.

    This curriculum guide presents 100 lessons for an advanced placement course on programming in Pascal. Some of the topics covered include arrays, sorting, strings, sets, records, computers in society, files, stacks, queues, linked lists, binary trees, searching, hashing, and chaining. Performance objectives, vocabulary, motivation, aim,…

  9. Reference manual for the Langley Research Center flight simulation computing system

    NASA Technical Reports Server (NTRS)

    Cleveland, J. I., II; Crawford, D. J.; Rowell, L. F.

    1978-01-01

    The researchers at the Langley Research Center Flight Simulation Computing System are provided with an advanced real-time digital simulation capability. This capability is controlled at the user interface level by the Real Time Simulation Supervisor. The Supervisor is a group of subprograms loaded with a simulation application program. The Supervisor provides the interface between the application program and the operating system, and coordinates input and output to and from the simulation hardware. The Supervisor also performs various utility functions as required by a simulation application program.

  10. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  11. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Advanced Simulation H Appendix H to Part... REQUIREMENTS: DOMESTIC, FLAG, AND SUPPLEMENTAL OPERATIONS Pt. 121, App. H Appendix H to Part 121—Advanced... ensure that all instructors and check airmen used in appendix H training and checking are...

  12. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Advanced Simulation H Appendix H to Part... REQUIREMENTS: DOMESTIC, FLAG, AND SUPPLEMENTAL OPERATIONS Pt. 121, App. H Appendix H to Part 121—Advanced... ensure that all instructors and check airmen used in appendix H training and checking are...

  13. Engineering Fracking Fluids with Computer Simulation

    NASA Astrophysics Data System (ADS)

    Shaqfeh, Eric

    2015-11-01

    There are no comprehensive simulation-based tools for engineering the flows of viscoelastic fluid-particle suspensions in fully three-dimensional geometries. On the other hand, the need for such a tool in engineering applications is immense. Suspensions of rigid particles in viscoelastic fluids play key roles in many energy applications. For example, in oil drilling the ``drilling mud'' is a very viscous, viscoelastic fluid designed to shear-thin during drilling, but thicken at stoppage so that the ``cuttings'' can remain suspended. In a related application known as hydraulic fracturing suspensions of solids called ``proppant'' are used to prop open the fracture by pumping them into the well. It is well-known that particle flow and settling in a viscoelastic fluid can be quite different from that which is observed in Newtonian fluids. First, it is now well known that the ``fluid particle split'' at bifurcation cracks is controlled by fluid rheology in a manner that is not understood. Second, in Newtonian fluids, the presence of an imposed shear flow in the direction perpendicular to gravity (which we term a cross or orthogonal shear flow) has no effect on the settling of a spherical particle in Stokes flow (i.e. at vanishingly small Reynolds number). By contrast, in a non-Newtonian liquid, the complex rheological properties induce a nonlinear coupling between the sedimentation and shear flow. Recent experimental data have shown both the shear thinning and the elasticity of the suspending polymeric solutions significantly affects the fluid-particle split at bifurcations, as well as the settling rate of the solids. In the present work, we use the Immersed Boundary Method to develop computer simulations of viscoelastic flow in suspensions of spheres to study these problems. These simulations allow us to understand the detailed physical mechanisms for the remarkable physical behavior seen in practice, and actually suggest design rules for creating new fluid recipes.

  14. Simulator design for advanced ISDN satellite design and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerald R.

    1992-01-01

    This simulation design task completion report documents the simulation techniques associated with the network models of both the Interim Service ISDN (integrated services digital network) Satellite (ISIS) and the Full Service ISDN Satellite (FSIS) architectures. The ISIS network model design represents satellite systems like the Advanced Communication Technology Satellite (ACTS) orbiting switch. The FSIS architecture, the ultimate aim of this element of the Satellite Communications Applications Research (SCAR) program, moves all control and switching functions on-board the next generation ISDN communication satellite. The technical and operational parameters for the advanced ISDN communications satellite design will be obtained from the simulation of ISIS and FSIS engineering software models for their major subsystems. Discrete events simulation experiments will be performed with these models using various traffic scenarios, design parameters and operational procedures. The data from these simulations will be used to determine the engineering parameters for the advanced ISDN communications satellite.

  15. Duality quantum computer and the efficient quantum simulations

    NASA Astrophysics Data System (ADS)

    Wei, Shi-Jie; Long, Gui-Lu

    2016-03-01

    Duality quantum computing is a new mode of a quantum computer to simulate a moving quantum computer passing through a multi-slit. It exploits the particle wave duality property for computing. A quantum computer with n qubits and a qudit simulates a moving quantum computer with n qubits passing through a d-slit. Duality quantum computing can realize an arbitrary sum of unitaries and therefore a general quantum operator, which is called a generalized quantum gate. All linear bounded operators can be realized by the generalized quantum gates, and unitary operators are just the extreme points of the set of generalized quantum gates. Duality quantum computing provides flexibility and a clear physical picture in designing quantum algorithms, and serves as a powerful bridge between quantum and classical algorithms. In this paper, after a brief review of the theory of duality quantum computing, we will concentrate on the applications of duality quantum computing in simulations of Hamiltonian systems. We will show that duality quantum computing can efficiently simulate quantum systems by providing descriptions of the recent efficient quantum simulation algorithm of Childs and Wiebe (Quantum Inf Comput 12(11-12):901-924, 2012) for the fast simulation of quantum systems with a sparse Hamiltonian, and the quantum simulation algorithm by Berry et al. (Phys Rev Lett 114:090502, 2015), which provides exponential improvement in precision for simulating systems with a sparse Hamiltonian.

  16. Problems in Conducting Research on Computer-Based Simulation.

    ERIC Educational Resources Information Center

    Crawford, Alice M.

    Computer-based simulation (CBS) represents a unique utilization of computers for instruction that combines some of the best features of the technologies of simulation and computer assisted instruction (CAI). CBS grew out of an interest in testing the application of CAI to procedural and perceptual motor skills. With the sophisticated graphics…

  17. Computer-aided simulation study of photomultiplier tubes

    NASA Technical Reports Server (NTRS)

    Zaghloul, Mona E.; Rhee, Do Jun

    1989-01-01

    A computer model that simulates the response of photomultiplier tubes (PMTs) and the associated voltage divider circuit is developed. An equivalent circuit that approximates the operation of the device is derived and then used to develop a computer simulation of the PMT. Simulation results are presented and discussed.

  18. Using Computational Simulations to Confront Students' Mental Models

    ERIC Educational Resources Information Center

    Rodrigues, R.; Carvalho, P. Simeão

    2014-01-01

    In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…

  19. Molecular dynamics simulations: advances and applications

    PubMed Central

    Hospital, Adam; Goñi, Josep Ramon; Orozco, Modesto; Gelpí, Josep L

    2015-01-01

    Molecular dynamics simulations have evolved into a mature technique that can be used effectively to understand macromolecular structure-to-function relationships. Present simulation times are close to biologically relevant ones. Information gathered about the dynamic properties of macromolecules is rich enough to shift the usual paradigm of structural bioinformatics from studying single structures to analyze conformational ensembles. Here, we describe the foundations of molecular dynamics and the improvements made in the direction of getting such ensemble. Specific application of the technique to three main issues (allosteric regulation, docking, and structure refinement) is discussed. PMID:26604800

  20. Molecular dynamics simulations: advances and applications

    PubMed Central

    Hospital, Adam; Goñi, Josep Ramon; Orozco, Modesto; Gelpí, Josep L

    2015-01-01

    Molecular dynamics simulations have evolved into a mature technique that can be used effectively to understand macromolecular structure-to-function relationships. Present simulation times are close to biologically relevant ones. Information gathered about the dynamic properties of macromolecules is rich enough to shift the usual paradigm of structural bioinformatics from studying single structures to analyze conformational ensembles. Here, we describe the foundations of molecular dynamics and the improvements made in the direction of getting such ensemble. Specific application of the technique to three main issues (allosteric regulation, docking, and structure refinement) is discussed.

  1. Recent Advances in Binary Black Hole Merger Simulations

    NASA Technical Reports Server (NTRS)

    Barker, John

    2006-01-01

    Recent advances in numerical simulation techniques have lead to dramatic progress in understanding binary black hole merger radiation. I present recent results from simulations performed at Goddard, focusing on the gravitational radiation waveforms, and the application of these results to gravitational wave observations.

  2. Interactive visualization to advance earthquake simulation

    USGS Publications Warehouse

    Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.

    2008-01-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.

  3. Modeling and simulation challenges pursued by the Consortium for Advanced Simulation of Light Water Reactors (CASL)

    NASA Astrophysics Data System (ADS)

    Turinsky, Paul J.; Kothe, Douglas B.

    2016-05-01

    The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear power industry that M&S can assist in addressing. To date CASL has developed a multi-physics "core simulator" based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M

  4. Computer simulation of industrial hazards1

    PubMed Central

    Knox, E. G.

    1973-01-01

    Knox, E. G. (1973).Brit. J. industr. Med.,30, 54-63. Computer simulation of industrial hazards. A computer simulation system for a range of industrial hazards provided for model experiments which manipulated (a) the sub-structure of an exposed population in terms of age-distributions and levels of exposure, (b) the nature of the dose/response relationship, (c) the latent interval and its variability, (d) normal life-table expectations, and (e) employment turnover rates. The development of the system led to clarification of terms and concepts with ambiguous current usages, notably in relation to latency. Distinction is made between the notions of `biological' and `observable' latent intervals. Hypothetical exercises with the model tested its technical validity and at the same time demonstrated in quantitative terms the relationships between `biological' and `observable' latent intervals, employment turnover rates, total mortalities, and the distribution of illnesses and death between those currently employed in the exposing industry, those employed elsewhere, and those retired. Prospects of success for personnel engineering techniques, which manipulate age-distributions of exposed work people in relation to diseases with long latent intervals, were examined. Published asbestos cancer data were used as a basis for specific model fitting and resulted in a numerical formulation of the exposure/response relationships. Severe exposure results in an increment of risk of death of about 0·02 unit per person per annum for those exposed for around six years, but with higher rates for shorter exposures and lower rates for longer ones. The mean biological latent interval was about 25 years with a coefficient of variation of about 25%. These suppositions explained a range of published data comprehensively and at the same time predicted that (a) persons exposed at severe levels for a working lifetime of 50 years have a 40% risk of dying from asbestos cancer, and (b) industrial

  5. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    SciTech Connect

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.

  6. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  7. Real-time hybrid computer simulation of a small turboshaft engine and control system

    NASA Technical Reports Server (NTRS)

    Hart, C. E.; Wenzel, L. M.

    1984-01-01

    The development of an analytical model of a small turboshaft engine designed for helicopter propulsion systems is described. The model equations were implemented on a hybrid computer system to provide a real time nonlinear simulation of the engine performance over a wide operating range. The real time hybrid simulation of the engine was used to evaluate a microprocessor based digital control module. This digital control module was developed as part of an advanced rotorcraft control program. After tests with the hybrid engine simulation the digital control module was used to control a real engine in an experimental program. A hybrid simulation of the engine's electrical hydromechanical control system was developed. This allowed to vary the fuel flow and torque load inputs to the hybrid engine simulation for simulating transient operation. A steady-state data and the experimental tests are compared. Analytical model equations, analog computer diagrams, and a digital computer flow chart are included.

  8. A scalable parallel black oil simulator on distributed memory parallel computers

    NASA Astrophysics Data System (ADS)

    Wang, Kun; Liu, Hui; Chen, Zhangxin

    2015-11-01

    This paper presents our work on developing a parallel black oil simulator for distributed memory computers based on our in-house parallel platform. The parallel simulator is designed to overcome the performance issues of common simulators that are implemented for personal computers and workstations. The finite difference method is applied to discretize the black oil model. In addition, some advanced techniques are employed to strengthen the robustness and parallel scalability of the simulator, including an inexact Newton method, matrix decoupling methods, and algebraic multigrid methods. A new multi-stage preconditioner is proposed to accelerate the solution of linear systems from the Newton methods. Numerical experiments show that our simulator is scalable and efficient, and is capable of simulating extremely large-scale black oil problems with tens of millions of grid blocks using thousands of MPI processes on parallel computers.

  9. A computer simulation study of racemic mixtures

    NASA Astrophysics Data System (ADS)

    Largo, J.; Vega, C.; MacDowell, L. G.; Solana, J. R.

    A simple model for a chiral molecule is proposed. The model consists of a central atom bonded to four different atoms in tetrahedral coordination. Two different potentials were used to describe the pair potentials between atoms: the hard sphere potential and the Lennard-Jones potential. For both the hard sphere and the Lennard-Jones chiral models, computer simulations have been performed for the pure enantiomers and also for the racemic mixture. The racemic mixture consisted of an equimolar mixture of the two optically active enantiomers. It is found that the equations of state are the same, within statistical uncertainty, for the pure enantiomer fluid and for the racemic mixture. Only at high pressures does the racemic mixture seem to have a higher density, for a given pressure, than the pure enantiomer. Concering the structure, no difference is found in the site-site correlation functions between like and unlike molecules in the racemic mixture either at low or at high densities. However, small differences are found for the site-site correlations of the pure enantiomer and those of the racemic mixtures. In the Lennard-Jones model, similar conclusions are drawn. The extension of Wertheim's first-order perturbation theory, denoted bonded hard sphere theory (ARCHER, A. L., and JACKSON, G., 1991, Molec. Phys. , 73 , 881; AMOS, M. D., and JACKSON, G., 1992, J. chem. Phys. , 96 , 4604), successfully reproduces the simulation results for the hard chiral model. Virial coefficients of the hard chiral model up to the fourth have also been evaluated. Again, no differences are found between virial coefficients of the pure fluid and of the racemic mixture. All the results of this work illustrate the quasi-ideal behaviour of racemic mixtures in the fluid phase.

  10. Simulation Toolkit for Renewable Energy Advanced Materials Modeling

    SciTech Connect

    Sides, Scott; Kemper, Travis; Larsen, Ross; Graf, Peter

    2013-11-13

    STREAMM is a collection of python classes and scripts that enables and eases the setup of input files and configuration files for simulations of advanced energy materials. The core STREAMM python classes provide a general framework for storing, manipulating and analyzing atomic/molecular coordinates to be used in quantum chemistry and classical molecular dynamics simulations of soft materials systems. The design focuses on enabling the interoperability of materials simulation codes such as GROMACS, LAMMPS and Gaussian.

  11. New scene projector developments at the AMRDEC's advanced simulation center

    NASA Astrophysics Data System (ADS)

    Saylor, Daniel A.; Bowden, Mark; Buford, James

    2006-05-01

    The Aviation and Missile Research, Engineering, and Development Center's (AMRDEC) System Simulation and Development Directorate (SS&DD) has an extensive history of applying all types of modeling and simulation (M&S) to weapon system development and has been a particularly strong advocate of hardware-in-the-loop (HWIL) simulation and test for many years. Key to the successful application of HWIL testing at AMRDEC has been the use of state-of-the-art Scene Projector technologies. This paper describes recent advancements over the past year within the AMRDEC Advanced Simulation Center (ASC) HWIL facilities with a specific emphasis on the state of the various IRSP technologies employed. Areas discussed include application of FMS-compatible IR projectors, advancements in hybrid and multi-spectral projectors, and characterization of existing and emerging technologies.

  12. Algorithmic implementations of domain decomposition methods for the diffraction simulation of advanced photomasks

    NASA Astrophysics Data System (ADS)

    Adam, Konstantinos; Neureuther, Andrew R.

    2002-07-01

    The domain decomposition method developed in [1] is examined in more detail. This method enables rapid computer simulation of advanced photomask (alt. PSM, masks with OPC) scattering and transmission properties. Compared to 3D computer simulation, speed-up factors of approximately 400, and up to approximately 200,000 when using the look-up table approach, are possible. Combined with the spatial frequency properties of projection printing systems, it facilitates accurate computer simulation of the projected image (normalized mean square error of a typical image is only a fraction of 1%). Some esoteric accuracy issues of the method are addressed and the way to handle arbitrary, Manhattan-type mask layouts is presented. The method is shown to be valid for off-axis incidence. The cross-talk model developed in [1] is used in 3D mask simulations (2D layouts).

  13. The Design and Implementation of NASA's Advanced Flight Computing Module

    NASA Technical Reports Server (NTRS)

    Alkakaj, Leon; Straedy, Richard; Jarvis, Bruce

    1995-01-01

    This paper describes a working flight computer Multichip Module developed jointly by JPL and TRW under their respective research programs in a collaborative fashion. The MCM is fabricated by nCHIP and is packaged within a 2 by 4 inch Al package from Coors. This flight computer module is one of three modules under development by NASA's Advanced Flight Computer (AFC) program. Further development of the Mass Memory and the programmable I/O MCM modules will follow. The three building block modules will then be stacked into a 3D MCM configuration. The mass and volume of the flight computer MCM achieved at 89 grams and 1.5 cubic inches respectively, represent a major enabling technology for future deep space as well as commercial remote sensing applications.

  14. Computer simulation of nanocube self-assemblies

    NASA Astrophysics Data System (ADS)

    Zhang, Xi

    Self-assembly of nanoscale building blocks and molecules into ordered nanostructures is a promising venue for bottom-up materials design. A wide variety of nanoparticles with unique shapes and uniform sizes have been successfully synthesized. However, organizing these nanoparticles into desired, predefined nanostructures is a formidable challenge now facing the materials community. For example, simple 2-D arrays and 3-D superlattices are the prevalent structures from most nanocube self-assemblies. Two practical strategies to impart anisotropy onto nanocubes, namely, attaching polymer tethers to nanoparticle surfaces and introducing directional dipolar interactions, can be applied to achieve more complex assembled structures. In this dissertation, we conduct computer simulations on nanocube self-assemblies induced by polymer tethers and directional dipole interactions, to examine the various parameters involved in such complicated self-assembly processes, including temperature, concentration, solvent condition, cube size, tether length, tether topology, tether placement, tether number, dipole direction, dipole strength and polydispersity, in order to understand how the packing geometry and interactions between nanocubes can be manipulated to confer precise control over the assembled structures and the phase behavior. First, we simulate monotethered nanocubes and find that the nanocubes favor face-to-face packing in poor solvents, stabilizing the lamellae phases. Next, we simulate different architectures of tethered nanocubes and demonstrate that the steric influence of tether beads can be manipulated to interfere with the face-to-face packing of nanocubes and alter the phase behaviors. We also study the self-assembly of nanocubes with dipoles. We find that the head-to-tail alignment of dipoles, coupled with the face-to-face close packing of nanocubes, dictates the assembled structures. The face-face attraction between nanocubes can also be utilized to control the

  15. Computational simulation of liquid rocket injector anomalies

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Singhal, A. K.; Tam, L. T.; Davidian, K.

    1986-01-01

    A computer model has been developed to analyze the three-dimensional two-phase reactive flows in liquid fueled rocket combustors. The model is designed to study the influence of liquid propellant injection nonuniformities on the flow pattern, combustion and heat transfer within the combustor. The Eulerian-Lagrangian approach for simulating polidisperse spray flow, evaporation and combustion has been used. Full coupling between the phases is accounted for. A nonorthogonal, body fitted coordinate system along with a conservative control volume formulation is employed. The physical models built into the model include a kappa-epsilon turbulence model, a two-step chemical reaction, and the six-flux radiation model. Semiempirical models are used to describe all interphase coupling terms as well as chemical reaction rates. The purpose of this study was to demonstrate an analytical capability to predict the effects of reactant injection nonuniformities (injection anomalies) on combustion and heat transfer within the rocket combustion chamber. The results show promising application of the model to comprehensive modeling of liquid propellant rocket engines.

  16. Computer simulation of FCC riser reactors.

    SciTech Connect

    Chang, S. L.; Golchert, B.; Lottes, S. A.; Petrick, M.; Zhou, C. Q.

    1999-04-20

    A three-dimensional computational fluid dynamics (CFD) code, ICRKFLO, was developed to simulate the multiphase reacting flow system in a fluid catalytic cracking (FCC) riser reactor. The code solve flow properties based on fundamental conservation laws of mass, momentum, and energy for gas, liquid, and solid phases. Useful phenomenological models were developed to represent the controlling FCC processes, including droplet dispersion and evaporation, particle-solid interactions, and interfacial heat transfer between gas, droplets, and particles. Techniques were also developed to facilitate numerical calculations. These techniques include a hybrid flow-kinetic treatment to include detailed kinetic calculations, a time-integral approach to overcome numerical stiffness problems of chemical reactions, and a sectional coupling and blocked-cell technique for handling complex geometry. The copyrighted ICRKFLO software has been validated with experimental data from pilot- and commercial-scale FCC units. The code can be used to evaluate the impacts of design and operating conditions on the production of gasoline and other oil products.

  17. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    NASA Technical Reports Server (NTRS)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  18. Activities of the Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  19. Advanced computer architecture specification for automated weld systems

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1994-01-01

    This report describes the requirements for an advanced automated weld system and the associated computer architecture, and defines the overall system specification from a broad perspective. According to the requirements of welding procedures as they relate to an integrated multiaxis motion control and sensor architecture, the computer system requirements are developed based on a proven multiple-processor architecture with an expandable, distributed-memory, single global bus architecture, containing individual processors which are assigned to specific tasks that support sensor or control processes. The specified architecture is sufficiently flexible to integrate previously developed equipment, be upgradable and allow on-site modifications.

  20. Interoperable Technologies for Advanced Petascale Simulations (ITAPS)

    SciTech Connect

    Shephard, Mark S

    2010-02-05

    Efforts during the past year have contributed to the continued development of the ITAPS interfaces and services as well as specific efforts to support ITAPS applications. The ITAPS interface efforts have two components. The first is working with the ITAPS team on improving the ITAPS software infrastructure and level of compliance of our implementations of ITAPS interfaces (iMesh, iMeshP, iRel and iGeom). The second is being involved with the discussions on the design of the iField fields interface. Efforts to move the ITAPS technologies to petascale computers has identified a number of key technical developments that are required to effectively execute the ITAPS interfaces and services. Research to address these parallel method developments has been a major emphasis of the RPI’s team efforts over the past year. Efforts to move the ITAPS technologies to petascale computers has identified a number of key technical developments that are required to effectively execute the ITAPS interfaces and services. Research to address these parallel method developments has been a major emphasis of the RPI’s team efforts over the past year. The development of parallel unstructured mesh methods has considered the need to scale unstructured mesh solves to massively parallel computers. These efforts, summarized in section 2.1 show that with the addition of the ITAPS procedures described in sections 2.2 and 2.3 we are able to obtain excellent strong scaling with our unstructured mesh CFD code on up to 294,912 cores of IBM Blue Gene/P which is the highest core count machine available. The ITAPS developments that have contributed to the scaling and performance of PHASTA include an iterative migration algorithm to improve the combined region and vertex balance of the mesh partition, which increases scalability, and mesh data reordering, which improves computational performance. The other developments are associated with the further development of the ITAPS parallel unstructured mesh

  1. Process simulation for advanced composites production

    SciTech Connect

    Allendorf, M.D.; Ferko, S.M.; Griffiths, S.

    1997-04-01

    The objective of this project is to improve the efficiency and lower the cost of chemical vapor deposition (CVD) processes used to manufacture advanced ceramics by providing the physical and chemical understanding necessary to optimize and control these processes. Project deliverables include: numerical process models; databases of thermodynamic and kinetic information related to the deposition process; and process sensors and software algorithms that can be used for process control. Target manufacturing techniques include CVD fiber coating technologies (used to deposit interfacial coatings on continuous fiber ceramic preforms), chemical vapor infiltration, thin-film deposition processes used in the glass industry, and coating techniques used to deposit wear-, abrasion-, and corrosion-resistant coatings for use in the pulp and paper, metals processing, and aluminum industries.

  2. Computer Simulations of Quantum Theory of Hydrogen Atom for Natural Science Education Students in a Virtual Lab

    ERIC Educational Resources Information Center

    Singh, Gurmukh

    2012-01-01

    The present article is primarily targeted for the advanced college/university undergraduate students of chemistry/physics education, computational physics/chemistry, and computer science. The most recent software system such as MS Visual Studio .NET version 2010 is employed to perform computer simulations for modeling Bohr's quantum theory of…

  3. Advanced computer modeling techniques expand belt conveyor technology

    SciTech Connect

    Alspaugh, M.

    1998-07-01

    Increased mining production is continuing to challenge engineers and manufacturers to keep up. The pressure to produce larger and more versatile equipment is increasing. This paper will show some recent major projects in the belt conveyor industry that have pushed the limits of design and engineering technology. Also, it will discuss the systems engineering discipline and advanced computer modeling tools that have helped make these achievements possible. Several examples of technologically advanced designs will be reviewed. However, new technology can sometimes produce increased problems with equipment availability and reliability if not carefully developed. Computer modeling techniques that help one design larger equipment can also compound operational headaches if engineering processes and algorithms are not carefully analyzed every step of the way.

  4. Soft computing in design and manufacturing of advanced materials

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.; Baaklini, George Y; Vary, Alex

    1993-01-01

    The potential of fuzzy sets and neural networks, often referred to as soft computing, for aiding in all aspects of manufacturing of advanced materials like ceramics is addressed. In design and manufacturing of advanced materials, it is desirable to find which of the many processing variables contribute most to the desired properties of the material. There is also interest in real time quality control of parameters that govern material properties during processing stages. The concepts of fuzzy sets and neural networks are briefly introduced and it is shown how they can be used in the design and manufacturing processes. These two computational methods are alternatives to other methods such as the Taguchi method. The two methods are demonstrated by using data collected at NASA Lewis Research Center. Future research directions are also discussed.

  5. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison

    2002-04-30

    This is the sixth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on the development of our IGCC workbench. Preliminary CFD simulations for single stage and two stage ''generic'' gasifiers using firing conditions based on the Vision 21 reference configuration have been performed. Work is continuing on implementing an advanced slagging model into the CFD based gasifier model. An investigation into published gasification kinetics has highlighted a wide variance in predicted performance due to the choice of kinetic parameters. A plan has been outlined for developing the reactor models required to simulate the heat transfer and gas clean up equipment downstream of the gasifier. Three models that utilize the CCA software protocol have been integrated into a version of the IGCC workbench. Tests of a CCA implementation of our CFD code into the workbench demonstrated that the CCA CFD module can execute on a geographically remote PC (linked via the Internet) in a manner that is transparent to the user. Software tools to create ''walk-through'' visualizations of the flow field within a gasifier have been demonstrated.

  6. Interoperable Technologies for Advanced Petascale Simulations

    SciTech Connect

    Li, Xiaolin

    2013-01-14

    Our final report on the accomplishments of ITAPS at Stony Brook during period covered by the research award includes component service, interface service and applications. On the component service, we have designed and implemented a robust functionality for the Lagrangian tracking of dynamic interface. We have migrated the hyperbolic, parabolic and elliptic solver from stage-wise second order toward global second order schemes. We have implemented high order coupling between interface propagation and interior PDE solvers. On the interface service, we have constructed the FronTier application programer's interface (API) and its manual page using doxygen. We installed the FronTier functional interface to conform with the ITAPS specifications, especially the iMesh and iMeshP interfaces. On applications, we have implemented deposition and dissolution models with flow and implemented the two-reactant model for a more realistic precipitation at the pore level and its coupling with Darcy level model. We have continued our support to the study of fluid mixing problem for problems in inertial comfinement fusion. We have continued our support to the MHD model and its application to plasma liner implosion in fusion confinement. We have simulated a step in the reprocessing and separation of spent fuels from nuclear power plant fuel rods. We have implemented the fluid-structure interaction for 3D windmill and parachute simulations. We have continued our collaboration with PNNL, BNL, LANL, ORNL, and other SciDAC institutions.

  7. Description of a digital computer simulation of an Annular Momentum Control Device (AMCD) laboratory test model

    NASA Technical Reports Server (NTRS)

    Woolley, C. T.; Groom, N. J.

    1981-01-01

    A description of a digital computer simulation of an Annular Momentum Control Device (AMCD) laboratory model is presented. The AMCD is a momentum exchange device which is under development as an advanced control effector for spacecraft attitude control systems. The digital computer simulation of this device incorporates the following models: six degree of freedom rigid body dynamics; rim warp; controller dynamics; nonlinear distributed element axial bearings; as well as power driver and power supply current limits. An annotated FORTRAN IV source code listing of the computer program is included.

  8. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  9. Simulation of reliability in multiserver computer networks

    NASA Astrophysics Data System (ADS)

    Minkevičius, Saulius

    2012-11-01

    The performance in terms of reliability of computer multiserver networks motivates this paper. The probability limit theorem is derived on the extreme queue length in open multiserver queueing networks in heavy traffic and applied to a reliability model for multiserver computer networks where we relate the time of failure of a multiserver computer network to the system parameters.

  10. A study of workstation computational performance for real-time flight simulation

    NASA Technical Reports Server (NTRS)

    Maddalon, Jeffrey M.; Cleveland, Jeff I., II

    1995-01-01

    With recent advances in microprocessor technology, some have suggested that modern workstations provide enough computational power to properly operate a real-time simulation. This paper presents the results of a computational benchmark, based on actual real-time flight simulation code used at Langley Research Center, which was executed on various workstation-class machines. The benchmark was executed on different machines from several companies including: CONVEX Computer Corporation, Cray Research, Digital Equipment Corporation, Hewlett-Packard, Intel, International Business Machines, Silicon Graphics, and Sun Microsystems. The machines are compared by their execution speed, computational accuracy, and porting effort. The results of this study show that the raw computational power needed for real-time simulation is now offered by workstations.

  11. Computational simulations of vorticity enhanced diffusion

    NASA Astrophysics Data System (ADS)

    Vold, Erik L.

    1999-11-01

    Computer simulations are used to investigate a phenomenon of vorticity enhanced diffusion (VED), a net transport and mixing of a passive scalar across a prescribed vortex flow field driven by a background gradient in the scalar quantity. The central issue under study here is the increase in scalar flux down the gradient and across the vortex field. The numerical scheme uses cylindrical coordinates centered with the vortex flow which allows an exact advective solution and 1D or 2D diffusion using simple numerical methods. In the results, the ratio of transport across a localized vortex region in the presence of the vortex flow over that expected for diffusion alone is evaluated as a measure of VED. This ratio is seen to increase dramatically while the absolute flux across the vortex decreases slowly as the diffusion coefficient is decreased. Similar results are found and compared for varying diffusion coefficient, D, or vortex rotation time, τv, for a constant background gradient in the transported scalar vs an interface in the transported quantity, and for vortex flow fields constant in time vs flow which evolves in time from an initial state and with a Schmidt number of order unity. A simple analysis shows that for a small diffusion coefficient, the flux ratio measure of VED scales as the vortex radius over the thickness for mass diffusion in a viscous shear layer within the vortex characterized by (Dτv)1/2. The phenomenon is linear as investigated here and suggests that a significant enhancement of mixing in fluids may be a relatively simple linear process. Discussion touches on how this vorticity enhanced diffusion may be related to mixing in nonlinear turbulent flows.

  12. Alignment and Initial Operation of an Advanced Solar Simulator

    NASA Technical Reports Server (NTRS)

    Jaworske, Donald A.; Jefferies, Kent S.; Mason, Lee S.

    1996-01-01

    A solar simulator utilizing nine 30-kW xenon arc lamps was built to provide radiant power for testing a solar dynamic space power system in a thermal vacuum environment. The advanced solar simulator achieved the following values specific to the solar dynamic system: (1) a subtense angle of 1 deg; (2) the ability to vary solar simulator intensity up to 1.7 kW/sq m; (3) a beam diameter of 4.8 m; and (4) uniformity of illumination on the order of +/-10%. The flexibility of the solar simulator design allows for other potential uses of the facility.

  13. Brush seal numerical simulation: Concepts and advances

    NASA Technical Reports Server (NTRS)

    Braun, M. J.; Kudriavtsev, V. V.

    1994-01-01

    The development of the brush seal is considered to be most promising among the advanced type seals that are presently in use in the high speed turbomachinery. The brush is usually mounted on the stationary portions of the engine and has direct contact with the rotating element, in the process of limiting the 'unwanted' leakage flows between stages, or various engine cavities. This type of sealing technology is providing high (in comparison with conventional seals) pressure drops due mainly to the high packing density (around 100 bristles/sq mm), and brush compliance with the rotor motions. In the design of modern aerospace turbomachinery leakage flows between the stages must be minimal, thus contributing to the higher efficiency of the engine. Use of the brush seal instead of the labyrinth seal reduces the leakage flow by one order of magnitude. Brush seals also have been found to enhance dynamic performance, cost less, and are lighter than labyrinth seals. Even though industrial brush seals have been successfully developed through extensive experimentation, there is no comprehensive numerical methodology for the design or prediction of their performance. The existing analytical/numerical approaches are based on bulk flow models and do not allow the investigation of the effects of brush morphology (bristle arrangement), or brushes arrangement (number of brushes, spacing between them), on the pressure drops and flow leakage. An increase in the brush seal efficiency is clearly a complex problem that is closely related to the brush geometry and arrangement, and can be solved most likely only by means of a numerically distributed model.

  14. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.

    2008-04-15

    The recent Nevada Earthquake (M=6) produced an extraordinary set of crustal guided waves. In this study, we examine the three-component data at all the USArray stations in terms of how well existing models perform in predicting the various phases, Rayleigh waves, Love waves, and Pnl waves. To establish the source parameters, we applied the Cut and Paste Code up to distance of 5° for an average local crustal model which produced a normal mechanism (strike=35°,dip=41°,rake=-85°) at a depth of 9 km and Mw=5.9. Assuming this mechanism, we generated synthetics at all distances for a number of 1D and 3D models. The Pnl observations fit the synthetics for the simple models well both in timing (VPn=7.9km/s) and waveform fits out to a distance of about 5°. Beyond this distance a great deal of complexity can be seen to the northwest apparently caused by shallow subducted slab material. These paths require considerable crustal thinning and higher P-velocities. Small delays and advances outline the various tectonic province to the south, Colorado Plateau, etc. with velocities compatible with that reported on by Song et al.(1996). Five-second Rayleigh waves (Airy Phase) can be observed throughout the whole array and show a great deal of variation ( up to 30s). In general, the Love waves are better behaved than the Rayleigh waves. We are presently adding higher frequency to the source description by including source complexity. Preliminary inversions suggest rupture to northeast with a shallow asperity. We are, also, inverting the aftershocks to extend the frequencies to 2 Hz and beyond following the calibration method outlined in Tan and Helmberger (2007). This will allow accurate directivity measurements for events with magnitude larger than 3.5. Thus, we will address the energy decay with distance as s function of frequency band for the various source types.

  15. Brush seal numerical simulation: Concepts and advances

    NASA Astrophysics Data System (ADS)

    Braun, M. J.; Kudriavtsev, V. V.

    1994-07-01

    The development of the brush seal is considered to be most promising among the advanced type seals that are presently in use in the high speed turbomachinery. The brush is usually mounted on the stationary portions of the engine and has direct contact with the rotating element, in the process of limiting the 'unwanted' leakage flows between stages, or various engine cavities. This type of sealing technology is providing high (in comparison with conventional seals) pressure drops due mainly to the high packing density (around 100 bristles/sq mm), and brush compliance with the rotor motions. In the design of modern aerospace turbomachinery leakage flows between the stages must be minimal, thus contributing to the higher efficiency of the engine. Use of the brush seal instead of the labyrinth seal reduces the leakage flow by one order of magnitude. Brush seals also have been found to enhance dynamic performance, cost less, and are lighter than labyrinth seals. Even though industrial brush seals have been successfully developed through extensive experimentation, there is no comprehensive numerical methodology for the design or prediction of their performance. The existing analytical/numerical approaches are based on bulk flow models and do not allow the investigation of the effects of brush morphology (bristle arrangement), or brushes arrangement (number of brushes, spacing between them), on the pressure drops and flow leakage. An increase in the brush seal efficiency is clearly a complex problem that is closely related to the brush geometry and arrangement, and can be solved most likely only by means of a numerically distributed model.

  16. Computer simulator for a mobile telephone system

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.; Ziegler, C.

    1983-01-01

    A software simulator to help NASA in the design of the LMSS was developed. The simulator will be used to study the characteristics of implementation requirements of the LMSS's configuration with specifications as outlined by NASA.

  17. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.

    2008-06-17

    Earthquake source parameters underpin several aspects of nuclear explosion monitoring. Such aspects are: calibration of moment magnitudes (including coda magnitudes) and magnitude and distance amplitude corrections (MDAC); source depths; discrimination by isotropic moment tensor components; and waveform modeling for structure (including waveform tomography). This project seeks to improve methods for and broaden the applicability of estimating source parameters from broadband waveforms using the Cut-and-Paste (CAP) methodology. The CAP method uses a library of Green’s functions for a one-dimensional (1D, depth-varying) seismic velocity model. The method separates the main arrivals of the regional waveform into 5 windows: Pnl (vertical and radial components), Rayleigh (vertical and radial components) and Love (transverse component). Source parameters are estimated by grid search over strike, dip, rake and depth and seismic moment or equivalently moment magnitude, MW, are adjusted to fit the amplitudes. Key to the CAP method is allowing the synthetic seismograms to shift in time relative to the data in order to account for path-propagation errors (delays) in the 1D seismic velocity model used to compute the Green’s functions. The CAP method has been shown to improve estimates of source parameters, especially when delay and amplitude biases are calibrated using high signal-to-noise data from moderate earthquakes, CAP+.

  18. Computational cost of full QCD simulations experienced by CP-PACS and JLQCD Collaborations

    NASA Astrophysics Data System (ADS)

    Ukawa, A.

    We summarize the experience of the CP-PACS and JLQCD Collaborations on the computational cost of two-flavor full QCD simulations with improved gauge and Wilson-type quark actions. Based on the experience, estimates are made on the Tflops.years necessary for advancing full QCD studies.

  19. New Pedagogies on Teaching Science with Computer Simulations

    ERIC Educational Resources Information Center

    Khan, Samia

    2011-01-01

    Teaching science with computer simulations is a complex undertaking. This case study examines how an experienced science teacher taught chemistry using computer simulations and the impact of his teaching on his students. Classroom observations over 3 semesters, teacher interviews, and student surveys were collected. The data was analyzed for (1)…

  20. How Effective Is Instructional Support for Learning with Computer Simulations?

    ERIC Educational Resources Information Center

    Eckhardt, Marc; Urhahne, Detlef; Conrad, Olaf; Harms, Ute

    2013-01-01

    The study examined the effects of two different instructional interventions as support for scientific discovery learning using computer simulations. In two well-known categories of difficulty, data interpretation and self-regulation, instructional interventions for learning with computer simulations on the topic "ecosystem water" were developed…

  1. Computer-Based Simulation Models for Community College Business Students.

    ERIC Educational Resources Information Center

    Kahl, James

    Instructors at Lower Columbia College in Longview, Washington use computer-based simulation models in lower level business administration courses. Prior to use, teachers must select and obtain a simulation, discuss it with campus computer personnel, set an operations schedule, obtain the necessary supplementary material, and test run the program.…

  2. Explore Effective Use of Computer Simulations for Physics Education

    ERIC Educational Resources Information Center

    Lee, Yu-Fen; Guo, Yuying

    2008-01-01

    The dual purpose of this article is to provide a synthesis of the findings related to the use of computer simulations in physics education and to present implications for teachers and researchers in science education. We try to establish a conceptual framework for the utilization of computer simulations as a tool for learning and instruction in…

  3. Evaluation of Computer Simulations for Teaching Apparel Merchandising Concepts.

    ERIC Educational Resources Information Center

    Jolly, Laura D.; Sisler, Grovalynn

    1988-01-01

    The study developed and evaluated computer simulations for teaching apparel merchandising concepts. Evaluation results indicated that teaching method (computer simulation versus case study) does not significantly affect cognitive learning. Student attitudes varied, however, according to topic (profitable merchandising analysis versus retailing…

  4. Cognitive Effects from Process Learning with Computer-Based Simulations.

    ERIC Educational Resources Information Center

    Breuer, Klaus; Kummer, Ruediger

    1990-01-01

    Discusses content learning versus process learning, describes process learning with computer-based simulations, and highlights an empirical study on the effects of process learning with problem-oriented, computer-managed simulations in technical vocational education classes in West Germany. Process learning within a model of the cognitive system…

  5. Computers for real time flight simulation: A market survey

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.; Karplus, W. J.

    1977-01-01

    An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.

  6. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  7. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.

    2008-10-17

    This quarter, we have focused on several tasks: (1) Building a high-quality catalog of earthquake source parameters for the Middle East and East Asia. In East Asia, we computed source parameters using the CAP method for a set of events studied by Herrman et al., (MRR, 2006) using a complete waveform technique. Results indicated excellent agreement with the moment magnitudes in the range 3.5 -5.5. Below magnitude 3.5 the scatter increases. For events with more than 2-3 observations at different azimuths, we found good agreement of focal mechanisms. Depths were generally consistent, although differences of up to 10 km were found. These results suggest that CAP modeling provides estimates of source parameters at least as reliable as complete waveform modeling techniques. However, East Asia and the Yellow Sea Korean Paraplatform (YSKP) region studied are relatively laterally homogeneous and may not benefit from the CAP method’s flexibility to shift waveform segments to account for path-dependent model errors. A more challenging region to study is the Middle East where strong variations in sedimentary basin, crustal thickness and crustal and mantle seismic velocities greatly impact regional wave propagation. We applied the CAP method to a set of events in and around Iran and found good agreement between estimated focal mechanisms and those reported by the Global Centroid Moment Tensor (CMT) catalog. We found a possible bias in the moment magnitudes that may be due to the thick low-velocity crust in the Iranian Plateau. (2) Testing Methods on a Lifetime Regional Data Set. In particular, the recent 2/21/08 Nevada Event and Aftershock Sequence occurred in the middle of USArray, producing over a thousand records per event. The tectonic setting is quite similar to Central Iran and thus provides an excellent testbed for CAP+ at ranges out to 10°, including extensive observations of crustal thinning and thickening and various Pnl complexities. Broadband modeling in 1D, 2D

  8. Building an advanced climate model: Program plan for the CHAMMP (Computer Hardware, Advanced Mathematics, and Model Physics) Climate Modeling Program

    SciTech Connect

    Not Available

    1990-12-01

    The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

  9. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    ERIC Educational Resources Information Center

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  10. GPU-accelerated micromagnetic simulations using cloud computing

    NASA Astrophysics Data System (ADS)

    Jermain, C. L.; Rowlands, G. E.; Buhrman, R. A.; Ralph, D. C.

    2016-03-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics.

  11. ADVANCED TECHNIQUES FOR RESERVOIR SIMULATION AND MODELING OF NONCONVENTIONAL WELLS

    SciTech Connect

    Louis J. Durlofsky; Khalid Aziz

    2004-08-20

    Nonconventional wells, which include horizontal, deviated, multilateral and ''smart'' wells, offer great potential for the efficient management of oil and gas reservoirs. These wells are able to contact larger regions of the reservoir than conventional wells and can also be used to target isolated hydrocarbon accumulations. The use of nonconventional wells instrumented with downhole inflow control devices allows for even greater flexibility in production. Because nonconventional wells can be very expensive to drill, complete and instrument, it is important to be able to optimize their deployment, which requires the accurate prediction of their performance. However, predictions of nonconventional well performance are often inaccurate. This is likely due to inadequacies in some of the reservoir engineering and reservoir simulation tools used to model and optimize nonconventional well performance. A number of new issues arise in the modeling and optimization of nonconventional wells. For example, the optimal use of downhole inflow control devices has not been addressed for practical problems. In addition, the impact of geological and engineering uncertainty (e.g., valve reliability) has not been previously considered. In order to model and optimize nonconventional wells in different settings, it is essential that the tools be implemented into a general reservoir simulator. This simulator must be sufficiently general and robust and must in addition be linked to a sophisticated well model. Our research under this five year project addressed all of the key areas indicated above. The overall project was divided into three main categories: (1) advanced reservoir simulation techniques for modeling nonconventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and for coupling the well to the simulator (which includes the accurate calculation of well index and the modeling of multiphase flow in the wellbore

  12. Case Studies in Computer Adaptive Test Design through Simulation.

    ERIC Educational Resources Information Center

    Eignor, Daniel R.; And Others

    The extensive computer simulation work done in developing the computer adaptive versions of the Graduate Record Examinations (GRE) Board General Test and the College Board Admissions Testing Program (ATP) Scholastic Aptitude Test (SAT) is described in this report. Both the GRE General and SAT computer adaptive tests (CATs), which are fixed length…

  13. Computational Electromagnetics (CEM) Laboratory: Simulation Planning Guide

    NASA Technical Reports Server (NTRS)

    Khayat, Michael A.

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CEM Laboratory. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  14. Coarse-grained computer simulation of dynamics in thylakoid membranes: methods and opportunities

    PubMed Central

    Schneider, Anna R.; Geissler, Phillip L.

    2013-01-01

    Coarse-grained simulation is a powerful and well-established suite of computational methods for studying structure and dynamics in nanoscale biophysical systems. As our understanding of the plant photosynthetic apparatus has become increasingly nuanced, opportunities have arisen for coarse-grained simulation to complement experiment by testing hypotheses and making predictions. Here, we give an overview of best practices in coarse-grained simulation, with a focus on techniques and results that are applicable to the plant thylakoid membrane–protein system. We also discuss current research topics for which coarse-grained simulation has the potential to play a key role in advancing the field. PMID:24478781

  15. Creating Science Simulations through Computational Thinking Patterns

    ERIC Educational Resources Information Center

    Basawapatna, Ashok Ram

    2012-01-01

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction.…

  16. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Technical Reports Server (NTRS)

    Rosenberg, L.

    1983-01-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  17. Advanced simulation and analysis of a geopotential research mission

    NASA Technical Reports Server (NTRS)

    Schutz, B. E.

    1988-01-01

    Computer simulations have been performed for an orbital gradiometer mission to assist in the study of high degree and order gravity field recovery. The simulations were conducted for a satellite in near-circular, frozen orbit at a 160-km altitude using a gravitational field complete to degree and order 360. The mission duration is taken to be 32 days. The simulation provides a set of measurements to assist in the evaluation of techniques developed for the determination of the gravity field. Also, the simulation provides an ephemeris to study available tracking systems to satisfy the orbit determination requirements of the mission.

  18. Computer simulation of the scaled power bipolar SHF transistor structures

    NASA Astrophysics Data System (ADS)

    Nelayev, V. V.; Efremov, V. A.; Snitovsky, Yu. P.

    2007-04-01

    New advanced technology for creation of the npn power silicon bipolar SHF transistor structure is proposed. Preferences of the advanced technology in comparison with standard technology are demonstrated. Simulation of both technology flows was performed with emphasis on scaling of the discussed device structure.

  19. Whole-genome CNV analysis: advances in computational approaches

    PubMed Central

    Pirooznia, Mehdi; Goes, Fernando S.; Zandi, Peter P.

    2015-01-01

    Accumulating evidence indicates that DNA copy number variation (CNV) is likely to make a significant contribution to human diversity and also play an important role in disease susceptibility. Recent advances in genome sequencing technologies have enabled the characterization of a variety of genomic features, including CNVs. This has led to the development of several bioinformatics approaches to detect CNVs from next-generation sequencing data. Here, we review recent advances in CNV detection from whole genome sequencing. We discuss the informatics approaches and current computational tools that have been developed as well as their strengths and limitations. This review will assist researchers and analysts in choosing the most suitable tools for CNV analysis as well as provide suggestions for new directions in future development. PMID:25918519

  20. Some recent advances in computational aerodynamics for helicopter applications

    NASA Technical Reports Server (NTRS)

    Mccroskey, W. J.; Baeder, J. D.

    1985-01-01

    The growing application of computational aerodynamics to nonlinear helicopter problems is outlined, with particular emphasis on several recent quasi-two-dimensional examples that used the thin-layer Navier-Stokes equations and an eddy-viscosity model to approximate turbulence. Rotor blade section characteristics can now be calculated accurately over a wide range of transonic flow conditions. However, a finite-difference simulation of the complete flow field about a helicopter in forward flight is not currently feasible, despite the impressive progress that is being made in both two and three dimensions. The principal limitations are today's computer speeds and memories, algorithm and solution methods, grid generation, vortex modeling, structural and aerodynamic coupling, and a shortage of engineers who are skilled in both computational fluid dynamics and helicopter aerodynamics and dynamics.

  1. Genetic Crossing vs Cloning by Computer Simulation

    NASA Astrophysics Data System (ADS)

    Dasgupta, Subinay

    We perform Monte Carlo simulation using Penna's bit string model, and compare the process of asexual reproduction by cloning with that by genetic crossover. We find them to be comparable as regards survival of a species, and also if a natural disaster is simulated.

  2. Spatial Learning and Computer Simulations in Science

    ERIC Educational Resources Information Center

    Lindgren, Robb; Schwartz, Daniel L.

    2009-01-01

    Interactive simulations are entering mainstream science education. Their effects on cognition and learning are often framed by the legacy of information processing, which emphasized amodal problem solving and conceptual organization. In contrast, this paper reviews simulations from the vantage of research on perception and spatial learning,…

  3. Genetic crossing vs cloning by computer simulation

    SciTech Connect

    Dasgupta, S.

    1997-06-01

    We perform Monte Carlo simulation using Penna`s bit string model, and compare the process of asexual reproduction by cloning with that by genetic crossover. We find them to be comparable as regards survival of a species, and also if a natural disaster is simulated.

  4. Advanced Methodology for Simulation of Complex Flows Using Structured Grid Systems

    NASA Technical Reports Server (NTRS)

    Steinthorsson, Erlendur; Modiano, David

    1995-01-01

    Detailed simulations of viscous flows in complicated geometries pose a significant challenge to current capabilities of Computational Fluid Dynamics (CFD). To enable routine application of CFD to this class of problems, advanced methodologies are required that employ (a) automated grid generation, (b) adaptivity, (c) accurate discretizations and efficient solvers, and (d) advanced software techniques. Each of these ingredients contributes to increased accuracy, efficiency (in terms of human effort and computer time), and/or reliability of CFD software. In the long run, methodologies employing structured grid systems will remain a viable choice for routine simulation of flows in complex geometries only if genuinely automatic grid generation techniques for structured grids can be developed and if adaptivity is employed more routinely. More research in both these areas is urgently needed.

  5. Software for the ACP (Advanced Computer Program) multiprocessor system

    SciTech Connect

    Biel, J.; Areti, H.; Atac, R.; Cook, A.; Fischler, M.; Gaines, I.; Kaliher, C.; Hance, R.; Husby, D.; Nash, T.

    1987-02-02

    Software has been developed for use with the Fermilab Advanced Computer Program (ACP) multiprocessor system. The software was designed to make a system of a hundred independent node processors as easy to use as a single, powerful CPU. Subroutines have been developed by which a user's host program can send data to and get results from the program running in each of his ACP node processors. Utility programs make it easy to compile and link host and node programs, to debug a node program on an ACP development system, and to submit a debugged program to an ACP production system.

  6. Computer modeling for advanced life support system analysis.

    PubMed

    Drysdale, A

    1997-01-01

    This article discusses the equivalent mass approach to advanced life support system analysis, describes a computer model developed to use this approach, and presents early results from modeling the NASA JSC BioPlex. The model is built using an object oriented approach and G2, a commercially available modeling package Cost factor equivalencies are given for the Volosin scenarios. Plant data from NASA KSC and Utah State University (USU) are used, together with configuration data from the BioPlex design effort. Initial results focus on the importance of obtaining high plant productivity with a flight-like configuration. PMID:11540448

  7. A direct-execution parallel architecture for the Advanced Continuous Simulation Language (ACSL)

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Owen, Jeffrey E.

    1988-01-01

    A direct-execution parallel architecture for the Advanced Continuous Simulation Language (ACSL) is presented which overcomes the traditional disadvantages of simulations executed on a digital computer. The incorporation of parallel processing allows the mapping of simulations into a digital computer to be done in the same inherently parallel manner as they are currently mapped onto an analog computer. The direct-execution format maximizes the efficiency of the executed code since the need for a high level language compiler is eliminated. Resolution is greatly increased over that which is available with an analog computer without the sacrifice in execution speed normally expected with digitial computer simulations. Although this report covers all aspects of the new architecture, key emphasis is placed on the processing element configuration and the microprogramming of the ACLS constructs. The execution times for all ACLS constructs are computed using a model of a processing element based on the AMD 29000 CPU and the AMD 29027 FPU. The increase in execution speed provided by parallel processing is exemplified by comparing the derived execution times of two ACSL programs with the execution times for the same programs executed on a similar sequential architecture.

  8. Use of computer simulations for the early introduction of nuclear engineering concepts

    SciTech Connect

    Ougouag, A.M.; Zerguini, T.H.

    1985-11-01

    A sophomore level nuclear engineering (NE) course is being introduced at the University of Illinois. Via computer simulations, this course presents materials covering the most important aspects of the field. It is noted that computer simulations in nuclear engineering are cheaper and safer than experiments yet they provide an effective teaching tool for the early introduction of advanced concepts. The new course material can be used as a tutorial and for remedial learning. The use of computer simulation motivates learning since students associate computer activities with games. Such a course can help in the dissemination of the proper information to students from different fields, including liberal arts, and eventually increase undergraduate student enrollment in nuclear engineering.

  9. Integration of adaptive process control with computational simulation for spin-forming

    SciTech Connect

    Raboin, P. J., LLNL

    1998-03-10

    Improvements in spin-forming capabilities through upgrades to a metrology and machine control system and advances in numerical simulation techniques were studied in a two year project funded by Laboratory Directed Research and Development (LDRD) at Lawrence Livermore National Laboratory. Numerical analyses were benchmarked with spin-forming experiments and computational speeds increased sufficiently to now permit actual part forming simulations. Extensive modeling activities examined the simulation speeds and capabilities of several metal forming computer codes for modeling flat plate and cylindrical spin-forming geometries. Shape memory research created the first numerical model to describe this highly unusual deformation behavior in Uranium alloys. A spin-forming metrology assessment led to sensor and data acquisition improvements that will facilitate future process accuracy enhancements, such as a metrology frame. Finally, software improvements (SmartCAM) to the manufacturing process numerically integrate the part models to the spin-forming process and to computational simulations.

  10. Parallel Computing Environments and Methods for Power Distribution System Simulation

    SciTech Connect

    Lu, Ning; Taylor, Zachary T.; Chassin, David P.; Guttromson, Ross T.; Studham, Scott S.

    2005-11-10

    The development of cost-effective high-performance parallel computing on multi-processor super computers makes it attractive to port excessively time consuming simulation software from personal computers (PC) to super computes. The power distribution system simulator (PDSS) takes a bottom-up approach and simulates load at appliance level, where detailed thermal models for appliances are used. This approach works well for a small power distribution system consisting of a few thousand appliances. When the number of appliances increases, the simulation uses up the PC memory and its run time increases to a point where the approach is no longer feasible to model a practical large power distribution system. This paper presents an effort made to port a PC-based power distribution system simulator (PDSS) to a 128-processor shared-memory super computer. The paper offers an overview of the parallel computing environment and a description of the modification made to the PDSS model. The performances of the PDSS running on a standalone PC and on the super computer are compared. Future research direction of utilizing parallel computing in the power distribution system simulation is also addressed.

  11. Radiotherapy Monte Carlo simulation using cloud computing technology.

    PubMed

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-12-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  12. Gasification CFD Modeling for Advanced Power Plant Simulations

    SciTech Connect

    Zitney, S.E.; Guenther, C.P.

    2005-09-01

    In this paper we have described recent progress on developing CFD models for two commercial-scale gasifiers, including a two-stage, coal slurry-fed, oxygen-blown, pressurized, entrained-flow gasifier and a scaled-up design of the PSDF transport gasifier. Also highlighted was NETL’s Advanced Process Engineering Co-Simulator for coupling high-fidelity equipment models with process simulation for the design, analysis, and optimization of advanced power plants. Using APECS, we have coupled the entrained-flow gasifier CFD model into a coal-fired, gasification-based FutureGen power and hydrogen production plant. The results for the FutureGen co-simulation illustrate how the APECS technology can help engineers better understand and optimize gasifier fluid dynamics and related phenomena that impact overall power plant performance.

  13. Some theoretical issues on computer simulations

    SciTech Connect

    Barrett, C.L.; Reidys, C.M.

    1998-02-01

    The subject of this paper is the development of mathematical foundations for a theory of simulation. Sequentially updated cellular automata (sCA) over arbitrary graphs are employed as a paradigmatic framework. In the development of the theory, the authors focus on the properties of causal dependencies among local mappings in a simulation. The main object of and study is the mapping between a graph representing the dependencies among entities of a simulation and a representing the equivalence classes of systems obtained by all possible updates.

  14. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

    PubMed

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

  15. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    NASA Astrophysics Data System (ADS)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. This work was presented in part at the 2010 Annual Meeting of the American Association of Physicists in Medicine (AAPM), Philadelphia, PA.

  16. Advances in parallel computer technology for desktop atmospheric dispersion models

    SciTech Connect

    Bian, X.; Ionescu-Niscov, S.; Fast, J.D.; Allwine, K.J.

    1996-12-31

    Desktop models are those models used by analysts with varied backgrounds, for performing, for example, air quality assessment and emergency response activities. These models must be robust, well documented, have minimal and well controlled user inputs, and have clear outputs. Existing coarse-grained parallel computers can provide significant increases in computation speed in desktop atmospheric dispersion modeling without considerable increases in hardware cost. This increased speed will allow for significant improvements to be made in the scientific foundations of these applied models, in the form of more advanced diffusion schemes and better representation of the wind and turbulence fields. This is especially attractive for emergency response applications where speed and accuracy are of utmost importance. This paper describes one particular application of coarse-grained parallel computer technology to a desktop complex terrain atmospheric dispersion modeling system. By comparing performance characteristics of the coarse-grained parallel version of the model with the single-processor version, we will demonstrate that applying coarse-grained parallel computer technology to desktop atmospheric dispersion modeling systems will allow us to address critical issues facing future requirements of this class of dispersion models.

  17. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Zumao Chen; Mike Maguire; Adel Sarofim; Changguan Yang; Hong-Shig Shim

    2004-01-28

    This is the thirteenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a Virtual Engineering-based framework for simulating the performance of Advanced Power Systems. Within the last quarter, good progress has been made on all aspects of the project. Software development efforts have focused on a preliminary detailed software design for the enhanced framework. Given the complexity of the individual software tools from each team (i.e., Reaction Engineering International, Carnegie Mellon University, Iowa State University), a robust, extensible design is required for the success of the project. In addition to achieving a preliminary software design, significant progress has been made on several development tasks for the program. These include: (1) the enhancement of the controller user interface to support detachment from the Computational Engine and support for multiple computer platforms, (2) modification of the Iowa State University interface-to-kernel communication mechanisms to meet the requirements of the new software design, (3) decoupling of the Carnegie Mellon University computational models from their parent IECM (Integrated Environmental Control Model) user interface for integration with the new framework and (4) development of a new CORBA-based model interfacing specification. A benchmarking exercise to compare process and CFD based models for entrained flow gasifiers was completed. A summary of our work on intrinsic kinetics for modeling coal gasification has been completed. Plans for implementing soot and tar models into our entrained flow gasifier models are outlined. Plans for implementing a model for mercury capture based on conventional capture technology, but applied to an IGCC system, are outlined.

  18. Computational Fluid Dynamic simulations of pipe elbow flow.

    SciTech Connect

    Homicz, Gregory Francis

    2004-08-01

    One problem facing today's nuclear power industry is flow-accelerated corrosion and erosion in pipe elbows. The Korean Atomic Energy Research Institute (KAERI) is performing experiments in their Flow-Accelerated Corrosion (FAC) test loop to better characterize these phenomena, and develop advanced sensor technologies for the condition monitoring of critical elbows on a continuous basis. In parallel with these experiments, Sandia National Laboratories is performing Computational Fluid Dynamic (CFD) simulations of the flow in one elbow of the FAC test loop. The simulations are being performed using the FLUENT commercial software developed and marketed by Fluent, Inc. The model geometry and mesh were created using the GAMBIT software, also from Fluent, Inc. This report documents the results of the simulations that have been made to date; baseline results employing the RNG k-e turbulence model are presented. The predicted value for the diametrical pressure coefficient is in reasonably good agreement with published correlations. Plots of the velocities, pressure field, wall shear stress, and turbulent kinetic energy adjacent to the wall are shown within the elbow section. Somewhat to our surprise, these indicate that the maximum values of both wall shear stress and turbulent kinetic energy occur near the elbow entrance, on the inner radius of the bend. Additional simulations were performed for the same conditions, but with the RNG k-e model replaced by either the standard k-{var_epsilon}, or the realizable k-{var_epsilon} turbulence model. The predictions using the standard k-{var_epsilon} model are quite similar to those obtained in the baseline simulation. However, with the realizable k-{var_epsilon} model, more significant differences are evident. The maximums in both wall shear stress and turbulent kinetic energy now appear on the outer radius, near the elbow exit, and are {approx}11% and 14% greater, respectively, than those predicted in the baseline calculation

  19. Computer simulation of water reclamation processors

    NASA Technical Reports Server (NTRS)

    Fisher, John W.; Hightower, T. M.; Flynn, Michael T.

    1991-01-01

    The development of detailed simulation models of water reclamation processors based on the ASPEN PLUS simulation program is discussed. Individual models have been developed for vapor compression distillation, vapor phase catalytic ammonia removal, and supercritical water oxidation. These models are used for predicting the process behavior. Particular attention is given to methodology which is used to complete this work, and the insights which are gained by this type of model development.

  20. Computer simulation of a few common process control systems

    SciTech Connect

    Muncy, M.P.

    1986-06-01

    This paper shows how to simulate five common process control systems on an IBM PC with a commercially available software package named TUTSIM. All steps involved in producing and checking each simulation are described as clearly as possible. Complete computer listings and output line plots are included to fully document each simulation. Sufficient information is provided so that readers of this paper can duplicate each simulation if they desire to do so. 10 refs., 13 figs., 11 tbls.

  1. Computer Simulation Performed for Columbia Project Cooling System

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  2. Using Process/CFD Co-Simulation for the Design and Analysis of Advanced Energy Systems

    SciTech Connect

    Zitney, S.E.

    2007-04-01

    In this presentation we describe the major features and capabilities of NETL’s Advanced Process Engineering Co-Simulator (APECS) and highlight its application to advanced energy systems, ranging from small fuel cell systems to commercial-scale power plants including the coal-fired, gasification-based electricity and hydrogen plant in the DOE’s $1 billion, 10-year FutureGen demonstration project. APECS is an integrated software suite which allows the process and energy industries to optimize overall plant performance with respect to complex thermal and fluid flow phenomena by combining process simulation (e.g., Aspen Plus®) with high-fidelity equipment simulations based on computational fluid dynamics (CFD) models (e.g., FLUENT®).

  3. How enzymes work: analysis by modern rate theory and computer simulations.

    PubMed

    Garcia-Viloca, Mireia; Gao, Jiali; Karplus, Martin; Truhlar, Donald G

    2004-01-01

    Advances in transition state theory and computer simulations are providing new insights into the sources of enzyme catalysis. Both lowering of the activation free energy and changes in the generalized transmission coefficient (recrossing of the transition state, tunneling, and nonequilibrium contributions) can play a role. A framework for understanding these effects is presented, and the contributions of the different factors, as illustrated by specific enzymes, are identified and quantified by computer simulations. The resulting understanding of enzyme catalysis is used to comment on alternative proposals of how enzymes work.

  4. Icing simulation: A survey of computer models and experimental facilities

    NASA Technical Reports Server (NTRS)

    Potapczuk, M. G.; Reinmann, J. J.

    1991-01-01

    A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focussed on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for increased understanding of the physical processes governing ice accretion, ice shedding, and iced airfoil aerodynamics is examined.

  5. Two inviscid computational simulations of separated flow about airfoils

    NASA Technical Reports Server (NTRS)

    Barnwell, R. W.

    1976-01-01

    Two inviscid computational simulations of separated flow about airfoils are described. The basic computational method is the line relaxation finite-difference method. Viscous separation is approximated with inviscid free-streamline separation. The point of separation is specified, and the pressure in the separation region is calculated. In the first simulation, the empiricism of constant pressure in the separation region is employed. This empiricism is easier to implement with the present method than with singularity methods. In the second simulation, acoustic theory is used to determine the pressure in the separation region. The results of both simulations are compared with experiment.

  6. Biocellion: accelerating computer simulation of multicellular biological system models

    PubMed Central

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  7. In-silico simulations of advanced drug delivery systems: what will the future offer?

    PubMed

    Siepmann, Juergen

    2013-09-15

    This commentary enlarges on some of the topics addressed in the Position Paper "Towards more effective advanced drug delivery systems" by Crommelin and Florence (2013). Inter alia, the role of mathematical modeling and computer-assisted device design is briefly addressed in the Position Paper. This emerging and particularly promising field is considered in more depth in this commentary. In fact, in-silico simulations have become of fundamental importance in numerous scientific and related domains, allowing for a better understanding of various phenomena and for facilitated device design. The development of novel prototypes of space shuttles, nuclear power plants and automobiles are just a few examples. In-silico simulations are nowadays also well established in the field of pharmacokinetics/pharmacodynamics (PK/PD) and have become an integral part of the discovery and development process of novel drug products. Since Takeru Higuchi published his seminal equation in 1961 the use of mathematical models for the analysis and optimization of drug delivery systems in vitro has also become more and more popular. However, applying in-silico simulations for facilitated optimization of advanced drug delivery systems is not yet common practice. One of the reasons is the gap between in vitro and in vivo (PK/PD) simulations. In the future it can be expected that this gap will be closed and that computer assisted device design will play a central role in the research on, and development of advanced drug delivery systems.

  8. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    SciTech Connect

    Lucas, Robert; Ang, James; Bergman, Keren; Borkar, Shekhar; Carlson, William; Carrington, Laura; Chiu, George; Colwell, Robert; Dally, William; Dongarra, Jack; Geist, Al; Haring, Rud; Hittinger, Jeffrey; Hoisie, Adolfy; Klein, Dean Micron; Kogge, Peter; Lethin, Richard; Sarkar, Vivek; Schreiber, Robert; Shalf, John; Sterling, Thomas; Stevens, Rick; Bashor, Jon; Brightwell, Ron; Coteus, Paul; Debenedictus, Erik; Hiller, Jon; Kim, K. H.; Langston, Harper; Murphy, Richard Micron; Webster, Clayton; Wild, Stefan; Grider, Gary; Ross, Rob; Leyffer, Sven; Laros III, James

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  9. Lessons Learned From Dynamic Simulations of Advanced Fuel Cycles

    SciTech Connect

    Steven J. Piet; Brent W. Dixon; Jacob J. Jacobson; Gretchen E. Matthern; David E. Shropshire

    2009-04-01

    Years of performing dynamic simulations of advanced nuclear fuel cycle options provide insights into how they could work and how one might transition from the current once-through fuel cycle. This paper summarizes those insights from the context of the 2005 objectives and goals of the Advanced Fuel Cycle Initiative (AFCI). Our intent is not to compare options, assess options versus those objectives and goals, nor recommend changes to those objectives and goals. Rather, we organize what we have learned from dynamic simulations in the context of the AFCI objectives for waste management, proliferation resistance, uranium utilization, and economics. Thus, we do not merely describe “lessons learned” from dynamic simulations but attempt to answer the “so what” question by using this context. The analyses have been performed using the Verifiable Fuel Cycle Simulation of Nuclear Fuel Cycle Dynamics (VISION). We observe that the 2005 objectives and goals do not address many of the inherently dynamic discriminators among advanced fuel cycle options and transitions thereof.

  10. Computer simulator for a mobile telephone system

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.

    1981-01-01

    A software simulator was developed to assist NASA in the design of the land mobile satellite service. Structured programming techniques were used by developing the algorithm using an ALCOL-like pseudo language and then encoding the algorithm into FORTRAN 4. The basic input data to the system is a sine wave signal although future plans call for actual sampled voice as the input signal. The simulator is capable of studying all the possible combinations of types and modes of calls through the use of five communication scenarios: single hop systems; double hop, signal gateway system; double hop, double gateway system; mobile to wireline system; and wireline to mobile system. The transmitter, fading channel, and interference source simulation are also discussed.

  11. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    SciTech Connect

    Reed, Daniel; Berzins, Martin; Pennington, Robert; Sarkar, Vivek; Taylor, Valerie

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  12. Numerical simulation of supersonic wake flow with parallel computers

    SciTech Connect

    Wong, C.C.; Soetrisno, M.

    1995-07-01

    Simulating a supersonic wake flow field behind a conical body is a computing intensive task. It requires a large number of computational cells to capture the dominant flow physics and a robust numerical algorithm to obtain a reliable solution. High performance parallel computers with unique distributed processing and data storage capability can provide this need. They have larger computational memory and faster computing time than conventional vector computers. We apply the PINCA Navier-Stokes code to simulate a wind-tunnel supersonic wake experiment on Intel Gamma, Intel Paragon, and IBM SP2 parallel computers. These simulations are performed to study the mean flow in the near wake region of a sharp, 7-degree half-angle, adiabatic cone at Mach number 4.3 and freestream Reynolds number of 40,600. Overall the numerical solutions capture the general features of the hypersonic laminar wake flow and compare favorably with the wind tunnel data. With a refined and clustering grid distribution in the recirculation zone, the calculated location of the rear stagnation point is consistent with the 2D axisymmetric and 3D experiments. In this study, we also demonstrate the importance of having a large local memory capacity within a computer node and the effective utilization of the number of computer nodes to achieve good parallel performance when simulating a complex, large-scale wake flow problem.

  13. Computer Simulation of Electric Field Lines.

    ERIC Educational Resources Information Center

    Kirkup, L.

    1985-01-01

    Describes a computer program which plots electric field line plots. Includes program listing, sample diagrams produced on a BBC model B microcomputer (which could be produced on other microcomputers by modifying the program), and a discussion of the properties of field lines. (JN)

  14. How Real Is a Computer Simulation?

    ERIC Educational Resources Information Center

    Higgins, John J.

    Two keywords "input" and "get," in the BASIC programming language provide a metaphor of the processes of response and intervention in a dialogue situation. Computer teaching activities can be programmed using one or both of these commands. There are at least five main types: the quiz or overt teaching program, the text processsing program, the…

  15. An Exercise in Biometrical Genetics Based on a Computer Simulation.

    ERIC Educational Resources Information Center

    Murphy, P. J.

    1983-01-01

    Describes an exercise in biometrical genetics based on the noninteractive use of a computer simulation of a wheat hydridization program. Advantages of using the material in this way are also discussed. (Author/JN)

  16. MINEXP, A Computer-Simulated Mineral Exploration Program

    ERIC Educational Resources Information Center

    Smith, Michael J.; And Others

    1978-01-01

    This computer simulation is designed to put students into a realistic decision making situation in mineral exploration. This program can be used with different exploration situations such as ore deposits, petroleum, ground water, etc. (MR)

  17. Parallel solvers for reservoir simulation on MIMD computers

    SciTech Connect

    Piault, E.; Willien, F.; Roux, F.X.

    1995-12-01

    We have investigated parallel solvers for reservoir simulation. We compare different solvers and preconditioners using T3D and SP1 parallel computers. We use block diagonal domain decomposition preconditioner with non-overlapping sub-domains.

  18. Computer simulation of chemical reactions in porous materials

    NASA Astrophysics Data System (ADS)

    Turner, Christoffer Heath

    Understanding reactions in nanoporous materials from a purely experimental perspective is a difficult task. Measuring the chemical composition of a reacting system within a catalytic material is usually only accomplished through indirect methods, and it is usually impossible to distinguish between true chemical equilibrium and metastable states. In addition, measuring molecular orientation or distribution profiles within porous systems is not easily accomplished. However, molecular simulation techniques are well-suited to these challenges. With appropriate simulation techniques and realistic molecular models, it is possible to validate the dominant physical and chemical forces controlling nanoscale reactivity. Novel nanostructured catalysts and supports can be designed, optimized, and tested using high-performance computing and advanced modeling techniques in order to guide the search for next-generation catalysts---setting new targets for the materials synthesis community. We have simulated the conversion of several different equilibrium-limited reactions within microporous carbons and we find that the pore size, pore geometry, and surface chemistry are important factors for determining the reaction yield. The equilibrium-limited reactions that we have modeled include nitric oxide dimerization, ammonia synthesis, and the esterification of acetic acid, all of which show yield enhancements within microporous carbons. In conjunction with a yield enhancement of the esterification reaction, selective adsorption of ethyl acetate within carbon micropores demonstrates an efficient method for product recovery. Additionally, a new method has been developed for simulating reaction kinetics within porous materials and other heterogeneous environments. The validity of this technique is first demonstrated by reproducing the kinetics of hydrogen iodide decomposition in the gas phase, and then predictions are made within slit-shaped carbon pores and carbon nanotubes. The rate

  19. Bodies Falling with Air Resistance: Computer Simulation.

    ERIC Educational Resources Information Center

    Vest, Floyd

    1982-01-01

    Two models are presented. The first assumes that air resistance is proportional to the velocity of the falling body. The second assumes that air resistance is proportional to the square of the velocity. A program written in BASIC that simulates the second model is presented. (MP)

  20. Student Ecosystems Problem Solving Using Computer Simulation.

    ERIC Educational Resources Information Center

    Howse, Melissa A.

    The purpose of this study was to determine the procedural knowledge brought to, and created within, a pond ecology simulation by students. Environmental Decision Making (EDM) is an ecosystems modeling tool that allows users to pose their own problems and seek satisfying solutions. Of specific interest was the performance of biology majors who had…

  1. The DOE Accelerated Strategic Computing Initiative: Challenges and opportunities for predictive materials simulation capabilities

    NASA Astrophysics Data System (ADS)

    Mailhiot, Christian

    1998-05-01

    In response to the unprecedented national security challenges emerging from the end of nuclear testing, the Defense Programs of the Department of Energy has developed a long-term strategic plan based on a vigorous Science-Based Stockpile Stewardship (SBSS) program. The main objective of the SBSS program is to ensure confidence in the performance, safety, and reliability of the stockpile on the basis of a fundamental science-based approach. A central element of this approach is the development of predictive, ‘full-physics’, full-scale computer simulation tools. As a critical component of the SBSS program, the Accelerated Strategic Computing Initiative (ASCI) was established to provide the required advances in computer platforms and to enable predictive, physics-based simulation capabilities. In order to achieve the ASCI goals, fundamental problems in the fields of computer and physical sciences of great significance to the entire scientific community must be successfully solved. Foremost among the key elements needed to develop predictive simulation capabilities, the development of improved physics-based materials models is a cornerstone. We indicate some of the materials theory, modeling, and simulation challenges and illustrate how the ASCI program will enable both the hardware and the software tools necessary to advance the state-of-the-art in the field of computational condensed matter and materials physics.

  2. Integration of Advanced Simulation and Visualization for Manufacturing Process Optimization

    NASA Astrophysics Data System (ADS)

    Zhou, Chenn; Wang, Jichao; Tang, Guangwu; Moreland, John; Fu, Dong; Wu, Bin

    2016-05-01

    The integration of simulation and visualization can provide a cost-effective tool for process optimization, design, scale-up and troubleshooting. The Center for Innovation through Visualization and Simulation (CIVS) at Purdue University Northwest has developed methodologies for such integration with applications in various manufacturing processes. The methodologies have proven to be useful for virtual design and virtual training to provide solutions addressing issues on energy, environment, productivity, safety, and quality in steel and other industries. In collaboration with its industrial partnerships, CIVS has provided solutions to companies, saving over US38 million. CIVS is currently working with the steel industry to establish an industry-led Steel Manufacturing Simulation and Visualization Consortium through the support of National Institute of Standards and Technology AMTech Planning Grant. The consortium focuses on supporting development and implementation of simulation and visualization technologies to advance steel manufacturing across the value chain.

  3. NASA Trapezoidal Wing Computations Including Transition and Advanced Turbulence Modeling

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Lee-Rausch, E. M.

    2012-01-01

    Flow about the NASA Trapezoidal Wing is computed with several turbulence models by using grids from the first High Lift Prediction Workshop in an effort to advance understanding of computational fluid dynamics modeling for this type of flowfield. Transition is accounted for in many of the computations. In particular, a recently-developed 4-equation transition model is utilized and works well overall. Accounting for transition tends to increase lift and decrease moment, which improves the agreement with experiment. Upper surface flap separation is reduced, and agreement with experimental surface pressures and velocity profiles is improved. The predicted shape of wakes from upstream elements is strongly influenced by grid resolution in regions above the main and flap elements. Turbulence model enhancements to account for rotation and curvature have the general effect of increasing lift and improving the resolution of the wing tip vortex as it convects downstream. However, none of the models improve the prediction of surface pressures near the wing tip, where more grid resolution is needed.

  4. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  5. Understanding Islamist political violence through computational social simulation

    SciTech Connect

    Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G; Eberhardt, Ariane; Stradling, Seth G

    2008-01-01

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.

  6. Simulation Neurotechnologies for Advancing Brain Research: Parallelizing Large Networks in NEURON.

    PubMed

    Lytton, William W; Seidenstein, Alexandra H; Dura-Bernal, Salvador; McDougal, Robert A; Schürmann, Felix; Hines, Michael L

    2016-10-01

    Large multiscale neuronal network simulations are of increasing value as more big data are gathered about brain wiring and organization under the auspices of a current major research initiative, such as Brain Research through Advancing Innovative Neurotechnologies. The development of these models requires new simulation technologies. We describe here the current use of the NEURON simulator with message passing interface (MPI) for simulation in the domain of moderately large networks on commonly available high-performance computers (HPCs). We discuss the basic layout of such simulations, including the methods of simulation setup, the run-time spike-passing paradigm, and postsimulation data storage and data management approaches. Using the Neuroscience Gateway, a portal for computational neuroscience that provides access to large HPCs, we benchmark simulations of neuronal networks of different sizes (500-100,000 cells), and using different numbers of nodes (1-256). We compare three types of networks, composed of either Izhikevich integrate-and-fire neurons (I&F), single-compartment Hodgkin-Huxley (HH) cells, or a hybrid network with half of each. Results show simulation run time increased approximately linearly with network size and decreased almost linearly with the number of nodes. Networks with I&F neurons were faster than HH networks, although differences were small since all tested cells were point neurons with a single compartment. PMID:27557104

  7. Simulation Neurotechnologies for Advancing Brain Research: Parallelizing Large Networks in NEURON.

    PubMed

    Lytton, William W; Seidenstein, Alexandra H; Dura-Bernal, Salvador; McDougal, Robert A; Schürmann, Felix; Hines, Michael L

    2016-10-01

    Large multiscale neuronal network simulations are of increasing value as more big data are gathered about brain wiring and organization under the auspices of a current major research initiative, such as Brain Research through Advancing Innovative Neurotechnologies. The development of these models requires new simulation technologies. We describe here the current use of the NEURON simulator with message passing interface (MPI) for simulation in the domain of moderately large networks on commonly available high-performance computers (HPCs). We discuss the basic layout of such simulations, including the methods of simulation setup, the run-time spike-passing paradigm, and postsimulation data storage and data management approaches. Using the Neuroscience Gateway, a portal for computational neuroscience that provides access to large HPCs, we benchmark simulations of neuronal networks of different sizes (500-100,000 cells), and using different numbers of nodes (1-256). We compare three types of networks, composed of either Izhikevich integrate-and-fire neurons (I&F), single-compartment Hodgkin-Huxley (HH) cells, or a hybrid network with half of each. Results show simulation run time increased approximately linearly with network size and decreased almost linearly with the number of nodes. Networks with I&F neurons were faster than HH networks, although differences were small since all tested cells were point neurons with a single compartment.

  8. Simulations for Complex Fluid Flow Problems from Berkeley Lab's Center for Computational Sciences and Engineering (CCSE)

    DOE Data Explorer

    The Center for Computational Sciences and Engineering (CCSE) develops and applies advanced computational methodologies to solve large-scale scientific and engineering problems arising in the Department of Energy (DOE) mission areas involving energy, environmental, and industrial technology. The primary focus is in the application of structured-grid finite difference methods on adaptive grid hierarchies for compressible, incompressible, and low Mach number flows. The diverse range of scientific applications that drive the research typically involve a large range of spatial and temporal scales (e.g. turbulent reacting flows) and require the use of extremely large computing hardware, such as the 153,000-core computer, Hopper, at NERSC. The CCSE approach to these problems centers on the development and application of advanced algorithms that exploit known separations in scale; for many of the application areas this results in algorithms are several orders of magnitude more efficient than traditional simulation approaches.

  9. Advanced Simulation Capability for Environmental Management (ASCEM) Phase II Demonstration

    SciTech Connect

    Freshley, M.; Hubbard, S.; Flach, G.; Freedman, V.; Agarwal, D.; Andre, B.; Bott, Y.; Chen, X.; Davis, J.; Faybishenko, B.; Gorton, I.; Murray, C.; Moulton, D.; Meyer, J.; Rockhold, M.; Shoshani, A.; Steefel, C.; Wainwright, H.; Waichler, S.

    2012-09-28

    In 2009, the National Academies of Science (NAS) reviewed and validated the U.S. Department of Energy Office of Environmental Management (EM) Technology Program in its publication, Advice on the Department of Energy’s Cleanup Technology Roadmap: Gaps and Bridges. The NAS report outlined prioritization needs for the Groundwater and Soil Remediation Roadmap, concluded that contaminant behavior in the subsurface is poorly understood, and recommended further research in this area as a high priority. To address this NAS concern, the EM Office of Site Restoration began supporting the development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific approach that uses an integration of toolsets for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM modeling toolset is modular and open source. It is divided into three thrust areas: Multi-Process High Performance Computing (HPC), Platform and Integrated Toolsets, and Site Applications. The ASCEM toolsets will facilitate integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. During fiscal year 2012, the ASCEM project continued to make significant progress in capabilities development. Capability development occurred in both the Platform and Integrated Toolsets and Multi-Process HPC Simulator areas. The new Platform and Integrated Toolsets capabilities provide the user an interface and the tools necessary for end-to-end model development that includes conceptual model definition, data management for model input, model calibration and uncertainty analysis, and model output processing including visualization. The new HPC Simulator capabilities target increased functionality of process model representations, toolsets for interaction with the Platform, and model confidence testing and verification for

  10. Monte Carlo simulations on SIMD computer architectures

    SciTech Connect

    Burmester, C.P.; Gronsky, R.; Wille, L.T.

    1992-03-01

    Algorithmic considerations regarding the implementation of various materials science applications of the Monte Carlo technique to single instruction multiple data (SMM) computer architectures are presented. In particular, implementation of the Ising model with nearest, next nearest, and long range screened Coulomb interactions on the SIMD architecture MasPar MP-1 (DEC mpp-12000) series of massively parallel computers is demonstrated. Methods of code development which optimize processor array use and minimize inter-processor communication are presented including lattice partitioning and the use of processor array spanning tree structures for data reduction. Both geometric and algorithmic parallel approaches are utilized. Benchmarks in terms of Monte Carlo updates per second for the MasPar architecture are presented and compared to values reported in the literature from comparable studies on other architectures.

  11. Computer Simulation of the Beating Human Heart

    NASA Astrophysics Data System (ADS)

    Peskin, Charles S.; McQueen, David M.

    2001-06-01

    The mechanical function of the human heart couples together the fluid mechanics of blood and the soft tissue mechanics of the muscular heart walls and flexible heart valve leaflets. We discuss a unified mathematical formulation of this problem in which the soft tissue looks like a specialized part of the fluid in which additional forces are applied. This leads to a computational scheme known as the Immersed Boundary (IB) method for solving the coupled equations of motion of the whole system. The IB method is used to construct a three-dimensional Virtual Heart, including representations of all four chambers of the heart and all four valves, in addition to the large arteries and veins that connect the heart to the rest of the circulation. The chambers, valves, and vessels are all modeled as collections of elastic (and where appropriate, actively contractile) fibers immersed in viscous incompressible fluid. Results are shown as a computer-generated video animation of the beating heart.

  12. Requirements for advanced simulation of nuclear reactor and chemicalseparation plants.

    SciTech Connect

    Palmiotti, G.; Cahalan, J.; Pfeiffer, P.; Sofu, T.; Taiwo, T.; Wei,T.; Yacout, A.; Yang, W.; Siegel, A.; Insepov, Z.; Anitescu, M.; Hovland,P.; Pereira, C.; Regalbuto, M.; Copple, J.; Willamson, M.

    2006-12-11

    This report presents requirements for advanced simulation of nuclear reactor and chemical processing plants that are of interest to the Global Nuclear Energy Partnership (GNEP) initiative. Justification for advanced simulation and some examples of grand challenges that will benefit from it are provided. An integrated software tool that has its main components, whenever possible based on first principles, is proposed as possible future approach for dealing with the complex problems linked to the simulation of nuclear reactor and chemical processing plants. The main benefits that are associated with a better integrated simulation have been identified as: a reduction of design margins, a decrease of the number of experiments in support of the design process, a shortening of the developmental design cycle, and a better understanding of the physical phenomena and the related underlying fundamental processes. For each component of the proposed integrated software tool, background information, functional requirements, current tools and approach, and proposed future approaches have been provided. Whenever possible, current uncertainties have been quoted and existing limitations have been presented. Desired target accuracies with associated benefits to the different aspects of the nuclear reactor and chemical processing plants were also given. In many cases the possible gains associated with a better simulation have been identified, quantified, and translated into economical benefits.

  13. Launch Site Computer Simulation and its Application to Processes

    NASA Technical Reports Server (NTRS)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  14. Conceptually enhanced simulations: A computer tool for science teaching

    NASA Astrophysics Data System (ADS)

    Snir, Joseph; Smith, Carol; Grosslight, Lorraine

    1993-06-01

    In this paper, we consider a way computer simulations can be used to address the problem of teaching for conceptual change and understanding. After identifying three levels of understanding of a natural phenomenon (concrete, conceptual, and metaconceptual) that need to be addressed in school science, and classifying computer model systems and simulations more generally in terms of the design choices facing the programmer, we argue that there are ways to design computer simulations that can make them more powerful than laboratory models. In particular, computer simulations that provide an explicit representation for a set of interrelated concepts allow students to perceive what cannot be directly observed in laboratory experiments: representations for the concepts and ideas used for interpreting the experiment. Further, by embedding the relevant physical laws directly into the program code, these simulations allow for genuine discoveries. We describe how we applied these ideas in developing a computer simulation for a particular set of purposes: to help students grasp the distinction between mass and density and to understand the phenomenon of flotation in terms of these concepts. Finally, we reflect on the kinds of activities such conceptually enhanced simulations allow that may be important in bringing about the desired conceptual change.

  15. Computational Aerothermodynamic Simulation Issues on Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; White, Jeffery A.

    2004-01-01

    The synthesis of physical models for gas chemistry and turbulence from the structured grid codes LAURA and VULCAN into the unstructured grid code FUN3D is described. A directionally Symmetric, Total Variation Diminishing (STVD) algorithm and an entropy fix (eigenvalue limiter) keyed to local cell Reynolds number are introduced to improve solution quality for hypersonic aeroheating applications. A simple grid-adaptation procedure is incorporated within the flow solver. Simulations of flow over an ellipsoid (perfect gas, inviscid), Shuttle Orbiter (viscous, chemical nonequilibrium) and comparisons to the structured grid solvers LAURA (cylinder, Shuttle Orbiter) and VULCAN (flat plate) are presented to show current capabilities. The quality of heating in 3D stagnation regions is very sensitive to algorithm options in general, high aspect ratio tetrahedral elements complicate the simulation of high Reynolds number, viscous flow as compared to locally structured meshes aligned with the flow.

  16. Computer simulations of particle-surface dynamics

    SciTech Connect

    Karo, A.M.; Hiskes, J.R.; DeBoni, T.M.

    1986-10-01

    Our simulations of particle-surface dynamics use the molecular dynamics codes that we have developed over the past several years. The initial state of a molecule and the parameters defining the incoming trajectory can be specifically described or randomly selected. Statistical analyses of the states of the particles and their trajectories following wall collisions are carried out by the code. We have carried out calculations at high center-of-mass energies and low incidence angles and have examined the survival fraction of molecules and the dependence upon the incoming trajectory. We report also on preliminary efforts that are being made to simulate sputtering and recombinant desorption processes, since the recombinant desorption of hydrogen from typical wall materials may be an important source for vibrationally-excited hydrogen in volume sources; for surface sources the presence of occluded hydrogen may affect the concentration of atomic species.

  17. Computer simulation of surface and film processes

    NASA Technical Reports Server (NTRS)

    Tiller, W. A.

    1981-01-01

    A molecular dynamics technique based upon Lennard-Jones type pair interactions is used to investigate time-dependent as well as equilibrium properties. The case study deals with systems containing Si and O atoms. In this case a more involved potential energy function (PEF) is employed and the system is simulated via a Monte-Carlo procedure. This furnishes the equilibrium properties of the system at its interfaces and surfaces as well as in the bulk.

  18. Computer simulation of a geomagnetic substorm

    NASA Technical Reports Server (NTRS)

    Lyon, J. G.; Brecht, S. H.; Huba, J. D.; Fedder, J. A.; Palmadesso, P. J.

    1981-01-01

    A global two-dimensional simulation of a substormlike process occurring in earth's magnetosphere is presented. The results are consistent with an empirical substorm model - the neutral-line model. Specifically, the introduction of a southward interplanetary magnetic field forms an open magnetosphere. Subsequently, a substorm neutral line forms at about 15 earth radii or closer in the magnetotail, and plasma sheet thinning and plasma acceleration occur. Eventually the substorm neutral line moves tailward toward its presubstorm position.

  19. The Use of Computer Simulations in High School Curricula.

    ERIC Educational Resources Information Center

    Visich, Marian, Jr.; Braun, Ludwig

    The Huntington Computer Project has developed 17 simulation games which can be used for instructional purposes in high schools. These games were designed to run on digital computers and to deal with material from either biology, physics, or social studies. Distribution was achieved through the Digital Equipment Corporation, which disseminated…

  20. Evaluation of a Computer Simulation in a Therapeutics Case Discussion.

    ERIC Educational Resources Information Center

    Kinkade, Raenel E.; And Others

    1995-01-01

    A computer program was used to simulate a case presentation in pharmacotherapeutics. Students (n=24) used their knowledge of the disease (glaucoma) and various topical agents on the computer program's formulary to "treat" the patient. Comparison of results with a control group found the method as effective as traditional case presentation on…

  1. Application Of Computer Simulation To The Entertainment Industry

    NASA Astrophysics Data System (ADS)

    Mittelman, Phillip S.

    1983-10-01

    Images generated by computer have started to appear in feature films (TRON, Star Trek II), in television commercials and in animated films. Of particular interest is the use of computer generated imagery which simulates the images which a real camera might have made if the imaged objects had been real.

  2. Frontiers in the Teaching of Physiology. Computer Literacy and Simulation.

    ERIC Educational Resources Information Center

    Tidball, Charles S., Ed.; Shelesnyak, M. C., Ed.

    Provided is a collection of papers on computer literacy and simulation originally published in The Physiology Teacher, supplemented by additional papers and a glossary of terms relevant to the field. The 12 papers are presented in five sections. An affirmation of conventional physiology laboratory exercises, coping with computer terminology, and…

  3. COFLO: A Computer Aid for Teaching Ecological Simulation.

    ERIC Educational Resources Information Center

    Le vow, Roy B.

    A computer-assisted course was designed to provide students with an understanding of modeling and simulation techniques in quantitiative ecology. It deals with continuous systems and has two segments. One develops mathematical and computer tools, beginning with abstract systems and their relation to physical systems. Modeling principles are next…

  4. Cardiovascular Physiology Teaching: Computer Simulations vs. Animal Demonstrations.

    ERIC Educational Resources Information Center

    Samsel, Richard W.; And Others

    1994-01-01

    At the introductory level, the computer provides an effective alternative to using animals for laboratory teaching. Computer software can simulate the operation of multiple organ systems. Advantages of software include alteration of variables that are not easily changed in vivo, repeated interventions, and cost-effective hands-on student access.…

  5. Use of Computer Simulations in Microbial and Molecular Genetics.

    ERIC Educational Resources Information Center

    Wood, Peter

    1984-01-01

    Describes five computer programs: four simulations of genetic and physical mapping experiments and one interactive learning program on the genetic coding mechanism. The programs were originally written in BASIC for the VAX-11/750 V.3. mainframe computer and have been translated into Applesoft BASIC for Apple IIe microcomputers. (JN)

  6. Coached, Interactive Computer Simulations: A New Technology for Training.

    ERIC Educational Resources Information Center

    Hummel, Thomas J.

    This paper provides an overview of a prototype simulation-centered intelligent computer-based training (CBT) system--implemented using expert system technology--which provides: (1) an environment in which trainees can learn and practice complex skills; (2) a computer-based coach or mentor to critique performance, suggest improvements, and provide…

  7. Advanced 3D Photocathode Modeling and Simulations Final Report

    SciTech Connect

    Dimitre A Dimitrov; David L Bruhwiler

    2005-06-06

    High brightness electron beams required by the proposed Next Linear Collider demand strong advances in photocathode electron gun performance. Significant improvement in the production of such beams with rf photocathode electron guns is hampered by the lack high-fidelity simulations. The critical missing piece in existing gun codes is a physics-based, detailed treatment of the very complex and highly nonlinear photoemission process.

  8. The Design, Development, and Evaluation of an Evaluative Computer Simulation.

    ERIC Educational Resources Information Center

    Ehrlich, Lisa R.

    This paper discusses evaluation design considerations for a computer based evaluation simulation developed at the University of Iowa College of Medicine in Cardiology to assess the diagnostic skills of primary care physicians and medical students. The simulation developed allows for the assessment of diagnostic skills of physicians in the…

  9. Computer Simulation of Laboratory Experiments: An Unrealized Potential.

    ERIC Educational Resources Information Center

    Magin, D. J.; Reizes, J. A.

    1990-01-01

    Discussion of the use of computer simulation for laboratory experiments in undergraduate engineering education focuses on work at the University of New South Wales in the instructional design and software development of a package simulating a heat exchange device. The importance of integrating theory, design, and experimentation is also discussed.…

  10. Students from Many Countries Use Computers to Simulate International Negotiations.

    ERIC Educational Resources Information Center

    Wilson, David L.

    1991-01-01

    College students around the world confer by computer in the International Communications and Negotiation Simulation. The simulation is offered by schools within the curriculum or as an extracurricular activity, with faculty as coordinators. Student teams are given scenarios and country assignments, prepare a position paper, and participate in the…

  11. Investigating the Effectiveness of Computer Simulations for Chemistry Learning

    ERIC Educational Resources Information Center

    Plass, Jan L.; Milne, Catherine; Homer, Bruce D.; Schwartz, Ruth N.; Hayward, Elizabeth O.; Jordan, Trace; Verkuilen, Jay; Ng, Florrie; Wang, Yan; Barrientos, Juan

    2012-01-01

    Are well-designed computer simulations an effective tool to support student understanding of complex concepts in chemistry when integrated into high school science classrooms? We investigated scaling up the use of a sequence of simulations of kinetic molecular theory and associated topics of diffusion, gas laws, and phase change, which we designed…

  12. Computer Simulation of the Population Growth (Schizosaccharomyces Pombe) Experiment.

    ERIC Educational Resources Information Center

    Daley, Michael; Hillier, Douglas

    1981-01-01

    Describes a computer program (available from authors) developed to simulate "Growth of a Population (Yeast) Experiment." Students actively revise the counting techniques with realistically simulated haemocytometer or eye-piece grid and are reminded of the necessary dilution technique. Program can be modified to introduce such variables as…

  13. Computer Simulation as an Aid to Managers of Training.

    ERIC Educational Resources Information Center

    Wagner, Harold; Butler, Patrick J.

    Research investigated computer simulations of a hypothetical self-paced training program to determine the utility of this technique as a planning aid for Army training program managers. The General Purpose Simulation System (GPSS) was selected as the programing language and the study was divided into three stages. In Stage I, the daily number of…

  14. Effectiveness of an Endodontic Diagnosis Computer Simulation Program.

    ERIC Educational Resources Information Center

    Fouad, Ashraf F.; Burleson, Joseph A.

    1997-01-01

    Effectiveness of a computer simulation to teach endodontic diagnosis was assessed using three groups (n=34,32,24) of dental students. All were lectured on diagnosis, pathology, and radiographic interpretation. One group then used the simulation, another had a seminar on the same material, and the third group had no further instruction. Results…

  15. Enhancing Computer Science Education with a Wireless Intelligent Simulation Environment

    ERIC Educational Resources Information Center

    Cook, Diane J.; Huber, Manfred; Yerraballi, Ramesh; Holder, Lawrence B.

    2004-01-01

    The goal of this project is to develop a unique simulation environment that can be used to increase students' interest and expertise in Computer Science curriculum. Hands-on experience with physical or simulated equipment is an essential ingredient for learning, but many approaches to training develop a separate piece of equipment or software for…

  16. Design Model for Learner-Centered, Computer-Based Simulations.

    ERIC Educational Resources Information Center

    Hawley, Chandra L.; Duffy, Thomas M.

    This paper presents a model for designing computer-based simulation environments within a constructivist framework for the K-12 school setting. The following primary criteria for the development of simulations are proposed: (1) the problem needs to be authentic; (2) the cognitive demand in learning should be authentic; (3) scaffolding supports a…

  17. Computer simulation program is adaptable to industrial processes

    NASA Technical Reports Server (NTRS)

    Schultz, F. E.

    1966-01-01

    The Reaction kinetics ablation program /REKAP/, developed to simulate ablation of various materials, provides mathematical formulations for computer programs which can simulate certain industrial processes. The programs are based on the use of nonsymmetrical difference equations that are employed to solve complex partial differential equation systems.

  18. A computer simulator for development of engineering system design methodologies

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  19. SciDAC Advances and Applications in Computational Beam Dynamics

    SciTech Connect

    Ryne, R.; Abell, D.; Adelmann, A.; Amundson, J.; Bohn, C.; Cary, J.; Colella, P.; Dechow, D.; Decyk, V.; Dragt, A.; Gerber, R.; Habib, S.; Higdon, D.; Katsouleas, T.; Ma, K.-L.; McCorquodale, P.; Mihalcea, D.; Mitchell, C.; Mori, W.; Mottershead, C.T.; Neri, F.; Pogorelov, I.; Qiang, J.; Samulyak, R.; Serafini, D.; Shalf, J.; Siegerist, C.; Spentzouris, P.; Stoltz, P.; Terzic, B.; Venturini, M.; Walstrom, P.

    2005-06-26

    SciDAC has had a major impact on computational beam dynamics and the design of particle accelerators. Particle accelerators--which account for half of the facilities in the DOE Office of Science Facilities for the Future of Science 20 Year Outlook--are crucial for US scientific, industrial, and economic competitiveness. Thanks to SciDAC, accelerator design calculations that were once thought impossible are now carried routinely, and new challenging and important calculations are within reach. SciDAC accelerator modeling codes are being used to get the most science out of existing facilities, to produce optimal designs for future facilities, and to explore advanced accelerator concepts that may hold the key to qualitatively new ways of accelerating charged particle beams. In this poster we present highlights from the SciDAC Accelerator Science and Technology (AST) project Beam Dynamics focus area in regard to algorithm development, software development, and applications.

  20. Advanced information processing system: Inter-computer communication services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  1. Computer Models Simulate Fine Particle Dispersion

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  2. Computer Simulations of Coupled Piano Strings

    NASA Astrophysics Data System (ADS)

    Albert, Destiny L.

    1997-03-01

    The behavior of coupled piano strings is studied by using a finite difference scheme. The coupling of the strings produce motion in two transverse directions, parallel and perpendicular to the soundboard. The sound induced shows two decay rates, a rapid decay followed by a slow decay. These effects are in agreement with experimental results. (Weinreich, Gabriel. "The Coupled Motion of Piano Strings." Scientific American. January 1979) . Our simulations suggest that the motion of the end supports contributes to the elliptical motion of the strings. Furthermore, multiple strings contribute to the quality of the sound produced by a piano.

  3. 5 CFR 550.404 - Computation of advance payments and evacuation payments; time periods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Computation of advance payments and... Computation of advance payments and evacuation payments; time periods. (a) Payments shall be based on the rate... others, when applicable, shall be made before advance payments or evacuation payments are made....

  4. 5 CFR 550.404 - Computation of advance payments and evacuation payments; time periods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false Computation of advance payments and... Computation of advance payments and evacuation payments; time periods. (a) Payments shall be based on the rate... others, when applicable, shall be made before advance payments or evacuation payments are made....

  5. Reliability of an interactive computer program for advance care planning.

    PubMed

    Schubart, Jane R; Levi, Benjamin H; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-06-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83-0.95, and 0.86-0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  6. Reliability of an Interactive Computer Program for Advance Care Planning

    PubMed Central

    Levi, Benjamin H.; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-01-01

    Abstract Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83–0.95, and 0.86–0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  7. Optical design and characterization of an advanced computational imaging system

    NASA Astrophysics Data System (ADS)

    Shepard, R. Hamilton; Fernandez-Cull, Christy; Raskar, Ramesh; Shi, Boxin; Barsi, Christopher; Zhao, Hang

    2014-09-01

    We describe an advanced computational imaging system with an optical architecture that enables simultaneous and dynamic pupil-plane and image-plane coding accommodating several task-specific applications. We assess the optical requirement trades associated with custom and commercial-off-the-shelf (COTS) optics and converge on the development of two low-cost and robust COTS testbeds. The first is a coded-aperture programmable pixel imager employing a digital micromirror device (DMD) for image plane per-pixel oversampling and spatial super-resolution experiments. The second is a simultaneous pupil-encoded and time-encoded imager employing a DMD for pupil apodization or a deformable mirror for wavefront coding experiments. These two testbeds are built to leverage two MIT Lincoln Laboratory focal plane arrays - an orthogonal transfer CCD with non-uniform pixel sampling and on-chip dithering and a digital readout integrated circuit (DROIC) with advanced on-chip per-pixel processing capabilities. This paper discusses the derivation of optical component requirements, optical design metrics, and performance analyses for the two testbeds built.

  8. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  9. Computer simulation of vasectomy for wolf control

    USGS Publications Warehouse

    Haight, R.G.; Mech, L.D.

    1997-01-01

    Recovering gray wolf (Canis lupus) populations in the Lake Superior region of the United States are prompting state management agencies to consider strategies to control population growth. In addition to wolf removal, vasectomy has been proposed. To predict the population effects of different sterilization and removal strategies, we developed a simulation model of wolf dynamics using simple rules for demography and dispersal. Simulations suggested that the effects of vasectomy and removal in a disjunct population depend largely on the degree of annual immigration. With low immigration, periodic sterilization reduced pup production and resulted in lower rates of territory recolonization. Consequently, average pack size, number of packs, and population size were significantly less than those for an untreated population. Periodically removing a proportion of the population produced roughly the same trends as did sterilization; however, more than twice as many wolves had to be removed than sterilized. With high immigration, periodic sterilization reduced pup production but not territory recolonization and produced only moderate reductions in population size relative to an untreated population. Similar reductions in population size were obtained by periodically removing large numbers of wolves. Our analysis does not address the possible effects of vasectomy on larger wolf populations, but it suggests that the subject should be considered through modeling or field testing.

  10. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to

  11. Integrating advanced materials simulation techniques into an automated data analysis workflow at the Spallation Neutron Source

    SciTech Connect

    Borreguero Calvo, Jose M; Campbell, Stuart I; Delaire, Olivier A; Doucet, Mathieu; Goswami, Monojoy; Hagen, Mark E; Lynch, Vickie E; Proffen, Thomas E; Ren, Shelly; Savici, Andrei T; Sumpter, Bobby G

    2014-01-01

    This presentation will review developments on the integration of advanced modeling and simulation techniques into the analysis step of experimental data obtained at the Spallation Neutron Source. A workflow framework for the purpose of refining molecular mechanics force-fields against quasi-elastic neutron scattering data is presented. The workflow combines software components to submit model simulations to remote high performance computers, a message broker interface for communications between the optimizer engine and the simulation production step, and tools to convolve the simulated data with the experimental resolution. A test application shows the correction to a popular fixed-charge water model in order to account polarization effects due to the presence of solvated ions. Future enhancements to the refinement workflow are discussed. This work is funded through the DOE Center for Accelerating Materials Modeling.

  12. A heterogeneous computing environment for simulating astrophysical fluid flows

    NASA Technical Reports Server (NTRS)

    Cazes, J.

    1994-01-01

    In the Concurrent Computing Laboratory in the Department of Physics and Astronomy at Louisiana State University we have constructed a heterogeneous computing environment that permits us to routinely simulate complicated three-dimensional fluid flows and to readily visualize the results of each simulation via three-dimensional animation sequences. An 8192-node MasPar MP-1 computer with 0.5 GBytes of RAM provides 250 MFlops of execution speed for our fluid flow simulations. Utilizing the parallel virtual machine (PVM) language, at periodic intervals data is automatically transferred from the MP-1 to a cluster of workstations where individual three-dimensional images are rendered for inclusion in a single animation sequence. Work is underway to replace executions on the MP-1 with simulations performed on the 512-node CM-5 at NCSA and to simultaneously gain access to more potent volume rendering workstations.

  13. Positive Wigner Functions Render Classical Simulation of Quantum Computation Efficient

    NASA Astrophysics Data System (ADS)

    Mari, A.; Eisert, J.

    2012-12-01

    We show that quantum circuits where the initial state and all the following quantum operations can be represented by positive Wigner functions can be classically efficiently simulated. This is true both for continuous-variable as well as discrete variable systems in odd prime dimensions, two cases which will be treated on entirely the same footing. Noting the fact that Clifford and Gaussian operations preserve the positivity of the Wigner function, our result generalizes the Gottesman-Knill theorem. Our algorithm provides a way of sampling from the output distribution of a computation or a simulation, including the efficient sampling from an approximate output distribution in the case of sampling imperfections for initial states, gates, or measurements. In this sense, this work highlights the role of the positive Wigner function as separating classically efficiently simulable systems from those that are potentially universal for quantum computing and simulation, and it emphasizes the role of negativity of the Wigner function as a computational resource.

  14. Computational methods for coupling microstructural and micromechanical materials response simulations

    SciTech Connect

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  15. Computer Simulations of Supercooled Liquids and Glasses

    NASA Astrophysics Data System (ADS)

    Kob, Walter

    Glasses are materials that are ubiquitous in our daily life. We find them in such diverse items as window pans, optical fibers, computer chips, ceramics, all of which are oxide glasses, as well as in food, foams, polymers, gels, which are mainly of organic nature. Roughly speaking glasses are solid materials that have no translational or orientational order on the scale beyond O(10) diameters of the constituent particles (atoms, colloids, …) [1]. Note that these materials are not necessarily homogeneous since, e.g., alkali-glasses such as Na2O-SiO2 show (disordered!) structural features on the length scale of 6-10 Å (compare to the interatomic distance of 1-2 Å) and gels can have structural inhomogeneities that extend up to macroscopic length scales.

  16. On the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods.

    PubMed

    Lee, Anthony; Yau, Christopher; Giles, Michael B; Doucet, Arnaud; Holmes, Christopher C

    2010-12-01

    We present a case-study on the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods. Graphics cards, containing multiple Graphics Processing Units (GPUs), are self-contained parallel computational devices that can be housed in conventional desktop and laptop computers and can be thought of as prototypes of the next generation of many-core processors. For certain classes of population-based Monte Carlo algorithms they offer massively parallel simulation, with the added advantage over conventional distributed multi-core processors that they are cheap, easily accessible, easy to maintain, easy to code, dedicated local devices with low power consumption. On a canonical set of stochastic simulation examples including population-based Markov chain Monte Carlo methods and Sequential Monte Carlo methods, we nd speedups from 35 to 500 fold over conventional single-threaded computer code. Our findings suggest that GPUs have the potential to facilitate the growth of statistical modelling into complex data rich domains through the availability of cheap and accessible many-core computation. We believe the speedup we observe should motivate wider use of parallelizable simulation methods and greater methodological attention to their design. PMID:22003276

  17. Advancing botnet modeling techniques for military and security simulations

    NASA Astrophysics Data System (ADS)

    Banks, Sheila B.; Stytz, Martin R.

    2011-06-01

    Simulation environments serve many purposes, but they are only as good as their content. One of the most challenging and pressing areas that call for improved content is the simulation of bot armies (botnets) and their effects upon networks and computer systems. Botnets are a new type of malware, a type that is more powerful and potentially dangerous than any other type of malware. A botnet's power derives from several capabilities including the following: 1) the botnet's capability to be controlled and directed throughout all phases of its activity, 2) a command and control structure that grows increasingly sophisticated, and 3) the ability of a bot's software to be updated at any time by the owner of the bot (a person commonly called a bot master or bot herder.) Not only is a bot army powerful and agile in its technical capabilities, a bot army can be extremely large, can be comprised of tens of thousands, if not millions, of compromised computers or it can be as small as a few thousand targeted systems. In all botnets, their members can surreptitiously communicate with each other and their command and control centers. In sum, these capabilities allow a bot army to execute attacks that are technically sophisticated, difficult to trace, tactically agile, massive, and coordinated. To improve our understanding of their operation and potential, we believe that it is necessary to develop computer security simulations that accurately portray bot army activities, with the goal of including bot army simulations within military simulation environments. In this paper, we investigate issues that arise when simulating bot armies and propose a combination of the biologically inspired MSEIR infection spread model coupled with the jump-diffusion infection spread model to portray botnet propagation.

  18. Advancement of DOE's EnergyPlus Building Energy Simulation Payment

    SciTech Connect

    Gu, Lixing; Shirey, Don; Raustad, Richard; Nigusse, Bereket; Sharma, Chandan; Lawrie, Linda; Strand, Rick; Pedersen, Curt; Fisher, Dan; Lee, Edwin; Witte, Mike; Glazer, Jason; Barnaby, Chip

    2011-09-30

    EnergyPlus{sup TM} is a new generation computer software analysis tool that has been developed, tested, and commercialized to support DOE's Building Technologies (BT) Program in terms of whole-building, component, and systems R&D (http://www.energyplus.gov). It is also being used to support evaluation and decision making of zero energy building (ZEB) energy efficiency and supply technologies during new building design and existing building retrofits. The 5-year project was managed by the National Energy Technology Laboratory and was divided into 5 budget period between 2006 and 2011. During the project period, 11 versions of EnergyPlus were released. This report summarizes work performed by an EnergyPlus development team led by the University of Central Florida's Florida Solar Energy Center (UCF/FSEC). The team members consist of DHL Consulting, C. O. Pedersen Associates, University of Illinois at Urbana-Champaign, Oklahoma State University, GARD Analytics, Inc., and WrightSoft Corporation. The project tasks involved new feature development, testing and validation, user support and training, and general EnergyPlus support. The team developed 146 new features during the 5-year period to advance the EnergyPlus capabilities. Annual contributions of new features are 7 in budget period 1, 19 in period 2, 36 in period 3, 41 in period 4, and 43 in period 5, respectively. The testing and validation task focused on running test suite and publishing report, developing new IEA test suite cases, testing and validating new source code, addressing change requests, and creating and testing installation package. The user support and training task provided support for users and interface developers, and organized and taught workshops. The general support task involved upgrading StarTeam (team sharing) software and updating existing utility software. The project met the DOE objectives and completed all tasks successfully. Although the EnergyPlus software was enhanced significantly

  19. Computing abstraction hierarchies by numerical simulation

    SciTech Connect

    Bundy, A.; Giunchiglia, F.; Sebastiani, R.; Walsh, T.

    1996-12-31

    We present a novel method for building ABSTRIPS-style abstraction hierarchies in planning. The aim of this method is to minimize the amount of backtracking between abstraction levels. Previous approaches have determined the criticality of operator preconditions by reasoning about plans directly. Here, we adopt a simpler and faster approach where we use numerical simulation of the planning process. We demonstrate the theoretical advantages of our approach by identifying some simple properties lacking in previous approaches but possessed by our method. We demonstrate the empirical advantages of our approach by a set of four benchmark experiments using the ABTWEAK system. We compare the quality of the abstraction hierarchies generated with those built by the ALPINE and HIGHPOINT algorithms.

  20. Computer simulations of adsorbed liquid crystal films

    NASA Astrophysics Data System (ADS)

    Wall, Greg D.; Cleaver, Douglas J.

    2003-01-01

    The structures adopted by adsorbed thin films of Gay-Berne particles in the presence of a coexisting vapour phase are investigated by molecular dynamics simulation. The films are adsorbed at a flat substrate which favours planar anchoring, whereas the nematic-vapour interface favours normal alignment. On cooling, a system with a high molecule-substrate interaction strength exhibits substrate-induced planar orientational ordering and considerable stratification is observed in the density profiles. In contrast, a system with weak molecule-substrate coupling adopts a director orientation orthogonal to the substrate plane, owing to the increased influence of the nematic-vapour interface. There are significant differences between the structures adopted at the two interfaces, in contrast with the predictions of density functional treatments of such systems.