A geostationary Earth orbit satellite model using Easy Java Simulation
NASA Astrophysics Data System (ADS)
Wee, Loo Kang; Hwee Goh, Giam
2013-01-01
We develop an Easy Java Simulation (EJS) model for students to visualize geostationary orbits near Earth, modelled using a Java 3D implementation of the EJS 3D library. The simplified physics model is described and simulated using a simple constant angular velocity equation. We discuss four computer model design ideas: (1) a simple and realistic 3D view and associated learning in the real world; (2) comparative visualization of permanent geostationary satellites; (3) examples of non-geostationary orbits of different rotation senses, periods and planes; and (4) an incorrect physics model for conceptual discourse. General feedback from the students has been relatively positive, and we hope teachers will find the computer model useful in their own classes.
ERIC Educational Resources Information Center
Roach, Linda E., Ed.
This document contains the following papers on science from the SITE (Society for Information Technology & Teacher Education) 2001 conference: (1) "Using a Computer Simulation before Dissection To Help Students Learn Anatomy" (Joseph Paul Akpan and Thomas Andre); (2) "EARTH2CLASS: A Unique Workshop/On-Line/Distance-Learning…
Manual for a workstation-based generic flight simulation program (LaRCsim), version 1.4
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce
1995-01-01
LaRCsim is a set of ANSI C routines that implement a full set of equations of motion for a rigid-body aircraft in atmospheric and low-earth orbital flight, suitable for pilot-in-the-loop simulations on a workstation-class computer. All six rigid-body degrees of freedom are modeled. The modules provided include calculations of the typical aircraft rigid-body simulation variables, earth geodesy, gravity and atmospheric models, and support several data recording options. Features/limitations of the current version include English units of measure, a 1962 atmosphere model in cubic spline function lookup form, ranging from sea level to 75,000 feet, rotating oblate spheroidal earth model, with aircraft C.G. coordinates in both geocentric and geodetic axes. Angular integrations are done using quaternion state variables Vehicle X-Z symmetry is assumed.
Earth System Science Education in a General Education Context: Two Case Studies
NASA Astrophysics Data System (ADS)
Herring, J. A.
2004-12-01
The teaching of Earth System Science (ESS) to non-science majors is examined in a large lecture format class at a state university and in small classes with a significant research component at a liberal arts college. Quantitative and qualitative evaluations of both approaches reveal some of the challenges educators face as they work to advance students' integrated understanding of the Earth system. Student learning on selected concepts in the large lecture format class was poorly or negatively correlated with the amount of class time spent on the topic, even when the time was spent in teacher-student dialogue or in cooperative learning activities. The small class format emphasized student participation in research, which was found to be particularly effective when the class operated as a three-week intensive block and student use of computer models to simulate the dynamics of complex systems, which was found to be more effective when the class was held in a ten-week quarter. This study provides some clarification as to the utility of specific pedagogical frameworks (such as constructivism and experiential education) in the teaching of ESS to a general education audience and emphasizes the importance of carefully defining educational goals (both cognitive and affective) as a part of the curriculum design.
Toward an in-situ analytics and diagnostics framework for earth system models
NASA Astrophysics Data System (ADS)
Anantharaj, Valentine; Wolf, Matthew; Rasch, Philip; Klasky, Scott; Williams, Dean; Jacob, Rob; Ma, Po-Lun; Kuo, Kwo-Sen
2017-04-01
The development roadmaps for many earth system models (ESM) aim for a globally cloud-resolving model targeting the pre-exascale and exascale systems of the future. The ESMs will also incorporate more complex physics, chemistry and biology - thereby vastly increasing the fidelity of the information content simulated by the model. We will then be faced with an unprecedented volume of simulation output that would need to be processed and analyzed concurrently in order to derive the valuable scientific results. We are already at this threshold with our current generation of ESMs at higher resolution simulations. Currently, the nominal I/O throughput in the Community Earth System Model (CESM) via Parallel IO (PIO) library is around 100 MB/s. If we look at the high frequency I/O requirements, it would require an additional 1 GB / simulated hour, translating to roughly 4 mins wallclock / simulated-day => 24.33 wallclock hours / simulated-model-year => 1,752,000 core-hours of charge per simulated-model-year on the Titan supercomputer at the Oak Ridge Leadership Computing Facility. There is also a pending need for 3X more volume of simulation output . Meanwhile, many ESMs use instrument simulators to run forward models to compare model simulations against satellite and ground-based instruments, such as radars and radiometers. The CFMIP Observation Simulator Package (COSP) is used in CESM as well as the Accelerated Climate Model for Energy (ACME), one of the ESMs specifically targeting current and emerging leadership-class computing platforms These simulators can be computationally expensive, accounting for as much as 30% of the computational cost. Hence the data are often written to output files that are then used for offline calculations. Again, the I/O bottleneck becomes a limitation. Detection and attribution studies also use large volume of data for pattern recognition and feature extraction to analyze weather and climate phenomenon such as tropical cyclones, atmospheric rivers, blizzards, etc. It is evident that ESMs need an in-situ framework to decouple the diagnostics and analytics from the prognostics and physics computations of the models so that the diagnostic computations could be performed concurrently without limiting model throughput. We are designing a science-driven online analytics framework for earth system models. Our approach is to adopt several data workflow technologies, such as the Adaptable IO System (ADIOS), being developed under the U.S. Exascale Computing Project (ECP) and integrate these to allow for extreme performance IO, in situ workflow integration, science-driven analytics and visualization all in a easy to use computational framework. This will allow science teams to write data 100-1000 times faster and seamlessly move from post processing the output for validation and verification purposes to performing these calculations in situ. We can easily and knowledgeably envision a near-term future where earth system models like ACME and CESM will have to address not only the challenges of the volume of data but also need to consider the velocity of the data. The earth system model of the future in the exascale era, as they incorporate more complex physics at higher resolutions, will be able to analyze more simulation content without having to compromise targeted model throughput.
The Australian Computational Earth Systems Simulator
NASA Astrophysics Data System (ADS)
Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.
2001-12-01
Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic behaviour of earth systems. ACcESS represents a part of Australia's contribution to the APEC Cooperation for Earthquake Simulation (ACES) international initiative. Together with other national earth systems science initiatives including the Japanese Earth Simulator and US General Earthquake Model projects, ACcESS aims to provide a driver for scientific advancement and technological breakthroughs including: quantum leaps in understanding of earth evolution at global, crustal, regional and microscopic scales; new knowledge of the physics of crustal fault systems required to underpin the grand challenge of earthquake prediction; new understanding and predictive capabilities of geological processes such as tectonics and mineralisation.
Automatic Computer Mapping of Terrain
NASA Technical Reports Server (NTRS)
Smedes, H. W.
1971-01-01
Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.
Inquiry-Based Whole-Class Teaching with Computer Simulations in Physics
ERIC Educational Resources Information Center
Rutten, Nico; van der Veen, Jan T.; van Joolingen, Wouter R.
2015-01-01
In this study we investigated the pedagogical context of whole-class teaching with computer simulations. We examined relations between the attitudes and learning goals of teachers and their students regarding the use of simulations in whole-class teaching, and how teachers implement these simulations in their teaching practices. We observed…
Modeling, simulation, and analysis of optical remote sensing systems
NASA Technical Reports Server (NTRS)
Kerekes, John Paul; Landgrebe, David A.
1989-01-01
Remote Sensing of the Earth's resources from space-based sensors has evolved in the past 20 years from a scientific experiment to a commonly used technological tool. The scientific applications and engineering aspects of remote sensing systems have been studied extensively. However, most of these studies have been aimed at understanding individual aspects of the remote sensing process while relatively few have studied their interrelations. A motivation for studying these interrelationships has arisen with the advent of highly sophisticated configurable sensors as part of the Earth Observing System (EOS) proposed by NASA for the 1990's. Two approaches to investigating remote sensing systems are developed. In one approach, detailed models of the scene, the sensor, and the processing aspects of the system are implemented in a discrete simulation. This approach is useful in creating simulated images with desired characteristics for use in sensor or processing algorithm development. A less complete, but computationally simpler method based on a parametric model of the system is also developed. In this analytical model the various informational classes are parameterized by their spectral mean vector and covariance matrix. These class statistics are modified by models for the atmosphere, the sensor, and processing algorithms and an estimate made of the resulting classification accuracy among the informational classes. Application of these models is made to the study of the proposed High Resolution Imaging Spectrometer (HRIS). The interrelationships among observational conditions, sensor effects, and processing choices are investigated with several interesting results.
Computer simulation results of attitude estimation of earth orbiting satellites
NASA Technical Reports Server (NTRS)
Kou, S. R.
1976-01-01
Computer simulation results of attitude estimation of Earth-orbiting satellites (including Space Telescope) subjected to environmental disturbances and noises are presented. Decomposed linear recursive filter and Kalman filter were used as estimation tools. Six programs were developed for this simulation, and all were written in the basic language and were run on HP 9830A and HP 9866A computers. Simulation results show that a decomposed linear recursive filter is accurate in estimation and fast in response time. Furthermore, for higher order systems, this filter has computational advantages (i.e., less integration errors and roundoff errors) over a Kalman filter.
Knowledge Discovery from Climate Data using Graph-Based Methods
NASA Astrophysics Data System (ADS)
Steinhaeuser, K.
2012-04-01
Climate and Earth sciences have recently experienced a rapid transformation from a historically data-poor to a data-rich environment, thus bringing them into the realm of the Fourth Paradigm of scientific discovery - a term coined by the late Jim Gray (Hey et al. 2009), the other three being theory, experimentation and computer simulation. In particular, climate-related observations from remote sensors on satellites and weather radars, in situ sensors and sensor networks, as well as outputs of climate or Earth system models from large-scale simulations, provide terabytes of spatio-temporal data. These massive and information-rich datasets offer a significant opportunity for advancing climate science and our understanding of the global climate system, yet current analysis techniques are not able to fully realize their potential benefits. We describe a class of computational approaches, specifically from the data mining and machine learning domains, which may be novel to the climate science domain and can assist in the analysis process. Computer scientists have developed spatial and spatio-temporal analysis techniques for a number of years now, and many of them may be applicable and/or adaptable to problems in climate science. We describe a large-scale, NSF-funded project aimed at addressing climate science question using computational analysis methods; team members include computer scientists, statisticians, and climate scientists from various backgrounds. One of the major thrusts is in the development of graph-based methods, and several illustrative examples of recent work in this area will be presented.
Applying Parallel Adaptive Methods with GeoFEST/PYRAMID to Simulate Earth Surface Crustal Dynamics
NASA Technical Reports Server (NTRS)
Norton, Charles D.; Lyzenga, Greg; Parker, Jay; Glasscoe, Margaret; Donnellan, Andrea; Li, Peggy
2006-01-01
This viewgraph presentation reviews the use Adaptive Mesh Refinement (AMR) in simulating the Crustal Dynamics of Earth's Surface. AMR simultaneously improves solution quality, time to solution, and computer memory requirements when compared to generating/running on a globally fine mesh. The use of AMR in simulating the dynamics of the Earth's Surface is spurred by future proposed NASA missions, such as InSAR for Earth surface deformation and other measurements. These missions will require support for large-scale adaptive numerical methods using AMR to model observations. AMR was chosen because it has been successful in computation fluid dynamics for predictive simulation of complex flows around complex structures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahowald, Natalie
Soils in natural and managed ecosystems and wetlands are well known sources of methane, nitrous oxides, and reactive nitrogen gases, but the magnitudes of gas flux to the atmosphere are still poorly constrained. Thus, the reasons for the large increases in atmospheric concentrations of methane and nitrous oxide since the preindustrial time period are not well understood. The low atmospheric concentrations of methane and nitrous oxide, despite being more potent greenhouse gases than carbon dioxide, complicate empirical studies to provide explanations. In addition to climate concerns, the emissions of reactive nitrogen gases from soils are important to the changing nitrogenmore » balance in the earth system, subject to human management, and may change substantially in the future. Thus improved modeling of the emission fluxes of these species from the land surface is important. Currently, there are emission modules for methane and some nitrogen species in the Community Earth System Model’s Community Land Model (CLM-ME/N); however, there are large uncertainties and problems in the simulations, resulting in coarse estimates. In this proposal, we seek to improve these emission modules by combining state-of-the-art process modules for emissions, available data, and new optimization methods. In earth science problems, we often have substantial data and knowledge of processes in disparate systems, and thus we need to combine data and a general process level understanding into a model for projections of future climate that are as accurate as possible. The best methodologies for optimization of parameters in earth system models are still being developed. In this proposal we will develop and apply surrogate algorithms that a) were especially developed for computationally expensive simulations like CLM-ME/N models; b) were (in the earlier surrogate optimization Stochastic RBF) demonstrated to perform very well on computationally expensive complex partial differential equations in earth science with limited numbers of simulations; and, c) will be (as part of the proposed research) significantly improved both by adding asynchronous parallelism, early truncation of unsuccessful simulations, and the improvement of both serial and parallel performance by the use of derivative and sensitivity information from global and local surrogate approximations S(x). The algorithm development and testing will be focused on the CLM-ME/N model application, but the methods are general and are expected to also perform well on optimization for parameter estimation of other climate models and other classes of continuous multimodal optimization problems arising from complex simulation models. In addition, this proposal will compile available datasets of emissions of methane, nitrous oxides and reactive nitrogen species and develop protocols for site level comparisons with the CLM-ME/N. Once the model parameters are optimized against site level data, the model will be simulated at the global level and compared to atmospheric concentration measurements for the current climate, and future emissions will be estimated using climate change as simulated by the CESM. This proposal combines experts in earth system modeling, optimization, computer science, and process level understanding of soil gas emissions in an interdisciplinary team in order to improve the modeling of methane and nitrogen gas emissions. This proposal thus meets the requirements of the SciDAC RFP, by integrating state-of-the-art computer science and earth system to build an improved earth system model.« less
ERIC Educational Resources Information Center
Loke, Swee-Kin; Al-Sallami, Hesham S.; Wright, Daniel F. B.; McDonald, Jenny; Jadhav, Sheetal; Duffull, Stephen B.
2012-01-01
Complex systems are typically difficult for students to understand and computer simulations offer a promising way forward. However, integrating such simulations into conventional classes presents numerous challenges. Framed within an educational design research, we studied the use of an in-house built simulation of the coagulation network in four…
Theoretical and computational foundations of management class simulation
Denie Gerold
1978-01-01
Investigations on complicated, complex, and not well-ordered systems are possible only with the aid of mathematical methods and electronic data processing. Simulation as a method of operations research is particularly suitable for this purpose. Theoretical and computational foundations of management class simulation must be integrated into the planning systems of...
Gaming via Computer Simulation Techniques for Junior College Economics Education. Final Report.
ERIC Educational Resources Information Center
Thompson, Fred A.
A study designed to answer the need for more attractive and effective economics education involved the teaching of one junior college economics class by the conventional (lecture) method and an experimental class by computer simulation techniques. Econometric models approximating the "real world" were computer programed to enable the experimental…
Cane Toad or Computer Mouse? Real and Computer-Simulated Laboratory Exercises in Physiology Classes
ERIC Educational Resources Information Center
West, Jan; Veenstra, Anneke
2012-01-01
Traditional practical classes in many countries are being rationalised to reduce costs. The challenge for university educators is to provide students with the opportunity to reinforce theoretical concepts by running something other than a traditional practical program. One alternative is to replace wet labs with comparable computer simulations.…
Spacecraft orbit/earth scan derivations, associated APL program, and application to IMP-6
NASA Technical Reports Server (NTRS)
Smith, G. A.
1971-01-01
The derivation of a time shared, remote site, demand processed computer program is discussed. The computer program analyzes the effects of selected orbit, attitude, and spacecraft parameters on earth sensor detections of earth. For prelaunch analysis, the program may be used to simulate effects in nominal parameters which are used in preparing attitude data processing programs. After launch, comparison of results from a simulation and from satellite data will produce deviations helpful in isolating problems.
ERIC Educational Resources Information Center
Kangassalo, Marjatta
Using a pictorial computer simulation of a natural phenomenon, children's exploration processes and their construction of conceptual models were examined. The selected natural phenomenon was the variations of sunlight and heat of the sun experienced on the earth in relation to the positions of the earth and sun in space, and the subjects were…
NASA Astrophysics Data System (ADS)
Kaplinger, Brian Douglas
For the past few decades, both the scientific community and the general public have been becoming more aware that the Earth lives in a shooting gallery of small objects. We classify all of these asteroids and comets, known or unknown, that cross Earth's orbit as near-Earth objects (NEOs). A look at our geologic history tells us that NEOs have collided with Earth in the past, and we expect that they will continue to do so. With thousands of known NEOs crossing the orbit of Earth, there has been significant scientific interest in developing the capability to deflect an NEO from an impacting trajectory. This thesis applies the ideas of Smoothed Particle Hydrodynamics (SPH) theory to the NEO disruption problem. A simulation package was designed that allows efficacy simulation to be integrated into the mission planning and design process. This is done by applying ideas in high-performance computing (HPC) on the computer graphics processing unit (GPU). Rather than prove a concept through large standalone simulations on a supercomputer, a highly parallel structure allows for flexible, target dependent questions to be resolved. Built around nonclassified data and analysis, this computer package will allow academic institutions to better tackle the issue of NEO mitigation effectiveness.
Accelerating the design of solar thermal fuel materials through high throughput simulations.
Liu, Yun; Grossman, Jeffrey C
2014-12-10
Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastable structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.
A hybrid method for the computation of quasi-3D seismograms.
NASA Astrophysics Data System (ADS)
Masson, Yder; Romanowicz, Barbara
2013-04-01
The development of powerful computer clusters and efficient numerical computation methods, such as the Spectral Element Method (SEM) made possible the computation of seismic wave propagation in a heterogeneous 3D earth. However, the cost of theses computations is still problematic for global scale tomography that requires hundreds of such simulations. Part of the ongoing research effort is dedicated to the development of faster modeling methods based on the spectral element method. Capdeville et al. (2002) proposed to couple SEM simulations with normal modes calculation (C-SEM). Nissen-Meyer et al. (2007) used 2D SEM simulations to compute 3D seismograms in a 1D earth model. Thanks to these developments, and for the first time, Lekic et al. (2011) developed a 3D global model of the upper mantle using SEM simulations. At the local and continental scale, adjoint tomography that is using a lot of SEM simulation can be implemented on current computers (Tape, Liu et al. 2009). Due to their smaller size, these models offer higher resolution. They provide us with images of the crust and the upper part of the mantle. In an attempt to teleport such local adjoint tomographic inversions into the deep earth, we are developing a hybrid method where SEM computation are limited to a region of interest within the earth. That region can have an arbitrary shape and size. Outside this region, the seismic wavefield is extrapolated to obtain synthetic data at the Earth's surface. A key feature of the method is the use of a time reversal mirror to inject the wavefield induced by distant seismic source into the region of interest (Robertsson and Chapman 2000). We compute synthetic seismograms as follow: Inside the region of interest, we are using regional spectral element software RegSEM to compute wave propagation in 3D. Outside this region, the wavefield is extrapolated to the surface by convolution with the Green's functions from the mirror to the seismic stations. For now, these Green's functions are computed using 2D SEM simulation in a 1D Earth model. Such seismograms account for the 3D structure inside the region of interest in a quasi-exact manner. Later we plan to extrapolate the misfit function computed from such seismograms at the stations back into the SEM region in order to compute local adjoint kernels. This opens a new path toward regional adjoint tomography into the deep Earth. Capdeville, Y., et al. (2002). "Coupling the spectral element method with a modal solution for elastic wave propagation in global Earth models." Geophysical Journal International 152(1): 34-67. Lekic, V. and B. Romanowicz (2011). "Inferring upper-mantle structure by full waveform tomography with the spectral element method." Geophysical Journal International 185(2): 799-831. Nissen-Meyer, T., et al. (2007). "A two-dimensional spectral-element method for computing spherical-earth seismograms-I. Moment-tensor source." Geophysical Journal International 168(3): 1067-1092. Robertsson, J. O. A. and C. H. Chapman (2000). "An efficient method for calculating finite-difference seismograms after model alterations." Geophysics 65(3): 907-918. Tape, C., et al. (2009). "Adjoint tomography of the southern California crust." Science 325(5943): 988-992.
NASA Astrophysics Data System (ADS)
Evans, Ben; Allen, Chris; Antony, Joseph; Bastrakova, Irina; Gohar, Kashif; Porter, David; Pugh, Tim; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley
2015-04-01
The National Computational Infrastructure (NCI) has established a powerful and flexible in-situ petascale computational environment to enable both high performance computing and Data-intensive Science across a wide spectrum of national environmental and earth science data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress so far to harmonise the underlying data collections for future interdisciplinary research across these large volume data collections. NCI has established 10+ PBytes of major national and international data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the major Australian national-scale scientific collections), leading research communities, and collaborating overseas organisations. New infrastructures created at NCI mean the data collections are now accessible within an integrated High Performance Computing and Data (HPC-HPD) environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large-scale high-bandwidth Lustre filesystems. The hardware was designed at inception to ensure that it would allow the layered software environment to flexibly accommodate the advancement of future data science. New approaches to software technology and data models have also had to be developed to enable access to these large and exponentially increasing data volumes at NCI. Traditional HPC and data environments are still made available in a way that flexibly provides the tools, services and supporting software systems on these new petascale infrastructures. But to enable the research to take place at this scale, the data, metadata and software now need to evolve together - creating a new integrated high performance infrastructure. The new infrastructure at NCI currently supports a catalogue of integrated, reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. One of the challenges for NCI has been to support existing techniques and methods, while carefully preparing the underlying infrastructure for the transition needed for the next class of Data-intensive Science. In doing so, a flexible range of techniques and software can be made available for application across the corpus of data collections available, and to provide a new infrastructure for future interdisciplinary research.
Studying the Earth's Environment from Space: Computer Laboratory Exercised and Instructor Resources
NASA Technical Reports Server (NTRS)
Smith, Elizabeth A.; Alfultis, Michael
1998-01-01
Studying the Earth's Environment From Space is a two-year project to develop a suite of CD-ROMs containing Earth System Science curriculum modules for introductory undergraduate science classes. Lecture notes, slides, and computer laboratory exercises, including actual satellite data and software, are being developed in close collaboration with Carla Evans of NASA GSFC Earth Sciences Directorate Scientific and Educational Endeavors (SEE) project. Smith and Alfultis are responsible for the Oceanography and Sea Ice Processes Modules. The GSFC SEE project is responsible for Ozone and Land Vegetation Modules. This document constitutes a report on the first year of activities of Smith and Alfultis' project.
The use of computer simulations in whole-class versus small-group settings
NASA Astrophysics Data System (ADS)
Smetana, Lara Kathleen
This study explored the use of computer simulations in a whole-class as compared to small-group setting. Specific consideration was given to the nature and impact of classroom conversations and interactions when computer simulations were incorporated into a high school chemistry course. This investigation fills a need for qualitative research that focuses on the social dimensions of actual classrooms. Participants included a novice chemistry teacher experienced in the use of educational technologies and two honors chemistry classes. The study was conducted in a rural school in the south-Atlantic United States at the end of the fall 2007 semester. The study took place during one instructional unit on atomic structure. Data collection allowed for triangulation of evidence from a variety of sources approximately 24 hours of video- and audio-taped classroom observations, supplemented with the researcher's field notes and analytic journal; miscellaneous classroom artifacts such as class notes, worksheets, and assignments; open-ended pre- and post-assessments; student exit interviews; teacher entrance, exit and informal interviews. Four web-based simulations were used, three of which were from the ExploreLearning collection. Assessments were analyzed using descriptive statistics and classroom observations, artifacts and interviews were analyzed using Erickson's (1986) guidelines for analytic induction. Conversational analysis was guided by methods outlined by Erickson (1982). Findings indicated (a) the teacher effectively incorporated simulations in both settings (b) students in both groups significantly improved their understanding of the chemistry concepts (c) there was no statistically significant difference between groups' achievement (d) there was more frequent exploratory talk in the whole-class group (e) there were more frequent and meaningful teacher-student interactions in the whole-class group (f) additional learning experiences not measured on the assessment resulted from conversations and interactions in the whole-class setting (g) the potential benefits of exploratory talk in the whole-class setting were not fully realized. These findings suggest that both whole-class and small-group settings are appropriate for using computer simulations in science. The effective incorporation of simulations into whole-class instruction may provide a solution to the dilemma of technology penetration versus integration in today's classrooms.
Learning Oceanography from a Computer Simulation Compared with Direct Experience at Sea
ERIC Educational Resources Information Center
Winn, William; Stahr, Frederick; Sarason, Christian; Fruland, Ruth; Oppenheimer, Peter; Lee, Yen-Ling
2006-01-01
Considerable research has compared how students learn science from computer simulations with how they learn from "traditional" classes. Little research has compared how students learn science from computer simulations with how they learn from direct experience in the real environment on which the simulations are based. This study compared two…
A Geostationary Earth Orbit Satellite Model Using Easy Java Simulation
ERIC Educational Resources Information Center
Wee, Loo Kang; Goh, Giam Hwee
2013-01-01
We develop an Easy Java Simulation (EJS) model for students to visualize geostationary orbits near Earth, modelled using a Java 3D implementation of the EJS 3D library. The simplified physics model is described and simulated using a simple constant angular velocity equation. We discuss four computer model design ideas: (1) a simple and realistic…
Micromagnetics of rare-earth efficient permanent magnets
NASA Astrophysics Data System (ADS)
Fischbacher, Johann; Kovacs, Alexander; Gusenbauer, Markus; Oezelt, Harald; Exl, Lukas; Bance, Simon; Schrefl, Thomas
2018-05-01
The development of permanent magnets containing less or no rare-earth elements is linked to profound knowledge of the coercivity mechanism. Prerequisites for a promising permanent magnet material are a high spontaneous magnetization and a sufficiently high magnetic anisotropy. In addition to the intrinsic magnetic properties the microstructure of the magnet plays a significant role in establishing coercivity. The influence of the microstructure on coercivity, remanence, and energy density product can be understood by using micromagnetic simulations. With advances in computer hardware and numerical methods, hysteresis curves of magnets can be computed quickly so that the simulations can readily provide guidance for the development of permanent magnets. The potential of rare-earth reduced and rare-earth free permanent magnets is investigated using micromagnetic simulations. The results show excellent hard magnetic properties can be achieved in grain boundary engineered NdFeB, rare-earth magnets with a ThMn12 structure, Co-based nano-wires, and L10-FeNi provided that the magnet’s microstructure is optimized.
NASA Astrophysics Data System (ADS)
Karimabadi, Homa
2012-03-01
Recent advances in simulation technology and hardware are enabling breakthrough science where many longstanding problems can now be addressed for the first time. In this talk, we focus on kinetic simulations of the Earth's magnetosphere and magnetic reconnection process which is the key mechanism that breaks the protective shield of the Earth's dipole field, allowing the solar wind to enter the Earth's magnetosphere. This leads to the so-called space weather where storms on the Sun can affect space-borne and ground-based technological systems on Earth. The talk will consist of three parts: (a) overview of a new multi-scale simulation technique where each computational grid is updated based on its own unique timestep, (b) Presentation of a new approach to data analysis that we refer to as Physics Mining which entails combining data mining and computer vision algorithms with scientific visualization to extract physics from the resulting massive data sets. (c) Presentation of several recent discoveries in studies of space plasmas including the role of vortex formation and resulting turbulence in magnetized plasmas.
NASA Astrophysics Data System (ADS)
Ryu, Dongok; Kim, Sug-Whan; Kim, Dae Wook; Lee, Jae-Min; Lee, Hanshin; Park, Won Hyun; Seong, Sehyun; Ham, Sun-Jeong
2010-09-01
Understanding the Earth spectral bio-signatures provides an important reference datum for accurate de-convolution of collapsed spectral signals from potential earth-like planets of other star systems. This study presents a new ray tracing computation method including an improved 3D optical earth model constructed with the coastal line and vegetation distribution data from the Global Ecological Zone (GEZ) map. Using non-Lambertian bidirectional scattering distribution function (BSDF) models, the input earth surface model is characterized with three different scattering properties and their annual variations depending on monthly changes in vegetation distribution, sea ice coverage and illumination angle. The input atmosphere model consists of one layer with Rayleigh scattering model from the sea level to 100 km in altitude and its radiative transfer characteristics is computed for four seasons using the SMART codes. The ocean scattering model is a combination of sun-glint scattering and Lambertian scattering models. The land surface scattering is defined with the semi empirical parametric kernel method used for MODIS and POLDER missions. These three component models were integrated into the final Earth model that was then incorporated into the in-house built integrated ray tracing (IRT) model capable of computing both spectral imaging and radiative transfer performance of a hypothetical space instrument as it observes the Earth from its designated orbit. The IRT model simulation inputs include variation in earth orientation, illuminated phases, and seasonal sea ice and vegetation distribution. The trial simulation runs result in the annual variations in phase dependent disk averaged spectra (DAS) and its associated bio-signatures such as NDVI. The full computational details are presented together with the resulting annual variation in DAS and its associated bio-signatures.
Accelerating the Design of Solar Thermal Fuel Materials through High Throughput Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Y; Grossman, JC
2014-12-01
Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastablemore » structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.« less
The Strata-1 experiment on small body regolith segregation
NASA Astrophysics Data System (ADS)
Fries, Marc; Abell, Paul; Brisset, Julie; Britt, Daniel; Colwell, Joshua; Dove, Adrienne; Durda, Dan; Graham, Lee; Hartzell, Christine; Hrovat, Kenneth; John, Kristen; Karrer, Dakotah; Leonard, Matthew; Love, Stanley; Morgan, Joseph; Poppin, Jayme; Rodriguez, Vincent; Sánchez-Lana, Paul; Scheeres, Dan; Whizin, Akbar
2018-01-01
The Strata-1 experiment studies the mixing and segregation dynamics of regolith on small bodies by exposing a suite of regolith simulants to the microgravity environment aboard the International Space Station (ISS) for one year. This will improve our understanding of regolith dynamics and properties on small asteroids, and may assist in interpreting analyses of samples from missions to small bodies such as OSIRIS-REx, Hayabusa-1 and -2, and future missions to small bodies. The Strata-1 experiment consists of four evacuated tubes partially filled with regolith simulants. The simulants are chosen to represent models of regolith covering a range of complexity and tailored to inform and improve computational studies. The four tubes are regularly imaged while moving in response to the ambient vibrational environment using dedicated cameras. The imagery is then downlinked to the Strata-1 science team about every two months. Analyses performed on the imagery includes evaluating the extent of the segregation of Strata-1 samples and comparing the observations to computational models. After Strata-1's return to Earth, x-ray tomography and optical microscopy will be used to study the post-flight simulant distribution. Strata-1 is also a pathfinder for the new "1E" ISS payload class, which is intended to simplify and accelerate emplacement of experiments on board ISS.
CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.
ERIC Educational Resources Information Center
Skrein, Dale
1994-01-01
CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)
Simulation of interference between Earth stations and Earth-orbiting satellites
NASA Technical Reports Server (NTRS)
Bishop, D. F.
1994-01-01
It is often desirable to determine the potential for radio frequency interference between earth stations and orbiting spacecraft. This information can be used to select frequencies for radio systems to avoid interference or it can be used to determine if coordination between radio systems is necessary. A model is developed that will determine the statistics of interference between earth stations and elliptical orbiting spacecraft. The model uses orbital dynamics, detailed antenna patterns, and spectral characteristics to obtain accurate levels of interference at the victim receiver. The model is programmed into a computer simulation to obtain long-term statistics of interference. Two specific examples are shown to demonstrate the model. The first example is a simulation of interference from a fixed-satellite earth station to an orbiting scatterometer receiver. The second example is a simulation of interference from earth-exploration satellites to a deep-space earth station.
Computer Simulations: An Integrating Tool.
ERIC Educational Resources Information Center
Bilan, Bohdan J.
This introduction to computer simulations as an integrated learning experience reports on their use with students in grades 5 through 10 using commercial software packages such as SimCity, SimAnt, SimEarth, and Civilization. Students spent an average of 60 hours with the simulation games and reported their experiences each week in a personal log.…
Multi-objective optimization of GENIE Earth system models.
Price, Andrew R; Myerscough, Richard J; Voutchkov, Ivan I; Marsh, Robert; Cox, Simon J
2009-07-13
The tuning of parameters in climate models is essential to provide reliable long-term forecasts of Earth system behaviour. We apply a multi-objective optimization algorithm to the problem of parameter estimation in climate models. This optimization process involves the iterative evaluation of response surface models (RSMs), followed by the execution of multiple Earth system simulations. These computations require an infrastructure that provides high-performance computing for building and searching the RSMs and high-throughput computing for the concurrent evaluation of a large number of models. Grid computing technology is therefore essential to make this algorithm practical for members of the GENIE project.
Choice: 36 band feature selection software with applications to multispectral pattern recognition
NASA Technical Reports Server (NTRS)
Jones, W. C.
1973-01-01
Feature selection software was developed at the Earth Resources Laboratory that is capable of inputting up to 36 channels and selecting channel subsets according to several criteria based on divergence. One of the criterion used is compatible with the table look-up classifier requirements. The software indicates which channel subset best separates (based on average divergence) each class from all other classes. The software employs an exhaustive search technique, and computer time is not prohibitive. A typical task to select the best 4 of 22 channels for 12 classes takes 9 minutes on a Univac 1108 computer.
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.
2014-12-01
The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will involve the further integration and analysis of this data across the social sciences to facilitate the impacts across the societal domain, including timely analysis to more accurately predict and forecast future climate and environmental state.
Microsoft C#.NET program and electromagnetic depth sounding for large loop source
NASA Astrophysics Data System (ADS)
Prabhakar Rao, K.; Ashok Babu, G.
2009-07-01
A program, in the C# (C Sharp) language with Microsoft.NET Framework, is developed to compute the normalized vertical magnetic field of a horizontal rectangular loop source placed on the surface of an n-layered earth. The field can be calculated either inside or outside the loop. Five C# classes with member functions in each class are, designed to compute the kernel, Hankel transform integral, coefficients for cubic spline interpolation between computed values and the normalized vertical magnetic field. The program computes the vertical magnetic field in the frequency domain using the integral expressions evaluated by a combination of straightforward numerical integration and the digital filter technique. The code utilizes different object-oriented programming (OOP) features. It finally computes the amplitude and phase of the normalized vertical magnetic field. The computed results are presented for geometric and parametric soundings. The code is developed in Microsoft.NET visual studio 2003 and uses various system class libraries.
The QuakeSim Project: Numerical Simulations for Active Tectonic Processes
NASA Technical Reports Server (NTRS)
Donnellan, Andrea; Parker, Jay; Lyzenga, Greg; Granat, Robert; Fox, Geoffrey; Pierce, Marlon; Rundle, John; McLeod, Dennis; Grant, Lisa; Tullis, Terry
2004-01-01
In order to develop a solid earth science framework for understanding and studying of active tectonic and earthquake processes, this task develops simulation and analysis tools to study the physics of earthquakes using state-of-the art modeling, data manipulation, and pattern recognition technologies. We develop clearly defined accessible data formats and code protocols as inputs to the simulations. these are adapted to high-performance computers because the solid earth system is extremely complex and nonlinear resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes.
Parallel computing in enterprise modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.
2008-08-01
This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priorimore » ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.« less
NASA Astrophysics Data System (ADS)
Joiner, D. A.; Stevenson, D. E.; Panoff, R. M.
2000-12-01
The Computational Science Reference Desk is an online tool designed to provide educators in math, physics, astronomy, biology, chemistry, and engineering with information on how to use computational science to enhance inquiry based learning in the undergraduate and pre college classroom. The Reference Desk features a showcase of original content exploration activities, including lesson plans and background materials; a catalog of websites which contain models, lesson plans, software, and instructional resources; and a forum to allow educators to communicate their ideas. Many of the recent advances in astronomy rely on the use of computer simulation, and tools are being developed by CSERD to allow students to experiment with some of the models that have guided scientific discovery. One of these models allows students to study how scientists use spectral information to determine the makeup of the interstellar medium by modeling the interstellar extinction curve using spherical grains of silicate, amorphous carbon, or graphite. Students can directly compare their model to the average interstellar extinction curve, and experiment with how small changes in their model alter the shape of the interstellar extinction curve. A simpler model allows students to visualize spatial relationships between the Earth, Moon, and Sun to understand the cause of the phases of the moon. A report on the usefulness of these models in two classes, the Computational Astrophysics workshop at The Shodor Education Foundation and the Conceptual Astronomy class at the University of North Carolina at Greensboro, will be presented.
NASA Technical Reports Server (NTRS)
Clement, Warren F.; Gorder, Pater J.; Jewell, Wayne F.; Coppenbarger, Richard
1990-01-01
Developing a single-pilot all-weather NOE capability requires fully automatic NOE navigation and flight control. Innovative guidance and control concepts are being investigated to (1) organize the onboard computer-based storage and real-time updating of NOE terrain profiles and obstacles; (2) define a class of automatic anticipative pursuit guidance algorithms to follow the vertical, lateral, and longitudinal guidance commands; (3) automate a decision-making process for unexpected obstacle avoidance; and (4) provide several rapid response maneuvers. Acquired knowledge from the sensed environment is correlated with the recorded environment which is then used to determine an appropriate evasive maneuver if a nonconformity is observed. This research effort has been evaluated in both fixed-base and moving-base real-time piloted simulations thereby evaluating pilot acceptance of the automated concepts, supervisory override, manual operation, and reengagement of the automatic system.
Design of object-oriented distributed simulation classes
NASA Technical Reports Server (NTRS)
Schoeffler, James D. (Principal Investigator)
1995-01-01
Distributed simulation of aircraft engines as part of a computer aided design package is being developed by NASA Lewis Research Center for the aircraft industry. The project is called NPSS, an acronym for 'Numerical Propulsion Simulation System'. NPSS is a flexible object-oriented simulation of aircraft engines requiring high computing speed. It is desirable to run the simulation on a distributed computer system with multiple processors executing portions of the simulation in parallel. The purpose of this research was to investigate object-oriented structures such that individual objects could be distributed. The set of classes used in the simulation must be designed to facilitate parallel computation. Since the portions of the simulation carried out in parallel are not independent of one another, there is the need for communication among the parallel executing processors which in turn implies need for their synchronization. Communication and synchronization can lead to decreased throughput as parallel processors wait for data or synchronization signals from other processors. As a result of this research, the following have been accomplished. The design and implementation of a set of simulation classes which result in a distributed simulation control program have been completed. The design is based upon MIT 'Actor' model of a concurrent object and uses 'connectors' to structure dynamic connections between simulation components. Connectors may be dynamically created according to the distribution of objects among machines at execution time without any programming changes. Measurements of the basic performance have been carried out with the result that communication overhead of the distributed design is swamped by the computation time of modules unless modules have very short execution times per iteration or time step. An analytical performance model based upon queuing network theory has been designed and implemented. Its application to realistic configurations has not been carried out.
Design of Object-Oriented Distributed Simulation Classes
NASA Technical Reports Server (NTRS)
Schoeffler, James D.
1995-01-01
Distributed simulation of aircraft engines as part of a computer aided design package being developed by NASA Lewis Research Center for the aircraft industry. The project is called NPSS, an acronym for "Numerical Propulsion Simulation System". NPSS is a flexible object-oriented simulation of aircraft engines requiring high computing speed. It is desirable to run the simulation on a distributed computer system with multiple processors executing portions of the simulation in parallel. The purpose of this research was to investigate object-oriented structures such that individual objects could be distributed. The set of classes used in the simulation must be designed to facilitate parallel computation. Since the portions of the simulation carried out in parallel are not independent of one another, there is the need for communication among the parallel executing processors which in turn implies need for their synchronization. Communication and synchronization can lead to decreased throughput as parallel processors wait for data or synchronization signals from other processors. As a result of this research, the following have been accomplished. The design and implementation of a set of simulation classes which result in a distributed simulation control program have been completed. The design is based upon MIT "Actor" model of a concurrent object and uses "connectors" to structure dynamic connections between simulation components. Connectors may be dynamically created according to the distribution of objects among machines at execution time without any programming changes. Measurements of the basic performance have been carried out with the result that communication overhead of the distributed design is swamped by the computation time of modules unless modules have very short execution times per iteration or time step. An analytical performance model based upon queuing network theory has been designed and implemented. Its application to realistic configurations has not been carried out.
NASA Technical Reports Server (NTRS)
Chen, CHIEN-C.; Hui, Elliot; Okamoto, Garret
1992-01-01
Spatial acquisition using the sun-lit Earth as a beacon source provides several advantages over active beacon-based systems for deep-space optical communication systems. However, since the angular extend of the Earth image is large compared to the laser beam divergence, the acquisition subsystem must be capable of resolving the image to derive the proper pointing orientation. The algorithms used must be capable of deducing the receiver location given the blurring introduced by the imaging optics and the large Earth albedo fluctuation. Furthermore, because of the complexity of modelling the Earth and the tracking algorithms, an accurate estimate of the algorithm accuracy can only be made via simulation using realistic Earth images. An image simulator was constructed for this purpose, and the results of the simulation runs are reported.
NASA Astrophysics Data System (ADS)
Cheng, D. L. C.; Quinn, J. D.; Larour, E. Y.; Halkides, D. J.
2017-12-01
The Virtual Earth System Laboratory (VESL) is a Web application, under continued development at the Jet Propulsion Laboratory and UC Irvine, for the visualization of Earth System data and process simulations. As with any project of its size, we have encountered both successes and challenges during the course of development. Our principal point of success is the fact that VESL users can interact seamlessly with our earth science simulations within their own Web browser. Some of the challenges we have faced include retrofitting the VESL Web application to respond to touch gestures, reducing page load time (especially as the application has grown), and accounting for the differences between the various Web browsers and computing platforms.
GEANT4 distributed computing for compact clusters
NASA Astrophysics Data System (ADS)
Harrawood, Brian P.; Agasthya, Greeshma A.; Lakshmanan, Manu N.; Raterman, Gretchen; Kapadia, Anuj J.
2014-11-01
A new technique for distribution of GEANT4 processes is introduced to simplify running a simulation in a parallel environment such as a tightly coupled computer cluster. Using a new C++ class derived from the GEANT4 toolkit, multiple runs forming a single simulation are managed across a local network of computers with a simple inter-node communication protocol. The class is integrated with the GEANT4 toolkit and is designed to scale from a single symmetric multiprocessing (SMP) machine to compact clusters ranging in size from tens to thousands of nodes. User designed 'work tickets' are distributed to clients using a client-server work flow model to specify the parameters for each individual run of the simulation. The new g4DistributedRunManager class was developed and well tested in the course of our Neutron Stimulated Emission Computed Tomography (NSECT) experiments. It will be useful for anyone running GEANT4 for large discrete data sets such as covering a range of angles in computed tomography, calculating dose delivery with multiple fractions or simply speeding the through-put of a single model.
A Hybrid Cloud Computing Service for Earth Sciences
NASA Astrophysics Data System (ADS)
Yang, C. P.
2016-12-01
Cloud Computing is becoming a norm for providing computing capabilities for advancing Earth sciences including big Earth data management, processing, analytics, model simulations, and many other aspects. A hybrid spatiotemporal cloud computing service is bulit at George Mason NSF spatiotemporal innovation center to meet this demands. This paper will report the service including several aspects: 1) the hardware includes 500 computing services and close to 2PB storage as well as connection to XSEDE Jetstream and Caltech experimental cloud computing environment for sharing the resource; 2) the cloud service is geographically distributed at east coast, west coast, and central region; 3) the cloud includes private clouds managed using open stack and eucalyptus, DC2 is used to bridge these and the public AWS cloud for interoperability and sharing computing resources when high demands surfing; 4) the cloud service is used to support NSF EarthCube program through the ECITE project, ESIP through the ESIP cloud computing cluster, semantics testbed cluster, and other clusters; 5) the cloud service is also available for the earth science communities to conduct geoscience. A brief introduction about how to use the cloud service will be included.
Compilation of Abstracts for SC12 Conference Proceedings
NASA Technical Reports Server (NTRS)
Morello, Gina Francine (Compiler)
2012-01-01
1 A Breakthrough in Rotorcraft Prediction Accuracy Using Detached Eddy Simulation; 2 Adjoint-Based Design for Complex Aerospace Configurations; 3 Simulating Hypersonic Turbulent Combustion for Future Aircraft; 4 From a Roar to a Whisper: Making Modern Aircraft Quieter; 5 Modeling of Extended Formation Flight on High-Performance Computers; 6 Supersonic Retropropulsion for Mars Entry; 7 Validating Water Spray Simulation Models for the SLS Launch Environment; 8 Simulating Moving Valves for Space Launch System Liquid Engines; 9 Innovative Simulations for Modeling the SLS Solid Rocket Booster Ignition; 10 Solid Rocket Booster Ignition Overpressure Simulations for the Space Launch System; 11 CFD Simulations to Support the Next Generation of Launch Pads; 12 Modeling and Simulation Support for NASA's Next-Generation Space Launch System; 13 Simulating Planetary Entry Environments for Space Exploration Vehicles; 14 NASA Center for Climate Simulation Highlights; 15 Ultrascale Climate Data Visualization and Analysis; 16 NASA Climate Simulations and Observations for the IPCC and Beyond; 17 Next-Generation Climate Data Services: MERRA Analytics; 18 Recent Advances in High-Resolution Global Atmospheric Modeling; 19 Causes and Consequences of Turbulence in the Earths Protective Shield; 20 NASA Earth Exchange (NEX): A Collaborative Supercomputing Platform; 21 Powering Deep Space Missions: Thermoelectric Properties of Complex Materials; 22 Meeting NASA's High-End Computing Goals Through Innovation; 23 Continuous Enhancements to the Pleiades Supercomputer for Maximum Uptime; 24 Live Demonstrations of 100-Gbps File Transfers Across LANs and WANs; 25 Untangling the Computing Landscape for Climate Simulations; 26 Simulating Galaxies and the Universe; 27 The Mysterious Origin of Stellar Masses; 28 Hot-Plasma Geysers on the Sun; 29 Turbulent Life of Kepler Stars; 30 Modeling Weather on the Sun; 31 Weather on Mars: The Meteorology of Gale Crater; 32 Enhancing Performance of NASAs High-End Computing Applications; 33 Designing Curiosity's Perfect Landing on Mars; 34 The Search Continues: Kepler's Quest for Habitable Earth-Sized Planets.
Computer Simulations Improve University Instructional Laboratories1
2004-01-01
Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and “wet” laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff. PMID:15592599
The computational challenges of Earth-system science.
O'Neill, Alan; Steenman-Clark, Lois
2002-06-15
The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.
BASIC Simulation Programs; Volumes I and II. Biology, Earth Science, Chemistry.
ERIC Educational Resources Information Center
Digital Equipment Corp., Maynard, MA.
Computer programs which teach concepts and processes related to biology, earth science, and chemistry are presented. The seven biology problems deal with aspects of genetics, evolution and natural selection, gametogenesis, enzymes, photosynthesis, and the transport of material across a membrane. Four earth science problems concern climates, the…
Imaging Near-Earth Electron Densities Using Thomson Scattering
2009-01-15
geocentric solar magnetospheric (GSM) coordinates1. TECs were initially computed from a viewing loca- tion at the Sun-Earth L1 Lagrange point2 for both...further find that an elliptical Earth orbit (apogee ~30 RE) is a suitable lower- cost option for a demonstration mission. 5. SIMULATED OBSERVATIONS We
In-class Simulations of the Iterated Prisoner's Dilemma Game.
ERIC Educational Resources Information Center
Bodo, Peter
2002-01-01
Developed a simple computer program for the in-class simulation of the repeated prisoner's dilemma game with student-designed strategies. Describes the basic features of the software. Presents two examples using the program to teach the problems of cooperation among profit-maximizing agents. (JEH)
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
This is the student guide in a set of five computer-oriented environmental/energy education units. Contents of this guide are: (1) Introduction to the unit; (2) The "EARTH" program; (3) Exercises; and (4) Sources of information on the energy crisis. This guide supplements a simulation which allows students to analyze different aspects of…
NASA Astrophysics Data System (ADS)
Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock
2017-01-01
The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.
Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Dali; Yuan, Fengming; Hernandez, Benjamin
Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less
Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations
Wang, Dali; Yuan, Fengming; Hernandez, Benjamin; ...
2017-01-01
Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less
Quantitative Modeling of Earth Surface Processes
NASA Astrophysics Data System (ADS)
Pelletier, Jon D.
This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.
Solving search problems by strongly simulating quantum circuits
Johnson, T. H.; Biamonte, J. D.; Clark, S. R.; Jaksch, D.
2013-01-01
Simulating quantum circuits using classical computers lets us analyse the inner workings of quantum algorithms. The most complete type of simulation, strong simulation, is believed to be generally inefficient. Nevertheless, several efficient strong simulation techniques are known for restricted families of quantum circuits and we develop an additional technique in this article. Further, we show that strong simulation algorithms perform another fundamental task: solving search problems. Efficient strong simulation techniques allow solutions to a class of search problems to be counted and found efficiently. This enhances the utility of strong simulation methods, known or yet to be discovered, and extends the class of search problems known to be efficiently simulable. Relating strong simulation to search problems also bounds the computational power of efficiently strongly simulable circuits; if they could solve all problems in P this would imply that all problems in NP and #P could be solved in polynomial time. PMID:23390585
NASA Technical Reports Server (NTRS)
Spangelo, Sara; Dalle, Derek; Longmier, Benjamin
2015-01-01
This paper investigates the feasibility of Earth-transfer and interplanetary mission architectures for miniaturized spacecraft using emerging small solar electric propulsion technologies. Emerging small SEP thrusters offer significant advantages relative to existing technologies and will enable U-class systems to perform trajectory maneuvers with significant Delta V requirements. The approach in this paper is unique because it integrates trajectory design with vehicle sizing and accounts for the system and operational constraints of small U-class missions. The modeling framework includes integrated propulsion, orbit, energy, and external environment dynamics and systems-level power, energy, mass, and volume constraints. The trajectory simulation environment models orbit boosts in Earth orbit and flyby and capture trajectories to interplanetary destinations. A family of small spacecraft mission architectures are studied, including altitude and inclination transfers in Earth orbit and trajectories that escape Earth orbit and travel to interplanetary destinations such as Mercury, Venus, and Mars. Results are presented visually to show the trade-offs between competing performance objectives such as maximizing available mass and volume for payloads and minimizing transfer time. The results demonstrate the feasibility of using small spacecraft to perform significant Earth and interplanetary orbit transfers in less than one year with reasonable U-class mass, power, volume, and mission durations.
HydroViz: A web-based hydrologic observatory for enhancing hydrology and earth-science education
NASA Astrophysics Data System (ADS)
Habib, E. H.; Ma, Y.; Williams, D.
2010-12-01
The main goal of this study is to develop a virtual hydrologic observatory (HydroViz) that integrates hydrologic field observations with numerical simulations by taking advantage of advances in hydrologic field & remote sensing data, computer modeling, scientific visualization, and web resources and internet accessibility. The HydroViz system is a web-based teaching tool that can run on any web browsers. It leverages the strength of Google Earth to provide authentic and hands-on activities to improve learning. Evaluation of the HydroViz was performed in three engineering courses (a senior level course and two Introductory courses at two different universities). Evaluation results indicate that HydroViz provides an improvement over existing engineering hydrology curriculum. HydroViz was effective in facilitating students’ learning and understanding of hydrologic concepts & increasing related skills. HydroViz was much more effective for students in engineering hydrology classes rather than at the freshmen introduction to civil engineering class. We found that HydroViz has great potential for freshmen audience. Even though HydroViz was challenging to some freshmen, most of them still learned the key concepts and the tool increased the enthusiasm for half of the freshmen. The evaluation provided suggestions to create a simplified version of HydroViz for freshmen-level courses students. It identified concepts and tasks that might be too challenging or irrelevant to the freshmen and areas where we could provide more guidance in the tool. After the first round of evaluation, the development team has made significant improvements to HydroViz, which would further improve its effectiveness for next round of class applications which is planned for the Fall of 2010 to take place in 5 classes at 4 different institutions.
Specification of the Surface Charging Environment with SHIELDS
NASA Astrophysics Data System (ADS)
Jordanova, V.; Delzanno, G. L.; Henderson, M. G.; Godinez, H. C.; Jeffery, C. A.; Lawrence, E. C.; Meierbachtol, C.; Moulton, J. D.; Vernon, L.; Woodroffe, J. R.; Brito, T.; Toth, G.; Welling, D. T.; Yu, Y.; Albert, J.; Birn, J.; Borovsky, J.; Denton, M.; Horne, R. B.; Lemon, C.; Markidis, S.; Thomsen, M. F.; Young, S. L.
2016-12-01
Predicting variations in the near-Earth space environment that can lead to spacecraft damage and failure, i.e. "space weather", remains a big space physics challenge. A recently funded project through the Los Alamos National Laboratory (LANL) Directed Research and Development (LDRD) program aims at developing a new capability to understand, model, and predict Space Hazards Induced near Earth by Large Dynamic Storms, the SHIELDS framework. The project goals are to understand the dynamics of the surface charging environment (SCE), the hot (keV) electrons representing the source and seed populations for the radiation belts, on both macro- and microscale. Important physics questions related to rapid particle injection and acceleration associated with magnetospheric storms and substorms as well as plasma waves are investigated. These challenging problems are addressed using a team of world-class experts in the fields of space science and computational plasma physics, and state-of-the-art models and computational facilities. In addition to physics-based models (like RAM-SCB, BATS-R-US, and iPIC3D), new data assimilation techniques employing data from LANL instruments on the Van Allen Probes and geosynchronous satellites are developed. Simulations with the SHIELDS framework of the near-Earth space environment where operational satellites reside are presented. Further model development and the organization of a "Spacecraft Charging Environment Challenge" by the SHIELDS project at LANL in collaboration with the NSF Geospace Environment Modeling (GEM) Workshop and the multi-agency Community Coordinated Modeling Center (CCMC) to assess the accuracy of SCE predictions are discussed.
Modeling Subsurface Reactive Flows Using Leadership-Class Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Richard T; Hammond, Glenn; Lichtner, Peter
2009-01-01
We describe our experiences running PFLOTRAN - a code for simulation of coupled hydro-thermal-chemical processes in variably saturated, non-isothermal, porous media - on leadership-class supercomputers, including initial experiences running on the petaflop incarnation of Jaguar, the Cray XT5 at the National Center for Computational Sciences at Oak Ridge National Laboratory. PFLOTRAN utilizes fully implicit time-stepping and is built on top of the Portable, Extensible Toolkit for Scientific Computation (PETSc). We discuss some of the hurdles to 'at scale' performance with PFLOTRAN and the progress we have made in overcoming them on leadership-class computer architectures.
Particle simulation of plasmas and stellar systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tajima, T.; Clark, A.; Craddock, G.G.
1985-04-01
A computational technique is introduced which allows the student and researcher an opportunity to observe the physical behavior of a class of many-body systems. A series of examples is offered which illustrates the diversity of problems that may be studied using particle simulation. These simulations were in fact assigned as homework in a course on computational physics.
ERIC Educational Resources Information Center
Bender, David A.
1986-01-01
Describes how a computer simulation is used with a laboratory experiment on the synthesis of urea in isolated hepatocytes. The simulation calculates the amount of urea formed and the amount of ammonium remaining as the concentrations of ornithine, citrulline, argininosuccinate, arginine, and aspartate are altered. (JN)
A Comparison of Techniques for Scheduling Fleets of Earth-Observing Satellites
NASA Technical Reports Server (NTRS)
Globus, Al; Crawford, James; Lohn, Jason; Pryor, Anna
2003-01-01
Earth observing satellite (EOS) scheduling is a complex real-world domain representative of a broad class of over-subscription scheduling problems. Over-subscription problems are those where requests for a facility exceed its capacity. These problems arise in a wide variety of NASA and terrestrial domains and are .XI important class of scheduling problems because such facilities often represent large capital investments. We have run experiments comparing multiple variants of the genetic algorithm, hill climbing, simulated annealing, squeaky wheel optimization and iterated sampling on two variants of a realistically-sized model of the EOS scheduling problem. These are implemented as permutation-based methods; methods that search in the space of priority orderings of observation requests and evaluate each permutation by using it to drive a greedy scheduler. Simulated annealing performs best and random mutation operators outperform our squeaky (more intelligent) operator. Furthermore, taking smaller steps towards the end of the search improves performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation. The main computational objectives were: 1. To develop computationally efficient, but physically based, parameterizations of estuary and continental shelf mixing processes for use in an Earth System Model (CESM). 2. Tomore » develop a two-way nested regional modeling framework in order to dynamically downscale the climate response of particular coastal ocean regions and to upscale the impact of the regional coastal processes to the global climate in an Earth System Model (CESM). 3. To develop computational infrastructure to enhance the efficiency of data transfer between specific sources and destinations, i.e., a point-to-point communication capability, (used in objective 1) within POP, the ocean component of CESM.« less
Octree-based Global Earthquake Simulations
NASA Astrophysics Data System (ADS)
Ramirez-Guzman, L.; Juarez, A.; Bielak, J.; Salazar Monroy, E. F.
2017-12-01
Seismological research has motivated recent efforts to construct more accurate three-dimensional (3D) velocity models of the Earth, perform global simulations of wave propagation to validate models, and also to study the interaction of seismic fields with 3D structures. However, traditional methods for seismogram computation at global scales are limited by computational resources, relying primarily on traditional methods such as normal mode summation or two-dimensional numerical methods. We present an octree-based mesh finite element implementation to perform global earthquake simulations with 3D models using topography and bathymetry with a staircase approximation, as modeled by the Carnegie Mellon Finite Element Toolchain Hercules (Tu et al., 2006). To verify the implementation, we compared the synthetic seismograms computed in a spherical earth against waveforms calculated using normal mode summation for the Preliminary Earth Model (PREM) for a point source representation of the 2014 Mw 7.3 Papanoa, Mexico earthquake. We considered a 3 km-thick ocean layer for stations with predominantly oceanic paths. Eigen frequencies and eigen functions were computed for toroidal, radial, and spherical oscillations in the first 20 branches. Simulations are valid at frequencies up to 0.05 Hz. Matching among the waveforms computed by both approaches, especially for long period surface waves, is excellent. Additionally, we modeled the Mw 9.0 Tohoku-Oki earthquake using the USGS finite fault inversion. Topography and bathymetry from ETOPO1 are included in a mesh with more than 3 billion elements; constrained by the computational resources available. We compared estimated velocity and GPS synthetics against observations at regional and teleseismic stations of the Global Seismological Network and discuss the differences among observations and synthetics, revealing that heterogeneity, particularly in the crust, needs to be considered.
Games and Simulations for Climate, Weather and Earth Science Education
NASA Astrophysics Data System (ADS)
Russell, R. M.
2014-12-01
We will demonstrate several interactive, computer-based simulations, games, and other interactive multimedia. These resources were developed for weather, climate, atmospheric science, and related Earth system science education. The materials were created by the UCAR Center for Science Education. These materials have been disseminated via our web site (SciEd.ucar.edu), webinars, online courses, teacher workshops, and large touchscreen displays in weather and Sun-Earth connections exhibits in NCAR's Mesa Lab facility in Boulder, Colorado. Our group has also assembled a web-based list of similar resources, especially simulations and games, from other sources that touch upon weather, climate, and atmospheric science topics. We'll briefly demonstrate this directory. More info available at: scied.ucar.edu/events/agu-2014-games-simulations-sessions
An earth imaging camera simulation using wide-scale construction of reflectance surfaces
NASA Astrophysics Data System (ADS)
Murthy, Kiran; Chau, Alexandra H.; Amin, Minesh B.; Robinson, M. Dirk
2013-10-01
Developing and testing advanced ground-based image processing systems for earth-observing remote sensing applications presents a unique challenge that requires advanced imagery simulation capabilities. This paper presents an earth-imaging multispectral framing camera simulation system called PayloadSim (PaySim) capable of generating terabytes of photorealistic simulated imagery. PaySim leverages previous work in 3-D scene-based image simulation, adding a novel method for automatically and efficiently constructing 3-D reflectance scenes by draping tiled orthorectified imagery over a geo-registered Digital Elevation Map (DEM). PaySim's modeling chain is presented in detail, with emphasis given to the techniques used to achieve computational efficiency. These techniques as well as cluster deployment of the simulator have enabled tuning and robust testing of image processing algorithms, and production of realistic sample data for customer-driven image product development. Examples of simulated imagery of Skybox's first imaging satellite are shown.
NASA Technical Reports Server (NTRS)
Cohen, Jarrett
1999-01-01
Parallel computers built out of mass-market parts are cost-effectively performing data processing and simulation tasks. The Supercomputing (now known as "SC") series of conferences celebrated its 10th anniversary last November. While vendors have come and gone, the dominant paradigm for tackling big problems still is a shared-resource, commercial supercomputer. Growing numbers of users needing a cheaper or dedicated-access alternative are building their own supercomputers out of mass-market parts. Such machines are generally called Beowulf-class systems after the 11th century epic. This modern-day Beowulf story began in 1994 at NASA's Goddard Space Flight Center. A laboratory for the Earth and space sciences, computing managers there threw down a gauntlet to develop a $50,000 gigaFLOPS workstation for processing satellite data sets. Soon, Thomas Sterling and Don Becker were working on the Beowulf concept at the University Space Research Association (USRA)-run Center of Excellence in Space Data and Information Sciences (CESDIS). Beowulf clusters mix three primary ingredients: commodity personal computers or workstations, low-cost Ethernet networks, and the open-source Linux operating system. One of the larger Beowulfs is Goddard's Highly-parallel Integrated Virtual Environment, or HIVE for short.
NASA Astrophysics Data System (ADS)
Golubovic, Leonardo; Knudsen, Steven
2017-01-01
We consider general problem of modeling the dynamics of objects sliding on moving strings. We introduce a powerful computational algorithm that can be used to investigate the dynamics of objects sliding along non-relativistic strings. We use the algorithm to numerically explore fundamental physics of sliding climbers on a unique class of dynamical systems, Rotating Space Elevators (RSE). Objects sliding along RSE strings do not require internal engines or propulsion to be transported from the Earth's surface into outer space. By extensive numerical simulations, we find that sliding climbers may display interesting non-linear dynamics exhibiting both quasi-periodic and chaotic states of motion. While our main interest in this study is in the climber dynamics on RSEs, our results for the dynamics of sliding object are of more general interest. In particular, we designed tools capable of dealing with strongly nonlinear phenomena involving moving strings of any kind, such as the chaotic dynamics of sliding climbers observed in our simulations.
NASA Astrophysics Data System (ADS)
Adhikari, S.; Ivins, E. R.; Larour, E. Y.
2015-12-01
Perturbations in gravitational and rotational potentials caused by climate driven mass redistribution on the earth's surface, such as ice sheet melting and terrestrial water storage, affect the spatiotemporal variability in global and regional sea level. Here we present a numerically accurate, computationally efficient, high-resolution model for sea level. Unlike contemporary models that are based on spherical-harmonic formulation, the model can operate efficiently in a flexible embedded finite-element mesh system, thus capturing the physics operating at km-scale yet capable of simulating geophysical quantities that are inherently of global scale with minimal computational cost. One obvious application is to compute evolution of sea level fingerprints and associated geodetic and astronomical observables (e.g., geoid height, gravity anomaly, solid-earth deformation, polar motion, and geocentric motion) as a companion to a numerical 3-D thermo-mechanical ice sheet simulation, thus capturing global signatures of climate driven mass redistribution. We evaluate some important time-varying signatures of GRACE inferred ice sheet mass balance and continental hydrological budget; for example, we identify dominant sources of ongoing sea-level change at the selected tide gauge stations, and explain the relative contribution of different sources to the observed polar drift. We also report our progress on ice-sheet/solid-earth/sea-level model coupling efforts toward realistic simulation of Pine Island Glacier over the past several hundred years.
2018-01-01
Understanding Earth surface responses in terms of sediment dynamics to climatic variability and tectonics forcing is hindered by limited ability of current models to simulate long-term evolution of sediment transfer and associated morphological changes. This paper presents pyBadlands, an open-source python-based framework which computes over geological time (1) sediment transport from landmasses to coasts, (2) reworking of marine sediments by longshore currents and (3) development of coral reef systems. pyBadlands is cross-platform, distributed under the GPLv3 license and available on GitHub (http://github.com/badlands-model). Here, we describe the underlying physical assumptions behind the simulated processes and the main options already available in the numerical framework. Along with the source code, a list of hands-on examples is provided that illustrates the model capabilities. In addition, pre and post-processing classes have been built and are accessible as a companion toolbox which comprises a series of workflows to efficiently build, quantify and explore simulation input and output files. While the framework has been primarily designed for research, its simplicity of use and portability makes it a great tool for teaching purposes. PMID:29649301
NASA Technical Reports Server (NTRS)
Bennett, Jerome (Technical Monitor)
2002-01-01
The NASA Center for Computational Sciences (NCCS) is a high-performance scientific computing facility operated, maintained and managed by the Earth and Space Data Computing Division (ESDCD) of NASA Goddard Space Flight Center's (GSFC) Earth Sciences Directorate. The mission of the NCCS is to advance leading-edge science by providing the best people, computers, and data storage systems to NASA's Earth and space sciences programs and those of other U.S. Government agencies, universities, and private institutions. Among the many computationally demanding Earth science research efforts supported by the NCCS in Fiscal Year 1999 (FY99) are the NASA Seasonal-to-Interannual Prediction Project, the NASA Search and Rescue Mission, Earth gravitational model development efforts, the National Weather Service's North American Observing System program, Data Assimilation Office studies, a NASA-sponsored project at the Center for Ocean-Land-Atmosphere Studies, a NASA-sponsored microgravity project conducted by researchers at the City University of New York and the University of Pennsylvania, the completion of a satellite-derived global climate data set, simulations of a new geodynamo model, and studies of Earth's torque. This document presents highlights of these research efforts and an overview of the NCCS, its facilities, and its people.
High-Fidelity Dynamic Modeling of Spacecraft in the Continuum--Rarefied Transition Regime
NASA Astrophysics Data System (ADS)
Turansky, Craig P.
The state of the art of spacecraft rarefied aerodynamics seldom accounts for detailed rigid-body dynamics. In part because of computational constraints, simpler models based upon the ballistic and drag coefficients are employed. Of particular interest is the continuum-rarefied transition regime of Earth's thermosphere where gas dynamic simulation is difficult yet wherein many spacecraft operate. The feasibility of increasing the fidelity of modeling spacecraft dynamics is explored by coupling rarefied aerodynamics with rigid-body dynamics modeling similar to that traditionally used for aircraft in atmospheric flight. Presented is a framework of analysis and guiding principles which capitalize on the availability of increasing computational methods and resources. Aerodynamic force inputs for modeling spacecraft in two dimensions in a rarefied flow are provided by analytical equations in the free-molecular regime, and the direct simulation Monte Carlo method in the transition regime. The application of the direct simulation Monte Carlo method to this class of problems is examined in detail with a new code specifically designed for engineering-level rarefied aerodynamic analysis. Time-accurate simulations of two distinct geometries in low thermospheric flight and atmospheric entry are performed, demonstrating non-linear dynamics that cannot be predicted using simpler approaches. The results of this straightforward approach to the aero-orbital coupled-field problem highlight the possibilities for future improvements in drag prediction, control system design, and atmospheric science. Furthermore, a number of challenges for future work are identified in the hope of stimulating the development of a new subfield of spacecraft dynamics.
NASA Technical Reports Server (NTRS)
Clement, Warren F.; Gorder, Peter J.; Jewell, Wayne F.
1991-01-01
Developing a single-pilot, all-weather nap-of-the-earth (NOE) capability requires fully automatic NOE (ANOE) navigation and flight control. Innovative guidance and control concepts are investigated in a four-fold research effort that: (1) organizes the on-board computer-based storage and real-time updating of NOE terrain profiles and obstacles in course-oriented coordinates indexed to the mission flight plan; (2) defines a class of automatic anticipative pursuit guidance algorithms and necessary data preview requirements to follow the vertical, lateral, and longitudinal guidance commands dictated by the updated flight profiles; (3) automates a decision-making process for unexpected obstacle avoidance; and (4) provides several rapid response maneuvers. Acquired knowledge from the sensed environment is correlated with the forehand knowledge of the recorded environment (terrain, cultural features, threats, and targets), which is then used to determine an appropriate evasive maneuver if a nonconformity of the sensed and recorded environments is observed. This four-fold research effort was evaluated in both fixed-based and moving-based real-time piloted simulations, thereby, providing a practical demonstration for evaluating pilot acceptance of the automated concepts, supervisory override, manual operation, and re-engagement of the automatic system. Volume one describes the major components of the guidance and control laws as well as the results of the piloted simulations. Volume two describes the complete mathematical model of the fully automatic guidance system for rotorcraft NOE flight following planned flight profiles.
Computer-Simulated Psychotherapy as an Aid in Teaching Clinical Psychology.
ERIC Educational Resources Information Center
Suler, John R.
1987-01-01
Describes how Elisa, a widely known computer program which simulates the responses of a psychotherapist, can be used as a teaching aid in undergraduate clinical psychology classes. Provides information on conducting the exercise, integrating it into the course syllabus, and evaluating its impact on students. (JDH)
Macromod: Computer Simulation For Introductory Economics
ERIC Educational Resources Information Center
Ross, Thomas
1977-01-01
The Macroeconomic model (Macromod) is a computer assisted instruction simulation model designed for introductory economics courses. An evaluation of its utilization at a community college indicates that it yielded a 10 percent to 13 percent greater economic comprehension than lecture classes and that it met with high student approval. (DC)
ERIC Educational Resources Information Center
Huang, Ching-Hsu
2014-01-01
The class quasi-experiment was conducted to determine whether using computer simulation teaching strategy enhanced student understanding of statistics concepts for students enrolled in an introductory course. One hundred and ninety-three sophomores in hospitality management department were invited as participants in this two-year longitudinal…
NASA Technical Reports Server (NTRS)
Wray, S. T., Jr.
1975-01-01
The LOVES computer code developed to investigate the concept of space servicing operational satellites as an alternative to replacing expendable satellites or returning satellites to earth for ground refurbishment is presented. In addition to having the capability to simulate the expendable satellite operation and the ground refurbished satellite operation, the program is designed to simulate the logistics of space servicing satellites using an upper stage vehicle and/or the earth to orbit shuttle. The program not only provides for the initial deployment of the satellite but also simulates the random failure and subsequent replacement of various equipment modules comprising the satellite. The program has been used primarily to conduct trade studies and/or parametric studies of various space program operational philosophies.
Games and Simulations for Climate, Weather and Earth Science Education
NASA Astrophysics Data System (ADS)
Russell, R. M.
2013-12-01
We will demonstrate several interactive, computer-based simulations, games, and other interactive multimedia. These resources were developed for weather, climate, atmospheric science, and related Earth system science education. The materials were created by education groups at NCAR/UCAR in Boulder, primarily Spark and the COMET Program. These materials have been disseminated via Spark's web site (spark.ucar.edu), webinars, online courses, teacher workshops, and large touchscreen displays in weather and Sun-Earth connections exhibits in NCAR's Mesa Lab facility. Spark has also assembled a web-based list of similar resources, especially simulations and games, from other sources that touch upon weather, climate, and atmospheric science topics. We'll briefly demonstrate this directory.
Games and Simulations for Climate, Weather and Earth Science Education
NASA Astrophysics Data System (ADS)
Russell, R. M.; Clark, S.
2015-12-01
We will demonstrate several interactive, computer-based simulations, games, and other interactive multimedia. These resources were developed for weather, climate, atmospheric science, and related Earth system science education. The materials were created by the UCAR Center for Science Education. These materials have been disseminated via our web site (SciEd.ucar.edu), webinars, online courses, teacher workshops, and large touchscreen displays in weather and Sun-Earth connections exhibits in NCAR's Mesa Lab facility in Boulder, Colorado. Our group has also assembled a web-based list of similar resources, especially simulations and games, from other sources that touch upon weather, climate, and atmospheric science topics. We'll briefly demonstrate this directory.
NASA Center for Climate Simulation (NCCS) Advanced Technology AT5 Virtualized Infiniband Report
NASA Technical Reports Server (NTRS)
Thompson, John H.; Bledsoe, Benjamin C.; Wagner, Mark; Shakshober, John; Fromkin, Russ
2013-01-01
The NCCS is part of the Computational and Information Sciences and Technology Office (CISTO) of Goddard Space Flight Center's (GSFC) Sciences and Exploration Directorate. The NCCS's mission is to enable scientists to increase their understanding of the Earth, the solar system, and the universe by supplying state-of-the-art high performance computing (HPC) solutions. To accomplish this mission, the NCCS (https://www.nccs.nasa.gov) provides high performance compute engines, mass storage, and network solutions to meet the specialized needs of the Earth and space science user communities
2012 Community Earth System Model (CESM) Tutorial - Proposal to DOE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Marika; Bailey, David A
2013-03-18
The Community Earth System Model (CESM) is a fully-coupled, global climate model that provides state-of-the-art computer simulations of the Earth's past, present, and future climate states. This document provides the agenda and list of participants for the conference. Web materials for all lectures and practical sessions available from: http://www.cesm.ucar.edu/events/tutorials/073012/ .
Computer simulation of a geomagnetic substorm
NASA Technical Reports Server (NTRS)
Lyon, J. G.; Brecht, S. H.; Huba, J. D.; Fedder, J. A.; Palmadesso, P. J.
1981-01-01
A global two-dimensional simulation of a substormlike process occurring in earth's magnetosphere is presented. The results are consistent with an empirical substorm model - the neutral-line model. Specifically, the introduction of a southward interplanetary magnetic field forms an open magnetosphere. Subsequently, a substorm neutral line forms at about 15 earth radii or closer in the magnetotail, and plasma sheet thinning and plasma acceleration occur. Eventually the substorm neutral line moves tailward toward its presubstorm position.
Integrating interactive computational modeling in biology curricula.
Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A
2015-03-01
While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.
NASA Astrophysics Data System (ADS)
Wu, Yuanqiao; Verseghy, Diana L.; Melton, Joe R.
2016-08-01
Peatlands, which contain large carbon stocks that must be accounted for in the global carbon budget, are poorly represented in many earth system models. We integrated peatlands into the coupled Canadian Land Surface Scheme (CLASS) and the Canadian Terrestrial Ecosystem Model (CTEM), which together simulate the fluxes of water, energy, and CO2 at the land surface-atmosphere boundary in the family of Canadian Earth system models (CanESMs). New components and algorithms were added to represent the unique features of peatlands, such as their characteristic ground floor vegetation (mosses), the slow decomposition of carbon in the water-logged soils and the interaction between the water, energy, and carbon cycles. This paper presents the modifications introduced into the CLASS-CTEM modelling framework together with site-level evaluations of the model performance for simulated water, energy and carbon fluxes at eight different peatland sites. The simulated daily gross primary production (GPP) and ecosystem respiration are well correlated with observations, with values of the Pearson correlation coefficient higher than 0.8 and 0.75 respectively. The simulated mean annual net ecosystem production at the eight test sites is 87 g C m-2 yr-1, which is 22 g C m-2 yr-1 higher than the observed annual mean. The general peatland model compares well with other site-level and regional-level models for peatlands, and is able to represent bogs and fens under a range of climatic and geographical conditions.
NASA Astrophysics Data System (ADS)
Wu, Y.; Verseghy, D. L.; Melton, J. R.
2015-11-01
Peatlands, which contain large carbon stocks that must be accounted for in the global carbon budget, are poorly represented in many earth system models. We integrated peatlands into the coupled Canadian Land Surface Scheme (CLASS) and the Canadian Terrestrial Ecosystem Model (CTEM), which together simulate the fluxes of water, energy and CO2 at the land surface-atmosphere boundary in the family of Canadian Earth System Models (CanESMs). New components and algorithms were added to represent the unique features of peatlands, such as their characteristic ground floor vegetation (mosses), the slow decomposition of carbon in the water-logged soils and the interaction between the water, energy and carbon cycles. This paper presents the modifications introduced into the CLASS-CTEM modelling framework together with site-level evaluations of the model performance for simulated water, energy and carbon fluxes at eight different peatland sites. The simulated daily gross primary production and ecosystem respiration are well correlated with observations, with values of the Pearson correlation coefficient higher than 0.8 and 0.75 respectively. The simulated mean annual net ecosystem production at the eight test sites is 87 g C m-2 yr-1, which is 22 g C m-2 yr-1 higher than the observed annual mean. The general peatland model compares well with other site-level and regional-level models for peatlands, and is able to represent bogs and fens under a range of climatic and geographical conditions.
A study of workstation computational performance for real-time flight simulation
NASA Technical Reports Server (NTRS)
Maddalon, Jeffrey M.; Cleveland, Jeff I., II
1995-01-01
With recent advances in microprocessor technology, some have suggested that modern workstations provide enough computational power to properly operate a real-time simulation. This paper presents the results of a computational benchmark, based on actual real-time flight simulation code used at Langley Research Center, which was executed on various workstation-class machines. The benchmark was executed on different machines from several companies including: CONVEX Computer Corporation, Cray Research, Digital Equipment Corporation, Hewlett-Packard, Intel, International Business Machines, Silicon Graphics, and Sun Microsystems. The machines are compared by their execution speed, computational accuracy, and porting effort. The results of this study show that the raw computational power needed for real-time simulation is now offered by workstations.
Interactive visualization of Earth and Space Science computations
NASA Technical Reports Server (NTRS)
Hibbard, William L.; Paul, Brian E.; Santek, David A.; Dyer, Charles R.; Battaiola, Andre L.; Voidrot-Martinez, Marie-Francoise
1994-01-01
Computers have become essential tools for scientists simulating and observing nature. Simulations are formulated as mathematical models but are implemented as computer algorithms to simulate complex events. Observations are also analyzed and understood in terms of mathematical models, but the number of these observations usually dictates that we automate analyses with computer algorithms. In spite of their essential role, computers are also barriers to scientific understanding. Unlike hand calculations, automated computations are invisible and, because of the enormous numbers of individual operations in automated computations, the relation between an algorithm's input and output is often not intuitive. This problem is illustrated by the behavior of meteorologists responsible for forecasting weather. Even in this age of computers, many meteorologists manually plot weather observations on maps, then draw isolines of temperature, pressure, and other fields by hand (special pads of maps are printed for just this purpose). Similarly, radiologists use computers to collect medical data but are notoriously reluctant to apply image-processing algorithms to that data. To these scientists with life-and-death responsibilities, computer algorithms are black boxes that increase rather than reduce risk. The barrier between scientists and their computations can be bridged by techniques that make the internal workings of algorithms visible and that allow scientists to experiment with their computations. Here we describe two interactive systems developed at the University of Wisconsin-Madison Space Science and Engineering Center (SSEC) that provide these capabilities to Earth and space scientists.
Specification of the near-Earth space environment with SHIELDS
Jordanova, Vania Koleva; Delzanno, Gian Luca; Henderson, Michael Gerard; ...
2017-11-26
Here, predicting variations in the near-Earth space environment that can lead to spacecraft damage and failure is one example of “space weather” and a big space physics challenge. A project recently funded through the Los Alamos National Laboratory (LANL) Directed Research and Development (LDRD) program aims at developing a new capability to understand, model, and predict Space Hazards Induced near Earth by Large Dynamic Storms, the SHIELDS framework. The project goals are to understand the dynamics of the surface charging environment (SCE), the hot (keV) electrons representing the source and seed populations for the radiation belts, on both macro- andmore » micro-scale. Important physics questions related to particle injection and acceleration associated with magnetospheric storms and substorms, as well as plasma waves, are investigated. These challenging problems are addressed using a team of world-class experts in the fields of space science and computational plasma physics, and state-of-the-art models and computational facilities. A full two-way coupling of physics-based models across multiple scales, including a global MHD (BATS-R-US) embedding a particle-in-cell (iPIC3D) and an inner magnetosphere (RAM-SCB) codes, is achieved. New data assimilation techniques employing in situ satellite data are developed; these provide an order of magnitude improvement in the accuracy in the simulation of the SCE. SHIELDS also includes a post-processing tool designed to calculate the surface charging for specific spacecraft geometry using the Curvilinear Particle-In-Cell (CPIC) code that can be used for reanalysis of satellite failures or for satellite design.« less
Specification of the near-Earth space environment with SHIELDS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jordanova, Vania Koleva; Delzanno, Gian Luca; Henderson, Michael Gerard
Here, predicting variations in the near-Earth space environment that can lead to spacecraft damage and failure is one example of “space weather” and a big space physics challenge. A project recently funded through the Los Alamos National Laboratory (LANL) Directed Research and Development (LDRD) program aims at developing a new capability to understand, model, and predict Space Hazards Induced near Earth by Large Dynamic Storms, the SHIELDS framework. The project goals are to understand the dynamics of the surface charging environment (SCE), the hot (keV) electrons representing the source and seed populations for the radiation belts, on both macro- andmore » micro-scale. Important physics questions related to particle injection and acceleration associated with magnetospheric storms and substorms, as well as plasma waves, are investigated. These challenging problems are addressed using a team of world-class experts in the fields of space science and computational plasma physics, and state-of-the-art models and computational facilities. A full two-way coupling of physics-based models across multiple scales, including a global MHD (BATS-R-US) embedding a particle-in-cell (iPIC3D) and an inner magnetosphere (RAM-SCB) codes, is achieved. New data assimilation techniques employing in situ satellite data are developed; these provide an order of magnitude improvement in the accuracy in the simulation of the SCE. SHIELDS also includes a post-processing tool designed to calculate the surface charging for specific spacecraft geometry using the Curvilinear Particle-In-Cell (CPIC) code that can be used for reanalysis of satellite failures or for satellite design.« less
Global Magnetohydrodynamic Simulation Using High Performance FORTRAN on Parallel Computers
NASA Astrophysics Data System (ADS)
Ogino, T.
High Performance Fortran (HPF) is one of modern and common techniques to achieve high performance parallel computation. We have translated a 3-dimensional magnetohydrodynamic (MHD) simulation code of the Earth's magnetosphere from VPP Fortran to HPF/JA on the Fujitsu VPP5000/56 vector-parallel supercomputer and the MHD code was fully vectorized and fully parallelized in VPP Fortran. The entire performance and capability of the HPF MHD code could be shown to be almost comparable to that of VPP Fortran. A 3-dimensional global MHD simulation of the earth's magnetosphere was performed at a speed of over 400 Gflops with an efficiency of 76.5 VPP5000/56 in vector and parallel computation that permitted comparison with catalog values. We have concluded that fluid and MHD codes that are fully vectorized and fully parallelized in VPP Fortran can be translated with relative ease to HPF/JA, and a code in HPF/JA may be expected to perform comparably to the same code written in VPP Fortran.
ERIC Educational Resources Information Center
What Works Clearinghouse, 2014
2014-01-01
The 2014 study, "Conceptualizing Astronomical Scale: Virtual Simulations on Handheld Tablet Computers Reverse Misconceptions," examined the effects of using the true-to-scale (TTS) display mode versus the orrery display mode in the iPad's Solar Walk software application on students' knowledge of the Earth's place in the solar system. The…
Effects of Psychology Courseware Use on Computer Anxiety in Students.
ERIC Educational Resources Information Center
Lambert, Matthew E.; Lenthall, Gerard
1989-01-01
Describes study that examined the relationship between computer anxiety and the use of psychology courseware in an undergraduate abnormal psychology class using four computerized case simulations. Comparisons of pretest and posttest computer anxiety measures are described, and the relationship between computer anxiety/attitudes and computer use is…
Assessing the Effectiveness of a Computer Simulation for Teaching Ecological Experimental Design
ERIC Educational Resources Information Center
Stafford, Richard; Goodenough, Anne E.; Davies, Mark S.
2010-01-01
Designing manipulative ecological experiments is a complex and time-consuming process that is problematic to teach in traditional undergraduate classes. This study investigates the effectiveness of using a computer simulation--the Virtual Rocky Shore (VRS)--to facilitate rapid, student-centred learning of experimental design. We gave a series of…
Fractal Simulations of African Design in Pre-College Computing Education
ERIC Educational Resources Information Center
Eglash, Ron; Krishnamoorthy, Mukkai; Sanchez, Jason; Woodbridge, Andrew
2011-01-01
This article describes the use of fractal simulations of African design in a high school computing class. Fractal patterns--repetitions of shape at multiple scales--are a common feature in many aspects of African design. In African architecture we often see circular houses grouped in circular complexes, or rectangular houses in rectangular…
Virtual Instrument Simulator for CERES
NASA Technical Reports Server (NTRS)
Chapman, John J.
1997-01-01
A benchtop virtual instrument simulator for CERES (Clouds and the Earth's Radiant Energy System) has been built at NASA, Langley Research Center in Hampton, VA. The CERES instruments will fly on several earth orbiting platforms notably NASDA's Tropical Rainfall Measurement Mission (TRMM) and NASA's Earth Observing System (EOS) satellites. CERES measures top of the atmosphere radiative fluxes using microprocessor controlled scanning radiometers. The CERES Virtual Instrument Simulator consists of electronic circuitry identical to the flight unit's twin microprocessors and telemetry interface to the supporting spacecraft electronics and two personal computers (PC) connected to the I/O ports that control azimuth and elevation gimbals. Software consists of the unmodified TRW developed Flight Code and Ground Support Software which serves as the instrument monitor and NASA/TRW developed engineering models of the scanners. The CERES Instrument Simulator will serve as a testbed for testing of custom instrument commands intended to solve in-flight anomalies of the instruments which could arise during the CERES mission. One of the supporting computers supports the telemetry display which monitors the simulator microprocessors during the development and testing of custom instrument commands. The CERES engineering development software models have been modified to provide a virtual instrument running on a second supporting computer linked in real time to the instrument flight microprocessor control ports. The CERES Instrument Simulator will be used to verify memory uploads by the CERES Flight Operations TEAM at NASA. Plots of the virtual scanner models match the actual instrument scan plots. A high speed logic analyzer has been used to track the performance of the flight microprocessor. The concept of using an identical but non-flight qualified microprocessor and electronics ensemble linked to a virtual instrument with identical system software affords a relatively inexpensive simulation system capable of high fidelity.
NASA Astrophysics Data System (ADS)
Tucker, Laura Jane
Under the harsh conditions of limited nutrient and hard growth surface, Paenibacillus dendritiformis in agar plates form two classes of patterns (morphotypes). The first class, called the dendritic morphotype, has radially directed branches. The second class, called the chiral morphotype, exhibits uniform handedness. The dendritic morphotype has been modeled successfully using a continuum model on a regular lattice; however, a suitable computational approach was not known to solve a continuum chiral model. This work details a new computational approach to solving the chiral continuum model of pattern formation in P. dendritiformis. The approach utilizes a random computational lattice and new methods for calculating certain derivative terms found in the model.
Simulating Snow in Canadian Boreal Environments with CLASS for ESM-SnowMIP
NASA Astrophysics Data System (ADS)
Wang, L.; Bartlett, P. A.; Derksen, C.; Ireson, A. M.; Essery, R.
2017-12-01
The ability of land surface schemes to provide realistic simulations of snow cover is necessary for accurate representation of energy and water balances in climate models. Historically, this has been particularly challenging in boreal forests, where poor treatment of both snow masking by forests and vegetation-snow interaction has resulted in biases in simulated albedo and snowpack properties, with subsequent effects on both regional temperatures and the snow albedo feedback in coupled simulations. The SnowMIP (Snow Model Intercomparison Project) series of experiments or `MIPs' was initiated in order to provide assessments of the performance of various snow- and land-surface-models at selected locations, in order to understand the primary factors affecting model performance. Here we present preliminary results of simulations conducted for the third such MIP, ESM-SnowMIP (Earth System Model - Snow Model Intercomparison Project), using the Canadian Land Surface Scheme (CLASS) at boreal forest sites in central Saskatchewan. We assess the ability of our latest model version (CLASS 3.6.2) to simulate observed snowpack properties (snow water equivalent, density and depth) and above-canopy albedo over 13 winters. We also examine the sensitivity of these simulations to climate forcing at local and regional scales.
Investigation of Effective Material Properties of Stony Meteorites
NASA Technical Reports Server (NTRS)
Agrawal, Parul; Carlozzi, Alex; Bryson, Kathryn
2016-01-01
To assess the threat posed by an asteroid entering Earth's atmosphere, one must predict if, when, and how it fragments during entry. A comprehensive understanding of the Asteroid material properties is needed to achieve this objective. At present, the meteorite material found on Earth are the only objects from an entering asteroid that can be used as representative material and be tested inside a laboratory setting. Therefore, unit cell models are developed to determine the effective material properties of stony meteorites and in turn deduce the properties of asteroids. The unit cell is representative volume that accounts for diverse minerals, porosity, and matrix composition inside a meteorite. The various classes under investigation includes H-class, L-class, and LL-class chondrites. The effective mechanical properties such as Young's Modulus and Poisson's Ratio of the unit cell are calculated by performing several hundreds of Monte-Carlo simulations. Terrestrial analogs such as Basalt and Gabbro are being used to validate the unit cell methodology.
NASA Technical Reports Server (NTRS)
Gastellu-Etchegorry, Jean-Philippe; Yin, Tiangang; Lauret, Nicolas; Grau, Eloi; Rubio, Jeremy; Cook, Bruce D.; Morton, Douglas C.; Sun, Guoqing
2016-01-01
Light Detection And Ranging (LiDAR) provides unique data on the 3-D structure of atmosphere constituents and the Earth's surface. Simulating LiDAR returns for different laser technologies and Earth scenes is fundamental for evaluating and interpreting signal and noise in LiDAR data. Different types of models are capable of simulating LiDAR waveforms of Earth surfaces. Semi-empirical and geometric models can be imprecise because they rely on simplified simulations of Earth surfaces and light interaction mechanisms. On the other hand, Monte Carlo ray tracing (MCRT) models are potentially accurate but require long computational time. Here, we present a new LiDAR waveform simulation tool that is based on the introduction of a quasi-Monte Carlo ray tracing approach in the Discrete Anisotropic Radiative Transfer (DART) model. Two new approaches, the so-called "box method" and "Ray Carlo method", are implemented to provide robust and accurate simulations of LiDAR waveforms for any landscape, atmosphere and LiDAR sensor configuration (view direction, footprint size, pulse characteristics, etc.). The box method accelerates the selection of the scattering direction of a photon in the presence of scatterers with non-invertible phase function. The Ray Carlo method brings traditional ray-tracking into MCRT simulation, which makes computational time independent of LiDAR field of view (FOV) and reception solid angle. Both methods are fast enough for simulating multi-pulse acquisition. Sensitivity studies with various landscapes and atmosphere constituents are presented, and the simulated LiDAR signals compare favorably with their associated reflectance images and Laser Vegetation Imaging Sensor (LVIS) waveforms. The LiDAR module is fully integrated into DART, enabling more detailed simulations of LiDAR sensitivity to specific scene elements (e.g., atmospheric aerosols, leaf area, branches, or topography) and sensor configuration for airborne or satellite LiDAR sensors.
Digital Simulation in Education.
ERIC Educational Resources Information Center
Braun, Ludwig
Simulation as a mode of computer use in instruction has been neglected by educators. This paper briefly explores the circumstances in which simulations are useful and presents several examples of simulation programs currently being used in high-school biology, chemistry, physics, and social studies classes. One program, STERIL, which simulates…
Multiscale Methods for Accurate, Efficient, and Scale-Aware Models of the Earth System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldhaber, Steve; Holland, Marika
The major goal of this project was to contribute improvements to the infrastructure of an Earth System Model in order to support research in the Multiscale Methods for Accurate, Efficient, and Scale-Aware models of the Earth System project. In support of this, the NCAR team accomplished two main tasks: improving input/output performance of the model and improving atmospheric model simulation quality. Improvement of the performance and scalability of data input and diagnostic output within the model required a new infrastructure which can efficiently handle the unstructured grids common in multiscale simulations. This allows for a more computationally efficient model, enablingmore » more years of Earth System simulation. The quality of the model simulations was improved by reducing grid-point noise in the spectral element version of the Community Atmosphere Model (CAM-SE). This was achieved by running the physics of the model using grid-cell data on a finite-volume grid.« less
Numerical Simulations of Buoyancy Effects in low Density Gas Jets
NASA Technical Reports Server (NTRS)
Satti, R. P.; Pasumarthi, K. S.; Agrawal, A. K.
2004-01-01
This paper deals with the computational analysis of buoyancy effects in the near field of an isothermal helium jet injected into quiescent ambient air environment. The transport equations of helium mass fraction coupled with the conservation equations of mixture mass and momentum were solved using a staggered grid finite volume method. Laminar, axisymmetric, unsteady flow conditions were considered for the analysis. An orthogonal system with non-uniform grids was used to capture the instability phenomena. Computations were performed for Earth gravity and during transition from Earth to different gravitational levels. The flow physics was described by simultaneous visualizations of velocity and concentration fields at Earth and microgravity conditions. Computed results were validated by comparing with experimental data substantiating that buoyancy induced global flow oscillations present in Earth gravity are absent in microgravity. The dependence of oscillation frequency and amplitude on gravitational forcing was presented to further quantify the buoyancy effects.
Computer Generated View of Earth as seen from the Asteroid Toutatis
1996-11-27
This computer generated image depicts a view of Earth as seen from the surface of the asteroid Toutatis on Nov 29th 1996. A 2.5 degree field-of-view synthetic computer camera was used for this simulation. Toutatis is visible on this date as a twelfth magnitude object in the night sky in the constellation of Virgo and could be viewed with a medium sized telescope. Toutatis currently approaches Earth once every four years and, on Nov. 29th, 1996 will be 5.2 million kilometers away (approx. 3.3 million miles). In approximately 8 years, on Sept. 29th, 2004, it will be less than 1.6 million kilometers from Earth. This is only 4 times the distance to the moon, and is the closest approach predicted for any known asteroid or comet during the next 60 years. http://photojournal.jpl.nasa.gov/catalog/PIA00515
Middle-School Understanding of the Greenhouse Effect using a NetLogo Computer Model
NASA Astrophysics Data System (ADS)
Schultz, L.; Koons, P. O.; Schauffler, M.
2009-12-01
We investigated the effectiveness of a freely available agent based, modeling program as a learning tool for seventh and eighth grade students to explore the greenhouse effect without added curriculum. The investigation was conducted at two Maine middle-schools with 136 seventh-grade students and 11 eighth-grade students in eight classes. Students were given a pre-test that consisted of a concept map, a free-response question, and multiple-choice questions about how the greenhouse effect influences the Earth's temperature. The computer model simulates the greenhouse effect and allows students to manipulate atmospheric and surface conditions to observe the effects on the Earth’s temperature. Students explored the Greenhouse Effect model for approximately twenty minutes with only two focus questions for guidance. After the exploration period, students were given a post-test that was identical to the pre-test. Parametric post-test analysis of the assessments indicated middle-school students gained in their understanding about how the greenhouse effect influences the Earth's temperature after exploring the computer model for approximately twenty minutes. The magnitude of the changes in pre- and post-test concept map and free-response scores were small (average free-response post-test score of 7.0) compared to an expert's score (48), indicating that students understood only a few of the system relationships. While students gained in their understanding about the greenhouse effect, there was evidence that students held onto their misconceptions that (1) carbon dioxide in the atmosphere deteriorates the ozone layer, (2) the greenhouse effect is a result of humans burning fossil fuels, and (3) infrared and visible light have similar behaviors with greenhouse gases. We recommend using the Greenhouse Effect computer model with guided inquiry to focus students’ investigations on the system relationships in the model.
Using SPEEDES to simulate the blue gene interconnect network
NASA Technical Reports Server (NTRS)
Springer, P.; Upchurch, E.
2003-01-01
JPL and the Center for Advanced Computer Architecture (CACR) is conducting application and simulation analyses of BG/L in order to establish a range of effectiveness for the Blue Gene/L MPP architecture in performing important classes of computations and to determine the design sensitivity of the global interconnect network in support of real world ASCI application execution.
Long-Term, Non-Computer, Communication Simulations as Course Integration Activities
ERIC Educational Resources Information Center
Hamilton, James P.
2008-01-01
This article offers a few guidelines for constructing effective simulations. It presents a sample class activity called simulated public hearing which aims to integrate the various elements of a public speaking course into a more comprehensive whole. Properly designed, simulated hearings have elements of persuasive, informative, and impromptu…
Synthetic Seismograms of Explosive Sources Calculated by the Earth Simulator
NASA Astrophysics Data System (ADS)
Tsuboi, S.; Matsumoto, H.; Rozhkov, M.; Stachnik, J.
2017-12-01
We calculate broadband synthetic seismograms using the spectral-element method (Komatitsch & Tromp, 2001) for recent explosive events in northern Korean peninsula. We use supercomputer Earth Simulator system in JAMSTEC to compute synthetic seismograms using the spectral-element method. The simulations are performed on 8,100 processors, which require 2,025 nodes of the Earth Simulator. We use one chunk with the angular distance 40 degrees to compute synthetic seismograms. On this number of nodes, a simulation of 5 minutes of wave propagation accurate at periods of 1.5 seconds and longer requires about 10 hours of CPU time. We use CMT solution of Rozhkov et al (2016) as a source model for this event. One example of CMT solution for this source model has 28% double couple component and 51% isotropic component. The hypocenter depth of this solution is 1.4 km. Comparisons of the synthetic waveforms with the observation show that the arrival time of Pn and Pg waves matches well with the observation. Comparison also shows that the agreement of amplitude of other phases is not necessarily well, which demonstrates that the crustal structure should be improved to include in the simulation. The surface waves observed are also modeled well in the synthetics, which shows that the CMT solution we have used for this computation correctly grasps the source characteristics of this event. Because of characteristics of artificial explosive sources of which hypocenter location is already known, we may evaluate crustal structure along the propagation path from the waveform modeling for these sources. We may discuss the limitation of one dimensional crustal structure model by comparing the synthetic waveform of 3D crustal structure and the observed seismograms.
Monte Carlo Techniques for Nuclear Systems - Theory Lectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. Thesemore » lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations. Beginning MCNP users are encouraged to review LA-UR-09-00380, "Criticality Calculations with MCNP: A Primer (3nd Edition)" (available at http:// mcnp.lanl.gov under "Reference Collection") prior to the class. No Monte Carlo class can be complete without having students write their own simple Monte Carlo routines for basic random sampling, use of the random number generator, and simplified particle transport simulation.« less
Free-electron laser simulations on the MPP
NASA Technical Reports Server (NTRS)
Vonlaven, Scott A.; Liebrock, Lorie M.
1987-01-01
Free electron lasers (FELs) are of interest because they provide high power, high efficiency, and broad tunability. FEL simulations can make efficient use of computers of the Massively Parallel Processor (MPP) class because most of the processing consists of applying a simple equation to a set of identical particles. A test version of the KMS Fusion FEL simulation, which resides mainly in the MPPs host computer and only partially in the MPP, has run successfully.
Earth and Space Science Ph.D. Class of 2003 Report released
NASA Astrophysics Data System (ADS)
Keelor, Brad
AGU and the American Geological Institute (AGI) released on 26 July an employment study of 180 Earth and space science Ph.D. recipients who received degrees from U.S. universities in 2003. The AGU/AGI survey asked graduates about their education and employment, efforts to find their first job after graduation, and experiences in graduate school. Key results from the study include: The vast majority (87%) of 2003 graduates found work in the Earth and space sciences, earning salaries commensurate with or slightly higher than 2001 and 2002 salary averages. Most (64%) graduates were employed within academia (including postdoctoral appointments), with the remainder in government (19%), industry (10%), and other (7%) sectors. Most graduates were positive about their employment situation and found that their work was challenging, relevant, and appropriate for someone with a Ph.D. The percentage of Ph.D. recipients accepting postdoctoral positions (58%) increased slightly from 2002. In contrast, the fields of physics and chemistry showed significant increases in postdoctoral appointments for Ph.D.s during the same time period. As in previous years, recipients of Ph.D.s in the Earth, atmospheric, and ocean sciences (median age of 32.7 years) are slightly older than Ph.D. recipients in most other natural sciences (except computer sciences), which is attributed to time taken off between undergraduate and graduate studies. Women in the Earth, atmospheric,and ocean sciences earned 33% of Ph.D.s in the class of 2003, surpassing the percentage of Ph.D.s earned by women in chemistry (32%) and well ahead of the percentage in computer sciences (20%), physics (19%), and engineering (17%). Participation of other underrepresented groups in the Earth, atmospheric, and ocean sciences remained extremely low.
NASA Technical Reports Server (NTRS)
Smith, David A.; Hojnicki, Jeffrey S.; Sjauw, Waldy K.
2014-01-01
Recent NASA interest in utilizing solar electronic propulsion (SEP) technology to transfer payloads, e.g. from low-Earth orbit (LEO) to higher energy geostationary-Earth orbit (GEO) or to Earth escape, has necessitated the development of high fidelity SEP vehicle models and simulations. These models and simulations need to be capable of capturing vehicle dynamics and sub-system interactions experienced during the transfer trajectories which are typically accomplished with continuous-burn (potentially interrupted by solar eclipse), long duration "spiral out" maneuvers taking several months or more to complete. This paper presents details of an integrated simulation approach achieved by combining a high fidelity vehicle simulation code with a detailed solar array model. The combined simulation tool gives researchers the functionality to study the integrated effects of various vehicle sub-systems (e.g. vehicle guidance, navigation and control (GN&C), electric propulsion system (EP)) with time varying power production. Results from a simulation model of a vehicle with a 50 kW class SEP system using the integrated tool are presented and compared to the results from another simulation model employing a 50 kW end-of-life (EOL) fixed power level assumption. These models simulate a vehicle under three degree of freedom dynamics (i.e. translational dynamics only) and include the effects of a targeting guidance algorithm (providing a "near optimal" transfer) during a LEO to near Earth escape (C (sub 3) = -2.0 km (sup 2) / sec (sup -2) spiral trajectory. The presented results include the impact of the fully integrated, time-varying solar array model (e.g. cumulative array degradation from traversing the Van Allen belts, impact of solar eclipses on the vehicle and the related temperature responses in the solar arrays due to operating in the Earth's thermal environment, high fidelity array power module, etc.); these are used to assess the impact on vehicle performance (i.e. propellant consumption) and transit times.
Astronomy Simulation with Computer Graphics.
ERIC Educational Resources Information Center
Thomas, William E.
1982-01-01
"Planetary Motion Simulations" is a system of programs designed for students to observe motions of a superior planet (one whose orbit lies outside the orbit of the earth). Programs run on the Apple II microcomputer and employ high-resolution graphics to present the motions of Saturn. (Author/JN)
Interactive Heat Transfer Simulations for Everyone
ERIC Educational Resources Information Center
Xie, Charles
2012-01-01
Heat transfer is widely taught in secondary Earth science and physics. Researchers have identified many misconceptions related to heat and temperature. These misconceptions primarily stem from hunches developed in everyday life (though the confusions in terminology often worsen them). Interactive computer simulations that visualize thermal energy,…
A standard library for modeling satellite orbits on a microcomputer
NASA Astrophysics Data System (ADS)
Beutel, Kenneth L.
1988-03-01
Introductory students of astrodynamics and the space environment are required to have a fundamental understanding of the kinematic behavior of satellite orbits. This thesis develops a standard library that contains the basic formulas for modeling earth orbiting satellites. This library is used as a basis for implementing a satellite motion simulator that can be used to demonstrate orbital phenomena in the classroom. Surveyed are the equations of orbital elements, coordinate systems and analytic formulas, which are made into a standard method for modeling earth orbiting satellites. The standard library is written in the C programming language and is designed to be highly portable between a variety of computer environments. The simulation draws heavily on the standards established by the library to produce a graphics-based orbit simulation program written for the Apple Macintosh computer. The simulation demonstrates the utility of the standard library functions but, because of its extensive use of the Macintosh user interface, is not portable to other operating systems.
Novel systems and methods for quantum communication, quantum computation, and quantum simulation
NASA Astrophysics Data System (ADS)
Gorshkov, Alexey Vyacheslavovich
Precise control over quantum systems can enable the realization of fascinating applications such as powerful computers, secure communication devices, and simulators that can elucidate the physics of complex condensed matter systems. However, the fragility of quantum effects makes it very difficult to harness the power of quantum mechanics. In this thesis, we present novel systems and tools for gaining fundamental insights into the complex quantum world and for bringing practical applications of quantum mechanics closer to reality. We first optimize and show equivalence between a wide range of techniques for storage of photons in atomic ensembles. We describe experiments demonstrating the potential of our optimization algorithms for quantum communication and computation applications. Next, we combine the technique of photon storage with strong atom-atom interactions to propose a robust protocol for implementing the two-qubit photonic phase gate, which is an important ingredient in many quantum computation and communication tasks. In contrast to photon storage, many quantum computation and simulation applications require individual addressing of closely-spaced atoms, ions, quantum dots, or solid state defects. To meet this requirement, we propose a method for coherent optical far-field manipulation of quantum systems with a resolution that is not limited by the wavelength of radiation. While alkali atoms are currently the system of choice for photon storage and many other applications, we develop new methods for quantum information processing and quantum simulation with ultracold alkaline-earth atoms in optical lattices. We show how multiple qubits can be encoded in individual alkaline-earth atoms and harnessed for quantum computing and precision measurements applications. We also demonstrate that alkaline-earth atoms can be used to simulate highly symmetric systems exhibiting spin-orbital interactions and capable of providing valuable insights into strongly correlated physics of transition metal oxides, heavy fermion materials, and spin liquid phases. While ultracold atoms typically exhibit only short-range interactions, numerous exotic phenomena and practical applications require long-range interactions, which can be achieved with ultracold polar molecules. We demonstrate the possibility to engineer a repulsive interaction between polar molecules, which allows for the suppression of inelastic collisions, efficient evaporative cooling, and the creation of novel phases of polar molecules.
Computer-simulated laboratory explorations for middle school life, earth, and physical Science
NASA Astrophysics Data System (ADS)
von Blum, Ruth
1992-06-01
Explorations in Middle School Science is a set of 72 computer-simulated laboratory lessons in life, earth, and physical Science for grades 6 9 developed by Jostens Learning Corporation with grants from the California State Department of Education and the National Science Foundation.3 At the heart of each lesson is a computer-simulated laboratory that actively involves students in doing science improving their: (1) understanding of science concepts by applying critical thinking to solve real problems; (2) skills in scientific processes and communications; and (3) attitudes about science. Students use on-line tools (notebook, calculator, word processor) to undertake in-depth investigations of phenomena (like motion in outer space, disease transmission, volcanic eruptions, or the structure of the atom) that would be too difficult, dangerous, or outright impossible to do in a “live” laboratory. Suggested extension activities lead students to hands-on investigations, away from the computer. This article presents the underlying rationale, instructional model, and process by which Explorations was designed and developed. It also describes the general courseware structure and three lesson's in detail, as well as presenting preliminary data from the evaluation. Finally, it suggests a model for incorporating technology into the science classroom.
Computer Series, 98. Electronics for Scientists: A Computer-Intensive Approach.
ERIC Educational Resources Information Center
Scheeline, Alexander; Mork, Brian J.
1988-01-01
Reports the design for a principles-before-details presentation of electronics for an instrumental analysis class. Uses computers for data collection and simulations. Requires one semester with two 2.5-hour periods and two lectures per week. Includes lab and lecture syllabi. (MVL)
Development of a Cloud Resolving Model for Heterogeneous Supercomputers
NASA Astrophysics Data System (ADS)
Sreepathi, S.; Norman, M. R.; Pal, A.; Hannah, W.; Ponder, C.
2017-12-01
A cloud resolving climate model is needed to reduce major systematic errors in climate simulations due to structural uncertainty in numerical treatments of convection - such as convective storm systems. This research describes the porting effort to enable SAM (System for Atmosphere Modeling) cloud resolving model on heterogeneous supercomputers using GPUs (Graphical Processing Units). We have isolated a standalone configuration of SAM that is targeted to be integrated into the DOE ACME (Accelerated Climate Modeling for Energy) Earth System model. We have identified key computational kernels from the model and offloaded them to a GPU using the OpenACC programming model. Furthermore, we are investigating various optimization strategies intended to enhance GPU utilization including loop fusion/fission, coalesced data access and loop refactoring to a higher abstraction level. We will present early performance results, lessons learned as well as optimization strategies. The computational platform used in this study is the Summitdev system, an early testbed that is one generation removed from Summit, the next leadership class supercomputer at Oak Ridge National Laboratory. The system contains 54 nodes wherein each node has 2 IBM POWER8 CPUs and 4 NVIDIA Tesla P100 GPUs. This work is part of a larger project, ACME-MMF component of the U.S. Department of Energy(DOE) Exascale Computing Project. The ACME-MMF approach addresses structural uncertainty in cloud processes by replacing traditional parameterizations with cloud resolving "superparameterization" within each grid cell of global climate model. Super-parameterization dramatically increases arithmetic intensity, making the MMF approach an ideal strategy to achieve good performance on emerging exascale computing architectures. The goal of the project is to integrate superparameterization into ACME, and explore its full potential to scientifically and computationally advance climate simulation and prediction.
Influence of speckle image reconstruction on photometric precision for large solar telescopes
NASA Astrophysics Data System (ADS)
Peck, C. L.; Wöger, F.; Marino, J.
2017-11-01
Context. High-resolution observations from large solar telescopes require adaptive optics (AO) systems to overcome image degradation caused by Earth's turbulent atmosphere. AO corrections are, however, only partial. Achieving near-diffraction limited resolution over a large field of view typically requires post-facto image reconstruction techniques to reconstruct the source image. Aims: This study aims to examine the expected photometric precision of amplitude reconstructed solar images calibrated using models for the on-axis speckle transfer functions and input parameters derived from AO control data. We perform a sensitivity analysis of the photometric precision under variations in the model input parameters for high-resolution solar images consistent with four-meter class solar telescopes. Methods: Using simulations of both atmospheric turbulence and partial compensation by an AO system, we computed the speckle transfer function under variations in the input parameters. We then convolved high-resolution numerical simulations of the solar photosphere with the simulated atmospheric transfer function, and subsequently deconvolved them with the model speckle transfer function to obtain a reconstructed image. To compute the resulting photometric precision, we compared the intensity of the original image with the reconstructed image. Results: The analysis demonstrates that high photometric precision can be obtained for speckle amplitude reconstruction using speckle transfer function models combined with AO-derived input parameters. Additionally, it shows that the reconstruction is most sensitive to the input parameter that characterizes the atmospheric distortion, and sub-2% photometric precision is readily obtained when it is well estimated.
Neoproterozoic 'snowball Earth' simulations with a coupled climate/ice-sheet model.
Hyde, W T; Crowley, T J; Baum, S K; Peltier, W R
2000-05-25
Ice sheets may have reached the Equator in the late Proterozoic era (600-800 Myr ago), according to geological and palaeomagnetic studies, possibly resulting in a 'snowball Earth'. But this period was a critical time in the evolution of multicellular animals, posing the question of how early life survived under such environmental stress. Here we present computer simulations of this unusual climate stage with a coupled climate/ice-sheet model. To simulate a snowball Earth, we use only a reduction in the solar constant compared to present-day conditions and we keep atmospheric CO2 concentrations near present levels. We find rapid transitions into and out of full glaciation that are consistent with the geological evidence. When we combine these results with a general circulation model, some of the simulations result in an equatorial belt of open water that may have provided a refugium for multicellular animals.
ERIC Educational Resources Information Center
Shifflet, Mark; Brown, Jane
2006-01-01
The purpose of this study was to investigate how exposure to classroom instruction affected the use of a computer simulation that was designed to provide students an opportunity to apply material presented in class. The study involved an analysis of a computer-based crisis communication case study designed for a college-level public relations…
Exact and efficient simulation of concordant computation
NASA Astrophysics Data System (ADS)
Cable, Hugo; Browne, Daniel E.
2015-11-01
Concordant computation is a circuit-based model of quantum computation for mixed states, that assumes that all correlations within the register are discord-free (i.e. the correlations are essentially classical) at every step of the computation. The question of whether concordant computation always admits efficient simulation by a classical computer was first considered by Eastin in arXiv:quant-ph/1006.4402v1, where an answer in the affirmative was given for circuits consisting only of one- and two-qubit gates. Building on this work, we develop the theory of classical simulation of concordant computation. We present a new framework for understanding such computations, argue that a larger class of concordant computations admit efficient simulation, and provide alternative proofs for the main results of arXiv:quant-ph/1006.4402v1 with an emphasis on the exactness of simulation which is crucial for this model. We include detailed analysis of the arithmetic complexity for solving equations in the simulation, as well as extensions to larger gates and qudits. We explore the limitations of our approach, and discuss the challenges faced in developing efficient classical simulation algorithms for all concordant computations.
A Queue Simulation Tool for a High Performance Scientific Computing Center
NASA Technical Reports Server (NTRS)
Spear, Carrie; McGalliard, James
2007-01-01
The NASA Center for Computational Sciences (NCCS) at the Goddard Space Flight Center provides high performance highly parallel processors, mass storage, and supporting infrastructure to a community of computational Earth and space scientists. Long running (days) and highly parallel (hundreds of CPUs) jobs are common in the workload. NCCS management structures batch queues and allocates resources to optimize system use and prioritize workloads. NCCS technical staff use a locally developed discrete event simulation tool to model the impacts of evolving workloads, potential system upgrades, alternative queue structures and resource allocation policies.
Proton Upset Monte Carlo Simulation
NASA Technical Reports Server (NTRS)
O'Neill, Patrick M.; Kouba, Coy K.; Foster, Charles C.
2009-01-01
The Proton Upset Monte Carlo Simulation (PROPSET) program calculates the frequency of on-orbit upsets in computer chips (for given orbits such as Low Earth Orbit, Lunar Orbit, and the like) from proton bombardment based on the results of heavy ion testing alone. The software simulates the bombardment of modern microelectronic components (computer chips) with high-energy (.200 MeV) protons. The nuclear interaction of the proton with the silicon of the chip is modeled and nuclear fragments from this interaction are tracked using Monte Carlo techniques to produce statistically accurate predictions.
Computer Access and Flowcharting as Variables in Learning Computer Programming.
ERIC Educational Resources Information Center
Ross, Steven M.; McCormick, Deborah
Manipulation of flowcharting was crossed with in-class computer access to examine flowcharting effects in the traditional lecture/laboratory setting and in a classroom setting where online time was replaced with manual simulation. Seventy-two high school students (24 male and 48 female) enrolled in a computer literacy course served as subjects.…
Energy Exascale Earth System Model (E3SM) Project Strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bader, D.
The E3SM project will assert and maintain an international scientific leadership position in the development of Earth system and climate models at the leading edge of scientific knowledge and computational capabilities. With its collaborators, it will demonstrate its leadership by using these models to achieve the goal of designing, executing, and analyzing climate and Earth system simulations that address the most critical scientific questions for the nation and DOE.
NASA Technical Reports Server (NTRS)
Moser, D. E.; Cooke, W. J.
2004-01-01
The cometary meteoroid ejection models of Jones (1996) and Crifo (1997) were used to simulate ejection from comets 55P/Tempel-Tuttle during the last 12 revolutions, and the 1862, 1737, and 161 0 apparitions of 1 OSP/Swift-Tuttle. Using cometary ephemerides generated by the JPL HORIZONS Solar System Data and Ephemeris Computation Service, ejection was simulated in 1 hour time steps while the comet was within 2.5 AU of the Sun. Also simulated was ejection occurring at the hour of perihelion passage. An RK4 variable step integrator was then used to integrate meteoroid position and velocity forward in time, accounting for the effects of radiation pressure, Poynting-Robertson drag, and the gravitational forces of the planets, which were computed using JPL's DE406 planetary ephemerides. An impact parameter is computed for each particle approaching the Earth, and the results are compared to observations of the 1998-2002 Leonid showers, and the 1993-1 994 Perseids. A prediction for Earth's encounter with the Perseid stream in 2004 is also presented.
Simulations in a Science and Society Course.
ERIC Educational Resources Information Center
Maier, Mark H.; Venanzi, Thomas
1984-01-01
Provides a course outline which includes simulation exercises designed as in-class activities related to science and society interactions. Simulations focus on the IQ debate, sociobiology, nuclear weapons and nulcear strategy, nuclear power and radiation, computer explosion, and cosmology. Indicates that learning improves when students take active…
NASA Astrophysics Data System (ADS)
Palit, Sourav; Chakrabarti, Sandip Kumar; Pal, Sujay; Das, Bakul; Ray, Suman
2016-07-01
Very Low Frequency (VLF) signal at any location on Earth's surface is strongly dependent on the interference of various modes. The modulation effects on VLF signal due to any terrestrial or extra-terrestrial events vary widely from one propagation path to another depending on the interference patterns along these paths. The task of predicting or reproducing the modulation in the values of signal amplitudes or phase between any two transmitting and receiving stations is challenging. In this work we present results of modeling of the VLF signal amplitudes from five different transmitters as observed at a single receiving station in India during a C9.3 class solar flare. In this model we simulate the ionization rates at lower ionospheric heights from actual flare spectra with the GEANT4 Monte Carlo simulation code and find the equilibrium ion densities with a D-region ion-chemistry model. We find the signal amplitude variation along different propagation paths with the LWPC code. Such efforts are essential for an appropriate understanding of the VLF propagation in Earth's ionosphere waveguide and to achieve desired accuracy while using Earth's ionosphere as an efficient detector of such extra-terrestrial ionization events.
Zonal methods for the parallel execution of range-limited N-body simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowers, Kevin J.; Dror, Ron O.; Shaw, David E.
2007-01-20
Particle simulations in fields ranging from biochemistry to astrophysics require the evaluation of interactions between all pairs of particles separated by less than some fixed interaction radius. The applicability of such simulations is often limited by the time required for calculation, but the use of massive parallelism to accelerate these computations is typically limited by inter-processor communication requirements. Recently, Snir [M. Snir, A note on N-body computations with cutoffs, Theor. Comput. Syst. 37 (2004) 295-318] and Shaw [D.E. Shaw, A fast, scalable method for the parallel evaluation of distance-limited pairwise particle interactions, J. Comput. Chem. 26 (2005) 1318-1328] independently introducedmore » two distinct methods that offer asymptotic reductions in the amount of data transferred between processors. In the present paper, we show that these schemes represent special cases of a more general class of methods, and introduce several new algorithms in this class that offer practical advantages over all previously described methods for a wide range of problem parameters. We also show that several of these algorithms approach an approximate lower bound on inter-processor data transfer.« less
NASA Technical Reports Server (NTRS)
Rowell, L. F.; Powell, R. W.; Stone, H. W., Jr.
1980-01-01
A nonlinear, six degree of freedom, digital computer simulation of a vehicle which has constant mass properties and whose attitudes are controlled by both aerodynamic surfaces and reaction control system thrusters was developed. A rotating, oblate Earth model was used to describe the gravitational forces which affect long duration Earth entry trajectories. The program is executed in a nonreal time mode or connected to a simulation cockpit to conduct piloted and autopilot studies. The program guidance and control software used by the space shuttle orbiter for its descent from approximately 121.9 km to touchdown on the runway.
NASA Technical Reports Server (NTRS)
Mitchell, Paul H.
1991-01-01
F77NNS (FORTRAN 77 Neural Network Simulator) computer program simulates popular back-error-propagation neural network. Designed to take advantage of vectorization when used on computers having this capability, also used on any computer equipped with ANSI-77 FORTRAN Compiler. Problems involving matching of patterns or mathematical modeling of systems fit class of problems F77NNS designed to solve. Program has restart capability so neural network solved in stages suitable to user's resources and desires. Enables user to customize patterns of connections between layers of network. Size of neural network F77NNS applied to limited only by amount of random-access memory available to user.
2007-09-01
example, an application developed in Sun’s Netbeans [2007] integrated development environment (IDE) uses Swing class object for graphical user... Netbeans Version 5.5.1 [Computer Software]. Santa Clara, CA: Sun Microsystems. Process Modeler Version 7.0 [Computer Software]. Santa Clara, Ca
Biological Simulations in Distance Learning. CAL Research Group Technical Report No. 12.
ERIC Educational Resources Information Center
Murphy, P. J.
When two biological simulations on evolution and genetics (one originally developed for a conventional university undergraduate course) were introduced into Open University distance education classes, the difficulties encountered required a reappraisal of the concept of using computer simulation for distance learning and decisions on which…
New NASA 3D Animation Shows Seven Days of Simulated Earth Weather
2014-08-11
This visualization shows early test renderings of a global computational model of Earth's atmosphere based on data from NASA's Goddard Earth Observing System Model, Version 5 (GEOS-5). This particular run, called Nature Run 2, was run on a supercomputer, spanned 2 years of simulation time at 30 minute intervals, and produced Petabytes of output. The visualization spans a little more than 7 days of simulation time which is 354 time steps. The time period was chosen because a simulated category-4 typhoon developed off the coast of China. The 7 day period is repeated several times during the course of the visualization. Credit: NASA's Scientific Visualization Studio Read more or download here: svs.gsfc.nasa.gov/goto?4180 NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack
20th Edition of TOP500 List of World's Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 20th edition of the TOP500 list of the world's fastest supercomputers was released today (November 15, 2002). The Earth Simulator supercomputer installed earlier this year at the Earth Simulator Center in Yokohama, Japan, is with its Linpack benchmark performance of 35.86 Tflop/s (trillions of calculations per second) retains the number one position. The No.2 and No.3 positions are held by two new, identical ASCI Q systems at Los Alamos National Laboratorymore » (7.73Tflop/s each). These systems are built by Hewlett-Packard and based on the Alpha Server SC computer system.« less
TOP500 Supercomputers for November 2003
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack
2003-11-16
22nd Edition of TOP500 List of World s Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.; BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 22nd edition of the TOP500 list of the worlds fastest supercomputers was released today (November 16, 2003). The Earth Simulator supercomputer retains the number one position with its Linpack benchmark performance of 35.86 Tflop/s (''teraflops'' or trillions of calculations per second). It was built by NEC and installed last year at the Earth Simulator Center in Yokohama, Japan.
Giant Impacts on Earth-Like Worlds
NASA Astrophysics Data System (ADS)
Kohler, Susanna
2016-05-01
Earth has experienced a large number of impacts, from the cratering events that may have caused mass extinctions to the enormous impact believed to have formed the Moon. A new study examines whether our planets impact history is typical for Earth-like worlds.N-Body ChallengesTimeline placing the authors simulations in context of the history of our solar system (click for a closer look). [Quintana et al. 2016]The final stages of terrestrial planet formation are thought to be dominated by giant impacts of bodies in the protoplanetary disk. During this stage, protoplanets smash into one another and accrete, greatly influencing the growth, composition, and habitability of the final planets.There are two major challenges when simulating this N-body planet formation. The first is fragmentation: since computational time scales as N^2, simulating lots of bodies that split into many more bodies is very computationally intensive. For this reason, fragmentation is usually ignored; simulations instead assume perfect accretion during collisions.Total number of bodies remaining within the authors simulations over time, with fragmentation included (grey) and ignored (red). Both simulations result in the same final number of bodies, but the ones that include fragmentation take more time to reach that final number. [Quintana et al. 2016]The second challengeis that many-body systems are chaotic, which means its necessary to do a large number of simulations to make statistical statements about outcomes.Adding FragmentationA team of scientists led by Elisa Quintana (NASA NPP Senior Fellow at the Ames Research Center) has recently pushed at these challenges by modeling inner-planet formation using a code that does include fragmentation. The team ran 140 simulations with and 140 without the effects of fragmentation using similar initial conditions to understand how including fragmentation affects the outcome.Quintana and collaborators then used the fragmentation-inclusive simulations to examine the collisional histories of Earth-like planets that form. Their goal is to understand if our solar systems formation and evolution is typical or unique.How Common Are Giant Impacts?Histogram of the total number of giant impacts received by the 164 Earth-like worlds produced in the authors fragmentation-inclusive simulations. [Quintana et al. 2016]The authors find that including fragmentation does not affect the final number of planets that are formed in the simulation (an average of 34 in each system, consistent with our solar systems terrestrial planet count). But when fragmentation is included, fewer collisions end in merger which results in typical accretion timescales roughly doubling. So the effects of fragmentation influence the collisional history of the system and the length of time needed for the final system to form.Examining the 164 Earth-analogs produced in the fragmentation-inclusive simulations, Quintana and collaborators find that impacts large enough to completely strip a planets atmosphere are rare; fewer than 1% of the Earth-like worlds experienced this.But giant impacts that are able to strip ~50% of an Earth-analogs atmosphere roughly the energy of the giant impact thought to have formed our Moon are more common. Almost all of the authors Earth-analogs experienced at least 1 giant impact of this size in the 2-Gyr simulation, and the average Earth-like world experienced ~3 such impacts.These results suggest that our planets impact history with the Moon-forming impact likely being the last giant impact Earth experienced is fairly typical for Earth-like worlds. The outcomes also indicate that smaller impacts that are still potentially life-threatening are much more common than bulk atmospheric removal. Higher-resolution simulations could be used to examine such smaller impacts.CitationElisa V. Quintana et al 2016 ApJ 821 126. doi:10.3847/0004-637X/821/2/126
The Shuttle Mission Simulator computer generated imagery
NASA Technical Reports Server (NTRS)
Henderson, T. H.
1984-01-01
Equipment available in the primary training facility for the Space Transportation System (STS) flight crews includes the Fixed Base Simulator, the Motion Base Simulator, the Spacelab Simulator, and the Guidance and Navigation Simulator. The Shuttle Mission Simulator (SMS) consists of the Fixed Base Simulator and the Motion Base Simulator. The SMS utilizes four visual Computer Generated Image (CGI) systems. The Motion Base Simulator has a forward crew station with six-degrees of freedom motion simulation. Operation of the Spacelab Simulator is planned for the spring of 1983. The Guidance and Navigation Simulator went into operation in 1982. Aspects of orbital visual simulation are discussed, taking into account the earth scene, payload simulation, the generation and display of 1079 stars, the simulation of sun glare, and Reaction Control System jet firing plumes. Attention is also given to landing site visual simulation, and night launch and landing simulation.
Effects of Relativity Lead to"Warp Speed" Computations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vay, J.-L.
A scientist at Lawrence Berkeley National Laboratory has discovered that a previously unnoticed consequence of Einstein's special theory of relativity can lead to speedup of computer calculations by orders of magnitude when applied to the computer modeling of a certain class of physical systems. This new finding offers the possibility of tackling some problems in a much shorter time and with far more precision than was possible before, as well as studying some configurations in every detail for the first time. The basis of Einstein's theory is the principle of relativity, which states that the laws of physics are themore » same for all observers, whether the 'observer' is a turtle 'racing' with a rabbit, or a beam of particles moving at near light speed. From the invariance of the laws of physics, one may be tempted to infer that the complexity of a system is independent of the motion of the observer, and consequently, a computer simulation will require the same number of mathematical operations, independently of the reference frame that is used for the calculation. Length contraction and time dilation are well known consequences of the special theory of relativity which lead to very counterintuitive effects. An alien observing human activity through a telescope in a spaceship traveling in the Vicinity of the earth near the speed of light would see everything flattened in the direction of propagation of its spaceship (for him, the earth would have the shape of a pancake), while all motions on earth would appear extremely slow, slowed almost to a standstill. Conversely, a space scientist observing the alien through a telescope based on earth would see a flattened alien almost to a standstill in a flattened spaceship. Meanwhile, an astronaut sitting in a spaceship moving at some lower velocity than the alien spaceship with regard to earth might see both the alien spaceship and the earth flattened in the same proportion and the motion unfolding in each of them at the same speed. Let us now assume that each protagonist (the alien, the space scientist and the astronaut) is to run a computer simulation describing the motion of all of them in a single calculation. In order to model a physical system on a computer, scientists often divide space and time into small chunks. Since the computer must calculated some things for each chunk, having a large system containing numerous small chunks translates to long calculations requiring many computational steps on supercomputers. Let us assume that each protagonist of our intergalactic story uses the space and time slicing as described and chooses to perform the calculation in its own frame of reference. For the alien and the space scientist, the slicing of space and time results in an exceedingly large number of chunks, due to the wide disparity of spatial and time scales needed to describe both their own environment and motion together with the other extremely flattened environment and slowed motion. Since the disparity of scales is reduced for the astronaut, who is traveling at an intermediate velocity, the number of computer operations needed to complete the calculation in his frame of reference will be significantly lower, possibly by many orders of magnitude. Analogously, the new discovery at Lawrence Berkeley National Laboratory shows that there exists a frame of reference minimizing the number of computational operations needed for studying the interaction of beams of particles or light (lasers) interacting at, or near, light speed with other particles or with surrounding structures. Speedups ranging from ten to a million times or more are predicted for the modeling of beams interacting with electron clouds, such as those in the upcoming Large Hadron Collider 'atom smasher' accelerator at CERN (Switzerland), and in free electron lasers and tabletop laser wakefield accelerators. The discovery has surprised many physicists and was received initially with much skepticism. It sounded too much like a 'free lunch'. Yet, the demonstration of a speedup of a stunning one thousand times in a test simulation of a particle beam interacting with a background of electrons (see image), has proven that the effect is real and can be applied successfully, at least to some problems. Work is being actively pursued at Berkeley Lab and elsewhere to validate the feasibility of the method for a wider range of applications, as well as to apply the already successful method to more problems, where it might help getting better understanding of some processes and eventually lead to new findings.« less
NASA's Information Power Grid: Large Scale Distributed Computing and Data Management
NASA Technical Reports Server (NTRS)
Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)
2001-01-01
Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.
Computational Analysis of Gravitational Effects in Low-Density Gas Jets
NASA Technical Reports Server (NTRS)
Satti, Rajani P.; Agrawal, Ajay K.
2004-01-01
This study deals with the computational analysis of buoyancy-induced instability in the nearfield of an isothermal helium jet injected into quiescent ambient air environment. Laminar, axisymmetric, unsteady flow conditions were considered for the analysis. The transport equations of helium mass fraction coupled with the conservation equations of mixture mass and momentum were solved using a staggered grid finite volume method. The jet Richardson numbers of 1.5 and 0.018 were considered to encompass both buoyant and inertial jet flow regimes. Buoyancy effects were isolated by initiating computations in Earth gravity and subsequently, reducing gravity to simulate the microgravity conditions. Computed results concur with experimental observations that the periodic flow oscillations observed in Earth gravity subside in microgravity.
The Politics of City Planning Simulations.
ERIC Educational Resources Information Center
Kolson, Kenneth
This research paper presents an analysis of the computer simulation, SimCity, used for an urban city planning class. The data were gathered by actual use of the simulation and an electronic mail network was employed to secure impressions from users of the simulation. SimCity (developed by Maxis) provides the player with rules of human factors,…
Simulation-based performance analysis of EC-Earth 3.2.0 using Dimemas
NASA Astrophysics Data System (ADS)
Yepes Arbós, Xavier; César Acosta Cobos, Mario; Serradell Maronda, Kim; Sanchez Lorente, Alicia; Doblas Reyes, Francisco Javier
2017-04-01
Earth System Models (ESMs) are complex applications executed in supercomputing facilities due to their high demand on computing resources. However, not all these models perform a good resources usage and the energy efficiency can be well below a minimum acceptable. One example is EC-Earth, a global coupled climate model which integrates different component models to simulate the Earth system. The two main components used in this analysis are IFS as atmospheric model and NEMO as ocean model, both coupled via the OASIS3-MCT coupler. Preliminary results proved that EC-Earth does not have a good computational performance. For example, the scalability of this model using the T255L91 grid with 512 MPI processes for IFS and the ORCA1L75 grid with 128 MPI processes for NEMO achieves 40.3 of speedup. This means that the 81.2% of the resources are wasted. Therefore, it is necessary a performance analysis to find the bottlenecks of the model and thus, determine the most appropriate optimization techniques. Using traces of the model collected with profiling tools such as Extrae, Paraver and Dimemas, allow us to simulate the model behaviour on a configurable parallel platform and extrapolate the impact of hardware changes in the performance of EC-Earth. In this document we propose a state-of-art procedure which makes possible to evaluate the different characteristics of climate models in a very efficient way. Accordingly, the performance of EC-Earth in different scenarios, namely assuming an ideal machine, model sensitivity and limiting model due to coupling has been shown. By simulating these scenarios, we realized that each model has different characteristics. With the ideal machine, we have seen that there are some sources of inefficiency: about a 20.59% of the execution time is communication; and there are workload imbalances produced by data dependences both between IFS and NEMO and within each model. In addition, in the model sensitivity simulations, we have described the types of messages and detected data dependencies. In IFS, we have observed that latency affects the coupling between models due to a large amount of small communications, whereas bandwidth affects another region of the code with a few big messages. In NEMO, results show that the simulated latencies and bandwidths only affect slightly to its execution time. However, it has data dependencies solved inefficiently and workload imbalances. The last simulation performed to detect the slowest model due to coupling has revealed that IFS is slower than NEMO. Moreover, there is not enough bandwidth to transfer all the data in IFS, whereas in NEMO there is almost no contention. This study is useful to improve the computational efficiency of the model, adapt it to support ultra-high resolution (UHR) experiments and future exascale supercomputers, and help code developers to design new algorithms more machine-independent.
Greenhouse Effect in the Classroom: A Project- and Laboratory-Based Curriculum.
ERIC Educational Resources Information Center
Lueddecke, Susann B.; Pinter, Nicholas; McManus, Scott A.
2001-01-01
Tests a multifaceted curriculum for use in introductory earth science classes from the secondary school to the introductory undergraduate level. Simulates the greenhouse effect with two fish tanks, heat lamps, and thermometers. Uses a hands-on science approach to develop a deeper understanding of the climate system among students. (Contains 28…
Simulation research: A vital step for human missions to Mars
NASA Astrophysics Data System (ADS)
Perino, Maria Antonietta; Apel, Uwe; Bichi, Alessandro
The complex nature of the challenge as humans embark on exploration missions beyond Earth orbit will require that, in the early stages, simulation facilities be established at least on Earth. Suitable facilities in Low Earth Orbit and on the Moon surface would provide complementary information of critical importance for the overall design of a human mission to Mars. A full range of simulation campaigns is required, in fact, to reach a better understanding of the complexities involved in exploration missions that will bring humans back to the Moon and then outward to Mars. The corresponding simulation means may range from small scale environmental simulation chambers and/or computer models that will aid in the development of new materials, to full scale mock-ups of spacecraft and planetary habitats and/or orbiting infrastructues. This paper describes how a suitable simulation campaign will contribute to the definition of the required countermeasures with respect to the expected duration of the flight. This will allow to be traded contermeasure payload and astronaut time against effort in technological development of propulsion systems.
NASA Astrophysics Data System (ADS)
Jöckel, Patrick; Tost, Holger; Pozzer, Andrea; Kunze, Markus; Kirner, Oliver; Brenninkmeijer, Carl A. M.; Brinkop, Sabine; Cai, Duy S.; Dyroff, Christoph; Eckstein, Johannes; Frank, Franziska; Garny, Hella; Gottschaldt, Klaus-Dirk; Graf, Phoebe; Grewe, Volker; Kerkweg, Astrid; Kern, Bastian; Matthes, Sigrun; Mertens, Mariano; Meul, Stefanie; Neumaier, Marco; Nützel, Matthias; Oberländer-Hayn, Sophie; Ruhnke, Roland; Runde, Theresa; Sander, Rolf; Scharffe, Dieter; Zahn, Andreas
2016-03-01
Three types of reference simulations, as recommended by the Chemistry-Climate Model Initiative (CCMI), have been performed with version 2.51 of the European Centre for Medium-Range Weather Forecasts - Hamburg (ECHAM)/Modular Earth Submodel System (MESSy) Atmospheric Chemistry (EMAC) model: hindcast simulations (1950-2011), hindcast simulations with specified dynamics (1979-2013), i.e. nudged towards ERA-Interim reanalysis data, and combined hindcast and projection simulations (1950-2100). The manuscript summarizes the updates of the model system and details the different model set-ups used, including the on-line calculated diagnostics. Simulations have been performed with two different nudging set-ups, with and without interactive tropospheric aerosol, and with and without a coupled ocean model. Two different vertical resolutions have been applied. The on-line calculated sources and sinks of reactive species are quantified and a first evaluation of the simulation results from a global perspective is provided as a quality check of the data. The focus is on the intercomparison of the different model set-ups. The simulation data will become publicly available via CCMI and the Climate and Environmental Retrieval and Archive (CERA) database of the German Climate Computing Centre (DKRZ). This manuscript is intended to serve as an extensive reference for further analyses of the Earth System Chemistry integrated Modelling (ESCiMo) simulations.
Cryosphere Science Outreach using the NASA/JPL Virtual Earth System Laboratory
NASA Astrophysics Data System (ADS)
Larour, E. Y.; Cheng, D. L. C.; Quinn, J.; Halkides, D. J.; Perez, G. L.
2016-12-01
Understanding the role of Cryosphere Science within the larger context of Sea Level Rise is both a technical and educational challenge that needs to be addressed if the public at large is to truly understand the implications and consequences of Climate Change. Within this context, we propose a new approach in which scientific tools are used directly inside a mobile/website platform geared towards Education/Outreach. Here, we apply this approach by using the Ice Sheet System Model, a state of the art Cryosphere model developed at NASA, and integrated within a Virtual Earth System Laboratory, with the goal to outreach Cryosphere science to K-12 and College level students. The approach mixes laboratory experiments, interactive classes/lessons on a website, and a simplified interface to a full-fledged instance of ISSM to validate the classes/lessons. This novel approach leverages new insights from the Outreach/Educational community and the interest of new generations in web based technologies and simulation tools, all of it delivered in a seamlessly integrated web platform, relying on a state of the art climate model and live simulations.
NASA Technical Reports Server (NTRS)
Clement, Warren F.; Gorder, Peter J.; Jewell, Wayne F.
1991-01-01
Developing a single-pilot, all-weather nap-of-the-earth (NOE) capability requires fully automatic NOE (ANOE) navigation and flight control. Innovative guidance and control concepts are investigated in a four-fold research effort that: (1) organizes the on-board computer-based storage and real-time updating of NOE terrain profiles and obstacles in course-oriented coordinates indexed to the mission flight plan; (2) defines a class of automatic anticipative pursuit guidance algorithms and necessary data preview requirements to follow the vertical, lateral, and longitudinal guidance commands dictated by the updated flight profiles; (3) automates a decision-making process for unexpected obstacle avoidance; and (4) provides several rapid response maneuvers. Acquired knowledge from the sensed environment is correlated with the forehand knowledge of the recorded environment (terrain, cultural features, threats, and targets), which is then used to determine an appropriate evasive maneuver if a nonconformity of the sensed and recorded environments is observed. This four-fold research effort was evaluated in both fixed-base and moving-base real-time piloted simulations; thereby, providing a practical demonstration for evaluating pilot acceptance of the automated concepts, supervisory override, manual operation, and re-engagement of the automatic system. Volume one describes the major components of the guidance and control laws as well as the results of the piloted simulations. Volume two describes the complete mathematical model of the fully automatic guidance system for rotorcraft NOE flight following planned flight profiles.
Single-photon technique for the detection of periodic extraterrestrial laser pulses.
Leeb, W R; Poppe, A; Hammel, E; Alves, J; Brunner, M; Meingast, S
2013-06-01
To draw humankind's attention to its existence, an extraterrestrial civilization could well direct periodic laser pulses toward Earth. We developed a technique capable of detecting a quasi-periodic light signal with an average of less than one photon per pulse within a measurement time of a few tens of milliseconds in the presence of the radiation emitted by an exoplanet's host star. Each of the electronic events produced by one or more single-photon avalanche detectors is tagged with precise time-of-arrival information and stored. From this we compute a histogram displaying the frequency of event-time differences in classes with bin widths on the order of a nanosecond. The existence of periodic laser pulses manifests itself in histogram peaks regularly spaced at multiples of the-a priori unknown-pulse repetition frequency. With laser sources simulating both the pulse source and the background radiation, we tested a detection system in the laboratory at a wavelength of 850 nm. We present histograms obtained from various recorded data sequences with the number of photons per pulse, the background photons per pulse period, and the recording time as main parameters. We then simulated a periodic signal hypothetically generated on a planet orbiting a G2V-type star (distance to Earth 500 light-years) and show that the technique is capable of detecting the signal even if the received pulses carry as little as one photon on average on top of the star's background light.
From petascale to exascale, the future of simulated climate data (Invited)
NASA Astrophysics Data System (ADS)
Lawrence, B.; Juckes, M. N.
2013-12-01
Coleridge ought to have said: data, data, everywhere, and all the data centres groan, data data everywhere, nor any I should clone. Except of course, he didn't say it, and we do clone data! While we've been dealing with terabytes of simulated datasets, downloading ("cloning") and analysing, has been a plausible way forward. In doing so, we have set up systems that support four broad classes of activities: personal and institutional data analysis, federated data systems, and data portals. We use metadata to manage the migration of data between these (and their communities) and we have built software systems. However, our metadata and software solutions are fragile, often based on soft money, and loose governance arrangements. We often download data with minimal provenance, and often many of us download the same data. In the not too distant future we can imagine exabytes of data being produced, and all these problems will get worse. Arguably we have no plausible methods of effectively exploiting such data - particularly if the analysis requires intercomparison. Yet of course, we know full well that intercomparison is at the heart of climate science. In this talk, we review the current status of simulation data management, with special emphasis on accessibility and usability. We talk about file formats, bundles of files, real and virtual, and simulation metadata. We introduce the InfraStructure for the European Network for Earth Simulation (IS-ENES) and its relationship with the Earth System Grid Federation (ESGF) as well as JASMIN, the UK Joint Analysis System. There will be a small digression on parallel data analysis - locally and distributed. we then progress to the near term problems (and solutions) for climate data before scoping out the problems of the future, both for data handling, and the models that produce the data. The way we think about data, computing, models, even ensemble design, may need to change.
Advanced Techniques for Simulating the Behavior of Sand
NASA Astrophysics Data System (ADS)
Clothier, M.; Bailey, M.
2009-12-01
Computer graphics and visualization techniques continue to provide untapped research opportunities, particularly when working with earth science disciplines. Through collaboration with the Oregon Space Grant and IGERT Ecosystem Informatics programs we are developing new techniques for simulating sand. In addition, through collaboration with the Oregon Space Grant, we’ve been communicating with the Jet Propulsion Laboratory (JPL) to exchange ideas and gain feedback on our work. More specifically, JPL’s DARTS Laboratory specializes in planetary vehicle simulation, such as the Mars rovers. This simulation utilizes a virtual "sand box" to test how planetary rovers respond to different terrains while traversing them. Unfortunately, this simulation is unable to fully mimic the harsh, sandy environments of those found on Mars. Ideally, these simulations should allow a rover to interact with the sand beneath it, particularly for different sand granularities and densities. In particular, there may be situations where a rover may become stuck in sand due to lack of friction between the sand and wheels. In fact, in May 2009, the Spirit rover became stuck in the Martian sand and has provided additional motivation for this research. In order to develop a new sand simulation model, high performance computing will play a very important role in this work. More specifically, graphics processing units (GPUs) are useful due to their ability to run general purpose algorithms and ability to perform massively parallel computations. In prior research, simulating vast quantities of sand has been difficult to compute in real-time due to the computational complexity of many colliding particles. With the use of GPUs however, each particle collision will be parallelized, allowing for a dramatic performance increase. In addition, spatial partitioning will also provide a speed boost as this will help limit the number of particle collision calculations. However, since the goal of this research is to simulate the look and behavior of sand, this work will go beyond simple particle collision. In particular, we can continue to use our parallel algorithms not only on single particles but on particle “clumps” that consist of multiple combined particles. Since sand is typically not spherical in nature, these particle “clumps” help to simulate the coarse nature of sand. In a simulation environment, multiple combined particles could be used to simulate the polygonal and granular nature of sand grains. Thus, a diversity of sand particles can be generated. The interaction between these particles can then be parallelized using GPU hardware. As such, this research will investigate different graphics and physics techniques and determine the tradeoffs in performance and visual quality for sand simulation. An enhanced sand model through the use of high performance computing and GPUs has great potential to impact research for both earth and space scientists. Interaction with JPL has provided an opportunity for us to refine our simulation techniques that can ultimately be used for their vehicle simulator. As an added benefit of this work, advancements in simulating sand can also benefit scientists here on earth, especially in regard to understanding landslides and debris flows.
Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility
NASA Astrophysics Data System (ADS)
Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.
2014-12-01
The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and to reduce the total volume of data communicated. Use of Titan has enabled ECMWF to plan future scalability developments and resource requirements. We will also discuss the best practices developed over the years in navigating logistical, legal and regulatory hurdles involved in supporting the facility's diverse user community.
STS-133 crew members Mike Barratt and Nicole Stott in cupola
2010-06-08
JSC2010-E-090701 (8 June 2010) --- Several computer monitors are featured in this image photographed during an STS-133 exercise in the systems engineering simulator in the Avionics Systems Laboratory at NASA's Johnson Space Center. The facility includes moving scenes of full-sized International Space Station components over a simulated Earth.
Simulations of horizontal roll vortex development above lines of extreme surface heating
W.E. Heilman; J.D. Fast
1992-01-01
A two-dimensional, nonhydrostatic, coupled, earth/atmospheric model has been used to simulate mean and turbulent atmospheric characteristics near lines of extreme surface heating. Prognostic equations are used to solve for the horizontal and vertical wind components, potential temperature, and turbulent kinetic energy (TKE). The model computes nonhydrostatic pressure...
ERIC Educational Resources Information Center
Shlechter, Theodore M.; And Others
1992-01-01
Examines the effectiveness of SIMNET (Simulation Networking), a virtual reality training simulation system, combined with a program of role-playing activities for helping Army classes to master the conditional knowledge needed for successful field performance. The value of active forms of learning for promoting higher order cognitive thinking is…
Ideas in Practice (3): A Simulated Laboratory Experience in Digital Design.
ERIC Educational Resources Information Center
Cleaver, Thomas G.
1988-01-01
Gives an example of the use of a simplified logic simulator in a logic design course. Discusses some problems in logic design classes, commercially available software, and software problems. Describes computer-aided engineering (CAE) software. Lists 14 experiments in the simulated laboratory and presents students' evaluation of the course. (YP)
Heredia, Alejandro; Colín-García, María; Puig, Teresa Pi I; Alba-Aldave, Leticia; Meléndez, Adriana; Cruz-Castañeda, Jorge A; Basiuk, Vladimir A; Ramos-Bernal, Sergio; Mendoza, Alicia Negrón
2017-12-01
Ionizing radiation may have played a relevant role in chemical reactions for prebiotic biomolecule formation on ancient Earth. Environmental conditions such as the presence of water and magnetic fields were possibly relevant in the formation of organic compounds such as amino acids. ATR-FTIR, Raman, EPR and X-ray spectroscopies provide valuable information about molecular organization of different glycine polymorphs under static magnetic fields. γ-glycine polymorph formation increases in irradiated samples interacting with static magnetic fields. The increase in γ-glycine polymorph agrees with the computer simulations. The AM1 semi-empirical simulations show a change in the catalyst behavior and dipole moment values in α and γ-glycine interaction with the static magnetic field. The simulated crystal lattice energy in α-glycine is also affected by the free radicals under the magnetic field, which decreases its stability. Therefore, solid α and γ-glycine containing free radicals under static magnetic fields might have affected the prebiotic scenario on ancient Earth by causing the oligomerization of glycine in prebiotic reactions. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
MOHAMMED, M. A. SI; BOUSSADIA, H.; BELLAR, A.; ADNANE, A.
2017-01-01
This paper presents a brief synthesis and useful performance analysis of different attitude filtering algorithms (attitude determination algorithms, attitude estimation algorithms, and nonlinear observers) applied to Low Earth Orbit Satellite in terms of accuracy, convergence time, amount of memory, and computation time. This latter is calculated in two ways, using a personal computer and also using On-board computer 750 (OBC 750) that is being used in many SSTL Earth observation missions. The use of this comparative study could be an aided design tool to the designer to choose from an attitude determination or attitude estimation or attitude observer algorithms. The simulation results clearly indicate that the nonlinear Observer is the more logical choice.
NASA Technical Reports Server (NTRS)
Karoly, Kis; Taylor, Patrick T.; Geza, Wittmann
2014-01-01
We computed magnetic field gradients at satellite altitude, over Europe with emphasis on the Kursk Magnetic Anomaly (KMA). They were calculated using the CHAMP satellite total magnetic anomalies. Our computations were done to determine how the magnetic anomaly data from the new ESA/Swarm satellites could be utilized to determine the structure of the magnetization of the Earths crust, especially in the region of the KMA. Since the ten years of 2 CHAMP data could be used to simulate the Swarm data. An initial East magnetic anomaly gradient map of Europe was computed and subsequently the North, East and Vertical magnetic gradients for the KMA region were calculated. The vertical gradient of the KMA was determined using Hilbert transforms. Inversion of the total KMA was derived using Simplex and Simulated Annealing algorithms. Our resulting inversion depth model is a horizontal quadrangle with upper 300-329 km and lower 331-339 km boundaries.
NASA Astrophysics Data System (ADS)
Haverd, V.; Smith, B.; Nieradzik, L. P.; Briggs, P. R.
2014-02-01
Poorly constrained rates of biomass turnover are a key limitation of Earth system models (ESM). In light of this, we recently proposed a new approach encoded in a model called Populations-Order-Physiology (POP), for the simulation of woody ecosystem stand dynamics, demography and disturbance-mediated heterogeneity. POP is suitable for continental to global applications and designed for coupling to the terrestrial ecosystem component of any ESM. POP bridges the gap between first generation Dynamic Vegetation Models (DVMs) with simple large-area parameterisations of woody biomass (typically used in current ESMs) and complex second generation DVMs, that explicitly simulate demographic processes and landscape heterogeneity of forests. The key simplification in the POP approach, compared with second-generation DVMs, is to compute physiological processes such as assimilation at grid-scale (with CABLE or a similar land surface model), but to partition the grid-scale biomass increment among age classes defined at sub grid-scale, each subject to its own dynamics. POP was successfully demonstrated along a savanna transect in northern Australia, replicating the effects of strong rainfall and fire disturbance gradients on observed stand productivity and structure. Here, we extend the application of POP to a range of forest types around the globe, employing paired observations of stem biomass and density from forest inventory data to calibrate model parameters governing stand demography and biomass evolution. The calibrated POP model is then coupled to the CABLE land surface model and the combined model (CABLE-POP) is evaluated against leaf-stem allometry observations from forest stands ranging in age from 3 to 200 yr. Results indicate that simulated biomass pools conform well with observed allometry. We conclude that POP represents a preferable alternative to large-area parameterisations of woody biomass turnover, typically used in current ESMs.
NASA Astrophysics Data System (ADS)
Nascetti, A.; Di Rita, M.; Ravanelli, R.; Amicuzi, M.; Esposito, S.; Crespi, M.
2017-05-01
The high-performance cloud-computing platform Google Earth Engine has been developed for global-scale analysis based on the Earth observation data. In particular, in this work, the geometric accuracy of the two most used nearly-global free DSMs (SRTM and ASTER) has been evaluated on the territories of four American States (Colorado, Michigan, Nevada, Utah) and one Italian Region (Trentino Alto- Adige, Northern Italy) exploiting the potentiality of this platform. These are large areas characterized by different terrain morphology, land covers and slopes. The assessment has been performed using two different reference DSMs: the USGS National Elevation Dataset (NED) and a LiDAR acquisition. The DSMs accuracy has been evaluated through computation of standard statistic parameters, both at global scale (considering the whole State/Region) and in function of the terrain morphology using several slope classes. The geometric accuracy in terms of Standard deviation and NMAD, for SRTM range from 2-3 meters in the first slope class to about 45 meters in the last one, whereas for ASTER, the values range from 5-6 to 30 meters. In general, the performed analysis shows a better accuracy for the SRTM in the flat areas whereas the ASTER GDEM is more reliable in the steep areas, where the slopes increase. These preliminary results highlight the GEE potentialities to perform DSM assessment on a global scale.
NASA Technical Reports Server (NTRS)
Weaver, W. L.; Green, R. N.
1980-01-01
A study was performed on the use of geometric shape factors to estimate earth-emitted flux densities from radiation measurements with wide field-of-view flat-plate radiometers on satellites. Sets of simulated irradiance measurements were computed for unrestricted and restricted field-of-view detectors. In these simulations, the earth radiation field was modeled using data from Nimbus 2 and 3. Geometric shape factors were derived and applied to these data to estimate flux densities on global and zonal scales. For measurements at a satellite altitude of 600 km, estimates of zonal flux density were in error 1.0 to 1.2%, and global flux density errors were less than 0.2%. Estimates with unrestricted field-of-view detectors were about the same for Lambertian and non-Lambertian radiation models, but were affected by satellite altitude. The opposite was found for the restricted field-of-view detectors.
Particle-In-Cell simulations of high pressure plasmas using graphics processing units
NASA Astrophysics Data System (ADS)
Gebhardt, Markus; Atteln, Frank; Brinkmann, Ralf Peter; Mussenbrock, Thomas; Mertmann, Philipp; Awakowicz, Peter
2009-10-01
Particle-In-Cell (PIC) simulations are widely used to understand the fundamental phenomena in low-temperature plasmas. Particularly plasmas at very low gas pressures are studied using PIC methods. The inherent drawback of these methods is that they are very time consuming -- certain stability conditions has to be satisfied. This holds even more for the PIC simulation of high pressure plasmas due to the very high collision rates. The simulations take up to very much time to run on standard computers and require the help of computer clusters or super computers. Recent advances in the field of graphics processing units (GPUs) provides every personal computer with a highly parallel multi processor architecture for very little money. This architecture is freely programmable and can be used to implement a wide class of problems. In this paper we present the concepts of a fully parallel PIC simulation of high pressure plasmas using the benefits of GPU programming.
Computational techniques for flows with finite-rate condensation
NASA Technical Reports Server (NTRS)
Candler, Graham V.
1993-01-01
A computational method to simulate the inviscid two-dimensional flow of a two-phase fluid was developed. This computational technique treats the gas phase and each of a prescribed number of particle sizes as separate fluids which are allowed to interact with one another. Thus, each particle-size class is allowed to move through the fluid at its own velocity at each point in the flow field. Mass, momentum, and energy are exchanged between each particle class and the gas phase. It is assumed that the particles do not collide with one another, so that there is no inter-particle exchange of momentum and energy. However, the particles are allowed to grow, and therefore, they may change from one size class to another. Appropriate rates of mass, momentum, and energy exchange between the gas and particle phases and between the different particle classes were developed. A numerical method was developed for use with this equation set. Several test cases were computed and show qualitative agreement with previous calculations.
A central compact object in Kes 79: the hypercritical regime and neutrino expectation
NASA Astrophysics Data System (ADS)
Bernal, C. G.; Fraija, N.
2016-11-01
We present magnetohydrodynamical simulations of a strong accretion on to magnetized proto-neutron stars for the Kesteven 79 (Kes 79) scenario. The supernova remnant Kes 79, observed with the Chandra ACIS-I instrument during approximately 8.3 h, is located in the constellation Aquila at a distance of 7.1 kpc in the galactic plane. It is a galactic and a very young object with an estimate age of 6 kyr. The Chandra image has revealed, for the first time, a point-like source at the centre of the remnant. The Kes 79 compact remnant belongs to a special class of objects, the so-called central compact objects (CCOs), which exhibits no evidence for a surrounding pulsar wind nebula. In this work, we show that the submergence of the magnetic field during the hypercritical phase can explain such behaviour for Kes 79 and others CCOs. The simulations of such regime were carried out with the adaptive-mesh-refinement code FLASH in two spatial dimensions, including radiative loss by neutrinos and an adequate equation of state for such regime. From the simulations, we estimate that the number of thermal neutrinos expected on the Hyper-Kamiokande Experiment is 733 ± 364. In addition, we compute the flavour ratio on Earth for a progenitor model.
Discovery and dynamical characterization of the Amor-class asteroid 2012 XH16
NASA Astrophysics Data System (ADS)
Wlodarczyk, I.; Cernis, K.; Boyle, R. P.; Laugalys, V.
2014-03-01
The near-Earth asteroid belt is continuously replenished with material originally moving in Amor-class orbits. Here, the orbit of the dynamically interesting Amor-class asteroid 2012 XH16 is analysed. This asteroid was discovered with the Vatican Advanced Technology Telescope (VATT) at the Mt Graham International Observatory as part of an ongoing asteroid survey focused on astrometry and photometry. The orbit of the asteroid was computed using 66 observations (57 obtained with VATT and 9 from the Lunar and Planetary Laboratory-Spacewatch II project) to give a = 1.63 au, e = 0.36, i = 3.76°. The absolute magnitude of the asteroid is 22.3 which translates into a diameter in the range 104-231 m, assuming the average albedos of S-type and C-type asteroids, respectively. We have used the current orbit to study the future dynamical evolution of the asteroid under the perturbations of the planets and the Moon, relativistic effects, and the Yarkovsky force. Asteroid 2012 XH16 is locked close to the strong 1:2 mean motion resonance with the Earth. The object shows stable evolution and could survive in near-resonance for a relatively long period of time despite experiencing frequent close encounters with Mars. Moreover, results of our computations show that the asteroid 2012 XH16 can survive in the Amor region at most for about 200-400 Myr. The evolution is highly chaotic with a characteristic Lyapunov time of 245 yr. Jupiter is the main perturber but the effects of Saturn, Mars and the Earth-Moon system are also important. In particular, secular resonances with Saturn are significant.
Aerobraking strategies for the sample of comet coma earth return mission
NASA Astrophysics Data System (ADS)
Abe, Takashi; Kawaguchi, Jun'ichiro; Uesugi, Kuninori; Yen, Chen-Wan L.
The results of a study to the validate the applicability of the aerobraking concept to the SOCCER (sample of comet coma earth return) mission using a six-DOF computer simulation of the aerobraking process are presented. The SOCCER spacecraft and the aerobraking scenario and power supply problem are briefly described. Results are presented for the spin effect, payload exposure problem, and sun angle effect.
Aerobraking strategies for the sample of comet coma earth return mission
NASA Technical Reports Server (NTRS)
Abe, Takashi; Kawaguchi, Jun'ichiro; Uesugi, Kuninori; Yen, Chen-Wan L.
1990-01-01
The results of a study to the validate the applicability of the aerobraking concept to the SOCCER (sample of comet coma earth return) mission using a six-DOF computer simulation of the aerobraking process are presented. The SOCCER spacecraft and the aerobraking scenario and power supply problem are briefly described. Results are presented for the spin effect, payload exposure problem, and sun angle effect.
One of My Favorite Assignments: Automated Teller Machine Simulation.
ERIC Educational Resources Information Center
Oberman, Paul S.
2001-01-01
Describes an assignment for an introductory computer science class that requires the student to write a software program that simulates an automated teller machine. Highlights include an algorithm for the assignment; sample file contents; language features used; assignment variations; and discussion points. (LRW)
Mazilu, I; Mazilu, D A; Melkerson, R E; Hall-Mejia, E; Beck, G J; Nshimyumukiza, S; da Fonseca, Carlos M
2016-03-01
We present exact and approximate results for a class of cooperative sequential adsorption models using matrix theory, mean-field theory, and computer simulations. We validate our models with two customized experiments using ionically self-assembled nanoparticles on glass slides. We also address the limitations of our models and their range of applicability. The exact results obtained using matrix theory can be applied to a variety of two-state systems with cooperative effects.
Effect of computer game playing on baseline laparoscopic simulator skills.
Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd
2013-08-01
Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.
NASA Astrophysics Data System (ADS)
Nakagawa, T.; Tajika, E.; Kadoya, S.
2017-12-01
Discussing an impact of evolution and dynamics in the Earth's deep interior on the surface climate change for the last few decades (see review by Ehlmann et al., 2016), the mantle volatile (particularly carbon) degassing in the mid-oceanic ridges seems to play a key role in understanding the evolutionary climate track for Earth-like planets (e.g. Kadoya and Tajika, 2015). However, since the mantle degassing occurs not only in the mid-oceanic ridges but also in the wedge mantle (island arc volcanism) and hotspots, to incorporate more accurate estimate of mantle degassing flux into the climate evolution framework, we developed a coupled model of surface climate-deep Earth evolution in numerical mantle convection simulations, including more accurate deep water and carbon cycle (e.g. Nakagawa and Spiegelman, 2017) with an energy balance theory of climate change. Modeling results suggest that the evolution of planetary climate computed from a developed model is basically consistent with an evolutionary climate track in simplified mantle degassing model (Kadoya and Tajika, 2015), but an occurrence timing of global (snowball) glaciation is strongly dependent on mantle degassing rate occurred with activities of surface plate motions. With this implication, the surface plate motion driven by deep mantle dynamics would play an important role in the planetary habitability of such as the Earth and Earth-like planets over geologic time-scale.
Relativistic interpretation of Newtonian simulations for cosmic structure formation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fidler, Christian; Tram, Thomas; Crittenden, Robert
2016-09-01
The standard numerical tools for studying non-linear collapse of matter are Newtonian N -body simulations. Previous work has shown that these simulations are in accordance with General Relativity (GR) up to first order in perturbation theory, provided that the effects from radiation can be neglected. In this paper we show that the present day matter density receives more than 1% corrections from radiation on large scales if Newtonian simulations are initialised before z =50. We provide a relativistic framework in which unmodified Newtonian simulations are compatible with linear GR even in the presence of radiation. Our idea is to usemore » GR perturbation theory to keep track of the evolution of relativistic species and the relativistic space-time consistent with the Newtonian trajectories computed in N -body simulations. If metric potentials are sufficiently small, they can be computed using a first-order Einstein–Boltzmann code such as CLASS. We make this idea rigorous by defining a class of GR gauges, the Newtonian motion gauges, which are defined such that matter particles follow Newtonian trajectories. We construct a simple example of a relativistic space-time within which unmodified Newtonian simulations can be interpreted.« less
NASA Technical Reports Server (NTRS)
1974-01-01
Observations and research progress of the Smithsonian Astrophysical Observatory are reported. Satellite tracking networks (ground stations) are discussed and equipment (Baker-Nunn cameras) used to observe the satellites is described. The improvement of the accuracy of a laser ranging system of the ground stations is discussed. Also, research efforts in satellite geodesy (tides, gravity anomalies, plate tectonics) is discussed. The use of data processing for geophysical data is examined, and a data base for the Earth and Ocean Physics Applications Program is proposed. Analytical models of the earth's motion (computerized simulation) are described and the computation (numerical integration and algorithms) of satellite orbits affected by the earth's albedo, using computer techniques, is also considered. Research efforts in the study of the atmosphere are examined (the effect of drag on satellite motion), and models of the atmosphere based on satellite data are described.
NASA Technical Reports Server (NTRS)
Clarke, R.; Lintereur, L.; Bahm, C.
2016-01-01
A desire for more complete documentation of the National Aeronautics and Space Administration (NASA) Armstrong Flight Research Center (AFRC), Edwards, California legacy code used in the core simulation has led to this e ort to fully document the oblate Earth six-degree-of-freedom equations of motion and integration algorithm. The authors of this report have taken much of the earlier work of the simulation engineering group and used it as a jumping-o point for this report. The largest addition this report makes is that each element of the equations of motion is traced back to first principles and at no point is the reader forced to take an equation on faith alone. There are no discoveries of previously unknown principles contained in this report; this report is a collection and presentation of textbook principles. The value of this report is that those textbook principles are herein documented in standard nomenclature that matches the form of the computer code DERIVC. Previous handwritten notes are much of the backbone of this work, however, in almost every area, derivations are explicitly shown to assure the reader that the equations which make up the oblate Earth version of the computer routine, DERIVC, are correct.
NASA Astrophysics Data System (ADS)
Chuvashov, I. N.
2010-12-01
The features of high-precision numerical simulation of the Earth satellite motion using parallel computing are discussed on example the implementation of the cluster "Skiff Cyberia" software complex "Numerical model of the motion of system satellites". It is shown that the use of 128 bit word length allows considering weak perturbations from the high-order harmonics in the expansion of the geopotential and the effect of strain geopotential harmonics arising due to the combination of tidal perturbations associated with exposure to the moon and sun on the solid Earth and its oceans.
Iwasaki, T; Sato, H; Suga, H; Takemoto, Y; Inada, E; Saitoh, I; Kakuno, K; Kanomi, R; Yamasaki, Y
2017-05-01
To examine the influence of negative pressure of the pharyngeal airway on mandibular retraction during inspiration in children with nasal obstruction using the computational fluid dynamics (CFD) method. Sixty-two children were divided into Classes I, II (mandibular retrusion) and III (mandibular protrusion) malocclusion groups. Cone-beam computed tomography data were used to reconstruct three-dimensional shapes of the nasal and pharyngeal airways. Airflow pressure was simulated using CFD to calculate nasal resistance and pharyngeal airway pressure during inspiration and expiration. Nasal resistance of the Class II group was significantly higher than that of the other two groups, and oropharyngeal airway inspiration pressure in the Class II (-247.64 Pa) group was larger than that in the Class I (-43.51 Pa) and Class III (-31.81 Pa) groups (P<.001). The oropharyngeal airway inspiration-expiration pressure difference in the Class II (-27.38 Pa) group was larger than that in the Class I (-5.17 Pa) and Class III (0.68 Pa) groups (P=.006). Large negative inspiratory pharyngeal airway pressure due to nasal obstruction in children with Class II malocclusion may be related to their retrognathia. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Simulating electron energy loss spectroscopy with the MNPBEM toolbox
NASA Astrophysics Data System (ADS)
Hohenester, Ulrich
2014-03-01
Within the MNPBEM toolbox, we show how to simulate electron energy loss spectroscopy (EELS) of plasmonic nanoparticles using a boundary element method approach. The methodology underlying our approach closely follows the concepts developed by García de Abajo and coworkers (Garcia de Abajo, 2010). We introduce two classes eelsret and eelsstat that allow in combination with our recently developed MNPBEM toolbox for a simple, robust, and efficient computation of EEL spectra and maps. The classes are accompanied by a number of demo programs for EELS simulation of metallic nanospheres, nanodisks, and nanotriangles, and for electron trajectories passing by or penetrating through the metallic nanoparticles. We also discuss how to compute electric fields induced by the electron beam and cathodoluminescence. Catalogue identifier: AEKJ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKJ_v2_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 38886 No. of bytes in distributed program, including test data, etc.: 1222650 Distribution format: tar.gz Programming language: Matlab 7.11.0 (R2010b). Computer: Any which supports Matlab 7.11.0 (R2010b). Operating system: Any which supports Matlab 7.11.0 (R2010b). RAM:≥1 GB Classification: 18. Catalogue identifier of previous version: AEKJ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 370 External routines: MESH2D available at www.mathworks.com Does the new version supersede the previous version?: Yes Nature of problem: Simulation of electron energy loss spectroscopy (EELS) for plasmonic nanoparticles. Solution method: Boundary element method using electromagnetic potentials. Reasons for new version: The new version of the toolbox includes two additional classes for the simulation of electron energy loss spectroscopy (EELS) of plasmonic nanoparticles, and corrects a few minor bugs and inconsistencies. Summary of revisions: New classes “eelsstat” and “eelsret” for the simulation of electron energy loss spectroscopy (EELS) of plasmonic nanoparticles have been added. A few minor errors in the implementation of dipole excitation have been corrected. Running time: Depending on surface discretization between seconds and hours.
NASA Astrophysics Data System (ADS)
Schuh, Terance; Li, Yutong; Elghossain, Geena; Wiita, Paul J.
2018-06-01
We have computed a suite of simulations of propagating three-dimensional relativistic jets, involving substantial ranges of initial jet Lorentz factors and ratios of jet density to external medium density. These allow us to categorize the respective AGN into Fanaroff-Riley class I (jet dominated) and FR class II (lobe-dominated) based upon the stability and morphology of the simulations. We used the Athena code to produce a substantial collection of large 3D variations of jets, many of which propagate stably and quickly for over 100 jet radii, but others of which eventually go unstable and fill up slowing advancing lobes. Most of these simulations have jet-to-ambient medium densities between 0.005 and 0.5 and velocities between 0.90c and 0.995c. Comparing the times when some jets go unstable to these initial parameters allow us to find a threshold where radio-loud AGNs transition from class II to class I. With these high resolution fully 3D relativistic simulations we can represent the jets more accurately and thus improve upon and refine earlier results that were based on 2D simulations.
Conservative parallel simulation of priority class queueing networks
NASA Technical Reports Server (NTRS)
Nicol, David
1992-01-01
A conservative synchronization protocol is described for the parallel simulation of queueing networks having C job priority classes, where a job's class is fixed. This problem has long vexed designers of conservative synchronization protocols because of its seemingly poor ability to compute lookahead: the time of the next departure. For, a job in service having low priority can be preempted at any time by an arrival having higher priority and an arbitrarily small service time. The solution is to skew the event generation activity so that the events for higher priority jobs are generated farther ahead in simulated time than lower priority jobs. Thus, when a lower priority job enters service for the first time, all the higher priority jobs that may preempt it are already known and the job's departure time can be exactly predicted. Finally, the protocol was analyzed and it was demonstrated that good performance can be expected on the simulation of large queueing networks.
Conservative parallel simulation of priority class queueing networks
NASA Technical Reports Server (NTRS)
Nicol, David M.
1990-01-01
A conservative synchronization protocol is described for the parallel simulation of queueing networks having C job priority classes, where a job's class is fixed. This problem has long vexed designers of conservative synchronization protocols because of its seemingly poor ability to compute lookahead: the time of the next departure. For, a job in service having low priority can be preempted at any time by an arrival having higher priority and an arbitrarily small service time. The solution is to skew the event generation activity so that the events for higher priority jobs are generated farther ahead in simulated time than lower priority jobs. Thus, when a lower priority job enters service for the first time, all the higher priority jobs that may preempt it are already known and the job's departure time can be exactly predicted. Finally, the protocol was analyzed and it was demonstrated that good performance can be expected on the simulation of large queueing networks.
Optimal guidance law development for an advanced launch system
NASA Technical Reports Server (NTRS)
Calise, Anthony J.; Hodges, Dewey H.
1990-01-01
A regular perturbation analysis is presented. Closed-loop simulations were performed with a first order correction including all of the atmospheric terms. In addition, a method was developed for independently checking the accuracy of the analysis and the rather extensive programming required to implement the complete first order correction with all of the aerodynamic effects included. This amounted to developing an equivalent Hamiltonian computed from the first order analysis. A second order correction was also completed for the neglected spherical Earth and back-pressure effects. Finally, an analysis was begun on a method for dealing with control inequality constraints. The results on including higher order corrections do show some improvement for this application; however, it is not known at this stage if significant improvement will result when the aerodynamic forces are included. The weak formulation for solving optimal problems was extended in order to account for state inequality constraints. The formulation was tested on three example problems and numerical results were compared to the exact solutions. Development of a general purpose computational environment for the solution of a large class of optimal control problems is under way. An example, along with the necessary input and the output, is given.
Towards a standardized method to assess straylight in earth observing optical instruments
NASA Astrophysics Data System (ADS)
Caron, J.; Taccola, M.; Bézy, J.-L.
2017-09-01
Straylight is a spurious effect that can seriously degrade the radiometric accuracy achieved by Earth observing optical instruments, as a result of the high contrast in the observed Earth radiance scenes and spectra. It is considered critical for several ESA missions such as Sentinel-5, FLEX and potential successors to CarbonSat. Although it is traditionally evaluated by Monte-Carlo simulations performed with commercial softwares (e.g. ASAP, Zemax, LightTools), semi-analytical approximate methods [1,2] have drawn some interest in recent years due to their faster computing time and the greater insight they provide in straylight mechanisms. They cannot replace numerical simulations, but may be more advantageous in contexts where many iterations are needed, for instance during the early phases of an instrument design.
TOP500 Supercomputers for June 2003
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack
2003-06-23
21st Edition of TOP500 List of World's Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 21st edition of the TOP500 list of the world's fastest supercomputers was released today (June 23, 2003). The Earth Simulator supercomputer built by NEC and installed last year at the Earth Simulator Center in Yokohama, Japan, with its Linpack benchmark performance of 35.86 Tflop/s (teraflops or trillions of calculations per second), retains the number one position. The number 2 position is held by the re-measured ASCI Q system at Los Alamosmore » National Laboratory. With 13.88 Tflop/s, it is the second system ever to exceed the 10 Tflop/smark. ASCIQ was built by Hewlett-Packard and is based on the AlphaServerSC computer system.« less
Computation and Validation of the Dynamic Response Index (DRI)
2013-08-06
matplotlib plotting library. • Executed from command line. • Allows several optional arguments. • Runs on Windows, Linux, UNIX, and Mac OS X. 10... vs . Time: Triangular pulse input data with given time duration and peak acceleration: Time (s) EARTH Code: Motivation • Error Assessment of...public release • ARC provided electrothermal battery model example: • Test vs . simulation data for terminal voltage. • EARTH input parameters
Numerical Study of Solar Storms from the Sun to Earth
NASA Astrophysics Data System (ADS)
Feng, Xueshang; Jiang, Chaowei; Zhou, Yufen
2017-04-01
As solar storms are sweeping the Earth, adverse changes occur in geospace environment. How human can mitigate and avoid destructive damages caused by solar storms becomes an important frontier issue that we must face in the high-tech times. It is of both scientific significance to understand the dynamic process during solar storm's propagation in interplanetary space and realistic value to conduct physics-based numerical researches on the three-dimensional process of solar storms in interplanetary space with the aid of powerful computing capacity to predict the arrival times, intensities, and probable geoeffectiveness of solar storms at the Earth. So far, numerical studies based on magnetohydrodynamics (MHD) have gone through the transition from the initial qualitative principle researches to systematic quantitative studies on concrete events and numerical predictions. Numerical modeling community has a common goal to develop an end-to-end physics-based modeling system for forecasting the Sun-Earth relationship. It is hoped that the transition of these models to operational use depends on the availability of computational resources at reasonable cost and that the models' prediction capabilities may be improved by incorporating the observational findings and constraints into physics-based models, combining the observations, empirical models and MHD simulations in organic ways. In this talk, we briefly focus on our recent progress in using solar observations to produce realistic magnetic configurations of CMEs as they leave the Sun, and coupling data-driven simulations of CMEs to heliospheric simulations that then propagate the CME configuration to 1AU, and outlook the important numerical issues and their possible solutions in numerical space weather modeling from the Sun to Earth for future research.
Performance issues for domain-oriented time-driven distributed simulations
NASA Technical Reports Server (NTRS)
Nicol, David M.
1987-01-01
It has long been recognized that simulations form an interesting and important class of computations that may benefit from distributed or parallel processing. Since the point of parallel processing is improved performance, the recent proliferation of multiprocessors requires that we consider the performance issues that naturally arise when attempting to implement a distributed simulation. Three such issues are: (1) the problem of mapping the simulation onto the architecture, (2) the possibilities for performing redundant computation in order to reduce communication, and (3) the avoidance of deadlock due to distributed contention for message-buffer space. These issues are discussed in the context of a battlefield simulation implemented on a medium-scale multiprocessor message-passing architecture.
Algorithms for radiative transfer simulations for aerosol retrieval
NASA Astrophysics Data System (ADS)
Mukai, Sonoyo; Sano, Itaru; Nakata, Makiko
2012-11-01
Aerosol retrieval work from satellite data, i.e. aerosol remote sensing, is divided into three parts as: satellite data analysis, aerosol modeling and multiple light scattering calculation in the atmosphere model which is called radiative transfer simulation. The aerosol model is compiled from the accumulated measurements during more than ten years provided with the world wide aerosol monitoring network (AERONET). The radiative transfer simulations take Rayleigh scattering by molecules and Mie scattering by aerosols in the atmosphere, and reflection by the Earth surface into account. Thus the aerosol properties are estimated by comparing satellite measurements with the numerical values of radiation simulations in the Earth-atmosphere-surface model. It is reasonable to consider that the precise simulation of multiple light-scattering processes is necessary, and needs a long computational time especially in an optically thick atmosphere model. Therefore efficient algorithms for radiative transfer problems are indispensable to retrieve aerosols from space.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langer, S; Rotman, D; Schwegler, E
The Institutional Computing Executive Group (ICEG) review of FY05-06 Multiprogrammatic and Institutional Computing (M and IC) activities is presented in the attached report. In summary, we find that the M and IC staff does an outstanding job of acquiring and supporting a wide range of institutional computing resources to meet the programmatic and scientific goals of LLNL. The responsiveness and high quality of support given to users and the programs investing in M and IC reflects the dedication and skill of the M and IC staff. M and IC has successfully managed serial capacity, parallel capacity, and capability computing resources.more » Serial capacity computing supports a wide range of scientific projects which require access to a few high performance processors within a shared memory computer. Parallel capacity computing supports scientific projects that require a moderate number of processors (up to roughly 1000) on a parallel computer. Capability computing supports parallel jobs that push the limits of simulation science. M and IC has worked closely with Stockpile Stewardship, and together they have made LLNL a premier institution for computational and simulation science. Such a standing is vital to the continued success of laboratory science programs and to the recruitment and retention of top scientists. This report provides recommendations to build on M and IC's accomplishments and improve simulation capabilities at LLNL. We recommend that institution fully fund (1) operation of the atlas cluster purchased in FY06 to support a few large projects; (2) operation of the thunder and zeus clusters to enable 'mid-range' parallel capacity simulations during normal operation and a limited number of large simulations during dedicated application time; (3) operation of the new yana cluster to support a wide range of serial capacity simulations; (4) improvements to the reliability and performance of the Lustre parallel file system; (5) support for the new GDO petabyte-class storage facility on the green network for use in data intensive external collaborations; and (6) continued support for visualization and other methods for analyzing large simulations. We also recommend that M and IC begin planning in FY07 for the next upgrade of its parallel clusters. LLNL investments in M and IC have resulted in a world-class simulation capability leading to innovative science. We thank the LLNL management for its continued support and thank the M and IC staff for its vision and dedicated efforts to make it all happen.« less
Education Calls for a New Philosophy.
ERIC Educational Resources Information Center
Scheidlinger, Zygmunt
1999-01-01
Highlights changes brought on by computers and technological advancement and notes that only those with a vision of the future can direct and participate in the evolution of education. Suggests that virtual reality, simulation, animation and other computer-based features will render traditional class learning futile and that computerized education…
Multi-scale simulations of space problems with iPIC3D
NASA Astrophysics Data System (ADS)
Lapenta, Giovanni; Bettarini, Lapo; Markidis, Stefano
The implicit Particle-in-Cell method for the computer simulation of space plasma, and its im-plementation in a three-dimensional parallel code, called iPIC3D, are presented. The implicit integration in time of the Vlasov-Maxwell system removes the numerical stability constraints and enables kinetic plasma simulations at magnetohydrodynamics scales. Simulations of mag-netic reconnection in plasma are presented to show the effectiveness of the algorithm. In particular we will show a number of simulations done for large scale 3D systems using the physical mass ratio for Hydrogen. Most notably one simulation treats kinetically a box of tens of Earth radii in each direction and was conducted using about 16000 processors of the Pleiades NASA computer. The work is conducted in collaboration with the MMS-IDS theory team from University of Colorado (M. Goldman, D. Newman and L. Andersson). Reference: Stefano Markidis, Giovanni Lapenta, Rizwan-uddin Multi-scale simulations of plasma with iPIC3D Mathematics and Computers in Simulation, Available online 17 October 2009, http://dx.doi.org/10.1016/j.matcom.2009.08.038
Come In Spaceship Earth. Kids as Crew Members. Peace Works Series.
ERIC Educational Resources Information Center
Schmidt, Fran; Friedman, Alice
This program, for grades 4 through 12, introduces students to the concepts that result in cooperative work for the survival and improvement of the quality of life of the human family. In addition to the teacher's guide presented here, the program comes with a music video recorded in seven languages, reproducible pages, a class simulation game, and…
A real-time digital computer program for the simulation of automatic spacecraft reentries
NASA Technical Reports Server (NTRS)
Kaylor, J. T.; Powell, L. F.; Powell, R. W.
1977-01-01
The automatic reentry flight dynamics simulator, a nonlinear, six-degree-of-freedom simulation, digital computer program, has been developed. The program includes a rotating, oblate earth model for accurate navigation calculations and contains adjustable gains on the aerodynamic stability and control parameters. This program uses a real-time simulation system and is designed to examine entries of vehicles which have constant mass properties whose attitudes are controlled by both aerodynamic surfaces and reaction control thrusters, and which have automatic guidance and control systems. The program has been used to study the space shuttle orbiter entry. This report includes descriptions of the equations of motion used, the control and guidance schemes that were implemented, the program flow and operation, and the hardware involved.
NASA Astrophysics Data System (ADS)
Laurie, J.; Bouchet, F.
2012-04-01
Many turbulent flows undergo sporadic random transitions, after long periods of apparent statistical stationarity. For instance, paths of the Kuroshio [1], the Earth's magnetic field reversal, atmospheric flows [2], MHD experiments [3], 2D turbulence experiments [4,5], 3D flows [6] show this kind of behavior. The understanding of this phenomena is extremely difficult due to the complexity, the large number of degrees of freedom, and the non-equilibrium nature of these turbulent flows. It is however a key issue for many geophysical problems. A straightforward study of these transitions, through a direct numerical simulation of the governing equations, is nearly always impracticable. This is mainly a complexity problem, due to the large number of degrees of freedom involved for genuine turbulent flows, and the extremely long time between two transitions. In this talk, we consider two-dimensional and geostrophic turbulent models, with stochastic forces. We consider regimes where two or more attractors coexist. As an alternative to direct numerical simulation, we propose a non-equilibrium statistical mechanics approach to the computation of this phenomenon. Our strategy is based on large deviation theory [7], derived from a path integral representation of the stochastic process. Among the trajectories connecting two non-equilibrium attractors, we determine the most probable one. Moreover, we also determine the transition rates, and in which cases this most probable trajectory is a typical one. Interestingly, we prove that in the class of models we consider, a mechanism exists for diffusion over sets of connected attractors. For the type of stochastic forces that allows this diffusion, the transition between attractors is not a rare event. It is then very difficult to characterize the flow as bistable. However for another class of stochastic forces, this diffusion mechanism is prevented, and genuine bistability or multi-stability is observed. We discuss how these results are probably connected to the long debated existence of multi-stability in the atmosphere and oceans.
Virtual reality simulation in neurosurgery: technologies and evolution.
Chan, Sonny; Conti, François; Salisbury, Kenneth; Blevins, Nikolas H
2013-01-01
Neurosurgeons are faced with the challenge of learning, planning, and performing increasingly complex surgical procedures in which there is little room for error. With improvements in computational power and advances in visual and haptic display technologies, virtual surgical environments can now offer potential benefits for surgical training, planning, and rehearsal in a safe, simulated setting. This article introduces the various classes of surgical simulators and their respective purposes through a brief survey of representative simulation systems in the context of neurosurgery. Many technical challenges currently limit the application of virtual surgical environments. Although we cannot yet expect a digital patient to be indistinguishable from reality, new developments in computational methods and related technology bring us closer every day. We recognize that the design and implementation of an immersive virtual reality surgical simulator require expert knowledge from many disciplines. This article highlights a selection of recent developments in research areas related to virtual reality simulation, including anatomic modeling, computer graphics and visualization, haptics, and physics simulation, and discusses their implication for the simulation of neurosurgery.
Inductive System Health Monitoring
NASA Technical Reports Server (NTRS)
Iverson, David L.
2004-01-01
The Inductive Monitoring System (IMS) software was developed to provide a technique to automatically produce health monitoring knowledge bases for systems that are either difficult to model (simulate) with a computer or which require computer models that are too complex to use for real time monitoring. IMS uses nominal data sets collected either directly from the system or from simulations to build a knowledge base that can be used to detect anomalous behavior in the system. Machine learning and data mining techniques are used to characterize typical system behavior by extracting general classes of nominal data from archived data sets. IMS is able to monitor the system by comparing real time operational data with these classes. We present a description of learning and monitoring method used by IMS and summarize some recent IMS results.
Dynamical Correlation In Some Liquid Alkaline Earth Metals Near Melting
NASA Astrophysics Data System (ADS)
Thakore, B. Y.; Suthar, P. H.; Khambholja, S. G.; Gajjar, P. N.; Jani, A. R.
2010-12-01
The study of dynamical variables: velocity autocorrelation function (VACF) and power spectrum of liquid alkaline earth metals (Ca, Sr, and Ba) have been presented based on the static harmonic well approximation. The effective interatomic potential for liquid metals is computed using our well recognized model potential with the exchange correlation functions due to Hartree, Taylor, Ichimaru and Utsumi, Farid et al. and Sarkar et al. It is observed that the VACF computed using Sarkar et al. gives the good agreement with available molecular dynamics simulation (MD) results [Phys Rev. B 62, 14818 (2000)]. The shoulder of the power spectrum depends upon the type of local field correlation function used.
NASA Astrophysics Data System (ADS)
Gonczi, Amanda L.; Chiu, Jennifer L.; Maeng, Jennifer L.; Bell, Randy L.
2016-07-01
This investigation sought to identify patterns in elementary science teachers' computer simulation use, particularly implementation structures and instructional supports commonly employed by teachers. Data included video-recorded science lessons of 96 elementary teachers who used computer simulations in one or more science lessons. Results indicated teachers used a one-to-one student-to-computer ratio most often either during class-wide individual computer use or during a rotating station structure. Worksheets, general support, and peer collaboration were the most common forms of instructional support. The least common instructional support forms included lesson pacing, initial play, and a closure discussion. Students' simulation use was supported in the fewest ways during a rotating station structure. Results suggest that simulation professional development with elementary teachers needs to explicitly focus on implementation structures and instructional support to enhance participants' pedagogical knowledge and improve instructional simulation use. In addition, research is needed to provide theoretical explanations for the observed patterns that should subsequently be addressed in supporting teachers' instructional simulation use during professional development or in teacher preparation programs.
NASA Astrophysics Data System (ADS)
Badawy, B.; Fletcher, C. G.
2017-12-01
The parameterization of snow processes in land surface models is an important source of uncertainty in climate simulations. Quantifying the importance of snow-related parameters, and their uncertainties, may therefore lead to better understanding and quantification of uncertainty within integrated earth system models. However, quantifying the uncertainty arising from parameterized snow processes is challenging due to the high-dimensional parameter space, poor observational constraints, and parameter interaction. In this study, we investigate the sensitivity of the land simulation to uncertainty in snow microphysical parameters in the Canadian LAnd Surface Scheme (CLASS) using an uncertainty quantification (UQ) approach. A set of training cases (n=400) from CLASS is used to sample each parameter across its full range of empirical uncertainty, as determined from available observations and expert elicitation. A statistical learning model using support vector regression (SVR) is then constructed from the training data (CLASS output variables) to efficiently emulate the dynamical CLASS simulations over a much larger (n=220) set of cases. This approach is used to constrain the plausible range for each parameter using a skill score, and to identify the parameters with largest influence on the land simulation in CLASS at global and regional scales, using a random forest (RF) permutation importance algorithm. Preliminary sensitivity tests indicate that snow albedo refreshment threshold and the limiting snow depth, below which bare patches begin to appear, have the highest impact on snow output variables. The results also show a considerable reduction of the plausible ranges of the parameters values and hence reducing their uncertainty ranges, which can lead to a significant reduction of the model uncertainty. The implementation and results of this study will be presented and discussed in details.
Progress in Computational Simulation of Earthquakes
NASA Technical Reports Server (NTRS)
Donnellan, Andrea; Parker, Jay; Lyzenga, Gregory; Judd, Michele; Li, P. Peggy; Norton, Charles; Tisdale, Edwin; Granat, Robert
2006-01-01
GeoFEST(P) is a computer program written for use in the QuakeSim project, which is devoted to development and improvement of means of computational simulation of earthquakes. GeoFEST(P) models interacting earthquake fault systems from the fault-nucleation to the tectonic scale. The development of GeoFEST( P) has involved coupling of two programs: GeoFEST and the Pyramid Adaptive Mesh Refinement Library. GeoFEST is a message-passing-interface-parallel code that utilizes a finite-element technique to simulate evolution of stress, fault slip, and plastic/elastic deformation in realistic materials like those of faulted regions of the crust of the Earth. The products of such simulations are synthetic observable time-dependent surface deformations on time scales from days to decades. Pyramid Adaptive Mesh Refinement Library is a software library that facilitates the generation of computational meshes for solving physical problems. In an application of GeoFEST(P), a computational grid can be dynamically adapted as stress grows on a fault. Simulations on workstations using a few tens of thousands of stress and displacement finite elements can now be expanded to multiple millions of elements with greater than 98-percent scaled efficiency on over many hundreds of parallel processors (see figure).
seismo-live: Training in Computational Seismology using Jupyter Notebooks
NASA Astrophysics Data System (ADS)
Igel, H.; Krischer, L.; van Driel, M.; Tape, C.
2016-12-01
Practical training in computational methodologies is still underrepresented in Earth science curriculae despite the increasing use of sometimes highly sophisticated simulation technologies in research projects. At the same time well-engineered community codes make it easy to return simulation-based results yet with the danger that the inherent traps of numerical solutions are not well understood. It is our belief that training with highly simplified numerical solutions (here to the equations describing elastic wave propagation) with carefully chosen elementary ingredients of simulation technologies (e.g., finite-differencing, function interpolation, spectral derivatives, numerical integration) could substantially improve this situation. For this purpose we have initiated a community platform (www.seismo-live.org) where Python-based Jupyter notebooks can be accessed and run without and necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow combining markup language, graphics, equations with interactive, executable python codes. We demonstrate the potential with training notebooks for the finite-difference method, pseudospectral methods, finite/spectral element methods, the finite-volume and the discontinuous Galerkin method. The platform already includes general Python training, introduction to the ObsPy library for seismology as well as seismic data processing and noise analysis. Submission of Jupyter notebooks for general seismology are encouraged. The platform can be used for complementary teaching in Earth Science courses on compute-intensive research areas.
NASA Technical Reports Server (NTRS)
Halyo, Nesim; Choi, Sang H.; Chrisman, Dan A., Jr.; Samms, Richard W.
1987-01-01
Dynamic models and computer simulations were developed for the radiometric sensors utilized in the Earth Radiation Budget Experiment (ERBE). The models were developed to understand performance, improve measurement accuracy by updating model parameters and provide the constants needed for the count conversion algorithms. Model simulations were compared with the sensor's actual responses demonstrated in the ground and inflight calibrations. The models consider thermal and radiative exchange effects, surface specularity, spectral dependence of a filter, radiative interactions among an enclosure's nodes, partial specular and diffuse enclosure surface characteristics and steady-state and transient sensor responses. Relatively few sensor nodes were chosen for the models since there is an accuracy tradeoff between increasing the number of nodes and approximating parameters such as the sensor's size, material properties, geometry, and enclosure surface characteristics. Given that the temperature gradients within a node and between nodes are small enough, approximating with only a few nodes does not jeopardize the accuracy required to perform the parameter estimates and error analyses.
Quantum simulation from the bottom up: the case of rebits
NASA Astrophysics Data System (ADS)
Enshan Koh, Dax; Yuezhen Niu, Murphy; Yoder, Theodore J.
2018-05-01
Typically, quantum mechanics is thought of as a linear theory with unitary evolution governed by the Schrödinger equation. While this is technically true and useful for a physicist, with regards to computation it is an unfortunately narrow point of view. Just as a classical computer can simulate highly nonlinear functions of classical states, so too can the more general quantum computer simulate nonlinear evolutions of quantum states. We detail one particular simulation of nonlinearity on a quantum computer, showing how the entire class of -unitary evolutions (on n qubits) can be simulated using a unitary, real-amplitude quantum computer (consisting of n + 1 qubits in total). These operators can be represented as the sum of a linear and antilinear operator, and add an intriguing new set of nonlinear quantum gates to the toolbox of the quantum algorithm designer. Furthermore, a subgroup of these nonlinear evolutions, called the -Cliffords, can be efficiently classically simulated, by making use of the fact that Clifford operators can simulate non-Clifford (in fact, non-linear) operators. This perspective of using the physical operators that we have to simulate non-physical ones that we do not is what we call bottom-up simulation, and we give some examples of its broader implications.
Augmenting Sand Simulation Environments through Subdivision and Particle Refinement
NASA Astrophysics Data System (ADS)
Clothier, M.; Bailey, M.
2012-12-01
Recent advances in computer graphics and parallel processing hardware have provided disciplines with new methods to evaluate and visualize data. These advances have proven useful for earth and planetary scientists as many researchers are using this hardware to process large amounts of data for analysis. As such, this has provided opportunities for collaboration between computer graphics and the earth sciences. Through collaboration with the Oregon Space Grant and IGERT Ecosystem Informatics programs, we are investigating techniques for simulating the behavior of sand. We are also collaborating with the Jet Propulsion Laboratory's (JPL) DARTS Lab to exchange ideas and gain feedback on our research. The DARTS Lab specializes in simulation of planetary vehicles, such as the Mars rovers. Their simulations utilize a virtual "sand box" to test how a planetary vehicle responds to different environments. Our research builds upon this idea to create a sand simulation framework so that planetary environments, such as the harsh, sandy regions on Mars, are more fully realized. More specifically, we are focusing our research on the interaction between a planetary vehicle, such as a rover, and the sand beneath it, providing further insight into its performance. Unfortunately, this can be a computationally complex problem, especially if trying to represent the enormous quantities of sand particles interacting with each other. However, through the use of high-performance computing, we have developed a technique to subdivide areas of actively participating sand regions across a large landscape. Similar to a Level of Detail (LOD) technique, we only subdivide regions of a landscape where sand particles are actively participating with another object. While the sand is within this subdivision window and moves closer to the surface of the interacting object, the sand region subdivides into smaller regions until individual sand particles are left at the surface. As an example, let's say there is a planetary rover interacting with our sand simulation environment. Sand that is actively interacting with a rover wheel will be represented as individual particles whereas sand that is further under the surface will be represented by larger regions of sand. The result of this technique allows for many particles to be represented without the computational complexity. In developing this method, we have further generalized these subdivision regions into any volumetric area suitable for use in the simulation. This is a further improvement of our method as it allows for more compact subdivision sand regions. This helps to fine tune the simulation so that more emphasis can be placed on regions of actively participating sand. We feel that through the generalization of our technique, our research can provide other opportunities within the earth and planetary sciences. Through collaboration with our academic colleagues, we continue to refine our technique and look for other opportunities to utilize our research.
Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations
NASA Technical Reports Server (NTRS)
Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.
2015-01-01
The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.
NASA Astrophysics Data System (ADS)
Bunge, Hans-Peter
2002-08-01
Earth's mantle overturns itself about once every 200 Million years (myrs). Prima facie evidence for this overturn is the motion of tectonic plates at the surface of the Earth driving the geologic activity of our planet. Supporting evidence also comes from seismic tomograms of the Earth's interior that reveal the convective currents in remarkable clarity. Much has been learned about the physics of solid state mantle convection over the past two decades aided primarily by sophisticated computer simulations. Such simulations are reaching the threshold of fully resolving the convective system globally. In this talk we will review recent progress in mantle dynamics studies. We will then turn our attention to the fundamental question of whether it is possible to explicitly reconstruct mantle flow back in time. This is a classic problem of history matching, amenable to control theory and data assimilation. The technical advances that make such approach feasible are dramatically increasing compute resources, represented for example through Beowulf clusters, and new observational initiatives, represented for example through the US-Array effort that should lead to an order-of-magnitude improvement in our ability to resolve Earth structure seismically below North America. In fact, new observational constraints on deep Earth structure illustrate the growing importance of of improving our data assimilation skills in deep Earth models. We will explore data assimilation through high resolution global adjoint models of mantle circulation and conclude that it is feasible to reconstruct mantle flow back in time for at least the past 100 myrs.
NASA Astrophysics Data System (ADS)
Adhikari, Surendra; Ivins, Erik R.; Larour, Eric
2016-03-01
A classical Green's function approach for computing gravitationally consistent sea-level variations associated with mass redistribution on the earth's surface employed in contemporary sea-level models naturally suits the spectral methods for numerical evaluation. The capability of these methods to resolve high wave number features such as small glaciers is limited by the need for large numbers of pixels and high-degree (associated Legendre) series truncation. Incorporating a spectral model into (components of) earth system models that generally operate on a mesh system also requires repetitive forward and inverse transforms. In order to overcome these limitations, we present a method that functions efficiently on an unstructured mesh, thus capturing the physics operating at kilometer scale yet capable of simulating geophysical observables that are inherently of global scale with minimal computational cost. The goal of the current version of this model is to provide high-resolution solid-earth, gravitational, sea-level and rotational responses for earth system models operating in the domain of the earth's outer fluid envelope on timescales less than about 1 century when viscous effects can largely be ignored over most of the globe. The model has numerous important geophysical applications. For example, we compute time-varying computations of global geodetic and sea-level signatures associated with recent ice-sheet changes that are derived from space gravimetry observations. We also demonstrate the capability of our model to simultaneously resolve kilometer-scale sources of the earth's time-varying surface mass transport, derived from high-resolution modeling of polar ice sheets, and predict the corresponding local and global geodetic signatures.
Computer simulation of on-orbit manned maneuvering unit operations
NASA Technical Reports Server (NTRS)
Stuart, G. M.; Garcia, K. D.
1986-01-01
Simulation of spacecraft on-orbit operations is discussed in reference to Martin Marietta's Space Operations Simulation laboratory's use of computer software models to drive a six-degree-of-freedom moving base carriage and two target gimbal systems. In particular, key simulation issues and related computer software models associated with providing real-time, man-in-the-loop simulations of the Manned Maneuvering Unit (MMU) are addressed with special attention given to how effectively these models and motion systems simulate the MMU's actual on-orbit operations. The weightless effects of the space environment require the development of entirely new devices for locomotion. Since the access to space is very limited, it is necessary to design, build, and test these new devices within the physical constraints of earth using simulators. The simulation method that is discussed here is the technique of using computer software models to drive a Moving Base Carriage (MBC) that is capable of providing simultaneous six-degree-of-freedom motions. This method, utilized at Martin Marietta's Space Operations Simulation (SOS) laboratory, provides the ability to simulate the operation of manned spacecraft, provides the pilot with proper three-dimensional visual cues, and allows training of on-orbit operations. The purpose here is to discuss significant MMU simulation issues, the related models that were developed in response to these issues and how effectively these models simulate the MMU's actual on-orbiter operations.
A Structured-Inquiry Approach to Teaching Neurophysiology Using Computer Simulation
Crisp, Kevin M.
2012-01-01
Computer simulation is a valuable tool for teaching the fundamentals of neurophysiology in undergraduate laboratories where time and equipment limitations restrict the amount of course content that can be delivered through hands-on interaction. However, students often find such exercises to be tedious and unstimulating. In an effort to engage students in the use of computational modeling while developing a deeper understanding of neurophysiology, an attempt was made to use an educational neurosimulation environment as the basis for a novel, inquiry-based research project. During the semester, students in the class wrote a research proposal, used the Neurodynamix II simulator to generate a large data set, analyzed their modeling results statistically, and presented their findings at the Midbrains Neuroscience Consortium undergraduate poster session. Learning was assessed in the form of a series of short term papers and two 10-min in-class writing responses to the open-ended question, “How do ion channels influence neuronal firing?”, which they completed on weeks 6 and 15 of the semester. Students’ answers to this question showed a deeper understanding of neuronal excitability after the project; their term papers revealed evidence of critical thinking about computational modeling and neuronal excitability. Suggestions for the adaptation of this structured-inquiry approach into shorter term lab experiences are discussed. PMID:23494064
AxiSEM3D: broadband seismic wavefields in 3-D aspherical Earth models
NASA Astrophysics Data System (ADS)
Leng, K.; Nissen-Meyer, T.; Zad, K. H.; van Driel, M.; Al-Attar, D.
2017-12-01
Seismology is the primary tool for data-informed inference of Earth structure and dynamics. Simulating seismic wave propagation at a global scale is fundamental to seismology, but remains as one of most challenging problems in scientific computing, because of both the multiscale nature of Earth's interior and the observable frequency band of seismic data. We present a novel numerical method to simulate global seismic wave propagation in realistic 3-D Earth models. Our method, named AxiSEM3D, is a hybrid of spectral element method and pseudospectral method. It reduces the azimuthal dimension of wavefields by means of a global Fourier series parameterization, of which the number of terms can be locally adapted to the inherent azimuthal smoothness of the wavefields. AxiSEM3D allows not only for material heterogeneities, such as velocity, density, anisotropy and attenuation, but also for finite undulations on radial discontinuities, both solid-solid and solid-fluid, and thereby a variety of aspherical Earth features such as ellipticity, topography, variable crustal thickness, and core-mantle boundary topography. Such interface undulations are equivalently interpreted as material perturbations of the contiguous media, based on the "particle relabelling transformation". Efficiency comparisons show that AxiSEM3D can be 1 to 3 orders of magnitude faster than conventional 3-D methods, with the speedup increasing with simulation frequency and decreasing with model complexity, but for all realistic structures the speedup remains at least one order of magnitude. The observable frequency range of global seismic data (up to 1 Hz) has been covered for wavefield modelling upon a 3-D Earth model with reasonable computing resources. We show an application of surface wave modelling within a state-of-the-art global crustal model (Crust1.0), with the synthetics compared to real data. The high-performance C++ code is released at github.com/AxiSEM3D/AxiSEM3D.
Risser, Dennis W.
2008-01-01
This report presents the results of a study by the U.S. Geological Survey, in cooperation with the Pennsylvania Geological Survey, to illustrate a water-budget method for mapping the spatial distribution of ground-water recharge for a 76-square-mile part of the Jordan Creek watershed, northwest of Allentown, in Lehigh County, Pennsylvania. Recharge was estimated by using the Hydrological Evaluation of Landfill Performance (HELP) water-budget model for 577 landscape units in Jordan Creek watershed, delineated on the basis of their soils, land use/land cover, and mean annual precipitation during 1951-2000. The water-budget model routes precipitation falling on each landscape unit to components of evapotranspiration, surface runoff, storage, and vertical percolation (recharge) for a five-layer soil column on a daily basis. The spatial distribution of mean annual recharge during 1951-2000 for each landscape unit was mapped by the use of a geographic information system. Recharge simulated by the water-budget model in Jordan Creek watershed during 1951-2000 averaged 12.3 inches per year and ranged by landscape unit from 0.11 to 17.05 inches per year. Mean annual recharge during 1951-2000 simulated by the water-budget model was most sensitive to changes to input values for precipitation and runoff-curve number. Mean annual recharge values for the crop, forest, pasture, and low-density urban land-use/land-cover classes were similar (11.2 to 12.2 inches per year) but were substantially less for high-density urban (6.8 inches per year), herbaceous wetlands (2.5 inches per year), and forested wetlands (1.3 inches per year). Recharge rates simulated for the crop, forest, pasture, and low-density urban land-cover classes were similar because those land-use/land-cover classes are represented in the model with parameter values that either did not significantly affect simulated recharge or tended to have offsetting effects on recharge. For example, for landscapes with forest land cover, values of runoff-curve number assigned to the model were smaller than for other land-use/land-cover classes (causing more recharge and less runoff), but the maximum depth of evapotranspiration was larger than for other land-use/ land-cover classes because of deeper root penetration in forests (causing more evapotranspiration and less recharge). The smaller simulated recharge for high-density urban and wetland land-use/land-cover classes was caused by the large values of runoff-curve number (greater than 90) assigned to those classes. The large runoff-curve number, however, certainly is not realistic for all wetlands; some wetlands act as areas of ground-water discharge and some as areas of recharge. Simulated mean annual recharge computed by the water-budget model for the 53-square-mile part of the watershed upstream from the streamflow-gaging station near Schnecksville was compared to estimates of recharge and base flow determined by analysis of streamflow records from 1967 to 2000. The mean annual recharge of 12.4 inches per year simulated by the water-budget method for 1967-2000 was less than estimates of mean annual recharge of 19.3 inches per year computed from the RORA computer program and base flow computed by the PART computer program (15.1 inches per year). In theory, the water-budget method provides a practical tool for estimating differences in recharge at local scales of interest, and the watershed- average recharge rate of 12.4 inches per year computed by the method is reasonable. However, the mean annual surface runoff of 4.5 inches per year simulated by the model is unrealistically small. The sum of surface runoff and recharge simulated by the water-budget model (16.9 inches per year) is 7 inches per year less than the streamflow measured at the gaging station near Schnecksville (23.9 inches per year) during 1967-2000, indicating that evapotranspiration is overestimated by the water-budget model by that amount. This discrepancy ca
Particle-in-cell simulations of Earth-like magnetosphere during a magnetic field reversal
NASA Astrophysics Data System (ADS)
Barbosa, M. V. G.; Alves, M. V.; Vieira, L. E. A.; Schmitz, R. G.
2017-12-01
The geologic record shows that hundreds of pole reversals have occurred throughout Earth's history. The mean interval between the poles reversals is roughly 200 to 300 thousand years and the last reversal occurred around 780 thousand years ago. Pole reversal is a slow process, during which the strength of the magnetic field decreases, become more complex, with the appearance of more than two poles for some time and then the field strength increases, changing polarity. Along the process, the magnetic field configuration changes, leaving the Earth-like planet vulnerable to the harmful effects of the Sun. Understanding what happens with the magnetosphere during these pole reversals is an open topic of investigation. Only recently PIC codes are used to modeling magnetospheres. Here we use the particle code iPIC3D [Markidis et al, Mathematics and Computers in Simulation, 2010] to simulate an Earth-like magnetosphere at three different times along the pole reversal process. The code was modified, so the Earth-like magnetic field is generated using an expansion in spherical harmonics with the Gauss coefficients given by a MHD simulation of the Earth's core [Glatzmaier et al, Nature, 1995; 1999; private communication to L.E.A.V.]. Simulations show the qualitative behavior of the magnetosphere, such as the current structures. Only the planet magnetic field was changed in the runs. The solar wind is the same for all runs. Preliminary results show the formation of the Chapman-Ferraro current in the front of the magnetosphere in all the cases. Run for the middle of the reversal process, the low intensity magnetic field and its asymmetrical configuration the current structure changes and the presence of multiple poles can be observed. In all simulations, a structure similar to the radiation belts was found. Simulations of more severe solar wind conditions are necessary to determine the real impact of the reversal in the magnetosphere.
GES DISC Data Recipes in Jupyter Notebooks
NASA Astrophysics Data System (ADS)
Li, A.; Banavige, B.; Garimella, K.; Rice, J.; Shen, S.; Liu, Z.
2017-12-01
The Earth Science Data and Information System (ESDIS) Project manages twelve Distributed Active Archive Centers (DAACs) which are geographically dispersed across the United States. The DAACs are responsible for ingesting, processing, archiving, and distributing Earth science data produced from various sources (satellites, aircraft, field measurements, etc.). In response to projections of an exponential increase in data production, there has been a recent effort to prototype various DAAC activities in the cloud computing environment. This, in turn, led to the creation of an initiative, called the Cloud Analysis Toolkit to Enable Earth Science (CATEES), to develop a Python software package in order to transition Earth science data processing to the cloud. This project, in particular, supports CATEES and has two primary goals. One, transition data recipes created by the Goddard Earth Science Data and Information Service Center (GES DISC) DAAC into an interactive and educational environment using Jupyter Notebooks. Two, acclimate Earth scientists to cloud computing. To accomplish these goals, we create Jupyter Notebooks to compartmentalize the different steps of data analysis and help users obtain and parse data from the command line. We also develop a Docker container, comprised of Jupyter Notebooks, Python library dependencies, and command line tools, and configure it into an easy to deploy package. The end result is an end-to-end product that simulates the use case of end users working in the cloud computing environment.
Portable radiography: a reality and necessity for ISS and explorer-class missions.
Lerner, David J; Parmet, Allen J
2015-02-01
On ISS missions and explorer class missions, unexpected medical and surgical emergencies could be disastrous. Lack of ability to rapidly assess and make critical decisions affects mission capability. Current imaging modalities on ISS consist only of ultrasound. There are many acute diagnoses which ultrasound alone cannot diagnose. Portable X-Ray imaging (radiography) technology has advanced far enough to where it is now small enough, cheap enough, and accurate enough to give diagnostic quality images sent wirelessly to the onboard computer and on Earth for interpretation while fitting in something the size of a briefcase. Although further research is warranted, Portable Radiography is an important addition to have on ISS and future Explorer Class Missions while maintaining a very small footprint.
Computers with Wings: Flight Simulation and Personalized Landscapes
ERIC Educational Resources Information Center
Oss, Stefano
2005-01-01
We propose, as a special way to explore the physics of flying objects, to use a flight simulator with a personalized scenery to reproduce the territory where students live. This approach increases the participation and attention of students to physics classes but also creates several opportunities for addressing side activities and arguments of…
Key algorithms used in GR02: A computer simulation model for predicting tree and stand growth
Garrett A. Hughes; Paul E. Sendak; Paul E. Sendak
1985-01-01
GR02 is an individual tree, distance-independent simulation model for predicting tree and stand growth over time. It performs five major functions during each run: (1) updates diameter at breast height, (2) updates total height, (3) estimates mortality, (4) determines regeneration, and (5) updates crown class.
Benefits of computer screen-based simulation in learning cardiac arrest procedures.
Bonnetain, Elodie; Boucheix, Jean-Michel; Hamet, Maël; Freysz, Marc
2010-07-01
What is the best way to train medical students early so that they acquire basic skills in cardiopulmonary resuscitation as effectively as possible? Studies have shown the benefits of high-fidelity patient simulators, but have also demonstrated their limits. New computer screen-based multimedia simulators have fewer constraints than high-fidelity patient simulators. In this area, as yet, there has been no research on the effectiveness of transfer of learning from a computer screen-based simulator to more realistic situations such as those encountered with high-fidelity patient simulators. We tested the benefits of learning cardiac arrest procedures using a multimedia computer screen-based simulator in 28 Year 2 medical students. Just before the end of the traditional resuscitation course, we compared two groups. An experiment group (EG) was first asked to learn to perform the appropriate procedures in a cardiac arrest scenario (CA1) in the computer screen-based learning environment and was then tested on a high-fidelity patient simulator in another cardiac arrest simulation (CA2). While the EG was learning to perform CA1 procedures in the computer screen-based learning environment, a control group (CG) actively continued to learn cardiac arrest procedures using practical exercises in a traditional class environment. Both groups were given the same amount of practice, exercises and trials. The CG was then also tested on the high-fidelity patient simulator for CA2, after which it was asked to perform CA1 using the computer screen-based simulator. Performances with both simulators were scored on a precise 23-point scale. On the test on a high-fidelity patient simulator, the EG trained with a multimedia computer screen-based simulator performed significantly better than the CG trained with traditional exercises and practice (16.21 versus 11.13 of 23 possible points, respectively; p<0.001). Computer screen-based simulation appears to be effective in preparing learners to use high-fidelity patient simulators, which present simulations that are closer to real-life situations.
Simulation studies of wide and medium field of view earth radiation data analysis
NASA Technical Reports Server (NTRS)
Green, R. N.
1978-01-01
A parameter estimation technique is presented to estimate the radiative flux distribution over the earth from radiometer measurements at satellite altitude. The technique analyzes measurements from a wide field of view (WFOV), horizon to horizon, nadir pointing sensor with a mathematical technique to derive the radiative flux estimates at the top of the atmosphere for resolution elements smaller than the sensor field of view. A computer simulation of the data analysis technique is presented for both earth-emitted and reflected radiation. Zonal resolutions are considered as well as the global integration of plane flux. An estimate of the equator-to-pole gradient is obtained from the zonal estimates. Sensitivity studies of the derived flux distribution to directional model errors are also presented. In addition to the WFOV results, medium field of view results are presented.
Architectural Improvements and New Processing Tools for the Open XAL Online Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, Christopher K; Pelaia II, Tom; Freed, Jonathan M
The online model is the component of Open XAL providing accelerator modeling, simulation, and dynamic synchronization to live hardware. Significant architectural changes and feature additions have been recently made in two separate areas: 1) the managing and processing of simulation data, and 2) the modeling of RF cavities. Simulation data and data processing have been completely decoupled. A single class manages all simulation data while standard tools were developed for processing the simulation results. RF accelerating cavities are now modeled as composite structures where parameter and dynamics computations are distributed. The beam and hardware models both maintain their relative phasemore » information, which allows for dynamic phase slip and elapsed time computation.« less
Analytical investigation of the dynamics of tethered constellations in Earth orbit, phase 2
NASA Technical Reports Server (NTRS)
Lorenzini, E.
1985-01-01
This Quarterly Report deals with the deployment maneuver of a single-axis, vertical constellation with three masses. A new, easy to handle, computer code that simulates the two-dimensional dynamics of the constellation has been implemented. This computer code is used for designing control laws for the deployment maneuver that minimizes the acceleration level of the low-g platform during the maneuver.
Parallel Simulation of Three-Dimensional Free Surface Fluid Flow Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
BAER,THOMAS A.; SACKINGER,PHILIP A.; SUBIA,SAMUEL R.
1999-10-14
Simulation of viscous three-dimensional fluid flow typically involves a large number of unknowns. When free surfaces are included, the number of unknowns increases dramatically. Consequently, this class of problem is an obvious application of parallel high performance computing. We describe parallel computation of viscous, incompressible, free surface, Newtonian fluid flow problems that include dynamic contact fines. The Galerkin finite element method was used to discretize the fully-coupled governing conservation equations and a ''pseudo-solid'' mesh mapping approach was used to determine the shape of the free surface. In this approach, the finite element mesh is allowed to deform to satisfy quasi-staticmore » solid mechanics equations subject to geometric or kinematic constraints on the boundaries. As a result, nodal displacements must be included in the set of unknowns. Other issues discussed are the proper constraints appearing along the dynamic contact line in three dimensions. Issues affecting efficient parallel simulations include problem decomposition to equally distribute computational work among a SPMD computer and determination of robust, scalable preconditioners for the distributed matrix systems that must be solved. Solution continuation strategies important for serial simulations have an enhanced relevance in a parallel coquting environment due to the difficulty of solving large scale systems. Parallel computations will be demonstrated on an example taken from the coating flow industry: flow in the vicinity of a slot coater edge. This is a three dimensional free surface problem possessing a contact line that advances at the web speed in one region but transitions to static behavior in another region. As such, a significant fraction of the computational time is devoted to processing boundary data. Discussion focuses on parallel speed ups for fixed problem size, a class of problems of immediate practical importance.« less
Modeling of convection phenomena in Bridgman-Stockbarger crystal growth
NASA Technical Reports Server (NTRS)
Carlson, F. M.; Eraslan, A. H.; Sheu, J. Z.
1985-01-01
Thermal convection phenomena in a vertically oriented Bridgman-Stockbarger apparatus were modeled by computer simulations for different gravity conditions, ranging from earth conditions to extremely low gravity, approximate space conditions. The modeling results were obtained by the application of a state-of-the art, transient, multi-dimensional, completely densimetrically coupled, discrete-element computational model which was specifically developed for the simulation of flow, temperature, and species concentration conditions in two-phase (solid-liquid) systems. The computational model was applied to the simulation of the flow and the thermal conditions associated with the convection phenomena in a modified Germanium-Silicon charge enclosed in a stationary fused-silica ampoule. The results clearly indicated that the gravitational field strength influences the characteristics of the coherent vortical flow patterns, interface shape and position, maximum melt velocity, and interfacial normal temperature gradient.
Computer simulations of electromagnetic cool ion beam instabilities. [in near earth space
NASA Technical Reports Server (NTRS)
Gary, S. P.; Madland, C. D.; Schriver, D.; Winske, D.
1986-01-01
Electromagnetic ion beam instabilities driven by cool ion beams at propagation parallel or antiparallel to a uniform magnetic field are studied using computer simulations. The elements of linear theory applicable to electromagnetic ion beam instabilities and the simulations derived from a one-dimensional hybrid computer code are described. The quasi-linear regime of the right-hand resonant ion beam instability, and the gyrophase bunching of the nonlinear regime of the right-hand resonant and nonresonant instabilities are examined. It is detected that in the quasi-linear regime the instability saturation is due to a reduction in the beam core relative drift speed and an increase in the perpendicular-to-parallel beam temperature; in the nonlinear regime the instabilities saturate when half the initial beam drift kinetic energy density is converted to fluctuating magnetic field energy density.
NASA Astrophysics Data System (ADS)
Limkumnerd, Surachate
2014-03-01
Interest in thin-film fabrication for industrial applications have driven both theoretical and computational aspects of modeling its growth. One of the earliest attempts toward understanding the morphological structure of a film's surface is through a class of solid-on-solid limited-mobility growth models such as the Family, Wolf-Villain, or Das Sarma-Tamborenea models, which have produced fascinating surface roughening behaviors. These models, however, restrict the motion of an incidence atom to be within the neighborhood of its landing site, which renders them inept for simulating long-distance surface diffusion such as that observed in thin-film growth using a molecular-beam epitaxy technique. Naive extension of these models by repeatedly applying the local diffusion rules for each hop to simulate large diffusion length can be computationally very costly when certain statistical aspects are demanded. We present a graph-theoretic approach to simulating a long-range diffusion-attachment growth model. Using the Markovian assumption and given a local diffusion bias, we derive the transition probabilities for a random walker to traverse from one lattice site to the others after a large, possibly infinite, number of steps. Only computation with linear-time complexity is required for the surface morphology calculation without other probabilistic measures. The formalism is applied, as illustrations, to simulate surface growth on a two-dimensional flat substrate and around a screw dislocation under the modified Wolf-Villain diffusion rule. A rectangular spiral ridge is observed in the latter case with a smooth front feature similar to that obtained from simulations using the well-known multiple registration technique. An algorithm for computing the inverse of a class of substochastic matrices is derived as a corollary.
The Application of Web-based Computer-assisted Instruction Courseware within Health Assessment
NASA Astrophysics Data System (ADS)
Xiuyan, Guo
Health assessment is a clinical nursing course and places emphasis on clinical skills. The application of computer-assisted instruction in the field of nursing teaching solved the problems in the traditional lecture class. This article stated teaching experience of web-based computer-assisted instruction, based upon a two-year study of computer-assisted instruction courseware use within the course health assessment. The computer-assisted instruction courseware could develop teaching structure, simulate clinical situations, create teaching situations and facilitate students study.
Long period nodal motion of sun synchronous orbits
NASA Technical Reports Server (NTRS)
Duck, K. I.
1975-01-01
An approximative model is formulated for assessing these perturbations that significantly affect long term modal motion of sun synchronous orbits. Computer simulations with several independent computer programs consider zonal and tesseral gravitational harmonics, third body gravitational disturbances induced by the sun and the moon, and atmospheric drag. A pendulum model consisting of evenzonal harmonics through order 4 and solar gravity dominated nodal motion approximation. This pendulum motion results from solar gravity inducing an inclination oscillation which couples into the nodal precession induced by the earth's oblateness. The pendulum model correlated well with simulations observed flight data.
NASA Astrophysics Data System (ADS)
Schneider, Tapio; Lan, Shiwei; Stuart, Andrew; Teixeira, João.
2017-12-01
Climate projections continue to be marred by large uncertainties, which originate in processes that need to be parameterized, such as clouds, convection, and ecosystems. But rapid progress is now within reach. New computational tools and methods from data assimilation and machine learning make it possible to integrate global observations and local high-resolution simulations in an Earth system model (ESM) that systematically learns from both and quantifies uncertainties. Here we propose a blueprint for such an ESM. We outline how parameterization schemes can learn from global observations and targeted high-resolution simulations, for example, of clouds and convection, through matching low-order statistics between ESMs, observations, and high-resolution simulations. We illustrate learning algorithms for ESMs with a simple dynamical system that shares characteristics of the climate system; and we discuss the opportunities the proposed framework presents and the challenges that remain to realize it.
MPI implementation of PHOENICS: A general purpose computational fluid dynamics code
NASA Astrophysics Data System (ADS)
Simunovic, S.; Zacharia, T.; Baltas, N.; Spalding, D. B.
1995-03-01
PHOENICS is a suite of computational analysis programs that are used for simulation of fluid flow, heat transfer, and dynamical reaction processes. The parallel version of the solver EARTH for the Computational Fluid Dynamics (CFD) program PHOENICS has been implemented using Message Passing Interface (MPI) standard. Implementation of MPI version of PHOENICS makes this computational tool portable to a wide range of parallel machines and enables the use of high performance computing for large scale computational simulations. MPI libraries are available on several parallel architectures making the program usable across different architectures as well as on heterogeneous computer networks. The Intel Paragon NX and MPI versions of the program have been developed and tested on massively parallel supercomputers Intel Paragon XP/S 5, XP/S 35, and Kendall Square Research, and on the multiprocessor SGI Onyx computer at Oak Ridge National Laboratory. The preliminary testing results of the developed program have shown scalable performance for reasonably sized computational domains.
MPI implementation of PHOENICS: A general purpose computational fluid dynamics code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simunovic, S.; Zacharia, T.; Baltas, N.
1995-04-01
PHOENICS is a suite of computational analysis programs that are used for simulation of fluid flow, heat transfer, and dynamical reaction processes. The parallel version of the solver EARTH for the Computational Fluid Dynamics (CFD) program PHOENICS has been implemented using Message Passing Interface (MPI) standard. Implementation of MPI version of PHOENICS makes this computational tool portable to a wide range of parallel machines and enables the use of high performance computing for large scale computational simulations. MPI libraries are available on several parallel architectures making the program usable across different architectures as well as on heterogeneous computer networks. Themore » Intel Paragon NX and MPI versions of the program have been developed and tested on massively parallel supercomputers Intel Paragon XP/S 5, XP/S 35, and Kendall Square Research, and on the multiprocessor SGI Onyx computer at Oak Ridge National Laboratory. The preliminary testing results of the developed program have shown scalable performance for reasonably sized computational domains.« less
Parabolic flights as Earth analogue for surface processes on Mars
NASA Astrophysics Data System (ADS)
Kuhn, Nikolaus J.
2017-04-01
The interpretation of landforms and environmental archives on Mars with regards to habitability and preservation of traces of life requires a quantitative understanding of the processes that shaped them. Commonly, qualitative similarities in sedimentary rocks between Earth and Mars are used as an analogue to reconstruct the environments in which they formed on Mars. However, flow hydraulics and sedimentation differ between Earth and Mars, requiring a recalibration of models describing runoff, erosion, transport and deposition. Simulation of these processes on Earth is limited because gravity cannot be changed and the trade-off between adjusting e.g. fluid or particle density generates other mismatches, such as fluid viscosity. Computational Fluid Dynamics offer an alternative, but would also require a certain degree of calibration or testing. Parabolic flights offer a possibility to amend the shortcomings of these approaches. Parabolas with reduced gravity last up to 30 seconds, which allows the simulation of sedimentation processes and the measurement of flow hydraulics. This study summarizes the experience gathered during four campaigns of parabolic flights, aimed at identifying potential and limitations of their use as an Earth analogue for surface processes on Mars.
Intercomparison of 3D pore-scale flow and solute transport simulation methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Xiaofan; Mehmani, Yashar; Perkins, William A.
2016-09-01
Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include methods that 1) explicitly model the three-dimensional geometry of pore spaces and 2) those that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of class 1, based on direct numerical simulation using computational fluid dynamics (CFD) codes, against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of class 1 based on the immersed-boundary method (IMB),more » lattice Boltzmann method (LBM), smoothed particle hydrodynamics (SPH), as well as a model of class 2 (a pore-network model or PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and nonreactive solute transport, and intercompare the model results with previously reported experimental observations. Experimental observations are limited to measured pore-scale velocities, so solute transport comparisons are made only among the various models. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations).« less
NASA Astrophysics Data System (ADS)
Bhatia, Pramod; Singh, Ravinder
2017-06-01
Diffusion flames are the most common type of flame which we see in our daily life such as candle flame and match-stick flame. Also, they are the most used flames in practical combustion system such as industrial burner (coal fired, gas fired or oil fired), diesel engines, gas turbines, and solid fuel rockets. In the present study, steady-state global chemistry calculations for 24 different flames were performed using an axisymmetric computational fluid dynamics code (UNICORN). Computation involved simulations of inverse and normal diffusion flames of propane in earth and microgravity condition with varying oxidizer compositions (21, 30, 50, 100 % O2, by mole, in N2). 2 cases were compared with the experimental result for validating the computational model. These flames were stabilized on a 5.5 mm diameter burner with 10 mm of burner length. The effect of oxygen enrichment and variation in gravity (earth gravity and microgravity) on shape and size of diffusion flames, flame temperature, flame velocity have been studied from the computational result obtained. Oxygen enrichment resulted in significant increase in flame temperature for both types of diffusion flames. Also, oxygen enrichment and gravity variation have significant effect on the flame configuration of normal diffusion flames in comparison with inverse diffusion flames. Microgravity normal diffusion flames are spherical in shape and much wider in comparison to earth gravity normal diffusion flames. In inverse diffusion flames, microgravity flames were wider than earth gravity flames. However, microgravity inverse flames were not spherical in shape.
Theoretical Studies of the Kinetics of First-Order Phase Transitions.
NASA Astrophysics Data System (ADS)
Zheng, Qiang
This thesis involves theoretical studies of the kinetics of orderings in three classes of systems. The first class involves problems of phase separation in which the order parameter is conserved, such as occurs in the binary alloy Al-Zn. A theory is developed for the late stages of phase separation in the droplet regime for two -dimensional systems, namely, Ostwald ripening in two dimensions. The theory considers droplet correlations, which was neglected before, by a proper treatment of the screening effect of the correlations. This correlation effect is found that it does not alert the scaling features of phase separation, but significantly changes the shape of droplet-size distribution function. Further experiments and computer simulations are needed before this long-time subject may be closed. A second class of problem involves a study of the finite-size effects on domain growth described by the Allen-Cahn dynamics. Based on a theoretical approach of Ohta, Jasnow, and Kawasaki the explicit scaling functions for the scattering intensity for hypercubes and films are obtained. These results are for the cases in which the order-parameter is not conserved, such as in an order-disorder transition in alloys. These studies will be relevant to the experimental and computer simulation research projects currently being carried out in the United States and Europe. The last class of problems involves orderings in strong correlated systems, namely, the growth of Breath Figures. A special feature of this class of problems is that the coalescence effect. A theoretical model is proposed which can handle the two growth mechanisms, the individual droplet growth and coalescence simultaneously. Under certain approximations, the droplet-size distribution function is obtained analytically, and is in qualitative agreement with computer simulations. Our model also suggests that there may be an interesting relationship between the growth of Breath Figures and a geometric structure (ultrametricity) of general complex systems.
Robotic space simulation integration of vision algorithms into an orbital operations simulation
NASA Technical Reports Server (NTRS)
Bochsler, Daniel C.
1987-01-01
In order to successfully plan and analyze future space activities, computer-based simulations of activities in low earth orbit will be required to model and integrate vision and robotic operations with vehicle dynamics and proximity operations procedures. The orbital operations simulation (OOS) is configured and enhanced as a testbed for robotic space operations. Vision integration algorithms are being developed in three areas: preprocessing, recognition, and attitude/attitude rates. The vision program (Rice University) was modified for use in the OOS. Systems integration testing is now in progress.
Additional Developments in Atmosphere Revitalization Modeling and Simulation
NASA Technical Reports Server (NTRS)
Coker, Robert F.; Knox, James C.; Cummings, Ramona; Brooks, Thomas; Schunk, Richard G.
2013-01-01
NASA's Advanced Exploration Systems (AES) program is developing prototype systems, demonstrating key capabilities, and validating operational concepts for future human missions beyond Earth orbit. These forays beyond the confines of earth's gravity will place unprecedented demands on launch systems. They must launch the supplies needed to sustain a crew over longer periods for exploration missions beyond earth's moon. Thus all spacecraft systems, including those for the separation of metabolic carbon dioxide and water from a crewed vehicle, must be minimized with respect to mass, power, and volume. Emphasis is also placed on system robustness both to minimize replacement parts and ensure crew safety when a quick return to earth is not possible. Current efforts are focused on improving the current state-of-the-art systems utilizing fixed beds of sorbent pellets by evaluating structured sorbents, seeking more robust pelletized sorbents, and examining alternate bed configurations to improve system efficiency and reliability. These development efforts combine testing of sub-scale systems and multi-physics computer simulations to evaluate candidate approaches, select the best performing options, and optimize the configuration of the selected approach. This paper describes the continuing development of atmosphere revitalization models and simulations in support of the Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM)
Coercivity of domain wall motion in thin films of amorphous rare earth-transition metal alloys
NASA Technical Reports Server (NTRS)
Mansuripur, M.; Giles, R. C.; Patterson, G.
1991-01-01
Computer simulations of a two dimensional lattice of magnetic dipoles are performed on the Connection Machine. The lattice is a discrete model for thin films of amorphous rare-earth transition metal alloys, which have application as the storage media in erasable optical data storage systems. In these simulations, the dipoles follow the dynamic Landau-Lifshitz-Gilbert equation under the influence of an effective field arising from local anisotropy, near-neighbor exchange, classical dipole-dipole interactions, and an externally applied field. Various sources of coercivity, such as defects and/or inhomogeneities in the lattice, are introduced and the subsequent motion of domain walls in response to external fields is investigated.
ERIC Educational Resources Information Center
Geigel, Joan; And Others
A self-paced program designed to integrate the use of computers and physics courseware into the regular classroom environment is offered for physics high school teachers in this module on projectile and circular motion. A diversity of instructional strategies including lectures, demonstrations, videotapes, computer simulations, laboratories, and…
Best bang for your buck: GPU nodes for GROMACS biomolecular simulations
Páll, Szilárd; Fechner, Martin; Esztermann, Ansgar; de Groot, Bert L.; Grubmüller, Helmut
2015-01-01
The molecular dynamics simulation package GROMACS runs efficiently on a wide variety of hardware from commodity workstations to high performance computing clusters. Hardware features are well‐exploited with a combination of single instruction multiple data, multithreading, and message passing interface (MPI)‐based single program multiple data/multiple program multiple data parallelism while graphics processing units (GPUs) can be used as accelerators to compute interactions off‐loaded from the CPU. Here, we evaluate which hardware produces trajectories with GROMACS 4.6 or 5.0 in the most economical way. We have assembled and benchmarked compute nodes with various CPU/GPU combinations to identify optimal compositions in terms of raw trajectory production rate, performance‐to‐price ratio, energy efficiency, and several other criteria. Although hardware prices are naturally subject to trends and fluctuations, general tendencies are clearly visible. Adding any type of GPU significantly boosts a node's simulation performance. For inexpensive consumer‐class GPUs this improvement equally reflects in the performance‐to‐price ratio. Although memory issues in consumer‐class GPUs could pass unnoticed as these cards do not support error checking and correction memory, unreliable GPUs can be sorted out with memory checking tools. Apart from the obvious determinants for cost‐efficiency like hardware expenses and raw performance, the energy consumption of a node is a major cost factor. Over the typical hardware lifetime until replacement of a few years, the costs for electrical power and cooling can become larger than the costs of the hardware itself. Taking that into account, nodes with a well‐balanced ratio of CPU and consumer‐class GPU resources produce the maximum amount of GROMACS trajectory over their lifetime. © 2015 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. PMID:26238484
Best bang for your buck: GPU nodes for GROMACS biomolecular simulations.
Kutzner, Carsten; Páll, Szilárd; Fechner, Martin; Esztermann, Ansgar; de Groot, Bert L; Grubmüller, Helmut
2015-10-05
The molecular dynamics simulation package GROMACS runs efficiently on a wide variety of hardware from commodity workstations to high performance computing clusters. Hardware features are well-exploited with a combination of single instruction multiple data, multithreading, and message passing interface (MPI)-based single program multiple data/multiple program multiple data parallelism while graphics processing units (GPUs) can be used as accelerators to compute interactions off-loaded from the CPU. Here, we evaluate which hardware produces trajectories with GROMACS 4.6 or 5.0 in the most economical way. We have assembled and benchmarked compute nodes with various CPU/GPU combinations to identify optimal compositions in terms of raw trajectory production rate, performance-to-price ratio, energy efficiency, and several other criteria. Although hardware prices are naturally subject to trends and fluctuations, general tendencies are clearly visible. Adding any type of GPU significantly boosts a node's simulation performance. For inexpensive consumer-class GPUs this improvement equally reflects in the performance-to-price ratio. Although memory issues in consumer-class GPUs could pass unnoticed as these cards do not support error checking and correction memory, unreliable GPUs can be sorted out with memory checking tools. Apart from the obvious determinants for cost-efficiency like hardware expenses and raw performance, the energy consumption of a node is a major cost factor. Over the typical hardware lifetime until replacement of a few years, the costs for electrical power and cooling can become larger than the costs of the hardware itself. Taking that into account, nodes with a well-balanced ratio of CPU and consumer-class GPU resources produce the maximum amount of GROMACS trajectory over their lifetime. © 2015 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Botts, Michael E.; Phillips, Ron J.; Parker, John V.; Wright, Patrick D.
1992-01-01
Five scientists at MSFC/ESAD have EOS SCF investigator status. Each SCF has unique tasks which require the establishment of a computing facility dedicated to accomplishing those tasks. A SCF Working Group was established at ESAD with the charter of defining the computing requirements of the individual SCFs and recommending options for meeting these requirements. The primary goal of the working group was to determine which computing needs can be satisfied using either shared resources or separate but compatible resources, and which needs require unique individual resources. The requirements investigated included CPU-intensive vector and scalar processing, visualization, data storage, connectivity, and I/O peripherals. A review of computer industry directions and a market survey of computing hardware provided information regarding important industry standards and candidate computing platforms. It was determined that the total SCF computing requirements might be most effectively met using a hierarchy consisting of shared and individual resources. This hierarchy is composed of five major system types: (1) a supercomputer class vector processor; (2) a high-end scalar multiprocessor workstation; (3) a file server; (4) a few medium- to high-end visualization workstations; and (5) several low- to medium-range personal graphics workstations. Specific recommendations for meeting the needs of each of these types are presented.
NASA Astrophysics Data System (ADS)
Ben Slimen, F.; Haouari, M.; Ben Ouada, H.; Guichaoua, D.; Raso, P.; Bidault, X.; Turlier, J.; Gaumer, N.; Chaussedent, S.
2017-02-01
Silicophosphate glasses (SiO2-P2O5) doped with Eu3+ ions were synthesized by the sol-gel process. Optical properties of these glasses were investigated by means of emission spectra and lifetime measurements. The Fluorescence Line Narrowing (FLN) technique was also used to explore the local structure around the Eu3+ ions in this host and to understand the role of phosphate as a codopant. As it is the case for aluminum, the ability of phosphate to avoid the rare earth clustering was investigated, and the role of this codopant in modifying the local order around the rare earth ion was evidenced. The analysis of the FLN spectra and lifetime measurements is consistent with this interpretation. Molecular Dynamics simulations were performed to evaluate and confirm these structural features. Two classes of europium sites were distinguished in agreement with the experimental characterization.
NASA Astrophysics Data System (ADS)
Fang, Y.; Huang, M.; Liu, C.; Li, H.; Leung, L. R.
2013-11-01
Physical and biogeochemical processes regulate soil carbon dynamics and CO2 flux to and from the atmosphere, influencing global climate changes. Integration of these processes into Earth system models (e.g., community land models (CLMs)), however, currently faces three major challenges: (1) extensive efforts are required to modify modeling structures and to rewrite computer programs to incorporate new or updated processes as new knowledge is being generated, (2) computational cost is prohibitively expensive to simulate biogeochemical processes in land models due to large variations in the rates of biogeochemical processes, and (3) various mathematical representations of biogeochemical processes exist to incorporate different aspects of fundamental mechanisms, but systematic evaluation of the different mathematical representations is difficult, if not impossible. To address these challenges, we propose a new computational framework to easily incorporate physical and biogeochemical processes into land models. The new framework consists of a new biogeochemical module, Next Generation BioGeoChemical Module (NGBGC), version 1.0, with a generic algorithm and reaction database so that new and updated processes can be incorporated into land models without the need to manually set up the ordinary differential equations to be solved numerically. The reaction database consists of processes of nutrient flow through the terrestrial ecosystems in plants, litter, and soil. This framework facilitates effective comparison studies of biogeochemical cycles in an ecosystem using different conceptual models under the same land modeling framework. The approach was first implemented in CLM and benchmarked against simulations from the original CLM-CN code. A case study was then provided to demonstrate the advantages of using the new approach to incorporate a phosphorus cycle into CLM. To our knowledge, the phosphorus-incorporated CLM is a new model that can be used to simulate phosphorus limitation on the productivity of terrestrial ecosystems. The method presented here could in theory be applied to simulate biogeochemical cycles in other Earth system models.
Challenges in the development of very high resolution Earth System Models for climate science
NASA Astrophysics Data System (ADS)
Rasch, Philip J.; Xie, Shaocheng; Ma, Po-Lun; Lin, Wuyin; Wan, Hui; Qian, Yun
2017-04-01
The authors represent the 20+ members of the ACME atmosphere development team. The US Department of Energy (DOE) has, like many other organizations around the world, identified the need for an Earth System Model capable of rapid completion of decade to century length simulations at very high (vertical and horizontal) resolution with good climate fidelity. Two years ago DOE initiated a multi-institution effort called ACME (Accelerated Climate Modeling for Energy) to meet this an extraordinary challenge, targeting a model eventually capable of running at 10-25km horizontal and 20-400m vertical resolution through the troposphere on exascale computational platforms at speeds sufficient to complete 5+ simulated years per day. I will outline the challenges our team has encountered in development of the atmosphere component of this model, and the strategies we have been using for tuning and debugging a model that we can barely afford to run on today's computational platforms. These strategies include: 1) evaluation at lower resolutions; 2) ensembles of short simulations to explore parameter space, and perform rough tuning and evaluation; 3) use of regionally refined versions of the model for probing high resolution model behavior at less expense; 4) use of "auto-tuning" methodologies for model tuning; and 5) brute force long climate simulations.
Research in Computational Astrobiology
NASA Technical Reports Server (NTRS)
Chaban, Galina; Jaffe, Richard; Liang, Shoudan; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.
2002-01-01
We present results from several projects in the new field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. We have developed a procedure for calculating long-range effects in molecular dynamics using a plane wave expansion of the electrostatic potential. This method is expected to be highly efficient for simulating biological systems on massively parallel supercomputers. We have perform genomics analysis on a family of actin binding proteins. We have performed quantum mechanical calculations on carbon nanotubes and nucleic acids, which simulations will allow us to investigate possible sources of organic material on the early earth. Finally, we have developed a model of protobiological chemistry using neural networks.
Design for progressive fracture in composite shell structures
NASA Technical Reports Server (NTRS)
Minnetyan, Levon; Murthy, Pappu L. N.
1992-01-01
The load carrying capability and structural behavior of composite shell structures and stiffened curved panels are investigated to provide accurate early design loads. An integrated computer code is utilized for the computational simulation of composite structural degradation under practical loading for realistic design. Damage initiation, growth, accumulation, and propagation to structural fracture are included in the simulation. Progressive fracture investigations providing design insight for several classes of composite shells are presented. Results demonstrate the significance of local defects, interfacial regions, and stress concentrations on the structural durability of composite shells.
NASA Technical Reports Server (NTRS)
Nosenchuck, D. M.; Littman, M. G.
1986-01-01
The Navier-Stokes computer (NSC) has been developed for solving problems in fluid mechanics involving complex flow simulations that require more speed and capacity than provided by current and proposed Class VI supercomputers. The machine is a parallel processing supercomputer with several new architectural elements which can be programmed to address a wide range of problems meeting the following criteria: (1) the problem is numerically intensive, and (2) the code makes use of long vectors. A simulation of two-dimensional nonsteady viscous flows is presented to illustrate the architecture, programming, and some of the capabilities of the NSC.
Computing the total atmospheric refraction for real-time optical imaging sensor simulation
NASA Astrophysics Data System (ADS)
Olson, Richard F.
2015-05-01
Fast and accurate computation of light path deviation due to atmospheric refraction is an important requirement for real-time simulation of optical imaging sensor systems. A large body of existing literature covers various methods for application of Snell's Law to the light path ray tracing problem. This paper provides a discussion of the adaptation to real time simulation of atmospheric refraction ray tracing techniques used in mid-1980's LOWTRAN releases. The refraction ray trace algorithm published in a LOWTRAN-6 technical report by Kneizys (et. al.) has been coded in MATLAB for development, and in C-language for simulation use. To this published algorithm we have added tuning parameters for variable path segment lengths, and extensions for Earth grazing and exoatmospheric "near Earth" ray paths. Model atmosphere properties used to exercise the refraction algorithm were obtained from tables published in another LOWTRAN-6 related report. The LOWTRAN-6 based refraction model is applicable to atmospheric propagation at wavelengths in the IR and visible bands of the electromagnetic spectrum. It has been used during the past two years by engineers at the U.S. Army Aviation and Missile Research, Development and Engineering Center (AMRDEC) in support of several advanced imaging sensor simulations. Recently, a faster (but sufficiently accurate) method using Gauss-Chebyshev Quadrature integration for evaluating the refraction integral was adopted.
Analysis of a Precambrian resonance-stabilized day length
NASA Astrophysics Data System (ADS)
Bartlett, Benjamin C.; Stevenson, David J.
2016-06-01
During the Precambrian era, Earth's decelerating rotation would have passed a 21 h period that would have been resonant with the semidiurnal atmospheric thermal tide. Near this point, the atmospheric torque would have been maximized, being comparable in magnitude but opposite in direction to the lunar torque, halting Earth's rotational deceleration, maintaining a constant day length, as detailed by Zahnle and Walker (1987). We develop a computational model to determine necessary conditions for formation and breakage of this resonant effect. Our simulations show the resonance to be resilient to atmospheric thermal noise but suggest a sudden atmospheric temperature increase like the deglaciation period following a possible "snowball Earth" near the end of the Precambrian would break this resonance; the Marinoan and Sturtian glaciations seem the most likely candidates for this event. Our model provides a simulated day length over time that resembles existing paleorotational data, though further data are needed to verify this hypothesis.
A parallel computational model for GATE simulations.
Rannou, F R; Vega-Acevedo, N; El Bitar, Z
2013-12-01
GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lynch, Amanda H.; Abramson, David; Görgen, Klaus; Beringer, Jason; Uotila, Petteri
2007-10-01
Fires in the Australian savanna have been hypothesized to affect monsoon evolution, but the hypothesis is controversial and the effects have not been quantified. A distributed computing approach allows the development of a challenging experimental design that permits simultaneous variation of all fire attributes. The climate model simulations are distributed around multiple independent computer clusters in six countries, an approach that has potential for a range of other large simulation applications in the earth sciences. The experiment clarifies that savanna burning can shape the monsoon through two mechanisms. Boundary-layer circulation and large-scale convergence is intensified monotonically through increasing fire intensity and area burned. However, thresholds of fire timing and area are evident in the consequent influence on monsoon rainfall. In the optimal band of late, high intensity fires with a somewhat limited extent, it is possible for the wet season to be significantly enhanced.
NASA Astrophysics Data System (ADS)
Jylhä, Juha; Marjanen, Kalle; Rantala, Mikko; Metsäpuro, Petri; Visa, Ari
2006-09-01
Surveillance camera automation and camera network development are growing areas of interest. This paper proposes a competent approach to enhance the camera surveillance with Geographic Information Systems (GIS) when the camera is located at the height of 10-1000 m. A digital elevation model (DEM), a terrain class model, and a flight obstacle register comprise exploited auxiliary information. The approach takes into account spherical shape of the Earth and realistic terrain slopes. Accordingly, considering also forests, it determines visible and shadow regions. The efficiency arises out of reduced dimensionality in the visibility computation. Image processing is aided by predicting certain advance features of visible terrain. The features include distance from the camera and the terrain or object class such as coniferous forest, field, urban site, lake, or mast. The performance of the approach is studied by comparing a photograph of Finnish forested landscape with the prediction. The predicted background is well-fitting, and potential knowledge-aid for various purposes becomes apparent.
The LPO Iron Pattern beneath the Earth's Inner Core Boundary
NASA Astrophysics Data System (ADS)
Mattesini, Maurizio; Belonoshko, Anatoly; Tkalčić, Hrvoje
2017-04-01
An Earth's inner core surface pattern for the iron Lattice Preferred Orientation (LPO) has been addressed for various iron crystal polymorphs. The geographical distribution of the amount of crystal alienation was achieved by bridging high-quality inner core probing seismic data [PKP(bc-df)] together with ab initio computed elastic constants. We show that the proposed topographic crystal alignment may be used as a boundary condition for dynamo simulations, providing an additional way to discriminate in between different and, often controversial, geodynamical scenarios.
The LPO Iron Pattern beneath the Earth's Inner Core Boundary
NASA Astrophysics Data System (ADS)
Mattesini, M.; Tkalcic, H.; Belonoshko, A. B.; Buforn, E.; Udias, A.
2015-12-01
An Earth's inner core surface pattern for the iron Lattice Preferred Orientation (LPO) has been addressed for various iron crystal polymorphs. The geographical distribution of the amount of crystal alienation was achieved by bridging high-quality inner core probing seismic data [PKP(bc-df)] together with ab initio computed elastic constants. We show that the proposed topographic crystal alignment may be used as a boundary condition for dynamo simulations, providing an additional way to discriminate in between different and, often controversial, geodynamical scenarios.
Analysis of the grounding system for a mobile communication site placed on HV power line mast
NASA Astrophysics Data System (ADS)
Bîrsan, I.; Munteanu, C.; Horgoș, M.; Ilut, T.
2016-08-01
This paper aims to analyze the potential distribution on the soil surface or potential variation on the main directions inside computing mobile site. I want to study a system made the earth a mobile communications site, antennas operator and the system of which the earth is placed on a High Voltage Power Line Mast (LEA 110 KV). I made direct measurements and I use a 3D software for analyze the results and simulating some possible solutions.
2017-12-08
Two rows of the “Discover” supercomputer at the NASA Center for Climate Simulation (NCCS) contain more than 4,000 computer processors. Discover has a total of nearly 15,000 processors. Credit: NASA/Pat Izzo To learn more about NCCS go to: www.nasa.gov/topics/earth/features/climate-sim-center.html NASA Goddard Space Flight Center is home to the nation's largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.
2017-12-08
This close-up view highlights one row—approximately 2,000 computer processors—of the “Discover” supercomputer at the NASA Center for Climate Simulation (NCCS). Discover has a total of nearly 15,000 processors. Credit: NASA/Pat Izzo To learn more about NCCS go to: www.nasa.gov/topics/earth/features/climate-sim-center.html NASA Goddard Space Flight Center is home to the nation's largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.
NASA Astrophysics Data System (ADS)
Fisher, J. A.; Brewer, C.; O'Brien, G.
2017-12-01
Computing and programming are rapidly becoming necessary skills for earth and environmental scientists. Scientists in both academia and industry must be able to manipulate increasingly large datasets, create plots and 3-D visualisations of observations, and interpret outputs from complex numerical models, among other tasks. However, these skills are rarely taught as a compulsory part of undergraduate earth science curricula. In 2016, the School of Earth & Environmental Sciences at the University of Wollongong began a pilot program to integrate introductory programming and modelling skills into the required first-year core curriculum for all undergraduates majoring in earth and environmental science fields. Using Python, a popular teaching language also widely used by professionals, a set of guided exercises were developed. These exercises use interactive Jupyter Notebooks to introduce students to programming fundamentals and simple modelling problems relevant to the earth system, such as carbon cycling and population growth. The exercises are paired with peer review activities to expose students to the multitude of "correct" ways to solve computing problems. In the last weeks of the semester, students work in groups to creatively adapt their new-found skills to selected problems in earth system science. In this presentation, I will report on outcomes from delivering the new curriculum to the first two cohorts of 120-150 students, including details of the implementation and the impacts on both student aptitude and attitudes towards computing. While the first cohort clearly developed competency, survey results suggested a drop in student confidence over the course of the semester. To address this confidence gap for the second cohort, the in-class activities are now being supplemented with low-stakes open-book review quizzes that provide further practice with no time pressure. Research into the effectiveness of these review quizzes is ongoing and preliminary findings will be discussed, along with lessons learned in the process and plans for the future.
Some theoretical issues on computer simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrett, C.L.; Reidys, C.M.
1998-02-01
The subject of this paper is the development of mathematical foundations for a theory of simulation. Sequentially updated cellular automata (sCA) over arbitrary graphs are employed as a paradigmatic framework. In the development of the theory, the authors focus on the properties of causal dependencies among local mappings in a simulation. The main object of and study is the mapping between a graph representing the dependencies among entities of a simulation and a representing the equivalence classes of systems obtained by all possible updates.
Simulating Sand Behavior through Terrain Subdivision and Particle Refinement
NASA Astrophysics Data System (ADS)
Clothier, M.
2013-12-01
Advances in computer graphics, GPUs, and parallel processing hardware have provided researchers with new methods to visualize scientific data. In fact, these advances have spurred new research opportunities between computer graphics and other disciplines, such as Earth sciences. Through collaboration, Earth and planetary scientists have benefited by using these advances in hardware technology to process large amounts of data for visualization and analysis. At Oregon State University, we are collaborating with the Oregon Space Grant and IGERT Ecosystem Informatics programs to investigate techniques for simulating the behavior of sand. In addition, we have also been collaborating with the Jet Propulsion Laboratory's DARTS Lab to exchange ideas on our research. The DARTS Lab specializes in the simulation of planetary vehicles, such as the Mars rovers. One aspect of their work is testing these vehicles in a virtual "sand box" to test their performance in different environments. Our research builds upon this idea to create a sand simulation framework to allow for more complex and diverse environments. As a basis for our framework, we have focused on planetary environments, such as the harsh, sandy regions on Mars. To evaluate our framework, we have used simulated planetary vehicles, such as a rover, to gain insight into the performance and interaction between the surface sand and the vehicle. Unfortunately, simulating the vast number of individual sand particles and their interaction with each other has been a computationally complex problem in the past. However, through the use of high-performance computing, we have developed a technique to subdivide physically active terrain regions across a large landscape. To achieve this, we only subdivide terrain regions where sand particles are actively participating with another object or force, such as a rover wheel. This is similar to a Level of Detail (LOD) technique, except that the density of subdivisions are determined by their proximity to the interacting object or force with the sand. To illustrate an example, as a rover wheel moves forward and approaches a particular sand region, that region will continue to subdivide until individual sand particles are represented. Conversely, if the rover wheel moves away, previously subdivided sand regions will recombine. Thus, individual sand particles are available when an interacting force is present but stored away if there is not. As such, this technique allows for many particles to be represented without the computational complexity. We have also further generalized these subdivision regions in our sand framework into any volumetric area suitable for use in the simulation. This allows for more compact subdivision regions and has fine-tuned our framework so that more emphasis can be placed on regions of actively participating sand. We feel that this increases the framework's usefulness across scientific applications and can provide for other research opportunities within the earth and planetary sciences. Through continued collaboration with our academic partners, we continue to build upon our sand simulation framework and look for other opportunities to utilize this research.
NASA Astrophysics Data System (ADS)
Leng, K.; Nissen-Meyer, T.; van Driel, M.; Al-Attar, D.
2016-12-01
We present a new, computationally efficient numerical method to simulate global seismic wave propagation in realistic 3-D Earth models with laterally heterogeneous media and finite boundary perturbations. Our method is a hybrid of pseudo-spectral and spectral element methods (SEM). We characterize the azimuthal dependence of 3-D wavefields in terms of Fourier series, such that the 3-D equations of motion reduce to an algebraic system of coupled 2-D meridional equations, which can be solved by a 2-D spectral element method (based on www.axisem.info). Computational efficiency of our method stems from lateral smoothness of global Earth models (with respect to wavelength) as well as axial singularity of seismic point sources, which jointly confine the Fourier modes of wavefields to a few lower orders. All boundary perturbations that violate geometric spherical symmetry, including Earth's ellipticity, topography and bathymetry, undulations of internal discontinuities such as Moho and CMB, are uniformly considered by means of a Particle Relabeling Transformation.The MPI-based high performance C++ code AxiSEM3D, is now available for forward simulations upon 3-D Earth models with fluid outer core, ellipticity, and both mantle and crustal structures. We show novel benchmarks for global wave solutions in 3-D mantle structures between our method and an independent, fully discretized 3-D SEM with remarkable agreement. Performance comparisons are carried out on three state-of-the-art tomography models, with seismic period going down to 5s. It is shown that our method runs up to two orders of magnitude faster than the 3-D SEM for such settings, and such computational advantage scales favourably with seismic frequency. By examining wavefields passing through hypothetical Gaussian plumes of varying sharpness, we identify in model-wavelength space the limits where our method may lose its advantage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bastian, Mark; Trigueros, Jose V.
Phoenix is a Java Virtual Machine (JVM) based library for performing mathematical and astrodynamics calculations. It consists of two primary sub-modules, phoenix-math and phoenix-astrodynamics. The mathematics package has a variety of mathematical classes for performing 3D transformations, geometric reasoning, and numerical analysis. The astrodynamics package has various classes and methods for computing locations, attitudes, accesses, and other values useful for general satellite modeling and simulation. Methods for computing celestial locations, such as the location of the Sun and Moon, are also included. Phoenix is meant to be used as a library within the context of a larger application. For example,more » it could be used for a web service, desktop client, or to compute simple values in a scripting environment.« less
Simulating the Dynamics of Earth's Core: Using NCCS Supercomputers Speeds Calculations
NASA Technical Reports Server (NTRS)
2002-01-01
If one wanted to study Earth's core directly, one would have to drill through about 1,800 miles of solid rock to reach liquid core-keeping the tunnel from collapsing under pressures that are more than 1 million atmospheres and then sink an instrument package to the bottom that could operate at 8,000 F with 10,000 tons of force crushing every square inch of its surface. Even then, several of these tunnels would probably be needed to obtain enough data. Faced with difficult or impossible tasks such as these, scientists use other available sources of information - such as seismology, mineralogy, geomagnetism, geodesy, and, above all, physical principles - to derive a model of the core and, study it by running computer simulations. One NASA researcher is doing just that on NCCS computers. Physicist and applied mathematician Weijia Kuang, of the Space Geodesy Branch, and his collaborators at Goddard have what he calls the,"second - ever" working, usable, self-consistent, fully dynamic, three-dimensional geodynamic model (see "The Geodynamic Theory"). Kuang runs his model simulations on the supercomputers at the NCCS. He and Jeremy Bloxham, of Harvard University, developed the original version, written in Fortran 77, in 1996.
NASA Astrophysics Data System (ADS)
Haverd, V.; Smith, B.; Nieradzik, L. P.; Briggs, P. R.
2014-08-01
Poorly constrained rates of biomass turnover are a key limitation of Earth system models (ESMs). In light of this, we recently proposed a new approach encoded in a model called Populations-Order-Physiology (POP), for the simulation of woody ecosystem stand dynamics, demography and disturbance-mediated heterogeneity. POP is suitable for continental to global applications and designed for coupling to the terrestrial ecosystem component of any ESM. POP bridges the gap between first-generation dynamic vegetation models (DVMs) with simple large-area parameterisations of woody biomass (typically used in current ESMs) and complex second-generation DVMs that explicitly simulate demographic processes and landscape heterogeneity of forests. The key simplification in the POP approach, compared with second-generation DVMs, is to compute physiological processes such as assimilation at grid-scale (with CABLE (Community Atmosphere Biosphere Land Exchange) or a similar land surface model), but to partition the grid-scale biomass increment among age classes defined at sub-grid-scale, each subject to its own dynamics. POP was successfully demonstrated along a savanna transect in northern Australia, replicating the effects of strong rainfall and fire disturbance gradients on observed stand productivity and structure. Here, we extend the application of POP to wide-ranging temporal and boreal forests, employing paired observations of stem biomass and density from forest inventory data to calibrate model parameters governing stand demography and biomass evolution. The calibrated POP model is then coupled to the CABLE land surface model, and the combined model (CABLE-POP) is evaluated against leaf-stem allometry observations from forest stands ranging in age from 3 to 200 year. Results indicate that simulated biomass pools conform well with observed allometry. We conclude that POP represents an ecologically plausible and efficient alternative to large-area parameterisations of woody biomass turnover, typically used in current ESMs.
Predictive simulation of gait at low gravity reveals skipping as the preferred locomotion strategy
Ackermann, Marko; van den Bogert, Antonie J.
2012-01-01
The investigation of gait strategies at low gravity environments gained momentum recently as manned missions to the Moon and to Mars are reconsidered. Although reports by astronauts of the Apollo missions indicate alternative gait strategies might be favored on the Moon, computational simulations and experimental investigations have been almost exclusively limited to the study of either walking or running, the locomotion modes preferred under Earth's gravity. In order to investigate the gait strategies likely to be favored at low gravity a series of predictive, computational simulations of gait are performed using a physiological model of the musculoskeletal system, without assuming any particular type of gait. A computationally efficient optimization strategy is utilized allowing for multiple simulations. The results reveal skipping as more efficient and less fatiguing than walking or running and suggest the existence of a walk-skip rather than a walk-run transition at low gravity. The results are expected to serve as a background to the design of experimental investigations of gait under simulated low gravity. PMID:22365845
Predictive simulation of gait at low gravity reveals skipping as the preferred locomotion strategy.
Ackermann, Marko; van den Bogert, Antonie J
2012-04-30
The investigation of gait strategies at low gravity environments gained momentum recently as manned missions to the Moon and to Mars are reconsidered. Although reports by astronauts of the Apollo missions indicate alternative gait strategies might be favored on the Moon, computational simulations and experimental investigations have been almost exclusively limited to the study of either walking or running, the locomotion modes preferred under Earth's gravity. In order to investigate the gait strategies likely to be favored at low gravity a series of predictive, computational simulations of gait are performed using a physiological model of the musculoskeletal system, without assuming any particular type of gait. A computationally efficient optimization strategy is utilized allowing for multiple simulations. The results reveal skipping as more efficient and less fatiguing than walking or running and suggest the existence of a walk-skip rather than a walk-run transition at low gravity. The results are expected to serve as a background to the design of experimental investigations of gait under simulated low gravity. Copyright © 2012 Elsevier Ltd. All rights reserved.
Real-time global MHD simulation of the solar wind interaction with the earth's magnetosphere
NASA Astrophysics Data System (ADS)
Shimazu, H.; Tanaka, T.; Fujita, S.; Nakamura, M.; Obara, T.
We have developed a real-time global MHD simulation of the solar wind interaction with the earth s magnetosphere By adopting the real-time solar wind parameters including the IMF observed routinely by the ACE spacecraft responses of the magnetosphere are calculated with the MHD code We adopted the modified spherical coordinates and the mesh point numbers for this simulation are 56 58 and 40 for the r theta and phi direction respectively The simulation is carried out routinely on the super computer system NEC SX-6 at National Institute of Information and Communications Technology Japan The visualized images of the magnetic field lines around the earth pressure distribution on the meridian plane and the conductivity of the polar ionosphere can be referred to on the Web site http www nict go jp dk c232 realtime The results show that various magnetospheric activities are almost reproduced qualitatively They also give us information how geomagnetic disturbances develop in the magnetosphere in relation with the ionosphere From the viewpoint of space weather the real-time simulation helps us to understand the whole image in the current condition of the magnetosphere To evaluate the simulation results we compare the AE index derived from the simulation and observations In the case of isolated substorms the indices almost agreed well in both timing and intensities In other cases the simulation can predict general activities although the exact timing of the onset of substorms and intensities did not always agree By analyzing
NASA Astrophysics Data System (ADS)
Featherstone, N. A.; Aurnou, J. M.; Yadav, R. K.; Heimpel, M. H.; Soderlund, K. M.; Matsui, H.; Stanley, S.; Brown, B. P.; Glatzmaier, G.; Olson, P.; Buffett, B. A.; Hwang, L.; Kellogg, L. H.
2017-12-01
In the past three years, CIG's Dynamo Working Group has successfully ported the Rayleigh Code to the Argonne Leadership Computer Facility's Mira BG/Q device. In this poster, we present some our first results, showing simulations of 1) convection in the solar convection zone; 2) dynamo action in Earth's core and 3) convection in the jovian deep atmosphere. These simulations have made efficient use of 131 thousand cores, 131 thousand cores and 232 thousand cores, respectively, on Mira. In addition to our novel results, the joys and logistical challenges of carrying out such large runs will also be discussed.
Ground Contact Modeling for the Morpheus Test Vehicle Simulation
NASA Technical Reports Server (NTRS)
Cordova, Luis
2014-01-01
The Morpheus vertical test vehicle is an autonomous robotic lander being developed at Johnson Space Center (JSC) to test hazard detection technology. Because the initial ground contact simulation model was not very realistic, it was decided to improve the model without making it too computationally expensive. The first development cycle added capability to define vehicle attachment points (AP) and to keep track of their states in the lander reference frame (LFRAME). These states are used with a spring damper model to compute an AP contact force. The lateral force is then overwritten, if necessary, by the Coulomb static or kinetic friction force. The second development cycle added capability to use the PolySurface class as the contact surface. The class can load CAD data in STL (Stereo Lithography) format, and use the data to compute line of sight (LOS) intercepts. A polygon frame (PFRAME) is computed from the facet intercept normal and used to convert the AP state to PFRAME. Three flat plane tests validate the transitions from kinetic to static, static to kinetic, and vertical impact. The hazardous terrain test will be used to test for visual reasonableness. The improved model is numerically inexpensive, robust, and produces results that are reasonable.
Ground Contact Modeling for the Morpheus Test Vehicle Simulation
NASA Technical Reports Server (NTRS)
Cordova, Luis
2013-01-01
The Morpheus vertical test vehicle is an autonomous robotic lander being developed at Johnson Space Center (JSC) to test hazard detection technology. Because the initial ground contact simulation model was not very realistic, it was decided to improve the model without making it too computationally expensive. The first development cycle added capability to define vehicle attachment points (AP) and to keep track of their states in the lander reference frame (LFRAME). These states are used with a spring damper model to compute an AP contact force. The lateral force is then overwritten, if necessary, by the Coulomb static or kinetic friction force. The second development cycle added capability to use the PolySurface class as the contact surface. The class can load CAD data in STL (Stereo Lithography) format, and use the data to compute line of sight (LOS) intercepts. A polygon frame (PFRAME) is computed from the facet intercept normal and used to convert the AP state to PFRAME. Three flat plane tests validate the transitions from kinetic to static, static to kinetic, and vertical impact. The hazardous terrain test will be used to test for visual reasonableness. The improved model is numerically inexpensive, robust, and produces results that are reasonable.
ERIC Educational Resources Information Center
Mumba, Frackson; Zhu, Mengxia
2013-01-01
This paper presents a Simulation-based interactive Virtual ClassRoom web system (SVCR: www.vclasie.com) powered by the state-of-the-art cloud computing technology from Google SVCR integrates popular free open-source math, science and engineering simulations and provides functions such as secure user access control and management of courses,…
ERIC Educational Resources Information Center
Chang, Hsin-Yi; Hsu, Ying-Shao; Wu, Hsin-Kai
2016-01-01
We investigated the impact of an augmented reality (AR) versus interactive simulation (IS) activity incorporated in a computer learning environment to facilitate students' learning of a socio-scientific issue (SSI) on nuclear power plants and radiation pollution. We employed a quasi-experimental research design. Two classes (a total of 45…
NASA Technical Reports Server (NTRS)
Phillips, D. T.; Manseur, B.; Foster, J. W.
1982-01-01
Alternate definitions of system failure create complex analysis for which analytic solutions are available only for simple, special cases. The GRASP methodology is a computer simulation approach for solving all classes of problems in which both failure and repair events are modeled according to the probability laws of the individual components of the system.
A parallel implementation of an off-lattice individual-based model of multicellular populations
NASA Astrophysics Data System (ADS)
Harvey, Daniel G.; Fletcher, Alexander G.; Osborne, James M.; Pitt-Francis, Joe
2015-07-01
As computational models of multicellular populations include ever more detailed descriptions of biophysical and biochemical processes, the computational cost of simulating such models limits their ability to generate novel scientific hypotheses and testable predictions. While developments in microchip technology continue to increase the power of individual processors, parallel computing offers an immediate increase in available processing power. To make full use of parallel computing technology, it is necessary to develop specialised algorithms. To this end, we present a parallel algorithm for a class of off-lattice individual-based models of multicellular populations. The algorithm divides the spatial domain between computing processes and comprises communication routines that ensure the model is correctly simulated on multiple processors. The parallel algorithm is shown to accurately reproduce the results of a deterministic simulation performed using a pre-existing serial implementation. We test the scaling of computation time, memory use and load balancing as more processes are used to simulate a cell population of fixed size. We find approximate linear scaling of both speed-up and memory consumption on up to 32 processor cores. Dynamic load balancing is shown to provide speed-up for non-regular spatial distributions of cells in the case of a growing population.
NASA Technical Reports Server (NTRS)
Kumar, Uttam; Nemani, Ramakrishna R.; Ganguly, Sangram; Kalia, Subodh; Michaelis, Andrew
2017-01-01
In this work, we use a Fully Constrained Least Squares Subpixel Learning Algorithm to unmix global WELD (Web Enabled Landsat Data) to obtain fractions or abundances of substrate (S), vegetation (V) and dark objects (D) classes. Because of the sheer nature of data and compute needs, we leveraged the NASA Earth Exchange (NEX) high performance computing architecture to optimize and scale our algorithm for large-scale processing. Subsequently, the S-V-D abundance maps were characterized into 4 classes namely, forest, farmland, water and urban areas (with NPP-VIIRS-national polar orbiting partnership visible infrared imaging radiometer suite nighttime lights data) over California, USA using Random Forest classifier. Validation of these land cover maps with NLCD (National Land Cover Database) 2011 products and NAFD (North American Forest Dynamics) static forest cover maps showed that an overall classification accuracy of over 91 percent was achieved, which is a 6 percent improvement in unmixing based classification relative to per-pixel-based classification. As such, abundance maps continue to offer an useful alternative to high-spatial resolution data derived classification maps for forest inventory analysis, multi-class mapping for eco-climatic models and applications, fast multi-temporal trend analysis and for societal and policy-relevant applications needed at the watershed scale.
NASA Astrophysics Data System (ADS)
Ganguly, S.; Kumar, U.; Nemani, R. R.; Kalia, S.; Michaelis, A.
2017-12-01
In this work, we use a Fully Constrained Least Squares Subpixel Learning Algorithm to unmix global WELD (Web Enabled Landsat Data) to obtain fractions or abundances of substrate (S), vegetation (V) and dark objects (D) classes. Because of the sheer nature of data and compute needs, we leveraged the NASA Earth Exchange (NEX) high performance computing architecture to optimize and scale our algorithm for large-scale processing. Subsequently, the S-V-D abundance maps were characterized into 4 classes namely, forest, farmland, water and urban areas (with NPP-VIIRS - national polar orbiting partnership visible infrared imaging radiometer suite nighttime lights data) over California, USA using Random Forest classifier. Validation of these land cover maps with NLCD (National Land Cover Database) 2011 products and NAFD (North American Forest Dynamics) static forest cover maps showed that an overall classification accuracy of over 91% was achieved, which is a 6% improvement in unmixing based classification relative to per-pixel based classification. As such, abundance maps continue to offer an useful alternative to high-spatial resolution data derived classification maps for forest inventory analysis, multi-class mapping for eco-climatic models and applications, fast multi-temporal trend analysis and for societal and policy-relevant applications needed at the watershed scale.
NASA Astrophysics Data System (ADS)
Midekisa, A.; Bennet, A.; Gething, P. W.; Holl, F.; Andrade-Pacheco, R.; Savory, D. J.; Hugh, S. J.
2016-12-01
Spatially detailed and temporally dynamic land use land cover data is necessary to monitor the state of the land surface for various applications. Yet, such data at a continental to global scale is lacking. Here, we developed high resolution (30 meter) annual land use land cover layers for the continental Africa using Google Earth Engine. To capture ground truth training data, high resolution satellite imageries were visually inspected and used to identify 7, 212 sample Landsat pixels that were comprised entirely of one of seven land use land cover classes (water, man-made impervious surface, high biomass, low biomass, rock, sand and bare soil). For model validation purposes, 80% of points from each class were used as training data, with 20% withheld as a validation dataset. Cloud free Landsat 7 annual composites for 2000 to 2015 were generated and spectral bands from the Landsat images were then extracted for each of the training and validation sample points. In addition to the Landsat spectral bands, spectral indices such as normalized difference vegetation index (NDVI) and normalized difference water index (NDWI) were used as covariates in the model. Additionally, calibrated night time light imageries from the National Oceanic and Atmospheric Administration (NOAA) were included as a covariate. A decision tree classification algorithm was applied to predict the 7 land cover classes for the periods 2000 to 2015 using the training dataset. Using the validation dataset, classification accuracy including omission error and commission error were computed for each land cover class. Model results showed that overall accuracy of classification was high (88%). This high resolution land cover product developed for the continental Africa will be available for public use and can potentially enhance the ability of monitoring and studying the state of the Earth's surface.
Satellite Direct Readout: Opportunities for Science Education
1994-02-01
responsible for acid our Earth science classes, which gave us infor- rain problems in that country. Along with our mation about the water cycle and weather...rain damage. We also believe that shows the water cycle (with sources of humidity transboundary pollution (between the United and precipitation...about the water cycle and weather computer images we collected a series of weath- fronts. We also collected data on the location of er maps from the
NASA Technical Reports Server (NTRS)
Hueschen, Richard M.
2011-01-01
A six degree-of-freedom, flat-earth dynamics, non-linear, and non-proprietary aircraft simulation was developed that is representative of a generic mid-sized twin-jet transport aircraft. The simulation was developed from a non-proprietary, publicly available, subscale twin-jet transport aircraft simulation using scaling relationships and a modified aerodynamic database. The simulation has an extended aerodynamics database with aero data outside the normal transport-operating envelope (large angle-of-attack and sideslip values). The simulation has representative transport aircraft surface actuator models with variable rate-limits and generally fixed position limits. The simulation contains a generic 40,000 lb sea level thrust engine model. The engine model is a first order dynamic model with a variable time constant that changes according to simulation conditions. The simulation provides a means for interfacing a flight control system to use the simulation sensor variables and to command the surface actuators and throttle position of the engine model.
Cyberdyn supercomputer - a tool for imaging geodinamic processes
NASA Astrophysics Data System (ADS)
Pomeran, Mihai; Manea, Vlad; Besutiu, Lucian; Zlagnean, Luminita
2014-05-01
More and more physical processes developed within the deep interior of our planet, but with significant impact on the Earth's shape and structure, become subject to numerical modelling by using high performance computing facilities. Nowadays, worldwide an increasing number of research centers decide to make use of such powerful and fast computers for simulating complex phenomena involving fluid dynamics and get deeper insight to intricate problems of Earth's evolution. With the CYBERDYN cybernetic infrastructure (CCI), the Solid Earth Dynamics Department in the Institute of Geodynamics of the Romanian Academy boldly steps into the 21st century by entering the research area of computational geodynamics. The project that made possible this advancement, has been jointly supported by EU and Romanian Government through the Structural and Cohesion Funds. It lasted for about three years, ending October 2013. CCI is basically a modern high performance Beowulf-type supercomputer (HPCC), combined with a high performance visualization cluster (HPVC) and a GeoWall. The infrastructure is mainly structured around 1344 cores and 3 TB of RAM. The high speed interconnect is provided by a Qlogic InfiniBand switch, able to transfer up to 40 Gbps. The CCI storage component is a 40 TB Panasas NAS. The operating system is Linux (CentOS). For control and maintenance, the Bright Cluster Manager package is used. The SGE job scheduler manages the job queues. CCI has been designed for a theoretical peak performance up to 11.2 TFlops. Speed tests showed that a high resolution numerical model (256 × 256 × 128 FEM elements) could be resolved with a mean computational speed of 1 time step at 30 seconds, by employing only a fraction of the computing power (20%). After passing the mandatory tests, the CCI has been involved in numerical modelling of various scenarios related to the East Carpathians tectonic and geodynamic evolution, including the Neogene magmatic activity, and the intriguing intermediate-depth seismicity within the so-called Vrancea zone. The CFD code for numerical modelling is CitcomS, a widely employed open source package specifically developed for earth sciences. Several preliminary 3D geodynamic models for simulating an assumed subduction or the effect of a mantle plume will be presented and discussed.
NASA Astrophysics Data System (ADS)
Kettle, L. M.; Mora, P.; Weatherley, D.; Gross, L.; Xing, H.
2006-12-01
Simulations using the Finite Element method are widely used in many engineering applications and for the solution of partial differential equations (PDEs). Computational models based on the solution of PDEs play a key role in earth systems simulations. We present numerical modelling of crustal fault systems where the dynamic elastic wave equation is solved using the Finite Element method. This is achieved using a high level computational modelling language, escript, available as open source software from ACcESS (Australian Computational Earth Systems Simulator), the University of Queensland. Escript is an advanced geophysical simulation software package developed at ACcESS which includes parallel equation solvers, data visualisation and data analysis software. The escript library was implemented to develop a flexible Finite Element model which reliably simulates the mechanism of faulting and the physics of earthquakes. Both 2D and 3D elastodynamic models are being developed to study the dynamics of crustal fault systems. Our final goal is to build a flexible model which can be applied to any fault system with user-defined geometry and input parameters. To study the physics of earthquake processes, two different time scales must be modelled, firstly the quasi-static loading phase which gradually increases stress in the system (~100years), and secondly the dynamic rupture process which rapidly redistributes stress in the system (~100secs). We will discuss the solution of the time-dependent elastic wave equation for an arbitrary fault system using escript. This involves prescribing the correct initial stress distribution in the system to simulate the quasi-static loading of faults to failure; determining a suitable frictional constitutive law which accurately reproduces the dynamics of the stick/slip instability at the faults; and using a robust time integration scheme. These dynamic models generate data and information that can be used for earthquake forecasting.
Bioastrophysical aspects of low energy ion irradiation of frozen anthracene containing water.
Tuleta, M; Gabła, L; Madej, J
2001-08-13
The origin of life on Earth remains a fascinating mystery in spite of many theories existing on this subject. However, it seems that simple prebiotic molecules could play an essential role in the formation of more complex organisms. In our experiment, we synthesized a class of these molecules (quinones) bombarding frozen anthracene containing water with low energy hydrogen ions. This experiment roughly simulated the astrophysical conditions which one can find in the solar system. Thus, we can hypothesize that prebiotic molecules could be created by interaction of the solar wind with interplanetary dust grains. The delivery of these molecules to early Earth may have contributed to the generation of life on our planet.
Data management and analysis for the Earth System Grid
NASA Astrophysics Data System (ADS)
Williams, D. N.; Ananthakrishnan, R.; Bernholdt, D. E.; Bharathi, S.; Brown, D.; Chen, M.; Chervenak, A. L.; Cinquini, L.; Drach, R.; Foster, I. T.; Fox, P.; Hankin, S.; Henson, V. E.; Jones, P.; Middleton, D. E.; Schwidder, J.; Schweitzer, R.; Schuler, R.; Shoshani, A.; Siebenlist, F.; Sim, A.; Strand, W. G.; Wilhelmi, N.; Su, M.
2008-07-01
The international climate community is expected to generate hundreds of petabytes of simulation data within the next five to seven years. This data must be accessed and analyzed by thousands of analysts worldwide in order to provide accurate and timely estimates of the likely impact of climate change on physical, biological, and human systems. Climate change is thus not only a scientific challenge of the first order but also a major technological challenge. In order to address this technological challenge, the Earth System Grid Center for Enabling Technologies (ESG-CET) has been established within the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC)-2 program, with support from the offices of Advanced Scientific Computing Research and Biological and Environmental Research. ESG-CET's mission is to provide climate researchers worldwide with access to the data, information, models, analysis tools, and computational capabilities required to make sense of enormous climate simulation datasets. Its specific goals are to (1) make data more useful to climate researchers by developing Grid technology that enhances data usability; (2) meet specific distributed database, data access, and data movement needs of national and international climate projects; (3) provide a universal and secure web-based data access portal for broad multi-model data collections; and (4) provide a wide-range of Grid-enabled climate data analysis tools and diagnostic methods to international climate centers and U.S. government agencies. Building on the successes of the previous Earth System Grid (ESG) project, which has enabled thousands of researchers to access tens of terabytes of data from a small number of ESG sites, ESG-CET is working to integrate a far larger number of distributed data providers, high-bandwidth wide-area networks, and remote computers in a highly collaborative problem-solving environment.
Geospace simulations using modern accelerator processor technology
NASA Astrophysics Data System (ADS)
Germaschewski, K.; Raeder, J.; Larson, D. J.
2009-12-01
OpenGGCM (Open Geospace General Circulation Model) is a well-established numerical code simulating the Earth's space environment. The most computing intensive part is the MHD (magnetohydrodynamics) solver that models the plasma surrounding Earth and its interaction with Earth's magnetic field and the solar wind flowing in from the sun. Like other global magnetosphere codes, OpenGGCM's realism is currently limited by computational constraints on grid resolution. OpenGGCM has been ported to make use of the added computational powerof modern accelerator based processor architectures, in particular the Cell processor. The Cell architecture is a novel inhomogeneous multicore architecture capable of achieving up to 230 GFLops on a single chip. The University of New Hampshire recently acquired a PowerXCell 8i based computing cluster, and here we will report initial performance results of OpenGGCM. Realizing the high theoretical performance of the Cell processor is a programming challenge, though. We implemented the MHD solver using a multi-level parallelization approach: On the coarsest level, the problem is distributed to processors based upon the usual domain decomposition approach. Then, on each processor, the problem is divided into 3D columns, each of which is handled by the memory limited SPEs (synergistic processing elements) slice by slice. Finally, SIMD instructions are used to fully exploit the SIMD FPUs in each SPE. Memory management needs to be handled explicitly by the code, using DMA to move data from main memory to the per-SPE local store and vice versa. We use a modern technique, automatic code generation, which shields the application programmer from having to deal with all of the implementation details just described, keeping the code much more easily maintainable. Our preliminary results indicate excellent performance, a speed-up of a factor of 30 compared to the unoptimized version.
Concept development of automatic guidance for rotorcraft obstacle avoidance
NASA Technical Reports Server (NTRS)
Cheng, Victor H. L.
1990-01-01
The automatic guidance of rotorcraft for obstacle avoidance in nap-of-the-earth flight is studied. A hierarchical breakdown of the guidance components is used to identify the functional requirements. These requirements and anticipated sensor capabilities lead to a preliminary guidance concept, which has been evaluated via computer simulations.
Equations of State: Gateway to Planetary Origin and Evolution (Invited)
NASA Astrophysics Data System (ADS)
Melosh, J.
2013-12-01
Research over the past decades has shown that collisions between solid bodies govern many crucial phases of planetary origin and evolution. The accretion of the terrestrial planets was punctuated by planetary-scale impacts that generated deep magma oceans, ejected primary atmospheres and probably created the moons of Earth and Pluto. Several extrasolar planetary systems are filled with silicate vapor and condensed 'tektites', probably attesting to recent giant collisions. Even now, long after the solar system settled down from its violent birth, a large asteroid impact wiped out the dinosaurs, while other impacts may have played a role in the origin of life on Earth and perhaps Mars, while maintaining a steady exchange of small meteorites between the terrestrial planets and our moon. Most of these events are beyond the scale at which experiments are possible, so that our main research tool is computer simulation, constrained by the laws of physics and the behavior of materials during high-speed impact. Typical solar system impact velocities range from a few km/s in the outer solar system to 10s of km/s in the inner system. Extrasolar planetary systems expand that range to 100s of km/sec typical of the tightly clustered planetary systems now observed. Although computer codes themselves are currently reaching a high degree of sophistication, we still rely on experimental studies to determine the Equations of State (EoS) of materials critical for the correct simulation of impact processes. The recent expansion of the range of pressures available for study, from a few 100 GPa accessible with light gas guns up to a few TPa from current high energy accelerators now opens experimental access to the full velocity range of interest in our solar system. The results are a surprise: several groups in both the USA and Japan have found that silicates and even iron melt and vaporize much more easily in an impact than previously anticipated. The importance of these findings is illustrated by the impact origin of our Moon. Computer simulations that do not take account of the liquid/vapor phase change are unable to retain any material in orbit around the Earth after a planetary impact. A purely gaseous disk around the Earth is wracked by gravitational instabilities and soon collapses back onto the Earth. Only if the silicate EoS also includes a liquid phase can a disk remain stable long enough to condense into a moon. The implications of this new-found ease of vaporization have yet to be fully explored, but it seems clear that current ideas must undergo extensive revision. More melt and vapor production in impacts implies much larger volume changes of the impacted materials and hence more energetic post-impact expansion. EoSs are thus of vital importance to our understanding of the evolution of planetary systems. Computer simulations can (and must!) substitute for experiments for many aspects of large planetary collisions, but so far experiments are leading theory in accurate determination of equations of state. Yet, the fidelity of the computer simulations to Nature can be only as good as the accuracy of the inputs, making further experimental study of EoS a central task in the exploration and elucidation of our solar system and of planetary systems in general.
OASIS - ORBIT ANALYSIS AND SIMULATION SOFTWARE
NASA Technical Reports Server (NTRS)
Wu, S. C.
1994-01-01
The Orbit Analysis and Simulation Software, OASIS, is a software system developed for covariance and simulation analyses of problems involving earth satellites, especially the Global Positioning System (GPS). It provides a flexible, versatile and efficient accuracy analysis tool for earth satellite navigation and GPS-based geodetic studies. To make future modifications and enhancements easy, the system is modular, with five major modules: PATH/VARY, REGRES, PMOD, FILTER/SMOOTHER, and OUTPUT PROCESSOR. PATH/VARY generates satellite trajectories. Among the factors taken into consideration are: 1) the gravitational effects of the planets, moon and sun; 2) space vehicle orientation and shapes; 3) solar pressure; 4) solar radiation reflected from the surface of the earth; 5) atmospheric drag; and 6) space vehicle gas leaks. The REGRES module reads the user's input, then determines if a measurement should be made based on geometry and time. PMOD modifies a previously generated REGRES file to facilitate various analysis needs. FILTER/SMOOTHER is especially suited to a multi-satellite precise orbit determination and geodetic-type problems. It can be used for any situation where parameters are simultaneously estimated from measurements and a priori information. Examples of nonspacecraft areas of potential application might be Very Long Baseline Interferometry (VLBI) geodesy and radio source catalogue studies. OUTPUT PROCESSOR translates covariance analysis results generated by FILTER/SMOOTHER into user-desired easy-to-read quantities, performs mapping of orbit covariances and simulated solutions, transforms results into different coordinate systems, and computes post-fit residuals. The OASIS program was developed in 1986. It is designed to be implemented on a DEC VAX 11/780 computer using VAX VMS 3.7 or higher. It can also be implemented on a Micro VAX II provided sufficient disk space is available.
An analysis of the low-earth-orbit communications environment
NASA Astrophysics Data System (ADS)
Diersing, Robert Joseph
Advances in microprocessor technology and availability of launch opportunities have caused interest in low-earth-orbit satellite based communications systems to increase dramatically during the past several years. In this research the capabilities of two low-cost, store-and-forward LEO communications satellites operating in the public domain are examined--PACSAT-1 (operated by the Radio Amateur Satellite Corporation) and UoSAT-3 (operated by the University of Surrey, England, Electrical Engineering Department). The file broadcasting and file transfer facilities are examined in detail and a simulation model of the downlink traffic pattern is developed. The simulator will aid the assessment of changes in design and implementation for other systems. The development of the downlink traffic simulator is based on three major parts. First, is a characterization of the low-earth-orbit operating environment along with preliminary measurements of the PACSAT-1 and UoSAT-3 systems including: satellite visibility constraints on communications, monitoring equipment configuration, link margin computations, determination of block and bit error rates, and establishing typical data capture rates for ground stations using computer-pointed directional antennas and fixed omni-directional antennas. Second, arrival rates for successful and unsuccessful file server connections are established along with transaction service times. Downlink traffic has been further characterized by measuring: frame and byte counts for all data-link layer traffic; 30-second interval average response time for all traffic and for file server traffic only; file server response time on a per-connection basis; and retry rates for information and supervisory frames. Finally, the model is verified by comparison with measurements of actual traffic not previously used in the model building process. The simulator is then used to predict operation of the PACSAT-1 satellite with modifications to the original design.
Additional Developments in Atmosphere Revitalization Modeling and Simulation
NASA Technical Reports Server (NTRS)
Coker, Robert F.; Knox, James C.; Cummings, Ramona; Brooks, Thomas; Schunk, Richard G.; Gomez, Carlos
2013-01-01
NASA's Advanced Exploration Systems (AES) program is developing prototype systems, demonstrating key capabilities, and validating operational concepts for future human missions beyond Earth orbit. These forays beyond the confines of earth's gravity will place unprecedented demands on launch systems. They must launch the supplies needed to sustain a crew over longer periods for exploration missions beyond earth's moon. Thus all spacecraft systems, including those for the separation of metabolic carbon dioxide and water from a crewed vehicle, must be minimized with respect to mass, power, and volume. Emphasis is also placed on system robustness both to minimize replacement parts and ensure crew safety when a quick return to earth is not possible. Current efforts are focused on improving the current state-of-the-art systems utilizing fixed beds of sorbent pellets by evaluating structured sorbents, seeking more robust pelletized sorbents, and examining alternate bed configurations to improve system efficiency and reliability. These development efforts combine testing of sub-scale systems and multi-physics computer simulations to evaluate candidate approaches, select the best performing options, and optimize the configuration of the selected approach. This paper describes the continuing development of atmosphere revitalization models and simulations in support of the Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM) project within the AES program.
NASA Technical Reports Server (NTRS)
Luther, M. R.
1981-01-01
The Earth Radiation Budget Experiment (ERBE) is to fly on NASA's Earth Radiation Budget Satellite (ERBS) and on NOAA F and NOAA G. Large spatial scale earth energy budget data will be derived primarily from measurements made by the ERBE nonscanning instrument (ERBE-NS). A description is given of a mathematical model capable of simulating the radiometric response of any of the ERBE-NS earth viewing channels. The model uses a Monte Carlo method to accurately account for directional distributions of emission and reflection from optical surfaces which are neither strictly diffuse nor strictly specular. The model computes radiation exchange factors among optical system components, and determines the distribution in the optical system of energy from an outside source. Attention is also given to an approach for implementing the model and results obtained from the implementation.
NASA Astrophysics Data System (ADS)
Rybus, Tomasz; Seweryn, Karol
2016-03-01
All devices designed to be used in space must be thoroughly tested in relevant conditions. For several classes of devices the reduced gravity conditions are the key factor. In early stages of development and later due to financial reasons, the tests need to be done on Earth. However, in Earth conditions it is impossible to obtain a different gravity field independent on all linear and rotational spatial coordinates. Therefore, various test-bed systems are used, with their design driven by the device's specific needs. One of such test-beds are planar air-bearing microgravity simulators. In such an approach, the tested objects (e.g., manipulators intended for on-orbit operations or vehicles simulating satellites in a close formation flight) are mounted on planar air-bearings that allow almost frictionless motion on a flat surface, thus simulating microgravity conditions in two dimensions. In this paper we present a comprehensive review of research activities related to planar air-bearing microgravity simulators, demonstrating achievements of the most active research groups and describing newest trends and ideas, such as tests of landing gears for low-g bodies. Major design parameters of air-bearing test-beds are also reviewed and a list of notable existing test-beds is presented.
UNH Data Cooperative: A Cyber Infrastructure for Earth System Studies
NASA Astrophysics Data System (ADS)
Braswell, B. H.; Fekete, B. M.; Prusevich, A.; Gliden, S.; Magill, A.; Vorosmarty, C. J.
2007-12-01
Earth system scientists and managers have a continuously growing demand for a wide array of earth observations derived from various data sources including (a) modern satellite retrievals, (b) "in-situ" records, (c) various simulation outputs, and (d) assimilated data products combining model results with observational records. The sheer quantity of data, and formatting inconsistencies make it difficult for users to take full advantage of this important information resource. Thus the system could benefit from a thorough retooling of our current data processing procedures and infrastructure. Emerging technologies, like OPeNDAP and OGC map services, open standard data formats (NetCDF, HDF) data cataloging systems (NASA-Echo, Global Change Master Directory, etc.) are providing the basis for a new approach in data management and processing, where web- services are increasingly designed to serve computer-to-computer communications without human interactions and complex analysis can be carried out over distributed computer resources interconnected via cyber infrastructure. The UNH Earth System Data Collaborative is designed to utilize the aforementioned emerging web technologies to offer new means of access to earth system data. While the UNH Data Collaborative serves a wide array of data ranging from weather station data (Climate Portal) to ocean buoy records and ship tracks (Portsmouth Harbor Initiative) to land cover characteristics, etc. the underlaying data architecture shares common components for data mining and data dissemination via web-services. Perhaps the most unique element of the UNH Data Cooperative's IT infrastructure is its prototype modeling environment for regional ecosystem surveillance over the Northeast corridor, which allows the integration of complex earth system model components with the Cooperative's data services. While the complexity of the IT infrastructure to perform complex computations is continuously increasing, scientists are often forced to spend considerable amount of time to solve basic data management and preprocessing tasks and deal with low level computational design problems like parallelization of model codes. Our modeling infrastructure is designed to take care the bulk of the common tasks found in complex earth system models like I/O handling, computational domain and time management, parallel execution of the modeling tasks, etc. The modeling infrastructure allows scientists to focus on the numerical implementation of the physical processes on a single computational objects(typically grid cells) while the framework takes care of the preprocessing of input data, establishing of the data exchange between computation objects and the execution of the science code. In our presentation, we will discuss the key concepts of our modeling infrastructure. We will demonstrate integration of our modeling framework with data services offered by the UNH Earth System Data Collaborative via web interfaces. We will layout the road map to turn our prototype modeling environment into a truly community framework for wide range of earth system scientists and environmental managers.
NASA Astrophysics Data System (ADS)
Fukazawa, K.; Walker, R. J.; Kimura, T.; Tsuchiya, F.; Murakami, G.; Kita, H.; Tao, C.; Murata, K. T.
2016-12-01
Planetary magnetospheres are very large, while phenomena within them occur on meso- and micro-scales. These scales range from 10s of planetary radii to kilometers. To understand dynamics in these multi-scale systems, numerical simulations have been performed by using the supercomputer systems. We have studied the magnetospheres of Earth, Jupiter and Saturn by using 3-dimensional magnetohydrodynamic (MHD) simulations for a long time, however, we have not obtained the phenomena near the limits of the MHD approximation. In particular, we have not studied meso-scale phenomena that can be addressed by using MHD.Recently we performed our MHD simulation of Earth's magnetosphere by using the K-computer which is the first 10PFlops supercomputer and obtained multi-scale flow vorticity for the both northward and southward IMF. Furthermore, we have access to supercomputer systems which have Xeon, SPARC64, and vector-type CPUs and can compare simulation results between the different systems. Finally, we have compared the results of our parameter survey of the magnetosphere with observations from the HISAKI spacecraft.We have encountered a number of difficulties effectively using the latest supercomputer systems. First the size of simulation output increases greatly. Now a simulation group produces over 1PB of output. Storage and analysis of this much data is difficult. The traditional way to analyze simulation results is to move the results to the investigator's home computer. This takes over three months using an end-to-end 10Gbps network. In reality, there are problems at some nodes such as firewalls that can increase the transfer time to over one year. Another issue is post-processing. It is hard to treat a few TB of simulation output due to the memory limitations of a post-processing computer. To overcome these issues, we have developed and introduced the parallel network storage, the highly efficient network protocol and the CUI based visualization tools.In this study, we will show the latest simulation results using the petascale supercomputer and problems from the use of these supercomputer systems.
Real-Time Climate Simulations in the Interactive 3D Game Universe Sandbox ²
NASA Astrophysics Data System (ADS)
Goldenson, N. L.
2014-12-01
Exploration in an open-ended computer game is an engaging way to explore climate and climate change. Everyone can explore physical models with real-time visualization in the educational simulator Universe Sandbox ² (universesandbox.com/2), which includes basic climate simulations on planets. I have implemented a time-dependent, one-dimensional meridional heat transport energy balance model to run and be adjustable in real time in the midst of a larger simulated system. Universe Sandbox ² is based on the original game - at its core a gravity simulator - with other new physically-based content for stellar evolution, and handling collisions between bodies. Existing users are mostly science enthusiasts in informal settings. We believe that this is the first climate simulation to be implemented in a professionally developed computer game with modern 3D graphical output in real time. The type of simple climate model we've adopted helps us depict the seasonal cycle and the more drastic changes that come from changing the orbit or other external forcings. Users can alter the climate as the simulation is running by altering the star(s) in the simulation, dragging to change orbits and obliquity, adjusting the climate simulation parameters directly or changing other properties like CO2 concentration that affect the model parameters in representative ways. Ongoing visuals of the expansion and contraction of sea ice and snow-cover respond to the temperature calculations, and make it accessible to explore a variety of scenarios and intuitive to understand the output. Variables like temperature can also be graphed in real time. We balance computational constraints with the ability to capture the physical phenomena we wish to visualize, giving everyone access to a simple open-ended meridional energy balance climate simulation to explore and experiment with. The software lends itself to labs at a variety of levels about climate concepts including seasons, the Greenhouse effect, reservoirs and flows, albedo feedback, Snowball Earth, climate sensitivity, and model experiment design. Climate calculations are extended to Mars with some modifications to the Earth climate component, and could be used in lessons about the Mars atmosphere, and exploring scenarios of Mars climate history.
Global MHD simulation of magnetosphere using HPF
NASA Astrophysics Data System (ADS)
Ogino, T.
We have translated a 3-dimensional magnetohydrodynamic (MHD) simulation code of the Earth's magnetosphere from VPP Fortran to HPF/JA on the Fujitsu VPP5000/56 vector-parallel supercomputer and the MHD code was fully vectorized and fully parallelized in VPP Fortran. The entire performance and capability of the HPF MHD code could be shown to be almost comparable to that of VPP Fortran. A 3-dimensional global MHD simulation of the earth's magnetosphere was performed at a speed of over 400 Gflops with an efficiency of 76.5% using 56 PEs of Fujitsu VPP5000/56 in vector and parallel computation that permitted comparison with catalog values. We have concluded that fluid and MHD codes that are fully vectorized and fully parallelized in VPP Fortran can be translated with relative ease to HPF/JA, and a code in HPF/JA may be expected to perform comparably to the same code written in VPP Fortran.
Human Exploration of Earth's Neighborhood and Mars
NASA Technical Reports Server (NTRS)
Condon, Gerald
2003-01-01
The presentation examines Mars landing scenarios, Earth to Moon transfers comparing direct vs. via libration points. Lunar transfer/orbit diagrams, comparison of opposition class and conjunction class missions, and artificial gravity for human exploration missions. Slides related to Mars landing scenarios include: mission scenario; direct entry landing locations; 2005 opportunity - Type 1; Earth-mars superior conjunction; Lander latitude accessibility; Low thrust - Earth return phase; SEP Earth return sequence; Missions - 200, 2007, 2009; and Mission map. Slides related to Earth to Moon transfers (direct vs. via libration points (L1, L2) include libration point missions, expeditionary vs. evolutionary, Earth-Moon L1 - gateway for lunar surface operations, and Lunar mission libration point vs. lunar orbit rendezvous (LOR). Slides related to lunar transfer/orbit diagrams include: trans-lunar trajectory from ISS parking orbit, trans-Earth trajectories, parking orbit considerations, and landing latitude restrictions. Slides related to comparison of opposition class (short-stay) and conjunction class (long-stay) missions for human exploration of Mars include: Mars mission planning, Earth-Mars orbital characteristics, delta-V variations, and Mars mission duration comparison. Slides related to artificial gravity for human exploration missions include: current configuration, NEP thruster location trades, minor axis rotation, and example load paths.
HydroViz: design and evaluation of a Web-based tool for improving hydrology education
NASA Astrophysics Data System (ADS)
Habib, E.; Ma, Y.; Williams, D.; Sharif, H. O.; Hossain, F.
2012-10-01
HydroViz is a Web-based, student-centered, educational tool designed to support active learning in the field of Engineering Hydrology. The design of HydroViz is guided by a learning model that is based on learning with data and simulations, using real-world natural hydrologic systems to convey theoretical concepts, and using Web-based technologies for dissemination of the hydrologic education developments. This model, while being used in a hydrologic education context, can be adapted in other engineering educational settings. HydroViz leverages the free Google Earth resources to enable presentation of geospatial data layers and embed them in web pages that have the same look and feel of Google Earth. These design features significantly facilitate the dissemination and adoption of HydroViz by any interested educational institutions regardless of their access to data or computer models. To facilitate classroom usage, HydroViz is populated with a set of course modules that can be used incrementally within different stages of an engineering hydrology curriculum. A pilot evaluation study was conducted to determine the effectiveness of the HydroViz tool in delivering its educational content, to examine the buy-in of the program by faculty and students, and to identify specific project components that need to be further pursued and improved. A total of 182 students from seven freshmen and senior-level undergraduate classes in three universities participated in the study. HydroViz was effective in facilitating students' learning and understanding of hydrologic concepts and increasing related skills. Students had positive perceptions of various features of HydroViz and they believe that HydroViz fits well in the curriculum. In general, HydroViz tend to be more effective with students in senior-level classes than students in freshmen classes. Lessons gained from this pilot study provide guidance for future adaptation and expansion studies to scale-up the application and utility of HydroViz and other similar systems into various hydrology and water-resource engineering curriculum settings. The paper presents a set of design principles that contribute to the development of other active hydrology educational systems.
Good coupling for the multiscale patch scheme on systems with microscale heterogeneity
NASA Astrophysics Data System (ADS)
Bunder, J. E.; Roberts, A. J.; Kevrekidis, I. G.
2017-05-01
Computational simulation of microscale detailed systems is frequently only feasible over spatial domains much smaller than the macroscale of interest. The 'equation-free' methodology couples many small patches of microscale computations across space to empower efficient computational simulation over macroscale domains of interest. Motivated by molecular or agent simulations, we analyse the performance of various coupling schemes for patches when the microscale is inherently 'rough'. As a canonical problem in this universality class, we systematically analyse the case of heterogeneous diffusion on a lattice. Computer algebra explores how the dynamics of coupled patches predict the large scale emergent macroscale dynamics of the computational scheme. We determine good design for the coupling of patches by comparing the macroscale predictions from patch dynamics with the emergent macroscale on the entire domain, thus minimising the computational error of the multiscale modelling. The minimal error on the macroscale is obtained when the coupling utilises averaging regions which are between a third and a half of the patch. Moreover, when the symmetry of the inter-patch coupling matches that of the underlying microscale structure, patch dynamics predicts the desired macroscale dynamics to any specified order of error. The results confirm that the patch scheme is useful for macroscale computational simulation of a range of systems with microscale heterogeneity.
NASA Technical Reports Server (NTRS)
Stupl, Jan; Faber, Nicolas; Foster, Cyrus; Yang, Fan Yang; Nelson, Bron; Aziz, Jonathan; Nuttall, Andrew; Henze, Chris; Levit, Creon
2014-01-01
This paper provides an updated efficiency analysis of the LightForce space debris collision avoidance scheme. LightForce aims to prevent collisions on warning by utilizing photon pressure from ground based, commercial off the shelf lasers. Past research has shown that a few ground-based systems consisting of 10 kilowatt class lasers directed by 1.5 meter telescopes with adaptive optics could lower the expected number of collisions in Low Earth Orbit (LEO) by an order of magnitude. Our simulation approach utilizes the entire Two Line Element (TLE) catalogue in LEO for a given day as initial input. Least-squares fitting of a TLE time series is used for an improved orbit estimate. We then calculate the probability of collision for all LEO objects in the catalogue for a time step of the simulation. The conjunctions that exceed a threshold probability of collision are then engaged by a simulated network of laser ground stations. After those engagements, the perturbed orbits are used to re-assess the probability of collision and evaluate the efficiency of the system. This paper describes new simulations with three updated aspects: 1) By utilizing a highly parallel simulation approach employing hundreds of processors, we have extended our analysis to a much broader dataset. The simulation time is extended to one year. 2) We analyze not only the efficiency of LightForce on conjunctions that naturally occur, but also take into account conjunctions caused by orbit perturbations due to LightForce engagements. 3) We use a new simulation approach that is regularly updating the LightForce engagement strategy, as it would be during actual operations. In this paper we present our simulation approach to parallelize the efficiency analysis, its computational performance and the resulting expected efficiency of the LightForce collision avoidance system. Results indicate that utilizing a network of four LightForce stations with 20 kilowatt lasers, 85% of all conjunctions with a probability of collision Pc > 10 (sup -6) can be mitigated.
Multigrid preconditioned conjugate-gradient method for large-scale wave-front reconstruction.
Gilles, Luc; Vogel, Curtis R; Ellerbroek, Brent L
2002-09-01
We introduce a multigrid preconditioned conjugate-gradient (MGCG) iterative scheme for computing open-loop wave-front reconstructors for extreme adaptive optics systems. We present numerical simulations for a 17-m class telescope with n = 48756 sensor measurement grid points within the aperture, which indicate that our MGCG method has a rapid convergence rate for a wide range of subaperture average slope measurement signal-to-noise ratios. The total computational cost is of order n log n. Hence our scheme provides for fast wave-front simulation and control in large-scale adaptive optics systems.
Efficient Parallel Algorithm For Direct Numerical Simulation of Turbulent Flows
NASA Technical Reports Server (NTRS)
Moitra, Stuti; Gatski, Thomas B.
1997-01-01
A distributed algorithm for a high-order-accurate finite-difference approach to the direct numerical simulation (DNS) of transition and turbulence in compressible flows is described. This work has two major objectives. The first objective is to demonstrate that parallel and distributed-memory machines can be successfully and efficiently used to solve computationally intensive and input/output intensive algorithms of the DNS class. The second objective is to show that the computational complexity involved in solving the tridiagonal systems inherent in the DNS algorithm can be reduced by algorithm innovations that obviate the need to use a parallelized tridiagonal solver.
NASA Astrophysics Data System (ADS)
Pazmino, John
2007-02-01
Many concepts of chaotic action in astrodynamics can be appreciated through simulations with home computers and software. Many astrodynamical cases are illustrated. Although chaos theory is now applied to spaceflight trajectories, this presentation employs only inert bodies with no onboard impulse, e.g., from rockets or outgassing. Other nongravitational effects are also ignored, such as atmosphere drag, solar pressure, and radiation. The ability to simulate gravity behavior, even if not completely rigorous, on small mass-market computers allows a fuller understanding of the new approach to astrodynamics by home astronomers, scientists outside orbital mechanics, and students in middle and high school. The simulations can also help a lay audience visualize gravity behavior during press conferences, briefings, and public lectures. No review, evaluation, critique of the programs shown in this presentation is intended. The results from these simulations are not valid for - and must not be used for - making earth-colliding predictions.
NASA Astrophysics Data System (ADS)
Law, Rachel M.; Ziehn, Tilo; Matear, Richard J.; Lenton, Andrew; Chamberlain, Matthew A.; Stevens, Lauren E.; Wang, Ying-Ping; Srbinovsky, Jhan; Bi, Daohua; Yan, Hailin; Vohralik, Peter F.
2017-07-01
Earth system models (ESMs) that incorporate carbon-climate feedbacks represent the present state of the art in climate modelling. Here, we describe the Australian Community Climate and Earth System Simulator (ACCESS)-ESM1, which comprises atmosphere (UM7.3), land (CABLE), ocean (MOM4p1), and sea-ice (CICE4.1) components with OASIS-MCT coupling, to which ocean and land carbon modules have been added. The land carbon model (as part of CABLE) can optionally include both nitrogen and phosphorous limitation on the land carbon uptake. The ocean carbon model (WOMBAT, added to MOM) simulates the evolution of phosphate, oxygen, dissolved inorganic carbon, alkalinity and iron with one class of phytoplankton and zooplankton. We perform multi-centennial pre-industrial simulations with a fixed atmospheric CO2 concentration and different land carbon model configurations (prescribed or prognostic leaf area index). We evaluate the equilibration of the carbon cycle and present the spatial and temporal variability in key carbon exchanges. Simulating leaf area index results in a slight warming of the atmosphere relative to the prescribed leaf area index case. Seasonal and interannual variations in land carbon exchange are sensitive to whether leaf area index is simulated, with interannual variations driven by variability in precipitation and temperature. We find that the response of the ocean carbon cycle shows reasonable agreement with observations. While our model overestimates surface phosphate values, the global primary productivity agrees well with observations. Our analysis highlights some deficiencies inherent in the carbon models and where the carbon simulation is negatively impacted by known biases in the underlying physical model and consequent limits on the applicability of this model version. We conclude the study with a brief discussion of key developments required to further improve the realism of our model simulation.
Climate simulations and services on HPC, Cloud and Grid infrastructures
NASA Astrophysics Data System (ADS)
Cofino, Antonio S.; Blanco, Carlos; Minondo Tshuma, Antonio
2017-04-01
Cloud, Grid and High Performance Computing have changed the accessibility and availability of computing resources for Earth Science research communities, specially for Climate community. These paradigms are modifying the way how climate applications are being executed. By using these technologies the number, variety and complexity of experiments and resources are increasing substantially. But, although computational capacity is increasing, traditional applications and tools used by the community are not good enough to manage this large volume and variety of experiments and computing resources. In this contribution, we evaluate the challenges to run climate simulations and services on Grid, Cloud and HPC infrestructures and how to tackle them. The Grid and Cloud infrastructures provided by EGI's VOs ( esr , earth.vo.ibergrid and fedcloud.egi.eu) will be evaluated, as well as HPC resources from PRACE infrastructure and institutional clusters. To solve those challenges, solutions using DRM4G framework will be shown. DRM4G provides a good framework to manage big volume and variety of computing resources for climate experiments. This work has been supported by the Spanish National R&D Plan under projects WRF4G (CGL2011-28864), INSIGNIA (CGL2016-79210-R) and MULTI-SDM (CGL2015-66583-R) ; the IS-ENES2 project from the 7FP of the European Commission (grant agreement no. 312979); the European Regional Development Fund—ERDF and the Programa de Personal Investigador en Formación Predoctoral from Universidad de Cantabria and Government of Cantabria.
Students' learning of clinical sonography: use of computer-assisted instruction and practical class.
Wood, A K; Dadd, M J; Lublin, J R
1996-08-01
The application of information technology to teaching radiology will profoundly change the way learning is mediated to students. In this project, the integration of veterinary medical students' knowledge of sonography was promoted by a computer-assisted instruction program and a subsequent practical class. The computer-assisted instruction program emphasized the physical principles of clinical sonography and contained simulations and user-active experiments. In the practical class, the students used an actual sonographic machine for the first time and made images of a tissue-equivalent phantom. Students' responses to questionnaires were analyzed. On completing the overall project, 96% of the students said that they now understood sonographic concepts very or reasonably well, and 98% had become very or moderately interested in clinical sonography. The teaching and learning initiatives enhanced an integrated approach to learning, stimulated student interest and curiosity, improved understanding of sonographic principles, and contributed to an increased confidence and skill in using sonographic equipment.
NASA Center for Climate Simulation (NCCS) Presentation
NASA Technical Reports Server (NTRS)
Webster, William P.
2012-01-01
The NASA Center for Climate Simulation (NCCS) offers integrated supercomputing, visualization, and data interaction technologies to enhance NASA's weather and climate prediction capabilities. It serves hundreds of users at NASA Goddard Space Flight Center, as well as other NASA centers, laboratories, and universities across the US. Over the past year, NCCS has continued expanding its data-centric computing environment to meet the increasingly data-intensive challenges of climate science. We doubled our Discover supercomputer's peak performance to more than 800 teraflops by adding 7,680 Intel Xeon Sandy Bridge processor-cores and most recently 240 Intel Xeon Phi Many Integrated Core (MIG) co-processors. A supercomputing-class analysis system named Dali gives users rapid access to their data on Discover and high-performance software including the Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT), with interfaces from user desktops and a 17- by 6-foot visualization wall. NCCS also is exploring highly efficient climate data services and management with a new MapReduce/Hadoop cluster while augmenting its data distribution to the science community. Using NCCS resources, NASA completed its modeling contributions to the Intergovernmental Panel on Climate Change (IPCG) Fifth Assessment Report this summer as part of the ongoing Coupled Modellntercomparison Project Phase 5 (CMIP5). Ensembles of simulations run on Discover reached back to the year 1000 to test model accuracy and projected climate change through the year 2300 based on four different scenarios of greenhouse gases, aerosols, and land use. The data resulting from several thousand IPCC/CMIP5 simulations, as well as a variety of other simulation, reanalysis, and observationdatasets, are available to scientists and decision makers through an enhanced NCCS Earth System Grid Federation Gateway. Worldwide downloads have totaled over 110 terabytes of data.
Toward regional-scale adjoint tomography in the deep earth
NASA Astrophysics Data System (ADS)
Masson, Y.; Romanowicz, B. A.
2013-12-01
Thanks to the development of efficient numerical computation methods, such as the Spectral Element Method (SEM) and to the increasing power of computer clusters, it is now possible to obtain regional-scale images of the Earth's interior using adjoint-tomography (e.g. Tape, C., et al., 2009). As for now, these tomographic models are limited to the upper layers of the earth, i.e., they provide us with high-resolution images of the crust and the upper part of the mantle. Given the gigantic amount of calculation it represents, obtaing similar models at the global scale (i.e. images of the entire Earth) seems out of reach at the moment. Furthermore, it's likely that the first generation of such global adjoint tomographic models will have a resolution significantly smaller than the current regional models. In order to image regions of interests in the deep Earth, such as plumes, slabs or large low shear velocity provinces (LLSVPs), while keeping the computation tractable, we are developing new tools that will allow us to perform regional-scale adjoint-tomography at arbitrary depths. In a recent study (Masson et al., 2013), we showed that a numerical equivalent of the time reversal mirrors used in experimental acoustics permits to confine the wave propagation computations (i.e. using SEM simulations) inside the region to be imaged. With this ability to limit wave propagation modeling inside a region of interest, obtaining the adjoint sensitivity kernels needed for tomographic imaging is only two steps further. First, the local wavefield modeling needs to be coupled with field extrapolation techniques in order to obtain synthetic seismograms at the surface of the earth. These seismograms will account for the 3D structure inside the region of interest in a quasi-exact manner. We will present preliminary results where the field-extrapolation is performed using Green's function computed in a 1D Earth model thanks to the Direct Solution Method (DSM). Once synthetic seismograms can be obtained, it is possible to evaluate the misfit between observed and computed seismograms. The second step will then be to extrapolate the misfit function back into the SEM region in order to compute local adjoint sensitivity kernels. When available, these kernels will allow us to perform regional-scale adjoint tomography at arbitrary locations inside the earth. Masson Y., Cupillard P., Capdeville Y., & Romanowicz B., 2013. On the numerical implementation of time-reversal mirrors for tomographic imaging, Journal of Geophysical Research (under review). Tape, C., et al. (2009). "Adjoint tomography of the southern California crust." Science 325(5943): 988-992.
Adaptive hybrid simulations for multiscale stochastic reaction networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hepp, Benjamin; Gupta, Ankit; Khammash, Mustafa
2015-01-21
The probability distribution describing the state of a Stochastic Reaction Network (SRN) evolves according to the Chemical Master Equation (CME). It is common to estimate its solution using Monte Carlo methods such as the Stochastic Simulation Algorithm (SSA). In many cases, these simulations can take an impractical amount of computational time. Therefore, many methods have been developed that approximate sample paths of the underlying stochastic process and estimate the solution of the CME. A prominent class of these methods include hybrid methods that partition the set of species and the set of reactions into discrete and continuous subsets. Such amore » partition separates the dynamics into a discrete and a continuous part. Simulating such a stochastic process can be computationally much easier than simulating the exact discrete stochastic process with SSA. Moreover, the quasi-stationary assumption to approximate the dynamics of fast subnetworks can be applied for certain classes of networks. However, as the dynamics of a SRN evolves, these partitions may have to be adapted during the simulation. We develop a hybrid method that approximates the solution of a CME by automatically partitioning the reactions and species sets into discrete and continuous components and applying the quasi-stationary assumption on identifiable fast subnetworks. Our method does not require any user intervention and it adapts to exploit the changing timescale separation between reactions and/or changing magnitudes of copy-numbers of constituent species. We demonstrate the efficiency of the proposed method by considering examples from systems biology and showing that very good approximations to the exact probability distributions can be achieved in significantly less computational time. This is especially the case for systems with oscillatory dynamics, where the system dynamics change considerably throughout the time-period of interest.« less
Adaptive hybrid simulations for multiscale stochastic reaction networks.
Hepp, Benjamin; Gupta, Ankit; Khammash, Mustafa
2015-01-21
The probability distribution describing the state of a Stochastic Reaction Network (SRN) evolves according to the Chemical Master Equation (CME). It is common to estimate its solution using Monte Carlo methods such as the Stochastic Simulation Algorithm (SSA). In many cases, these simulations can take an impractical amount of computational time. Therefore, many methods have been developed that approximate sample paths of the underlying stochastic process and estimate the solution of the CME. A prominent class of these methods include hybrid methods that partition the set of species and the set of reactions into discrete and continuous subsets. Such a partition separates the dynamics into a discrete and a continuous part. Simulating such a stochastic process can be computationally much easier than simulating the exact discrete stochastic process with SSA. Moreover, the quasi-stationary assumption to approximate the dynamics of fast subnetworks can be applied for certain classes of networks. However, as the dynamics of a SRN evolves, these partitions may have to be adapted during the simulation. We develop a hybrid method that approximates the solution of a CME by automatically partitioning the reactions and species sets into discrete and continuous components and applying the quasi-stationary assumption on identifiable fast subnetworks. Our method does not require any user intervention and it adapts to exploit the changing timescale separation between reactions and/or changing magnitudes of copy-numbers of constituent species. We demonstrate the efficiency of the proposed method by considering examples from systems biology and showing that very good approximations to the exact probability distributions can be achieved in significantly less computational time. This is especially the case for systems with oscillatory dynamics, where the system dynamics change considerably throughout the time-period of interest.
Gottschlich, Carsten; Schuhmacher, Dominic
2014-01-01
Finding solutions to the classical transportation problem is of great importance, since this optimization problem arises in many engineering and computer science applications. Especially the Earth Mover's Distance is used in a plethora of applications ranging from content-based image retrieval, shape matching, fingerprint recognition, object tracking and phishing web page detection to computing color differences in linguistics and biology. Our starting point is the well-known revised simplex algorithm, which iteratively improves a feasible solution to optimality. The Shortlist Method that we propose substantially reduces the number of candidates inspected for improving the solution, while at the same time balancing the number of pivots required. Tests on simulated benchmarks demonstrate a considerable reduction in computation time for the new method as compared to the usual revised simplex algorithm implemented with state-of-the-art initialization and pivot strategies. As a consequence, the Shortlist Method facilitates the computation of large scale transportation problems in viable time. In addition we describe a novel method for finding an initial feasible solution which we coin Modified Russell's Method.
Gottschlich, Carsten; Schuhmacher, Dominic
2014-01-01
Finding solutions to the classical transportation problem is of great importance, since this optimization problem arises in many engineering and computer science applications. Especially the Earth Mover's Distance is used in a plethora of applications ranging from content-based image retrieval, shape matching, fingerprint recognition, object tracking and phishing web page detection to computing color differences in linguistics and biology. Our starting point is the well-known revised simplex algorithm, which iteratively improves a feasible solution to optimality. The Shortlist Method that we propose substantially reduces the number of candidates inspected for improving the solution, while at the same time balancing the number of pivots required. Tests on simulated benchmarks demonstrate a considerable reduction in computation time for the new method as compared to the usual revised simplex algorithm implemented with state-of-the-art initialization and pivot strategies. As a consequence, the Shortlist Method facilitates the computation of large scale transportation problems in viable time. In addition we describe a novel method for finding an initial feasible solution which we coin Modified Russell's Method. PMID:25310106
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
Continuous-Variable Instantaneous Quantum Computing is Hard to Sample.
Douce, T; Markham, D; Kashefi, E; Diamanti, E; Coudreau, T; Milman, P; van Loock, P; Ferrini, G
2017-02-17
Instantaneous quantum computing is a subuniversal quantum complexity class, whose circuits have proven to be hard to simulate classically in the discrete-variable realm. We extend this proof to the continuous-variable (CV) domain by using squeezed states and homodyne detection, and by exploring the properties of postselected circuits. In order to treat postselection in CVs, we consider finitely resolved homodyne detectors, corresponding to a realistic scheme based on discrete probability distributions of the measurement outcomes. The unavoidable errors stemming from the use of finitely squeezed states are suppressed through a qubit-into-oscillator Gottesman-Kitaev-Preskill encoding of quantum information, which was previously shown to enable fault-tolerant CV quantum computation. Finally, we show that, in order to render postselected computational classes in CVs meaningful, a logarithmic scaling of the squeezing parameter with the circuit size is necessary, translating into a polynomial scaling of the input energy.
Exploiting Quantum Resonance to Solve Combinatorial Problems
NASA Technical Reports Server (NTRS)
Zak, Michail; Fijany, Amir
2006-01-01
Quantum resonance would be exploited in a proposed quantum-computing approach to the solution of combinatorial optimization problems. In quantum computing in general, one takes advantage of the fact that an algorithm cannot be decoupled from the physical effects available to implement it. Prior approaches to quantum computing have involved exploitation of only a subset of known quantum physical effects, notably including parallelism and entanglement, but not including resonance. In the proposed approach, one would utilize the combinatorial properties of tensor-product decomposability of unitary evolution of many-particle quantum systems for physically simulating solutions to NP-complete problems (a class of problems that are intractable with respect to classical methods of computation). In this approach, reinforcement and selection of a desired solution would be executed by means of quantum resonance. Classes of NP-complete problems that are important in practice and could be solved by the proposed approach include planning, scheduling, search, and optimal design.
NASA Astrophysics Data System (ADS)
Becker, T. W.
2011-12-01
I present results from ongoing, NSF-CAREER funded educational and research efforts that center around making numerical tools in seismology and geodynamics more accessible to a broader audience. The goal is not only to train students in quantitative, interdisciplinary research, but also to make methods more easily accessible to practitioners across disciplines. I describe the two main efforts that were funded, the Solid Earth Research and Teaching Environment (SEATREE, geosys.usc.edu/projects/seatree/), and a new Numerical Methods class. SEATREE is a modular and user-friendly software framework to facilitate using solid Earth research tools in the undergraduate and graduate classroom and for interdisciplinary, scientific collaboration. We use only open-source software, and most programming is done in the Python computer language. We strive to make use of modern software design and development concepts while remaining compatible with traditional scientific coding and existing, legacy software. Our goals are to provide a fully contained, yet transparent package that lets users operate in an easy, graphically supported "black box" mode, while also allowing to look under the hood, for example to conduct numerous forward models to explore parameter space. SEATREE currently has several implemented modules, including on global mantle flow, 2D phase velocity tomography, and 2D mantle convection and was used at the University of Southern California, Los Angeles, and at a 2010 CIDER summer school tutorial. SEATREE was developed in collaboration with engineering and computer science undergraduate students, some of which have gone on to work in Earth Science projects. In the long run, we envision SEATREE to contribute to new ways of sharing scientific research, and making (numerical) experiments truly reproducible again. The other project is a set of lecture notes and Matlab exercises on Numerical Methods in solid Earth, focusing on finite difference and element methods. The class has been taught several times at USC to a broad audience of Earth science students with very diverse levels of exposure to math and physics. It is our goal to bring everyone up to speed and empower students, and we have seen structural geology students with very little exposure to math go on to construct their own numerical models of pTt-paths in a core-complex setting. This exemplifies the goal of teaching students to both be able to put together simple numerical models from scratch, and, perhaps more importantly, to truly understand the basic concepts, capabilities, and pitfalls of the more powerful community codes that are being increasingly used. SEATREE and the Numerical Methods class material are freely available at geodynamics.usc.edu/~becker.
Can we use Earth Observations to improve monthly water level forecasts?
NASA Astrophysics Data System (ADS)
Slater, L. J.; Villarini, G.
2017-12-01
Dynamical-statistical hydrologic forecasting approaches benefit from different strengths in comparison with traditional hydrologic forecasting systems: they are computationally efficient, can integrate and `learn' from a broad selection of input data (e.g., General Circulation Model (GCM) forecasts, Earth Observation time series, teleconnection patterns), and can take advantage of recent progress in machine learning (e.g. multi-model blending, post-processing and ensembling techniques). Recent efforts to develop a dynamical-statistical ensemble approach for forecasting seasonal streamflow using both GCM forecasts and changing land cover have shown promising results over the U.S. Midwest. Here, we use climate forecasts from several GCMs of the North American Multi Model Ensemble (NMME) alongside 15-minute stage time series from the National River Flow Archive (NRFA) and land cover classes extracted from the European Space Agency's Climate Change Initiative 300 m annual Global Land Cover time series. With these data, we conduct systematic long-range probabilistic forecasting of monthly water levels in UK catchments over timescales ranging from one to twelve months ahead. We evaluate the improvement in model fit and model forecasting skill that comes from using land cover classes as predictors in the models. This work opens up new possibilities for combining Earth Observation time series with GCM forecasts to predict a variety of hazards from space using data science techniques.
Validation of double Langmuir probe in-orbit performance onboard a nano-satellite
NASA Astrophysics Data System (ADS)
Tejumola, Taiwo Raphael; Zarate Segura, Guillermo Wenceslao; Kim, Sangkyun; Khan, Arifur; Cho, Mengu
2018-03-01
Many plasma measurement systems have been proposed and used onboard different satellites to characterize space plasma. Most of these systems employed the technique of Langmuir probes either using the single or double probes methods. Recent growth of lean satellites has positioned it on advantage to be used for space science missions using Langmuir probes because of its simplicity and convenience. However, single Langmuir probes are not appropriate to be used on lean satellites because of their limited conducting area which leads to spacecraft charging and drift of the instrument's electrical ground during measurement. Double Langmuir probes technique can overcome this limitation, as a measurement reference in relation to the spacecraft is not required. A double Langmuir probe measurement system was designed and developed at Kyushu Institute of Technology for HORYU-IV satellite, which is a 10 kg, 30 cm cubic class lean satellite launched into Low Earth Orbit on 17th February 2016. This paper presents the on-orbit performance and validation of the double Langmuir probe measurement using actual on-orbit measured data and computer simulations.
Lee, Anthony; Yau, Christopher; Giles, Michael B.; Doucet, Arnaud; Holmes, Christopher C.
2011-01-01
We present a case-study on the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods. Graphics cards, containing multiple Graphics Processing Units (GPUs), are self-contained parallel computational devices that can be housed in conventional desktop and laptop computers and can be thought of as prototypes of the next generation of many-core processors. For certain classes of population-based Monte Carlo algorithms they offer massively parallel simulation, with the added advantage over conventional distributed multi-core processors that they are cheap, easily accessible, easy to maintain, easy to code, dedicated local devices with low power consumption. On a canonical set of stochastic simulation examples including population-based Markov chain Monte Carlo methods and Sequential Monte Carlo methods, we nd speedups from 35 to 500 fold over conventional single-threaded computer code. Our findings suggest that GPUs have the potential to facilitate the growth of statistical modelling into complex data rich domains through the availability of cheap and accessible many-core computation. We believe the speedup we observe should motivate wider use of parallelizable simulation methods and greater methodological attention to their design. PMID:22003276
Earth Science Data Education through Cooking Up Recipes
NASA Astrophysics Data System (ADS)
Weigel, A. M.; Maskey, M.; Smith, T.; Conover, H.
2016-12-01
One of the major challenges in Earth science research and applications is understanding and applying the proper methods, tools, and software for using scientific data. These techniques are often difficult and time consuming to identify, requiring novel users to conduct extensive research, take classes, and reach out for assistance, thus hindering scientific discovery and real-world applications. To address these challenges, the Global Hydrology Resource Center (GHRC) DAAC has developed a series of data recipes that novel users such as students, decision makers, and general Earth scientists can leverage to learn how to use Earth science datasets. Once the data recipe content had been finalized, GHRC computer and Earth scientists collaborated with a web and graphic designer to ensure the content is both attractively presented to data users, and clearly communicated to promote the education and use of Earth science data. The completed data recipes include, but are not limited to, tutorials, iPython Notebooks, resources, and tools necessary for addressing key difficulties in data use across a broad user base. These recipes enable non-traditional users to learn how to use data, but also curates and communicates common methods and approaches that may be difficult and time consuming for these users to identify.
Improvement of the Earth's gravity field from terrestrial and satellite data
NASA Technical Reports Server (NTRS)
1987-01-01
The terrestrial gravity data base was updated. Studies related to the Geopotential Research Mission (GRM) have primarily considered the local recovery of gravity anomalies on the surface of the Earth based on satellite to satellite tracking or gradiometer data. A simulation study was used to estimate the accuracy of 1 degree-mean anomalies which could be recovered from the GRM data. Numerous procedures were developed for the intent of performing computations at the laser stations in the SL6 system to improve geoid undulation calculations.
2017-12-08
The heart of the NASA Center for Climate Simulation (NCCS) is the “Discover” supercomputer. In 2009, NCCS added more than 8,000 computer processors to Discover, for a total of nearly 15,000 processors. Credit: NASA/Pat Izzo To learn more about NCCS go to: www.nasa.gov/topics/earth/features/climate-sim-center.html NASA Goddard Space Flight Center is home to the nation's largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.
2017-12-08
The heart of the NASA Center for Climate Simulation (NCCS) is the “Discover” supercomputer. In 2009, NCCS added more than 8,000 computer processors to Discover, for a total of nearly 15,000 processors. Credit: NASA/Pat Izzo To learn more about NCCS go to: www.nasa.gov/topics/earth/features/climate-sim-center.html NASA Goddard Space Flight Center is home to the nation's largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.
2017-12-08
The heart of the NASA Center for Climate Simulation (NCCS) is the “Discover” supercomputer. In 2009, NCCS added more than 8,000 computer processors to Discover, for a total of nearly 15,000 processors. Credit: NASA/Pat Izzo To learn more about NCCS go to: www.nasa.gov/topics/earth/features/climate-sim-center.html NASA Goddard Space Flight Center is home to the nation's largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.
Role of High-End Computing in Meeting NASA's Science and Engineering Challenges
NASA Technical Reports Server (NTRS)
Biswas, Rupak
2006-01-01
High-End Computing (HEC) has always played a major role in meeting the modeling and simulation needs of various NASA missions. With NASA's newest 62 teraflops Columbia supercomputer, HEC is having an even greater impact within the Agency and beyond. Significant cutting-edge science and engineering simulations in the areas of space exploration, Shuttle operations, Earth sciences, and aeronautics research, are already occurring on Columbia, demonstrating its ability to accelerate NASA s exploration vision. The talk will describe how the integrated supercomputing production environment is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions.
Numerical simulation of the geodynamo reaches Earth's core dynamical regime
NASA Astrophysics Data System (ADS)
Aubert, J.; Gastine, T.; Fournier, A.
2016-12-01
Numerical simulations of the geodynamo have been successful at reproducing a number of static (field morphology) and kinematic (secular variation patterns, core surface flows and westward drift) features of Earth's magnetic field, making them a tool of choice for the analysis and retrieval of geophysical information on Earth's core. However, classical numerical models have been run in a parameter regime far from that of the real system, prompting the question of whether we do get "the right answers for the wrong reasons", i.e. whether the agreement between models and nature simply occurs by chance and without physical relevance in the dynamics. In this presentation, we show that classical models succeed in describing the geodynamo because their large-scale spatial structure is essentially invariant as one progresses along a well-chosen path in parameter space to Earth's core conditions. This path is constrained by the need to enforce the relevant force balance (MAC or Magneto-Archimedes-Coriolis) and preserve the ratio of the convective overturn and magnetic diffusion times. Numerical simulations performed along this path are shown to be spatially invariant at scales larger than that where the magnetic energy is ohmically dissipated. This property enables the definition of large-eddy simulations that show good agreement with direct numerical simulations in the range where both are feasible, and that can be computed at unprecedented values of the control parameters, such as an Ekman number E=10-8. Combining direct and large-eddy simulations, large-scale invariance is observed over half the logarithmic distance in parameter space between classical models and Earth. The conditions reached at this mid-point of the path are furthermore shown to be representative of the rapidly-rotating, asymptotic dynamical regime in which Earth's core resides, with a MAC force balance undisturbed by viscosity or inertia, the enforcement of a Taylor state and strong-field dynamo action. We conclude that numerical modelling has advanced to a stage where it is possible to use models correctly representing the statics, kinematics and now the dynamics of the geodynamo. This opens the way to a better analysis of the geomagnetic field in the time and space domains.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Penny, Matthew T., E-mail: penny@astronomy.ohio-state.edu
2014-08-01
Extensive simulations of planetary microlensing are necessary both before and after a survey is conducted: before to design and optimize the survey and after to understand its detection efficiency. The major bottleneck in such computations is the computation of light curves. However, for low-mass planets, most of these computations are wasteful, as most light curves do not contain detectable planetary signatures. In this paper, I develop a parameterization of the binary microlens that is conducive to avoiding light curve computations. I empirically find analytic expressions describing the limits of the parameter space that contain the vast majority of low-mass planetmore » detections. Through a large-scale simulation, I measure the (in)completeness of the parameterization and the speed-up it is possible to achieve. For Earth-mass planets in a wide range of orbits, it is possible to speed up simulations by a factor of ∼30-125 (depending on the survey's annual duty-cycle) at the cost of missing ∼1% of detections (which is actually a smaller loss than for the arbitrary parameter limits typically applied in microlensing simulations). The benefits of the parameterization probably outweigh the costs for planets below 100 M{sub ⊕}. For planets at the sensitivity limit of AFTA-WFIRST, simulation speed-ups of a factor ∼1000 or more are possible.« less
Massively parallel simulator of optical coherence tomography of inhomogeneous turbid media.
Malektaji, Siavash; Lima, Ivan T; Escobar I, Mauricio R; Sherif, Sherif S
2017-10-01
An accurate and practical simulator for Optical Coherence Tomography (OCT) could be an important tool to study the underlying physical phenomena in OCT such as multiple light scattering. Recently, many researchers have investigated simulation of OCT of turbid media, e.g., tissue, using Monte Carlo methods. The main drawback of these earlier simulators is the long computational time required to produce accurate results. We developed a massively parallel simulator of OCT of inhomogeneous turbid media that obtains both Class I diffusive reflectivity, due to ballistic and quasi-ballistic scattered photons, and Class II diffusive reflectivity due to multiply scattered photons. This Monte Carlo-based simulator is implemented on graphic processing units (GPUs), using the Compute Unified Device Architecture (CUDA) platform and programming model, to exploit the parallel nature of propagation of photons in tissue. It models an arbitrary shaped sample medium as a tetrahedron-based mesh and uses an advanced importance sampling scheme. This new simulator speeds up simulations of OCT of inhomogeneous turbid media by about two orders of magnitude. To demonstrate this result, we have compared the computation times of our new parallel simulator and its serial counterpart using two samples of inhomogeneous turbid media. We have shown that our parallel implementation reduced simulation time of OCT of the first sample medium from 407 min to 92 min by using a single GPU card, to 12 min by using 8 GPU cards and to 7 min by using 16 GPU cards. For the second sample medium, the OCT simulation time was reduced from 209 h to 35.6 h by using a single GPU card, and to 4.65 h by using 8 GPU cards, and to only 2 h by using 16 GPU cards. Therefore our new parallel simulator is considerably more practical to use than its central processing unit (CPU)-based counterpart. Our new parallel OCT simulator could be a practical tool to study the different physical phenomena underlying OCT, or to design OCT systems with improved performance. Copyright © 2017 Elsevier B.V. All rights reserved.
Computer simulation of a space SAR using a range-sequential processor for soil moisture mapping
NASA Technical Reports Server (NTRS)
Fujita, M.; Ulaby, F. (Principal Investigator)
1982-01-01
The ability of a spaceborne synthetic aperture radar (SAR) to detect soil moisture was evaluated by means of a computer simulation technique. The computer simulation package includes coherent processing of the SAR data using a range-sequential processor, which can be set up through hardware implementations, thereby reducing the amount of telemetry involved. With such a processing approach, it is possible to monitor the earth's surface on a continuous basis, since data storage requirements can be easily met through the use of currently available technology. The Development of the simulation package is described, followed by an examination of the application of the technique to actual environments. The results indicate that in estimating soil moisture content with a four-look processor, the difference between the assumed and estimated values of soil moisture is within + or - 20% of field capacity for 62% of the pixels for agricultural terrain and for 53% of the pixels for hilly terrain. The estimation accuracy for soil moisture may be improved by reducing the effect of fading through non-coherent averaging.
Ionizing Radiation Environments and Exposure Risks
NASA Astrophysics Data System (ADS)
Kim, M. H. Y.
2015-12-01
Space radiation environments for historically large solar particle events (SPE) and galactic cosmic rays (GCR) are simulated to characterize exposures to radio-sensitive organs for missions to low-Earth orbit (LEO), moon, near-Earth asteroid, and Mars. Primary and secondary particles for SPE and GCR are transported through the respective atmospheres of Earth or Mars, space vehicle, and astronaut's body tissues using NASA's HZETRN/QMSFRG computer code. Space radiation protection methods, which are derived largely from ground-based methods recommended by the National Council on Radiation Protection and Measurements (NCRP) or International Commission on Radiological Protections (ICRP), are built on the principles of risk justification, limitation, and ALARA (as low as reasonably achievable). However, because of the large uncertainties in high charge and energy (HZE) particle radiobiology and the small population of space crews, NASA develops distinct methods to implement a space radiation protection program. For the fatal cancer risks, which have been considered the dominant risk for GCR, the NASA Space Cancer Risk (NSCR) model has been developed from recommendations by NCRP; and undergone external review by the National Research Council (NRC), NCRP, and through peer-review publications. The NSCR model uses GCR environmental models, particle transport codes describing the GCR modification by atomic and nuclear interactions in atmospheric shielding coupled with spacecraft and tissue shielding, and NASA-defined quality factors for solid cancer and leukemia risk estimates for HZE particles. By implementing the NSCR model, the exposure risks from various heliospheric conditions are assessed for the radiation environments for various-class mission types to understand architectures and strategies of human exploration missions and ultimately to contribute to the optimization of radiation safety and well-being of space crewmembers participating in long-term space missions.
NASA Technical Reports Server (NTRS)
Pogorzelski, R. J.; Beckon, R. J.
1997-01-01
The virtual spacecraft concept is embodied in a set of subsystems, either in the form of hardware or computational models, which together represent all, or a portion of, a spacecraft. For example, the telecommunications transponder may be a hardware prototype while the propulsion system may exist only as a simulation. As the various subsystems are realized in hardware, the spacecraft becomes progressively less virtual. This concept is enabled by JPL's Mission System Testbed which is a set of networked workstations running a message passing operating system called "TRAMEL" which stands for Task Remote Asynchronous Message Exchange Layer. Each simulation on the workstations, which may in fact be hardware controlled by the workstation, "publishes" its operating parameters on TRAMEL and other simulations requiring those parameters as input may "subscribe" to them. In this manner, the whole simulation operates as a single virtual system. This paper describes a simulation designed to evaluate a communications link between the earth and the Mars Pathfinder Lander module as it descends under a parachute through the Martian atmosphere toward the planet's surface. This link includes a transmitter and a low gain antenna on the spacecraft and a receiving antenna and receiver on the earth as well as a simulation of the dynamics of the spacecraft. The transmitter, the ground station antenna, the receiver and the dynamics are all simulated computationally while the spacecraft antenna is implemented in hardware on a very simple spacecraft mockup. The dynamics simulation is a record of one output of the ensemble of outputs of a Monte Carlo simulation of the descent. Additionally, the antenna/spacecraft mock-up system was simulated using APATCH, a shooting and bouncing ray code developed by Demaco, Inc. The antenna simulation, the antenna hardware, and the link simulation are all physically located in different facilities at JPL separated by several hundred meters and are linked via the local area network (LAN).
Analytics of crystal growth in space
NASA Technical Reports Server (NTRS)
Wilcox, W. R.; Chang, C. E.; Shlichta, P. J.; Chen, P. S.; Kim, C. K.
1974-01-01
Two crystal growth processes considered for spacelab experiments were studied to anticipate and understand phenomena not ordinarily encountered on earth. Computer calculations were performed on transport processes in floating zone melting and on growth of a crystal from solution in a spacecraft environment. Experiments intended to simulate solution growth at micro accelerations were performed.
Users guide to ACORn: a comprehensive Ozark regeneration simulator.
Daniel C. Dey; Michael Ter-Mikaelian; Paul S. Johnson; Stephen R. Shifley
1996-01-01
Describes how to use the ACORn computer program for predicting number of trees per acre and stocking percent by species and diameter classes 21 years after complete overstory removal of oak stands in the Ozark Highlands of Missouri and adjacent States.
NASA Astrophysics Data System (ADS)
Huang, Xiaomeng; Tang, Qiang; Tseng, Yuheng; Hu, Yong; Baker, Allison H.; Bryan, Frank O.; Dennis, John; Fu, Haohuan; Yang, Guangwen
2016-11-01
In the Community Earth System Model (CESM), the ocean model is computationally expensive for high-resolution grids and is often the least scalable component for high-resolution production experiments. The major bottleneck is that the barotropic solver scales poorly at high core counts. We design a new barotropic solver to accelerate the high-resolution ocean simulation. The novel solver adopts a Chebyshev-type iterative method to reduce the global communication cost in conjunction with an effective block preconditioner to further reduce the iterations. The algorithm and its computational complexity are theoretically analyzed and compared with other existing methods. We confirm the significant reduction of the global communication time with a competitive convergence rate using a series of idealized tests. Numerical experiments using the CESM 0.1° global ocean model show that the proposed approach results in a factor of 1.7 speed-up over the original method with no loss of accuracy, achieving 10.5 simulated years per wall-clock day on 16 875 cores.
Emerging aerospace technologies
NASA Technical Reports Server (NTRS)
Ballhaus, W. F., Jr.; Milov, L. A.
1985-01-01
The United States Government has a long history of promoting the advancement of technology to strengthen the economy and national defense. An example is NASA, which was formed in 1958 to establish and maintain U.S. space technology leadership. This leadership has resulted in technological benefits to many fields and the establishment of new commercial industries, such as satellite communications. Currently, NASA's leading technology development at Ames Research Center includes the Tilt Rotor XV-15, which provides the versatility of a helicopter with the speed of a turboprop aircraft; the Numerical Aerodynamic Simulator, which is pushing the state of the art in advanced computational mathematics and computer simulation; and the Advanced Automation and Robotics programs, which will improve all areas of space development as well as life on Earth. Private industry is involved in maintaining technological leadership through NASA's Commercial Use of Space Program, which provides for synergistic relationships among government, industry, and academia. The plan for a space station by 1992 has framed much of NASA's future goals and has provided new areas of opportunity for both domestic space technology and leadership improvement of life on Earth.
Evaluation of Ten Methods for Initializing a Land Surface Model
NASA Technical Reports Server (NTRS)
Rodell, M.; Houser, P. R.; Berg, A. A.; Famiglietti, J. S.
2005-01-01
Land surface models (LSMs) are computer programs, similar to weather and climate prediction models, which simulate the stocks and fluxes of water (including soil moisture, snow, evaporation, and runoff) and energy (including the temperature of and sensible heat released from the soil) after they arrive on the land surface as precipitation and sunlight. It is not currently possible to measure all of the variables of interest everywhere on Earth with sufficient accuracy and space-time resolution. Hence LSMs have been developed to integrate the available observations with our understanding of the physical processes involved, using powerful computers, in order to map these stocks and fluxes as they change in time. The maps are used to improve weather forecasts, support water resources and agricultural applications, and study the Earth"s water cycle and climate variability. NASA"s Global Land Data Assimilation System (GLDAS) project facilitates testing of several different LSMs with a variety of input datasets (e.g., precipitation, plant type).
NASA Technical Reports Server (NTRS)
Tezduyar, Tayfun E.
1998-01-01
This is a final report as far as our work at University of Minnesota is concerned. The report describes our research progress and accomplishments in development of high performance computing methods and tools for 3D finite element computation of aerodynamic characteristics and fluid-structure interactions (FSI) arising in airdrop systems, namely ram-air parachutes and round parachutes. This class of simulations involves complex geometries, flexible structural components, deforming fluid domains, and unsteady flow patterns. The key components of our simulation toolkit are a stabilized finite element flow solver, a nonlinear structural dynamics solver, an automatic mesh moving scheme, and an interface between the fluid and structural solvers; all of these have been developed within a parallel message-passing paradigm.
Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.
2013-12-01
Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.
Real time simulation of computer-assisted sequencing of terminal area operations
NASA Technical Reports Server (NTRS)
Dear, R. G.
1981-01-01
A simulation was developed to investigate the utilization of computer assisted decision making for the task of sequencing and scheduling aircraft in a high density terminal area. The simulation incorporates a decision methodology termed Constrained Position Shifting. This methodology accounts for aircraft velocity profiles, routes, and weight classes in dynamically sequencing and scheduling arriving aircraft. A sample demonstration of Constrained Position Shifting is presented where six aircraft types (including both light and heavy aircraft) are sequenced to land at Denver's Stapleton International Airport. A graphical display is utilized and Constrained Position Shifting with a maximum shift of four positions (rearward or forward) is compared to first come, first serve with respect to arrival at the runway. The implementation of computer assisted sequencing and scheduling methodologies is investigated. A time based control concept will be required and design considerations for such a system are discussed.
NASA Astrophysics Data System (ADS)
Vieira, V. M. N. C. S.; Sahlée, E.; Jurus, P.; Clementi, E.; Pettersson, H.; Mateus, M.
2015-09-01
Earth-System and regional models, forecasting climate change and its impacts, simulate atmosphere-ocean gas exchanges using classical yet too simple generalizations relying on wind speed as the sole mediator while neglecting factors as sea-surface agitation, atmospheric stability, current drag with the bottom, rain and surfactants. These were proved fundamental for accurate estimates, particularly in the coastal ocean, where a significant part of the atmosphere-ocean greenhouse gas exchanges occurs. We include several of these factors in a customizable algorithm proposed for the basis of novel couplers of the atmospheric and oceanographic model components. We tested performances with measured and simulated data from the European coastal ocean, having found our algorithm to forecast greenhouse gas exchanges largely different from the forecasted by the generalization currently in use. Our algorithm allows calculus vectorization and parallel processing, improving computational speed roughly 12× in a single cpu core, an essential feature for Earth-System models applications.
Grid Computing for Earth Science
NASA Astrophysics Data System (ADS)
Renard, Philippe; Badoux, Vincent; Petitdidier, Monique; Cossu, Roberto
2009-04-01
The fundamental challenges facing humankind at the beginning of the 21st century require an effective response to the massive changes that are putting increasing pressure on the environment and society. The worldwide Earth science community, with its mosaic of disciplines and players (academia, industry, national surveys, international organizations, and so forth), provides a scientific basis for addressing issues such as the development of new energy resources; a secure water supply; safe storage of nuclear waste; the analysis, modeling, and mitigation of climate changes; and the assessment of natural and industrial risks. In addition, the Earth science community provides short- and medium-term prediction of weather and natural hazards in real time, and model simulations of a host of phenomena relating to the Earth and its space environment. These capabilities require that the Earth science community utilize, both in real and remote time, massive amounts of data, which are usually distributed among many different organizations and data centers.
User data dissemination concepts for earth resources
NASA Technical Reports Server (NTRS)
Davies, R.; Scott, M.; Mitchell, C.; Torbett, A.
1976-01-01
Domestic data dissemination networks for earth-resources data in the 1985-1995 time frame were evaluated. The following topics were addressed: (1) earth-resources data sources and expected data volumes, (2) future user demand in terms of data volume and timeliness, (3) space-to-space and earth point-to-point transmission link requirements and implementation, (4) preprocessing requirements and implementation, (5) network costs, and (6) technological development to support this implementation. This study was parametric in that the data input (supply) was varied by a factor of about fifteen while the user request (demand) was varied by a factor of about nineteen. Correspondingly, the time from observation to delivery to the user was varied. This parametric evaluation was performed by a computer simulation that was based on network alternatives and resulted in preliminary transmission and preprocessing requirements. The earth-resource data sources considered were: shuttle sorties, synchronous satellites (e.g., SEOS), aircraft, and satellites in polar orbits.
Design of a nickel-hydrogen battery simulator for the NASA EOS testbed
NASA Technical Reports Server (NTRS)
Gur, Zvi; Mang, Xuesi; Patil, Ashok R.; Sable, Dan M.; Cho, Bo H.; Lee, Fred C.
1992-01-01
The hardware and software design of a nickel-hydrogen (Ni-H2) battery simulator (BS) with application to the NASA Earth Observation System (EOS) satellite is presented. The battery simulator is developed as a part of a complete testbed for the EOS satellite power system. The battery simulator involves both hardware and software components. The hardware component includes the capability of sourcing and sinking current at a constant programmable voltage. The software component includes the capability of monitoring the battery's ampere-hours (Ah) and programming the battery voltage according to an empirical model of the nickel-hydrogen battery stored in a computer.
NASA High Performance Computing and Communications program
NASA Technical Reports Server (NTRS)
Holcomb, Lee; Smith, Paul; Hunter, Paul
1994-01-01
The National Aeronautics and Space Administration's HPCC program is part of a new Presidential initiative aimed at producing a 1000-fold increase in supercomputing speed and a 1(X)-fold improvement in available communications capability by 1997. As more advanced technologies are developed under the HPCC program, they will be used to solve NASA's 'Grand Challenge' problems, which include improving the design and simulation of advanced aerospace vehicles, allowing people at remote locations to communicate more effectively and share information, increasing scientists' abilities to model the Earth's climate and forecast global environmental trends, and improving the development of advanced spacecraft. NASA's HPCC program is organized into three projects which are unique to the agency's mission: the Computational Aerosciences (CAS) project, the Earth and Space Sciences (ESS) project, and the Remote Exploration and Experimentation (REE) project. An additional project, the Basic Research and Human Resources (BRHR) project, exists to promote long term research in computer science and engineering and to increase the pool of trained personnel in a variety of scientific disciplines. This document presents an overview of the objectives and organization of these projects, as well as summaries of early accomplishments and the significance, status, and plans for individual research and development programs within each project. Areas of emphasis include benchmarking, testbeds, software and simulation methods.
Autonomous navigation accuracy using simulated horizon sensor and sun sensor observations
NASA Technical Reports Server (NTRS)
Pease, G. E.; Hendrickson, H. T.
1980-01-01
A relatively simple autonomous system which would use horizon crossing indicators, a sun sensor, a quartz oscillator, and a microprogrammed computer is discussed. The sensor combination is required only to effectively measure the angle between the centers of the Earth and the Sun. Simulations for a particular orbit indicate that 2 km r.m.s. orbit determination uncertainties may be expected from a system with 0.06 deg measurement uncertainty. A key finding is that knowledge of the satellite orbit plane orientation can be maintained to this level because of the annual motion of the Sun and the predictable effects of Earth oblateness. The basic system described can be updated periodically by transits of the Moon through the IR horizon crossing indicator fields of view.
NASA Astrophysics Data System (ADS)
Furuichi, Mikito; Nishiura, Daisuke
2017-10-01
We developed dynamic load-balancing algorithms for Particle Simulation Methods (PSM) involving short-range interactions, such as Smoothed Particle Hydrodynamics (SPH), Moving Particle Semi-implicit method (MPS), and Discrete Element method (DEM). These are needed to handle billions of particles modeled in large distributed-memory computer systems. Our method utilizes flexible orthogonal domain decomposition, allowing the sub-domain boundaries in the column to be different for each row. The imbalances in the execution time between parallel logical processes are treated as a nonlinear residual. Load-balancing is achieved by minimizing the residual within the framework of an iterative nonlinear solver, combined with a multigrid technique in the local smoother. Our iterative method is suitable for adjusting the sub-domain frequently by monitoring the performance of each computational process because it is computationally cheaper in terms of communication and memory costs than non-iterative methods. Numerical tests demonstrated the ability of our approach to handle workload imbalances arising from a non-uniform particle distribution, differences in particle types, or heterogeneous computer architecture which was difficult with previously proposed methods. We analyzed the parallel efficiency and scalability of our method using Earth simulator and K-computer supercomputer systems.
GMI-IPS: Python Processing Software for Aircraft Campaigns
NASA Technical Reports Server (NTRS)
Damon, M. R.; Strode, S. A.; Steenrod, S. D.; Prather, M. J.
2018-01-01
NASA's Atmospheric Tomography Mission (ATom) seeks to understand the impact of anthropogenic air pollution on gases in the Earth's atmosphere. Four flight campaigns are being deployed on a seasonal basis to establish a continuous global-scale data set intended to improve the representation of chemically reactive gases in global atmospheric chemistry models. The Global Modeling Initiative (GMI), is creating chemical transport simulations on a global scale for each of the ATom flight campaigns. To meet the computational demands required to translate the GMI simulation data to grids associated with the flights from the ATom campaigns, the GMI ICARTT Processing Software (GMI-IPS) has been developed and is providing key functionality for data processing and analysis in this ongoing effort. The GMI-IPS is written in Python and provides computational kernels for data interpolation and visualization tasks on GMI simulation data. A key feature of the GMI-IPS, is its ability to read ICARTT files, a text-based file format for airborne instrument data, and extract the required flight information that defines regional and temporal grid parameters associated with an ATom flight. Perhaps most importantly, the GMI-IPS creates ICARTT files containing GMI simulated data, which are used in collaboration with ATom instrument teams and other modeling groups. The initial main task of the GMI-IPS is to interpolate GMI model data to the finer temporal resolution (1-10 seconds) of a given flight. The model data includes basic fields such as temperature and pressure, but the main focus of this effort is to provide species concentrations of chemical gases for ATom flights. The software, which uses parallel computation techniques for data intensive tasks, linearly interpolates each of the model fields to the time resolution of the flight. The temporally interpolated data is then saved to disk, and is used to create additional derived quantities. In order to translate the GMI model data to the spatial grid of the flight path as defined by the pressure, latitude, and longitude points at each flight time record, a weighted average is then calculated from the nearest neighbors in two dimensions (latitude, longitude). Using SciPya's Regular Grid Interpolator, interpolation functions are generated for the GMI model grid and the calculated weighted averages. The flight path points are then extracted from the ATom ICARTT instrument file, and are sent to the multi-dimensional interpolating functions to generate GMI field quantities along the spatial path of the flight. The interpolated field quantities are then written to a ICARTT data file, which is stored for further manipulation. The GMI-IPS is aware of a generic ATom ICARTT header format, containing basic information for all flight campaigns. The GMI-IPS includes logic to edit metadata for the derived field quantities, as well as modify the generic header data such as processing dates and associated instrument files. The ICARTT interpolated data is then appended to the modified header data, and the ICARTT processing is complete for the given flight and ready for collaboration. The output ICARTT data adheres to the ICARTT file format standards V1.1. The visualization component of the GMI-IPS uses Matplotlib extensively and has several functions ranging in complexity. First, it creates a model background curtain for the flight (time versus model eta levels) with the interpolated flight data superimposed on the curtain. Secondly, it creates a time-series plot of the interpolated flight data. Lastly, the visualization component creates averaged 2D model slices (longitude versus latitude) with overlaid flight track circles at key pressure levels. The GMI-IPS consists of a handful of classes and supporting functionality that have been generalized to be compatible with any ICARTT file that adheres to the base class definition. The base class represents a generic ICARTT entry, only defining a single time entry and 3D spatial positioning parameters. Other classes inherit from this base class; several classes for input ICARTT instrument files, which contain the necessary flight positioning information as a basis for data processing, as well as other classes for output ICARTT files, which contain the interpolated model data. Utility classes provide functionality for routine procedures such as: comparing field names among ICARTT files, reading ICARTT entries from a data file and storing them in data structures, and returning a reduced spatial grid based on a collection of ICARTT entries. Although the GMI-IPS is compatible with GMI model data, it can be adapted with reasonable effort for any simulation that creates Hierarchical Data Format (HDF) files. The same can be said of its adaptability to ICARTT files outside of the context of the ATom mission. The GMI-IPS contains just under 30,000 lines of code, eight classes, and a dozen drivers and utility programs. It is maintained with GIT source code management and has been used to deliver processed GMI model data for the ATom campaigns that have taken place to date.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey D.; Twombly, I. Alexander; Maese, A. Christopher; Cagle, Yvonne; Boyle, Richard
2003-01-01
The International Space Station demonstrates the greatest capabilities of human ingenuity, international cooperation and technology development. The complexity of this space structure is unprecedented; and training astronaut crews to maintain all its systems, as well as perform a multitude of research experiments, requires the most advanced training tools and techniques. Computer simulation and virtual environments are currently used by astronauts to train for robotic arm manipulations and extravehicular activities; but now, with the latest computer technologies and recent successes in areas of medical simulation, the capability exists to train astronauts for more hands-on research tasks using immersive virtual environments. We have developed a new technology, the Virtual Glovebox (VGX), for simulation of experimental tasks that astronauts will perform aboard the Space Station. The VGX may also be used by crew support teams for design of experiments, testing equipment integration capability and optimizing the procedures astronauts will use. This is done through the 3D, desk-top sized, reach-in virtual environment that can simulate the microgravity environment in space. Additional features of the VGX allow for networking multiple users over the internet and operation of tele-robotic devices through an intuitive user interface. Although the system was developed for astronaut training and assisting support crews, Earth-bound applications, many emphasizing homeland security, have also been identified. Examples include training experts to handle hazardous biological and/or chemical agents in a safe simulation, operation of tele-robotic systems for assessing and diffusing threats such as bombs, and providing remote medical assistance to field personnel through a collaborative virtual environment. Thus, the emerging VGX simulation technology, while developed for space- based applications, can serve a dual use facilitating homeland security here on Earth.
Atomic Detail Visualization of Photosynthetic Membranes with GPU-Accelerated Ray Tracing
Vandivort, Kirby L.; Barragan, Angela; Singharoy, Abhishek; Teo, Ivan; Ribeiro, João V.; Isralewitz, Barry; Liu, Bo; Goh, Boon Chong; Phillips, James C.; MacGregor-Chatwin, Craig; Johnson, Matthew P.; Kourkoutis, Lena F.; Hunter, C. Neil
2016-01-01
The cellular process responsible for providing energy for most life on Earth, namely photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. We present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. We describe the techniques that were used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers. PMID:27274603
Rebecca Ralston; Joseph Buongiorno; Benedict Schulte; Jeremy Fried
2003-01-01
WestPro is an add-in program designed to work with Microsoft Excel to simulate the growth of uneven-aged Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco) stands in the Pacific Northwest region of the United States. Given the initial stand state, defined as the number of softwood and hardwood trees per acre by diameter class, WestPro predicts the...
An adaptive replacement algorithm for paged-memory computer systems.
NASA Technical Reports Server (NTRS)
Thorington, J. M., Jr.; Irwin, J. D.
1972-01-01
A general class of adaptive replacement schemes for use in paged memories is developed. One such algorithm, called SIM, is simulated using a probability model that generates memory traces, and the results of the simulation of this adaptive scheme are compared with those obtained using the best nonlookahead algorithms. A technique for implementing this type of adaptive replacement algorithm with state of the art digital hardware is also presented.
Towards high-resolution mantle convection simulations
NASA Astrophysics Data System (ADS)
Höink, T.; Richards, M. A.; Lenardic, A.
2009-12-01
The motion of tectonic plates at the Earth’s surface, earthquakes, most forms of volcanism, the growth and evolution of continents, and the volatile fluxes that govern the composition and evolution of the oceans and atmosphere are all controlled by the process of solid-state thermal convection in the Earth’s rocky mantle, with perhaps a minor contribution from convection in the iron core. Similar processes govern the evolution of other planetary objects such as Mars, Venus, Titan, and Europa, all of which might conceivably shed light on the origin and evolution of life on Earth. Modeling and understanding this complicated dynamical system is one of the true “grand challenges” of Earth and planetary science. In the past three decades much progress towards understanding the dynamics of mantle convection has been made, with the increasing aid of computational modeling. Numerical sophistication has evolved significantly, and a small number of independent codes have been successfully employed. Computational power continues to increase dramatically, and with it the ability to resolve increasingly finer fluid mechanical structures. Yet, the perhaps most often cited limitation in numerical modeling based publications is still the limitation of computing power, because the ability to resolve thermal boundary layers within the convecting mantle (e.g., lithospheric plates), requires a spatial resolution of ~ 10 km. At present, the largest supercomputing facilities still barely approach the power to resolve this length scale in mantle convection simulations that include the physics necessary to model plate-like behavior. Our goal is to use supercomputing facilities to perform 3D spherical mantle convection simulations that include the ingredients for plate-like behavior, i.e. strongly temperature- and stress-dependent viscosity, at Earth-like convective vigor with a global resolution of order 10 km. In order to qualify to use such facilities, it is also necessary to demonstrate good parallel efficiency. Here we will present two kinds of results: (1) scaling properties of the community code CitcomS on DOE/NERSC's supercomputer Franklin for up to ~ 6000 processors, and (2) preliminary simulations that illustrate the role of a low-viscosity asthenosphere in plate-like behavior in mantle convection.
A self-paced motor imagery based brain-computer interface for robotic wheelchair control.
Tsui, Chun Sing Louis; Gan, John Q; Hu, Huosheng
2011-10-01
This paper presents a simple self-paced motor imagery based brain-computer interface (BCI) to control a robotic wheelchair. An innovative control protocol is proposed to enable a 2-class self-paced BCI for wheelchair control, in which the user makes path planning and fully controls the wheelchair except for the automatic obstacle avoidance based on a laser range finder when necessary. In order for the users to train their motor imagery control online safely and easily, simulated robot navigation in a specially designed environment was developed. This allowed the users to practice motor imagery control with the core self-paced BCI system in a simulated scenario before controlling the wheelchair. The self-paced BCI can then be applied to control a real robotic wheelchair using a protocol similar to that controlling the simulated robot. Our emphasis is on allowing more potential users to use the BCI controlled wheelchair with minimal training; a simple 2-class self paced system is adequate with the novel control protocol, resulting in a better transition from offline training to online control. Experimental results have demonstrated the usefulness of the online practice under the simulated scenario, and the effectiveness of the proposed self-paced BCI for robotic wheelchair control.
2001 Flight Mechanics Symposium
NASA Technical Reports Server (NTRS)
Lynch, John P. (Editor)
2001-01-01
This conference publication includes papers and abstracts presented at the Flight Mechanics Symposium held on June 19-21, 2001. Sponsored by the Guidance, Navigation and Control Center of Goddard Space Flight Center, this symposium featured technical papers on a wide range of issues related to attitude/orbit determination, prediction and control; attitude simulation; attitude sensor calibration; theoretical foundation of attitude computation; dynamics model improvements; autonomous navigation; constellation design and formation flying; estimation theory and computational techniques; Earth environment mission analysis and design; and, spacecraft re-entry mission design and operations.
Implications of a quadratic stream definition in radiative transfer theory.
NASA Technical Reports Server (NTRS)
Whitney, C.
1972-01-01
An explicit definition of the radiation-stream concept is stated and applied to approximate the integro-differential equation of radiative transfer with a set of twelve coupled differential equations. Computational efficiency is enhanced by distributing the corresponding streams in three-dimensional space in a totally symmetric way. Polarization is then incorporated in this model. A computer program based on the model is briefly compared with a Monte Carlo program for simulation of horizon scans of the earth's atmosphere. It is found to be considerably faster.
Spectral-element Seismic Wave Propagation on CUDA/OpenCL Hardware Accelerators
NASA Astrophysics Data System (ADS)
Peter, D. B.; Videau, B.; Pouget, K.; Komatitsch, D.
2015-12-01
Seismic wave propagation codes are essential tools to investigate a variety of wave phenomena in the Earth. Furthermore, they can now be used for seismic full-waveform inversions in regional- and global-scale adjoint tomography. Although these seismic wave propagation solvers are crucial ingredients to improve the resolution of tomographic images to answer important questions about the nature of Earth's internal processes and subsurface structure, their practical application is often limited due to high computational costs. They thus need high-performance computing (HPC) facilities to improving the current state of knowledge. At present, numerous large HPC systems embed many-core architectures such as graphics processing units (GPUs) to enhance numerical performance. Such hardware accelerators can be programmed using either the CUDA programming environment or the OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted by additional hardware accelerators, like e.g. AMD graphic cards, ARM-based processors as well as Intel Xeon Phi coprocessors. For seismic wave propagation simulations using the open-source spectral-element code package SPECFEM3D_GLOBE, we incorporated an automatic source-to-source code generation tool (BOAST) which allows us to use meta-programming of all computational kernels for forward and adjoint runs. Using our BOAST kernels, we generate optimized source code for both CUDA and OpenCL languages within the source code package. Thus, seismic wave simulations are able now to fully utilize CUDA and OpenCL hardware accelerators. We show benchmarks of forward seismic wave propagation simulations using SPECFEM3D_GLOBE on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.
Integration of Extended MHD and Kinetic Effects in Global Magnetosphere Models
NASA Astrophysics Data System (ADS)
Germaschewski, K.; Wang, L.; Maynard, K. R. M.; Raeder, J.; Bhattacharjee, A.
2015-12-01
Computational models of Earth's geospace environment are an important tool to investigate the science of the coupled solar-wind -- magnetosphere -- ionosphere system, complementing satellite and ground observations with a global perspective. They are also crucial in understanding and predicting space weather, in particular under extreme conditions. Traditionally, global models have employed the one-fluid MHD approximation, which captures large-scale dynamics quite well. However, in Earth's nearly collisionless plasma environment it breaks down on small scales, where ion and electron dynamics and kinetic effects become important, and greatly change the reconnection dynamics. A number of approaches have recently been taken to advance global modeling, e.g., including multiple ion species, adding Hall physics in a Generalized Ohm's Law, embedding local PIC simulations into a larger fluid domain and also some work on simulating the entire system with hybrid or fully kinetic models, the latter however being to computationally expensive to be run at realistic parameters. We will present an alternate approach, ie., a multi-fluid moment model that is derived rigorously from the Vlasov-Maxwell system. The advantage is that the computational cost remains managable, as we are still solving fluid equations. While the evolution equation for each moment is exact, it depends on the next higher-order moment, so that truncating the hiearchy and closing the system to capture the essential kinetic physics is crucial. We implement 5-moment (density, momentum, scalar pressure) and 10-moment (includes pressure tensor) versions of the model, and use local approximations for the heat flux to close the system. We test these closures by local simulations where we can compare directly to PIC / hybrid codes, and employ them in global simulations using the next-generation OpenGGCM to contrast them to MHD / Hall-MHD results and compare with observations.
A salt bridge turns off the foot-pocket in class-II HDACs.
Zhou, Jingwei; Yang, Zuolong; Zhang, Fan; Luo, Hai-Bin; Li, Min; Wu, Ruibo
2016-08-21
Histone Deacetylases (HDACs) are promising anticancer targets and several selective inhibitors have been created based on the architectural differences of foot-pockets among HDACs. However, the "gate-keeper" of foot-pockets is still controversial. Herein, it is for the first time revealed that a conserved R-E salt bridge plays a critical role in keeping foot-pockets closed in class-II HDACs by computational simulations. This finding is further substantiated by our mutagenesis experiments.
Investigation of models for large-scale meteorological prediction experiments
NASA Technical Reports Server (NTRS)
Spar, J.
1981-01-01
An attempt is made to compute the contributions of various surface boundary conditions to the monthly mean states generated by the 7 layer, 8 x 10 GISS climate model (Hansen et al., 1980), and also to examine the influence of initial conditions on the model climate simulations. Obvious climatic controls as the shape and rotation of the Earth, the solar radiation, and the dry composition of the atmosphere are fixed, and only the surface boundary conditions are altered in the various climate simulations.
Analysis of self-oscillating dc-to-dc converters
NASA Technical Reports Server (NTRS)
Burger, P.
1974-01-01
The basic operational characteristics of dc-to-dc converters are analyzed along with the basic physical characteristics of power converters. A simple class of dc-to-dc power converters are chosen which could satisfy any set of operating requirements, and three different controlling methods in this class are described in detail. Necessary conditions for the stability of these converters are measured through analog computer simulation whose curves are related to other operational characteristics, such as ripple and regulation. Further research is suggested for the solution of absolute stability and efficient physical design of this class of power converters.
Applying ``intelligent`` materials for materials education: The Labless Lab{trademark}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrade, J.D.; Scheer, R.
1994-12-31
A very large number of science and engineering courses taught in colleges and universities today do not involve laboratories. Although good instructors incorporate class demonstrations, hands on homework, and various teaching aids, including computer simulations, the fact is that students in such courses often accept key concepts and experimental results without discovering them for themselves. The only partial solution to this problem has been increasing use of class demonstrations and computer simulations. The authors feel strongly that many complex concepts can be observed and assimilated through experimentation with properly designed materials. They propose the development of materials and specimens designedmore » specifically for education purposes. Intelligent and communicative materials are ideal for this purpose. Specimens which respond in an observable fashion to new environments and situations provided by the students/experimenter provide a far more effective materials science and engineering experience than readouts and data generated by complex and expensive machines, particularly in an introductory course. Modern materials can be designed to literally communicate with the observer. The authors embarked on a project to develop a series of Labless Labs{trademark} utilizing various degrees and levels of intelligence in materials. It is expected that such Labless Labs{trademark} would be complementary to textbooks and computer simulations and to be used to provide a reality for students in courses and other learning situations where access to a laboratory is non-existent or limited.« less
Industry and Academic Consortium for Computer Based Subsurface Geology Laboratory
NASA Astrophysics Data System (ADS)
Brown, A. L.; Nunn, J. A.; Sears, S. O.
2008-12-01
Twenty two licenses for Petrel Software acquired through a grant from Schlumberger are being used to redesign the laboratory portion of Subsurface Geology at Louisiana State University. The course redesign is a cooperative effort between LSU's Geology and Geophysics and Petroleum Engineering Departments and Schlumberger's Technical Training Division. In spring 2008, two laboratory sections were taught with 22 students in each section. The class contained geology majors, petroleum engineering majors, and geology graduate students. Limited enrollments and 3 hour labs make it possible to incorporate hands-on visualization, animation, manipulation of data and images, and access to geological data available online. 24/7 access to the laboratory and step by step instructions for Petrel exercises strongly promoted peer instruction and individual learning. Goals of the course redesign include: enhancing visualization of earth materials; strengthening student's ability to acquire, manage, and interpret multifaceted geological information; fostering critical thinking, the scientific method; improving student communication skills; providing cross training between geologists and engineers and increasing the quantity, quality, and diversity of students pursuing Earth Science and Petroleum Engineering careers. IT resources available in the laboratory provide students with sophisticated visualization tools, allowing them to switch between 2-D and 3-D reconstructions more seamlessly, and enabling them to manipulate larger integrated data-sets, thus permitting more time for critical thinking and hypothesis testing. IT resources also enable faculty and students to simultaneously work with the software to visually interrogate a 3D data set and immediately test hypothesis formulated in class. Preliminary evaluation of class results indicate that students found MS-Windows based Petrel easy to learn. By the end of the semester, students were able to not only map horizons and faults using seismic and well data but also compute volumetrics. Exam results indicated that while students could complete sophisticated exercises using the software, their understanding of key concepts such as conservation of volume in a palinspastic reconstruction or association of structures with a particular stress regime was limited. Future classes will incorporate more paper and pencil exercises to illustrate basic concepts. The equipment, software, and exercises developed will be used in additional upper level undergraduate and graduate classes.
NASA Astrophysics Data System (ADS)
Mikkili, Suresh; Panda, Anup Kumar; Prattipati, Jayanthi
2015-06-01
Nowadays the researchers want to develop their model in real-time environment. Simulation tools have been widely used for the design and improvement of electrical systems since the mid twentieth century. The evolution of simulation tools has progressed in step with the evolution of computing technologies. In recent years, computing technologies have improved dramatically in performance and become widely available at a steadily decreasing cost. Consequently, simulation tools have also seen dramatic performance gains and steady cost decreases. Researchers and engineers now have the access to affordable, high performance simulation tools that were previously too cost prohibitive, except for the largest manufacturers. This work has introduced a specific class of digital simulator known as a real-time simulator by answering the questions "what is real-time simulation", "why is it needed" and "how it works". The latest trend in real-time simulation consists of exporting simulation models to FPGA. In this article, the Steps involved for implementation of a model from MATLAB to REAL-TIME are provided in detail.
EOG-sEMG Human Interface for Communication
Tamura, Hiroki; Yan, Mingmin; Sakurai, Keiko; Tanno, Koichi
2016-01-01
The aim of this study is to present electrooculogram (EOG) and surface electromyogram (sEMG) signals that can be used as a human-computer interface. Establishing an efficient alternative channel for communication without overt speech and hand movements is important for increasing the quality of life for patients suffering from amyotrophic lateral sclerosis, muscular dystrophy, or other illnesses. In this paper, we propose an EOG-sEMG human-computer interface system for communication using both cross-channels and parallel lines channels on the face with the same electrodes. This system could record EOG and sEMG signals as “dual-modality” for pattern recognition simultaneously. Although as much as 4 patterns could be recognized, dealing with the state of the patients, we only choose two classes (left and right motion) of EOG and two classes (left blink and right blink) of sEMG which are easily to be realized for simulation and monitoring task. From the simulation results, our system achieved four-pattern classification with an accuracy of 95.1%. PMID:27418924
EOG-sEMG Human Interface for Communication.
Tamura, Hiroki; Yan, Mingmin; Sakurai, Keiko; Tanno, Koichi
2016-01-01
The aim of this study is to present electrooculogram (EOG) and surface electromyogram (sEMG) signals that can be used as a human-computer interface. Establishing an efficient alternative channel for communication without overt speech and hand movements is important for increasing the quality of life for patients suffering from amyotrophic lateral sclerosis, muscular dystrophy, or other illnesses. In this paper, we propose an EOG-sEMG human-computer interface system for communication using both cross-channels and parallel lines channels on the face with the same electrodes. This system could record EOG and sEMG signals as "dual-modality" for pattern recognition simultaneously. Although as much as 4 patterns could be recognized, dealing with the state of the patients, we only choose two classes (left and right motion) of EOG and two classes (left blink and right blink) of sEMG which are easily to be realized for simulation and monitoring task. From the simulation results, our system achieved four-pattern classification with an accuracy of 95.1%.
NASA Technical Reports Server (NTRS)
Nuth, Joseph A.
2009-01-01
Studies of meteorites have yielded a wealth of scientific information based on highly detailed chemical and isotopic studies possible only in sophisticated terrestrial laboratories. Telescopic studies have revealed an enormous (greater than 10(exp 5)) number of physical objects ranging in size from a few tens of meters to several hundred kilometers, orbiting not only in the traditional asteroid belt between Mars and Jupiter but also throughout the inner solar system. Many of the largest asteroids are classed into taxonomic groups based on their observed spectral properties and are designated as C, D. X, S or V types (as well as a wide range in sub-types). These objects are certainly the sources far the meteorites in our laboratories, but which asteroids are the sources for which meteorites? Spectral classes are nominally correlated to the chemical composition and physical characteristics of the asteroid itself based on studies of the spectral changes induced in meteorites due to exposure to a simulated space environment. While laboratory studies have produced some notable successes (e.g. the identification of the asteroid Vesta as the source of the H, E and D meteorite classes), it is unlikely that we have samples of each asteroidal spectral type in our meteorite collection. The correlation of spectral type and composition for many objects will therefore remain uncertain until we can return samples of specific asteroid types to Earth for analyses. The best candidates for sample return are asteroids that already come close to the Earth. Asteroids in orbit near 1 A.U. have been classified into three groups (Aten, Apollo & Amor) based on their orbital characteristics. These Near Earth Objects (NEOs) contain representatives of virtually all spectral types and sub-types of the asteroid population identified to date. Because of their close proximity to Earth, NEOs are prime targets for asteroid missions such as the NEAR-Shoemaker NASA Discovery Mission to Eros and the Japanese Hyabusa Mission to Itokawa. Also due to their close proximity to Earth, NEOs constitute the most likely set of celestial objects that will impact us in the relatively near future.
Geospace simulations on the Cell BE processor
NASA Astrophysics Data System (ADS)
Germaschewski, K.; Raeder, J.; Larson, D.
2008-12-01
OpenGGCM (Open Geospace General circulation Model) is an established numerical code that simulates the Earth's space environment. The most computing intensive part is the MHD (magnetohydrodynamics) solver that models the plasma surrounding Earth and its interaction with Earth's magnetic field and the solar wind flowing in from the sun. Like other global magnetosphere codes, OpenGGCM's realism is limited by computational constraints on grid resolution. We investigate porting of the MHD solver to the Cell BE architecture, a novel inhomogeneous multicore architecture capable of up to 230 GFlops per processor. Realizing this high performance on the Cell processor is a programming challenge, though. We implemented the MHD solver using a multi-level parallel approach: On the coarsest level, the problem is distributed to processors based upon the usual domain decomposition approach. Then, on each processor, the problem is divided into 3D columns, each of which is handled by the memory limited SPEs (synergistic processing elements) slice by slice. Finally, SIMD instructions are used to fully exploit the vector/SIMD FPUs in each SPE. Memory management needs to be handled explicitly by the code, using DMA to move data from main memory to the per-SPE local store and vice versa. We obtained excellent performance numbers, a speed-up of a factor of 25 compared to just using the main processor, while still keeping the numerical implementation details of the code maintainable.
Control aspects of quantum computing using pure and mixed states.
Schulte-Herbrüggen, Thomas; Marx, Raimund; Fahmy, Amr; Kauffman, Louis; Lomonaco, Samuel; Khaneja, Navin; Glaser, Steffen J
2012-10-13
Steering quantum dynamics such that the target states solve classically hard problems is paramount to quantum simulation and computation. And beyond, quantum control is also essential to pave the way to quantum technologies. Here, important control techniques are reviewed and presented in a unified frame covering quantum computational gate synthesis and spectroscopic state transfer alike. We emphasize that it does not matter whether the quantum states of interest are pure or not. While pure states underly the design of quantum circuits, ensemble mixtures of quantum states can be exploited in a more recent class of algorithms: it is illustrated by characterizing the Jones polynomial in order to distinguish between different (classes of) knots. Further applications include Josephson elements, cavity grids, ion traps and nitrogen vacancy centres in scenarios of closed as well as open quantum systems.
Control aspects of quantum computing using pure and mixed states
Schulte-Herbrüggen, Thomas; Marx, Raimund; Fahmy, Amr; Kauffman, Louis; Lomonaco, Samuel; Khaneja, Navin; Glaser, Steffen J.
2012-01-01
Steering quantum dynamics such that the target states solve classically hard problems is paramount to quantum simulation and computation. And beyond, quantum control is also essential to pave the way to quantum technologies. Here, important control techniques are reviewed and presented in a unified frame covering quantum computational gate synthesis and spectroscopic state transfer alike. We emphasize that it does not matter whether the quantum states of interest are pure or not. While pure states underly the design of quantum circuits, ensemble mixtures of quantum states can be exploited in a more recent class of algorithms: it is illustrated by characterizing the Jones polynomial in order to distinguish between different (classes of) knots. Further applications include Josephson elements, cavity grids, ion traps and nitrogen vacancy centres in scenarios of closed as well as open quantum systems. PMID:22946034
Advanced Computational Methods for Thermal Radiative Heat Transfer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tencer, John; Carlberg, Kevin Thomas; Larsen, Marvin E.
2016-10-01
Participating media radiation (PMR) in weapon safety calculations for abnormal thermal environments are too costly to do routinely. This cost may be s ubstantially reduced by applying reduced order modeling (ROM) techniques. The application of ROM to PMR is a new and unique approach for this class of problems. This approach was investigated by the authors and shown to provide significant reductions in the computational expense associated with typical PMR simulations. Once this technology is migrated into production heat transfer analysis codes this capability will enable the routine use of PMR heat transfer in higher - fidelity simulations of weaponmore » resp onse in fire environments.« less
Advances in target imaging of deep Earth structure
NASA Astrophysics Data System (ADS)
Masson, Y.; Romanowicz, B. A.; Clouzet, P.
2015-12-01
A new generation of global tomographic models (Lekić and Romanowicz, 2011; French et al, 2013, 2014) has emerged with the development of accurate numerical wavefield computations in a 3D earth combined with access to enhanced HPC capabilities. These models have sharpened up mantle images and unveiled relatively small scale structures that were blurred out in previous generation models. Fingerlike structures have been found at the base of the oceanic asthenosphere, and vertically oriented broad low velocity plume conduits extend throughout the lower mantle beneath those major hotspots that are located within the perimeter of the deep mantle large low shear velocity provinces (LLSVPs). While providing new insights into our understanding of mantle dynamics, the detailed morphology of these features, requires further efforts to obtain higher resolution images. The focus of our ongoing effort is to develop advanced tomographic methods to image remote regions of the Earth at fine scales. We have developed an approach in which distant sources (located outside of the target region) are replaced by an equivalent set of local sources located at the border of the computational domain (Masson et al., 2014). A limited number of global simulations in a reference 3D earth model is then required. These simulations are computed prior to the regional inversion, while iterations of the model need to be performed only within the region of interest, potentially allowing us to include shorter periods at limited additional computational cost. Until now, the application was limited to a distribution of receivers inside the target region. This is particularly suitable for studies of upper mantle structure in regions with dense arrays (e.g. see our companion presentation Clouzet et al., this Fall AGU). Here we present our latest development that now can include teleseismic data recorded outside the imaged region. This allows us to perform regional waveform tomography in the situation where neither earthquakes nor seismological stations are present within the region of interest, such as would be desireable for the study of a region in the deep mantle. We present benchmark tests showing how the uncertainties in the reference 3D model employed outside of the target region affects the quality of the regional tomographic images obtained.
Results of Formal Evaluation of a Data and Modeling Driven Hydrology Learning Module
NASA Astrophysics Data System (ADS)
Ruddell, B. L.; Sanchez, C. A.; Schiesser, R.; Merwade, V.
2014-12-01
New hydrologists should not only develop a well-defined knowledgebase of basic hydrological concepts, but also synthesize this factual learning with more authentic 'real-world' knowledge gained from the interpretation and analysis of data from hydrological models (Merwade and Ruddell, 2012, Wagener et al., 2007). However, hydrological instruction is often implemented using a traditional teacher-centered approach (e.g., lectures) (Wagener, 2007). The emergence of rich and dynamic computer simulation techniques which allow students the opportunity for more authentic application of knowledge (Merwade & Ruddell, 2012). This study evaluates the efficacy of using such data-driven simulations to increase the understanding of the field of hydrology in the lower-division undergraduate geoscience classroom. In this study, 88 students at a local community college who were enrolled in an Introductory Earth Science class were evaluated on their learning performance in a unit on applying the Rational Method to estimate hydrographs and flooding for urban areas. Students were either presented with a data and visualization rich computer module (n=52), or with paper and pencil calculation activities (n=36). All conceptual material presented in lecture was consistent across these two conditions. Students were evaluated for not only changes in their knowledge and application of the concepts within the unit (e.g., effects of urbanization and impervious cover, discharge rates), but also for their broad "T-shaped" profile of professional knowledge and skills. While results showed significant (p<.05) increases from pre to post assessments in all learning areas for both groups, there is a significantly larger benefit for the data module group when it came to (1) understanding the effects of urbanization and impervious cover on flooding, (2) applying consistent vocabulary appropriately within context, and (3) explaining the roles and responsibilities of hydrologists and flood managers.
A global map of rainfed cropland areas (GMRCA) at the end of last millennium using remote sensing
Biradar, C.M.; Thenkabail, P.S.; Noojipady, P.; Li, Y.; Dheeravath, V.; Turral, H.; Velpuri, M.; Gumma, M.K.; Gangalakunta, O.R.P.; Cai, X.L.; Xiao, X.; Schull, M.A.; Alankara, R.D.; Gunasinghe, S.; Mohideen, S.
2009-01-01
The overarching goal of this study was to produce a global map of rainfed cropland areas (GMRCA) and calculate country-by-country rainfed area statistics using remote sensing data. A suite of spatial datasets, methods and protocols for mapping GMRCA were described. These consist of: (a) data fusion and composition of multi-resolution time-series mega-file data-cube (MFDC), (b) image segmentation based on precipitation, temperature, and elevation zones, (c) spectral correlation similarity (SCS), (d) protocols for class identification and labeling through uses of SCS R2-values, bi-spectral plots, space-time spiral curves (ST-SCs), rich source of field-plot data, and zoom-in-views of Google Earth (GE), and (e) techniques for resolving mixed classes by decision tree algorithms, and spatial modeling. The outcome was a 9-class GMRCA from which country-by-country rainfed area statistics were computed for the end of the last millennium. The global rainfed cropland area estimate from the GMRCA 9-class map was 1.13 billion hectares (Bha). The total global cropland areas (rainfed plus irrigated) was 1.53 Bha which was close to national statistics compiled by FAOSTAT (1.51 Bha). The accuracies and errors of GMRCA were assessed using field-plot and Google Earth data points. The accuracy varied between 92 and 98% with kappa value of about 0.76, errors of omission of 2-8%, and the errors of commission of 19-36%. ?? 2008 Elsevier B.V.
A fast exact simulation method for a class of Markov jump processes.
Li, Yao; Hu, Lili
2015-11-14
A new method of the stochastic simulation algorithm (SSA), named the Hashing-Leaping method (HLM), for exact simulations of a class of Markov jump processes, is presented in this paper. The HLM has a conditional constant computational cost per event, which is independent of the number of exponential clocks in the Markov process. The main idea of the HLM is to repeatedly implement a hash-table-like bucket sort algorithm for all times of occurrence covered by a time step with length τ. This paper serves as an introduction to this new SSA method. We introduce the method, demonstrate its implementation, analyze its properties, and compare its performance with three other commonly used SSA methods in four examples. Our performance tests and CPU operation statistics show certain advantages of the HLM for large scale problems.
Towards a comprehensive model of Earth's disk-integrated Stokes vector
NASA Astrophysics Data System (ADS)
García Muñoz, A.
2015-07-01
A significant body of work on simulating the remote appearance of Earth-like exoplanets has been done over the last decade. The research is driven by the prospect of characterizing habitable planets beyond the Solar System in the near future. In this work, I present a method to produce the disk-integrated signature of planets that are described in their three-dimensional complexity, i.e. with both horizontal and vertical variations in the optical properties of their envelopes. The approach is based on Pre-conditioned Backward Monte Carlo integration of the vector Radiative Transport Equation and yields the full Stokes vector for outgoing reflected radiation. The method is demonstrated through selected examples inspired by published work at wavelengths from the visible to the near infrared and terrestrial prescriptions of both cloud and surface albedo maps. I explore the performance of the method in terms of computational time and accuracy. A clear strength of this approach is that its computational cost does not appear to be significantly affected by non-uniformities in the planet optical properties. Earth's simulated appearance is strongly dependent on wavelength; both brightness and polarization undergo diurnal variations arising from changes in the planet cover, but polarization yields a better insight into variations with phase angle. There is partial cancellation of the polarized signal from the northern and southern hemispheres so that the outgoing polarization vector lies preferentially either in the plane parallel or perpendicular to the planet scattering plane, also for non-uniform cloud and albedo properties and various levels of absorption within the atmosphere. The evaluation of circular polarization is challenging; a number of one-photon experiments of 109 or more is needed to resolve hemispherically integrated degrees of circular polarization of a few times 10-5. Last, I introduce brightness curves of Earth obtained with one of the Messenger cameras at three wavelengths (0.48, 0.56 and 0.63 μm) during a flyby in 2005. The light curves show distinct structure associated with the varying aspect of the Earth's visible disk (phases of 98-107°) as the planet undergoes a full 24 h rotation; the structure is reasonably well reproduced with model simulations.
Computational Physics' Greatest Hits
NASA Astrophysics Data System (ADS)
Bug, Amy
2011-03-01
The digital computer, has worked its way so effectively into our profession that now, roughly 65 years after its invention, it is virtually impossible to find a field of experimental or theoretical physics unaided by computational innovation. It is tough to think of another device about which one can make that claim. In the session ``What is computational physics?'' speakers will distinguish computation within the field of computational physics from this ubiquitous importance across all subfields of physics. This talk will recap the invited session ``Great Advances...Past, Present and Future'' in which five dramatic areas of discovery (five of our ``greatest hits'') are chronicled: The physics of many-boson systems via Path Integral Monte Carlo, the thermodynamic behavior of a huge number of diverse systems via Monte Carlo Methods, the discovery of new pharmaceutical agents via molecular dynamics, predictive simulations of global climate change via detailed, cross-disciplinary earth system models, and an understanding of the formation of the first structures in our universe via galaxy formation simulations. The talk will also identify ``greatest hits'' in our field from the teaching and research perspectives of other members of DCOMP, including its Executive Committee.
Network-based stochastic semisupervised learning.
Silva, Thiago Christiano; Zhao, Liang
2012-03-01
Semisupervised learning is a machine learning approach that is able to employ both labeled and unlabeled samples in the training process. In this paper, we propose a semisupervised data classification model based on a combined random-preferential walk of particles in a network (graph) constructed from the input dataset. The particles of the same class cooperate among themselves, while the particles of different classes compete with each other to propagate class labels to the whole network. A rigorous model definition is provided via a nonlinear stochastic dynamical system and a mathematical analysis of its behavior is carried out. A numerical validation presented in this paper confirms the theoretical predictions. An interesting feature brought by the competitive-cooperative mechanism is that the proposed model can achieve good classification rates while exhibiting low computational complexity order in comparison to other network-based semisupervised algorithms. Computer simulations conducted on synthetic and real-world datasets reveal the effectiveness of the model.
NASA Technical Reports Server (NTRS)
Mahan, J. R.; Tira, Nour E.
1991-01-01
An improved dynamic electrothermal model for the Earth Radiation Budget Experiment (ERBE) total, nonscanning channels is formulated. This model is then used to accurately simulate two types of dynamic solar observation: the solar calibration and the so-called pitchover maneuver. Using a second model, the nonscanner active cavity radiometer (ACR) thermal noise is studied. This study reveals that radiative emission and scattering by the surrounding parts of the nonscanner cavity are acceptably small. The dynamic electrothermal model is also used to compute ACR instrument transfer function. Accurate in-flight measurement of this transfer function is shown to depend on the energy distribution over the frequency spectrum of the radiation input function. A new array-type field of view limiter, whose geometry controls the input function, is proposed for in-flight calibration of an ACR and other types of radiometers. The point spread function (PSF) of the ERBE and the Clouds and Earth's Radiant Energy System (CERES) scanning radiometers is computed. The PSF is useful in characterizing the channel optics. It also has potential for recovering the distribution of the radiative flux from Earth by deconvolution.
Computational complexity of the landscape II-Cosmological considerations
NASA Astrophysics Data System (ADS)
Denef, Frederik; Douglas, Michael R.; Greene, Brian; Zukowski, Claire
2018-05-01
We propose a new approach for multiverse analysis based on computational complexity, which leads to a new family of "computational" measure factors. By defining a cosmology as a space-time containing a vacuum with specified properties (for example small cosmological constant) together with rules for how time evolution will produce the vacuum, we can associate global time in a multiverse with clock time on a supercomputer which simulates it. We argue for a principle of "limited computational complexity" governing early universe dynamics as simulated by this supercomputer, which translates to a global measure for regulating the infinities of eternal inflation. The rules for time evolution can be thought of as a search algorithm, whose details should be constrained by a stronger principle of "minimal computational complexity". Unlike previously studied global measures, ours avoids standard equilibrium considerations and the well-known problems of Boltzmann Brains and the youngness paradox. We also give various definitions of the computational complexity of a cosmology, and argue that there are only a few natural complexity classes.
Magnetospheric Reconnection in Modified Current-Sheet Equilibria
NASA Astrophysics Data System (ADS)
Newman, D. L.; Goldman, M. V.; Lapenta, G.; Markidis, S.
2012-10-01
Particle simulations of magnetic reconnection in Earth's magnetosphere are frequently initialized with a current-carrying Harris equilibrium superposed on a current-free uniform background plasma. The Harris equilibrium satisfies local charge neutrality, but requires that the sheet current be dominated by the hotter species -- often the ions in Earth's magnetosphere. This constraint is not necessarily consistent with observations. A modified kinetic equilibrium that relaxes this constraint on the currents was proposed by Yamada et al. [Phys. Plasmas., 7, 1781 (2000)] with no background population. These modified equilibria were characterized by an asymptotic converging or diverging electrostatic field normal to the current sheet. By reintroducing the background plasma, we have developed new families of equilibria where the asymptotic fields are suppressed by Debye shielding. Because the electrostatic potential profiles of these new equilibria contain wells and/or barriers capable of spatially isolating different populations of electrons and/or ions, these solutions can be further generalized to include classes of asymmetric kinetic equilibria. Examples of both symmetric and asymmetric equilibria will be presented. The dynamical evolution of these equilibria, when perturbed, will be further explored by means of implicit 2D PIC reconnection simulations, including comparisons with simulations employing standard Harris-equilibrium initializations.
Processing for spaceborne synthetic aperture radar imagery
NASA Technical Reports Server (NTRS)
Lybanon, M.
1973-01-01
The data handling and processing in using synthetic aperture radar as a satellite-borne earth resources remote sensor is considered. The discussion covers the nature of the problem, the theory, both conventional and potential advanced processing techniques, and a complete computer simulation. It is shown that digital processing is a real possibility and suggests some future directions for research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jain, Atul K.
The overall objectives of this DOE funded project is to combine scientific and computational challenges in climate modeling by expanding our understanding of the biogeophysical-biogeochemical processes and their interactions in the northern high latitudes (NHLs) using an earth system modeling (ESM) approach, and by adopting an adaptive parallel runtime system in an ESM to achieve efficient and scalable climate simulations through improved load balancing algorithms.
A Computer Simulation of Organizational Decision-Making.
1979-12-01
future research into one class of manpower models. In choosing the voting scen- ario I was more interested in the long-term process of political ... socialization , rather than the prediction of the outcome of a particular election. Successive elections are like successive learning trials. The analysis did
Mars Science Laboratory Workstation Test Set
NASA Technical Reports Server (NTRS)
Henriquez, David A.; Canham, Timothy K.; Chang, Johnny T.; Villaume, Nathaniel
2009-01-01
The Mars Science Laboratory developed the Workstation TestSet (WSTS) is a computer program that enables flight software development on virtual MSL avionics. The WSTS is the non-real-time flight avionics simulator that is designed to be completely software-based and run on a workstation class Linux PC.
The GOCE end-to-end system simulator
NASA Astrophysics Data System (ADS)
Catastini, G.; Cesare, S.; de Sanctis, S.; Detoma, E.; Dumontel, M.; Floberghagen, R.; Parisch, M.; Sechi, G.; Anselmi, A.
2003-04-01
The idea of an end-to-end simulator was conceived in the early stages of the GOCE programme, as an essential tool for assessing the satellite system performance, that cannot be fully tested on the ground. The simulator in its present form is under development at Alenia Spazio for ESA since the beginning of Phase B and is being used for checking the consistency of the spacecraft and of the payload specifications with the overall system requirements, supporting trade-off, sensitivity and worst-case analyses, and preparing and testing the on-ground and in-flight calibration concepts. The software simulates the GOCE flight along an orbit resulting from the application of Earth's gravity field, non-conservative environmental disturbances (atmospheric drag, coupling with Earth's magnetic field, etc.) and control forces/torques. The drag free control forces as well as the attitude control torques are generated by the current design of the dedicated algorithms. Realistic sensor models (star tracker, GPS receiver and gravity gradiometer) feed the control algorithms and the commanded forces are applied through realistic thruster models. The output of this stage of the simulator is a time series of Level-0 data, namely the gradiometer raw measurements and spacecraft ancillary data. The next stage of the simulator transforms Level-0 data into Level-1b (gravity gradient tensor) data, by implementing the following steps: - transformation of raw measurements of each pair of accelerometers into common and differential accelerations - calibration of the common and differential accelerations - application of the post-facto algorithm to rectify the phase of the accelerations and to estimate the GOCE angular velocity and attitude - computation of the Level-1b gravity gradient tensor from calibrated accelerations and estimated angular velocity in different reference frames (orbital, inertial, earth-fixed); computation of the spectral density of the error of the tensor diagonal components (measured gravity gradient minus input gravity gradient) in order to verify the requirement on the error of gravity gradient of 4 mE/sqrt(Hz) within the gradiometer measurement bandwidth (5 to 100 mHz); computation of the spectral density of the tensor trace in order to verify the requirement of 4 sqrt(3) mE/sqrt(Hz) within the measurement bandwidth - processing of GPS observations for orbit reconstruction within the required 10m accuracy and for gradiometer measurement geolocation. The current version of the end-to-end simulator, essentially focusing on the gradiometer payload, is undergoing detailed testing based on a time span of 10 days of simulated flight. This testing phase, ending in January 2003, will verify the current implementation and conclude the assessment of numerical stability and precision. Following that, the exercise will be repeated on a longer-duration simulated flight and the lesson learnt so far will be exploited to further improve the simulator's fidelity. The paper will describe the simulator's current status and will illustrate its capabilities for supporting the assessment of the quality of the scientific products resulting from the current spacecraft and payload design.
Scaling a Convection-Resolving RCM to Near-Global Scales
NASA Astrophysics Data System (ADS)
Leutwyler, D.; Fuhrer, O.; Chadha, T.; Kwasniewski, G.; Hoefler, T.; Lapillonne, X.; Lüthi, D.; Osuna, C.; Schar, C.; Schulthess, T. C.; Vogt, H.
2017-12-01
In the recent years, first decade-long kilometer-scale resolution RCM simulations have been performed on continental-scale computational domains. However, the size of the planet Earth is still an order of magnitude larger and thus the computational implications of performing global climate simulations at this resolution are challenging. We explore the gap between the currently established RCM simulations and global simulations by scaling the GPU accelerated version of the COSMO model to a near-global computational domain. To this end, the evolution of an idealized moist baroclinic wave has been simulated over the course of 10 days with a grid spacing of up to 930 m. The computational mesh employs 36'000 x 16'001 x 60 grid points and covers 98.4% of the planet's surface. The code shows perfect weak scaling up to 4'888 Nodes of the Piz Daint supercomputer and yields 0.043 simulated years per day (SYPD) which is approximately one seventh of the 0.2-0.3 SYPD required to conduct AMIP-type simulations. However, at half the resolution (1.9 km) we've observed 0.23 SYPD. Besides formation of frontal precipitating systems containing embedded explicitly-resolved convective motions, the simulations reveal a secondary instability that leads to cut-off warm-core cyclonic vortices in the cyclone's core, once the grid spacing is refined to the kilometer scale. The explicit representation of embedded moist convection and the representation of the previously unresolved instabilities exhibit a physically different behavior in comparison to coarser-resolution simulations. The study demonstrates that global climate simulations using kilometer-scale resolution are imminent and serves as a baseline benchmark for global climate model applications and future exascale supercomputing systems.
A Fast Method for Embattling Optimization of Ground-Based Radar Surveillance Network
NASA Astrophysics Data System (ADS)
Jiang, H.; Cheng, H.; Zhang, Y.; Liu, J.
A growing number of space activities have created an orbital debris environment that poses increasing impact risks to existing space systems and human space flight. For the safety of in-orbit spacecraft, a lot of observation facilities are needed to catalog space objects, especially in low earth orbit. Surveillance of Low earth orbit objects are mainly rely on ground-based radar, due to the ability limitation of exist radar facilities, a large number of ground-based radar need to build in the next few years in order to meet the current space surveillance demands. How to optimize the embattling of ground-based radar surveillance network is a problem to need to be solved. The traditional method for embattling optimization of ground-based radar surveillance network is mainly through to the detection simulation of all possible stations with cataloged data, and makes a comprehensive comparative analysis of various simulation results with the combinational method, and then selects an optimal result as station layout scheme. This method is time consuming for single simulation and high computational complexity for the combinational analysis, when the number of stations increases, the complexity of optimization problem will be increased exponentially, and cannot be solved with traditional method. There is no better way to solve this problem till now. In this paper, target detection procedure was simplified. Firstly, the space coverage of ground-based radar was simplified, a space coverage projection model of radar facilities in different orbit altitudes was built; then a simplified objects cross the radar coverage model was established according to the characteristics of space objects orbit motion; after two steps simplification, the computational complexity of the target detection was greatly simplified, and simulation results shown the correctness of the simplified results. In addition, the detection areas of ground-based radar network can be easily computed with the simplified model, and then optimized the embattling of ground-based radar surveillance network with the artificial intelligent algorithm, which can greatly simplifies the computational complexities. Comparing with the traditional method, the proposed method greatly improved the computational efficiency.
A Hybrid Multiscale Framework for Subsurface Flow and Transport Simulations
Scheibe, Timothy D.; Yang, Xiaofan; Chen, Xingyuan; ...
2015-06-01
Extensive research efforts have been invested in reducing model errors to improve the predictive ability of biogeochemical earth and environmental system simulators, with applications ranging from contaminant transport and remediation to impacts of biogeochemical elemental cycling (e.g., carbon and nitrogen) on local ecosystems and regional to global climate. While the bulk of this research has focused on improving model parameterizations in the face of observational limitations, the more challenging type of model error/uncertainty to identify and quantify is model structural error which arises from incorrect mathematical representations of (or failure to consider) important physical, chemical, or biological processes, properties, ormore » system states in model formulations. While improved process understanding can be achieved through scientific study, such understanding is usually developed at small scales. Process-based numerical models are typically designed for a particular characteristic length and time scale. For application-relevant scales, it is generally necessary to introduce approximations and empirical parameterizations to describe complex systems or processes. This single-scale approach has been the best available to date because of limited understanding of process coupling combined with practical limitations on system characterization and computation. While computational power is increasing significantly and our understanding of biological and environmental processes at fundamental scales is accelerating, using this information to advance our knowledge of the larger system behavior requires the development of multiscale simulators. Accordingly there has been much recent interest in novel multiscale methods in which microscale and macroscale models are explicitly coupled in a single hybrid multiscale simulation. A limited number of hybrid multiscale simulations have been developed for biogeochemical earth systems, but they mostly utilize application-specific and sometimes ad-hoc approaches for model coupling. We are developing a generalized approach to hierarchical model coupling designed for high-performance computational systems, based on the Swift computing workflow framework. In this presentation we will describe the generalized approach and provide two use cases: 1) simulation of a mixing-controlled biogeochemical reaction coupling pore- and continuum-scale models, and 2) simulation of biogeochemical impacts of groundwater – river water interactions coupling fine- and coarse-grid model representations. This generalized framework can be customized for use with any pair of linked models (microscale and macroscale) with minimal intrusiveness to the at-scale simulators. It combines a set of python scripts with the Swift workflow environment to execute a complex multiscale simulation utilizing an approach similar to the well-known Heterogeneous Multiscale Method. User customization is facilitated through user-provided input and output file templates and processing function scripts, and execution within a high-performance computing environment is handled by Swift, such that minimal to no user modification of at-scale codes is required.« less
IPSL-CM5A2. An Earth System Model designed to run long simulations for past and future climates.
NASA Astrophysics Data System (ADS)
Sepulchre, Pierre; Caubel, Arnaud; Marti, Olivier; Hourdin, Frédéric; Dufresne, Jean-Louis; Boucher, Olivier
2017-04-01
The IPSL-CM5A model was developed and released in 2013 "to study the long-term response of the climate system to natural and anthropogenic forcings as part of the 5th Phase of the Coupled Model Intercomparison Project (CMIP5)" [Dufresne et al., 2013]. Although this model also has been used for numerous paleoclimate studies, a major limitation was its computation time, which averaged 10 model-years / day on 32 cores of the Curie supercomputer (on TGCC computing center, France). Such performances were compatible with the experimental designs of intercomparison projects (e.g. CMIP, PMIP) but became limiting for modelling activities involving several multi-millenial experiments, which are typical for Quaternary or "deeptime" paleoclimate studies, in which a fully-equilibrated deep-ocean is mandatory. Here we present the Earth-System model IPSL-CM5A2. Based on IPSL-CM5A, technical developments have been performed both on separate components and on the coupling system in order to speed up the whole coupled model. These developments include the integration of hybrid parallelization MPI-OpenMP in LMDz atmospheric component, the use of a new input-ouput library to perform parallel asynchronous input/output by using computing cores as "IO servers", the use of a parallel coupling library between the ocean and the atmospheric components. Running on 304 cores, the model can now simulate 55 years per day, opening new gates towards multi-millenial simulations. Apart from obtaining better computing performances, one aim of setting up IPSL-CM5A2 was also to overcome the cold bias depicted in global surface air temperature (t2m) in IPSL-CM5A. We present the tuning strategy to overcome this bias as well as the main characteristics (including biases) of the pre-industrial climate simulated by IPSL-CM5A2. Lastly, we shortly present paleoclimate simulations run with this model, for the Holocene and for deeper timescales in the Cenozoic, for which the particular continental configuration was overcome by a new design of the ocean tripolar grid.
Plasma Sheet Circulation Pathways
NASA Technical Reports Server (NTRS)
Moore, Thomas E.; Delcourt, D. C.; Slinker, S. P.; Fedder, J. A.; Damiano, P.; Lotko, W.
2008-01-01
Global simulations of Earth's magnetosphere in the solar wind compute the pathways of plasma circulation through the plasma sheet. We address the pathways that supply and drain the plasma sheet, by coupling single fluid simulations with Global Ion Kinetic simulations of the outer magnetosphere and the Comprehensive Ring Current Model of the inner magnetosphere, including plasmaspheric plasmas. We find that the plasma sheet is supplied with solar wind plasmas via the magnetospheric flanks, and that this supply is most effective for northward IMF. For southward IMF, the innermost plasma sheet and ring current region are directly supplied from the flanks, with an asymmetry of single particle entry favoring the dawn flank. The central plasma sheet (near midnight) is supplied, as expected, from the lobes and polar cusps, but the near-Earth supply consists mainly of slowly moving ionospheric outflows for typical conditions. Work with the recently developed multi-fluid LFM simulation shows transport via plasma "fingers" extending Earthward from the flanks, suggestive of an interchange instability. We investigate this with solar wind ion trajectories, seeking to understand the fingering mechanisms and effects on transport rates.
Toward Exascale Earthquake Ground Motion Simulations for Near-Fault Engineering Analysis
Johansen, Hans; Rodgers, Arthur; Petersson, N. Anders; ...
2017-09-01
Modernizing SW4 for massively parallel time-domain simulations of earthquake ground motions in 3D earth models increases resolution and provides ground motion estimates for critical infrastructure risk evaluations. Simulations of ground motions from large (M ≥ 7.0) earthquakes require domains on the order of 100 to500 km and spatial granularity on the order of 1 to5 m resulting in hundreds of billions of grid points. Surface-focused structured mesh refinement (SMR) allows for more constant grid point per wavelength scaling in typical Earth models, where wavespeeds increase with depth. In fact, MR allows for simulations to double the frequency content relative tomore » a fixed grid calculation on a given resource. The authors report improvements to the SW4 algorithm developed while porting the code to the Cori Phase 2 (Intel Xeon Phi) systems at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. As a result, investigations of the performance of the innermost loop of the calculations found that reorganizing the order of operations can improve performance for massive problems.« less
Toward Exascale Earthquake Ground Motion Simulations for Near-Fault Engineering Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johansen, Hans; Rodgers, Arthur; Petersson, N. Anders
Modernizing SW4 for massively parallel time-domain simulations of earthquake ground motions in 3D earth models increases resolution and provides ground motion estimates for critical infrastructure risk evaluations. Simulations of ground motions from large (M ≥ 7.0) earthquakes require domains on the order of 100 to500 km and spatial granularity on the order of 1 to5 m resulting in hundreds of billions of grid points. Surface-focused structured mesh refinement (SMR) allows for more constant grid point per wavelength scaling in typical Earth models, where wavespeeds increase with depth. In fact, MR allows for simulations to double the frequency content relative tomore » a fixed grid calculation on a given resource. The authors report improvements to the SW4 algorithm developed while porting the code to the Cori Phase 2 (Intel Xeon Phi) systems at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. As a result, investigations of the performance of the innermost loop of the calculations found that reorganizing the order of operations can improve performance for massive problems.« less
Paleo-environment Simulation using GIS based on Shell Mounds
NASA Astrophysics Data System (ADS)
Uchiyama, T.; Asanuma, I.; Harada, E.
2016-02-01
Paleo-coastlines are simulated using the geographic information system (GIS) based on the shell mounds as the paleo-environment in the Tsubaki-no-umi, Ocean of Camellia in Japanese, the paleo-ocean, in Japan. The shell mounds, which are introduced in the paleo-study in the class history in junior and senior high, are used to estimate the paleo-coastlines. The paleo-coastlines are simulated as the function of sea levels relative to the current sea level for 6000 to 3000 BP on the digital elevation map of the GIS. The polygon of the simulated sea level height of 10 m extracted the shell mounds during 6000 to 5500 BP as the result of the spatial operation, and exhibited the consistency with the previous studies. The simulated sea level height of 5.5 m showed the paleo-coastline during 3600 to 3220 BP, while the Tsubaki-no-Umiturned into the brackish water lake, partly isolated from the ocean. The simulation of sea levels with GIS could be implemented to the class in the junior and senior high school with minimum efforts of teachers with the available computer and software environments.
Advanced Methodology for Simulation of Complex Flows Using Structured Grid Systems
NASA Technical Reports Server (NTRS)
Steinthorsson, Erlendur; Modiano, David
1995-01-01
Detailed simulations of viscous flows in complicated geometries pose a significant challenge to current capabilities of Computational Fluid Dynamics (CFD). To enable routine application of CFD to this class of problems, advanced methodologies are required that employ (a) automated grid generation, (b) adaptivity, (c) accurate discretizations and efficient solvers, and (d) advanced software techniques. Each of these ingredients contributes to increased accuracy, efficiency (in terms of human effort and computer time), and/or reliability of CFD software. In the long run, methodologies employing structured grid systems will remain a viable choice for routine simulation of flows in complex geometries only if genuinely automatic grid generation techniques for structured grids can be developed and if adaptivity is employed more routinely. More research in both these areas is urgently needed.
Numerical investigation for the impact of CO2 geologic sequestration on regional groundwater flow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamamoto, H.; Zhang, K.; Karasaki, K.
Large-scale storage of carbon dioxide in saline aquifers may cause considerable pressure perturbation and brine migration in deep rock formations, which may have a significant influence on the regional groundwater system. With the help of parallel computing techniques, we conducted a comprehensive, large-scale numerical simulation of CO{sub 2} geologic storage that predicts not only CO{sub 2} migration, but also its impact on regional groundwater flow. As a case study, a hypothetical industrial-scale CO{sub 2} injection in Tokyo Bay, which is surrounded by the most heavily industrialized area in Japan, was considered, and the impact of CO{sub 2} injection on near-surfacemore » aquifers was investigated, assuming relatively high seal-layer permeability (higher than 10 microdarcy). A regional hydrogeological model with an area of about 60 km x 70 km around Tokyo Bay was discretized into about 10 million gridblocks. To solve the high-resolution model efficiently, we used a parallelized multiphase flow simulator TOUGH2-MP/ECO2N on a world-class high performance supercomputer in Japan, the Earth Simulator. In this simulation, CO{sub 2} was injected into a storage aquifer at about 1 km depth under Tokyo Bay from 10 wells, at a total rate of 10 million tons/year for 100 years. Through the model, we can examine regional groundwater pressure buildup and groundwater migration to the land surface. The results suggest that even if containment of CO{sub 2} plume is ensured, pressure buildup on the order of a few bars can occur in the shallow confined aquifers over extensive regions, including urban inlands.« less
NASA Astrophysics Data System (ADS)
Choudhury, Diptyajit; Angeloski, Aleksandar; Ziah, Haseeb; Buchholz, Hilmar; Landsman, Andre; Gupta, Amitava; Mitra, Tiyasa
Lunar explorations often involve use of a lunar lander , a rover [1],[2] and an orbiter which rotates around the moon with a fixed radius. The orbiters are usually lunar satellites orbiting along a polar orbit to ensure visibility with respect to the rover and the Earth Station although with varying latency. Communication in such deep space missions is usually done using a specialized protocol like Proximity-1[3]. MATLAB simulation of Proximity-1 have been attempted by some contemporary researchers[4] to simulate all features like transmission control, delay etc. In this paper it is attempted to simulate, in real time, the communication between a tracking station on earth (earth station), a lunar orbiter and a lunar rover using concepts of Distributed Real-time Simulation(DRTS).The objective of the simulation is to simulate, in real-time, the time varying communication delays associated with the communicating elements with a facility to integrate specific simulation modules to study different aspects e.g. response due to a specific control command from the earth station to be executed by the rover. The hardware platform comprises four single board computers operating as stand-alone real time systems (developed by MATLAB xPC target and inter-networked using UDP-IP protocol). A time triggered DRTS approach is adopted. The earth station, the orbiter and the rover are programmed as three standalone real-time processes representing the communicating elements in the system. Communication from one communicating element to another constitutes an event which passes a state message from one element to another, augmenting the state of the latter. These events are handled by an event scheduler which is the fourth real-time process. The event scheduler simulates the delay in space communication taking into consideration the distance between the communicating elements. A unique time synchronization algorithm is developed which takes into account the large latencies in space communication. The DRTS setup thus developed serves as an important and inexpensive test bench for trying out remote controlled applications on the rover, for example, from an earth station. The simulation is modular and the system is composable. Each of the processes can be aug-mented with relevant simulation modules that handle the events to simulate specific function-alities. With stringent energy saving requirements on most rovers, such a simulation set up, for example, can be used to design optimal rover movement control strategies from the orbiter in conjunction with autonomous systems on the rover itself. References 1. Lunar and Planetary Department, Moscow University, Lunokhod 1, "http://selena.sai.msu.ru/Home/Spa 2. NASA History Office, Guidelines for Advanced Manned Space Vehicle Program, "http://history.nasa.gov 35ann/AMSVPguidelines/top.htm" 3. Consultative Committee For Space Data Systems, "Proximity-1 Space Link Protocol" CCSDS 211.0-B-1 Blue Book. October 2002. 4. Segui, J. and Jennings, E., "Delay Tolerant Networking-Bundle Protocol Simulation", in Proceedings of the 2nd IEEE International Conference on Space Mission Challenges for Infor-mation Technology, 2006.
Precise attitude control of the Stanford relativity satellite.
NASA Technical Reports Server (NTRS)
Bull, J. S.; Debra, D. B.
1973-01-01
A satellite being designed by the Stanford University to measure (with extremely high precision) the effect of General Relativity is described. Specifically, the satellite will measure two relativistic precessions predicted by the theory: the geodetic effect (6.9 arcsec/yr), due solely to motion about the earth, and the motional effect (0.05 arcsec/yr), due to rotation of the earth. The gyro design requirements, including the requirement for precise attitude control and a dynamic model for attitude control synthesis, are discussed. Closed loop simulation of the satellite's natural dynamics on an analog computer is described.
Visualizing ultrasound through computational modeling
NASA Technical Reports Server (NTRS)
Guo, Theresa W.
2004-01-01
The Doppler Ultrasound Hematocrit Project (DHP) hopes to find non-invasive methods of determining a person s blood characteristics. Because of the limits of microgravity and the space travel environment, it is important to find non-invasive methods of evaluating the health of persons in space. Presently, there is no well developed method of determining blood composition non-invasively. This projects hopes to use ultrasound and Doppler signals to evaluate the characteristic of hematocrit, the percentage by volume of red blood cells within whole blood. These non-invasive techniques may also be developed to be used on earth for trauma patients where invasive measure might be detrimental. Computational modeling is a useful tool for collecting preliminary information and predictions for the laboratory research. We hope to find and develop a computer program that will be able to simulate the ultrasound signals the project will work with. Simulated models of test conditions will more easily show what might be expected from laboratory results thus help the research group make informed decisions before and during experimentation. There are several existing Matlab based computer programs available, designed to interpret and simulate ultrasound signals. These programs will be evaluated to find which is best suited for the project needs. The criteria of evaluation that will be used are 1) the program must be able to specify transducer properties and specify transmitting and receiving signals, 2) the program must be able to simulate ultrasound signals through different attenuating mediums, 3) the program must be able to process moving targets in order to simulate the Doppler effects that are associated with blood flow, 4) the program should be user friendly and adaptable to various models. After a computer program is chosen, two simulation models will be constructed. These models will simulate and interpret an RF data signal and a Doppler signal.
The origins of computer weather prediction and climate modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lynch, Peter
2008-03-20
Numerical simulation of an ever-increasing range of geophysical phenomena is adding enormously to our understanding of complex processes in the Earth system. The consequences for mankind of ongoing climate change will be far-reaching. Earth System Models are capable of replicating climate regimes of past millennia and are the best means we have of predicting the future of our climate. The basic ideas of numerical forecasting and climate modeling were developed about a century ago, long before the first electronic computer was constructed. There were several major practical obstacles to be overcome before numerical prediction could be put into practice. Amore » fuller understanding of atmospheric dynamics allowed the development of simplified systems of equations; regular radiosonde observations of the free atmosphere and, later, satellite data, provided the initial conditions; stable finite difference schemes were developed; and powerful electronic computers provided a practical means of carrying out the prodigious calculations required to predict the changes in the weather. Progress in weather forecasting and in climate modeling over the past 50 years has been dramatic. In this presentation, we will trace the history of computer forecasting through the ENIAC integrations to the present day. The useful range of deterministic prediction is increasing by about one day each decade, and our understanding of climate change is growing rapidly as Earth System Models of ever-increasing sophistication are developed.« less
The origins of computer weather prediction and climate modeling
NASA Astrophysics Data System (ADS)
Lynch, Peter
2008-03-01
Numerical simulation of an ever-increasing range of geophysical phenomena is adding enormously to our understanding of complex processes in the Earth system. The consequences for mankind of ongoing climate change will be far-reaching. Earth System Models are capable of replicating climate regimes of past millennia and are the best means we have of predicting the future of our climate. The basic ideas of numerical forecasting and climate modeling were developed about a century ago, long before the first electronic computer was constructed. There were several major practical obstacles to be overcome before numerical prediction could be put into practice. A fuller understanding of atmospheric dynamics allowed the development of simplified systems of equations; regular radiosonde observations of the free atmosphere and, later, satellite data, provided the initial conditions; stable finite difference schemes were developed; and powerful electronic computers provided a practical means of carrying out the prodigious calculations required to predict the changes in the weather. Progress in weather forecasting and in climate modeling over the past 50 years has been dramatic. In this presentation, we will trace the history of computer forecasting through the ENIAC integrations to the present day. The useful range of deterministic prediction is increasing by about one day each decade, and our understanding of climate change is growing rapidly as Earth System Models of ever-increasing sophistication are developed.
Developments in Atmosphere Revitalization Modeling and Simulation
NASA Technical Reports Server (NTRS)
Knox, James C.; Kittredge, Kenneth; Xoker, Robert F.; Cummings, Ramona; Gomez, Carlos F.
2012-01-01
"NASA's Advanced Exploration Systems (AES) program is pioneering new approaches for rapidly developing prototype systems, demonstrating key capabilities, and validating operational concepts for future human missions beyond Earth orbit" (NASA 2012). These forays beyond the confines of earth's gravity will place unprecedented demands on launch systems. They must not only blast out of earth's gravity well as during the Apollo moon missions, but also launch the supplies needed to sustain a crew over longer periods for exploration missions beyond earth's moon. Thus all spacecraft systems, including those for the separation of metabolic carbon dioxide and water from a crewed vehicle, must be minimized with respect to mass, power, and volume. Emphasis is also placed on system robustness both to minimize replacement parts and ensure crew safety when a quick return to earth is not possible. Current efforts are focused on improving the current state-of-the-art systems utilizing fixed beds of sorbent pellets by evaluating structured sorbents, seeking more robust pelletized sorbents, and examining alternate bed configurations to improve system efficiency and reliability. These development efforts combine testing of sub-scale systems and multi-physics computer simulations to evaluate candidate approaches, select the best performing options, and optimize the configuration of the selected approach, which is then implemented in a full-scale integrated atmosphere revitalization test. This paper describes the development of atmosphere revitalization models and simulations. A companion paper discusses the hardware design and sorbent screening and characterization effort in support of the Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM) project within the AES program.
Experimental and simulation study results for video landmark acquisition and tracking technology
NASA Technical Reports Server (NTRS)
Schappell, R. T.; Tietz, J. C.; Thomas, H. M.; Lowrie, J. W.
1979-01-01
A synopsis of related Earth observation technology is provided and includes surface-feature tracking, generic feature classification and landmark identification, and navigation by multicolor correlation. With the advent of the Space Shuttle era, the NASA role takes on new significance in that one can now conceive of dedicated Earth resources missions. Space Shuttle also provides a unique test bed for evaluating advanced sensor technology like that described in this report. As a result of this type of rationale, the FILE OSTA-1 Shuttle experiment, which grew out of the Video Landmark Acquisition and Tracking (VILAT) activity, was developed and is described in this report along with the relevant tradeoffs. In addition, a synopsis of FILE computer simulation activity is included. This synopsis relates to future required capabilities such as landmark registration, reacquisition, and tracking.
A Study of Fluid Interface Configurations in Exploration Vehicle Propellant Tanks
NASA Technical Reports Server (NTRS)
Zimmerli, Gregory A.; Asipauskas, Marius; Chen, Yongkang; Weislogel, Mark M.
2010-01-01
The equilibrium shape and location of fluid interfaces in spacecraft propellant tanks while in low-gravity is of interest to system designers, but can be challenging to predict. The propellant position can affect many aspects of the spacecraft such as the spacecraft center of mass, response to thruster firing due to sloshing, liquid acquisition, propellant mass gauging, and thermal control systems. We use Surface Evolver, a fluid interface energy minimizing algorithm, to investigate theoretical equilibrium liquid-vapor interfaces for spacecraft propellant tanks similar to those that have been considered for NASA's new class of Exploration vehicles. The choice of tank design parameters we consider are derived from the NASA Exploration Systems Architecture Study report. The local acceleration vector employed in the computations is determined by estimating low-Earth orbit (LEO) atmospheric drag effects and centrifugal forces due to a fixed spacecraft orientation with respect to the Earth or Moon, and rotisserie-type spacecraft rotation. Propellant/vapor interface positions are computed for the Earth Departure Stage and Altair lunar lander descent and ascent stage tanks for propellant loads applicable to LEO and low-lunar orbit. In some of the cases investigated the vapor ullage bubble is located at the drain end of the tank, where propellant management device hardware is often located.
A class of hybrid finite element methods for electromagnetics: A review
NASA Technical Reports Server (NTRS)
Volakis, J. L.; Chatterjee, A.; Gong, J.
1993-01-01
Integral equation methods have generally been the workhorse for antenna and scattering computations. In the case of antennas, they continue to be the prominent computational approach, but for scattering applications the requirement for large-scale computations has turned researchers' attention to near neighbor methods such as the finite element method, which has low O(N) storage requirements and is readily adaptable in modeling complex geometrical features and material inhomogeneities. In this paper, we review three hybrid finite element methods for simulating composite scatterers, conformal microstrip antennas, and finite periodic arrays. Specifically, we discuss the finite element method and its application to electromagnetic problems when combined with the boundary integral, absorbing boundary conditions, and artificial absorbers for terminating the mesh. Particular attention is given to large-scale simulations, methods, and solvers for achieving low memory requirements and code performance on parallel computing architectures.
Toward 10-km mesh global climate simulations
NASA Astrophysics Data System (ADS)
Ohfuchi, W.; Enomoto, T.; Takaya, K.; Yoshioka, M. K.
2002-12-01
An atmospheric general circulation model (AGCM) that runs very efficiently on the Earth Simulator (ES) was developed. The ES is a gigantic vector-parallel computer with the peak performance of 40 Tflops. The AGCM, named AFES (AGCM for ES), was based on the version 5.4.02 of an AGCM developed jointly by the Center for Climate System Research, the University of Tokyo and the Japanese National Institute for Environmental Sciences. The AFES was, however, totally rewritten in FORTRAN90 and MPI while the original AGCM was written in FORTRAN77 and not capable of parallel computing. The AFES achieved 26 Tflops (about 65 % of the peak performance of the ES) at resolution of T1279L96 (10-km horizontal resolution and 500-m vertical resolution in middle troposphere to lower stratosphere). Some results of 10- to 20-day global simulations will be presented. At this moment, only short-term simulations are possible due to data storage limitation. As ten tera flops computing is achieved, peta byte data storage are necessary to conduct climate-type simulations at this super-high resolution global simulations. Some possibilities for future research topics in global super-high resolution climate simulations will be discussed. Some target topics are mesoscale structures and self-organization of the Baiu-Meiyu front over Japan, cyclogenecsis over the North Pacific and typhoons around the Japan area. Also improvement in local precipitation with increasing horizontal resolution will be demonstrated.
Atomic detail visualization of photosynthetic membranes with GPU-accelerated ray tracing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, John E.; Sener, Melih; Vandivort, Kirby L.
The cellular process responsible for providing energy for most life on Earth, namely, photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. In this paper, we present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. Finally, we describemore » the techniques that were used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers.« less
Atomic detail visualization of photosynthetic membranes with GPU-accelerated ray tracing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, John E.; Sener, Melih; Vandivort, Kirby L.
The cellular process responsible for providing energy for most life on Earth, namely, photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. We present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. We describe the techniques that weremore » used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers.« less
Atomic detail visualization of photosynthetic membranes with GPU-accelerated ray tracing
Stone, John E.; Sener, Melih; Vandivort, Kirby L.; ...
2015-12-12
The cellular process responsible for providing energy for most life on Earth, namely, photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. In this paper, we present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. Finally, we describemore » the techniques that were used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers.« less
NASA Technical Reports Server (NTRS)
Williams, Jessica L.; Bhat, Ramachandra S.; You, Tung-Han
2012-01-01
The Soil Moisture Active Passive (SMAP) mission will perform soil moisture content and freeze/thaw state observations from a low-Earth orbit. The observatory is scheduled to launch in October 2014 and will perform observations from a near-polar, frozen, and sun-synchronous Science Orbit for a 3-year data collection mission. At launch, the observatory is delivered to an Injection Orbit that is biased below the Science Orbit; the spacecraft will maneuver to the Science Orbit during the mission Commissioning Phase. The delta V needed to maneuver from the Injection Orbit to the Science Orbit is computed statistically via a Monte Carlo simulation; the 99th percentile delta V (delta V99) is carried as a line item in the mission delta V budget. This paper details the simulation and analysis performed to compute this figure and the delta V99 computed per current mission parameters.
NASA Astrophysics Data System (ADS)
Lyon, Ellen Beth
1998-09-01
This research project investigated the influence of homogeneous (like-ability) review pairs coupled with heterogeneous (mixed-ability) cooperative learning groups using computer-assisted instruction (CAI) on academic achievement and attitude toward science in eighth grade Earth science students. Subjects were placed into academic quartiles (Hi, Med-Hi, Med-Lo, and Lo) based on achievement. Cooperative learning groups of four (one student from each academic quartile) were formed in all classes, within which students completed CAI through a software package entitled Geoscience Education Through Interactive Technology, or GETITspTM. Each day, when computer activities were completed, students in the experimental classes were divided into homogeneous review pairs to review their work. The students in the control classes were divided into heterogeneous review pairs to review their work. The effects of the experimental treatment were measured by pretest, posttest, and delayed posttest measures, by pre- and post-student attitude scales, and by evaluation of amendments students made to their work during the time spent in review pairs. Results showed that student achievement was not significantly influenced by placement in homogeneous or heterogeneous review pairs, regardless of academic quartile assignment. Student attitude toward science as a school subject did not change significantly due to experimental treatment. Achievement retention of students in experimental and control groups within each quartile showed no significant difference. Notebook amendment patterns showed some significant differences in a few categories. For the Hi quartile, there were significant differences in numbers of deletion amendments and substitution amendments between the experimental and the control group. In both cases, subjects in the experimental group (homogeneous review pairs) made greater number of amendments then those in the control group (heterogeneous review pairs). For the Lo quartile, there was a significant difference in the number of grammar/usage/mechanics (GUM) amendments between the experimental and control groups. The experimental group made far more GUM amendments than the control group. This research highlights the fact that many factors may influence a successful learning environment in which CAI is successfully implemented. Educational research projects should be designed and used to help teachers create learning environments in which CAI is maximized.
Neurophysiological model of the normal and abnormal human pupil
NASA Technical Reports Server (NTRS)
Krenz, W.; Robin, M.; Barez, S.; Stark, L.
1985-01-01
Anatomical, experimental, and computer simulation studies were used to determine the structure of the neurophysiological model of the pupil size control system. The computer simulation of this model demonstrates the role played by each of the elements in the neurological pathways influencing the size of the pupil. Simulations of the effect of drugs and common abnormalities in the system help to illustrate the workings of the pathways and processes involved. The simulation program allows the user to select pupil condition (normal or an abnormality), specific site along the neurological pathway (retina, hypothalamus, etc.) drug class input (barbiturate, narcotic, etc.), stimulus/response mode, display mode, stimulus type and input waveform, stimulus or background intensity and frequency, the input and output conditions, and the response at the neuroanatomical site. The model can be used as a teaching aid or as a tool for testing hypotheses regarding the system.
Large-eddy simulation of a boundary layer with concave streamwise curvature
NASA Technical Reports Server (NTRS)
Lund, Thomas S.
1994-01-01
Turbulence modeling continues to be one of the most difficult problems in fluid mechanics. Existing prediction methods are well developed for certain classes of simple equilibrium flows, but are still not entirely satisfactory for a large category of complex non-equilibrium flows found in engineering practice. Direct and large-eddy simulation (LES) approaches have long been believed to have great potential for the accurate prediction of difficult turbulent flows, but the associated computational cost has been prohibitive for practical problems. This remains true for direct simulation but is no longer clear for large-eddy simulation. Advances in computer hardware, numerical methods, and subgrid-scale modeling have made it possible to conduct LES for flows or practical interest at Reynolds numbers in the range of laboratory experiments. The objective of this work is to apply ES and the dynamic subgrid-scale model to the flow of a boundary layer over a concave surface.
Running SW4 On New Commodity Technology Systems (CTS-1) Platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodgers, Arthur J.; Petersson, N. Anders; Pitarka, Arben
We have recently been running earthquake ground motion simulations with SW4 on the new capacity computing systems, called the Commodity Technology Systems - 1 (CTS-1) at Lawrence Livermore National Laboratory (LLNL). SW4 is a fourth order time domain finite difference code developed by LLNL and distributed by the Computational Infrastructure for Geodynamics (CIG). SW4 simulates seismic wave propagation in complex three-dimensional Earth models including anelasticity and surface topography. We are modeling near-fault earthquake strong ground motions for the purposes of evaluating the response of engineered structures, such as nuclear power plants and other critical infrastructure. Engineering analysis of structures requiresmore » the inclusion of high frequencies which can cause damage, but are often difficult to include in simulations because of the need for large memory to model fine grid spacing on large domains.« less
NASA Astrophysics Data System (ADS)
Vilotte, J. P.; Atkinson, M.; Spinuso, A.; Rietbrock, A.; Michelini, A.; Igel, H.; Frank, A.; Carpené, M.; Schwichtenberg, H.; Casarotti, E.; Filgueira, R.; Garth, T.; Germünd, A.; Klampanos, I.; Krause, A.; Krischer, L.; Leong, S. H.; Magnoni, F.; Matser, J.; Moguilny, G.
2015-12-01
Seismology addresses both fundamental problems in understanding the Earth's internal wave sources and structures and augmented societal applications, like earthquake and tsunami hazard assessment and risk mitigation; and puts a premium on open-data accessible by the Federated Digital Seismological Networks. The VERCE project, "Virtual Earthquake and seismology Research Community e-science environment in Europe", has initiated a virtual research environment to support complex orchestrated workflows combining state-of-art wave simulation codes and data analysis tools on distributed computing and data infrastructures (DCIs) along with multiple sources of observational data and new capabilities to combine simulation results with observational data. The VERCE Science Gateway provides a view of all the available resources, supporting collaboration with shared data and methods, with data access controls. The mapping to DCIs handles identity management, authority controls, transformations between representations and controls, and access to resources. The framework for computational science that provides simulation codes, like SPECFEM3D, democratizes their use by getting data from multiple sources, managing Earth models and meshes, distilling them as input data, and capturing results with meta-data. The dispel4py data-intensive framework allows for developing data-analysis applications using Python and the ObsPy library, which can be executed on different DCIs. A set of tools allows coupling with seismology and external data services. Provenance driven tools validate results and show relationships between data to facilitate method improvement. Lessons learned from VERCE training lead us to conclude that solid-Earth scientists could make significant progress by using VERCE e-science environment. VERCE has already contributed to the European Plate Observation System (EPOS), and is part of the EPOS implementation phase. Its cross-disciplinary capabilities are being extended for the EPOS implantation phase.
ERIC Educational Resources Information Center
Fogarty, Ian; Geelan, David
2013-01-01
Students in 4 Canadian high school physics classes completed instructional sequences in two key physics topics related to motion--Straight Line Motion and Newton's First Law. Different sequences of laboratory investigation, teacher explanation (lecture) and the use of computer-based scientific visualizations (animations and simulations) were…
Mars exploration, Venus swingby and conjunction class mission modes, time period 2000 to 2045
NASA Technical Reports Server (NTRS)
Young, A. C.; Mulqueen, J. A.; Skinner, J. E.
1984-01-01
Trajectory and mission requirement data are presented for Earth-Mars opposition class and conjunction class round trip stopover mission opportunities available during the time period year 2000 to year 2045. The opposition class mission employs the gravitational field of Venus to accelerate the space vehicle on either the outbound or inbound leg. The gravitational field of Venus was used to reduce the propulsion requirement associated with the opposition class mission. Representative space vehicle systems are sized to compare the initial mass required in low Earth orbit of one mission opportunity with another mission opportunity. The interplanetary space vehicle is made up of the spacecraft and the space vehicle acceleration system. The space vehicle acceleration system consists of three propulsion stages. The first propulsion stage performs the Earth escape maneuver; the second stage brakes the spacecraft and Earth braking stage into the Mars elliptical orbit and effects the escape maneuver from the Mars elliptical orbit. The third propulsion stage brakes the mission module into an elliptical orbit at Earth return. The interplanetary space vehicle was assumed to be assembled in and depart from the space station circular orbit.
NASA Astrophysics Data System (ADS)
Lin, Mingpei; Xu, Ming; Fu, Xiaoyu
2017-05-01
Currently, a tremendous amount of space debris in Earth's orbit imperils operational spacecraft. It is essential to undertake risk assessments of collisions and predict dangerous encounters in space. However, collision predictions for an enormous amount of space debris give rise to large-scale computations. In this paper, a parallel algorithm is established on the Compute Unified Device Architecture (CUDA) platform of NVIDIA Corporation for collision prediction. According to the parallel structure of NVIDIA graphics processors, a block decomposition strategy is adopted in the algorithm. Space debris is divided into batches, and the computation and data transfer operations of adjacent batches overlap. As a consequence, the latency to access shared memory during the entire computing process is significantly reduced, and a higher computing speed is reached. Theoretically, a simulation of collision prediction for space debris of any amount and for any time span can be executed. To verify this algorithm, a simulation example including 1382 pieces of debris, whose operational time scales vary from 1 min to 3 days, is conducted on Tesla C2075 of NVIDIA. The simulation results demonstrate that with the same computational accuracy as that of a CPU, the computing speed of the parallel algorithm on a GPU is 30 times that on a CPU. Based on this algorithm, collision prediction of over 150 Chinese spacecraft for a time span of 3 days can be completed in less than 3 h on a single computer, which meets the timeliness requirement of the initial screening task. Furthermore, the algorithm can be adapted for multiple tasks, including particle filtration, constellation design, and Monte-Carlo simulation of an orbital computation.
NASA Astrophysics Data System (ADS)
Fairley, J. P.; Hinds, J. J.
2003-12-01
The advent of the World Wide Web in the early 1990s not only revolutionized the exchange of ideas and information within the scientific community, but also provided educators with a new array of teaching, informational, and promotional tools. Use of computer graphics and animation to explain concepts and processes can stimulate classroom participation and student interest in the geosciences, which has historically attracted students with strong spatial and visualization skills. In today's job market, graduates are expected to have knowledge of computers and the ability to use them for acquiring, processing, and visually analyzing data. Furthermore, in addition to promoting visibility and communication within the scientific community, computer graphics and the Internet can be informative and educational for the general public. Although computer skills are crucial for earth science students and educators, many pitfalls exist in implementing computer technology and web-based resources into research and classroom activities. Learning to use these new tools effectively requires a significant time commitment and careful attention to the source and reliability of the data presented. Furthermore, educators have a responsibility to ensure that students and the public understand the assumptions and limitations of the materials presented, rather than allowing them to be overwhelmed by "gee-whiz" aspects of the technology. We present three examples of computer technology in the earth sciences classroom: 1) a computer animation of water table response to well pumping, 2) a 3-D fly-through animation of a fault controlled valley, and 3) a virtual field trip for an introductory geology class. These examples demonstrate some of the challenges and benefits of these new tools, and encourage educators to expand the responsible use of computer technology for teaching and communicating scientific results to the general public.
ORNL Cray X1 evaluation status report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agarwal, P.K.; Alexander, R.A.; Apra, E.
2004-05-01
On August 15, 2002 the Department of Energy (DOE) selected the Center for Computational Sciences (CCS) at Oak Ridge National Laboratory (ORNL) to deploy a new scalable vector supercomputer architecture for solving important scientific problems in climate, fusion, biology, nanoscale materials and astrophysics. ''This program is one of the first steps in an initiative designed to provide U.S. scientists with the computational power that is essential to 21st century scientific leadership,'' said Dr. Raymond L. Orbach, director of the department's Office of Science. In FY03, CCS procured a 256-processor Cray X1 to evaluate the processors, memory subsystem, scalability of themore » architecture, software environment and to predict the expected sustained performance on key DOE applications codes. The results of the micro-benchmarks and kernel bench marks show the architecture of the Cray X1 to be exceptionally fast for most operations. The best results are shown on large problems, where it is not possible to fit the entire problem into the cache of the processors. These large problems are exactly the types of problems that are important for the DOE and ultra-scale simulation. Application performance is found to be markedly improved by this architecture: - Large-scale simulations of high-temperature superconductors run 25 times faster than on an IBM Power4 cluster using the same number of processors. - Best performance of the parallel ocean program (POP v1.4.3) is 50 percent higher than on Japan s Earth Simulator and 5 times higher than on an IBM Power4 cluster. - A fusion application, global GYRO transport, was found to be 16 times faster on the X1 than on an IBM Power3. The increased performance allowed simulations to fully resolve questions raised by a prior study. - The transport kernel in the AGILE-BOLTZTRAN astrophysics code runs 15 times faster than on an IBM Power4 cluster using the same number of processors. - Molecular dynamics simulations related to the phenomenon of photon echo run 8 times faster than previously achieved. Even at 256 processors, the Cray X1 system is already outperforming other supercomputers with thousands of processors for a certain class of applications such as climate modeling and some fusion applications. This evaluation is the outcome of a number of meetings with both high-performance computing (HPC) system vendors and application experts over the past 9 months and has received broad-based support from the scientific community and other agencies.« less
Computing Radiative Transfer in a 3D Medium
NASA Technical Reports Server (NTRS)
Von Allmen, Paul; Lee, Seungwon
2012-01-01
A package of software computes the time-dependent propagation of a narrow laser beam in an arbitrary three- dimensional (3D) medium with absorption and scattering, using the transient-discrete-ordinates method and a direct integration method. Unlike prior software that utilizes a Monte Carlo method, this software enables simulation at very small signal-to-noise ratios. The ability to simulate propagation of a narrow laser beam in a 3D medium is an improvement over other discrete-ordinate software. Unlike other direct-integration software, this software is not limited to simulation of propagation of thermal radiation with broad angular spread in three dimensions or of a laser pulse with narrow angular spread in two dimensions. Uses for this software include (1) computing scattering of a pulsed laser beam on a material having given elastic scattering and absorption profiles, and (2) evaluating concepts for laser-based instruments for sensing oceanic turbulence and related measurements of oceanic mixed-layer depths. With suitable augmentation, this software could be used to compute radiative transfer in ultrasound imaging in biological tissues, radiative transfer in the upper Earth crust for oil exploration, and propagation of laser pulses in telecommunication applications.
The Role of Transfer in Designing Games and Simulations for Health: Systematic Review
Terlouw, Gijs; Wartena, Bard O; van 't Veer, Job TB; Prins, Jelle T; Pierie, Jean Pierre EN
2017-01-01
Background The usefulness and importance of serious games and simulations in learning and behavior change for health and health-related issues are widely recognized. Studies have addressed games and simulations as interventions, mostly in comparison with their analog counterparts. Numerous complex design choices have to be made with serious games and simulations for health, including choices that directly contribute to the effects of the intervention. One of these decisions is the way an intervention is expected to lead to desirable transfer effects. Most designs adopt a first-class transfer rationale, whereas the second class of transfer types seems a rarity in serious games and simulations for health. Objective This study sought to review the literature specifically on the second class of transfer types in the design of serious games and simulations. Focusing on game-like interventions for health and health care, this study aimed to (1) determine whether the second class of transfer is recognized as a road for transfer in game-like interventions, (2) review the application of the second class of transfer type in designing game-like interventions, and (3) assess studies that include second-class transfer types reporting transfer outcomes. Methods A total of 6 Web-based databases were systematically searched by titles, abstracts, and keywords using the search strategy (video games OR game OR games OR gaming OR computer simulation*) AND (software design OR design) AND (fidelity OR fidelities OR transfer* OR behaviour OR behavior). The databases searched were identified as relevant to health, education, and social science. Results A total of 15 relevant studies were included, covering a range of game-like interventions, all more or less mentioning design parameters aimed at transfer. We found 9 studies where first-class transfer was part of the design of the intervention. In total, 8 studies dealt with transfer concepts and fidelity types in game-like intervention design in general; 3 studies dealt with the concept of second-class transfer types and reported effects, and 2 of those recognized transfer as a design parameter. Conclusions In studies on game-like interventions for health and health care, transfer is regarded as a desirable effect but not as a basic principle for design. None of the studies determined the second class of transfer or instances thereof, although in 3 cases a nonliteral transfer type was present. We also found that studies on game-like interventions for health do not elucidate design choices made and rarely provide design principles for future work. Games and simulations for health abundantly build upon the principles of first-class transfer, but the adoption of second-class transfer types proves scarce. It is likely to be worthwhile to explore the possibilities of second-class transfer types, as they may considerably influence educational objectives in terms of future serious game design for health. PMID:29175812
MSFC Stream Model Preliminary Results: Modeling Recent Leonid and Perseid Encounters
NASA Technical Reports Server (NTRS)
Cooke, William J.; Moser, Danielle E.
2004-01-01
The cometary meteoroid ejection model of Jones and Brown (1996b) was used to simulate ejection from comets 55P/Tempel-Tuttle during the last 12 revolutions, and the last 9 apparitions of 109P/Swift-Tuttle. Using cometary ephemerides generated by the Jet Propulsion Laboratory s (JPL) HORIZONS Solar System Data and Ephemeris Computation Service, two independent ejection schemes were simulated. In the first case, ejection was simulated in 1 hour time steps along the comet s orbit while it was within 2.5 AU of the Sun. In the second case, ejection was simulated to occur at the hour the comet reached perihelion. A 4th order variable step-size Runge-Kutta integrator was then used to integrate meteoroid position and velocity forward in time, accounting for the effects of radiation pressure, Poynting-Robertson drag, and the gravitational forces of the planets, which were computed using JPL s DE406 planetary ephemerides. An impact parameter was computed for each particle approaching the Earth to create a flux profile, and the results compared to observations of the 1998 and 1999 Leonid showers, and the 1993 and 2004 Perseids.
MSFC Stream Model Preliminary Results: Modeling Recent Leonid and Perseid Encounters
NASA Astrophysics Data System (ADS)
Moser, Danielle E.; Cooke, William J.
2004-12-01
The cometary meteoroid ejection model of Jones and Brown [ Physics, Chemistry, and Dynamics of Interplanetary Dust, ASP Conference Series 104 (1996b) 137] was used to simulate ejection from comets 55P/Tempel-Tuttle during the last 12 revolutions, and the last 9 apparitions of 109P/Swift-Tuttle. Using cometary ephemerides generated by the Jet Propulsion Laboratory’s (JPL) HORIZONS Solar System Data and Ephemeris Computation Service, two independent ejection schemes were simulated. In the first case, ejection was simulated in 1 h time steps along the comet’s orbit while it was within 2.5 AU of the Sun. In the second case, ejection was simulated to occur at the hour the comet reached perihelion. A 4th order variable step-size Runge Kutta integrator was then used to integrate meteoroid position and velocity forward in time, accounting for the effects of radiation pressure, Poynting Robertson drag, and the gravitational forces of the planets, which were computed using JPL’s DE406 planetary ephemerides. An impact parameter (IP) was computed for each particle approaching the Earth to create a flux profile, and the results compared to observations of the 1998 and 1999 Leonid showers, and the 1993 and 2004 Perseids.
Mantle convection on modern supercomputers
NASA Astrophysics Data System (ADS)
Weismüller, Jens; Gmeiner, Björn; Mohr, Marcus; Waluga, Christian; Wohlmuth, Barbara; Rüde, Ulrich; Bunge, Hans-Peter
2015-04-01
Mantle convection is the cause for plate tectonics, the formation of mountains and oceans, and the main driving mechanism behind earthquakes. The convection process is modeled by a system of partial differential equations describing the conservation of mass, momentum and energy. Characteristic to mantle flow is the vast disparity of length scales from global to microscopic, turning mantle convection simulations into a challenging application for high-performance computing. As system size and technical complexity of the simulations continue to increase, design and implementation of simulation models for next generation large-scale architectures demand an interdisciplinary co-design. Here we report about recent advances of the TERRA-NEO project, which is part of the high visibility SPPEXA program, and a joint effort of four research groups in computer sciences, mathematics and geophysical application under the leadership of FAU Erlangen. TERRA-NEO develops algorithms for future HPC infrastructures, focusing on high computational efficiency and resilience in next generation mantle convection models. We present software that can resolve the Earth's mantle with up to 1012 grid points and scales efficiently to massively parallel hardware with more than 50,000 processors. We use our simulations to explore the dynamic regime of mantle convection assessing the impact of small scale processes on global mantle flow.
Gene regulatory networks: a coarse-grained, equation-free approach to multiscale computation.
Erban, Radek; Kevrekidis, Ioannis G; Adalsteinsson, David; Elston, Timothy C
2006-02-28
We present computer-assisted methods for analyzing stochastic models of gene regulatory networks. The main idea that underlies this equation-free analysis is the design and execution of appropriately initialized short bursts of stochastic simulations; the results of these are processed to estimate coarse-grained quantities of interest, such as mesoscopic transport coefficients. In particular, using a simple model of a genetic toggle switch, we illustrate the computation of an effective free energy Phi and of a state-dependent effective diffusion coefficient D that characterize an unavailable effective Fokker-Planck equation. Additionally we illustrate the linking of equation-free techniques with continuation methods for performing a form of stochastic "bifurcation analysis"; estimation of mean switching times in the case of a bistable switch is also implemented in this equation-free context. The accuracy of our methods is tested by direct comparison with long-time stochastic simulations. This type of equation-free analysis appears to be a promising approach to computing features of the long-time, coarse-grained behavior of certain classes of complex stochastic models of gene regulatory networks, circumventing the need for long Monte Carlo simulations.
A fast exact simulation method for a class of Markov jump processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yao, E-mail: yaoli@math.umass.edu; Hu, Lili, E-mail: lilyhu86@gmail.com
2015-11-14
A new method of the stochastic simulation algorithm (SSA), named the Hashing-Leaping method (HLM), for exact simulations of a class of Markov jump processes, is presented in this paper. The HLM has a conditional constant computational cost per event, which is independent of the number of exponential clocks in the Markov process. The main idea of the HLM is to repeatedly implement a hash-table-like bucket sort algorithm for all times of occurrence covered by a time step with length τ. This paper serves as an introduction to this new SSA method. We introduce the method, demonstrate its implementation, analyze itsmore » properties, and compare its performance with three other commonly used SSA methods in four examples. Our performance tests and CPU operation statistics show certain advantages of the HLM for large scale problems.« less
NASA Astrophysics Data System (ADS)
Cardall, Christian Y.; Budiardja, Reuben D.
2018-01-01
The large-scale computer simulation of a system of physical fields governed by partial differential equations requires some means of approximating the mathematical limit of continuity. For example, conservation laws are often treated with a 'finite-volume' approach in which space is partitioned into a large number of small 'cells,' with fluxes through cell faces providing an intuitive discretization modeled on the mathematical definition of the divergence operator. Here we describe and make available Fortran 2003 classes furnishing extensible object-oriented implementations of simple meshes and the evolution of generic conserved currents thereon, along with individual 'unit test' programs and larger example problems demonstrating their use. These classes inaugurate the Mathematics division of our developing astrophysics simulation code GENASIS (Gen eral A strophysical Si mulation S ystem), which will be expanded over time to include additional meshing options, mathematical operations, solver types, and solver variations appropriate for many multiphysics applications.
Atomistic Structure, Strength, and Kinetic Properties of Intergranular Films in Ceramics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garofalini, Stephen H
2015-01-08
Intergranular films (IGFs) present in polycrystalline oxide and nitride ceramics provide an excellent example of nanoconfined glasses that occupy only a small volume percentage of the bulk ceramic, but can significantly influence various mechanical, thermal, chemical, and optical properties. By employing molecular dynamics computer simulations, we have been able to predict structures and the locations of atoms at the crystal/IGF interface that were subsequently verified with the newest electron microscopies. Modification of the chemistry of the crystal surface in the simulations provided the necessary mechanism for adsorption of specific rare earth ions from the IGF in the liquid state tomore » the crystal surface. Such results had eluded other computational approaches such as ab-initio calculations because of the need to include not only the modified chemistry of the crystal surfaces but also an accurate description of the adjoining glassy IGF. This segregation of certain ions from the IGF to the crystal caused changes in the local chemistry of the IGF that affected fracture behavior in the simulations. Additional work with the rare earth ions La and Lu in the silicon oxynitride IGFs showed the mechanisms for their different affects on crystal growth, even though both types of ions are seen adhering to a bounding crystal surface that would normally imply equivalent affects on grain growth.« less
Transient thermal modeling of the nonscanning ERBE detector
NASA Technical Reports Server (NTRS)
Mahan, J. R.
1983-01-01
A numerical model to predict the transient thermal response of the ERBE nonscanning wide field of view total radiometer channel was developed. The model, which uses Monte Carlo techniques to characterize the radiative component of heat transfer, is described and a listing of the computer program is provided. Application of the model to simulate the actual blackbody calibration procedure is discussed. The use of the model to establish a real time flight data interpretation strategy is recommended. Modification of the model to include a simulated Earth radiation source field and a filter dome is indicated.
ATHLETE: Trading Complexity for Mass in Roving Vehicles
NASA Technical Reports Server (NTRS)
Wilcox, Brian H.
2013-01-01
This paper describes a scaling analysis of ATHLETE for exploration of the moon, Mars and Near-Earth Asteroids (NEAs) in comparison to a more conventional vehicle configuration. Recently, the focus of human exploration beyond LEO has been on NEAs. A low gravity testbed has been constructed in the ATHLETE lab, with six computer-controlled winches able to lift ATHLETE and payloads so as to simulate the motion of the system in the vicinity of a NEA or to simulate ATHLETE on extreme terrain in lunar or Mars gravity. Test results from this system are described.
PACES Participation in Educational Outreach Programs at the University of Texas at El Paso
NASA Technical Reports Server (NTRS)
Dodge, Rebecca L.
1997-01-01
The University of Texas at El Paso (UTEP) is involved in several initiatives to improve science education within the El Paso area public schools. These include outreach efforts into the K- 12 classrooms; training programs for in-service teachers; and the introduction of a strong science core curricula within the College of Education. The Pan American Center for Earth and Environmental Studies (PACES), a NASA-funded University Research Center, will leverage off the goals of these existing initiatives to provide curriculum support materials at all levels. We will use currently available Mission to Planet Earth (MTPE) materials as well as new materials developed specifically for this region, in an effort to introduce the Earth System Science perspective into these programs. In addition, we are developing curriculum support materials and classes within the Geology and Computer Departments, to provide education in the area of remote sensing and GIS applications at the undergraduate and graduate levels.
Practical Unitary Simulator for Non-Markovian Complex Processes
NASA Astrophysics Data System (ADS)
Binder, Felix C.; Thompson, Jayne; Gu, Mile
2018-06-01
Stochastic processes are as ubiquitous throughout the quantitative sciences as they are notorious for being difficult to simulate and predict. In this Letter, we propose a unitary quantum simulator for discrete-time stochastic processes which requires less internal memory than any classical analogue throughout the simulation. The simulator's internal memory requirements equal those of the best previous quantum models. However, in contrast to previous models, it only requires a (small) finite-dimensional Hilbert space. Moreover, since the simulator operates unitarily throughout, it avoids any unnecessary information loss. We provide a stepwise construction for simulators for a large class of stochastic processes hence directly opening the possibility for experimental implementations with current platforms for quantum computation. The results are illustrated for an example process.
NASA Astrophysics Data System (ADS)
Anantharaj, Valentine; Norman, Matthew; Evans, Katherine; Taylor, Mark; Worley, Patrick; Hack, James; Mayer, Benjamin
2014-05-01
During 2013, high-resolution climate model simulations accounted for over 100 million "core hours" using Titan at the Oak Ridge Leadership Computing Facility (OLCF). The suite of climate modeling experiments, primarily using the Community Earth System Model (CESM) at nearly 0.25 degree horizontal resolution, generated over a petabyte of data and nearly 100,000 files, ranging in sizes from 20 MB to over 100 GB. Effective utilization of leadership class resources requires careful planning and preparation. The application software, such as CESM, need to be ported, optimized and benchmarked for the target platform in order to meet the computational readiness requirements. The model configuration needs to be "tuned and balanced" for the experiments. This can be a complicated and resource intensive process, especially for high-resolution configurations using complex physics. The volume of I/O also increases with resolution; and new strategies may be required to manage I/O especially for large checkpoint and restart files that may require more frequent output for resiliency. It is also essential to monitor the application performance during the course of the simulation exercises. Finally, the large volume of data needs to be analyzed to derive the scientific results; and appropriate data and information delivered to the stakeholders. Titan is currently the largest supercomputer available for open science. The computational resources, in terms of "titan core hours" are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) and ASCR Leadership Computing Challenge (ALCC) programs, both sponsored by the U.S. Department of Energy (DOE) Office of Science. Titan is a Cray XK7 system, capable of a theoretical peak performance of over 27 PFlop/s, consists of 18,688 compute nodes, with a NVIDIA Kepler K20 GPU and a 16-core AMD Opteron CPU in every node, for a total of 299,008 Opteron cores and 18,688 GPUs offering a cumulative 560,640 equivalent cores. Scientific applications, such as CESM, are also required to demonstrate a "computational readiness capability" to efficiently scale across and utilize 20% of the entire system. The 0,25 deg configuration of the spectral element dynamical core of the Community Atmosphere Model (CAM-SE), the atmospheric component of CESM, has been demonstrated to scale efficiently across more than 5,000 nodes (80,000 CPU cores) on Titan. The tracer transport routines of CAM-SE have also been ported to take advantage of the hybrid many-core architecture of Titan using GPUs [see EGU2014-4233], yielding over 2X speedup when transporting over 100 tracers. The high throughput I/O in CESM, based on the Parallel IO Library (PIO), is being further augmented to support even higher resolutions and enhance resiliency. The application performance of the individual runs are archived in a database and routinely analyzed to identify and rectify performance degradation during the course of the experiments. The various resources available at the OLCF now support a scientific workflow to facilitate high-resolution climate modelling. A high-speed center-wide parallel file system, called ATLAS, capable of 1 TB/s, is available on Titan as well as on the clusters used for analysis (Rhea) and visualization (Lens/EVEREST). Long-term archive is facilitated by the HPSS storage system. The Earth System Grid (ESG), featuring search & discovery, is also used to deliver data. The end-to-end workflow allows OLCF users to efficiently share data and publish results in a timely manner.
Multicategory Composite Least Squares Classifiers
Park, Seo Young; Liu, Yufeng; Liu, Dacheng; Scholl, Paul
2010-01-01
Classification is a very useful statistical tool for information extraction. In particular, multicategory classification is commonly seen in various applications. Although binary classification problems are heavily studied, extensions to the multicategory case are much less so. In view of the increased complexity and volume of modern statistical problems, it is desirable to have multicategory classifiers that are able to handle problems with high dimensions and with a large number of classes. Moreover, it is necessary to have sound theoretical properties for the multicategory classifiers. In the literature, there exist several different versions of simultaneous multicategory Support Vector Machines (SVMs). However, the computation of the SVM can be difficult for large scale problems, especially for problems with large number of classes. Furthermore, the SVM cannot produce class probability estimation directly. In this article, we propose a novel efficient multicategory composite least squares classifier (CLS classifier), which utilizes a new composite squared loss function. The proposed CLS classifier has several important merits: efficient computation for problems with large number of classes, asymptotic consistency, ability to handle high dimensional data, and simple conditional class probability estimation. Our simulated and real examples demonstrate competitive performance of the proposed approach. PMID:21218128
Large-Scale NASA Science Applications on the Columbia Supercluster
NASA Technical Reports Server (NTRS)
Brooks, Walter
2005-01-01
Columbia, NASA's newest 61 teraflops supercomputer that became operational late last year, is a highly integrated Altix cluster of 10,240 processors, and was named to honor the crew of the Space Shuttle lost in early 2003. Constructed in just four months, Columbia increased NASA's computing capability ten-fold, and revitalized the Agency's high-end computing efforts. Significant cutting-edge science and engineering simulations in the areas of space and Earth sciences, as well as aeronautics and space operations, are already occurring on this largest operational Linux supercomputer, demonstrating its capacity and capability to accelerate NASA's space exploration vision. The presentation will describe how an integrated environment consisting not only of next-generation systems, but also modeling and simulation, high-speed networking, parallel performance optimization, and advanced data analysis and visualization, is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions. The talk will conclude by discussing how NAS partnered with various NASA centers, other government agencies, computer industry, and academia, to create a national resource in large-scale modeling and simulation.
The ab initio simulation of the Earth's core.
Alfè, D; Gillan, M J; Vocadlo, L; Brodholt, J; Price, G D
2002-06-15
The Earth has a liquid outer and solid inner core. It is predominantly composed of Fe, alloyed with small amounts of light elements, such as S, O and Si. The detailed chemical and thermal structure of the core is poorly constrained, and it is difficult to perform experiments to establish the properties of core-forming phases at the pressures (ca. 300 GPa) and temperatures (ca. 5000-6000 K) to be found in the core. Here we present some major advances that have been made in using quantum mechanical methods to simulate the high-P/T properties of Fe alloys, which have been made possible by recent developments in high-performance computing. Specifically, we outline how we have calculated the Gibbs free energies of the crystalline and liquid forms of Fe alloys, and so conclude that the inner core of the Earth is composed of hexagonal close packed Fe containing ca. 8.5% S (or Si) and 0.2% O in equilibrium at 5600 K at the boundary between the inner and outer cores with a liquid Fe containing ca. 10% S (or Si) and 8% O.
Fast rotation of a subkilometer-sized near-Earth object 2011 XA{sub 3}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Urakawa, Seitaro; Ohtsuka, Katsuhito; Abe, Shinsuke
2014-05-01
We present light curve observations and their multiband photometry for near-Earth object (NEO) 2011 XA{sub 3}. The light curve has shown a periodicity of 0.0304 ± 0.0003 days (= 43.8 ± 0.4 minutes). The fast rotation shows that 2011 XA{sub 3} is in a state of tension (i.e., a monolithic asteroid) and cannot be held together by self-gravitation. Moreover, the multiband photometric analysis indicates that the taxonomic class of 2011 XA{sub 3} is S-complex, or V-type. Its estimated effective diameter is 225 ± 97 m (S-complex) and 166 ± 63 m (V-type), respectively. Therefore, 2011 XA{sub 3} is a candidatemore » for the second-largest, fast-rotating, monolithic asteroid. Moreover, the orbital parameters of 2011 XA{sub 3} are apparently similar to those of NEO (3200) Phaethon, but F/B-type. We computed the orbital evolutions of 2011 XA{sub 3} and Phaethon. However, the results of the computation and distinct taxonomy indicate that neither of the asteroids is of common origin.« less
Arrieta-Camacho, Juan José; Biegler, Lorenz T
2005-12-01
Real time optimal guidance is considered for a class of low thrust spacecraft. In particular, nonlinear model predictive control (NMPC) is utilized for computing the optimal control actions required to transfer a spacecraft from a low Earth orbit to a mission orbit. The NMPC methodology presented is able to cope with unmodeled disturbances. The dynamics of the transfer are modeled using a set of modified equinoctial elements because they do not exhibit singularities for zero inclination and zero eccentricity. The idea behind NMPC is the repeated solution of optimal control problems; at each time step, a new control action is computed. The optimal control problem is solved using a direct method-fully discretizing the equations of motion. The large scale nonlinear program resulting from the discretization procedure is solved using IPOPT--a primal-dual interior point algorithm. Stability and robustness characteristics of the NMPC algorithm are reviewed. A numerical example is presented that encourages further development of the proposed methodology: the transfer from low-Earth orbit to a molniya orbit.
Onboard Algorithms for Data Prioritization and Summarization of Aerial Imagery
NASA Technical Reports Server (NTRS)
Chien, Steve A.; Hayden, David; Thompson, David R.; Castano, Rebecca
2013-01-01
Many current and future NASA missions are capable of collecting enormous amounts of data, of which only a small portion can be transmitted to Earth. Communications are limited due to distance, visibility constraints, and competing mission downlinks. Long missions and high-resolution, multispectral imaging devices easily produce data exceeding the available bandwidth. To address this situation computationally efficient algorithms were developed for analyzing science imagery onboard the spacecraft. These algorithms autonomously cluster the data into classes of similar imagery, enabling selective downlink of representatives of each class, and a map classifying the terrain imaged rather than the full dataset, reducing the volume of the downlinked data. A range of approaches was examined, including k-means clustering using image features based on color, texture, temporal, and spatial arrangement
Simulation of Earth-Moon-Mars Environments for the Assessment of Organ Doses
NASA Astrophysics Data System (ADS)
Kim, M. Y.; Schwadron, N. A.; Townsend, L.; Cucinotta, F. A.
2010-12-01
Space radiation environments for historically large solar particle events (SPE) and galactic cosmic rays (GCR) at solar minimum and solar maximum are simulated in order to characterize exposures to radio-sensitive organs for missions to low-Earth orbit (LEO), moon, and Mars. Primary and secondary particles for SPE and GCR are transported through the respective atmosphere of Earth or Mars, space vehicle, and astronaut’s body tissues using the HZETRN/QMSFRG computer code. In LEO, exposures are reduced compared to deep space because particles are deflected by the Earth’s magnetic field and absorbed by the solid body of the Earth. Geomagnetic transmission function as a function of altitude was applied for the particle flux of charged particles, and the shift of the organ exposures to higher velocity or lower stopping powers compared to those in deep space was analyzed. In the transport through Mars atmosphere, a vertical distribution of atmospheric thickness was calculated from the temperature and pressure data of Mars Global Surveyor, and the directional cosine distribution was implemented to describe the spherically distributed atmospheric distance along the slant path at each altitude. The resultant directional shielding by Mars atmosphere at solar minimum and solar maximum was used for the particle flux simulation at various altitudes on the Martian surface. Finally, atmospheric shielding was coupled with vehicle and body shielding for organ dose estimates. We made predictions of radiation dose equivalents and evaluated acute symptoms at LEO, moon, and Mars at solar minimum and solar maximum.
ERIC Educational Resources Information Center
Siegel-Causey, Ellin; McMorris, Carol; McGowen, Susan; Sands-Buss, Sue
1998-01-01
This case study of a 14-year-old boy with severe disabilities describes the collaboration of a team of educators who sought to include him in eighth-grade general-education classes. His inclusion plan included four steps: planning, selecting classes, accommodating, and collaborating. The accomplishments of the student's inclusion in earth science…
76 FR 9965 - Amendment of Class E Airspace and Revocation of Class E Airspace; Easton, MD
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-23
...-5588. SUPPLEMENTARY INFORMATION: History On October 22, 2010, the FAA published in the Federal Register...] * * * * * Paragraph 6005 Class E Airspace Areas Extending Upward from 700 feet or More Above the Surface of the Earth...]04'08'' W.) That airspace extending upward from 700 feet above the surface of the Earth within a 6.5...
NASA Astrophysics Data System (ADS)
Saha, S.; Basak, S.; Safonova, M.; Bora, K.; Agrawal, S.; Sarkar, P.; Murthy, J.
2018-04-01
Seven Earth-sized planets, known as the TRAPPIST-1 system, was discovered with great fanfare in the last week of February 2017. Three of these planets are in the habitable zone of their star, making them potentially habitable planets (PHPs) a mere 40 light years away. The discovery of the closest potentially habitable planet to us just a year before - Proxima b and a realization that Earth-type planets in circumstellar habitable zones are a common occurrence provides the impetus to the existing pursuit for life outside the Solar System. The search for life has two goals essentially: looking for planets with Earth-like conditions (Earth similarity) and looking for the possibility of life in some form (habitability). An index was recently developed, the Cobb-Douglas Habitability Score (CDHS), based on Cobb-Douglas habitability production function (CD-HPF), which computes the habitability score by using measured and estimated planetary parameters. As an initial set, radius, density, escape velocity and surface temperature of a planet were used. The proposed metric, with exponents accounting for metric elasticity, is endowed with analytical properties that ensure global optima and can be scaled to accommodate a finite number of input parameters. We show here that the model is elastic, and the conditions on elasticity to ensure global maxima can scale as the number of predictor parameters increase. K-NN (K-Nearest Neighbor) classification algorithm, embellished with probabilistic herding and thresholding restriction, utilizes CDHS scores and labels exoplanets into appropriate classes via feature-learning methods yielding granular clusters of habitability. The algorithm works on top of a decision-theoretical model using the power of convex optimization and machine learning. The goal is to characterize the recently discovered exoplanets into an "Earth League" and several other classes based on their CDHS values. A second approach, based on a novel feature-learning and tree-building method classifies the same planets without computing the CDHS of the planets and produces a similar outcome. For this, we use XGBoosted trees. The convergence of the outcome of the two different approaches indicates the strength of the proposed solution scheme and the likelihood of the potential habitability of the recently announced discoveries.
CD-HPF: New habitability score via data analytic modeling
NASA Astrophysics Data System (ADS)
Bora, K.; Saha, S.; Agrawal, S.; Safonova, M.; Routh, S.; Narasimhamurthy, A.
2016-10-01
The search for life on the planets outside the Solar System can be broadly classified into the following: looking for Earth-like conditions or the planets similar to the Earth (Earth similarity), and looking for the possibility of life in a form known or unknown to us (habitability). The two frequently used indices, Earth Similarity Index (ESI) and Planetary Habitability Index (PHI), describe heuristic methods to score habitability in the efforts to categorize different exoplanets (or exomoons). ESI, in particular, considers Earth as the reference frame for habitability, and is a quick screening tool to categorize and measure physical similarity of any planetary body with the Earth. The PHI assesses the potential habitability of any given planet, and is based on the essential requirements of known life: presence of a stable and protected substrate, energy, appropriate chemistry and a liquid medium. We propose here a different metric, a Cobb-Douglas Habitability Score (CDHS), based on Cobb-Douglas habitability production function (CD-HPF), which computes the habitability score by using measured and estimated planetary input parameters. As an initial set, we used radius, density, escape velocity and surface temperature of a planet. The values of the input parameters are normalized to the Earth Units (EU). The proposed metric, with exponents accounting for metric elasticity, is endowed with analytical properties that ensure global optima, and scales up to accommodate finitely many input parameters. The model is elastic, and, as we discovered, the standard PHI turns out to be a special case of the CDHS. Computed CDHS scores are fed to K-NN (K-Nearest Neighbor) classification algorithm with probabilistic herding that facilitates the assignment of exoplanets to appropriate classes via supervised feature learning methods, producing granular clusters of habitability. The proposed work describes a decision-theoretical model using the power of convex optimization and algorithmic machine learning.
NASA Technical Reports Server (NTRS)
Greenstadt, E. W.; Moses, S. L.; Coroniti, F. V.; Farris, M. H.; Russell, C. T.
1993-01-01
ULF waves in Earth's foreshock cause the instantaneous angle theta-B(n) between the upstream magnetic field and the shock normal to deviate from its average value. Close to the quasi-parallel (Q-parallel) shock, the transverse components of the waves become so large that the orientation of the field to the normal becomes quasi-perpendicular (Q-perpendicular) during applicable phases of each wave cycle. Large upstream pulses of B were observed completely enclosed in excursions of Theta-B(n) into the Q-perpendicular range. A recent numerical simulation included Theta-B(n) among the parameters examined in Q-parallel runs, and described a similar coincidence as intrinsic to a stage in development of the reformation process of such shocks. Thus, the natural environment of the Q-perpendicular section of Earth's bow shock seems to include an identifiable class of enlarged magnetic pulses for which local Q-perpendicular geometry is a necessary association.
NASA Astrophysics Data System (ADS)
Cook, G. W.
2012-12-01
At the University of California, San Diego, I teach a quarter-long, introductory Earth Science class titled "Volcanoes," which is, in essence, a functional class in volcanology designed specifically for non-majors. This large-format (enrollment ~ 85), lecture-based class provides students from an assortment of backgrounds an opportunity to acquire much-needed (and sometimes dreaded) area credits in science, while also serving as an introduction to the Earth Science major at UCSD (offered through Scripps Institution of Oceanography). The overall goal of the course is to provide students with a stimulating and exciting general science option that, using an inherently interesting topic, introduces them to the fundamentals of geoscience. A secondary goal is to promote general science and geoscience literacy among the general population of UCSD. Student evaluations of this course unequivocally indicate a high degree of learning and interest in the material. The majority of students in the class (>80%) are non-science majors and very few students (<3%) are Earth science degree-seeking students. In addition, only a handful of students have typically had any form of geology class beyond high school level Earth Science. Consequently, there are challenges associated with teaching the class. Perhaps most significantly, students have very little background—background that is necessary for understanding the processes involved in volcanic eruptions. Second, many non-science students have built-in anxieties with respect to math and science, anxieties that must be considered when designing curriculum and syllabi. It is essential to provide the right balance of technical information while remaining in touch with the audience. My approach to the class involves a dynamic lecture format that incorporates a wide array of multimedia, analogue demonstrations of volcanic processes, and small-group discussions of topics and concepts. In addition to teaching about volcanoes—a fascinating subject in and of itself—I take the opportunity in the first two weeks to introduce students to basic geology, including tectonics, earth materials, surface processes, and geologic time. In fact, this is a vital segment of the class, as the students need this background for the latter portions of the class. A side benefit is that students are provided with a "mini" education in geology whether they know it or not and take this knowledge with them into other classes, and ultimately, their futures. Student satisfaction is uniformly very high with this class. 100% of students agreed that the course material was intellectually stimulating; 95% of students agreed that they learned a great deal from the course; 100% of students stated that they would recommend the class to other students. Overall, the class highlights the role that non-major introductory-level geoscience classes, in particular ones with interesting topics, can serve in educating college-level students about Earth Science. They may also serve as a gateway into the Earth Sciences for students who previously had no such inclination.
Nonrecursive formulations of multibody dynamics and concurrent multiprocessing
NASA Technical Reports Server (NTRS)
Kurdila, Andrew J.; Menon, Ramesh
1993-01-01
Since the late 1980's, research in recursive formulations of multibody dynamics has flourished. Historically, much of this research can be traced to applications of low dimensionality in mechanism and vehicle dynamics. Indeed, there is little doubt that recursive order N methods are the method of choice for this class of systems. This approach has the advantage that a minimal number of coordinates are utilized, parallelism can be induced for certain system topologies, and the method is of order N computational cost for systems of N rigid bodies. Despite the fact that many authors have dismissed redundant coordinate formulations as being of order N(exp 3), and hence less attractive than recursive formulations, we present recent research that demonstrates that at least three distinct classes of redundant, nonrecursive multibody formulations consistently achieve order N computational cost for systems of rigid and/or flexible bodies. These formulations are as follows: (1) the preconditioned range space formulation; (2) penalty methods; and (3) augmented Lagrangian methods for nonlinear multibody dynamics. The first method can be traced to its foundation in equality constrained quadratic optimization, while the last two methods have been studied extensively in the context of coercive variational boundary value problems in computational mechanics. Until recently, however, they have not been investigated in the context of multibody simulation, and present theoretical questions unique to nonlinear dynamics. All of these nonrecursive methods have additional advantages with respect to recursive order N methods: (1) the formalisms retain the highly desirable order N computational cost; (2) the techniques are amenable to concurrent simulation strategies; (3) the approaches do not depend upon system topology to induce concurrency; and (4) the methods can be derived to balance the computational load automatically on concurrent multiprocessors. In addition to the presentation of the fundamental formulations, this paper presents new theoretical results regarding the rate of convergence of order N constraint stabilization schemes associated with the newly introduced class of methods.
NASA Astrophysics Data System (ADS)
Ganguly, S.; Kumar, U.; Nemani, R. R.; Kalia, S.; Michaelis, A.
2016-12-01
In this work, we use a Fully Constrained Least Squares Subpixel Learning Algorithm to unmix global WELD (Web Enabled Landsat Data) to obtain fractions or abundances of substrate (S), vegetation (V) and dark objects (D) classes. Because of the sheer nature of data and compute needs, we leveraged the NASA Earth Exchange (NEX) high performance computing architecture to optimize and scale our algorithm for large-scale processing. Subsequently, the S-V-D abundance maps were characterized into 4 classes namely, forest, farmland, water and urban areas (with NPP-VIIRS - national polar orbiting partnership visible infrared imaging radiometer suite nighttime lights data) over California, USA using Random Forest classifier. Validation of these land cover maps with NLCD (National Land Cover Database) 2011 products and NAFD (North American Forest Dynamics) static forest cover maps showed that an overall classification accuracy of over 91% was achieved, which is a 6% improvement in unmixing based classification relative to per-pixel based classification. As such, abundance maps continue to offer an useful alternative to high-spatial resolution data derived classification maps for forest inventory analysis, multi-class mapping for eco-climatic models and applications, fast multi-temporal trend analysis and for societal and policy-relevant applications needed at the watershed scale.
Future remote-sensing programs
NASA Technical Reports Server (NTRS)
Schweickart, R. L.
1975-01-01
User requirements and methods developed to fulfill them are discussed. Quick-look data, data storage on computer-compatible tape, and an integrated capability for production of images from the whole class of earth-viewing satellites are among the new developments briefly described. The increased capability of LANDSAT-C and Nimbus G and the needs of specialized applications such as, urban land use planning, cartography, accurate measurement of small agricultural fields, thermal mapping and coastal zone management are examined. The affect of the space shuttle on remote sensing technology through increased capability is considered.
The ERTS-1 investigation (ER-600). Volume 3: ERTS-1 forest analysis
NASA Technical Reports Server (NTRS)
Erb, R. B.
1974-01-01
The Forest Analysis Team of the Lyndon B. Johnson Space Center Earth Observations Division conducted a year's investigation of LANDSAT 1 multispectral data to determine the size of forest features that could be detected and to determine the suitability for making forest classification maps. The Sam Houston National Forest of Texas was used as the test site. Using conventional interpretation and computer aided techniques, the team was able to differentiate up to 14 classes of forest features to an accuracy ranging between 55 and 84 percent.
Multistage classification of multispectral Earth observational data: The design approach
NASA Technical Reports Server (NTRS)
Bauer, M. E. (Principal Investigator); Muasher, M. J.; Landgrebe, D. A.
1981-01-01
An algorithm is proposed which predicts the optimal features at every node in a binary tree procedure. The algorithm estimates the probability of error by approximating the area under the likelihood ratio function for two classes and taking into account the number of training samples used in estimating each of these two classes. Some results on feature selection techniques, particularly in the presence of a very limited set of training samples, are presented. Results comparing probabilities of error predicted by the proposed algorithm as a function of dimensionality as compared to experimental observations are shown for aircraft and LANDSAT data. Results are obtained for both real and simulated data. Finally, two binary tree examples which use the algorithm are presented to illustrate the usefulness of the procedure.
Characterization of Earth as an exoplanet on the basis of VIRTIS-Venus Express data analysis.
NASA Astrophysics Data System (ADS)
Oliva, Fabrizio; Piccioni, Giuseppe; D'Aversa, Emiliano; Bellucci, Giancarlo; Sindoni, Giuseppe; Grassi, Davide; Filacchione, Gianrico; Tosi, Federico; Capaccioni, Fabrizio
2017-04-01
The Visible and InfraRed Thermal Imaging Spectrometer (VIRTIS, Piccioni et al., 2007) on board the Venus Express spacecraft observed the planet Earth several times in the course of the mission. In particular, a subset of 48 observations has been taken from a distance at which our planet is imaged at sub-pixel size, as exoplanets are observed using current technologies. We studied this full subset to understand which spectral signatures, related to different surface and cloud types, can be identified from the integrated planet spectrum. As expected, we found that the cloud coverage has a key role in the identification of surface features and that vegetation is very difficult to be detected. To validate our results we built a simple tool capable to simulate observations of an Earth-like planet as seen from a VIRTIS-like spectrometer in the 0.3 - 5.0 μm range. The illumination and viewing geometries, along with the spectrometer instantaneous field of view and spectral grid and sampling, can be defined by the user. The spectral endmembers used to generate the planet have been selected from an observation of Earth registered from the instrument VIRTIS on board the ESA mission Rosetta, with similar characteristics, during the third flyby of the spacecraft around our planet, occurred in November 2009. Hence, we simulated planets made of: vegetation, desert, ocean, water ice clouds and liquid water clouds. Using different amounts for each spectral class we inferred the percentages that are required to identify each class when all the spectral information is integrated into a single pixel. The outcome of this analysis confirms that clouds are not a negligible issue in the research for spectral signatures, in particular those related to the habitability of a planet and its climate conditions, even when the cloud coverage is not so high. Acknowledgements: This study has been performed within the WOW project financed by INAF and thanks to the support from the Italian Space Agency to VIRTIS Venus Express and Rosetta. References Piccioni, G., et al., 2007. VIRTIS: The Visible and Infrared Thermal Imaging Spectrometer. ESA Special Publication, SP-1295, 1-27.
Educational NASA Computational and Scientific Studies (enCOMPASS)
NASA Technical Reports Server (NTRS)
Memarsadeghi, Nargess
2013-01-01
Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and engineering applications to computer science and applied mathematics university classes, and makes NASA objectives part of the university curricula. There is great potential for growth and return on investment of this program to the point where every major university in the U.S. would use at least one of these case studies in one of their computational courses, and where every NASA scientist and engineer facing a computational challenge (without having resources or expertise to solve it) would use enCOMPASS to formulate the problem as a case study, provide it to a university, and get back their solutions and ideas.
Using Adaptive Mesh Refinment to Simulate Storm Surge
NASA Astrophysics Data System (ADS)
Mandli, K. T.; Dawson, C.
2012-12-01
Coastal hazards related to strong storms such as hurricanes and typhoons are one of the most frequently recurring and wide spread hazards to coastal communities. Storm surges are among the most devastating effects of these storms, and their prediction and mitigation through numerical simulations is of great interest to coastal communities that need to plan for the subsequent rise in sea level during these storms. Unfortunately these simulations require a large amount of resolution in regions of interest to capture relevant effects resulting in a computational cost that may be intractable. This problem is exacerbated in situations where a large number of similar runs is needed such as in design of infrastructure or forecasting with ensembles of probable storms. One solution to address the problem of computational cost is to employ adaptive mesh refinement (AMR) algorithms. AMR functions by decomposing the computational domain into regions which may vary in resolution as time proceeds. Decomposing the domain as the flow evolves makes this class of methods effective at ensuring that computational effort is spent only where it is needed. AMR also allows for placement of computational resolution independent of user interaction and expectation of the dynamics of the flow as well as particular regions of interest such as harbors. The simulation of many different applications have only been made possible by using AMR-type algorithms, which have allowed otherwise impractical simulations to be performed for much less computational expense. Our work involves studying how storm surge simulations can be improved with AMR algorithms. We have implemented relevant storm surge physics in the GeoClaw package and tested how Hurricane Ike's surge into Galveston Bay and up the Houston Ship Channel compares to available tide gauge data. We will also discuss issues dealing with refinement criteria, optimal resolution and refinement ratios, and inundation.
Golfing with protons: using research grade simulation algorithms for online games
NASA Astrophysics Data System (ADS)
Harold, J.
2004-12-01
Scientists have long known the power of simulations. By modeling a system in a computer, researchers can experiment at will, developing an intuitive sense of how a system behaves. The rapid increase in the power of personal computers, combined with technologies such as Flash, Shockwave and Java, allow us to bring research simulations into the education world by creating exploratory environments for the public. This approach is illustrated by a project funded by a small grant from NSF's Informal Science Education program, through an opportunity that provides education supplements to existing research awards. Using techniques adapted from a magnetospheric research program, several Flash based interactives have been developed that allow web site visitors to explore the motion of particles in the Earth's magnetosphere. These pieces were folded into a larger Space Weather Center web project at the Space Science Institute (www.spaceweathercenter.org). Rather than presenting these interactives as plasma simulations per se, the research algorithms were used to create games such as "Magneto Mini Golf", where the balls are protons moving in combined electric and magnetic fields. The "holes" increase in complexity, beginning with no fields and progressing towards a simple model of Earth's magnetosphere. The emphasis of the activity is gameplay, but because it is at its core a plasma simulation, the user develops an intuitive sense of charged particle motion as they progress. Meanwhile, the pieces contain embedded assessments that are measurable through a database driven tracking system. Mining that database not only provides helpful usability information, but allows us to examine whether users are meeting the learning goals of the activities. We will discuss the development and evaluation results of the project, as well as the potential for these types of activities to shift the expectations of what a web site can and should provide educationally.
Toon, Owen B.; Bardeen, Charles G.; Mills, Michael J.; Fan, Tianyi; English, Jason M.; Neely, Ryan R.
2015-01-01
Abstract A sectional aerosol model (CARMA) has been developed and coupled with the Community Earth System Model (CESM1). Aerosol microphysics, radiative properties, and interactions with clouds are simulated in the size‐resolving model. The model described here uses 20 particle size bins for each aerosol component including freshly nucleated sulfate particles, as well as mixed particles containing sulfate, primary organics, black carbon, dust, and sea salt. The model also includes five types of bulk secondary organic aerosols with four volatility bins. The overall cost of CESM1‐CARMA is approximately ∼2.6 times as much computer time as the standard three‐mode aerosol model in CESM1 (CESM1‐MAM3) and twice as much computer time as the seven‐mode aerosol model in CESM1 (CESM1‐MAM7) using similar gas phase chemistry codes. Aerosol spatial‐temporal distributions are simulated and compared with a large set of observations from satellites, ground‐based measurements, and airborne field campaigns. Simulated annual average aerosol optical depths are lower than MODIS/MISR satellite observations and AERONET observations by ∼32%. This difference is within the uncertainty of the satellite observations. CESM1/CARMA reproduces sulfate aerosol mass within 8%, organic aerosol mass within 20%, and black carbon aerosol mass within 50% compared with a multiyear average of the IMPROVE/EPA data over United States, but differences vary considerably at individual locations. Other data sets show similar levels of comparison with model simulations. The model suggests that in addition to sulfate, organic aerosols also significantly contribute to aerosol mass in the tropical UTLS, which is consistent with limited data. PMID:27668039
Using Virtual Reality To Bring Your Instruction to Life.
ERIC Educational Resources Information Center
Gaddis, Tony
Prepared by the manager of a virtual reality (VR) laboratory at North Carolina's Haywood Community College, the three papers collected in this document are designed to help instructors incorporate VR into their classes. The first paper reviews the characteristics of VR, defining it as a computer-generated simulation of a three-dimensional…
Learning Science through Computer Games and Simulations
ERIC Educational Resources Information Center
Honey, Margaret A., Ed.; Hilton, Margaret, Ed.
2011-01-01
At a time when scientific and technological competence is vital to the nation's future, the weak performance of U.S. students in science reflects the uneven quality of current science education. Although young children come to school with innate curiosity and intuitive ideas about the world around them, science classes rarely tap this potential.…
Exploring Complex Social Phenomena with Computer Simulations
ERIC Educational Resources Information Center
Berson, Ilene R.; Berson, Michael J.
2007-01-01
In social studies classes, there is a longstanding interest in how societies evolve and change over time. However, as stories of the past unfold, it is often difficult to identify a direct link between causes and effects, so students are forced to accept at face value the interpretations of economists, political scientists, historians,…
Fast Monte Carlo-assisted simulation of cloudy Earth backgrounds
NASA Astrophysics Data System (ADS)
Adler-Golden, Steven; Richtsmeier, Steven C.; Berk, Alexander; Duff, James W.
2012-11-01
A calculation method has been developed for rapidly synthesizing radiometrically accurate ultraviolet through longwavelengthinfrared spectral imagery of the Earth for arbitrary locations and cloud fields. The method combines cloudfree surface reflectance imagery with cloud radiance images calculated from a first-principles 3-D radiation transport model. The MCScene Monte Carlo code [1-4] is used to build a cloud image library; a data fusion method is incorporated to speed convergence. The surface and cloud images are combined with an upper atmospheric description with the aid of solar and thermal radiation transport equations that account for atmospheric inhomogeneity. The method enables a wide variety of sensor and sun locations, cloud fields, and surfaces to be combined on-the-fly, and provides hyperspectral wavelength resolution with minimal computational effort. The simulations agree very well with much more time-consuming direct Monte Carlo calculations of the same scene.
NASA Astrophysics Data System (ADS)
Baker Metzler-Winslow, Elizabeth; Terebey, Susan
2018-06-01
This project examines the Class 0/Class 1 protostar L1527-IRS (hereby referred to as L1527) in the interest of creating a more accurate computational model. In a Class 0/Class I protostar like L1527, the envelope is massive, the protostar is growing in mass, and the disk is a small fraction of the protostar mass. Recent work based on ALMA data indicates that L1527, located in the constellation Taurus (about 140 parsecs from Earth), is about ~0.44 solar masses. Existing models were able to fit the spectral energy distribution of L1527 by assuming a puffed-up inner disk. However, the inclusion of the puffed-up disk results in a portion of the disk coinciding with the outflow cavities, a physically unsatisfying arrangement. This project tests models which decrease the size of the disk and increase the density of the outflow cavities (hypothesizing that some dust from the walls of the outflow cavities is swept up into the cavity itself) against existing observational data, and finds that these models fit the data relatively well.
Enabling Earth Science: The Facilities and People of the NCCS
NASA Technical Reports Server (NTRS)
2002-01-01
The NCCS's mass data storage system allows scientists to store and manage the vast amounts of data generated by these computations, and its high-speed network connections allow the data to be accessed quickly from the NCCS archives. Some NCCS users perform studies that are directly related to their ability to run computationally expensive and data-intensive simulations. Because the number and type of questions scientists research often are limited by computing power, the NCCS continually pursues the latest technologies in computing, mass storage, and networking technologies. Just as important as the processors, tapes, and routers of the NCCS are the personnel who administer this hardware, create and manage accounts, maintain security, and assist the scientists, often working one on one with them.
NASA Astrophysics Data System (ADS)
Sempau, Josep; Wilderman, Scott J.; Bielajew, Alex F.
2000-08-01
A new Monte Carlo (MC) algorithm, the `dose planning method' (DPM), and its associated computer program for simulating the transport of electrons and photons in radiotherapy class problems employing primary electron beams, is presented. DPM is intended to be a high-accuracy MC alternative to the current generation of treatment planning codes which rely on analytical algorithms based on an approximate solution of the photon/electron Boltzmann transport equation. For primary electron beams, DPM is capable of computing 3D dose distributions (in 1 mm3 voxels) which agree to within 1% in dose maximum with widely used and exhaustively benchmarked general-purpose public-domain MC codes in only a fraction of the CPU time. A representative problem, the simulation of 1 million 10 MeV electrons impinging upon a water phantom of 1283 voxels of 1 mm on a side, can be performed by DPM in roughly 3 min on a modern desktop workstation. DPM achieves this performance by employing transport mechanics and electron multiple scattering distribution functions which have been derived to permit long transport steps (of the order of 5 mm) which can cross heterogeneity boundaries. The underlying algorithm is a `mixed' class simulation scheme, with differential cross sections for hard inelastic collisions and bremsstrahlung events described in an approximate manner to simplify their sampling. The continuous energy loss approximation is employed for energy losses below some predefined thresholds, and photon transport (including Compton, photoelectric absorption and pair production) is simulated in an analogue manner. The δ-scattering method (Woodcock tracking) is adopted to minimize the computational costs of transporting photons across voxels.
Modeling in the Classroom: An Evolving Learning Tool
NASA Astrophysics Data System (ADS)
Few, A. A.; Marlino, M. R.; Low, R.
2006-12-01
Among the early programs (early 1990s) focused on teaching Earth System Science were the Global Change Instruction Program (GCIP) funded by NSF through UCAR and the Earth System Science Education Program (ESSE) funded by NASA through USRA. These two programs introduced modeling as a learning tool from the beginning, and they provided workshops, demonstrations and lectures for their participating universities. These programs were aimed at university-level education. Recently, classroom modeling is experiencing a revival of interest. Drs John Snow and Arthur Few conducted two workshops on modeling at the ESSE21 meeting in Fairbanks, Alaska, in August 2005. The Digital Library for Earth System Education (DLESE) at http://www.dlese.org provides web access to STELLA models and tutorials, and UCAR's Education and Outreach (EO) program holds workshops that include training in modeling. An important innovation to the STELLA modeling software by isee systems, http://www.iseesystems.com, called "isee Player" is available as a free download. The Player allows users to view and run STELLA models, change model parameters, share models with colleagues and students, and make working models available on the web. This is important because the expert can create models, and the user can learn how the modeled system works. Another aspect of this innovation is that the educational benefits of modeling concepts can be extended throughout most of the curriculum. The procedure for building a working computer model of an Earth Science System follows this general format: (1) carefully define the question(s) for which you seek the answer(s); (2) identify the interacting system components and inputs contributing to the system's behavior; (3) collect the information and data that will be required to complete the conceptual model; (4) construct a system diagram (graphic) of the system that displays all of system's central questions, components, relationships and required inputs. At this stage in the process the conceptual model of the system is compete and a clear understanding of how the system works is achieved. When appropriate software is available the advanced classes can proceed to (5) creating a computer model of the system and testing the conceptual model. For classes lacking these advanced capabilities they may view and run models using the free isee Player and shared working models. In any event there is understanding to be gained in every step of the procedure outlined above. You can view some examples at http://www.ruf.rice.edu/~few/. We plan to populate this site with samples of Earth science systems for use in Earth system science education.
Undergraduate Course on Global Concerns
NASA Astrophysics Data System (ADS)
Richard, G. A.; Weidner, D. J.
2008-12-01
GEO 311: Geoscience and Global Concerns is an undergraduate course taught at Stony Brook University during each fall semester. The class meets twice per week, with one session consisting of a lecture and the other, an interactive activity in a computer laboratory that engages the students in exploring real world problems. A specific concern or issue serves as a focus during each session. The students are asked to develop answers to a series of questions that engage them in identifying causes of the problem, connections with the Earth system, relationships to other problems, and possible solutions on both a global and local scale. The questions are designed to facilitate an integrated view of the Earth system. Examples of topics that the students explore during the laboratory sessions are: 1) fossil fuel reserves and consumption rates and the effect of their use on climate, 2) alternative sources of energy and associated technologies, such as solar photovoltaics, nuclear energy, tidal power, geothermal energy, and wind power, 3) effects of tsunamis and earthquakes on human populations and infrastructure, 4) climate change, and 5) hurricanes and storms. The selection and scheduling of topics often takes advantage of the occurrence of media attention or events that can serve as case studies. Tools used during the computer sessions include Google Earth, ArcGIS, spreadsheets, and web sites that offer data and maps. The students use Google Earth or ArcGIS to map events such as earthquakes, storms, tsunamis, and changes in the extent of polar ice. Spreadsheets are employed to discern trends in fossil fuel supply and consumption, and to experiment with models that make predictions for the future. We present examples of several of these activities and discuss how they facilitate an understanding of interrelationships within the Earth system.
FSD- FLEXIBLE SPACECRAFT DYNAMICS
NASA Technical Reports Server (NTRS)
Fedor, J. V.
1994-01-01
The Flexible Spacecraft Dynamics and Control program (FSD) was developed to aid in the simulation of a large class of flexible and rigid spacecraft. FSD is extremely versatile and can be used in attitude dynamics and control analysis as well as in-orbit support of deployment and control of spacecraft. FSD has been used to analyze the in-orbit attitude performance and antenna deployment of the RAE and IMP class satellites, and the HAWKEYE, SCATHA, EXOS-B, and Dynamics Explorer flight programs. FSD is applicable to inertially-oriented spinning, earth oriented, or gravity gradient stabilized spacecraft. The spacecraft flexibility is treated in a continuous manner (instead of finite element) by employing a series of shape functions for the flexible elements. Torsion, bending, and three flexible modes can be simulated for every flexible element. FSD can handle up to ten tubular elements in an arbitrary orientation. FSD is appropriate for studies involving the active control of pointed instruments, with options for digital PID (proportional, integral, derivative) error feedback controllers and control actuators such as thrusters and momentum wheels. The input to FSD is in four parts: 1) Orbit Construction FSD calculates a Keplerian orbit with environmental effects such as drag, magnetic torque, solar pressure, thermal effects, and thruster adjustments; or the user can supply a GTDS format orbit tape for a particular satellite/time-span; 2) Control words - for options such as gravity gradient effects, control torques, and integration ranges; 3) Mathematical descriptions of spacecraft, appendages, and control systems- including element geometry, properties, attitudes, libration damping, tip mass inertia, thermal expansion, magnetic tracking, and gimbal simulation options; and 4) Desired state variables to output, i.e., geometries, bending moments, fast Fourier transform plots, gimbal rotation, filter vectors, etc. All FSD input is of free format, namelist construction. FSD is written in FORTRAN 77, PASCAL, and MACRO assembler for batch execution and has been implemented on a DEC VAX series computer operating under VMS. The PASCAL and MACRO routines (in addition to the FORTRAN program) are supplied as both source and object code, so the PASCAL compiler is not required for implementation. This program was last updated in 1985.
Roxo, Sónia; de Almeida, José António; Matias, Filipa Vieira; Mata-Lima, Herlander; Barbosa, Sofia
2016-03-01
This paper proposes a multistep approach for creating a 3D stochastic model of total petroleum hydrocarbon (TPH) grade in potentially polluted soils of a deactivated oil storage site by using chemical analysis results as primary or hard data and classes of sensory perception variables as secondary or soft data. First, the statistical relationship between the sensory perception variables (e.g. colour, odour and oil-water reaction) and TPH grade is analysed, after which the sensory perception variable exhibiting the highest correlation is selected (oil-water reaction in this case study). The probabilities of cells belonging to classes of oil-water reaction are then estimated for the entire soil volume using indicator kriging. Next, local histograms of TPH grade for each grid cell are computed, combining the probabilities of belonging to a specific sensory perception indicator class and conditional to the simulated values of TPH grade. Finally, simulated images of TPH grade are generated by using the P-field simulation algorithm, utilising the local histograms of TPH grade for each grid cell. The set of simulated TPH values allows several calculations to be performed, such as average values, local uncertainties and the probability of the TPH grade of the soil exceeding a specific threshold value.
Classical problems in computational aero-acoustics
NASA Technical Reports Server (NTRS)
Hardin, Jay C.
1996-01-01
In relation to the expected problems in the development of computational aeroacoustics (CAA), the preliminary applications were to classical problems where the known analytical solutions could be used to validate the numerical results. Such comparisons were used to overcome the numerical problems inherent in these calculations. Comparisons were made between the various numerical approaches to the problems such as direct simulations, acoustic analogies and acoustic/viscous splitting techniques. The aim was to demonstrate the applicability of CAA as a tool in the same class as computational fluid dynamics. The scattering problems that occur are considered and simple sources are discussed.
A Simplified Guidance for Target Missiles Used in Ballistic Missile Defence Evaluation
NASA Astrophysics Data System (ADS)
Prabhakar, N.; Kumar, I. D.; Tata, S. K.; Vaithiyanathan, V.
2013-01-01
A simplified guidance scheme for the target missiles used in Ballistic Missile Defence is presented in this paper. The proposed method has two major components, a Ground Guidance Computation (GGC) and an In-Flight Guidance Computation. The GGC which runs on the ground uses a missile model to generate attitude history in pitch plane and computes launch azimuth of the missile to compensate for the effect of earth rotation. The vehicle follows the pre launch computed attitude (theta) history in pitch plane and also applies the course correction in azimuth plane based on its deviation from the pre launch computed azimuth plane. This scheme requires less computations and counters In-flight disturbances such as wind, gust etc. quite efficiently. The simulation results show that the proposed method provides the satisfactory performance and robustness.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weber, M.J.; Brawer, S.A.
1982-07-02
The local structure at individual ion sites in simple and multicomponent glasses is simulated using methods of molecular dynamics. Computer simulations of fluoroberyllate glasses predict a range of ion separations and coordination numbers that increases with increasing complexity of the glass composition. This occurs at both glass forming and glass modifying cation sites. Laser-induced fluorescence line-narrowing techniques provide a unique probe of the local environments of selected subsets of ions and are used to measure site to site variations in the electronic energy levels and transition probabilities of rare earth ions. These and additional results from EXAFS, neutron and x-raymore » diffraction, and NMR experiments are compared with simulated glass structures.« less
VizieR Online Data Catalog: Simulation data for 50 planetary model systems (Hansen+, 2015)
NASA Astrophysics Data System (ADS)
Hansen, B. M. S.; Murray, N.
2017-11-01
We have used the results (after 10 Myr of evolution) of 50 model realizations of the 20 M{Earth} rocky planet systems from Hansen & Murray (2013ApJ...775...53H) to define the initial state of our systems, given in Table A1. We assume all the planets are of terrestrial class, in the sense that they obey the tidal dissipation, and evolve them for 10 Gyr according to our model for tidal+secular evolution. The final configurations are given in Table A2. (2 data files).
An Educational Model for Hands-On Hydrology Education
NASA Astrophysics Data System (ADS)
AghaKouchak, A.; Nakhjiri, N.; Habib, E. H.
2014-12-01
This presentation provides an overview of a hands-on modeling tool developed for students in civil engineering and earth science disciplines to help them learn the fundamentals of hydrologic processes, model calibration, sensitivity analysis, uncertainty assessment, and practice conceptual thinking in solving engineering problems. The toolbox includes two simplified hydrologic models, namely HBV-EDU and HBV-Ensemble, designed as a complement to theoretical hydrology lectures. The models provide an interdisciplinary application-oriented learning environment that introduces the hydrologic phenomena through the use of a simplified conceptual hydrologic model. The toolbox can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching more advanced topics including uncertainty analysis, and ensemble simulation. Both models have been administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of hydrology.
Computational Earth Science: Big Data Transformed Into Insight
NASA Astrophysics Data System (ADS)
Sellars, Scott; Nguyen, Phu; Chu, Wei; Gao, Xiaogang; Hsu, Kuo-lin; Sorooshian, Soroosh
2013-08-01
More than ever in the history of science, researchers have at their fingertips an unprecedented wealth of data from continuously orbiting satellites, weather monitoring instruments, ecological observatories, seismic stations, moored buoys, floats, and even model simulations and forecasts. With just an internet connection, scientists and engineers can access atmospheric and oceanic gridded data and time series observations, seismographs from around the world, minute-by-minute conditions of the near-Earth space environment, and other data streams that provide information on events across local, regional, and global scales. These data sets have become essential for monitoring and understanding the associated impacts of geological and environmental phenomena on society.
Comprehensive evaluation of attitude and orbit estimation using real earth magnetic field data
NASA Technical Reports Server (NTRS)
Deutschmann, Julie; Bar-Itzhack, Itzhack
1997-01-01
A single, augmented extended Kalman filter (EKF) which simultaneously and autonomously estimates spacecraft attitude and orbit was developed and tested with simulated and real magnetometer and rate data. Since the earth's magnetic field is a function of time and position, and since time is accurately known, the differences between the computed and measured magnetic field components, as measured by the magnetometers throughout the entire spacecraft's orbit, are a function of orbit and attitude errors. These differences can be used to estimate the orbit and attitude. The test results of the EKF with magnetometer and gyro data from three NASA satellites are presented and evaluated.
ERIC Educational Resources Information Center
Agne, Russell M.
1972-01-01
Students in classes using a self-instructional unit on meteorology and climatology which provided research data from which generalizations could be drawn increased their critical thinking skills more than groups using conventional earth science texts but did not differ significantly in performance on an achievement test. (AL)
Large-Eddy Simulations of Dust Devils and Convective Vortices
NASA Astrophysics Data System (ADS)
Spiga, Aymeric; Barth, Erika; Gu, Zhaolin; Hoffmann, Fabian; Ito, Junshi; Jemmett-Smith, Bradley; Klose, Martina; Nishizawa, Seiya; Raasch, Siegfried; Rafkin, Scot; Takemi, Tetsuya; Tyler, Daniel; Wei, Wei
2016-11-01
In this review, we address the use of numerical computations called Large-Eddy Simulations (LES) to study dust devils, and the more general class of atmospheric phenomena they belong to (convective vortices). We describe the main elements of the LES methodology. We review the properties, statistics, and variability of dust devils and convective vortices resolved by LES in both terrestrial and Martian environments. The current challenges faced by modelers using LES for dust devils are also discussed in detail.
Multispectral image fusion using neural networks
NASA Technical Reports Server (NTRS)
Kagel, J. H.; Platt, C. A.; Donaven, T. W.; Samstad, E. A.
1990-01-01
A prototype system is being developed to demonstrate the use of neural network hardware to fuse multispectral imagery. This system consists of a neural network IC on a motherboard, a circuit card assembly, and a set of software routines hosted by a PC-class computer. Research in support of this consists of neural network simulations fusing 4 to 7 bands of Landsat imagery and fusing (separately) multiple bands of synthetic imagery. The simulations, results, and a description of the prototype system are presented.
Infrastructure for Training and Partnershipes: California Water and Coastal Ocean Resources
NASA Technical Reports Server (NTRS)
Siegel, David A.; Dozier, Jeffrey; Gautier, Catherine; Davis, Frank; Dickey, Tommy; Dunne, Thomas; Frew, James; Keller, Arturo; MacIntyre, Sally; Melack, John
2000-01-01
The purpose of this project was to advance the existing ICESS/Bren School computing infrastructure to allow scientists, students, and research trainees the opportunity to interact with environmental data and simulations in near-real time. Improvements made with the funding from this project have helped to strengthen the research efforts within both units, fostered graduate research training, and helped fortify partnerships with government and industry. With this funding, we were able to expand our computational environment in which computer resources, software, and data sets are shared by ICESS/Bren School faculty researchers in all areas of Earth system science. All of the graduate and undergraduate students associated with the Donald Bren School of Environmental Science and Management and the Institute for Computational Earth System Science have benefited from the infrastructure upgrades accomplished by this project. Additionally, the upgrades fostered a significant number of research projects (attached is a list of the projects that benefited from the upgrades). As originally proposed, funding for this project provided the following infrastructure upgrades: 1) a modem file management system capable of interoperating UNIX and NT file systems that can scale to 6.7 TB, 2) a Qualstar 40-slot tape library with two AIT tape drives and Legato Networker backup/archive software, 3) previously unavailable import/export capability for data sets on Zip, Jaz, DAT, 8mm, CD, and DLT media in addition to a 622Mb/s Internet 2 connection, 4) network switches capable of 100 Mbps to 128 desktop workstations, 5) Portable Batch System (PBS) computational task scheduler, and vi) two Compaq/Digital Alpha XP1000 compute servers each with 1.5 GB of RAM along with an SGI Origin 2000 (purchased partially using funds from this project along with funding from various other sources) to be used for very large computations, as required for simulation of mesoscale meteorology or climate.
NASA Technical Reports Server (NTRS)
Kavaya, Michael J.; Singh, Upendra N.; Koch, Grady J.; Yu, Jirong; Frehlich, Rod G.
2009-01-01
We present preliminary results of computer simulations of the error in measuring carbon dioxide mixing ratio profiles from earth orbit. The simulated sensor is a pulsed, 2-micron, coherent-detection lidar alternately operating on at least two wavelengths. The simulated geometry is a nadir viewing lidar measuring the column content signal. Atmospheric absorption is modeled using FASCODE3P software with the HITRAN 2004 absorption line data base. Lidar shot accumulation is employed up to the horizontal resolution limit. Horizontal resolutions of 50, 100, and 200 km are shown. Assuming a 400 km spacecraft orbit, the horizontal resolutions correspond to measurement times of about 7, 14, and 28 s. We simulate laser pulse-pair repetition frequencies from 1 Hz to 100 kHz. The range of shot accumulation is 7 to 2.8 million pulse-pairs. The resultant error is shown as a function of horizontal resolution, laser pulse-pair repetition frequency, and laser pulse energy. The effect of different on and off pulse energies is explored. The results are compared to simulation results of others and to demonstrated 2-micron operating points at NASA Langley.
A Computing Infrastructure for Supporting Climate Studies
NASA Astrophysics Data System (ADS)
Yang, C.; Bambacus, M.; Freeman, S. M.; Huang, Q.; Li, J.; Sun, M.; Xu, C.; Wojcik, G. S.; Cahalan, R. F.; NASA Climate @ Home Project Team
2011-12-01
Climate change is one of the major challenges facing us on the Earth planet in the 21st century. Scientists build many models to simulate the past and predict the climate change for the next decades or century. Most of the models are at a low resolution with some targeting high resolution in linkage to practical climate change preparedness. To calibrate and validate the models, millions of model runs are needed to find the best simulation and configuration. This paper introduces the NASA effort on Climate@Home project to build a supercomputer based-on advanced computing technologies, such as cloud computing, grid computing, and others. Climate@Home computing infrastructure includes several aspects: 1) a cloud computing platform is utilized to manage the potential spike access to the centralized components, such as grid computing server for dispatching and collecting models runs results; 2) a grid computing engine is developed based on MapReduce to dispatch models, model configuration, and collect simulation results and contributing statistics; 3) a portal serves as the entry point for the project to provide the management, sharing, and data exploration for end users; 4) scientists can access customized tools to configure model runs and visualize model results; 5) the public can access twitter and facebook to get the latest about the project. This paper will introduce the latest progress of the project and demonstrate the operational system during the AGU fall meeting. It will also discuss how this technology can become a trailblazer for other climate studies and relevant sciences. It will share how the challenges in computation and software integration were solved.
A generic biogeochemical module for earth system models
NASA Astrophysics Data System (ADS)
Fang, Y.; Huang, M.; Liu, C.; Li, H.-Y.; Leung, L. R.
2013-06-01
Physical and biogeochemical processes regulate soil carbon dynamics and CO2 flux to and from the atmosphere, influencing global climate changes. Integration of these processes into earth system models (e.g. community land models - CLM), however, currently faces three major challenges: (1) extensive efforts are required to modify modeling structures and to rewrite computer programs to incorporate new or updated processes as new knowledge is being generated, (2) computational cost is prohibitively expensive to simulate biogeochemical processes in land models due to large variations in the rates of biogeochemical processes, and (3) various mathematical representations of biogeochemical processes exist to incorporate different aspects of fundamental mechanisms, but systematic evaluation of the different mathematical representations is difficult, if not impossible. To address these challenges, we propose a new computational framework to easily incorporate physical and biogeochemical processes into land models. The new framework consists of a new biogeochemical module with a generic algorithm and reaction database so that new and updated processes can be incorporated into land models without the need to manually set up the ordinary differential equations to be solved numerically. The reaction database consists of processes of nutrient flow through the terrestrial ecosystems in plants, litter and soil. This framework facilitates effective comparison studies of biogeochemical cycles in an ecosystem using different conceptual models under the same land modeling framework. The approach was first implemented in CLM and benchmarked against simulations from the original CLM-CN code. A case study was then provided to demonstrate the advantages of using the new approach to incorporate a phosphorus cycle into the CLM model. To our knowledge, the phosphorus-incorporated CLM is a new model that can be used to simulate phosphorus limitation on the productivity of terrestrial ecosystems.
NASA Astrophysics Data System (ADS)
Lanzano, Alexander
2016-10-01
Given recent discoveries there is a very real potential for tidally-locked Earth-like planets to exist orbiting M stars. To determine whether these planets may be habitable it is necessary to understand the nature of their atmospheres. In our investigation we simulate the evolution of present-day Earth while placed in tidally-locked orbit (meaning the same side of the planet always faces the star) around an M dwarf star. We are particularly interested in the evolution of the planet's ozone layer and whether it will shield the planet, and therefore life, from harmful radiation.To accomplish the above objectives we use a state-of-the-art 3-D terrestrial model, the Whole Atmosphere Community Climate Model (WACCM), which fully couples chemistry and climate, and therefore allows self-consistent simulations of atmospheric constituents and their effects on a planet's climate, surface radiation and thus habitability. Preliminary results show that this model is stable and that a tidally-locked Earth is protected from harmful UV radiation produced by G stars. The next step shall be to adapt this model for an M star by including its UV and visible spectrum.This investigation will both provide an insight into the potential for habitable exoplanets and further define the nature of the habitable zones for M class stars. We will also be able to narrow the definition of the habitable zones around distant stars, which will help us identify these planets in the future. Furthermore, this project will allow for a more thorough analysis of data from past and future exoplanet observing missions by defining the atmospheric composition of Earth-like planets around a variety of types of stars.
An Invitation to Kitchen Earth Sciences, an Example of MISO Soup Convection Experiment in Classroom
NASA Astrophysics Data System (ADS)
Kurita, K.; Kumagai, I.; Davaille, A.
2008-12-01
In recent frontiers of earth sciences such as computer simulations and large-scale observations/experiments involved researchers are usually remote from the targets and feel difficulty in having a sense of touching the phenomena in hands. This results in losing sympathy for natural phenomena particularly among young researchers, which we consider a serious problem. We believe the analog experiments such as the subjects of "kitchen earth sciences" proposed here can be a remedy for this. Analog experiments have been used as an important tool in various research fields of earth science, particularly in the fields of developing new ideas. The experiment by H. Ramberg by using silicone pate is famous for guiding concept of the mantle dynamics. The term, "analog" means something not directly related to the target of the research but in analogical sense parallel comparison is possible. The advantages of the analog experiments however seem to have been overwhelmed by rapid progresses of computer simulations. Although we still believe in the present-day meaning, recently we are recognizing another aspect of its significance. The essence of "kitchen earth science" as an analog experiment is to provide experimental setups and materials easily from the kitchen, by which everyone can start experiments and participate in the discussion without special preparations because of our daily-experienced matter. Here we will show one such example which can be used as a heuristic subject in the classrooms at introductory level of earth science as well as in lunch time break of advanced researchers. In heated miso soup the fluid motion can be easily traced by the motion of miso "particles". At highly heated state immiscible part of miso convects with aqueous fluid. At intermediate heating the miso part precipitates to form a sediment layer at the bottom. This layered structure is destroyed regularly by the instability caused by accumulated heat in the miso layer as a bursting. By showing interesting movie we will discuss characteristics of the bursting and possible implications in the understanding of layered system in the planetary interior in the style of lunch time discussion.
[Computer simulation of a clinical magnet resonance tomography scanner for training purposes].
Hackländer, T; Mertens, H; Cramer, B M
2004-08-01
The idea for this project was born by the necessity to offer medical students an easy approach to the theoretical basics of magnetic resonance imaging. The aim was to simulate the features and functions of such a scanner on a commercially available computer by means of a computer program. The simulation was programmed in pure Java under the GNU General Public License and is freely available for a commercially available computer with Windows, Macintosh or Linux operating system. The graphic user interface is oriented to a real scanner. In an external program parameter, images for the proton density and the relaxation times T1 and T2 are calculated on the basis of clinical examinations. From this, the image calculation is carried out in the simulation program pixel by pixel on the basis of a pulse sequence chosen and modified by the user. The images can be stored and printed. In addition, it is possible to display and modify k-space images. Seven classes of pulse sequences are implemented and up to 14 relevant sequence parameters, such as repetition time and echo time, can be altered. Aliasing and motion artifacts can be simulated. As the image calculation only takes a few seconds, interactive working is possible. The simulation has been used in the university education for more than 1 year, successfully illustrating the dependence of the MR images on the measuring parameters. This should facititate the approach of students to the understanding MR imaging in the future.
NASA Astrophysics Data System (ADS)
Trottier, Olivier; Ganguly, Sujoy; Bowne-Anderson, Hugo; Liang, Xin; Howard, Jonathon
For the last 120 years, the development of neuronal shapes has been of great interest to the scientific community. Over the last 30 years, significant work has been done on the molecular processes responsible for dendritic development. In our ongoing research, we use the class IV sensory neurons of the Drosophila melanogaster larva as a model system to understand the growth of dendritic arbors. Our main goal is to elucidate the mechanisms that the neuron uses to determine the shape of its dendritic tree. We have observed the development of the class IV neuron's dendritic tree in the larval stage and have concluded that morphogenesis is defined by 3 distinct processes: 1) branch growth, 2) branching and 3) branch retraction. As the first step towards understanding dendritic growth, we have implemented these three processes in a computational model. Our simulations are able to reproduce the branch length distribution, number of branches and fractal dimension of the class IV neurons for a small range of parameters.
The Role of Transfer in Designing Games and Simulations for Health: Systematic Review.
Kuipers, Derek A; Terlouw, Gijs; Wartena, Bard O; van 't Veer, Job Tb; Prins, Jelle T; Pierie, Jean Pierre En
2017-11-24
The usefulness and importance of serious games and simulations in learning and behavior change for health and health-related issues are widely recognized. Studies have addressed games and simulations as interventions, mostly in comparison with their analog counterparts. Numerous complex design choices have to be made with serious games and simulations for health, including choices that directly contribute to the effects of the intervention. One of these decisions is the way an intervention is expected to lead to desirable transfer effects. Most designs adopt a first-class transfer rationale, whereas the second class of transfer types seems a rarity in serious games and simulations for health. This study sought to review the literature specifically on the second class of transfer types in the design of serious games and simulations. Focusing on game-like interventions for health and health care, this study aimed to (1) determine whether the second class of transfer is recognized as a road for transfer in game-like interventions, (2) review the application of the second class of transfer type in designing game-like interventions, and (3) assess studies that include second-class transfer types reporting transfer outcomes. A total of 6 Web-based databases were systematically searched by titles, abstracts, and keywords using the search strategy (video games OR game OR games OR gaming OR computer simulation*) AND (software design OR design) AND (fidelity OR fidelities OR transfer* OR behaviour OR behavior). The databases searched were identified as relevant to health, education, and social science. A total of 15 relevant studies were included, covering a range of game-like interventions, all more or less mentioning design parameters aimed at transfer. We found 9 studies where first-class transfer was part of the design of the intervention. In total, 8 studies dealt with transfer concepts and fidelity types in game-like intervention design in general; 3 studies dealt with the concept of second-class transfer types and reported effects, and 2 of those recognized transfer as a design parameter. In studies on game-like interventions for health and health care, transfer is regarded as a desirable effect but not as a basic principle for design. None of the studies determined the second class of transfer or instances thereof, although in 3 cases a nonliteral transfer type was present. We also found that studies on game-like interventions for health do not elucidate design choices made and rarely provide design principles for future work. Games and simulations for health abundantly build upon the principles of first-class transfer, but the adoption of second-class transfer types proves scarce. It is likely to be worthwhile to explore the possibilities of second-class transfer types, as they may considerably influence educational objectives in terms of future serious game design for health. ©Derek A Kuipers, Gijs Terlouw, Bard O Wartena, Job TB van 't Veer, Jelle T Prins, Jean Pierre EN Pierie. Originally published in JMIR Serious Games (http://games.jmir.org), 24.11.2017.
Time reversal imaging, Inverse problems and Adjoint Tomography}
NASA Astrophysics Data System (ADS)
Montagner, J.; Larmat, C. S.; Capdeville, Y.; Kawakatsu, H.; Fink, M.
2010-12-01
With the increasing power of computers and numerical techniques (such as spectral element methods), it is possible to address a new class of seismological problems. The propagation of seismic waves in heterogeneous media is simulated more and more accurately and new applications developed, in particular time reversal methods and adjoint tomography in the three-dimensional Earth. Since the pioneering work of J. Claerbout, theorized by A. Tarantola, many similarities were found between time-reversal methods, cross-correlations techniques, inverse problems and adjoint tomography. By using normal mode theory, we generalize the scalar approach of Draeger and Fink (1999) and Lobkis and Weaver (2001) to the 3D- elastic Earth, for theoretically understanding time-reversal method on global scale. It is shown how to relate time-reversal methods on one hand, with auto-correlations of seismograms for source imaging and on the other hand, with cross-correlations between receivers for structural imaging and retrieving Green function. Time-reversal methods were successfully applied in the past to acoustic waves in many fields such as medical imaging, underwater acoustics, non destructive testing and to seismic waves in seismology for earthquake imaging. In the case of source imaging, time reversal techniques make it possible an automatic location in time and space as well as the retrieval of focal mechanism of earthquakes or unknown environmental sources . We present here some applications at the global scale of these techniques on synthetic tests and on real data, such as Sumatra-Andaman (Dec. 2004), Haiti (Jan. 2010), as well as glacial earthquakes and seismic hum.
Cerebral hierarchies: predictive processing, precision and the pulvinar
Kanai, Ryota; Komura, Yutaka; Shipp, Stewart; Friston, Karl
2015-01-01
This paper considers neuronal architectures from a computational perspective and asks what aspects of neuroanatomy and neurophysiology can be disclosed by the nature of neuronal computations? In particular, we extend current formulations of the brain as an organ of inference—based upon hierarchical predictive coding—and consider how these inferences are orchestrated. In other words, what would the brain require to dynamically coordinate and contextualize its message passing to optimize its computational goals? The answer that emerges rests on the delicate (modulatory) gain control of neuronal populations that select and coordinate (prediction error) signals that ascend cortical hierarchies. This is important because it speaks to a hierarchical anatomy of extrinsic (between region) connections that form two distinct classes, namely a class of driving (first-order) connections that are concerned with encoding the content of neuronal representations and a class of modulatory (second-order) connections that establish context—in the form of the salience or precision ascribed to content. We explore the implications of this distinction from a formal perspective (using simulations of feature–ground segregation) and consider the neurobiological substrates of the ensuing precision-engineered dynamics, with a special focus on the pulvinar and attention. PMID:25823866
The Cloud Feedback Model Intercomparison Project (CFMIP) contribution to CMIP6
NASA Astrophysics Data System (ADS)
Webb, Mark J.; Andrews, Timothy; Bodas-Salcedo, Alejandro; Bony, Sandrine; Bretherton, Christopher S.; Chadwick, Robin; Chepfer, Hélène; Douville, Hervé; Good, Peter; Kay, Jennifer E.; Klein, Stephen A.; Marchand, Roger; Medeiros, Brian; Pier Siebesma, A.; Skinner, Christopher B.; Stevens, Bjorn; Tselioudis, George; Tsushima, Yoko; Watanabe, Masahiro
2017-01-01
The primary objective of CFMIP is to inform future assessments of cloud feedbacks through improved understanding of cloud-climate feedback mechanisms and better evaluation of cloud processes and cloud feedbacks in climate models. However, the CFMIP approach is also increasingly being used to understand other aspects of climate change, and so a second objective has now been introduced, to improve understanding of circulation, regional-scale precipitation, and non-linear changes. CFMIP is supporting ongoing model inter-comparison activities by coordinating a hierarchy of targeted experiments for CMIP6, along with a set of cloud-related output diagnostics. CFMIP contributes primarily to addressing the CMIP6 questions How does the Earth system respond to forcing?
and What are the origins and consequences of systematic model biases?
and supports the activities of the WCRP Grand Challenge on Clouds, Circulation and Climate Sensitivity.A compact set of Tier 1 experiments is proposed for CMIP6 to address this question: (1) what are the physical mechanisms underlying the range of cloud feedbacks and cloud adjustments predicted by climate models, and which models have the most credible cloud feedbacks? Additional Tier 2 experiments are proposed to address the following questions. (2) Are cloud feedbacks consistent for climate cooling and warming, and if not, why? (3) How do cloud-radiative effects impact the structure, the strength and the variability of the general atmospheric circulation in present and future climates? (4) How do responses in the climate system due to changes in solar forcing differ from changes due to CO2, and is the response sensitive to the sign of the forcing? (5) To what extent is regional climate change per CO2 doubling state-dependent (non-linear), and why? (6) Are climate feedbacks during the 20th century different to those acting on long-term climate change and climate sensitivity? (7) How do regional climate responses (e.g. in precipitation) and their uncertainties in coupled models arise from the combination of different aspects of CO2 forcing and sea surface warming?CFMIP also proposes a number of additional model outputs in the CMIP DECK, CMIP6 Historical and CMIP6 CFMIP experiments, including COSP simulator outputs and process diagnostics to address the following questions.
How well do clouds and other relevant variables simulated by models agree with observations?
What physical processes and mechanisms are important for a credible simulation of clouds, cloud feedbacks and cloud adjustments in climate models?
Which models have the most credible representations of processes relevant to the simulation of clouds?
How do clouds and their changes interact with other elements of the climate system?
NASA Astrophysics Data System (ADS)
Marchand, R.; Purschke, D.; Samson, J.
2013-03-01
Understanding the physics of interaction between satellites and the space environment is essential in planning and exploiting space missions. Several computer models have been developed over the years to study this interaction. In all cases, simulations are carried out in the reference frame of the spacecraft and effects such as charging, the formation of electrostatic sheaths and wakes are calculated for given conditions of the space environment. In this paper we present a program used to compute magnetic fields and a number of space plasma and space environment parameters relevant to Low Earth Orbits (LEO) spacecraft-plasma interaction modeling. Magnetic fields are obtained from the International Geophysical Reference Field (IGRF) and plasma parameters are obtained from the International Reference Ionosphere (IRI) model. All parameters are computed in the spacecraft frame of reference as a function of its six Keplerian elements. They are presented in a format that can be used directly in most spacecraft-plasma interaction models. Catalogue identifier: AENY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 270308 No. of bytes in distributed program, including test data, etc.: 2323222 Distribution format: tar.gz Programming language: FORTRAN 90. Computer: Non specific. Operating system: Non specific. RAM: 7.1 MB Classification: 19, 4.14. External routines: IRI, IGRF (included in the package). Nature of problem: Compute magnetic field components, direction of the sun, sun visibility factor and approximate plasma parameters in the reference frame of a Low Earth Orbit satellite. Solution method: Orbit integration, calls to IGRF and IRI libraries and transformation of coordinates from geocentric to spacecraft frame reference. Restrictions: Low Earth orbits, altitudes between 150 and 2000 km. Running time: Approximately two seconds to parameterize a full orbit with 1000 points.
NASA Astrophysics Data System (ADS)
Newman, David L.
2006-10-01
Kinetic plasma simulations in which the phase-space distribution functions are advanced directly via the coupled Vlasov and Poisson (or Maxwell) equations---better known simply as Vlasov simulations---provide a valuable low-noise complement to the more commonly employed Particle-in-Cell (PIC) simulations. However, in more than one spatial dimension Vlasov simulations become numerically demanding due to the high dimensionality of x--v phase-space. Methods that can reduce this computational demand are therefore highly desirable. Several such methods will be presented, which treat the phase-space dynamics along a dominant dimension (e.g., parallel to a beam or current) with the full Vlasov propagator, while employing a reduced description, such as moment equations, for the evolution perpendicular to the dominant dimension. A key difference between the moment-based (and other reduced) methods considered here and standard fluid methods is that the moments are now functions of a phase-space coordinate (e.g. moments of vy in z--vz--y phase space, where z is the dominant dimension), rather than functions of spatial coordinates alone. Of course, moment-based methods require closure. For effectively unmagnetized species, new dissipative closure methods inspired by those of Hammett and Perkins [PRL, 64, 3019 (1990)] have been developed, which exactly reproduce the linear electrostatic response for a broad class of distributions with power-law tails, as are commonly measured in space plasmas. The nonlinear response, which requires more care, will also be discussed. For weakly magnetized species (i.e., φs<φs) an alternative algorithm has been developed in which the distributions are assumed to gyrate about the magnetic field with a fixed nominal perpendicular ``thermal'' velocity, thereby reducing the required phase-space dimension by one. These reduced algorithms have been incorporated into 2-D codes used to study the evolution of nonlinear structures such as double layers and electron holes in Earth's auroral zone.
NASA Technical Reports Server (NTRS)
Molthan, Andrew L.; Petersen, Walter A.; Case, Jonathan L.; Dembek, Scott R.
2009-01-01
Increases in computational resources have allowed operational forecast centers to pursue experimental, high resolution simulations that resolve the microphysical characteristics of clouds and precipitation. These experiments are motivated by a desire to improve the representation of weather and climate, but will also benefit current and future satellite campaigns, which often use forecast model output to guide the retrieval process. The combination of reliable cloud microphysics and radar reflectivity may constrain radiative transfer models used in satellite simulators during future missions, including EarthCARE and the NASA Global Precipitation Measurement. Aircraft, surface and radar data from the Canadian CloudSat/CALIPSO Validation Project are used to check the validity of size distribution and density characteristics for snowfall simulated by the NASA Goddard six-class, single moment bulk water microphysics scheme, currently available within the Weather Research and Forecast (WRF) Model. Widespread snowfall developed across the region on January 22, 2007, forced by the passing of a mid latitude cyclone, and was observed by the dual-polarimetric, C-band radar King City, Ontario, as well as the NASA 94 GHz CloudSat Cloud Profiling Radar. Combined, these data sets provide key metrics for validating model output: estimates of size distribution parameters fit to the inverse-exponential equations prescribed within the model, bulk density and crystal habit characteristics sampled by the aircraft, and representation of size characteristics as inferred by the radar reflectivity at C- and W-band. Specified constants for distribution intercept and density differ significantly from observations throughout much of the cloud depth. Alternate parameterizations are explored, using column-integrated values of vapor excess to avoid problems encountered with temperature-based parameterizations in an environment where inversions and isothermal layers are present. Simulation of CloudSat reflectivity is performed by adopting the discrete-dipole parameterizations and databases provided in literature, and demonstrate an improved capability in simulating radar reflectivity at W-band versus Mie scattering assumptions.
The Programming Language Python In Earth System Simulations
NASA Astrophysics Data System (ADS)
Gross, L.; Imranullah, A.; Mora, P.; Saez, E.; Smillie, J.; Wang, C.
2004-12-01
Mathematical models in earth sciences base on the solution of systems of coupled, non-linear, time-dependent partial differential equations (PDEs). The spatial and time-scale vary from a planetary scale and million years for convection problems to 100km and 10 years for fault systems simulations. Various techniques are in use to deal with the time dependency (e.g. Crank-Nicholson), with the non-linearity (e.g. Newton-Raphson) and weakly coupled equations (e.g. non-linear Gauss-Seidel). Besides these high-level solution algorithms discretization methods (e.g. finite element method (FEM), boundary element method (BEM)) are used to deal with spatial derivatives. Typically, large-scale, three dimensional meshes are required to resolve geometrical complexity (e.g. in the case of fault systems) or features in the solution (e.g. in mantel convection simulations). The modelling environment escript allows the rapid implementation of new physics as required for the development of simulation codes in earth sciences. Its main object is to provide a programming language, where the user can define new models and rapidly develop high-level solution algorithms. The current implementation is linked with the finite element package finley as a PDE solver. However, the design is open and other discretization technologies such as finite differences and boundary element methods could be included. escript is implemented as an extension of the interactive programming environment python (see www.python.org). Key concepts introduced are Data objects, which are holding values on nodes or elements of the finite element mesh, and linearPDE objects, which are defining linear partial differential equations to be solved by the underlying discretization technology. In this paper we will show the basic concepts of escript and will show how escript is used to implement a simulation code for interacting fault systems. We will show some results of large-scale, parallel simulations on an SGI Altix system. Acknowledgements: Project work is supported by Australian Commonwealth Government through the Australian Computational Earth Systems Simulator Major National Research Facility, Queensland State Government Smart State Research Facility Fund, The University of Queensland and SGI.
Multiple point statistical simulation using uncertain (soft) conditional data
NASA Astrophysics Data System (ADS)
Hansen, Thomas Mejer; Vu, Le Thanh; Mosegaard, Klaus; Cordua, Knud Skou
2018-05-01
Geostatistical simulation methods have been used to quantify spatial variability of reservoir models since the 80s. In the last two decades, state of the art simulation methods have changed from being based on covariance-based 2-point statistics to multiple-point statistics (MPS), that allow simulation of more realistic Earth-structures. In addition, increasing amounts of geo-information (geophysical, geological, etc.) from multiple sources are being collected. This pose the problem of integration of these different sources of information, such that decisions related to reservoir models can be taken on an as informed base as possible. In principle, though difficult in practice, this can be achieved using computationally expensive Monte Carlo methods. Here we investigate the use of sequential simulation based MPS simulation methods conditional to uncertain (soft) data, as a computational efficient alternative. First, it is demonstrated that current implementations of sequential simulation based on MPS (e.g. SNESIM, ENESIM and Direct Sampling) do not account properly for uncertain conditional information, due to a combination of using only co-located information, and a random simulation path. Then, we suggest two approaches that better account for the available uncertain information. The first make use of a preferential simulation path, where more informed model parameters are visited preferentially to less informed ones. The second approach involves using non co-located uncertain information. For different types of available data, these approaches are demonstrated to produce simulation results similar to those obtained by the general Monte Carlo based approach. These methods allow MPS simulation to condition properly to uncertain (soft) data, and hence provides a computationally attractive approach for integration of information about a reservoir model.