Science.gov

Sample records for energy analysis simulations

  1. Methodology for Validating Building Energy Analysis Simulations

    SciTech Connect

    Judkoff, R.; Wortman, D.; O'Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  2. Energy Navigation: Simulation Evaluation and Benefit Analysis

    NASA Technical Reports Server (NTRS)

    Williams, David H.; Oseguera-Lohr, Rosa M.; Lewis, Elliot T.

    2011-01-01

    This paper presents results from two simulation studies investigating the use of advanced flight-deck-based energy navigation (ENAV) and conventional transport-category vertical navigation (VNAV) for conducting a descent through a busy terminal area, using Continuous Descent Arrival (CDA) procedures. This research was part of the Low Noise Flight Procedures (LNFP) element within the Quiet Aircraft Technology (QAT) Project, and the subsequent Airspace Super Density Operations (ASDO) research focus area of the Airspace Project. A piloted simulation study addressed development of flight guidance, and supporting pilot and Air Traffic Control (ATC) procedures for high density terminal operations. The procedures and charts were designed to be easy to understand, and to make it easy for the crew to make changes via the Flight Management Computer Control-Display Unit (FMC-CDU) to accommodate changes from ATC.

  3. Analysis and Simulation of a Blue Energy Cycle

    DOE PAGESBeta

    Sharma, Ms. Ketki; Kim, Yong-Ha; Yiacoumi, Sotira; Gabitto, Jorge; Bilheux, Hassina Z.; Santodonato, Louis J.; Mayes, Richard T.; Dai, Sheng; Tsouris, Costas

    2016-01-30

    The mixing process of fresh water and seawater releases a significant amount of energy and is a potential source of renewable energy. The so called ‘blue energy’ or salinity-gradient energy can be harvested by a device consisting of carbon electrodes immersed in an electrolyte solution, based on the principle of capacitive double layer expansion (CDLE). In this study, we have investigated the feasibility of energy production based on the CDLE principle. Experiments and computer simulations were used to study the process. Mesoporous carbon materials, synthesized at the Oak Ridge National Laboratory, were used as electrode materials in the experiments. Neutronmore » imaging of the blue energy cycle was conducted with cylindrical mesoporous carbon electrodes and 0.5 M lithium chloride as the electrolyte solution. For experiments conducted at 0.6 V and 0.9 V applied potential, a voltage increase of 0.061 V and 0.054 V was observed, respectively. From sequences of neutron images obtained for each step of the blue energy cycle, information on the direction and magnitude of lithium ion transport was obtained. A computer code was developed to simulate the process. Experimental data and computer simulations allowed us to predict energy production.« less

  4. Regional Dynamic Simulation Modeling and Analysis of Integrated Energy Futures

    SciTech Connect

    MALCZYNSKI, LEONARD A.; BEYELER, WALTER E.; CONRAD, STEPHEN H.; HARRIS, DAVID B; REXROTH, PAUL E.; BAKER, ARNOLD B.

    2002-11-01

    The Global Energy Futures Model (GEFM) is a demand-based, gross domestic product (GDP)-driven, dynamic simulation tool that provides an integrated framework to model key aspects of energy, nuclear-materials storage and disposition, environmental effluents from fossil and non fossil energy and global nuclear-materials management. Based entirely on public source data, it links oil, natural gas, coal, nuclear and renewable energy dynamically to greenhouse-gas emissions and 12 other measures of environmental impact. It includes historical data from 1990 to 2000, is benchmarked to the DOE/EIA/IEO 2001 [5] Reference Case for 2000 to 2020, and extrapolates energy demand through the year 2050. The GEFM is globally integrated, and breaks out five regions of the world: United States of America (USA), the Peoples Republic of China (China), the former Soviet Union (FSU), the Organization for Economic Cooperation and Development (OECD) nations excluding the USA (other industrialized countries), and the rest of the world (ROW) (essentially the developing world). The GEFM allows the user to examine a very wide range of ''what if'' scenarios through 2050 and to view the potential effects across widely dispersed, but interrelated areas. The authors believe that this high-level learning tool will help to stimulate public policy debate on energy, environment, economic and national security issues.

  5. Survey and Analysis of Weather Data for Building Energy Simulations

    SciTech Connect

    Bhandari, Mahabir S; Shrestha, Som S; New, Joshua Ryan

    2012-01-01

    In recent years, calibrated energy modeling of residential and commercial buildings has gained importance in a retrofit-dominated market. Accurate weather data plays an important role in this calibration process and projected energy savings. It would be ideal to measure weather data at the building location to capture relevant microclimate variation but this is generally considered cost-prohibitive. There are data sources publicly available with high temporal sampling rates but at relatively poor geospatial sampling locations. To overcome this limitation, there are a growing number of service providers that claim to provide real time and historical weather data for 20-35 km2 grid across the globe. Unfortunately, there is limited documentation from 3rd-party sources attesting to the accuracy of this data. This paper compares provided weather characteristics with data collected from a weather station inaccessible to the service providers. Monthly average dry bulb temperature; relative humidity; direct, diffuse and horizontal solar radiation; and wind speed are statistically compared. Moreover, we ascertain the relative contributions of each weather variable and its impact on building loads. Annual simulations are calculated for three different building types, including a closely monitored and automated energy efficient research building. The comparison shows that the difference for an individual variable can be as high as 90%. In addition, annual building energy consumption can vary by 7% while monthly building loads can vary by 40% as a function of the provided location s weather data.

  6. Spatial analysis of world energy markets: estimation of the demand for energy and variable supply simulation

    SciTech Connect

    Al-Sahlawi, M.A.

    1985-01-01

    This study develops a theoretical and empirical model to estimate the demand for oil, natural gas, and coal in the developing countries and to then simulate the response of spatially separated world energy markets to supply shocks. It is found that the demands for oil, natural gas, and coal in the developing countries are highly priced inelastic in the short-run. This is in keeping with elasticity estimates for the developed countries. The model simulates two kinds of supply shocks, first; partial blocking of the Strait of Hormuz, and second; return of Iraq and Iran oil supplies to their pre-war levels. Additionally, the impact of economic growth on energy demand is simulated when supplies are variable. The simulation results reveals that the problem of having oil supply disruption or oil supply surplus seems to create problems that can be solved by varying the supplies of energy substitutes. However, the impact of economic growth appears to be minimal.

  7. Building energy analysis of Electrical Engineering Building from DesignBuilder tool: calibration and simulations

    NASA Astrophysics Data System (ADS)

    Cárdenas, J.; Osma, G.; Caicedo, C.; Torres, A.; Sánchez, S.; Ordóñez, G.

    2016-07-01

    This research shows the energy analysis of the Electrical Engineering Building, located on campus of the Industrial University of Santander in Bucaramanga - Colombia. This building is a green pilot for analysing energy saving strategies such as solar pipes, green roof, daylighting, and automation, among others. Energy analysis was performed by means of DesignBuilder software from virtual model of the building. Several variables were analysed such as air temperature, relative humidity, air velocity, daylighting, and energy consumption. According to two criteria, thermal load and energy consumption, critical areas were defined. The calibration and validation process of the virtual model was done obtaining error below 5% in comparison with measured values. The simulations show that the average indoor temperature in the critical areas of the building was 27°C, whilst relative humidity reached values near to 70% per year. The most critical discomfort conditions were found in the area of the greatest concentration of people, which has an average annual temperature of 30°C. Solar pipes can increase 33% daylight levels into the areas located on the upper floors of the building. In the case of the green roofs, the simulated results show that these reduces of nearly 31% of the internal heat gains through the roof, as well as a decrease in energy consumption related to air conditioning of 5% for some areas on the fourth and fifth floor. The estimated energy consumption of the building was 69 283 kWh per year.

  8. NPTool: a simulation and analysis framework for low-energy nuclear physics experiments

    NASA Astrophysics Data System (ADS)

    Matta, A.; Morfouace, P.; de Séréville, N.; Flavigny, F.; Labiche, M.; Shearman, R.

    2016-08-01

    The Nuclear Physics Tool (NPTool) is an open source data analysis and Monte Carlo simulation framework that has been developed for low-energy nuclear physics experiments with an emphasis on radioactive beam experiments. The NPTool offers a unified framework for designing, preparing and analyzing complex experiments employing multiple detectors, each of which may comprise some hundreds of channels. The framework has been successfully used for the analysis and simulation of experiments at facilities including GANIL, RIKEN, ALTO and TRIUMF, using both stable and radioactive beams. This paper details the NPTool philosophy together with an overview of the workflow. The framework has been benchmarked through the comparison of simulated and experimental data for a variety of detectors used in charged particle and gamma-ray spectroscopy.

  9. Simulation Speed Analysis and Improvements of Modelica Models for Building Energy Simulation

    SciTech Connect

    Jorissen, Filip; Wetter, Michael; Helsen, Lieve

    2015-09-21

    This paper presents an approach for speeding up Modelica models. Insight is provided into how Modelica models are solved and what determines the tool’s computational speed. Aspects such as algebraic loops, code efficiency and integrator choice are discussed. This is illustrated using simple building simulation examples and Dymola. The generality of the work is in some cases verified using OpenModelica. Using this approach, a medium sized office building including building envelope, heating ventilation and air conditioning (HVAC) systems and control strategy can be simulated at a speed five hundred times faster than real time.

  10. An Exploratory Energy Analysis of Electrochromic Windows in Small and Medium Office Buildings - Simulated Results Using EnergyPlus

    SciTech Connect

    Belzer, David B.

    2010-08-01

    The Department of Energy’s (DOE) Building Technologies Program (BTP) has had an active research program in supporting the development of electrochromic (EC) windows. Electrochromic glazings used in these windows have the capability of varying the transmittance of light and heat in response to an applied voltage. This dynamic property allows these windows to reduce lighting, cooling, and heating energy in buildings where they are employed. The exploratory analysis described in this report examined three different variants of EC glazings, characterized by the amount of visible light and solar heat gain (as measured by the solar heat gain coefficients [SHGC] in their “clear” or transparent states). For these EC glazings, the dynamic range of the SHGC’s between their “dark” (or tinted) state and the clear state were: (0.22 - 0.70, termed “high” SHGC); (0.16 - 0.39, termed “low” SHGC); and (0.13 - 0.19; termed “very low” SHGC). These glazings are compared to conventional (static) glazing that meets the ASHRAE Standard 90.1-2004 energy standard for five different locations in the U.S. All analysis used the EnergyPlus building energy simulation program for modeling EC windows and alternative control strategies. The simulations were conducted for a small and a medium office building, where engineering specifications were taken from the set of Commercial Building Benchmark building models developed by BTP. On the basis of these simulations, total source-level savings in these buildings were estimated to range between 2 to 7%, depending on the amount of window area and building location.

  11. Using the BEopt Automated Residential Simulation Test Suite to Enable Comparative Analysis Between Energy Simulation Engines: Preprint

    SciTech Connect

    Tabares-Velasco, P. C.; Maguire, J.; Horowitz, S.; Christensen, C.

    2014-09-01

    Verification and validation are crucial software quality control procedures when developing and implementing models. This is particularly important as a variety of stakeholders rely on accurate predictions from building simulation programs. This study uses the BEopt Automated Residential Simulation Test Suite (BARTS) to facilitate comparison of two energy simulation engines across various building components and includes models that isolate the impacts of specific building components on annual energy consumption. As a case study, BARTS has been used to identify important discrepancies between the engines for several components of the building models; these discrepancies are caused by differences in the models used by the engines or coding errors.

  12. Using the Beopt Automated Residential Simulation Test Suite to Enable Comparative Analysis Between Energy Simulation Engines: Preprint

    SciTech Connect

    Tabares-Velasco, Paulo Cesar; Maguire, Jeff; Horowitz, Scott; Christensen, Craig

    2014-09-01

    Verification and validation are crucial software quality control procedures to follow when developing and implementing models. This is particularly important because a variety of stakeholders rely on accurate predictions from building simulation programs. This study uses the BEopt Automated Residential Simulation Test Suite (BARTS) to facilitate comparison of two energy simulation engines across various building components and includes building models that isolate the impacts of specific components on annual energy consumption. As a case study, BARTS has been used to identify important discrepancies between the engines for several components of the building models. These discrepancies are caused by differences in the algorithms used by the engines or coding errors.

  13. General purpose computational tools for simulation and analysis of medium-energy backscattering spectra

    NASA Astrophysics Data System (ADS)

    Weller, Robert A.

    1999-06-01

    This paper describes a suite of computational tools for general-purpose ion-solid calculations, which has been implemented in the platform-independent computational environment Mathematica®. Although originally developed for medium energy work (beam energies < 300 keV), they are suitable for general, classical, non-relativistic calculations. Routines are available for stopping power, Rutherford and Lenz-Jensen (screened) cross sections, sputtering yields, small-angle multiple scattering, and back-scattering-spectrum simulation and analysis. Also included are a full range of supporting functions, as well as easily accessible atomic mass and other data on all the stable isotopes in the periodic table. The functions use common calling protocols, recognize elements and isotopes by symbolic names and, wherever possible, return symbolic results for symbolic inputs, thereby facilitating further computation. A new paradigm for the representation of backscattering spectra is introduced.

  14. Uncertainty analysis of numerical model simulations and HFR measurements during high energy events

    NASA Astrophysics Data System (ADS)

    Donncha, Fearghal O.; Ragnoli, Emanuele; Suits, Frank; Updyke, Teresa; Roarty, Hugh

    2013-04-01

    The identification and decomposition of sensor and model shortcomings is a fundamental component of any coastal monitoring and predictive system. In this research, numerical model simulations are combined with high-frequency radar (HFR) measurements to provide insights into the statistical accuracy of the remote sensing unit. A combination of classical tidal analysis and quantitative measures of correlation evaluate the performance of both across the bay. A network of high frequency radars is deployed within the Chesapeake study site, on the East coast of the United States, as a backbone component of the Integrated Ocean Observing System (IOOS). This system provides real-time synoptic measurements of surface currents in the zonal and meridional direction at hourly intervals in areas where at least two stations overlap, and radial components elsewhere. In conjunction with this numerical simulations using EFDC (Environmental Fluid Dynamics Code), an advanced three-dimensional model, provide additional details on flows, encompassing both surface dynamics and volumetric transports, while eliminating certain fundamental error inherent in the HFR system such as geometric dilution of precision (GDOP) and range dependencies. The aim of this research is an uncertainty estimate of both these datasets allowing for a degree of inaccuracy in both. The analysis focuses on comparisons between both the vector and radial component of flows returned by the HFR relative to numerical predictions. The analysis provides insight into the reported accuracy of both the raw radial data and the post-processed vector current data computed from combining the radial data. Of interest is any loss of accuracy due to this post-processing. Linear regression techniques decompose the surface currents based on dominant flow processes (tide and wind); statistical analysis and cross-correlation techniques measure agreement between the processed signal and dominant forcing parameters. The tidal signal

  15. Simulation on a car interior aerodynamic noise control based on statistical energy analysis

    NASA Astrophysics Data System (ADS)

    Chen, Xin; Wang, Dengfeng; Ma, Zhengdong

    2012-09-01

    How to simulate interior aerodynamic noise accurately is an important question of a car interior noise reduction. The unsteady aerodynamic pressure on body surfaces is proved to be the key effect factor of car interior aerodynamic noise control in high frequency on high speed. In this paper, a detail statistical energy analysis (SEA) model is built. And the vibra-acoustic power inputs are loaded on the model for the valid result of car interior noise analysis. The model is the solid foundation for further optimization on car interior noise control. After the most sensitive subsystems for the power contribution to car interior noise are pointed by SEA comprehensive analysis, the sound pressure level of car interior aerodynamic noise can be reduced by improving their sound and damping characteristics. The further vehicle testing results show that it is available to improve the interior acoustic performance by using detailed SEA model, which comprised by more than 80 subsystems, with the unsteady aerodynamic pressure calculation on body surfaces and the materials improvement of sound/damping properties. It is able to acquire more than 2 dB reduction on the central frequency in the spectrum over 800 Hz. The proposed optimization method can be looked as a reference of car interior aerodynamic noise control by the detail SEA model integrated unsteady computational fluid dynamics (CFD) and sensitivity analysis of acoustic contribution.

  16. Energy Simulator Residential Buildings

    Energy Science and Technology Software Center (ESTSC)

    1992-02-24

    SERI-RES performs thermal energy analysis of residential or small commercial buildings and has the capability of modeling passive solar equipment such as rock beds, trombe walls, and phase change material. The analysis is accomplished by simulation. A thermal model of the building is created by the user and translated into mathematical form by the program. The mathematical equations are solved repeatedly at time intervals of one hour or less for the period of simulation. Themore » mathematical representation of the building is a thermal network with nonlinear, temperature-dependent controls. A combination of forward finite differences, Jacobian iteration, and constrained optimization techniques is used to obtain a solution. An auxiliary interactive editing program, EDITOR, is included for creating building descriptions. EDITOR checks the validity of the input data and also provides facilities for storing and referencing several types of building description files. Some of the data files used by SERI-RES need to be implemented as direct-access files. Programs are included to convert sequential files to direct-access files and vice versa.« less

  17. Free energy landscape of the Michaelis complex of lactate dehydrogenase: A network analysis of atomistic simulations

    NASA Astrophysics Data System (ADS)

    Pan, Xiaoliang; Schwartz, Steven

    2015-03-01

    It has long been recognized that the structure of a protein is a hierarchy of conformations interconverting on multiple time scales. However, the conformational heterogeneity is rarely considered in the context of enzymatic catalysis in which the reactant is usually represented by a single conformation of the enzyme/substrate complex. Lactate dehydrogenase (LDH) catalyzes the interconversion of pyruvate and lactate with concomitant interconversion of two forms of the cofactor nicotinamide adenine dinucleotide (NADH and NAD+). Recent experimental results suggest that multiple substates exist within the Michaelis complex of LDH, and they are catalytic competent at different reaction rates. In this study, millisecond-scale all-atom molecular dynamics simulations were performed on LDH to explore the free energy landscape of the Michaelis complex, and network analysis was used to characterize the distribution of the conformations. Our results provide a detailed view of the kinetic network the Michaelis complex and the structures of the substates at atomistic scale. It also shed some light on understanding the complete picture of the catalytic mechanism of LDH.

  18. Energy harvesting using rattleback: Theoretical analysis and simulations of spin resonance

    NASA Astrophysics Data System (ADS)

    Nanda, Aditya; Singla, Puneet; Karami, M. Amin

    2016-05-01

    This paper investigates the spin resonance of a rattleback subjected to base oscillations which is able to transduce vibrations into continuous rotary motion and, therefore, is ideal for applications in Energy harvesting and Vibration sensing. The rattleback is a toy with some curious properties. When placed on a surface with reasonable friction, the rattleback has a preferred direction of spin. If rotated anti to it, longitudinal vibrations are set up and spin direction is reversed. In this paper, the dynamics of a rattleback placed on a sinusoidally vibrating platform are simulated. We can expect base vibrations to excite the pitch motion of the rattleback, which, because of the coupling between pitch and spin motion, should cause the rattleback to spin. Results are presented which show that this indeed is the case-the rattleback has a mono-peak spin resonance with respect to base vibrations. The dynamic response of the rattleback was found to be composed of two principal frequencies that appeared in the pitch and rolling motions. One of the frequencies was found to have a large coupling with the spin of the rattleback. Spin resonance was found to occur when the base oscillatory frequency was twice the value of the coupled frequency. A linearized model is developed which can predict the values of the two frequencies accurately and analytical expressions for the same in terms of the parameters of the rattleback have been derived. The analysis, thus, forms an effective and easy method for obtaining the spin resonant frequency of a given rattleback. Novel ideas for applications utilizing the phenomenon of spin resonance, for example, an energy harvester composed of a magnetized rattleback surrounded by ferromagnetic walls and a small scale vibration sensor comprising an array of several magnetized rattlebacks, are included.

  19. Software interoperability for energy simulation

    SciTech Connect

    Hitchcock, Robert J.

    2002-07-31

    This paper provides an overview of software interoperability as it relates to the energy simulation of buildings. The paper begins with a discussion of the difficulties in using sophisticated analysis tools like energy simulation at various stages in the building life cycle, and the potential for interoperability to help overcome these difficulties. An overview of the Industry Foundation Classes (IFC), a common data model for supporting interoperability under continuing development by the International Alliance for Interoperability (IAI) is then given. The process of creating interoperable software is described next, followed by specific details for energy simulation tools. The paper closes with the current status of, and future plans for, the ongoing efforts to achieve software interoperability.

  20. PSTAR: Primary and secondary terms analysis and renormalization: A unified approach to building energy simulations and short-term monitoring

    SciTech Connect

    Subbarao, K.

    1988-09-01

    This report presents a unified method of hourly simulation of a building and analysis of performance data. The method is called Primary and Secondary Terms Analysis and Renormalization (PSTAR). In the PSTAR method, renormalized parameters are introduced for the primary terms such that the renormalized energy balance equation is best satisfied in the least squares sense, hence, the name PSTAR. PSTAR allows extraction of building characteristics from short-term tests on a small number of data channels. These can be used for long-term performance prediction (''ratings''), diagnostics, and control of heating, ventilating, and air conditioning systems (HVAC), comparison of design versus actual performance, etc. By combining realistic building models, simple test procedures, and analysis involving linear equations, PSTAR provides a powerful tool for analyzing building energy as well as testing and monitoring. It forms the basis for the Short-Term Energy Monitoring (STEM) project at SERI.

  1. Energy analysis of electric vehicles using batteries or fuel cells through well-to-wheel driving cycle simulations

    NASA Astrophysics Data System (ADS)

    Campanari, Stefano; Manzolini, Giampaolo; Garcia de la Iglesia, Fernando

    This work presents a study of the energy and environmental balances for electric vehicles using batteries or fuel cells, through the methodology of the well to wheel (WTW) analysis, applied to ECE-EUDC driving cycle simulations. Well to wheel balances are carried out considering different scenarios for the primary energy supply. The fuel cell electric vehicles (FCEV) are based on the polymer electrolyte membrane (PEM) technology, and it is discussed the possibility to feed the fuel cell with (i) hydrogen directly stored onboard and generated separately by water hydrolysis (using renewable energy sources) or by conversion processes using coal or natural gas as primary energy source (through gasification or reforming), (ii) hydrogen generated onboard with a fuel processor fed by natural gas, ethanol, methanol or gasoline. The battery electric vehicles (BEV) are based on Li-ion batteries charged with electricity generated by central power stations, either based on renewable energy, coal, natural gas or reflecting the average EU power generation feedstock. A further alternative is considered: the integration of a small battery to FCEV, exploiting a hybrid solution that allows recovering energy during decelerations and substantially improves the system energy efficiency. After a preliminary WTW analysis carried out under nominal operating conditions, the work discusses the simulation of the vehicles energy consumption when following standardized ECE-EUDC driving cycle. The analysis is carried out considering different hypothesis about the vehicle driving range, the maximum speed requirements and the possibility to sustain more aggressive driving cycles. The analysis shows interesting conclusions, with best results achieved by BEVs only for very limited driving range requirements, while the fuel cell solutions yield best performances for more extended driving ranges where the battery weight becomes too high. Results are finally compared to those of conventional internal

  2. Analysis of vibrational-translational energy transfer using the direct simulation Monte Carlo method

    NASA Technical Reports Server (NTRS)

    Boyd, Iain D.

    1991-01-01

    A new model is proposed for energy transfer between the vibrational and translational modes for use in the direct simulation Monte Carlo method (DSMC). The model modifies the Landau-Teller theory for a harmonic oscillator and the rate transition is related to an experimental correlation for the vibrational relaxation time. Assessment of the model is made with respect to three different computations: relaxation in a heat bath, a one-dimensional shock wave, and hypersonic flow over a two-dimensional wedge. These studies verify that the model achieves detailed balance, and excellent agreement with experimental data is obtained in the shock wave calculation. The wedge flow computation reveals that the usual phenomenological method for simulating vibrational nonequilibrium in the DSMC technique predicts much higher vibrational temperatures in the wake region.

  3. Monte Carlo simulation of Li+ motion in polyethylene based on polarization energy calculations and informed by data compression analysis

    NASA Astrophysics Data System (ADS)

    Scarle, S.; Sterzel, M.; Eilmes, A.; Munn, R. W.

    2005-10-01

    We present an n-fold way kinetic Monte Carlo simulation of the hopping motion of Li+ ions in polyethylene on a grid of mesh 0.36Å superimposed on the voids of the rigid polymer. The structure of the polymer is derived from a higher-order simulation, and the energy of the ion at each site is derived by the self-consistent polarization field method. The ion motion evolves in time from free flight through anomalous diffusion to normal diffusion, with the average energy tending to decrease with increasing temperature through thermal annealing. We compare the results with those of hopping models with probabilistic energy distributions of increasing complexity by analyzing the mean-square displacement and the average energy of an ensemble of ions. The Gumbel distribution describes the ion energy statistics in this system better than the usual Gaussian distribution does; including energy correlation greatly affects the ion dynamics. The analysis uses the standard data compression program GZIP, which proves to be a powerful tool for data analysis by giving a measure of recurrences in the ion path.

  4. Spectral energy analysis of locally resonant nanophononic metamaterials by molecular simulations

    NASA Astrophysics Data System (ADS)

    Honarvar, Hossein; Hussein, Mahmoud I.

    2016-02-01

    A nanophononic metamaterial is a new type of nanostructured material that features an array, or a forest, of intrinsically distributed resonating substructures. Each substructure exhibits numerous local resonances, each of which may hybridize with the phonon dispersion of the underlying host material, causing significant reductions in the group velocities and consequently a reduction in the lattice thermal conductivity. In this Rapid Communication, molecular dynamics simulations are utilized to investigate both the dynamics and the thermal transport properties of a nanophononic metamaterial configuration consisting of a freely suspended silicon membrane with an array of silicon nanopillars standing on the surface. The simulations yield results consistent with earlier lattice-dynamics-based predictions which showed a reduction in the thermal conductivity due to the presence of the local resonators. Using a spectral energy density approach, in which only simulation data are utilized and no a priori information on the nanostructure resonant phonon modes is provided, we show direct evidence of the existence of resonance hybridizations as an inherent mechanism contributing to the slowing down of thermal transport in the host medium.

  5. Pellet plant energy simulator

    NASA Astrophysics Data System (ADS)

    Bordeasu, D.; Vasquez Pulido, T.; Nielsen, C.

    2016-02-01

    The Pellet Plant energy simulator is a software based on advanced algorithms which has the main purpose to see the response of a pellet plant regarding certain location conditions. It combines energy provided by a combined heat and power, and/or by a combustion chamber with the energy consumption of the pellet factory and information regarding weather conditions in order to predict the biomass consumption of the pellet factory together with the combined heat and power, and/or with the biomass consumption of the combustion chamber. The user of the software will not only be able to plan smart the biomass acquisition and estimate its cost, but also to plan smart the preventive maintenance (charcoal cleaning in case of a gasification plant) and use the pellet plant at the maximum output regarding weather conditions and biomass moisture. The software can also be used in order to execute a more precise feasibility study for a pellet plant in a certain location. The paper outlines the algorithm that supports the Pellet Plant Energy Simulator idea and presents preliminary tests results that supports the discussion and implementation of the system

  6. Molecular dynamics simulations and binding free energy analysis of DNA minor groove complexes of curcumin.

    PubMed

    Koonammackal, Mathew Varghese; Nellipparambil, Unnikrishnan Viswambharan Nair; Sudarsanakumar, Chellappanpillai

    2011-11-01

    Curcumin is a natural phytochemical that exhibits a wide range of pharmacological properties, including antitumor and anticancer activities. The similarity in the shape of curcumin to DNA minor groove binding drugs is the motivation for exploring its binding affinity in the minor grooves of DNA sequences. Interactions of curcumin with DNA have not been extensively examined, while its pharmacological activities have been studied and documented in depth. Curcumin was docked with two DNA duplexes, d(GTATATAC)(2) and d(CGCGATATCGCG)(2), and molecular dynamics simulations of the complexes were performed in explicit solvent to determine the stability of the binding. In all systems, the curcumin is positioned in the minor groove in the A·T region, and was stably bound throughout the simulation, causing only minor modifications to the structural parameters of DNA. Water molecules were found to contribute to the stability of the binding of the ligand. Free energy analyses of the complexes were performed with MM-PBSA, and the binding affinities that were calculated are comparable to the values reported for other similar nucleic acid-ligand systems, indicating that curcumin is a suitable natural molecule for the development of minor groove binding drugs. PMID:21287216

  7. Analysis of Voyager Observed High-Energy Electron Fluxes in the Heliosheath Using MHD Simulations

    NASA Technical Reports Server (NTRS)

    Washimi, Haruichi; Webber, W. R.; Zank, Gary P.; Hu, Qiang; Florinski, Vladimir; Adams, James; Kubo, Yuki

    2011-01-01

    The Voyager spacecraft (V1 and V2) observed electrons of 6-14 MeV in the heliosheath which showed several incidences of flux variation relative to a background of gradually increasing flux with distance from the Sun. The increasing flux of background electrons is thought to result from inward radial diffusion. We compare the temporal electron flux variation with dynamical phenomena in the heliosheath that are obtained from our MHD simulations. Because our simulation is based on V2 observed plasma data before V2 crossed the termination shock, this analysis is effective up to late 2008, i.e., about a year after the V2-crossing, during which disturbances, driven prior to the crossing time, survived in the heliosheath. Several electron flux variations correspond to times directly associated with interplanetary shock events. One noteworthy example corresponds to various times associated with the March 2006 interplanetary shock, these being the collision with the termination shock, the passage past the V1 spacecraft, and the collision with the region near the heliopause, as identified by W.R. Webber et al. for proton/helium of 7-200 MeV. Our simulations indicate that all other electron flux variations, except one, correspond well to the times when a shock-driven magneto-sonic pulse and its reflection in the heliosheath either passed across V1/V2, or collided with the termination shock or with the plasma sheet near the heliopause. This result suggests that variation in the electron flux should be due to either direct or indirect effects of magnetosonic pulses in the heliosheath driven by interplanetary shocks

  8. A Labor Market Analysis of the Electricity Sector for 2030 using the National Energy with Weather System Simulator

    NASA Astrophysics Data System (ADS)

    Terry, L.; Clack, C.; Marquis, M.; Paine, J.; Picciano, P.

    2015-12-01

    We conducted an analysis that utilized the National Renewable Energy Laboratory's (NREL) Jobs and Economic Development Impact (JEDI) models to estimate the temporary and permanent jobs, earnings, and state sales tax revenues that would be created by various scenarios of the National Energy with Weather System (NEWS) simulator. This simulator was created by a collaboration between the Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado and the Earth Systems Research Laboratory (ESRL NOAA). The NEWS simulator used three years of high-resolution (13-km, hourly) weather and power data to select the most cost-efficient, resource-maximizing, and complementary locations for wind, solar photovoltaic, and natural gas power plants along with high-voltage direct-current transmission, thereby providing the cheapest possible electricity grid that facilitates the incorporation of large amounts of wind and solar PV. We applied various assumptions to ensure that we produced conservative estimates, while keeping costs in line with those of the NEWS simulator. Our analysis shows that under the lowest carbon-emitting scenario of the NEWS carried out (80% reduction in CO2 compared with 1990 levels), almost ten million new jobs could be created by 2030. Of those jobs, over 400,000 would be permanently supporting the operations of the power plants. That particular scenario would also add over 500 billion to the paychecks of American workers and 75 billion to state tax revenues by 2030. All of this is achieved with average electricity costs of 10.7¢/kWh, because the electric system relies less heavily on fuel and more on jobs constructing, operating, and maintaining infrastructure. We use the current presentation to describe the methods used to reach these findings and examine some potential impacts of our estimates on public policy. Although we are able to identify some systematic problems with the JEDI model, we find that these problems

  9. Exergy and Energy analysis of a ground-source heat pump for domestic water heating under simulated occupancy conditions

    SciTech Connect

    Ally, Moonis Raza; Munk, Jeffrey D; Baxter, Van D; Gehl, Anthony C

    2012-01-01

    This paper presents detailed analysis of a water to water ground source heat pump (WW-GSHP) to provide all the hot water needs in a 345 m2 house located in DOE climate zone 4 (mixed-humid). The protocol for hot water use is based on the Building America Research Benchmark Definition (Hendron 2008; Hendron and Engebrecht 2010) which aims to capture the living habits of the average American household and its impact on energy consumption. The entire house was operated under simulated occupancy conditions. Detailed energy and exergy analysis provides a complete set of information on system efficiency and sources of irreversibility, the main cause of wasted energy. The WW-GSHP was sized at 5.275 kW (1.5-ton) for this house and supplied hot water to a 303 L (80 gal) water storage tank. The WW-GSHP shared the same ground loop with a 7.56 kW (2.1-ton) water to air ground source heat pump (WA-GSHP) which provided space conditioning needs to the entire house. Data, analyses, and measures of performance for the WW-GSHP in this paper complements the results of the WA-GSHP published in this journal (Ally, Munk et al. 2012). Understanding the performance of GSHPs is vital if the ground is to be used as a viable renewable energy resource.

  10. Overview of validation procedures for building energy-analysis simulation codes. [SUNCAT 2. 4, DEROB 4, DOE 2. 1, BLAST

    SciTech Connect

    Wortman, D.; O'Doherty, B.; Judkoff, R.

    1981-03-01

    SERI is developing a procedure for the validation of Building Energy Analysis Simulation Codes (BEAS). These codes are being used increasingly in the building design process, both directly and as the basis for simplified design tools and guidelines. The importance of the validity of the BEAS in predicting building energy performance is obvious when one considers the money and energy which could be wasted by energy-inefficient designs. However, to date, little or no systematic effort has been made to ensure the validity of the various BEAS. The validation work at SERI consists of three distinct parts: Comparative Study, Analytical Verification, and Empirical Validation. The procedures have been developed for the first two parts, and these procedures have been implemented on a sampling of the major BEAS. Results from this work have shown major problems in two of the BEAS tested. Furthermore, when one building design was run on several of the BEAS, there were large differences in the predicted annual heating loads. The empirical validation procedure will be developed when high quality empirical data become available.

  11. Investigation of allosteric modulation mechanism of metabotropic glutamate receptor 1 by molecular dynamics simulations, free energy and weak interaction analysis

    PubMed Central

    Bai, Qifeng; Yao, Xiaojun

    2016-01-01

    Metabotropic glutamate receptor 1 (mGlu1), which belongs to class C G protein-coupled receptors (GPCRs), can be coupled with G protein to transfer extracellular signal by dimerization and allosteric regulation. Unraveling the dimer packing and allosteric mechanism can be of great help for understanding specific regulatory mechanism and designing more potential negative allosteric modulator (NAM). Here, we report molecular dynamics simulation studies of the modulation mechanism of FITM on the wild type, T815M and Y805A mutants of mGlu1 through weak interaction analysis and free energy calculation. The weak interaction analysis demonstrates that van der Waals (vdW) and hydrogen bonding play an important role on the dimer packing between six cholesterol molecules and mGlu1 as well as the interaction between allosteric sites T815, Y805 and FITM in wild type, T815M and Y805A mutants of mGlu1. Besides, the results of free energy calculations indicate that secondary binding pocket is mainly formed by the residues Thr748, Cys746, Lys811 and Ser735 except for FITM-bound pocket in crystal structure. Our results can not only reveal the dimer packing and allosteric regulation mechanism, but also can supply useful information for the design of potential NAM of mGlu1. PMID:26887338

  12. Investigation of allosteric modulation mechanism of metabotropic glutamate receptor 1 by molecular dynamics simulations, free energy and weak interaction analysis

    NASA Astrophysics Data System (ADS)

    Bai, Qifeng; Yao, Xiaojun

    2016-02-01

    Metabotropic glutamate receptor 1 (mGlu1), which belongs to class C G protein-coupled receptors (GPCRs), can be coupled with G protein to transfer extracellular signal by dimerization and allosteric regulation. Unraveling the dimer packing and allosteric mechanism can be of great help for understanding specific regulatory mechanism and designing more potential negative allosteric modulator (NAM). Here, we report molecular dynamics simulation studies of the modulation mechanism of FITM on the wild type, T815M and Y805A mutants of mGlu1 through weak interaction analysis and free energy calculation. The weak interaction analysis demonstrates that van der Waals (vdW) and hydrogen bonding play an important role on the dimer packing between six cholesterol molecules and mGlu1 as well as the interaction between allosteric sites T815, Y805 and FITM in wild type, T815M and Y805A mutants of mGlu1. Besides, the results of free energy calculations indicate that secondary binding pocket is mainly formed by the residues Thr748, Cys746, Lys811 and Ser735 except for FITM-bound pocket in crystal structure. Our results can not only reveal the dimer packing and allosteric regulation mechanism, but also can supply useful information for the design of potential NAM of mGlu1.

  13. HEAP: Heat Energy Analysis Program, a computer model simulating solar receivers. [solving the heat transfer problem

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.

    1979-01-01

    A computer program which can distinguish between different receiver designs, and predict transient performance under variable solar flux, or ambient temperatures, etc. has a basic structure that fits a general heat transfer problem, but with specific features that are custom-made for solar receivers. The code is written in MBASIC computer language. The methodology followed in solving the heat transfer problem is explained. A program flow chart, an explanation of input and output tables, and an example of the simulation of a cavity-type solar receiver are included.

  14. Analysis and simulation of energy use and cost at a municipal wastewater treatment plant

    NASA Astrophysics Data System (ADS)

    Feng, Yilu

    2011-12-01

    The cost of electricity, a major operating cost of municipal wastewater treatment plants, is related to influent flow rate, power price, and power load. With knowledge of inflow and price patterns, plant operators can manage processes to reduce electricity costs. Records of influent flow, power price, and load are evaluated for Blue Plains Advanced Wastewater Treatment Plant. Diurnal and seasonal trends are analyzed. Power usage is broken down among treatment processes. A simulation model of influent pumping, a large power user, is developed. It predicts pump discharge and power usage based on wet-well level. Individual pump characteristics are tested in the plant. The model accurately simulates plant inflow and power use for two pumping stations [R2 = 0.68, 0.93 (inflow), R2 =0.94, 0.91(power)]. Wet-well stage-storage relationship is estimated from data. Time-varying wet-well level is added to the model. A synthetic example demonstrates application in managing pumps to reduce electricity cost.

  15. Numerical simulation and analysis of energy loss in a nanosecond spark gap switch

    NASA Astrophysics Data System (ADS)

    Lavrinovich, I. V.; Oreshkin, V. I.

    2014-11-01

    A system of differential equations for the RLC circuit of a capacitor-switch assembly was derived being supplemented with an equation for the spark resistance of the switch in accordance with the Braginsky model. The parameters that affect the solutions of equations for the circuit with parallel or series connection of several capacitor-switch assemblies to a common inductive load were determined. Based on numerical solution of the system of equations, a dependence of the energy ES released in the spark within the first halfperiod on the discharge circuit and switch parameters was found.

  16. Empirical validation of building energy-analysis simulation programs: a status report

    SciTech Connect

    Judkoff, R.; Wortman, D.; Burch, J.

    1982-09-01

    Under the auspices of the DOE Passive/Hybrid Solar Division Class A Monitoring and Validation Program, SERI has engaged in several areas of research in fiscal year 1982. This research has included: (1) development of a validation methodology, (2) development of a performance monitoring methodology designed to meet the specific data needs for validation of analysis/design tools, (3) construction and monitoring of a 1000-ft/sup 2/ multizone skin-load-dominated test facility, (4) construction and monitoring of a two-zone test cell, and (5) sample validation studies using the DOE-2.1, BLAST-3.0, and SERIRES-1.0 computer programs. The status of these activities is reported and the validation methodology and the Class A data acquisition capabilities at SERI are described briefly.

  17. Energy Analysis.

    ERIC Educational Resources Information Center

    Bazjanac, Vladimir

    1981-01-01

    The Aquatic Center at Corvallis (Oregon) is analyzed for energy use. Energy conservation in the building would be accomplished best through heavy insulation of exterior surfaces and the maximization of passive solar gain. (Author/MLF)

  18. Role of the Closing Base Pair for d(GCA) Hairpin Stability: Free Energy Analysis and Folding Simulations

    SciTech Connect

    Kannan, Srinivasaraghavan; Zacharias, Martin W.

    2011-06-30

    Hairpin loops belong to the most important structural motifs in folded nucleic acids. The d(GNA) sequence in DNA can form very stable trinucleotide hairpin loops depending, however, strongly on the closing base pair. Replica-exchange molecular dynamics (REMD) were employed to study hairpin folding of two DNA sequences, d(gcGCAgc) and d(cgGCAcg), with the same central loop motif but different closing base pairs starting from singlestranded structures. In both cases, conformations of the most populated conformational cluster at the lowest temperature showed close agreement with available experimental structures. For the loop sequence with the less stable G:C closing base pair, an alternative loop topology accumulated as second most populated conformational state indicating a possible loop structural heterogeneity. Comparative-free energy simulations on induced loop unfolding indicated higher stability of the loop with a C:G closing base pair by 3 kcal mol1 (compared to a G:C closing base pair) in very good agreement with experiment. The comparative energetic analysis of sampled unfolded, intermediate and folded conformational states identified electrostatic and packing interactions as the main contributions to the closing base pair dependence of the d(GCA) loop stability.

  19. Building Energy Consumption Analysis

    Energy Science and Technology Software Center (ESTSC)

    2005-03-02

    DOE2.1E-121SUNOS is a set of modules for energy analysis in buildings. Modules are included to calculate the heating and cooling loads for each space in a building for each hour of a year (LOADS), to simulate the operation and response of the equipment and systems that control temperature and humidity and distribute heating, cooling and ventilation to the building (SYSTEMS), to model energy conversion equipment that uses fuel or electricity to provide the required heating,more » cooling and electricity (PLANT), and to compute the cost of energy and building operation based on utility rate schedule and economic parameters (ECONOMICS).« less

  20. Validation Methodology to Allow Simulated Peak Reduction and Energy Performance Analysis of Residential Building Envelope with Phase Change Materials: Preprint

    SciTech Connect

    Tabares-Velasco, P. C.; Christensen, C.; Bianchi, M.

    2012-08-01

    Phase change materials (PCM) represent a potential technology to reduce peak loads and HVAC energy consumption in residential buildings. This paper summarizes NREL efforts to obtain accurate energy simulations when PCMs are modeled in residential buildings: the overall methodology to verify and validate Conduction Finite Difference (CondFD) and PCM algorithms in EnergyPlus is presented in this study. It also shows preliminary results of three residential building enclosure technologies containing PCM: PCM-enhanced insulation, PCM impregnated drywall and thin PCM layers. The results are compared based on predicted peak reduction and energy savings using two algorithms in EnergyPlus: the PCM and Conduction Finite Difference (CondFD) algorithms.

  1. Suitability of Synthetic Driving Profiles from Traffic Micro-Simulation for Real-World Energy Analysis: Preprint

    SciTech Connect

    Hou, Yunfei; Wood, Eric; Burton, Evan; Gonder, Jeffrey

    2015-10-14

    A shift towards increased levels of driving automation is generally expected to result in improved safety and traffic congestion outcomes. However, little empirical data exists to estimate the impact that automated driving could have on energy consumption and greenhouse gas emissions. In the absence of empirical data on differences between drive cycles from present day vehicles (primarily operated by humans) and future vehicles (partially or fully operated by computers) one approach is to model both situations over identical traffic conditions. Such an exercise requires traffic micro-simulation to not only accurately model vehicle operation under high levels of automation, but also (and potentially more challenging) vehicle operation under present day human drivers. This work seeks to quantify the ability of a commercial traffic micro-simulation program to accurately model real-world drive cycles in vehicles operated primarily by humans in terms of driving speed, acceleration, and simulated fuel economy. Synthetic profiles from models of freeway and arterial facilities near Atlanta, Georgia, are compared to empirical data collected from real-world drivers on the same facilities. Empirical and synthetic drive cycles are then simulated in a powertrain efficiency model to enable comparison on the basis of fuel economy. Synthetic profiles from traffic micro-simulation were found to exhibit low levels of transient behavior relative to the empirical data. Even with these differences, the synthetic and empirical data in this study agree well in terms of driving speed and simulated fuel economy. The differences in transient behavior between simulated and empirical data suggest that larger stochastic contributions in traffic micro-simulation (relative to those present in the traffic micro-simulation tool used in this study) are required to fully capture the arbitrary elements of human driving. Interestingly, the lack of stochastic contributions from models of human drivers

  2. Bound Flavin-Cytochrome Model of Extracellular Electron Transfer in Shewanella oneidensis: Analysis by Free Energy Molecular Dynamics Simulations.

    PubMed

    Hong, Gongyi; Pachter, Ruth

    2016-06-30

    Flavins are known to enhance extracellular electron transfer (EET) in Shewanella oneidensis MR-1 bacteria, which reduce electron acceptors through outer-membrane (OM) cytochromes c. Free-shuttle and bound-redox cofactor mechanisms were proposed to explain this enhancement, but recent electrochemical reports favor a flavin-bound model, proposing two one-electron reductions of flavin, namely, oxidized (Ox) to semiquinone (Sq) and semiquinone to hydroquinone (Hq), at anodic and cathodic conditions, respectively. In this work, to provide a mechanistic understanding of riboflavin (RF) binding at the multiheme OM cytochrome OmcA, we explored binding configurations at hemes 2, 5, 7, and 10. Subsequently, on the basis of molecular dynamics (MD) simulations, binding free energies and redox potential shifts upon RF binding for the Ox/Sq and Sq/Hq reductions were analyzed. Our results demonstrated an upshift in the Ox/Sq and a downshift in the Sq/Hq redox potentials, consistent with a bound RF-OmcA model. Furthermore, binding free energy MD simulations indicated an RF binding preference at heme 7. MD simulations of the OmcA-MtrC complex interfacing at hemes 5 revealed a small interprotein redox potential difference with an electron transfer rate of 10(7)-10(8)/s. PMID:27266856

  3. Free-energy analysis of lysozyme-triNAG binding modes with all-atom molecular dynamics simulation combined with the solution theory in the energy representation

    NASA Astrophysics Data System (ADS)

    Takemura, Kazuhiro; Burri, Raghunadha Reddy; Ishikawa, Takeshi; Ishikura, Takakazu; Sakuraba, Shun; Matubayasi, Nobuyuki; Kuwata, Kazuo; Kitao, Akio

    2013-02-01

    We propose a method for calculating the binding free energy of protein-ligand complexes using all-atom molecular dynamics simulation combined with the solution theory in the energy representation. Four distinct modes for the binding of tri-N-acetyl-D-glucosamine (triNAG) to hen egg-white lysozyme were investigated, one from the crystal structure and three generated by docking predictions. The proposed method was demonstrated to be used to distinguish the most plausible binding mode (crystal model) as the lowest binding energy mode.

  4. Thyristor converter simulation and analysis

    SciTech Connect

    Zhang, S.Y.

    1991-01-01

    In this paper we present a simulation on thyristor converters. The simulation features nonlinearity, non-uniform firing, and the commutations. Several applications such as a current regulation, a converter frequency characteristics analysis, and a power line disturbance analysis will be presented. 4 refs., 4 figs.

  5. J. J. Sakurai Prize for Theoretical Particle Physics Lectgure: Improving the precision of high-energy simulation and analysis tools

    NASA Astrophysics Data System (ADS)

    Webber, Bryan

    2012-03-01

    Comparing theoretical predictions with experimental data on particle collisions like those at the Large Hadron Collider is far from straightforward. The predictions usually concern fundamental objects (quarks, gluons, leptons, ) whereas the colliding hadrons are complicated bound states. Furthermore, final states of interest often contain high-energy jets of many hadrons, together with underlying lower-energy hadron production. The jets may come from primary interactions producing energetic quarks and gluons, or from the decays of heavy or highly boosted objects, possibly new forms of matter. I will discuss the development of computer simulations of jet production in hard collisions, and of jet-finding algorithms that aim to reconstruct the fundamental collision and decay dynamics from hadronic final states. In both cases, improvements in the underlying theoretical framework have led to a better description of Standard Model processes at the LHC, and better tools for the discovery of any new processes that may lie within its reach.

  6. Energy losing rate and open-circuit voltage analysis of organic solar cells based on detailed photocurrent simulation

    SciTech Connect

    Yu Junsheng; Huang Jiang; Zhang Lei; Jiang Yadong

    2009-09-15

    The J-V characteristics and photovoltaic response of indium tin oxide/pentacene (d nm)/C{sub 60} (40 nm)/BCP (10 nm)/Ag (100 nm) devices have been systematically analyzed. By fitting the J-V characteristics of the fabricated devices, photocurrent densities J{sub ph} were obtained. Meanwhile, we proposed a modified optical transfer matrix theory to satisfy the reasonable trend between P{sub 0}R{sub 0} and film thickness of pentacene layers. Then, we revealed that an accurate rate of energy loss can be defined as E{sub loss}=1-betaJ{sub e}/P{sub 0}R{sub 0}. Also, the relationship between open-circuit voltage V{sub OC}, compensation voltage V{sub 0} and initial polaron-pair bounding energy E{sub B} was determined based on the detailed study and simulation of device photocurrent.

  7. Building Energy Consumption Analysis

    Energy Science and Technology Software Center (ESTSC)

    2005-01-24

    DOE2.1E-121 is a set of modules for energy analysis in buildings. Modules are included to calculate the heating and cooling loads for each space in a building for each hour of a year (LOADS), to simulate the operation and response of the equipment and systems that control temperature and humidity and distribute heating, cooling and ventilation to the building (SYSTEMS), to model energy conversion equipment that uses fuel or electricity to provide the required heating,more » cooling and electricity (PLANT), and to compute the cost of energy and building operation based on utility rate schedule and economic parameters (ECONOMICS). DOE2.1E-121 contains modifications to DOE2.1E which allows 1000 zones to be modeled.« less

  8. Dynamic Analysis of Nuclear Energy System Strategies

    SciTech Connect

    Den Durpel, Luc Van

    2004-06-17

    DANESS is an integrated process model for nuclear energy systems allowing the simulation of multiple reactors and fuel cycles in a continuously changing nuclear reactor park configuration. The model is energy demand driven and simulates all nuclear fuel cycle facilites, up to 10 reactors and fuels. Reactor and fuel cycle facility history are traced and the cost of generating energy is calculated per reactor and for total nuclear energy system. The DANESS model aims at performing dynamic systems analysis of nuclear energy development used for integrated analysis of development paths for nuclear energy, parameter scoping for new nuclear energy systems, economic analysis of nuclear energy, government role analysis, and education.

  9. Free-energy analysis of water affinity in polymer studied by atomistic molecular simulation combined with the theory of solutions in the energy representation

    NASA Astrophysics Data System (ADS)

    Kawakami, Tomonori; Shigemoto, Isamu; Matubayasi, Nobuyuki

    2012-12-01

    Affinity of small molecule to polymer is an essential property for designing polymer materials with tuned permeability. In the present work, we develop a computational approach to the free energy ΔG of binding a small solute molecule into polymer using the atomistic molecular dynamics (MD) simulation combined with the method of energy representation. The binding free energy ΔG is obtained by viewing a single polymer as a collection of fragments and employing an approximate functional constructed from distribution functions of the interaction energy between solute and the fragment obtained from MD simulation. The binding of water is then examined against 9 typical polymers. The relationship is addressed between the fragment size and the calculated ΔG, and a useful fragment size is identified to compromise the performance of the free-energy functional and the sampling efficiency. It is found with the appropriate fragment size that the ΔG convergence at a statistical error of ˜0.2 kcal/mol is reached at ˜4 ns of replica-exchange MD of the water-polymer system and that the mean absolute deviation of the computational ΔG from the experimental is 0.5 kcal/mol. The connection is further discussed between the polymer structure and the thermodynamic ΔG.

  10. Community Energy Consumption Analysis

    Energy Science and Technology Software Center (ESTSC)

    1992-02-21

    The TDIST3 program performs an analysis of large integrated community total energy systems (TES) supplying thermal and electrical energy from one or more power stations. The program models the time-dependent energy demands of a group of representative building types, distributes the thermal demands within a thermal utility system (TUS), simulates the dynamic response of a group of power stations in meeting the TUS demands, and designs an optimal base-loaded (electrically) power plant and thermal energymore » storage reservoir combination. The capital cost of the TES is evaluated. The program was developed primarily to analyze thermal utility systems supplied with high temperature water (HTW) from more than one power plant. The TUS consists of a transmission loop and secondary loops with a heat exchanger linking each secondary loop to the transmission loop. The power stations electrical output supplies all community buildings and the HTW supplies the thermal demand of the buildings connected through the TUS, a piping network. Basic components of the TES model are one or more power stations connected to the transmission loop. These may be dual-purpose, producing electricity and HTW, or just heating plants producing HTW. A thermal storage reservoir is located at one power station. The secondary loops may have heating plants connected to them. The transmission loop delivers HTW to local districts; the secondary loops deliver the energy to the individual buildings in a district.« less

  11. BLAST: Building energy simulation in Hong Kong

    NASA Astrophysics Data System (ADS)

    Fong, Sai-Keung

    1999-11-01

    evaluated. It was found that the supply of cool air to the lower portion of the compartment provided significant performance of space cooling. The mathematical relationships between different shading patterns and different glass window to wall ratios of single compartments were established to provide a guide for easy approximation of energy use under similar conditions. In addition, the Overall Thermal Transfer Values (OTTV) for the compartments were studied. The monthly and annual energy use of three realistic buildings were investigated. They were a commercial building, an industrial building and a dual-purpose building. The cooling loads per floor area for the buildings were studied and the OTTV were evaluated by two different methods. Sensitivity analysis was carried out to investigate the impact of the parameters of internal heat gains on the energy use of an academic building. It was found that there was major influence of indoor temperature setting on building energy use The performances of using the local weather data file of TMY and example weather years 1980 and 1989 were evaluated. TMY was found to be the most suitable for energy simulation while the weather years 1980 and 1989 yielded good results.

  12. JASMINE: Data analysis and simulation

    NASA Astrophysics Data System (ADS)

    Yamada, Yoshiyuki; Gouda, Naoteru; Yano, Taihei; Kobayashi, Yukiyasu; Sako, Nobutada; Jasmine Working Group

    JASMINE will study the structure and evolution of the Milky Way Galaxy. To accomplish these objectives JASMINE will measure trigonometric parallaxes, positions and proper motions of about 10 million stars with a precision of 10 μas at z = 14 mag. In this paper methods for data analysis and error budgets, on-board data handling such as sampling strategy and data compression, and simulation software for end-to-end simulation are presented.

  13. Simulation of low-energy ion scattering

    NASA Astrophysics Data System (ADS)

    Langelaar, M. H.; Breeman, M.; Mijiritskii, A. V.; Boerma, D. O.

    A new simulation program `MATCH' has been developed for a detailed analysis of low-energy ion scattering (LEIS) and recoiling data. Instead of performing the full calculation of the three-dimensional trajectories through the sample from the ion source towards the detector, incoming trajectories as well as reversed-time outgoing trajectories are calculated, separately. Finally, these trajectories are matched to obtain the yield. The program has been tested for spectra and azimuthal scans of scattering and recoiling events of various sample species in different scattering geometries.

  14. Energy analysis sample building data

    NASA Astrophysics Data System (ADS)

    1981-03-01

    Sample building data for energy calculations necessary for the comparative analysis between the proposed energy calculation procedure and the procedures using comprehensive hourly simulation of HVAC systems are presented. The comparison calculation includes data for the terminal reheat system, double-duct system, heat reclaim system, and standard VAV system for a hypothetical 20-story office building in Washington, DC. Each is evaluated in conjunction with electric centrifugal chiller and gas-fired boiler.

  15. Economic Analysis. Computer Simulation Models.

    ERIC Educational Resources Information Center

    Sterling Inst., Washington, DC. Educational Technology Center.

    A multimedia course in economic analysis was developed and used in conjunction with the United States Naval Academy. (See ED 043 790 and ED 043 791 for final reports of the project evaluation and development model.) This volume of the text discusses the simulation of behavioral relationships among variable elements in an economy and presents…

  16. WATER QUALITY ANALYSIS SIMULATION PROGRAM

    EPA Science Inventory

    The Water Quality Analysis Simulation Program (WASP6), an enhancement of the original WASP (Di Toro et al., 1983; Connolly and Winfield,1984; Ambrose, R.B. et al.,1988). This model helps users interpret and predict water quality responses to natural phenomena and man-made polluti...

  17. An atmospheric energy analysis of the impact of satellite lidar winds and TIROS temperatures in global simulations

    NASA Technical Reports Server (NTRS)

    Keller, Linda M.; Johnson, Donald R.

    1992-01-01

    A study of the effects on forecast accuracy of adding wind-profiler data is conducted. An observing system simulation test is employed that assumes a sufficient concentration of aerosols to provide global wind profiles (a best case scenario). The simulated data for the series of five day forecasts are produced from a twenty day integration utilizing the ECMWF model, which is also employed to produce the verification forecast for the five day period.

  18. Turbulence flight director analysis and preliminary simulation

    NASA Technical Reports Server (NTRS)

    Johnson, D. E.; Klein, R. E.

    1974-01-01

    A control column and trottle flight director display system is synthesized for use during flight through severe turbulence. The column system is designed to minimize airspeed excursions without overdriving attitude. The throttle system is designed to augment the airspeed regulation and provide an indication of the trim thrust required for any desired flight path angle. Together they form an energy management system to provide harmonious display indications of current aircraft motions and required corrective action, minimize gust upset tendencies, minimize unsafe aircraft excursions, and maintain satisfactory ride qualities. A preliminary fixed-base piloted simulation verified the analysis and provided a shakedown for a more sophisticated moving-base simulation to be accomplished next. This preliminary simulation utilized a flight scenario concept combining piloting tasks, random turbulence, and discrete gusts to create a high but realistic pilot workload conducive to pilot error and potential upset. The turbulence director (energy management) system significantly reduced pilot workload and minimized unsafe aircraft excursions.

  19. Data, exergy, and energy analysis of a vertical-bore, ground-source heat pump to for domestic water heating under simulated occupancy conditions

    DOE PAGESBeta

    Ally, Moonis Raza; Munk, Jeffrey D.; Baxter, Van D.; Gehl, Anthony C.

    2015-05-27

    Evidence is provided to support the view that greater than two-thirds of energy required to produce domestic hot water may be extracted from the ground which serves as renewable energy resource. The case refers to a 345 m2 research house located in Oak Ridge, Tennessee, 36.01 N 84.26 W in a mixed-humid climate with HDD of 2218 C-days (3993 F-days) and CDD of 723 C-days (1301 F-days). The house is operated under simulated occupancy conditions in which the hot water use protocol is based on the Building America Research Benchmark Definition (Hendron 2008; Hendron and Engebrecht 2010) which captures themore » water consumption lifestyles of the average family in the United States. The 5.275 (1.5-ton) water-to-water ground source heat pump (WW-GSHP) shared the same vertical bore with a 7.56 KW water-to-air ground source heat pump for space conditioning the same house. Energy and exergy analysis of data collected continuously over a twelve month period provide performance metrics and sources of inherent systemic inefficiencies. Data and analyses are vital to better understand how WW-GSHPs may be further improved to enable the ground to be used as a renewable energy resource.« less

  20. Data, exergy, and energy analysis of a vertical-bore, ground-source heat pump to for domestic water heating under simulated occupancy conditions

    SciTech Connect

    Ally, Moonis Raza; Munk, Jeffrey D.; Baxter, Van D.; Gehl, Anthony C.

    2015-05-27

    Evidence is provided to support the view that greater than two-thirds of energy required to produce domestic hot water may be extracted from the ground which serves as renewable energy resource. The case refers to a 345 m2 research house located in Oak Ridge, Tennessee, 36.01 N 84.26 W in a mixed-humid climate with HDD of 2218 C-days (3993 F-days) and CDD of 723 C-days (1301 F-days). The house is operated under simulated occupancy conditions in which the hot water use protocol is based on the Building America Research Benchmark Definition (Hendron 2008; Hendron and Engebrecht 2010) which captures the water consumption lifestyles of the average family in the United States. The 5.275 (1.5-ton) water-to-water ground source heat pump (WW-GSHP) shared the same vertical bore with a 7.56 KW water-to-air ground source heat pump for space conditioning the same house. Energy and exergy analysis of data collected continuously over a twelve month period provide performance metrics and sources of inherent systemic inefficiencies. Data and analyses are vital to better understand how WW-GSHPs may be further improved to enable the ground to be used as a renewable energy resource.

  1. The mesoscale forcing of a midlatitude upper-tropospheric jet streak by a simulated convective system. 2: Kinetic energy and resolution analysis

    NASA Technical Reports Server (NTRS)

    Wolf, Bart J.; Johnson, D. R.

    1995-01-01

    A kinetic energy (KE) analysis of the forcing of a mesoscale upper-tropospheric jet streak by organized diabatic processes within the simulated convective system (SCS) that was discussed in Part 1 is presented in this study. The relative contributions of the ageostrophic components of motion to the generation of KE of the convectively generated jet streak are compared, along with the KE generation by the rotational (nondivergent) and irrotational (divergent) mass transport. The sensitivity of the numerical simulations of SCS development to resolution is also briefly examined. Analysis within isentropic coordinates provides for an explicit determination of the influence of the diabatic processes on the generation of KE. The upper-level production of specific KE is due predominatly to the inertial advective ageostrophic component (IAD), and as such represents the primary process through which the KE of the convectively generated jet streak is realized. A secondary contribution by the inertial diabatic (IDI) term is observed. Partitioning the KE generation into its rotational and irrotational components reveals that the latter, which is directly linked to the diabatic heating within the SCS through isentropic continuity requirements, is the ultimate source of KE generation as the global area integral of generation by the rotational component vanishes. Comparison with an identical dry simulation reveals that the net generation of KE must be attributed to latent heating. Both the IAD and IDI ageostrophic components play important roles in this regard. Examination of results from simulations conducted at several resolutions supports the previous findings in that the effects of diabatic processes and ageostrophic motion on KE generation remain consistent. Resolution does impact the location and timing of SCS development, a result that has important implications in forecasting the onset of convection that develops from evolution of the large-scale flow and moisture

  2. JASMINE -- Data analysis and simulation --

    NASA Astrophysics Data System (ADS)

    Yamada, Y.; Gouda, N.; Kobayashi, Y.; Yano, T.; Jasmine Working Group

    Japan Astrometry Satellite Mission for INfrared Exploration JASMINE project stands at the stage where its basic design will be determined in a few years Then it is very important to simulate the data stream generated by astrometric fields at JASMINE in order to support investigations of error budgets sampling strategy data compression data analysis scientific performances etc We construct a simulation system that should include all objects in JASMINE such as observation techniques models of instruments and bus design orbit data transfer data analysis etc in order to resolve all issues which can be expected beforehand and make it easy to cope with some unexpected problems which might occurre during the mission of JASMINE For position reconstruction we are required to determine positions of stars and instrument attitudes simultaneously For doing this overwrapping focal plane images with high accuracy is needed We use statistical method which is used in Support Vector Machine for optimizing observation programm For getting high accuracy observations we also required to detect attitude error from mission data Principal Component Analysis is effective for this perpose In this poster presentation we explain topics on the data analysis and system simulation for JASMINE project

  3. Energy Sector Market Analysis

    SciTech Connect

    Arent, D.; Benioff, R.; Mosey, G.; Bird, L.; Brown, J.; Brown, E.; Vimmerstedt, L.; Aabakken, J.; Parks, K.; Lapsa, M.; Davis, S.; Olszewski, M.; Cox, D.; McElhaney, K.; Hadley, S.; Hostick, D.; Nicholls, A.; McDonald, S.; Holloman, B.

    2006-10-01

    This paper presents the results of energy market analysis sponsored by the Department of Energy's (DOE) Weatherization and International Program (WIP) within the Office of Energy Efficiency and Renewable Energy (EERE). The analysis was conducted by a team of DOE laboratory experts from the National Renewable Energy Laboratory (NREL), Oak Ridge National Laboratory (ORNL), and Pacific Northwest National Laboratory (PNNL), with additional input from Lawrence Berkeley National Laboratory (LBNL). The analysis was structured to identify those markets and niches where government can create the biggest impact by informing management decisions in the private and public sectors. The analysis identifies those markets and niches where opportunities exist for increasing energy efficiency and renewable energy use.

  4. Building energy analysis tool

    DOEpatents

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  5. A Simulated Growth Hormone Analysis

    NASA Astrophysics Data System (ADS)

    Harris, Mary

    1996-08-01

    Growth hormone is a drug that is sometimes abused by amateur or professional athletes for performance-enhancement. This laboratory is a semimicroscale simulation analysis of a sample of "urine" to detect proteins of two very different molecular weights. Gel filtration uses a 10 mL disposable pipette packed with Sephadex. Students analyze the fractions from the filtration by comparing colors of the Brilliant Blue Coomassie Dye as it interacts with the proteins in the sample to a standard set of known concentration of protein with the dye. The simulated analysis of growth hormone is intended to be included in a unit on organic chemistry or in the second year of high school chemistry.

  6. Multiphysics simulations of nanoarchitectures and analysis of germanium core-shell anode nanostructure for lithium-ion energy storage applications

    NASA Astrophysics Data System (ADS)

    Clancy, T.; Rohan, J. F.

    2015-12-01

    This paper reports multiphysics simulations (COMSOL) of relatively low conductive cathode oxide materials in nanoarchitectures that operate within the appropriate potential range (cut-off voltage 2.5 V) at 3 times the C-rate of micron scale thin film materials while still accessing 90% of material. This paper also reports a novel anode fabrication of Ge sputtered on a Cu nanotube current collector for lithium-ion batteries. Ge on Cu nanotubes is shown to alleviate the effect of volume expansion, enhancing mechanical stability at the nanoscale and improved the electronic characteristics for increased rate capabilities.

  7. Simulation based analysis of laser beam brazing

    NASA Astrophysics Data System (ADS)

    Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael

    2016-03-01

    Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.

  8. Dynamic Analysis of Nuclear Energy System Strategies

    Energy Science and Technology Software Center (ESTSC)

    2004-06-17

    DANESS is an integrated process model for nuclear energy systems allowing the simulation of multiple reactors and fuel cycles in a continuously changing nuclear reactor park configuration. The model is energy demand driven and simulates all nuclear fuel cycle facilites, up to 10 reactors and fuels. Reactor and fuel cycle facility history are traced and the cost of generating energy is calculated per reactor and for total nuclear energy system. The DANESS model aims atmore » performing dynamic systems analysis of nuclear energy development used for integrated analysis of development paths for nuclear energy, parameter scoping for new nuclear energy systems, economic analysis of nuclear energy, government role analysis, and education.« less

  9. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    SciTech Connect

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  10. Crisis on Mars: Classroom Energy Simulation.

    ERIC Educational Resources Information Center

    Pribble, Donald A.

    1979-01-01

    Described in this article is an energy conservation simulation game in which students participate in a space mission to Mars. Activities such as decision making, valuing, and problem solving occur during the game. (SA)

  11. Influence of the R823W mutation on the interaction of the ANKS6-ANKS3: insights from molecular dynamics simulation and free energy analysis.

    PubMed

    Kan, Wei; Fang, Fengqin; Chen, Lin; Wang, Ruige; Deng, Qigang

    2016-05-01

    The sterile alpha motif (SAM) domain of the protein ANKS6, a protein-protein interaction domain, is responsible for autosomal dominant polycystic kidney disease. Although the disease is the result of the R823W point mutation in the SAM domain of the protein ANKS6, the molecular details are still unclear. We applied molecular dynamics simulations, the principal component analysis, and the molecular mechanics Poisson-Boltzmann surface area binding free energy calculation to explore the structural and dynamic effects of the R823W point mutation on the complex ANKS6-ANKS3 (PDB ID: 4NL9) in comparison to the wild proteins. The energetic analysis presents that the wild type has a more stable structure than the mutant. The R823W point mutation not only disrupts the structure of the ANKS6 SAM domain but also negatively affects the interaction of the ANKS6-ANKS3. These results further clarify the previous experiments to understand the ANKS6-ANKS3 interaction comprehensively. In summary, this study would provide useful suggestions to understand the interaction of these proteins and their fatal action on mediating kidney function. PMID:26295479

  12. Simulation of proton-induced energy deposition in integrated circuits

    NASA Technical Reports Server (NTRS)

    Fernald, Kenneth W.; Kerns, Sherra E.

    1988-01-01

    A time-efficient simulation technique was developed for modeling the energy deposition by incident protons in modern integrated circuits. To avoid the excessive computer time required by many proton-effects simulators, a stochastic method was chosen to model the various physical effects responsible for energy deposition by incident protons. Using probability density functions to describe the nuclear reactions responsible for most proton-induced memory upsets, the simulator determines the probability of a proton hit depositing the energy necessary for circuit destabilization. This factor is combined with various circuit parameters to determine the expected error-rate in a given proton environment. An analysis of transient or dose-rate effects is also performed. A comparison to experimental energy-disposition data proves the simulator to be quite accurate for predicting the expected number of events in certain integrated circuits.

  13. Advancement of DOE's EnergyPlus Building Energy Simulation Payment

    SciTech Connect

    Gu, Lixing; Shirey, Don; Raustad, Richard; Nigusse, Bereket; Sharma, Chandan; Lawrie, Linda; Strand, Rick; Pedersen, Curt; Fisher, Dan; Lee, Edwin; Witte, Mike; Glazer, Jason; Barnaby, Chip

    2011-09-30

    EnergyPlus{sup TM} is a new generation computer software analysis tool that has been developed, tested, and commercialized to support DOE's Building Technologies (BT) Program in terms of whole-building, component, and systems R&D (http://www.energyplus.gov). It is also being used to support evaluation and decision making of zero energy building (ZEB) energy efficiency and supply technologies during new building design and existing building retrofits. The 5-year project was managed by the National Energy Technology Laboratory and was divided into 5 budget period between 2006 and 2011. During the project period, 11 versions of EnergyPlus were released. This report summarizes work performed by an EnergyPlus development team led by the University of Central Florida's Florida Solar Energy Center (UCF/FSEC). The team members consist of DHL Consulting, C. O. Pedersen Associates, University of Illinois at Urbana-Champaign, Oklahoma State University, GARD Analytics, Inc., and WrightSoft Corporation. The project tasks involved new feature development, testing and validation, user support and training, and general EnergyPlus support. The team developed 146 new features during the 5-year period to advance the EnergyPlus capabilities. Annual contributions of new features are 7 in budget period 1, 19 in period 2, 36 in period 3, 41 in period 4, and 43 in period 5, respectively. The testing and validation task focused on running test suite and publishing report, developing new IEA test suite cases, testing and validating new source code, addressing change requests, and creating and testing installation package. The user support and training task provided support for users and interface developers, and organized and taught workshops. The general support task involved upgrading StarTeam (team sharing) software and updating existing utility software. The project met the DOE objectives and completed all tasks successfully. Although the EnergyPlus software was enhanced significantly

  14. Physics-Based Simulator for NEO Exploration Analysis & Simulation

    NASA Technical Reports Server (NTRS)

    Balaram, J.; Cameron, J.; Jain, A.; Kline, H.; Lim, C.; Mazhar, H.; Myint, S.; Nayar, H.; Patton, R.; Pomerantz, M.; Quadrelli, M.; Shakkotai, P.; Tso, K.

    2011-01-01

    As part of the Space Exploration Analysis and Simulation (SEAS) task, the National Aeronautics and Space Administration (NASA) is using physics-based simulations at NASA's Jet Propulsion Laboratory (JPL) to explore potential surface and near-surface mission operations at Near Earth Objects (NEOs). The simulator is under development at JPL and can be used to provide detailed analysis of various surface and near-surface NEO robotic and human exploration concepts. In this paper we describe the SEAS simulator and provide examples of recent mission systems and operations concepts investigated using the simulation. We also present related analysis work and tools developed for both the SEAS task as well as general modeling, analysis and simulation capabilites for asteroid/small-body objects.

  15. Statistical Energy Analysis Program

    NASA Technical Reports Server (NTRS)

    Ferebee, R. C.; Trudell, R. W.; Yano, L. I.; Nygaard, S. I.

    1985-01-01

    Statistical Energy Analysis (SEA) is powerful tool for estimating highfrequency vibration spectra of complex structural systems and incorporated into computer program. Basic SEA analysis procedure divided into three steps: Idealization, parameter generation, and problem solution. SEA computer program written in FORTRAN V for batch execution.

  16. Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    SciTech Connect

    Not Available

    2012-02-01

    New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility bill calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.

  17. Qgui: A high-throughput interface for automated setup and analysis of free energy calculations and empirical valence bond simulations in biological systems.

    PubMed

    Isaksen, Geir Villy; Andberg, Tor Arne Heim; Åqvist, Johan; Brandsdal, Bjørn Olav

    2015-07-01

    Structural information and activity data has increased rapidly for many protein targets during the last decades. In this paper, we present a high-throughput interface (Qgui) for automated free energy and empirical valence bond (EVB) calculations that use molecular dynamics (MD) simulations for conformational sampling. Applications to ligand binding using both the linear interaction energy (LIE) method and the free energy perturbation (FEP) technique are given using the estrogen receptor (ERα) as a model system. Examples of free energy profiles obtained using the EVB method for the rate-limiting step of the enzymatic reaction catalyzed by trypsin are also shown. In addition, we present calculation of high-precision Arrhenius plots to obtain the thermodynamic activation enthalpy and entropy with Qgui from running a large number of EVB simulations. PMID:26080356

  18. Performance calculation and simulation system of high energy laser weapon

    NASA Astrophysics Data System (ADS)

    Wang, Pei; Liu, Min; Su, Yu; Zhang, Ke

    2014-12-01

    High energy laser weapons are ready for some of today's most challenging military applications. Based on the analysis of the main tactical/technical index and combating process of high energy laser weapon, a performance calculation and simulation system of high energy laser weapon was established. Firstly, the index decomposition and workflow of high energy laser weapon was proposed. The entire system was composed of six parts, including classical target, platform of laser weapon, detect sensor, tracking and pointing control, laser atmosphere propagation and damage assessment module. Then, the index calculation modules were designed. Finally, anti-missile interception simulation was performed. The system can provide reference and basis for the analysis and evaluation of high energy laser weapon efficiency.

  19. Revolutions in energy through modeling and simulation

    SciTech Connect

    Tatro, M.; Woodard, J.

    1998-08-01

    The development and application of energy technologies for all aspects from generation to storage have improved dramatically with the advent of advanced computational tools, particularly modeling and simulation. Modeling and simulation are not new to energy technology development, and have been used extensively ever since the first commercial computers were available. However, recent advances in computing power and access have broadened the extent and use, and, through increased fidelity (i.e., accuracy) of the models due to greatly enhanced computing power, the increased reliance on modeling and simulation has shifted the balance point between modeling and experimentation. The complex nature of energy technologies has motivated researchers to use these tools to understand better performance, reliability and cost issues related to energy. The tools originated in sciences such as the strength of materials (nuclear reactor containment vessels); physics, heat transfer and fluid flow (oil production); chemistry, physics, and electronics (photovoltaics); and geosciences and fluid flow (oil exploration and reservoir storage). Other tools include mathematics, such as statistics, for assessing project risks. This paper describes a few advancements made possible by these tools and explores the benefits and costs of their use, particularly as they relate to the acceleration of energy technology development. The computational complexity ranges from basic spreadsheets to complex numerical simulations using hardware ranging from personal computers (PCs) to Cray computers. In all cases, the benefits of using modeling and simulation relate to lower risks, accelerated technology development, or lower cost projects.

  20. Simulation of Energy Management Systems in EnergyPlus

    SciTech Connect

    Ellis, P. G.; Torcellini, P. A.; Crawley, D.

    2008-01-01

    An energy management system (EMS) is a dedicated computer that can be programmed to control all of a building's energy-related systems, including heating, cooling, ventilation, hot water, interior lighting, exterior lighting, on-site power generation, and mechanized systems for shading devices, window actuators, and double facade elements. Recently a new module for simulating an EMS was added to the EnergyPlus whole-building energy simulation program. An essential part of the EMS module is the EnergyPlus Runtime Language (ERL), which is a simple programming language that is used to specify the EMS control algorithms. The new EMS controls and the flexibility of ERL allow EnergyPlus to simulate many novel control strategies that are not possible with the previous generation of building energy simulation programs. This paper surveys the standard controls in EnergyPlus, presents the new EMS features, describes the implementation of the module, and explores some of the possible applications for the new EMS capabilities in EnergyPlus.

  1. Simulation of Flywheel Energy Storage System Controls

    NASA Technical Reports Server (NTRS)

    Truong, Long V.; Wolff, Frederick J.; Dravid, Narayan

    2001-01-01

    This paper presents the progress made in the controller design and operation of a flywheel energy storage system. The switching logic for the converter bridge circuit has been redefined to reduce line current harmonics, even at the highest operating speed of the permanent magnet motor-generator. An electromechanical machine model is utilized to simulate charge and discharge operation of the inertial energy in the flywheel. Controlling the magnitude of phase currents regulates the rate of charge and discharge. The resulting improvements are demonstrated by simulation.

  2. Simulation and Analysis of SNRs in LAT Data Challenge 2

    SciTech Connect

    Tibolla, O.; Busetto, G.; Digel, S.; Longo, F.

    2007-07-12

    In 2006 the GLAST LAT collaboration organized a detailed simulation of 55 days of the gamma-ray sky and particle background in orbit to test the simulation and analysis tools of the collaboration. For this simulation, designated Data Challenge 2 (DC2), empirical models for SNRs as gamma-ray sources in the energy range of the LAT were developed, in most cases informed by X-ray or gamma-ray observations. The development of these models and an example of analysis of one of the simulated SNRs are described here.

  3. Simulation And Analysis of SNRs in LAT Data Challenge 2

    SciTech Connect

    Tibolla, O.; Busetto, G.; Digel, S.; Longo, F.; /Trieste U. /INFN, Trieste

    2007-11-13

    In 2006 the GLAST LAT collaboration organized a detailed simulation of 55 days of the gamma-ray sky and particle background in orbit to test the simulation and analysis tools of the collaboration. For this simulation, designated Data Challenge 2 (DC2), empirical models for SNRs as gamma-ray sources in the energy range of the LAT were developed, in most cases informed by X-ray or gamma-ray observations. The development of these models and an example of analysis of one of the simulated SNRs are described here.

  4. Accuracy analysis of distributed simulation systems

    NASA Astrophysics Data System (ADS)

    Lin, Qi; Guo, Jing

    2010-08-01

    Existed simulation works always emphasize on procedural verification, which put too much focus on the simulation models instead of simulation itself. As a result, researches on improving simulation accuracy are always limited in individual aspects. As accuracy is the key in simulation credibility assessment and fidelity study, it is important to give an all-round discussion of the accuracy of distributed simulation systems themselves. First, the major elements of distributed simulation systems are summarized, which can be used as the specific basis of definition, classification and description of accuracy of distributed simulation systems. In Part 2, the framework of accuracy of distributed simulation systems is presented in a comprehensive way, which makes it more sensible to analyze and assess the uncertainty of distributed simulation systems. The concept of accuracy of distributed simulation systems is divided into 4 other factors and analyzed respectively further more in Part 3. In Part 4, based on the formalized description of framework of accuracy analysis in distributed simulation systems, the practical approach are put forward, which can be applied to study unexpected or inaccurate simulation results. Following this, a real distributed simulation system based on HLA is taken as an example to verify the usefulness of the approach proposed. The results show that the method works well and is applicable in accuracy analysis of distributed simulation systems.

  5. Residential Building Energy Analysis

    Energy Science and Technology Software Center (ESTSC)

    1990-09-01

    PEAR (Program for Energy Analysis of Residences) provides an easy-to-use and accurate method of estimating the energy and cost savings associated with various energy conservation measures in site-built single-family homes. Measures such as ceiling, wall, and floor insulation; different window type and glazing layers; infiltration levels; and equipment efficiency can be considered. PEAR also allows the user to consider the effects of roof and wall color, movable night insulation on the windows, reflective and heatmore » absorbing glass, an attached sunspace, and use of a night temperature setback. Regression techniques permit adjustments for different building geometries, window areas and orientations, wall construction, and extension of the data to 880 U.S. locations determined by climate parameters. Based on annual energy savings, user-specified costs of conservation measures, fuel, lifetime of measure, loan period, and fuel escalation and interest rates, PEAR calculates two economic indicators; the Simple Payback Period (SPP) and the Savings-to-Investment Ratio (SIR). Energy and cost savings of different sets of conservation measures can be compared in a single run. The program can be used both as a research tool by energy policy analysts and as a method for nontechnical energy calculation by architects, home builders, home owners, and others in the building industry.« less

  6. Residential Building Energy Analysis

    SciTech Connect

    Ritschard, R. L.

    1990-09-01

    PEAR (Program for Energy Analysis of Residences) provides an easy-to-use and accurate method of estimating the energy and cost savings associated with various energy conservation measures in site-built single-family homes. Measures such as ceiling, wall, and floor insulation; different window type and glazing layers; infiltration levels; and equipment efficiency can be considered. PEAR also allows the user to consider the effects of roof and wall color, movable night insulation on the windows, reflective and heat absorbing glass, an attached sunspace, and use of a night temperature setback. Regression techniques permit adjustments for different building geometries, window areas and orientations, wall construction, and extension of the data to 880 U.S. locations determined by climate parameters. Based on annual energy savings, user-specified costs of conservation measures, fuel, lifetime of measure, loan period, and fuel escalation and interest rates, PEAR calculates two economic indicators; the Simple Payback Period (SPP) and the Savings-to-Investment Ratio (SIR). Energy and cost savings of different sets of conservation measures can be compared in a single run. The program can be used both as a research tool by energy policy analysts and as a method for nontechnical energy calculation by architects, home builders, home owners, and others in the building industry.

  7. WEC-Sim (Wave Energy Converter - SIMulator)

    Energy Science and Technology Software Center (ESTSC)

    2014-11-26

    WEC-Sim (Wave Energy Converter SIMulator) is a code developed by Sandia National Laboratories and the National Renewable Energy Laboratory to model wave energy converters (WECs) when they are subject to operational waves. The code is a time-domain modeling tool developed in MATLAB/Simulink using the multi-body dynamics solver SimMechanics. In WEC-Sim, WECs are modeled by connecting rigid bodies to one another with joint or constraint blocks from the WEC-Sim library. WEC-Sim is a publicly available, open-sourcemore » code to model WECs.« less

  8. WEC-Sim (Wave Energy Converter - SIMulator)

    SciTech Connect

    2014-11-26

    WEC-Sim (Wave Energy Converter SIMulator) is a code developed by Sandia National Laboratories and the National Renewable Energy Laboratory to model wave energy converters (WECs) when they are subject to operational waves. The code is a time-domain modeling tool developed in MATLAB/Simulink using the multi-body dynamics solver SimMechanics. In WEC-Sim, WECs are modeled by connecting rigid bodies to one another with joint or constraint blocks from the WEC-Sim library. WEC-Sim is a publicly available, open-source code to model WECs.

  9. Numerical simulation of magma energy extraction

    SciTech Connect

    Hickox, C.E.

    1991-01-01

    The Magma Energy Program is a speculative endeavor regarding practical utility of electrical power production from the thermal energy which reside in magma. The systematic investigation has identified an number of research areas which have application to the utilization of magma energy and to the field of geothermal energy. Eight topics were identified which involve thermal processes and which are areas for the application of the techniques of numerical simulation. These areas are: (1) two-phase flow of the working fluid in the wellbore, (2) thermodynamic cycles for the production of electrical power, (3) optimization of the entire system, (4) solidification and fracturing of the magma caused by the energy extraction process, (5) heat transfer and fluid flow within an open, direct-contact, heat-exchanger, (6) thermal convection in the overlying geothermal region, (7) thermal convection within the magma body, and (8) induced natural convection near the thermal energy extraction device. Modeling issues have been identified which will require systematic investigation in order to develop the most appropriate strategies for numerical simulation. It appears that numerical simulations will be of ever increasing importance to the study of geothermal processes as the size and complexity of the systems of interest increase. It is anticipated that, in the future, greater emphasis will be placed on the numerical simulation of large-scale, three-dimensional, transient, mixed convection in viscous flows and porous media. Increased computational capabilities, e.g.; massively parallel computers, will allow for the detailed study of specific processes in fractured media, non-Darcy effects in porous media, and non-Newtonian effects. 23 refs., 13 figs., 1 tab.

  10. Simulation of High Energy Density Laboratory Plasmas

    NASA Astrophysics Data System (ADS)

    Guzik, Joyce

    2004-05-01

    High Energy Density plasmas are found in astrophysical environments, have been generated in past underground nuclear tests, and can be created in the laboratory by, e.g. laser or pulsed power experiments. These experiments can be used to validate simulation capabilities that are being developed to advance our understanding of plasma physics, and to develop predictive capabilities for HED plasma applications such as fusion energy. In this talk we will briefly introduce the subject of simulating HED plasmas using radiation hydrodynamics codes. We will give examples of simple test problems, showing how a problem is approached, including geometry specifications, simplifying assumptions, zoning, initial and boundary conditions, basic data on opacities and EOS, and illustrate sensitivities of results to variations. We will also show highlights of work at Los Alamos to validate codes, provide basic data, and develop applications, for example: 1) studying phenomena such as Rayleigh-Taylor and Richtmeyer-Meshkov instabilities, ablation, and supersonic jets at the Omega laser in Rochester and the Sandia Z Machine; 2) quantum molecular dynamics simulations which have recently led to a semi-classical, particle-particle particle-mesh code that allows ultra-fast simulations involving tens of thousands of particles to calculate properties of hot dense plasmas; 3) efforts to experimentally demonstrate the physics basis for magnetized target fusion (MTF), a potentially low cost path to fusion, intermediate in plasma regime between magnetic and inertial fusion energy.

  11. A simulation of high energy cosmic ray propagation 1

    NASA Technical Reports Server (NTRS)

    Honda, M.; Kifune, T.; Matsubara, Y.; Mori, M.; Nishijima, K.; Teshima, M.

    1985-01-01

    High energy cosmic ray propagation of the energy region 10 to the 14.5 power - 10 to the 18th power eV is simulated in the inter steller circumstances. In conclusion, the diffusion process by turbulent magnetic fields is classified into several regions by ratio of the gyro-radius and the scale of turbulence. When the ratio becomes larger then 10 to the minus 0.5 power, the analysis with the assumption of point scattering can be applied with the mean free path E sup 2. However, when the ratio is smaller than 10 to the minus 0.5 power, we need a more complicated analysis or simulation. Assuming the turbulence scale of magnetic fields of the Galaxy is 10-30pc and the mean magnetic field strength is 3 micro gauss, the energy of cosmic ray with that gyro-radius is about 10 to the 16.5 power eV.

  12. Simulating the energy performance of holographic glazings

    NASA Astrophysics Data System (ADS)

    Papamichael, K.; Beltran, L.; Furler, Reto; Lee, E. S.; Selkowitz, Steven E.; Rubin, Michael

    1994-09-01

    The light diffraction properties of holographic diffractive structures present an opportunity to improve the daylight performance in side-lit office spaces by redirecting and reflecting sunlight off the ceiling, providing adequate daylight illumination up to 30 ft (9.14 m) from the window wall. Prior studies of prototypical holographic glazings, installed above conventional `view' windows, have shown increased daylight levels over a deeper perimeter area than clear glass, for selected sun positions. In this study, we report on the simulation of the energy performance of prototypical holographic glazings assuming a commercial office building in the inland Los Angeles climate. The simulation of the energy performance involved determination of both luminous and thermal performance. Since the optical complexity of holographic glazings prevented the use of conventional algorithms for the simulation of their luminous performance, we used a newly developed method that combines experimentally determined directional workplane illuminance coefficients with computer-based analytical routines to determine a comprehensive set of daylight factors for many sun positions. These daylight factors were then used within the DOE-2.1D energy simulation program to determine hourly daylight and energy performance over the course of an entire year for four window orientations. Since the prototypical holographic diffractive structures considered in this study were applied on single pane clear glass, we also simulated the performance of hypothetical glazings, assuming the daylight performance of the prototype holographic glazings and the thermal performance of double-pane and low-e glazings. Finally, we addressed various design and implementation issues towards potential performance improvement.

  13. Reasoning about energy in qualitative simulation

    NASA Technical Reports Server (NTRS)

    Fouche, Pierre; Kuipers, Benjamin J.

    1992-01-01

    While possible behaviors of a mechanism that are consistent with an incomplete state of knowledge can be predicted through qualitative modeling and simulation, spurious behaviors corresponding to no solution of any ordinary differential equation consistent with the model may be generated. The present method for energy-related reasoning eliminates an important source of spurious behaviors, as demonstrated by its application to a nonlinear, proportional-integral controlled. It is shown that such qualitative properties of such a system as stability and zero-offset control are captured by the simulation.

  14. Technical Highlight: NREL Improves Building Energy Simulation Programs Through Diagnostic Testing

    SciTech Connect

    Polly, B.

    2012-01-09

    This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market.

  15. Error analysis using organizational simulation.

    PubMed Central

    Fridsma, D. B.

    2000-01-01

    Organizational simulations have been used by project organizations in civil and aerospace industries to identify work processes and organizational structures that are likely to fail under certain conditions. Using a simulation system based on Galbraith's information-processing theory and Simon's notion of bounded-rationality, we retrospectively modeled a chemotherapy administration error that occurred in a hospital setting. Our simulation suggested that when there is a high rate of unexpected events, the oncology fellow was differentially backlogged with work when compared with other organizational members. Alternative scenarios suggested that providing more knowledge resources to the oncology fellow improved her performance more effectively than adding additional staff to the organization. Although it is not possible to know whether this might have prevented the error, organizational simulation may be an effective tool to prospectively evaluate organizational "weak links", and explore alternative scenarios to correct potential organizational problems before they generate errors. PMID:11079885

  16. Free energy of steps using atomistic simulations

    NASA Astrophysics Data System (ADS)

    Freitas, Rodrigo; Frolov, Timofey; Asta, Mark

    The properties of solid-liquid interfaces are known to play critical roles in solidification processes. Particularly special importance is given to thermodynamic quantities that describe the equilibrium state of these surfaces. For example, on the solid-liquid-vapor heteroepitaxial growth of semiconductor nanowires the crystal nucleation process on the faceted solid-liquid interface is influenced by the solid-liquid and vapor-solid interfacial free energies, and also by the free energies of associated steps at these faceted interfaces. Crystal-growth theories and mesoscale simulation methods depend on quantitative information about these properties, which are often poorly characterized from experimental measurements. In this work we propose an extension of the capillary fluctuation method for calculation of the free energy of steps on faceted crystal surfaces. From equilibrium atomistic simulations of steps on (111) surfaces of Copper we computed accurately the step free energy for different step orientations. We show that the step free energy remains finite at all temperature up to the melting point and that the results obtained agree with the more well established method of thermodynamic integration if finite size effects are taken into account. The research of RF and MA at UC Berkeley were supported by the US National Science Foundation (Grant No. DMR-1105409). TF acknowledges support through a postdoctoral fellowship from the Miller Institute for Basic Research in Science.

  17. Plans for wind energy system simulation

    NASA Technical Reports Server (NTRS)

    Dreier, M. E.

    1978-01-01

    A digital computer code and a special purpose hybrid computer, were introduced. The digital computer program, the Root Perturbation Method or RPM, is an implementation of the classic floquet procedure which circumvents numerical problems associated with the extraction of Floquet roots. The hybrid computer, the Wind Energy System Time domain simulator (WEST), yields real time loads and deformation information essential to design and system stability investigations.

  18. Analysis and Optimization of Building Energy Consumption

    NASA Astrophysics Data System (ADS)

    Chuah, Jun Wei

    Energy is one of the most important resources required by modern human society. In 2010, energy expenditures represented 10% of global gross domestic product (GDP). By 2035, global energy consumption is expected to increase by more than 50% from current levels. The increased pace of global energy consumption leads to significant environmental and socioeconomic issues: (i) carbon emissions, from the burning of fossil fuels for energy, contribute to global warming, and (ii) increased energy expenditures lead to reduced standard of living. Efficient use of energy, through energy conservation measures, is an important step toward mitigating these effects. Residential and commercial buildings represent a prime target for energy conservation, comprising 21% of global energy consumption and 40% of the total energy consumption in the United States. This thesis describes techniques for the analysis and optimization of building energy consumption. The thesis focuses on building retrofits and building energy simulation as key areas in building energy optimization and analysis. The thesis first discusses and evaluates building-level renewable energy generation as a solution toward building energy optimization. The thesis next describes a novel heating system, called localized heating. Under localized heating, building occupants are heated individually by directed radiant heaters, resulting in a considerably reduced heated space and significant heating energy savings. To support localized heating, a minimally-intrusive indoor occupant positioning system is described. The thesis then discusses occupant-level sensing (OLS) as the next frontier in building energy optimization. OLS captures the exact environmental conditions faced by each building occupant, using sensors that are carried by all building occupants. The information provided by OLS enables fine-grained optimization for unprecedented levels of energy efficiency and occupant comfort. The thesis also describes a retrofit

  19. Simulation and analysis of solenoidal ion sources

    SciTech Connect

    Alderwick, A. R.; Jardine, A. P.; Hedgeland, H.; MacLaren, D. A.; Allison, W.; Ellis, J.

    2008-12-15

    We present a detailed analysis and simulation of solenoidal, magnetically confined electron bombardment ion sources, aimed at molecular beam detection. The aim is to achieve high efficiency for singly ionized species while minimizing multiple ionization. Electron space charge plays a major role and we apply combined ray tracing and finite element simulations to determine the properties of a realistic geometry. The factors controlling electron injection and ion extraction are discussed. The results from simulations are benchmarked against experimental measurements on a prototype source.

  20. Using Delft3D to Simulate Current Energy Conversion

    NASA Astrophysics Data System (ADS)

    James, S. C.; Chartrand, C.; Roberts, J.

    2015-12-01

    As public concern with renewable energy increases, current energy conversion (CEC) technology is being developed to optimize energy output and minimize environmental impact. CEC turbines generate energy from tidal and current systems and create wakes that interact with turbines located downstream of a device. The placement of devices can greatly influence power generation and structural reliability. CECs can also alter the ecosystem process surrounding the turbines, such as flow regimes, sediment dynamics, and water quality. Software is needed to investigate specific CEC sites to simulate power generation and hydrodynamic responses of a flow through a CEC turbine array. This work validates Delft3D against several flume experiments by simulating the power generation and hydrodynamic response of flow through a turbine or actuator disc(s). Model parameters are then calibrated against these data sets to reproduce momentum removal and wake recovery data with 3-D flow simulations. Simulated wake profiles and turbulence intensities compare favorably to the experimental data and demonstrate the utility and accuracy of a fast-running tool for future siting and analysis of CEC arrays in complex domains.

  1. SIMWEST - A simulation model for wind energy storage systems

    NASA Technical Reports Server (NTRS)

    Edsinger, R. W.; Warren, A. W.; Gordon, L. H.; Chang, G. C.

    1978-01-01

    This paper describes a comprehensive and efficient computer program for the modeling of wind energy systems with storage. The level of detail of SIMWEST (SImulation Model for Wind Energy STorage) is consistent with evaluating the economic feasibility as well as the general performance of wind energy systems with energy storage options. The software package consists of two basic programs and a library of system, environmental, and control components. The first program is a precompiler which allows the library components to be put together in building block form. The second program performs the technoeconomic system analysis with the required input/output, and the integration of system dynamics. An example of the application of the SIMWEST program to a current 100 kW wind energy storage system is given.

  2. Structure of the antiviral stavudine using quantum chemical methods: Complete conformational space analysis, 3D potential energy surfaces and solid state simulations

    NASA Astrophysics Data System (ADS)

    Alcolea Palafox, M.; Iza, N.

    2012-11-01

    The molecular structure and energy of the anti-HIV, 2',3'-didehydro-3'-deoxythymidine (D4T, stavudine or Zerit) nucleoside analogue was determined by using MP2, B3LYP and B971 quantum chemical methods. The global minimum was determined through 3D potential energy surfaces (PES). These surfaces were built by rotation of the exocyclic χ, γ and β torsional angles, in steps of 20°, and full optimization of the remaining parameters. As consequence 5832 geometries were final optimized. The search located 25 local minimum, 4 of which are by MP2 within a 2 kcal/mol electronic energy range of the global minimum. The whole conformational parameters as well as P, νmax were analyzed in all the stable conformers. The global minimum by MP2 corresponds to the calculated values of the exocyclic torsional angles: χ = -103.6°, β = 63.8° and γ = 60.6°. The results obtained are in accordance to those found in thymidine and in related anti-HIV nucleoside analogues. The effect of hydration on the two most stable conformers is analyzed by continuous and discrete models up to 20 water molecules. The solid state was also simulated. The dimer forms found in the crystal unit cell were accurately determined and they are in accordance to the X-ray data.

  3. Integrating software architectures for distributed simulations and simulation analysis communities.

    SciTech Connect

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael; Moore, Patrick Curtis; Sa, Timothy J.; Hawley, Marilyn F.

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.

  4. Simulating granular materials by energy minimization

    NASA Astrophysics Data System (ADS)

    Krijgsman, D.; Luding, S.

    2016-03-01

    Discrete element methods are extremely helpful in understanding the complex behaviors of granular media, as they give valuable insight into all internal variables of the system. In this paper, a novel discrete element method for performing simulations of granular media is presented, based on the minimization of the potential energy in the system. Contrary to most discrete element methods (i.e., soft-particle method, event-driven method, and non-smooth contact dynamics), the system does not evolve by (approximately) integrating Newtons equations of motion in time, but rather by searching for mechanical equilibrium solutions for the positions of all particles in the system, which is mathematically equivalent to locally minimizing the potential energy. The new method allows for the rapid creation of jammed initial conditions (to be used for further studies) and for the simulation of quasi-static deformation problems. The major advantage of the new method is that it allows for truly static deformations. The system does not evolve with time, but rather with the externally applied strain or load, so that there is no kinetic energy in the system, in contrast to other quasi-static methods. The performance of the algorithm for both types of applications of the method is tested. Therefore we look at the required number of iterations, for the system to converge to a stable solution. For each single iteration, the required computational effort scales linearly with the number of particles. During the process of creating initial conditions, the required number of iterations for two-dimensional systems scales with the square root of the number of particles in the system. The required number of iterations increases for systems closer to the jamming packing fraction. For a quasi-static pure shear deformation simulation, the results of the new method are validated by regular soft-particle dynamics simulations. The energy minimization algorithm is able to capture the evolution of the

  5. Simulation modeling and analysis with Arena

    SciTech Connect

    Tayfur Altiok; Benjamin Melamed

    2007-06-15

    The textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment. It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings. Chapter 13.3.3 is on coal loading operations on barges/tugboats.

  6. Visual analysis and steering of flooding simulations.

    PubMed

    Ribičić, Hrvoje; Waser, Jürgen; Fuchs, Raphael; Blöschl, Günter; Gröller, Eduard

    2013-06-01

    We present a visualization tool for the real-time analysis of interactively steered ensemble-simulation runs, and apply it to flooding simulations. Simulations are performed on-the-fly, generating large quantities of data. The user wants to make sense of the data as it is created. The tool facilitates understanding of what happens in all scenarios, where important events occur, and how simulation runs are related. We combine different approaches to achieve this goal. To maintain an overview, data are aggregated and embedded into the simulation rendering, showing trends, outliers, and robustness. For a detailed view, we use information-visualization views and interactive visual analysis techniques. A selection mechanism connects the two approaches. Points of interest are selected by clicking on aggregates, supplying data for visual analysis. This allows the user to maintain an overview of the ensemble and perform analysis even as new data are supplied through simulation steering. Unexpected or unwanted developments are detected easily, and the user can focus the exploration on them. The solution was evaluated with two case studies focusing on placing and testing flood defense measures. Both were evaluated by a consortium of flood simulation and defense experts, who found the system to be both intuitive and relevant. PMID:23559514

  7. Competing Uses of Underground Systems Related to Energy Supply: Applying Single- and Multiphase Simulations for Site Characterization and Risk-Analysis

    NASA Astrophysics Data System (ADS)

    Kissinger, A.; Walter, L.; Darcis, M.; Flemisch, B.; Class, H.

    2012-04-01

    Global climate change, shortage of resources and the resulting turn towards renewable sources of energy lead to a growing demand for the utilization of subsurface systems. Among these competing uses are Carbon Capture and Storage (CCS), geothermal energy, nuclear waste disposal, "renewable" methane or hydrogen storage as well as the ongoing production of fossil resources like oil, gas, and coal. Besides competing among themselves, these technologies may also create conflicts with essential public interests like water supply. For example, the injection of CO2 into the underground causes an increase in pressure reaching far beyond the actual radius of influence of the CO2 plume, potentially leading to large amounts of displaced salt water. Finding suitable sites is a demanding task for several reasons. Natural systems as opposed to technical systems are always characterized by heterogeneity. Therefore, parameter uncertainty impedes reliable predictions towards capacity and safety of a site. State of the art numerical simulations combined with stochastic approaches need to be used to obtain a more reliable assessment of the involved risks and the radii of influence of the different processes. These simulations may include the modeling of single- and multiphase non-isothermal flow, geo-chemical and geo-mechanical processes in order to describe all relevant physical processes adequately. Stochastic approaches have the aim to estimate a bandwidth of the key output parameters based on uncertain input parameters. Risks for these different underground uses can then be made comparable with each other. Along with the importance and the urgency of the competing processes this may lead to a more profound basis for a decision. Communicating risks to stake holders and a concerned public is crucial for the success of finding a suitable site for CCS (or other subsurface utilization). We present and discuss first steps towards an approach for addressing the issue of competitive

  8. Scalable Quantum Simulation of Molecular Energies

    NASA Astrophysics Data System (ADS)

    O'Malley, P. J. J.; Babbush, R.; Kivlichan, I. D.; Romero, J.; McClean, J. R.; Barends, R.; Kelly, J.; Roushan, P.; Tranter, A.; Ding, N.; Campbell, B.; Chen, Y.; Chen, Z.; Chiaro, B.; Dunsworth, A.; Fowler, A. G.; Jeffrey, E.; Lucero, E.; Megrant, A.; Mutus, J. Y.; Neeley, M.; Neill, C.; Quintana, C.; Sank, D.; Vainsencher, A.; Wenner, J.; White, T. C.; Coveney, P. V.; Love, P. J.; Neven, H.; Aspuru-Guzik, A.; Martinis, J. M.

    2016-07-01

    We report the first electronic structure calculation performed on a quantum computer without exponentially costly precompilation. We use a programmable array of superconducting qubits to compute the energy surface of molecular hydrogen using two distinct quantum algorithms. First, we experimentally execute the unitary coupled cluster method using the variational quantum eigensolver. Our efficient implementation predicts the correct dissociation energy to within chemical accuracy of the numerically exact result. Second, we experimentally demonstrate the canonical quantum algorithm for chemistry, which consists of Trotterization and quantum phase estimation. We compare the experimental performance of these approaches to show clear evidence that the variational quantum eigensolver is robust to certain errors. This error tolerance inspires hope that variational quantum simulations of classically intractable molecules may be viable in the near future.

  9. Testing simulation and structural models with applications to energy demand

    NASA Astrophysics Data System (ADS)

    Wolff, Hendrik

    2007-12-01

    This dissertation deals with energy demand and consists of two parts. Part one proposes a unified econometric framework for modeling energy demand and examples illustrate the benefits of the technique by estimating the elasticity of substitution between energy and capital. Part two assesses the energy conservation policy of Daylight Saving Time and empirically tests the performance of electricity simulation. In particular, the chapter "Imposing Monotonicity and Curvature on Flexible Functional Forms" proposes an estimator for inference using structural models derived from economic theory. This is motivated by the fact that in many areas of economic analysis theory restricts the shape as well as other characteristics of functions used to represent economic constructs. Specific contributions are (a) to increase the computational speed and tractability of imposing regularity conditions, (b) to provide regularity preserving point estimates, (c) to avoid biases existent in previous applications, and (d) to illustrate the benefits of our approach via numerical simulation results. The chapter "Can We Close the Gap between the Empirical Model and Economic Theory" discusses the more fundamental question of whether the imposition of a particular theory to a dataset is justified. I propose a hypothesis test to examine whether the estimated empirical model is consistent with the assumed economic theory. Although the proposed methodology could be applied to a wide set of economic models, this is particularly relevant for estimating policy parameters that affect energy markets. This is demonstrated by estimating the Slutsky matrix and the elasticity of substitution between energy and capital, which are crucial parameters used in computable general equilibrium models analyzing energy demand and the impacts of environmental regulations. Using the Berndt and Wood dataset, I find that capital and energy are complements and that the data are significantly consistent with duality

  10. The Energy-Environment Simulator as a Classroom Aid.

    ERIC Educational Resources Information Center

    Sell, Nancy J.; Van Koevering, Thomas E.

    1981-01-01

    Describes the use, availability, and flexibility of the Energy-Environment Simulator, a specially designed analog computer which simulates the real-world energy situation and which is programed with estimated United States and world supplies of energy sources and estimated United States energy demands. (MP)

  11. Automated Comparison of Building Energy Simulation Engines (Presentation)

    SciTech Connect

    Polly, B.; Horowitz, S.; Booten, B.; Kruis, N.; Christensen, C.

    2012-08-01

    This presentation describes the BEopt comparative test suite, which is a tool that facilitates the automated comparison of building energy simulation engines. It also demonstrates how the test suite is improving the accuracy of building energy simulation programs. Building energy simulation programs inform energy efficient design for new homes and energy efficient upgrades for existing homes. Stakeholders rely on accurate predictions from simulation programs. Previous research indicates that software tends to over-predict energy usage for poorly-insulated leaky homes. NREL is identifying, investigating, and resolving software inaccuracy issues. Comparative software testing is one method of many that NREL uses to identify potential software issues.

  12. Automated Simulation For Analysis And Design

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, Tim; Robinson, Peter; Upadhye, R.

    1992-01-01

    Design Assistant Workstation (DAWN) software being developed to facilitate simulation of qualitative and quantitative aspects of behavior of life-support system in spacecraft, chemical-processing plant, heating and cooling system of large building, or any of variety of systems including interacting process streams and processes. Used to analyze alternative design scenarios or specific designs of such systems. Expert system will automate part of design analysis: reason independently by simulating design scenarios and return to designer with overall evaluations and recommendations.

  13. Multiresolution simulated annealing for brain image analysis

    NASA Astrophysics Data System (ADS)

    Loncaric, Sven; Majcenic, Zoran

    1999-05-01

    Analysis of biomedical images is an important step in quantification of various diseases such as human spontaneous intracerebral brain hemorrhage (ICH). In particular, the study of outcome in patients having ICH requires measurements of various ICH parameters such as hemorrhage volume and their change over time. A multiresolution probabilistic approach for segmentation of CT head images is presented in this work. This method views the segmentation problem as a pixel labeling problem. In this application the labels are: background, skull, brain tissue, and ICH. The proposed method is based on the Maximum A-Posteriori (MAP) estimation of the unknown pixel labels. The MAP method maximizes the a-posterior probability of segmented image given the observed (input) image. Markov random field (MRF) model has been used for the posterior distribution. The MAP estimation of the segmented image has been determined using the simulated annealing (SA) algorithm. The SA algorithm is used to minimize the energy function associated with MRF posterior distribution function. A multiresolution SA (MSA) has been developed to speed up the annealing process. MSA is presented in detail in this work. A knowledge-based classification based on the brightness, size, shape and relative position toward other regions is performed at the end of the procedure. The regions are identified as background, skull, brain, ICH and calcifications.

  14. Energy analysis program. 1994 annual report

    SciTech Connect

    Levine, M.D.

    1995-04-01

    This report provides an energy analysis overview. The following topics are described: building energy analysis; urban and energy environmental issues; appliance energy efficiency standards; utility planning and policy; energy efficiency, economics, and policy issues; and international energy and environmental issues.

  15. Data Intensive Analysis of Biomolecular Simulations

    SciTech Connect

    Straatsma, TP

    2008-03-01

    The advances in biomolecular modeling and simulation made possible by the availability of increasingly powerful high performance computing resources is extending molecular simulations to biological more relevant system size and time scales. At the same time, advances in simulation methodologies are allowing more complex processes to be described more accurately. These developments make a systems approach to computational structural biology feasible, but this will require a focused emphasis on the comparative analysis of the increasing number of molecular simulations that are being carried out for biomolecular systems with more realistic models, multi-component environments, and for longer simulation times. Just as in the case of the analysis of the large data sources created by the new high-throughput experimental technologies, biomolecular computer simulations contribute to the progress in biology through comparative analysis. The continuing increase in available protein structures allows the comparative analysis of the role of structure and conformational flexibility in protein function, and is the foundation of the discipline of structural bioinformatics. This creates the opportunity to derive general findings from the comparative analysis of molecular dynamics simulations of a wide range of proteins, protein-protein complexes and other complex biological systems. Because of the importance of protein conformational dynamics for protein function, it is essential that the analysis of molecular trajectories is carried out using a novel, more integrative and systematic approach. We are developing a much needed rigorous computer science based framework for the efficient analysis of the increasingly large data sets resulting from molecular simulations. Such a suite of capabilities will also provide the required tools for access and analysis of a distributed library of generated trajectories. Our research is focusing on the following areas: (1) the development of an

  16. Data Intensive Analysis of Biomolecular Simulations

    SciTech Connect

    Straatsma, TP; Soares, Thereza A.

    2007-12-01

    The advances in biomolecular modeling and simulation made possible by the availability of increasingly powerful high performance computing resources is extending molecular simulations to biological more relevant system size and time scales. At the same time, advances in simulation methodologies are allowing more complex processes to be described more accurately. These developments make a systems approach to computational structural biology feasible, but this will require a focused emphasis on the comparative analysis of the increasing number of molecular simulations that are being carried out for biomolecular systems with more realistic models, multi-component environments, and for longer simulation times. Just as in the case of the analysis of the large data sources created by the new high-throughput experimental technologies, biomolecular computer simulations contribute to the progress in biology through comparative analysis. The continuing increase in available protein structures allows the comparative analysis of the role of structure and conformational flexibility in protein function, and is the foundation of the discipline of structural bioinformatics. This creates the opportunity to derive general findings from the comparative analysis of molecular dynamics simulations of a wide range of proteins, protein-protein complexes and other complex biological systems. Because of the importance of protein conformational dynamics for protein function, it is essential that the analysis of molecular trajectories is carried out using a novel, more integrative and systematic approach. We are developing a much needed rigorous computer science based framework for the efficient analysis of the increasingly large data sets resulting from molecular simulations. Such a suite of capabilities will also provide the required tools for access and analysis of a distributed library of generated trajectories. Our research is focusing on the following areas: (1) the development of an

  17. Energy-Systems Economic Analysis

    NASA Technical Reports Server (NTRS)

    Doane, J.; Slonski, M. L.; Borden, C. S.

    1982-01-01

    Energy Systems Economic Analysis (ESEA) program is flexible analytical tool for rank ordering of alternative energy systems. Basic ESEA approach derives an estimate of those costs incurred as result of purchasing, installing and operating an energy system. These costs, suitably aggregated into yearly costs over lifetime of system, are divided by expected yearly energy output to determine busbar energy costs. ESEA, developed in 1979, is written in FORTRAN IV for batch execution.

  18. Efficient evaluation of collisional energy transfer terms for plasma particle simulations

    NASA Astrophysics Data System (ADS)

    Turrell, A. E.; Sherlock, M.; Rose, S. J.

    2016-01-01

    Particle-based simulations, such as in particle-in-cell (PIC) codes, are widely used in plasma physics research. The analysis of particle energy transfers, as described by the second moment of the Boltzmann equation, is often necessary within these simulations. We present computationally efficient, analytically derived equations for evaluating collisional energy transfer terms from simulations using discrete particles. The equations are expressed as a sum over the properties of the discrete particles.

  19. Astronomy and Astrophysic Simulation and Analysis at NERSC

    NASA Astrophysics Data System (ADS)

    Gerber, Richard

    2014-04-01

    The National Energy Research Scientific Computing Center (NERSC) is the production High Performance Computing and Storage center for the DOE Office of Science. NERSC supports 4,500 users and 700 projects in the physical and biological sciences, including a number of highly successful, high-impact physics and astronomy projects in both simulation and data analysis. Recent successes at NERSC include Kepler spacecraft data analysis, creation of mock catalogs for BOSS, IceCube's discovery of high-energy astrophysical neutrinos, data analysis and simulation in support of the Planck satellite, and supernova discoveries from the PTF pipeline. In this presentation I will share some of these success stories and discuss what resources NERSC has available to support simulation, collaboration, data analysis, automated data transfers, data archiving, complex workflows, and data sharing/analysis through web portals. Iwill also summarize the findings from recent NERSC requirements planning reviews with the DOE Office of High Energy Physics and the national computing infrastructure section of the recent Snowmass report. Finally, I will describe how resources are allocated at NERSC and opportunities for collaborations with the center.

  20. Virtual Simulation of Vision 21 Energy Plants

    SciTech Connect

    Syamlal, Madhava; Felix, Paul E.; Osawe, Maxwell O.; Fiveland, Woodrow A.; Sloan, David G.; Zitney, Stephen E.; Joop, Frank; Cleetus, Joseph; Lapshin, Igor B.

    2001-11-06

    The Vision 21 Energy plants will be designed by combining several individual power, chemical, and fuel-conversion technologies. These independently developed technologies or technology modules can be interchanged and combined to form the complete Vision 21 plant that achieves the needed level of efficiency and environmental performance at affordable costs. The knowledge about each technology module must be captured in computer models so that the models can be linked together to simulate the entire Vision 21 power plant in a Virtual Simulation environment. Eventually the Virtual Simulation will find application in conceptual design, final design, plant operation and control, and operator training. In this project we take the first step towards developing such a Vision 21 Simulator. There are two main knowledge domains of a plant--the process domain (what is in the pipes), and the physical domain (the pipes and equipment that make up the plant). Over the past few decades, commercial software tools have been developed for each of these functions. However, there are three main problems that inhibit the design and operation of power plants: (1) Many of these tools, largely developed for chemicals and refining, have not been widely adopted in the power industry. (2) Tools are not integrated across functions. For example, the knowledge represented by computational fluid dynamics (CFD) models of equipment is not used in process-level simulations. (3) No tool exists for readily integrating the design and behavioral knowledge about components. These problems must be overcome to develop the Vision 21 Simulator. In this project our major objective is to achieve a seamless integration of equipment-level and process-level models and apply the integrated software to power plant simulations. Specifically we are developing user-friendly tools for linking process models (Aspen Plus) with detailed equipment models (FLUENT CFD and other proprietary models). Such integration will

  1. Miscible Applied Simulation Techniques for Energy Recovery

    Energy Science and Technology Software Center (ESTSC)

    2005-07-01

    During the use of MASTER at the New Mexico Petroleum Recovery Research Center (PRRC) as research division of New Mexico Institute of Mining and Technology a number of modification have been made to the original MASTER. We have worked at minimizing programming errors and incorporating a foaming option for surfactant solution (aqueous phase) injection altemating with gas (SAG) The original program checks and modifications performed at PRRC were under the direction of Dr. Shih-Hsien Changmore » under previous DOE contracts. The final modifications and completion of the documentation were performed by Dr. Zhengwen Zeng under DOE Contract Number DE-FG26-01BC15364. Drs. Chang and Zeng worked under Dr. Reid B. Grigg in the Gas Flooding Processes and Flow Heterogeneities Section of PRRC. This work is not intended to have any long-term support from the PRRC, but any errors should be reported to the Department of Energy for inclusion in future releases of MASTER. MASTER is an effective reservoir simulator for modeling a number of fluid flow problems and is a straight forward and economical program. We thank the Department of Energy for the original development of this program and the availability for our use.« less

  2. Predesign energy analysis

    SciTech Connect

    1980-09-01

    A new graphic technique developed to help architects and engineers design more energy-efficient buildings is presented. An energy-efficient design includes two interrelated elements: physical design characteristics which minimize testing, cooling, and lighting loads; and mechanical and electrical subsystems which meet energy loads efficiently. The technique focuses on manipulation of design variables to effectively reduce excessive heat gains and losses. The technique, termed a visual one, is designed to show how a building uses energy. The technique described can also be done manually.

  3. Calibrated Ultra Fast Image Simulations for the Dark Energy Survey

    NASA Astrophysics Data System (ADS)

    Bruderer, Claudio; Chang, Chihway; Refregier, Alexandre; Amara, Adam; Bergé, Joel; Gamper, Lukas

    2016-01-01

    Image simulations are becoming increasingly important in understanding the measurement process of the shapes of galaxies for weak lensing and the associated systematic effects. For this purpose we present the first implementation of the Monte Carlo Control Loops (MCCL), a coherent framework for studying systematic effects in weak lensing. It allows us to model and calibrate the shear measurement process using image simulations from the Ultra Fast Image Generator (UFig) and the image analysis software SExtractor. We apply this framework to a subset of the data taken during the Science Verification period (SV) of the Dark Energy Survey (DES). We calibrate the UFig simulations to be statistically consistent with one of the SV images, which covers ∼0.5 square degrees. We then perform tolerance analyses by perturbing six simulation parameters and study their impact on the shear measurement at the one-point level. This allows us to determine the relative importance of different parameters. For spatially constant systematic errors and point-spread function, the calibration of the simulation reaches the weak lensing precision needed for the DES SV survey area. Furthermore, we find a sensitivity of the shear measurement to the intrinsic ellipticity distribution, and an interplay between the magnitude-size and the pixel value diagnostics in constraining the noise model. This work is the first application of the MCCL framework to data and shows how it can be used to methodically study the impact of systematics on the cosmic shear measurement.

  4. TRANSIMS: Transportation analysis and simulation system

    SciTech Connect

    Smith, L.; Beckman, R.; Baggerly, K.

    1995-07-01

    This document summarizes the TRansportation ANalysis and SIMulation System (TRANSIMS) Project, the system`s major modules, and the project`s near-term plans. TRANSIMS will employ advanced computational and analytical techniques to create an integrated regional transportation systems analysis environment. The simulation environment will include a regional population of individual travelers and freight loads with travel activities and plans, whose individual interactions will be simulated on the transportation system, and whose environmental impact will be determined. We will develop an interim operational capability (IOC) for each major TRANSIMS module during the five-year program. When the IOC is ready, we will complete a specific case study to confirm the IOC features, applicability, and readiness.

  5. TRANSIMS: TRansportation ANalysis and SIMulation System

    SciTech Connect

    Smith, L.; Beckman, R.; Anson, D.; Nagel, K.; Williams, M.

    1995-08-01

    This paper summarizes the TRansportation ANalysis and SIMulation System (TRANSIMS) Project, the system`s major modules, and the project`s near-term plans. TRANSIMS will employ advanced computational and analytical techniques to create an integrated regional transportation systems analysis environment. The simulation environment will include a regional population of individual travelers and freight loads with travel activities and plans, whose individual interactions will be simulated on the transportation system, and whose environmental impact will be determined. We will develop an interim operational capability (IOC) for each major TRANSIMS module during the five-year program. When the IOC is ready, we will complete a specific case study to confirm the IOC features, applicability, and readiness.

  6. Guidelines for Energy Simulation of Commercial Buildings: Final.

    SciTech Connect

    Kaplan, Michael; Caner, Phoebe

    1992-03-01

    This report distills the experience gained from intensive computer building simulation work for the Energy Edge project. The purpose of this report is twofold: to use that experience to guide conservation program managers in their use of modeling, and to improve the accuracy of design-phase computer models. Though the main emphasis of the report is on new commercial construction, it also addresses modeling as it pertains to retrofit construction. To achieve these purposes, this report will: (1) discuss the value of modeling for energy conservation programs; (2) discuss strengths and weaknesses of computer models; (3) provide specific guidelines for model input; (4) discuss input topics that are unusually large drivers of energy use and model inaccuracy; (5) provide guidelines for developing baseline models; (6) discuss types of energy conservation measures (ECMs) and building operation that are not suitable to modeling and present possible alternatives to modeling for analysis; and (7) provide basic requirements for model documentation. This project was initiated to determine whether commercial buildings can be designed and constructed to use at least 30% less energy than if they were designed and built to meet the current regional model energy code, the Model Conservation Standards (MCS) developed by the Pacific Northwest Electric Power and Conservation Planning Council. Secondary objectives of the project are to determine the incremental energy savings of a wide variety of ECMs and to compare the predictive accuracy of design-phase models with models that are carefully tuned to monitored building data.

  7. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  8. Energy dispersive X-ray fluorescence spectroscopy/Monte Carlo simulation approach for the non-destructive analysis of corrosion patina-bearing alloys in archaeological bronzes: The case of the bowl from the Fareleira 3 site (Vidigueira, South Portugal)

    NASA Astrophysics Data System (ADS)

    Bottaini, C.; Mirão, J.; Figuereido, M.; Candeias, A.; Brunetti, A.; Schiavon, N.

    2015-01-01

    Energy dispersive X-ray fluorescence (EDXRF) is a well-known technique for non-destructive and in situ analysis of archaeological artifacts both in terms of the qualitative and quantitative elemental composition because of its rapidity and non-destructiveness. In this study EDXRF and realistic Monte Carlo simulation using the X-ray Monte Carlo (XRMC) code package have been combined to characterize a Cu-based bowl from the Iron Age burial from Fareleira 3 (Southern Portugal). The artifact displays a multilayered structure made up of three distinct layers: a) alloy substrate; b) green oxidized corrosion patina; and c) brownish carbonate soil-derived crust. To assess the reliability of Monte Carlo simulation in reproducing the composition of the bulk metal of the objects without recurring to potentially damaging patina's and crust's removal, portable EDXRF analysis was performed on cleaned and patina/crust coated areas of the artifact. Patina has been characterized by micro X-ray Diffractometry (μXRD) and Back-Scattered Scanning Electron Microscopy + Energy Dispersive Spectroscopy (BSEM + EDS). Results indicate that the EDXRF/Monte Carlo protocol is well suited when a two-layered model is considered, whereas in areas where the patina + crust surface coating is too thick, X-rays from the alloy substrate are not able to exit the sample.

  9. In-situ Data Analysis Framework for ACME Land Simulations

    NASA Astrophysics Data System (ADS)

    Wang, D.; Yao, C.; Jia, Y.; Steed, C.; Atchley, S.

    2015-12-01

    The realistic representation of key biogeophysical and biogeochemical functions is the fundamental of process-based ecosystem models. Investigating the behavior of those ecosystem functions within real-time model simulation can be a very challenging due to the complex of both model and software structure of an environmental model, such as the Accelerated Climate Model for Energy (ACME) Land Model (ALM). In this research, author will describe the urgent needs and challenges for in-situ data analysis for ALM simulations, and layouts our methods/strategies to meet these challenges. Specifically, an in-situ data analysis framework is designed to allow users interactively observe the biogeophyical and biogeochemical process during ALM simulation. There are two key components in this framework, automatically instrumented ecosystem simulation, in-situ data communication and large-scale data exploratory toolkit. This effort is developed by leveraging several active projects, including scientific unit testing platform, common communication interface and extreme-scale data exploratory toolkit. Authors believe that, based on advanced computing technologies, such as compiler-based software system analysis, automatic code instrumentation, and in-memory data transport, this software system provides not only much needed capability for real-time observation and in-situ data analytics for environmental model simulation, but also the potentials for in-situ model behavior adjustment via simulation steering.

  10. WINS. Market Simulation Tool for Facilitating Wind Energy Integration

    SciTech Connect

    Shahidehpour, Mohammad

    2012-10-30

    Integrating 20% or more wind energy into the system and transmitting large sums of wind energy over long distances will require a decision making capability that can handle very large scale power systems with tens of thousands of buses and lines. There is a need to explore innovative analytical and implementation solutions for continuing reliable operations with the most economical integration of additional wind energy in power systems. A number of wind integration solution paths involve the adoption of new operating policies, dynamic scheduling of wind power across interties, pooling integration services, and adopting new transmission scheduling practices. Such practices can be examined by the decision tool developed by this project. This project developed a very efficient decision tool called Wind INtegration Simulator (WINS) and applied WINS to facilitate wind energy integration studies. WINS focused on augmenting the existing power utility capabilities to support collaborative planning, analysis, and wind integration project implementations. WINS also had the capability of simulating energy storage facilities so that feasibility studies of integrated wind energy system applications can be performed for systems with high wind energy penetrations. The development of WINS represents a major expansion of a very efficient decision tool called POwer Market Simulator (POMS), which was developed by IIT and has been used extensively for power system studies for decades. Specifically, WINS provides the following superiorities; (1) An integrated framework is included in WINS for the comprehensive modeling of DC transmission configurations, including mono-pole, bi-pole, tri-pole, back-to-back, and multi-terminal connection, as well as AC/DC converter models including current source converters (CSC) and voltage source converters (VSC); (2) An existing shortcoming of traditional decision tools for wind integration is the limited availability of user interface, i.e., decision

  11. Efficient Analysis of Simulations of the Sun's Magnetic Field

    NASA Astrophysics Data System (ADS)

    Scarborough, C. W.; Martínez-Sykora, J.

    2014-12-01

    Dynamics in the solar atmosphere, including solar flares, coronal mass ejections, micro-flares and different types of jets, are powered by the evolution of the sun's intense magnetic field. 3D Radiative Magnetohydrodnamics (MHD) computer simulations have furthered our understanding of the processes involved: When non aligned magnetic field lines reconnect, the alteration of the magnetic topology causes stored magnetic energy to be converted into thermal and kinetic energy. Detailed analysis of this evolution entails tracing magnetic field lines, an operation which is not time-efficient on a single processor. By utilizing a graphics card (GPU) to trace lines in parallel, conducting such analysis is made feasible. We applied our GPU implementation to the most advanced 3D Radiative-MHD simulations (Bifrost, Gudicksen et al. 2011) of the solar atmosphere in order to better understand the evolution of the modeled field lines.

  12. The perceived value of using BIM for energy simulation

    NASA Astrophysics Data System (ADS)

    Lewis, Anderson M.

    Building Information Modeling (BIM) is becoming an increasingly important tool in the Architectural, Engineering & Construction (AEC) industries. Some of the benefits associated with BIM include but are not limited to cost and time savings through greater trade and design coordination, and more accurate estimating take-offs. BIM is a virtual 3D, parametric design software that allows users to store information of a model within and can be used as a communication platform between project stakeholders. Likewise, energy simulation is an integral tool for predicting and optimizing a building's performance during design. Creating energy models and running energy simulations can be a time consuming activity due to the large number of parameters and assumptions that must be addressed to achieve reasonably accurate results. However, leveraging information imbedded within Building Information Models (BIMs) has the potential to increase accuracy and reduce the amount of time required to run energy simulations and can facilitate continuous energy simulations throughout the design process, thus optimizing building performance. Although some literature exists on how design stakeholders perceive the benefits associated with leveraging BIM for energy simulation, little is known about how perceptions associated with leveraging BIM for energy simulation differ between various green design stakeholder user groups. Through an e-survey instrument, this study seeks to determine how perceptions of using BIMs to inform energy simulation differ among distinct design stakeholder groups, which include BIM-only users, energy simulation-only users and BIM and energy simulation users. Additionally, this study seeks to determine what design stakeholders perceive as the main barriers and benefits of implementing BIM-based energy simulation. Results from this study suggest that little to no correlation exists between green design stakeholders' perceptions of the value associated with using

  13. Building Energy Monitoring and Analysis

    SciTech Connect

    Hong, Tianzhen; Feng, Wei; Lu, Alison; Xia, Jianjun; Yang, Le; Shen, Qi; Im, Piljae; Bhandari, Mahabir

    2013-06-01

    This project aimed to develop a standard methodology for building energy data definition, collection, presentation, and analysis; apply the developed methods to a standardized energy monitoring platform, including hardware and software, to collect and analyze building energy use data; and compile offline statistical data and online real-time data in both countries for fully understanding the current status of building energy use. This helps decode the driving forces behind the discrepancy of building energy use between the two countries; identify gaps and deficiencies of current building energy monitoring, data collection, and analysis; and create knowledge and tools to collect and analyze good building energy data to provide valuable and actionable information for key stakeholders.

  14. Exchange interaction energy in magnetic recording simulation

    SciTech Connect

    Igarashi, Masukazu Tonooka, Shun; Katada, Hiroyuki; Maeda, Maki; Hara, Miki; Wood, Roger

    2015-05-07

    Based on a phenomenological theory, micromagnetic simulations and experiments are used to evaluate an improved function for the exchange interaction between magnetic particles in perpendicular recording media. Assuming diluted spin layers in the particle boundary and a gradual rather than abrupt rotation of magnetization between grain cores, the exchange energy is better described by an even power series of θ, rather than a cosine function. The conventional cosine function does not have a restoring torque near θ = π and adjacent grains tend to align strictly antiparallel. In contrast, using a power series of θ, adjacent grains tend to align at a small angle away from θ = π. This gives rise to a small in-plane magnetization component and therefore a distinct peak in in-plane susceptibility is observed around H = 0. From magnetization measurements of a real medium, a peak is observed around H = 0, which matches with an assumption of 2 or 3 spin layers. In some situations, the exchange interaction between discretized cells for numerical calculation is better described by a power series rather than a cosine function.

  15. NANA Strategic Energy Plan & Energy Options Analysis

    SciTech Connect

    Jay Hermanson; Brian Yanity

    2008-12-31

    Biomass Feasibility analysis in the upper Kobuk; • Run of the river hydroelectric development for the Upper Kobuk; • Solar photovoltaic (PV) power demonstration projects for Noatak, Ambler, Selawik, Kiana, and Noorvik; • Heat Recovery for several communities; In September 2008, the NRC team participated at the Alaska Rural Energy Conference in Girdwood, Alaska In November 2008, the NRC team gave a presentation on the NANA regional energy plans at a DOE Tribal Energy Program conference in Denver, Colorado. In January 2009, the final SEP report was submitted to NRC.

  16. Fluid Flow Simulation and Energetic Analysis of Anomalocarididae Locomotion

    NASA Astrophysics Data System (ADS)

    Mikel-Stites, Maxwell; Staples, Anne

    2014-11-01

    While an abundance of animal locomotion simulations have been performed modeling the motions of living arthropods and aquatic animals, little quantitative simulation and reconstruction of gait parameters has been done to model the locomotion of extinct animals, many of which bear little physical resemblance to their modern descendants. To that end, this project seeks to analyze potential swimming patterns used by the anomalocaridid family, (specifically Anomalocaris canadensis, a Cambrian Era aquatic predator), and determine the most probable modes of movement. This will serve to either verify or cast into question the current assumed movement patterns and properties of these animals and create a bridge between similar flexible-bodied swimmers and their robotic counterparts. This will be accomplished by particle-based fluid flow simulations of the flow around the fins of the animal, as well as an energy analysis of a variety of sample gaits. The energy analysis will then be compared to the extant information regarding speed/energy use curves in an attempt to determine which modes of swimming were most energy efficient for a given range of speeds. These results will provide a better understanding of how these long-extinct animals moved, possibly allowing an improved understanding of their behavioral patterns, and may also lead to a novel potential platform for bio-inspired underwater autonomous vehicles (UAVs).

  17. Building Energy Monitoring and Analysis

    SciTech Connect

    Hong, Tianzhen; Feng, Wei; Lu, Alison; Xia, Jianjun; Yang, Le; Shen, Qi; Im, Piljae; Bhandari, Mahabir

    2013-06-01

    U.S. and China are the world’s top two economics. Together they consumed one-third of the world’s primary energy. It is an unprecedented opportunity and challenge for governments, researchers and industries in both countries to join together to address energy issues and global climate change. Such joint collaboration has huge potential in creating new jobs in energy technologies and services. Buildings in the US and China consumed about 40% and 25% of the primary energy in both countries in 2010 respectively. Worldwide, the building sector is the largest contributor to the greenhouse gas emission. Better understanding and improving the energy performance of buildings is a critical step towards sustainable development and mitigation of global climate change. This project aimed to develop a standard methodology for building energy data definition, collection, presentation, and analysis; apply the developed methods to a standardized energy monitoring platform, including hardware and software, to collect and analyze building energy use data; and compile offline statistical data and online real-time data in both countries for fully understanding the current status of building energy use. This helps decode the driving forces behind the discrepancy of building energy use between the two countries; identify gaps and deficiencies of current building energy monitoring, data collection, and analysis; and create knowledge and tools to collect and analyze good building energy data to provide valuable and actionable information for key stakeholders.

  18. Simulating drought impacts on energy balance in an Amazonian rainforest

    NASA Astrophysics Data System (ADS)

    Imbuzeiro, H. A.; Costa, M. H.; Galbraith, D.; Christoffersen, B. O.; Powell, T.; Harper, A. B.; Levine, N. M.; Rowland, L.; Moorcroft, P. R.; Benezoli, V. H.; Meir, P.; da Costa, A. C. L.; Brando, P. M.; Malhi, Y.; Saleska, S. R.; Williams, M. D.

    2014-12-01

    The studies of the interaction between vegetation and climate change in the Amazon Basin indicate that up to half of the region's forests may be displaced by savanna vegetation by the end of the century. Additional analyses suggest that complex interactions among land use, fire-frequency, and episodic drought are driving an even more rapid process of the forest impoverishment and displacement referred here as "savannization". But it is not clear whether surface/ecosystem models are suitable to analyze extreme events like a drought. Long-term simulations of throughfall exclusion experiments has provided unique insights into the energy dynamics of Amazonian rainforests during drought conditions. In this study, we evaluate how well six surface/ecosystem models quantify the energy dynamics from two Amazonian throughfall exclusion experiments. All models were run for the Tapajós and Caxiuanã sites with one control plot using normal precipitation (i.e. do not impose a drought) and then the drought manipulation was imposed for several drought treatments (10 to 90% rainfall exclusion). The sap flow, net radiation (Rn), sensible (H), latent (LE) and ground (G) heat flux are used to analyze if the models are able to capture the dynamics of water stress and what the implications for the energy dynamics are. With respect to the model validation, when we compare the sap flow observed and transpiration simulated, models are more accurate to simulate control plots than drought treatments (50% rainfall exclusion). The results show that the models overestimate the sap flow data during the drought conditions, but they were able to capture the changes in the main energy balance components for different drought treatments. The Rn and LE decreased and H increased with more intensity of drought. The models sensitivity analysis indicate that models are more sensitive to drought when rainfall is excluded for more than 60% and when this reduction occurs during the dry season.

  19. Computer simulations of glasses: the potential energy landscape

    NASA Astrophysics Data System (ADS)

    Raza, Zamaan; Alling, Björn; Abrikosov, Igor A.

    2015-07-01

    We review the current state of research on glasses, discussing the theoretical background and computational models employed to describe them. This article focuses on the use of the potential energy landscape (PEL) paradigm to account for the phenomenology of glassy systems, and the way in which it can be applied in simulations and the interpretation of their results. This article provides a broad overview of the rich phenomenology of glasses, followed by a summary of the theoretical frameworks developed to describe this phenomonology. We discuss the background of the PEL in detail, the onerous task of how to generate computer models of glasses, various methods of analysing numerical simulations, and the literature on the most commonly used model systems. Finally, we tackle the problem of how to distinguish a good glass former from a good crystal former from an analysis of the PEL. In summarising the state of the potential energy landscape picture, we develop the foundations for new theoretical methods that allow the ab initio prediction of the glass-forming ability of new materials by analysis of the PEL.

  20. Computer simulations of glasses: the potential energy landscape.

    PubMed

    Raza, Zamaan; Alling, Björn; Abrikosov, Igor A

    2015-07-29

    We review the current state of research on glasses, discussing the theoretical background and computational models employed to describe them. This article focuses on the use of the potential energy landscape (PEL) paradigm to account for the phenomenology of glassy systems, and the way in which it can be applied in simulations and the interpretation of their results. This article provides a broad overview of the rich phenomenology of glasses, followed by a summary of the theoretical frameworks developed to describe this phenomonology. We discuss the background of the PEL in detail, the onerous task of how to generate computer models of glasses, various methods of analysing numerical simulations, and the literature on the most commonly used model systems. Finally, we tackle the problem of how to distinguish a good glass former from a good crystal former from an analysis of the PEL. In summarising the state of the potential energy landscape picture, we develop the foundations for new theoretical methods that allow the ab initio prediction of the glass-forming ability of new materials by analysis of the PEL. PMID:26139691

  1. Discrete Kinetic Models from Funneled Energy Landscape Simulations

    PubMed Central

    Burger, Anat; Craig, Patricio O.; Komives, Elizabeth A.; Wolynes, Peter G.

    2012-01-01

    A general method for facilitating the interpretation of computer simulations of protein folding with minimally frustrated energy landscapes is detailed and applied to a designed ankyrin repeat protein (4ANK). In the method, groups of residues are assigned to foldons and these foldons are used to map the conformational space of the protein onto a set of discrete macrobasins. The free energies of the individual macrobasins are then calculated, informing practical kinetic analysis. Two simple assumptions about the universality of the rate for downhill transitions between macrobasins and the natural local connectivity between macrobasins lead to a scheme for predicting overall folding and unfolding rates, generating chevron plots under varying thermodynamic conditions, and inferring dominant kinetic folding pathways. To illustrate the approach, free energies of macrobasins were calculated from biased simulations of a non-additive structure-based model using two structurally motivated foldon definitions at the full and half ankyrin repeat resolutions. The calculated chevrons have features consistent with those measured in stopped flow chemical denaturation experiments. The dominant inferred folding pathway has an “inside-out”, nucleation-propagation like character. PMID:23251375

  2. National Infrastructure Simulation and Analysis Center Overview

    SciTech Connect

    Berscheid, Alan P.

    2012-07-30

    National Infrastructure Simulation and Analysis Center (NISAC) mission is to: (1) Improve the understanding, preparation, and mitigation of the consequences of infrastructure disruption; (2) Provide a common, comprehensive view of U.S. infrastructure and its response to disruptions - Scale & resolution appropriate to the issues and All threats; and (3) Built an operations-tested DHS capability to respond quickly to urgent infrastructure protection issues.

  3. Strategic Energy Analysis (Fact Sheet)

    SciTech Connect

    Not Available

    2014-02-01

    NREL complements its scientific research with high-quality, credible, technology-neutral, objective analysis that informs policy and investment decisions as renewable energy and energy efficiency technologies move from innovation through integration. This sheet highlights NREL's analytical capabilities and achievements.

  4. Numerical analysis of applied magnetic field dependence in Malmberg-Penning Trap for compact simulator of energy driver in heavy ion fusion

    NASA Astrophysics Data System (ADS)

    Sato, T.; Park, Y.; Soga, Y.; Takahashi, K.; Sasaki, T.; Kikuchi, T.; Harada, Nob

    2016-05-01

    To simulate a pulse compression process of space charge dominated beams in heavy ion fusion, we have demonstrated a multi-particle numerical simulation as an equivalent beam using the Malmberg-Penning trap device. The results show that both transverse and longitudinal velocities as a function of external magnetic field strength are increasing during the longitudinal compression. The influence of space-charge effect, which is related to the external magnetic field, was observed as the increase of high velocity particles at the weak external magnetic field.

  5. Uncertainty Analysis of Simulated Hydraulic Fracturing

    NASA Astrophysics Data System (ADS)

    Chen, M.; Sun, Y.; Fu, P.; Carrigan, C. R.; Lu, Z.

    2012-12-01

    among input parameters and objective functions. In addition, reduced-order emulation models resulting from this analysis can be used for optimal control of hydraulic fracturing. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  6. Observationally-Motivated Analysis of Simulated Galaxies

    NASA Astrophysics Data System (ADS)

    Miranda, M. S.; Macfarlane, B. A.; Gibson, B. K.

    The spatial and temporal relationships between stellar age, kinematics, and chemistry are a fundamental tool for uncovering the physics driving galaxy formation and evolution. Observationally, these trends are derived using carefully selected samples isolated via the application of appropriate magnitude, colour, and gravity selection functions of individual stars; conversely, the analysis of chemodynamical simulations of galaxies has traditionally been restricted to the age, metallicity, and kinematics of 'composite' stellar particles comprised of open cluster-mass simple stellar populations. As we enter the Gaia era, it is crucial that this approach changes, with simulations confronting data in a manner which better mimics the methodology employed by observers. Here, we use the SynCMD synthetic stellar populations tool to analyse the metallicity distribution function of a Milky Way-like simulated galaxy, employing an apparent magnitude plus gravity selection function similar to that employed by the RAdial Velocity Experiment (RAVE); we compare such an observationally-motivated approach with that traditionally adopted - i.e., spatial cuts alone - in order to illustrate the point that how one analyses a simulation can be, in some cases, just as important as the underlying sub-grid physics employed.

  7. Multiscale DSA simulations for efficient hotspot analysis

    NASA Astrophysics Data System (ADS)

    Hori, Yoshihiro; Yoshimoto, Kenji; Taniguchi, Takashi; Ohshima, Masahiro

    2014-03-01

    In this study, we have investigated how to link "large-scale simulations with the simplified models" to "mesoscale simulations with the detailed models." For the simplified model, we have applied so-called the generalized Ohta-Kawasaki (gOK) model. Our simulation flow was implemented by two steps: 1) parallel computations of block copolymer annealing with the simplified model, 2) detailed analysis of the defects with the SCFT. The local volumetric densities of block copolymers calculated by the simplified models were used as an input for the SCFT. Then the SCFT simulations were performed under the constraints in which the density field was driven to be the one obtained from the simplified model. Using the resultant partition functions, we were able to obtain spatial distributions of the free chain ends and the connection points of the blocks. Note that the chain conformation of block copolymer is an important, but missing component of the simplified models; this multi-scale approach is expected to be useful for further understanding the origin and stability of DSA defects.

  8. Isentropic Analysis of a Simulated Hurricane

    NASA Technical Reports Server (NTRS)

    Mrowiec, Agnieszka A.; Pauluis, Olivier; Zhang, Fuqing

    2016-01-01

    Hurricanes, like many other atmospheric flows, are associated with turbulent motions over a wide range of scales. Here the authors adapt a new technique based on the isentropic analysis of convective motions to study the thermodynamic structure of the overturning circulation in hurricane simulations. This approach separates the vertical mass transport in terms of the equivalent potential temperature of air parcels. In doing so, one separates the rising air parcels at high entropy from the subsiding air at low entropy. This technique filters out oscillatory motions associated with gravity waves and separates convective overturning from the secondary circulation. This approach is applied here to study the flow of an idealized hurricane simulation with the Weather Research and Forecasting (WRF) Model. The isentropic circulation for a hurricane exhibits similar characteristics to that of moist convection, with a maximum mass transport near the surface associated with a shallow convection and entrainment. There are also important differences. For instance, ascent in the eyewall can be readily identified in the isentropic analysis as an upward mass flux of air with unusually high equivalent potential temperature. The isentropic circulation is further compared here to the Eulerian secondary circulation of the simulated hurricane to show that the mass transport in the isentropic circulation is much larger than the one in secondary circulation. This difference can be directly attributed to the mass transport by convection in the outer rainband and confirms that, even for a strongly organized flow like a hurricane, most of the atmospheric overturning is tied to the smaller scales.

  9. Acquisition of building geometry in the simulation of energy performance

    SciTech Connect

    Bazjanac, Vladimir

    2001-06-28

    Building geometry is essential to any simulation of building performance. This paper examines the importing of building geometry into simulation of energy performance from the users' point of view. It lists performance requirements for graphic user interfaces that input building geometry, and discusses the basic options in moving from two- to three-dimensional definition of geometry and the ways to import that geometry into energy simulation. The obvious answer lies in software interoperability. With the BLIS group of interoperable software one can interactively import building geometry from CAD into EnergyPlus and dramatically reduce the effort otherwise needed for manual input.The resulting savings may greatly increase the value obtained from simulation, the number of projects in which energy performance simulation is used, and expedite decision making in the design process.

  10. Simulation for analysis and control of superplastic forming. Final report

    SciTech Connect

    Zacharia, T.; Aramayo, G.A.; Simunovic, S.; Ludtka, G.M.; Khaleel, M.; Johnson, K.I.; Smith, M.T.; Van Arsdale, G.L.; Lavender, C.A.

    1996-08-01

    A joint study was conducted by Oak Ridge National Laboratory (ORNL) and the Pacific Northwest Laboratory (PNL) for the U.S. Department of Energy-Lightweight Materials (DOE-LWM) Program. the purpose of the study was to assess and benchmark the current modeling capabilities with respect to accuracy of predictions and simulation time. Two modeling capabilities with respect to accuracy of predictions and simulation time. Two simulation platforms were considered in this study, which included the LS-DYNA3D code installed on ORNL`s high- performance computers and the finite element code MARC used at PNL. both ORNL and PNL performed superplastic forming (SPF) analysis on a standard butter-tray geometry, which was defined by PNL, to better understand the capabilities of the respective models. The specific geometry was selected and formed at PNL, and the experimental results, such as forming time and thickness at specific locations, were provided for comparisons with numerical predictions. Furthermore, comparisons between the ORNL simulation results, using elasto-plastic analysis, and PNL`s results, using rigid-plastic flow analysis, were performed.

  11. Simulated Patient Studies: An Ethical Analysis

    PubMed Central

    Rhodes, Karin V; Miller, Franklin G

    2012-01-01

    Context In connection with health care reform, the U.S. Department of Health and Human Services commissioned a “mystery shopper,” or simulated patient study, to measure access to primary care. But the study was shelved because of public controversy over “government spying” on doctors. Opponents of the study also raised ethical concerns about the use of deception with human subjects without soliciting their informed consent. Methods We undertook an ethical analysis of the use of simulated patient techniques in health services research, with a particular focus on research measuring access to care. Using a case study, we explored relevant methodological considerations and ethical principles relating to deceptive research without informed consent, as well as U.S. federal regulations permitting exceptions to consent. Findings Several relevant considerations both favor and oppose soliciting consent for simulated patient studies. Making research participation conditional on informed consent protects the autonomy of research subjects and shields them from unreasonable exposure to research risks. However, scientific validity is also an important ethical principle of human subjects research, as the net risks to subjects must be justified by the value to society of the knowledge to be gained. The use of simulated patients to monitor access is a naturalistic and scientifically sound experimental design that can answer important policy-relevant questions, with minimal risks to human subjects. As interaction between researchers and subjects increases, however, so does the need for consent. Conclusions As long as adequate protections of confidentiality of research data are in place, minimally intrusive simulated patient research that gathers policy-relevant data on the health system without the consent of individuals working in that system can be ethically justified when the risks and burdens to research subjects are minimal and the research has the potential to generate

  12. Sample Analysis at Mars Instrument Simulator

    NASA Technical Reports Server (NTRS)

    Benna, Mehdi; Nolan, Tom

    2013-01-01

    The Sample Analysis at Mars Instrument Simulator (SAMSIM) is a numerical model dedicated to plan and validate operations of the Sample Analysis at Mars (SAM) instrument on the surface of Mars. The SAM instrument suite, currently operating on the Mars Science Laboratory (MSL), is an analytical laboratory designed to investigate the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. SAMSIM was developed using Matlab and Simulink libraries of MathWorks Inc. to provide MSL mission planners with accurate predictions of the instrument electrical, thermal, mechanical, and fluid responses to scripted commands. This tool is a first example of a multi-purpose, full-scale numerical modeling of a flight instrument with the purpose of supplementing or even eliminating entirely the need for a hardware engineer model during instrument development and operation. SAMSIM simulates the complex interactions that occur between the instrument Command and Data Handling unit (C&DH) and all subsystems during the execution of experiment sequences. A typical SAM experiment takes many hours to complete and involves hundreds of components. During the simulation, the electrical, mechanical, thermal, and gas dynamics states of each hardware component are accurately modeled and propagated within the simulation environment at faster than real time. This allows the simulation, in just a few minutes, of experiment sequences that takes many hours to execute on the real instrument. The SAMSIM model is divided into five distinct but interacting modules: software, mechanical, thermal, gas flow, and electrical modules. The software module simulates the instrument C&DH by executing a customized version of the instrument flight software in a Matlab environment. The inputs and outputs to this synthetic C&DH are mapped to virtual sensors and command lines that mimic in their structure and connectivity the layout of the instrument harnesses. This module executes

  13. Institutional analysis for energy policy

    SciTech Connect

    Morris, F.A.; Cole, R.J.

    1980-07-01

    This report summarizes principles, techniques, and other information for doing institutional analyses in the area of energy policy. The report was prepared to support DOE's Regional Issues Identification and Assessment (RIIA) program. RIIA identifies environmental, health, safety, socioeconomic, and institutional issues that could accompany hypothetical future scenarios for energy consumption and production on a regional basis. Chapter 1 provides some theoretical grounding in institutional analysis. Chapter 2 provides information on constructing institutional maps of the processes for bringing on line energy technologies and facilities contemplated in RIIA scenarios. Chapter 3 assesses the institutional constraints, opportunities, and impacts that affect whether these technologies and facilities would in fact be developed. Chapters 4 and 5 show how institutional analysis can support use of exercises such as RIIA in planning institutional change and making energy policy choices.

  14. Energy simulation and optimization for a small commercial building through Modelica

    NASA Astrophysics Data System (ADS)

    Rivas, Bryan

    Small commercial buildings make up the majority of buildings in the United States. Energy consumed by these buildings is expected to drastically increase in the next few decades, with a large percentage of the energy consumed attributed to cooling systems. This work presents the simulation and optimization of a thermostat schedule to minimize energy consumption in a small commercial building test bed during the cooling season. The simulation occurs through the use of the multi-engineering domain Dymola environment based on the Modelica open source programming language and is optimized with the Java based optimization program GenOpt. The simulation uses both physically based modeling utilizing heat transfer principles for the building and regression analysis for energy consumption. GenOpt is dynamically coupled to Dymola through various interface files. There are very few studies that have coupled GenOpt to a building simulation program and even fewer studies have used Dymola for building simulation as extensively as the work presented here. The work presented proves Dymola as a viable alternative to other building simulation programs such as EnergyPlus and MatLab. The model developed is used to simulate the energy consumption of a test bed, a commissioned real world small commercial building, while maintaining indoor thermal comfort. Potential applications include smart or intelligent building systems, predictive simulation of small commercial buildings, and building diagnostics.

  15. Database for Simulation of Electron Spectra for Surface Analysis (SESSA)Database for Simulation of Electron Spectra for Surface Analysis (SESSA)

    National Institute of Standards and Technology Data Gateway

    SRD 100 Database for Simulation of Electron Spectra for Surface Analysis (SESSA)Database for Simulation of Electron Spectra for Surface Analysis (SESSA) (PC database for purchase)   This database has been designed to facilitate quantitative interpretation of Auger-electron and X-ray photoelectron spectra and to improve the accuracy of quantitation in routine analysis. The database contains all physical data needed to perform quantitative interpretation of an electron spectrum for a thin-film specimen of given composition. A simulation module provides an estimate of peak intensities as well as the energy and angular distributions of the emitted electron flux.

  16. Design, modeling, simulation and evaluation of a distributed energy system

    NASA Astrophysics Data System (ADS)

    Cultura, Ambrosio B., II

    This dissertation presents the design, modeling, simulation and evaluation of distributed energy resources (DER) consisting of photovoltaics (PV), wind turbines, batteries, a PEM fuel cell and supercapacitors. The distributed energy resources installed at UMass Lowell consist of the following: 2.5kW PV, 44kWhr lead acid batteries and 1500W, 500W & 300W wind turbines, which were installed before year 2000. Recently added to that are the following: 10.56 kW PV array, 2.4 kW wind turbine, 29 kWhr Lead acid batteries, a 1.2 kW PEM fuel cell and 4-140F supercapacitors. Each newly added energy resource has been designed, modeled, simulated and evaluated before its integration into the existing PV/Wind grid-connected system. The Mathematical and Simulink model of each system was derived and validated by comparing the simulated and experimental results. The Simulated results of energy generated from a 10.56kW PV system are in good agreement with the experimental results. A detailed electrical model of a 2.4kW wind turbine system equipped with a permanent magnet generator, diode rectifier, boost converter and inverter is presented. The analysis of the results demonstrates the effectiveness of the constructed simulink model, and can be used to predict the performance of the wind turbine. It was observed that a PEM fuel cell has a very fast response to load changes. Moreover, the model has validated the actual operation of the PEM fuel cell, showing that the simulated results in Matlab Simulink are consistent with the experimental results. The equivalent mathematical equation, derived from an electrical model of the supercapacitor, is used to simulate its voltage response. The model is completely capable of simulating its voltage behavior, and can predict the charge time and discharge time of voltages on the supercapacitor. The bi-directional dc-dc converter was designed in order to connect the 48V battery bank storage to the 24V battery bank storage. This connection was

  17. Guidelines for the analysis of free energy calculations.

    PubMed

    Klimovich, Pavel V; Shirts, Michael R; Mobley, David L

    2015-05-01

    Free energy calculations based on molecular dynamics simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical-analysis.py, freely available on GitHub as part of the pymbar package (located at http://github.com/choderalab/pymbar), that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope this tool and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations. PMID:25808134

  18. Guidelines for the analysis of free energy calculations

    PubMed Central

    Klimovich, Pavel V.; Shirts, Michael R.; Mobley, David L.

    2015-01-01

    Free energy calculations based on molecular dynamics (MD) simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical–analysis.py, freely available on GitHub at https://github.com/choderalab/pymbar–examples, that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope these tools and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations. PMID:25808134

  19. Simulation model for Vuilleumier cycle machines and analysis of characteristics

    NASA Astrophysics Data System (ADS)

    Sekiya, Hiroshi; Terada, Fusao

    1992-11-01

    Numerical analysis using the computer is useful in predicting and evaluating the performance of the Vuilleumier (VM) cycle machine in research and development. The 3rd-order method must be employed particularly in the case of detailed analysis of performance and design optimization. This paper describes our simulation model for the VM machine, which is based on that method. The working space is divided into thirty-eight control volumes for the VM heat pump test machine, and the fundamental equations are derived rigorously by applying the conservative equations of mass, momentum, and energy to each control volume, using staggered mesh. These equations are solved simultaneously by the Adams-Moulton method. Then, the test machine is investigated in terms of the pressure and temperature fluctuations of the working gas, the energy flow, and the performance at each speed of revolution. The calculated results are examined in comparison with the experimental ones.

  20. Kinetic energy of rainfall simulation nozzles

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Different spray nozzles are used frequently to simulate natural rain for soil erosion and chemical transport, particularly phosphorous (P), studies. Oscillating VeeJet nozzles are used mostly in soil erosion research while constant spray FullJet nozzles are commonly used for P transport. Several ch...

  1. DOE-2 Building Energy Analysis Program

    SciTech Connect

    Curtis, R.B.; Birdsall, B.; Buhl, W.F.; Erdem, E.; Eto, J.; Hirsch, J.J.; Olson, K.H.; Winkelmann, F.C.

    1984-04-01

    The DOE-2 Building Energy Analysis Program was designed to allow engineers and architects to perform design studies of whole-building energy use under actual weather conditions. Its development was guided by several objectives: (1) that the description of the building entered by the user be readily understood by non-computer scientists, (2) that, when available, the calculations be based upon well established algorithms, (3) that it permit the simulation of commonly available heating, ventilating, and air-conditioning (HVAC) equipment, (4) that the computer costs of the program be minimal, and (5) that the predicted energy use of a building be acceptably close to measured values. These objectives have been met. An overview of the program upon completion of the DOE-2.1C edition is given.

  2. Teaching a Model-based Climatology Using Energy Balance Simulation.

    ERIC Educational Resources Information Center

    Unwin, David

    1981-01-01

    After outlining the difficulties of teaching climatology within an undergraduate geography curriculum, the author describes and evaluates the use of a computer assisted simulation to model surface energy balance and the effects of land use changes on local climate. (AM)

  3. An energy balance simulation tool for TOMS-EP

    SciTech Connect

    Mackowski, M.J.; Martin, D.K.

    1996-12-31

    A computer analysis tool has been developed to perform energy balance simulations of a spacecraft power subsystem. The purpose of the tool is to predict the battery state-of-charge as a function of time for different mission scenarios, particularly during the first few orbits. The load profile (power use versus time) and the solar array power available for charging the battery were both time-varying functions that were different for each scenario. Therefore an analysis tool was needed that could easily make changes to the load profile and select different levels of solar array power. This was accomplished by developing a simple spreadsheet that defined the load profiles, which would then be imported into another spreadsheet that performed the energy balance calculations, including the adjustments to the solar array output. The development of these relatively simple spreadsheets replaced a laborious manual process of defining the load profiles which were then sued in a less sophisticated spreadsheet.The improved version also added a capability to include loads prior to satellite separation from the launch vehicle. A more elaborate simulation program had also been used in the past, but it was inconvenient to use and was not as precise as the new spreadsheet. In summary, the new tool made it easy to quickly develop and evaluate many different operational scenarios. This process has been used to evaluate responses to various failure modes and to develop contingency plans for the first few orbits of the Total Ozone Mapping Spectrometer--Earth Probe (TOMS-EP) mission.

  4. An Energy Crisis Management Simulation for the State of California

    NASA Astrophysics Data System (ADS)

    Thomas, M. A.

    1982-08-01

    The Rand Corporation hosted an energy crisis management workshop on January 27-29, 1982. The workshop was sponsored by the California Energy Commission and was designed to give participants experience in managing an energy emergency. The vehicle for providing this experience was a gaming exercise that simulated an international energy emergency. The structure of the game used in the workshop and the game results are documented.

  5. Molecular dynamics simulation of threshold displacement energies in zircon

    SciTech Connect

    Moreira, Pedro A.; Devanathan, Ramaswami; Yu, Jianguo; Weber, William J.

    2009-10-15

    Molecular-dynamics simulations were used to examine the displacement threshold energy (Ed) surface for Zr, Si and O in zircon using two different interatomic potentials. For each sublattice, the simulation was repeated from different initial conditions to estimate the uncertainty in the calculated value of Ed. The displacement threshold energies vary considerably with crystallographic direction and sublattice. The average displacement energy calculated with a recently developed transferable potential is about 120 and 60 eV for cations and anions, respectively. The oxygen displacement energy shows good agreement with experimental estimates in ceramics.

  6. Simulation Toolkit for Renewable Energy Advanced Materials Modeling

    Energy Science and Technology Software Center (ESTSC)

    2013-11-13

    STREAMM is a collection of python classes and scripts that enables and eases the setup of input files and configuration files for simulations of advanced energy materials. The core STREAMM python classes provide a general framework for storing, manipulating and analyzing atomic/molecular coordinates to be used in quantum chemistry and classical molecular dynamics simulations of soft materials systems. The design focuses on enabling the interoperability of materials simulation codes such as GROMACS, LAMMPS and Gaussian.

  7. North energy system risk analysis features

    NASA Astrophysics Data System (ADS)

    Prokhorov, V. A.; Prokhorov, D. V.

    2015-12-01

    Risk indicator analysis for a decentralized energy system of the North was carried out. Based on analysis of damages caused by accidents at energy systems, their structure is selected, and a North energy system risk determination method was proposed.

  8. NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    SciTech Connect

    Not Available

    2012-01-01

    This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available for model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.

  9. Evaluation on Influence of Unstable Primary-Energy Price in a Deregulated Electric Power Market—Analysis based on a simulation model approach—

    NASA Astrophysics Data System (ADS)

    Maitani, Tatsuyuki; Tezuka, Tetsuo

    The electric power market of Japan has been locally monopolized for a long time. But, like many countries, Japan is moving forward with the deregulation of its electric power industry so that any power generation company could sell electric power in the market. The power price, however, will fluctuate inevitably to balance the power supply and demand. A new appropriate market design is indispensable when introducing new market mechanisms in the electric power market to avoid undesirable results of the market. The first stage of deregulation will be the competition between an existing large-scaled power utility and a new power generation company. In this paper we have investigated the wholesale market with competition of these two power companies based on a simulation model approach. Under the competitive situation the effects of exogenous disturbance may bring serious results and we estimated the influence on the market when the price of fossil fuel rises. The conclusion of this study is that several types of Nash equilibriums have been found in the market: the larger the new power generation company becomes, the higher the electricity price under the Nash equilibriums rises. Because of the difference in their structure of generation capacity, the existing large-scaled power utility gets more profit while the new power generation company loses its profit when the price of fossil fuel rises.

  10. Simulating the Value of Concentrating Solar Power with Thermal Energy Storage in a Production Cost Model

    SciTech Connect

    Denholm, P.; Hummon, M.

    2012-11-01

    Concentrating solar power (CSP) deployed with thermal energy storage (TES) provides a dispatchable source of renewable energy. The value of CSP with TES, as with other potential generation resources, needs to be established using traditional utility planning tools. Production cost models, which simulate the operation of grid, are often used to estimate the operational value of different generation mixes. CSP with TES has historically had limited analysis in commercial production simulations. This document describes the implementation of CSP with TES in a commercial production cost model. It also describes the simulation of grid operations with CSP in a test system consisting of two balancing areas located primarily in Colorado.

  11. The Lorenz energy cycle in simulated rotating annulus flows

    NASA Astrophysics Data System (ADS)

    Young, R. M. B.

    2014-05-01

    Lorenz energy cycles are presented for a series of simulated differentially heated rotating annulus flows, in the axisymmetric, steady, amplitude vacillating, and structurally vacillating flow regimes. The simulation allows contributions to the energy diagnostics to be identified in parts of the fluid that cannot be measured in experiments. These energy diagnostics are compared with laboratory experiments studying amplitude vacillation, and agree well with experimental time series of kinetic and potential energy, as well as conversions between them. Two of the three major energy transfer paradigms of the Lorenz energy cycle are identified—a Hadley-cell overturning circulation, and baroclinic instability. The third, barotropic instability, was never dominant, but increased in strength as rotation rate increased. For structurally vacillating flow, which matches the Earth's thermal Rossby number well, the ratio between energy conversions associated with baroclinic and barotropic instabilities was similar to the measured ratio in the Earth's mid-latitudes.

  12. Simulaid: a simulation facilitator and analysis program.

    PubMed

    Mezei, Mihaly

    2010-11-15

    Simulaid performs a large number of simulation-related tasks: interconversion and modification of structure and trajectory files, optimization of orientation, and a large variety of analysis functions. The program can handle structures in PDB (Berman et al., Nucleic Acids Res 2000, 28, 235), Charmm (Brooks et al., J Comput Chem 4, 187) CRD, Amber (Case et al.), Macromodel (Mohamadi et al., J Comput Chem 1990, 11, 440), Gromos/Gromacs (Hess et al.), InsightII (InsightII. Accelrys Inc.: San Diego, 2005), Grasp (Nicholls et al., Proteins: Struct Funct Genet 1991, 11, 281) .crg, Tripos (Tripos International, S. H. R., St. Louis, MO) .mol2 (input only), and in the MMC (Mezei, M.; MMC: Monte Carlo program for molecular assemblies. Available at: http://inka.mssm.edu/~mezei/mmc) formats; and trajectories in the formats of Charmm, Amber, Macromodel, and MMC. Analysis features include (but are not limited to): (1) simple distance calculations and hydrogen-bond analysis, (2) calculation of 2-D RMSD maps (produced both as text file with the data and as a color-coded matrix) and cross RMSD maps between trajectories, (3) clustering based on RMSD maps, (4) analysis of torsion angles, Ramachandran (Ramachandran and Sasiskharan, Adv Protein Chem 1968, 23, 283) angles, proline kink (Visiers et al., Protein Eng 2000, 13, 603) angles, pseudorotational (Altona and Sundaralingam, J Am Chem Soc 1972, 94, 8205; Cremer and Pople, J Am Chem Soc 1975, 97, 1354) angles, and (5) analysis based on circular variance (Mezei, J Mol Graphics Model 2003, 21, 463). Torsion angle evolutions are presented in dial plots (Ravishanker et al., J Biomol Struct Dyn 1989, 6, 669). Several of these features are unique to Simulaid. PMID:20740566

  13. Thermal analysis of simulated Pantex pit storage

    SciTech Connect

    Aceves, S.M., Kornblum, B.T.

    1996-10-01

    This report investigates potential pit storage configurations that could be used at the Mason and Hanger Pantex Plant. The study utilizes data from a thermal test series performed at Lawrence Livermore National Laboratory (LLNL) that simulated these storage configurations. The heat output values used in the LLNL test series do not represent actual pits but are rounded numbers that were chosen for convenience to allow parameter excursions. Specifically in this project, we are modeling the heat transfer and air flow around cylindrical storage containers in Pantex magazines in order to predict container temperatures. This difficult problem in thermal- fluid mechanics involves transient, three-dimensional (3-D) natural convection and thermal radiation around interacting containers with various heat generation rates. Our approach is to link together two computational methods in order to synthesize a modeling procedure for a large array of pit storage containers. The approach employs a finite element analysis of a few containers, followed by a lumped- parameter model of an array of containers. The modeling procedure we developed was applied in the simulation of a recent experiment where temperatures of pit storage containers were monitored in a steady- state, controlled environment. Our calculated pit container temperatures are comparable with data from that experiment. We found it absolutely necessary to include thermal radiation between containers in order to predict temperatures accurately, although the assumption of black-body radiation appears to be sufficient. When radiation is neglected the calculated temperatures are 4 to 6 {degrees}C higher than temperature data from the experiment. We also investigated our model`s sensitivity to variations in the natural convection heat transfer coefficient and found that with a 50% drop in the coefficient, calculated temperatures are approximately I {degree}C higher. Finally, with a modified lumped-parameter model, we

  14. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    SciTech Connect

    Boring, Ronald Laurids; Shirley, Rachel Elizabeth; Joe, Jeffrey Clark; Mandelli, Diego

    2014-12-01

    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  15. Visual cavity analysis in molecular simulations

    PubMed Central

    2013-01-01

    Molecular surfaces provide a useful mean for analyzing interactions between biomolecules; such as identification and characterization of ligand binding sites to a host macromolecule. We present a novel technique, which extracts potential binding sites, represented by cavities, and characterize them by 3D graphs and by amino acids. The binding sites are extracted using an implicit function sampling and graph algorithms. We propose an advanced cavity exploration technique based on the graph parameters and associated amino acids. Additionally, we interactively visualize the graphs in the context of the molecular surface. We apply our method to the analysis of MD simulations of Proteinase 3, where we verify the previously described cavities and suggest a new potential cavity to be studied. PMID:24564409

  16. Simulation and Analysis of Launch Teams (SALT)

    NASA Technical Reports Server (NTRS)

    2008-01-01

    A SALT effort was initiated in late 2005 with seed funding from the Office of Safety and Mission Assurance Human Factors organization. Its objectives included demonstrating human behavior and performance modeling and simulation technologies for launch team analysis, training, and evaluation. The goal of the research is to improve future NASA operations and training. The project employed an iterative approach, with the first iteration focusing on the last 70 minutes of a nominal-case Space Shuttle countdown, the second iteration focusing on aborts and launch commit criteria violations, the third iteration focusing on Ares I-X communications, and the fourth iteration focusing on Ares I-X Firing Room configurations. SALT applied new commercial off-the-shelf technologies from industry and the Department of Defense in the spaceport domain.

  17. Radiation and ionization energy loss simulation for the GDH sum rule experiment in Hall-A at Jefferson Lab

    DOE PAGESBeta

    Yan, Xin -Hu; Ye, Yun -Xiu; Chen, Jian -Ping; Lu, Hai -Jiang; Zhu, Peng -Jia; Jiang, Feng -Jian

    2015-07-17

    The radiation and ionization energy loss are presented for single arm Monte Carlo simulation for the GDH sum rule experiment in Hall-A at Jefferson Lab. Radiation and ionization energy loss are discussed formore » $$^{12}C$$ elastic scattering simulation. The relative momentum ratio $$\\frac{\\Delta p}{p}$$ and $$^{12}C$$ elastic cross section are compared without and with radiation energy loss and a reasonable shape is obtained by the simulation. The total energy loss distribution is obtained, showing a Landau shape for $$^{12}C$$ elastic scattering. This simulation work will give good support for radiation correction analysis of the GDH sum rule experiment.« less

  18. Radiation and ionization energy loss simulation for the GDH sum rule experiment in Hall-A at Jefferson Lab

    SciTech Connect

    Yan, Xin -Hu; Ye, Yun -Xiu; Chen, Jian -Ping; Lu, Hai -Jiang; Zhu, Peng -Jia; Jiang, Feng -Jian

    2015-07-17

    The radiation and ionization energy loss are presented for single arm Monte Carlo simulation for the GDH sum rule experiment in Hall-A at Jefferson Lab. Radiation and ionization energy loss are discussed for $^{12}C$ elastic scattering simulation. The relative momentum ratio $\\frac{\\Delta p}{p}$ and $^{12}C$ elastic cross section are compared without and with radiation energy loss and a reasonable shape is obtained by the simulation. The total energy loss distribution is obtained, showing a Landau shape for $^{12}C$ elastic scattering. This simulation work will give good support for radiation correction analysis of the GDH sum rule experiment.

  19. Asymptotic modal analysis and statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Dowell, Earl H.

    1992-01-01

    Asymptotic Modal Analysis (AMA) is a method which is used to model linear dynamical systems with many participating modes. The AMA method was originally developed to show the relationship between statistical energy analysis (SEA) and classical modal analysis (CMA). In the limit of a large number of modes of a vibrating system, the classical modal analysis result can be shown to be equivalent to the statistical energy analysis result. As the CMA result evolves into the SEA result, a number of systematic assumptions are made. Most of these assumptions are based upon the supposition that the number of modes approaches infinity. It is for this reason that the term 'asymptotic' is used. AMA is the asymptotic result of taking the limit of CMA as the number of modes approaches infinity. AMA refers to any of the intermediate results between CMA and SEA, as well as the SEA result which is derived from CMA. The main advantage of the AMA method is that individual modal characteristics are not required in the model or computations. By contrast, CMA requires that each modal parameter be evaluated at each frequency. In the latter, contributions from each mode are computed and the final answer is obtained by summing over all the modes in the particular band of interest. AMA evaluates modal parameters only at their center frequency and does not sum the individual contributions from each mode in order to obtain a final result. The method is similar to SEA in this respect. However, SEA is only capable of obtaining spatial averages or means, as it is a statistical method. Since AMA is systematically derived from CMA, it can obtain local spatial information as well.

  20. Sampling errors in free energy simulations of small molecules in lipid bilayers.

    PubMed

    Neale, Chris; Pomès, Régis

    2016-10-01

    Free energy simulations are a powerful tool for evaluating the interactions of molecular solutes with lipid bilayers as mimetics of cellular membranes. However, these simulations are frequently hindered by systematic sampling errors. This review highlights recent progress in computing free energy profiles for inserting molecular solutes into lipid bilayers. Particular emphasis is placed on a systematic analysis of the free energy profiles, identifying the sources of sampling errors that reduce computational efficiency, and highlighting methodological advances that may alleviate sampling deficiencies. This article is part of a Special Issue entitled: Biosimulations edited by Ilpo Vattulainen and Tomasz Róg. PMID:26952019

  1. First assessment of continental energy storage in CMIP5 simulations

    NASA Astrophysics Data System (ADS)

    Cuesta-Valero, Francisco José; García-García, Almudena; Beltrami, Hugo; Smerdon, Jason E.

    2016-05-01

    Although much of the energy gained by the climate system over the last century has been stored in the oceans, continental energy storage remains important to estimate the Earth's energy imbalance and also because crucial positive climate feedback processes such as soil carbon and permafrost stability depend on continental energy storage. Here for the first time, 32 general circulation model simulations from the fifth phase of the Coupled Model Intercomparison Project (CMIP5) are examined to assess their ability to characterize the continental energy storage. Results display a consistently lower magnitude of continental energy storage in CMIP5 simulations than the estimates from geothermal data. A large range in heat storage is present across the model ensemble, which is largely explained by the substantial differences in the bottom boundary depths used in each land surface component.

  2. Image simulation for electron energy loss spectroscopy

    SciTech Connect

    Oxley, Mark P.; Pennycook, Stephen J.

    2007-10-22

    In this paper, aberration correction of the probe forming optics of the scanning transmission electron microscope has allowed the probe-forming aperture to be increased in size, resulting in probes of the order of 1 Å in diameter. The next generation of correctors promise even smaller probes. Improved spectrometer optics also offers the possibility of larger electron energy loss spectrometry detectors. The localization of images based on core-loss electron energy loss spectroscopy is examined as function of both probe-forming aperture and detector size. The effective ionization is nonlocal in nature, and two common local approximations are compared to full nonlocal calculations. Finally, the affect of the channelling of the electron probe within the sample is also discussed.

  3. Image simulation for electron energy loss spectroscopy

    DOE PAGESBeta

    Oxley, Mark P.; Pennycook, Stephen J.

    2007-10-22

    In this paper, aberration correction of the probe forming optics of the scanning transmission electron microscope has allowed the probe-forming aperture to be increased in size, resulting in probes of the order of 1 Å in diameter. The next generation of correctors promise even smaller probes. Improved spectrometer optics also offers the possibility of larger electron energy loss spectrometry detectors. The localization of images based on core-loss electron energy loss spectroscopy is examined as function of both probe-forming aperture and detector size. The effective ionization is nonlocal in nature, and two common local approximations are compared to full nonlocal calculations.more » Finally, the affect of the channelling of the electron probe within the sample is also discussed.« less

  4. Fast Monte Carlo for ion beam analysis simulations

    NASA Astrophysics Data System (ADS)

    Schiettekatte, François

    2008-04-01

    A Monte Carlo program for the simulation of ion beam analysis data is presented. It combines mainly four features: (i) ion slowdown is computed separately from the main scattering/recoil event, which is directed towards the detector. (ii) A virtual detector, that is, a detector larger than the actual one can be used, followed by trajectory correction. (iii) For each collision during ion slowdown, scattering angle components are extracted form tables. (iv) Tables of scattering angle components, stopping power and energy straggling are indexed using the binary representation of floating point numbers, which allows logarithmic distribution of these tables without the computation of logarithms to access them. Tables are sufficiently fine-grained that interpolation is not necessary. Ion slowdown computation thus avoids trigonometric, inverse and transcendental function calls and, as much as possible, divisions. All these improvements make possible the computation of 107 collisions/s on current PCs. Results for transmitted ions of several masses in various substrates are well comparable to those obtained using SRIM-2006 in terms of both angular and energy distributions, as long as a sufficiently large number of collisions is considered for each ion. Examples of simulated spectrum show good agreement with experimental data, although a large detector rather than the virtual detector has to be used to properly simulate background signals that are due to plural collisions. The program, written in standard C, is open-source and distributed under the terms of the GNU General Public License.

  5. Electromyographic analysis on a windsurfing simulator

    PubMed Central

    Campillo, Philippe; Leszczynski, Barbara; Marthe, Cédric; Hespel, Jean Michel

    2007-01-01

    Recent technical innovations in windsurfing have been concentrated on the evolution of the sails and the board. It is only recently that manufacturers have become interested in the wishbones which have evolved becoming thinner and lighter than in the past. A group of six experienced windsurfers participated in an experiment on a land based windsurfing simulator. The goal of the study was to analyze the muscular force used for different techniques for holding onto the wishbone. The test consisted in recording the global electromyographic activity of several muscles on the forearm using surface electrodes. There were two different wind force conditions possible with the simulator: medium (15 kg) and strong (25 kg). Three different wishbone diameters were tested (28, 30 and 32 mm). Four different hand positions on the wishbone were analyzed: leading hand and/or following hand in pronation and/or supination. The electrical muscular activity obtained varied significantly (p < 0.05) depending on the type of grip and according to the diameter of the wishbone. The position with the two hands in supination on a wishbone of 28 mm in diameter was the most economical in muscular terms, notably the flexions of the forearm. The confirmation of the results should lead windsurfers to reconsider the positioning of the wishbone and the adapted posture to waste the least amount of energy possible. Key pointsFemale athletes landed with increased knee valgus and VGRF which may predispose them to ACL injury.Fatigue elicited a similar response in male and female athletes.The effectiveness of sports injury prevention programs may improve by focusing on teaching females to land softer and with less knee valgus. PMID:24149235

  6. Electromyographic analysis on a windsurfing simulator.

    PubMed

    Campillo, Philippe; Leszczynski, Barbara; Marthe, Cédric; Hespel, Jean Michel

    2007-01-01

    Recent technical innovations in windsurfing have been concentrated on the evolution of the sails and the board. It is only recently that manufacturers have become interested in the wishbones which have evolved becoming thinner and lighter than in the past. A group of six experienced windsurfers participated in an experiment on a land based windsurfing simulator. The goal of the study was to analyze the muscular force used for different techniques for holding onto the wishbone. The test consisted in recording the global electromyographic activity of several muscles on the forearm using surface electrodes. There were two different wind force conditions possible with the simulator: medium (15 kg) and strong (25 kg). Three different wishbone diameters were tested (28, 30 and 32 mm). Four different hand positions on the wishbone were analyzed: leading hand and/or following hand in pronation and/or supination. The electrical muscular activity obtained varied significantly (p < 0.05) depending on the type of grip and according to the diameter of the wishbone. The position with the two hands in supination on a wishbone of 28 mm in diameter was the most economical in muscular terms, notably the flexions of the forearm. The confirmation of the results should lead windsurfers to reconsider the positioning of the wishbone and the adapted posture to waste the least amount of energy possible. Key pointsFemale athletes landed with increased knee valgus and VGRF which may predispose them to ACL injury.Fatigue elicited a similar response in male and female athletes.The effectiveness of sports injury prevention programs may improve by focusing on teaching females to land softer and with less knee valgus. PMID:24149235

  7. Analysis of five simulated straw harvest scenarios

    SciTech Connect

    Sokhansanj, Shahabaddine; Turhollow Jr, Anthony F; Stephen, Jamie; Stumborg, Mark; Fenton, James; Mani, Sudhagar

    2008-01-01

    Almost 36 million tonnes (t) of cereal grains are harvested annually on more than 16 million hectares (ha) in Canada. The net straw production varies year by year depending upon weather patterns, crop fertility, soil conservation measures, harvest method, and plant variety. The net yield of straw, after discounting for soil conservation, averages approximately 2.5 dry (d)t ha-1. Efficient equipment is needed to collect and package the material as a feedstock for industrial applications. This paper investigates the costs, energy input, and emissions from power equipment used for harvesting straw. Five scenarios were investigated: (1) large square bales, (2) round bales, (3) large compacted stacks (loafs), (4) dried chops, and (5) wet chops. The baled or loafed biomass is stacked next to the farm. Dry chop is collected in a large pile and wet chop is ensiled. The baling and stacking cost was $21.47 dt-1 (dry tonne), with little difference between round and large square baling. Loafing was the cheapest option at $17.08 dt-1. Dry chop and piling was $23.90 dt-1 and wet chop followed by ensiling was $59.75 dt-1. A significant portion of the wet chop cost was in ensiling. Energy input and emissions were proportional to the costs for each system, except for loafing, which required more energy input than the baling systems. As a fraction of the energy content of biomass (roughly 16 GJ dt-1), the energy input ranged from 1.2% for baling to 3.2% for ensiling. Emissions from the power equipment ranged from 20.3 kg CO2e dt-1 to more than 40 kg CO2e dt-1. A sensitivity analysis on the effect of yield on collection costs showed that a 33% increase in yield reduced the cost by 20%. Similarly a sensitivity analysis on weather conditions showed that a 10oC cooler climate extended the harvest period by 5-10 days whereas a 10oC warmer climate shortened the harvest period by 2-3 days.

  8. Transport Energy Impact Analysis; NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Gonder, J.

    2015-05-13

    Presented at the Sustainable Transportation Energy Pathways Spring 2015 Symposium on May 13, 2015, this presentation by Jeff Gonder of the National Renewable Energy Laboratory (NREL) provides information about NREL's transportation energy impact analysis of connected and automated vehicles.

  9. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    NASA Technical Reports Server (NTRS)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  10. Estimating Continental Energy Storage from CMIP5 Simulations

    NASA Astrophysics Data System (ADS)

    José Cuesta-Valero, Francisco; García-García, Almudena; Beltrami, Hugo; Smerdon, Jason

    2016-04-01

    The Earth's energy imbalance is a critical metric for understanding the current state of the Earth's climate system and its future evolution. Although much of the energy gained by the climate system over the last century has been stored in the oceans, the continental subsurface energy storage remains important because climate feedback processes such as soil carbon and permafrost stability depend on long-term subsurface energy storage. Here, for the first time, thirty two General Circulation Model (GCM) simulations from the fifth phase of the Coupled Model Intercomparison Project (CMIP5) were examined to assess their ability to account for the continental energy storage. The magnitude of the subsurface heat content derived from GCM simulations are consistently lower than the estimates from borehole temperature data for the second half of the 20th century. The estimates of continental heat storage from CMIP5 simulations also display a large range of variability which may be partially due to (1) the different bottom boundary depth of each GCM land surface component, limiting the subsurface heat storage, (2) the different energy exchange parameterizations between the lower atmosphere and the ground within each model, and (3) the different sensitivity of models to external forcings. Our results suggest that a deeper bottom boundary placement in the land surface component could improve the estimates of subsurface energy content within the GCM simulations.

  11. Performance of statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Davis, R. F.; Hines, D. E.

    1973-01-01

    Statistical energy analysis (SEA) methods have been developed for high frequency modal analyses on random vibration environments. These SEA methods are evaluated by comparing analytical predictions to test results. Simple test methods are developed for establishing SEA parameter values. Techniques are presented, based on the comparison of the predictions with test values, for estimating SEA accuracy as a function of frequency for a general structure.

  12. Current work in energy analysis

    SciTech Connect

    1998-03-01

    This report describes the work performed at Berkeley Lab most recently. One of the Labs accomplishments is the publication of Scenarios of US Carbon Reductions, an analysis of the potential of energy technologies to reduce carbon emissions in the US. This analysis is described and played a key role in shaping the US position on climate change in the Kyoto Protocol negotiations. The Labs participation in the fundamental characterization of the climate change issue by the IPCC is described. Described also is a study of leaking electricity, which is stimulating an international campaign for a one-watt ceiling for standby electricity losses from appliances. This ceiling has the potential to save two-thirds of the 5% of US residential electricity currently expended on standby losses. The 54 vignettes contained in the report summarize results of research activities ranging in scale from calculating the efficacy of individual lamp ballasts to estimating the cost-effectiveness of the national Energy Star{reg_sign} labeling program, and ranging in location from a scoping study of energy-efficiency market transformation in California to development of an energy-efficiency project in the auto parts industry in Shandong Province, China.

  13. Asymptotic modal analysis and statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Dowell, Earl H.; Peretti, Linda F.

    1990-01-01

    The sound field of a structural-acoustic enclosure was subject to experimental analysis and theoretical description in order to develop an efficient and accurate method for predicting sound pressure levels in enclosures such as aircraft fuselages. Asymptotic Modal Analysis (AMA) is the method under investigation. AMA is derived from classical modal analysis (CMA) by considering the asymptotic limit of the sound pressure level as the number of acoustic and/or structural modes approaches infinity. Using AMA, results identical to those of Statistical Energy Analysis (SEA) were obtained for the spatially-averaged sound pressure levels in the interior. AMA is systematically derived from CMA and therefore the degree of generality of the end result can be adjusted through the choice of appropriate simplifying assumptions. For example, AMA can be used to obtain local sound pressure levels at particular points inside the enclosure, or to include the effects of varying the size and/or location of the sound source. AMA theoretical results were compared with CMA theory and also with experiment for the case where the structural-acoustic enclosure is a rectangular cavity with part of one wall flexible and vibrating, while the rest of the cavity is rigid.

  14. Methodology for analysis and simulation of large multidisciplinary problems

    NASA Technical Reports Server (NTRS)

    Russell, William C.; Ikeda, Paul J.; Vos, Robert G.

    1989-01-01

    The Integrated Structural Modeling (ISM) program is being developed for the Air Force Weapons Laboratory and will be available for Air Force work. Its goal is to provide a design, analysis, and simulation tool intended primarily for directed energy weapons (DEW), kinetic energy weapons (KEW), and surveillance applications. The code is designed to run on DEC (VMS and UNIX), IRIS, Alliant, and Cray hosts. Several technical disciplines are included in ISM, namely structures, controls, optics, thermal, and dynamics. Four topics from the broad ISM goal are discussed. The first is project configuration management and includes two major areas: the software and database arrangement and the system model control. The second is interdisciplinary data transfer and refers to exchange of data between various disciplines such as structures and thermal. Third is a discussion of the integration of component models into one system model, i.e., multiple discipline model synthesis. Last is a presentation of work on a distributed processing computing environment.

  15. Combustion irreversibilities: Numerical simulation and analysis

    NASA Astrophysics Data System (ADS)

    Silva, Valter; Rouboa, Abel

    2012-08-01

    An exergy analysis was performed considering the combustion of methane and agro-industrial residues produced in Portugal (forest residues and vines pruning). Regarding that the irreversibilities of a thermodynamic process are path dependent, the combustion process was considering as resulting from different hypothetical paths each one characterized by four main sub-processes: reactant mixing, fuel oxidation, internal thermal energy exchange (heat transfer), and product mixing. The exergetic efficiency was computed using a zero dimensional model developed by using a Visual Basic home code. It was concluded that the exergy losses were mainly due to the internal thermal energy exchange sub-process. The exergy losses from this sub-process are higher when the reactants are preheated up to the ignition temperature without previous fuel oxidation. On the other hand, the global exergy destruction can be minored increasing the pressure, the reactants temperature and the oxygen content on the oxidant stream. This methodology allows the identification of the phenomena and processes that have larger exergy losses, the understanding of why these losses occur and how the exergy changes with the parameters associated to each system which is crucial to implement the syngas combustion from biomass products as a competitive technology.

  16. Asymptotic modal analysis and statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Dowell, Earl H.

    1988-01-01

    Statistical Energy Analysis (SEA) is defined by considering the asymptotic limit of Classical Modal Analysis, an approach called Asymptotic Modal Analysis (AMA). The general approach is described for both structural and acoustical systems. The theoretical foundation is presented for structural systems, and experimental verification is presented for a structural plate responding to a random force. Work accomplished subsequent to the grant initiation focusses on the acoustic response of an interior cavity (i.e., an aircraft or spacecraft fuselage) with a portion of the wall vibrating in a large number of structural modes. First results were presented at the ASME Winter Annual Meeting in December, 1987, and accepted for publication in the Journal of Vibration, Acoustics, Stress and Reliability in Design. It is shown that asymptotically as the number of acoustic modes excited becomes large, the pressure level in the cavity becomes uniform except at the cavity boundaries. However, the mean square pressure at the cavity corner, edge and wall is, respectively, 8, 4, and 2 times the value in the cavity interior. Also it is shown that when the portion of the wall which is vibrating is near a cavity corner or edge, the response is significantly higher.

  17. Balancing Accuracy and Cost of Confinement Simulations by Interpolation and Extrapolation of Confinement Energies.

    PubMed

    Villemot, François; Capelli, Riccardo; Colombo, Giorgio; van der Vaart, Arjan

    2016-06-14

    Improvements to the confinement method for the calculation of conformational free energy differences are presented. By taking advantage of phase space overlap between simulations at different frequencies, significant gains in accuracy and speed are reached. The optimal frequency spacing for the simulations is obtained from extrapolations of the confinement energy, and relaxation time analysis is used to determine time steps, simulation lengths, and friction coefficients. At postprocessing, interpolation of confinement energies is used to significantly reduce discretization errors in the calculation of conformational free energies. The efficiency of this protocol is illustrated by applications to alanine n-peptides and lactoferricin. For the alanine-n-peptide, errors were reduced between 2- and 10-fold and sampling times between 8- and 67-fold, while for lactoferricin the long sampling times at low frequencies were reduced 10-100-fold. PMID:27120438

  18. Contribution to solving the energy crisis - Simulating the prospects for low cost energy through silicon solar cells

    NASA Technical Reports Server (NTRS)

    Kran, A.

    1978-01-01

    PECAN (Photovoltaic Energy Conversion Analysis) is a highly interactive decision analysis and support system. It simulates the prospects for widespread use of solar cells for the generation of electrical power. PECAN consists of a set of integrated APL functions for evaluating the potential of terrestrial photovoltaics. Specifically, the system is a deterministic simulator, which translates present and future manufacturing technology into economic and financial terms, using the production unit concept. It guides solar cell development in three areas: tactical decision making, strategic planning, and the formulation of alternative options.

  19. Plasma simulations of emission line regions in high energy environments

    NASA Astrophysics Data System (ADS)

    Richardson, Chris T.

    This dissertation focuses on understanding two different, but in each case extreme, astrophysical environments: the Crab Nebula and emission line galaxies. These relatively local objects are well constrained by observations and are test cases of phenomena seen at high-z where detailed observations are rare. The tool used to study these objects is the plasma simulation code known as Cloudy. The introduction provides a brief summary of relevant physical concepts in nebular astrophysics and presents the basic features and assumptions of Cloudy. The first object investigated with Cloudy, the Crab Nebula, is a nearby supernova remnant that previously has been subject to photoionization modeling to reproduce the ionized emission seen in the nebula's filamentary structure. However, there are still several unanswered questions: (1) What excites the H2 emitting gas? (2) How much mass is in the molecular component? (3) How did the H2 form? (4) What is nature of the dust grains? A large suite of observations including long slit optical and NIR spectra over ionized, neutral and molecular gas in addition to HST and NIR ground based images constrain a particularly bright region of H2 emission, Knot 51, which exhibits a high excitation temperature of ˜3000 K. Simulations of K51 revealed that only a trace amount of H2 is needed to reproduce the observed emission and that H2 forms through an uncommon nebular process known as associative detachment. The final chapters of this dissertation focus on interpreting the narrow line region (NLR) in low-z emission line galaxies selected by a novel technique known as mean field independent component analysis (MFICA). A mixture of starlight and radiation from an AGN excites the gas present in galaxies. MFICA separates galaxies over a wide range of ionization into subsets of pure AGN and pure star forming galaxies allowing simulations to reveal the properties responsible for their observed variation in ionization. Emission line ratios can

  20. On energy and momentum conservation in particle-in-cell plasma simulation

    NASA Astrophysics Data System (ADS)

    Brackbill, J. U.

    2016-07-01

    Particle-in-cell (PIC) plasma simulations are a productive and valued tool for the study of nonlinear plasma phenomena, yet there are basic questions about the simulation methods themselves that remain unanswered. Here we study energy and momentum conservation by PIC. We employ both analysis and simulations of one-dimensional, electrostatic plasmas to understand why PIC simulations are either energy or momentum conserving but not both, what role a numerical stability plays in non-conservation, and how errors in conservation scale with the numerical parameters. Conserving both momentum and energy make it possible to model problems such as Jeans'-type equilibria. Avoiding numerical instability is useful, but so is being able to identify when its effect on the results may be important. Designing simulations to achieve the best possible accuracy with the least expenditure of effort requires results on the scaling of error with the numerical parameters. Our results identify the central role of Gauss' law in conservation of both momentum and energy, and the significant differences in numerical stability and error scaling between energy-conserving and momentum-conserving simulations.

  1. NREL Develops Diagnostic Test Cases to Improve Building Energy Simulation Programs (Fact Sheet)

    SciTech Connect

    Not Available

    2011-12-01

    This technical highlight describes NREL research to develop a set of diagnostic test cases for building energy simulations in order to achieve more accurate energy use and savings predictions. The National Renewable Energy Laboratory (NREL) Residential and Commercial Buildings research groups developed a set of diagnostic test cases for building energy simulations. Eight test cases were developed to test surface conduction heat transfer algorithms of building envelopes in building energy simulation programs. These algorithms are used to predict energy flow through external opaque surfaces such as walls, ceilings, and floors. The test cases consist of analytical and vetted numerical heat transfer solutions that have been available for decades, which increases confidence in test results. NREL researchers adapted these solutions for comparisons with building energy simulation results. Testing the new cases with EnergyPlus identified issues with the conduction finite difference (CondFD) heat transfer algorithm in versions 5 and 6. NREL researchers resolved these issues for EnergyPlus version 7. The new test cases will help users and developers of EnergyPlus and other building energy tools to identify and fix problems associated with solid conduction heat transfer algorithms of building envelopes and their boundary conditions. In the long term, improvements to software algorithms will result in more accurate energy use and savings predictions. NREL researchers plan to document the set of test cases and make them available for future consideration by validation standards such as ASHRAE Standard 140: Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. EnergyPlus users will also have access to the improved CondFD model in version 7 after its next scheduled release.

  2. Simplified building energy analysis tool for architects

    NASA Astrophysics Data System (ADS)

    Chaisuparasmikul, Pongsak

    applicable to the earliest stage of design, where more informed analysis of possible alternatives could yield the most benefit and the greatest cost savings both economic and environmental. This is where computer modeling and simulation can really lead to better and energy efficient buildings. Both apply to internal environment and human comfort, and environmental impact from surroundings.

  3. Simulating a Nationally Representative Housing Sample Using EnergyPlus

    SciTech Connect

    Hopkins, Asa S.; Lekov, Alex; Lutz, James; Rosenquist, Gregory; Gu, Lixing

    2011-03-04

    This report presents a new simulation tool under development at Lawrence Berkeley National Laboratory (LBNL). This tool uses EnergyPlus to simulate each single-family home in the Residential Energy Consumption Survey (RECS), and generates a calibrated, nationally representative set of simulated homes whose energy use is statistically indistinguishable from the energy use of the single-family homes in the RECS sample. This research builds upon earlier work by Ritchard et al. for the Gas Research Institute and Huang et al. for LBNL. A representative national sample allows us to evaluate the variance in energy use between individual homes, regions, or other subsamples; using this tool, we can also evaluate how that variance affects the impacts of potential policies. The RECS contains information regarding the construction and location of each sampled home, as well as its appliances and other energy-using equipment. We combined this data with the home simulation prototypes developed by Huang et al. to simulate homes that match the RECS sample wherever possible. Where data was not available, we used distributions, calibrated using the RECS energy use data. Each home was assigned a best-fit location for the purposes of weather and some construction characteristics. RECS provides some detail on the type and age of heating, ventilation, and air-conditioning (HVAC) equipment in each home; we developed EnergyPlus models capable of reproducing the variety of technologies and efficiencies represented in the national sample. This includes electric, gas, and oil furnaces, central and window air conditioners, central heat pumps, and baseboard heaters. We also developed a model of duct system performance, based on in-home measurements, and integrated this with fan performance to capture the energy use of single- and variable-speed furnace fans, as well as the interaction of duct and fan performance with the efficiency of heating and cooling equipment. Comparison with RECS revealed

  4. Clean Energy Manufacturing Analysis Center (CEMAC)

    SciTech Connect

    2015-12-01

    The U.S. Department of Energy's Clean Energy Manufacturing Analysis Center (CEMAC) provides objective analysis and up-to-date data on global supply chains and manufacturing of clean energy technologies. Policymakers and industry leaders seek CEMAC insights to inform choices to promote economic growth and the transition to a clean energy economy.

  5. SUPERNOVA SIMULATIONS AND STRATEGIES FOR THE DARK ENERGY SURVEY

    SciTech Connect

    Bernstein, J. P.; Kuhlmann, S.; Biswas, R.; Kovacs, E.; Crane, I.; Hufford, T.; Kessler, R.; Frieman, J. A.; Aldering, G.; Kim, A. G.; Nugent, P.; D'Andrea, C. B.; Nichol, R. C.; Finley, D. A.; Marriner, J.; Reis, R. R. R.; Jarvis, M. J.; Mukherjee, P.; Parkinson, D.; Sako, M.; and others

    2012-07-10

    We present an analysis of supernova light curves simulated for the upcoming Dark Energy Survey (DES) supernova search. The simulations employ a code suite that generates and fits realistic light curves in order to obtain distance modulus/redshift pairs that are passed to a cosmology fitter. We investigated several different survey strategies including field selection, supernova selection biases, and photometric redshift measurements. Using the results of this study, we chose a 30 deg{sup 2} search area in the griz filter set. We forecast (1) that this survey will provide a homogeneous sample of up to 4000 Type Ia supernovae in the redshift range 0.05

  6. Practical Integration Approach and Whole Building Energy Simulation of Three Energy Efficient Building Technologies: Preprint

    SciTech Connect

    Miller, J. P.; Zhivov, A.; Heron, D.; Deru, M.; Benne, K.

    2010-08-01

    Three technologies that have potential to save energy and improve sustainability of buildings are dedicated outdoor air systems, radiant heating and cooling systems and tighter building envelopes. To investigate the energy savings potential of these three technologies, whole building energy simulations were performed for a barracks facility and an administration facility in 15 U.S. climate zones and 16 international locations.

  7. Computer simulated building energy consumption for verification of energy conservation measures in network facilities

    NASA Technical Reports Server (NTRS)

    Plankey, B.

    1981-01-01

    A computer program called ECPVER (Energy Consumption Program - Verification) was developed to simulate all energy loads for any number of buildings. The program computes simulated daily, monthly, and yearly energy consumption which can be compared with actual meter readings for the same time period. Such comparison can lead to validation of the model under a variety of conditions, which allows it to be used to predict future energy saving due to energy conservation measures. Predicted energy saving can then be compared with actual saving to verify the effectiveness of those energy conservation changes. This verification procedure is planned to be an important advancement in the Deep Space Network Energy Project, which seeks to reduce energy cost and consumption at all DSN Deep Space Stations.

  8. SimUGV: a simulator for analyzing energy dynamics and locomotion for unmanned ground vehicles (UGV)

    NASA Astrophysics Data System (ADS)

    Sinha, Aakash K.; Vashishtha, Jyoti

    2006-05-01

    In the area of research on unmanned ground vehicles (UGV), one major problem is limited operating duration of robotics vehicles due to energy losses. There is a need for systematic analysis of locomotion and energy dynamics, which would enable an efficient design of the vehicle. For this purpose, a multifunction simulator tool is required which can read several input variables that describe the vehicle and compute detailed analysis of its energy dynamics. This research presents a generic locomotion simulator for a UGV (SimUGV). SimUGV's goal is to help vehicle designers develop efficient vehicles by optimizing design variables to minimize the energy losses for the vehicle. SimUGV has a powerful GUI interface which allows users to compare multiple test runs and visualize the data in a variety of ways. To illustrate the capabilities of the simulator, we present a case study conducted on the energy dynamics of a skid steering robotic vehicle. Two major constituent components of energy losses/consumption for a skid steering vehicle are - losses in skid steer turning, and losses in rolling. Using SimUGV, we present a detailed energy loss analysis of the vehicle's different turning modes; elastic mode steering, half-slip steering, skid turns, low radius turns, and zero radius turns. Each of the energy loss components is modeled from physics in terms of the design variables. The effect of design variables on the total energy losses/consumption is then studied using simulated data for different types of surfaces i.e. hard surfaces and muddy surfaces. Finally, we make suggestions about efficient vehicle design choices in terms of the design variables.

  9. Multiphysics Simulation in the Development of Thermoelectric Energy Harvesting Systems

    NASA Astrophysics Data System (ADS)

    Nesarajah, Marco; Frey, Georg

    2016-03-01

    This contribution presents a model-based development process for thermoelectric energy harvesting systems. Such systems convert thermal energy into electrical energy and produce enough energy to supply low-power devices. Realizations require three main challenges to be solved: to guarantee optimal thermal connection of the thermoelectric generators, to find a good design for the energy harvesting system, and to find an optimal electrical connection. Therefore, a development process is presented here. The process is divided into different steps and supports the developer in finding an optimal thermoelectric energy harvesting system for a given heat source and given objectives (technical and economical). During the process, several steps are supported by simulation models. Based on developed model libraries in Modelica®/Dymola®, thermal, thermoelectrical, electrical, and control components can be modeled, integrated into different variants, and verified step by step before the system is physically built and finally validated. The process is illustrated by an example through all the steps.

  10. Hybrid Simulation Modeling to Estimate U.S. Energy Elasticities

    NASA Astrophysics Data System (ADS)

    Baylin-Stern, Adam C.

    This paper demonstrates how an U.S. application of CIMS, a technologically explicit and behaviourally realistic energy-economy simulation model which includes macro-economic feedbacks, can be used to derive estimates of elasticity of substitution (ESUB) and autonomous energy efficiency index (AEEI) parameters. The ability of economies to reduce greenhouse gas emissions depends on the potential for households and industry to decrease overall energy usage, and move from higher to lower emissions fuels. Energy economists commonly refer to ESUB estimates to understand the degree of responsiveness of various sectors of an economy, and use estimates to inform computable general equilibrium models used to study climate policies. Using CIMS, I have generated a set of future, 'pseudo-data' based on a series of simulations in which I vary energy and capital input prices over a wide range. I then used this data set to estimate the parameters for transcendental logarithmic production functions using regression techniques. From the production function parameter estimates, I calculated an array of elasticity of substitution values between input pairs. Additionally, this paper demonstrates how CIMS can be used to calculate price-independent changes in energy-efficiency in the form of the AEEI, by comparing energy consumption between technologically frozen and 'business as usual' simulations. The paper concludes with some ideas for model and methodological improvement, and how these might figure into future work in the estimation of ESUBs from CIMS. Keywords: Elasticity of substitution; hybrid energy-economy model; translog; autonomous energy efficiency index; rebound effect; fuel switching.

  11. FDTD simulation tools for UWB antenna analysis.

    SciTech Connect

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  12. FDTD simulation tools for UWB antenna analysis.

    SciTech Connect

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  13. A New Model to Simulate Energy Performance of VRF Systems

    SciTech Connect

    Hong, Tianzhen; Pang, Xiufeng; Schetrit, Oren; Wang, Liping; Kasahara, Shinichi; Yura, Yoshinori; Hinokuma, Ryohei

    2014-03-30

    This paper presents a new model to simulate energy performance of variable refrigerant flow (VRF) systems in heat pump operation mode (either cooling or heating is provided but not simultaneously). The main improvement of the new model is the introduction of the evaporating and condensing temperature in the indoor and outdoor unit capacity modifier functions. The independent variables in the capacity modifier functions of the existing VRF model in EnergyPlus are mainly room wet-bulb temperature and outdoor dry-bulb temperature in cooling mode and room dry-bulb temperature and outdoor wet-bulb temperature in heating mode. The new approach allows compliance with different specifications of each indoor unit so that the modeling accuracy is improved. The new VRF model was implemented in a custom version of EnergyPlus 7.2. This paper first describes the algorithm for the new VRF model, which is then used to simulate the energy performance of a VRF system in a Prototype House in California that complies with the requirements of Title 24 ? the California Building Energy Efficiency Standards. The VRF system performance is then compared with three other types of HVAC systems: the Title 24-2005 Baseline system, the traditional High Efficiency system, and the EnergyStar Heat Pump system in three typical California climates: Sunnyvale, Pasadena and Fresno. Calculated energy savings from the VRF systems are significant. The HVAC site energy savings range from 51 to 85percent, while the TDV (Time Dependent Valuation) energy savings range from 31 to 66percent compared to the Title 24 Baseline Systems across the three climates. The largest energy savings are in Fresno climate followed by Sunnyvale and Pasadena. The paper discusses various characteristics of the VRF systems contributing to the energy savings. It should be noted that these savings are calculated using the Title 24 prototype House D under standard operating conditions. Actual performance of the VRF systems for real

  14. Simulation and Big Data Challenges in Tuning Building Energy Models

    SciTech Connect

    Sanyal, Jibonananda; New, Joshua Ryan

    2013-01-01

    EnergyPlus is the flagship building energy simulation software used to model whole building energy consumption for residential and commercial establishments. A typical input to the program often has hundreds, sometimes thousands of parameters which are typically tweaked by a buildings expert to get it right . This process can sometimes take months. Autotune is an ongoing research effort employing machine learning techniques to automate the tuning of the input parameters for an EnergyPlus input description of a building. Even with automation, the computational challenge faced to run the tuning simulation ensemble is daunting and requires the use of supercomputers to make it tractable in time. In this proposal, we describe the scope of the problem, the technical challenges faced and overcome, the machine learning techniques developed and employed, and the software infrastructure developed/in development when taking the EnergyPlus engine, which was primarily designed to run on desktops, and scaling it to run on shared memory supercomputers (Nautilus) and distributed memory supercomputers (Frost and Titan). The parametric simulations produce data in the order of tens to a couple of hundred terabytes.We describe the approaches employed to streamline and reduce bottlenecks in the workflow for this data, which is subsequently being made available for the tuning effort as well as made available publicly for open-science.

  15. Energy analysis of wave and tidal power

    NASA Astrophysics Data System (ADS)

    Harrison, R.; Smith, K. G.; Varley, J. S.

    1980-06-01

    Energy requirements for building wave- and tidal-power systems are estimated and the relationship between energy requirements and extraction efficiency is examined for wavepower systems. It is found that a point of maximum net output is reached, beyond which further increases in extraction efficiency result in decreased net energy. In this manner, the energy analysis identifies a limit on the energy which could, in principle, be extracted by a wave-energy system. Finally, it is noted that although similar limits could be identified for other types of energy sources, the tidal power analysis is confined to a brief comparison of energy inputs and outputs.

  16. Simulated galaxy interactions as probes of merger spectral energy distributions

    SciTech Connect

    Lanz, Lauranne; Zezas, Andreas; Smith, Howard A.; Ashby, Matthew L. N.; Fazio, Giovanni G.; Hernquist, Lars; Hayward, Christopher C.; Brassington, Nicola

    2014-04-10

    We present the first systematic comparison of ultraviolet-millimeter spectral energy distributions (SEDs) of observed and simulated interacting galaxies. Our sample is drawn from the Spitzer Interacting Galaxy Survey and probes a range of galaxy interaction parameters. We use 31 galaxies in 14 systems which have been observed with Herschel, Spitzer, GALEX, and 2MASS. We create a suite of GADGET-3 hydrodynamic simulations of isolated and interacting galaxies with stellar masses comparable to those in our sample of interacting galaxies. Photometry for the simulated systems is then calculated with the SUNRISE radiative transfer code for comparison with the observed systems. For most of the observed systems, one or more of the simulated SEDs match reasonably well. The best matches recover the infrared luminosity and the star formation rate of the observed systems, and the more massive systems preferentially match SEDs from simulations of more massive galaxies. The most morphologically distorted systems in our sample are best matched to the simulated SEDs that are close to coalescence, while less evolved systems match well with the SEDs over a wide range of interaction stages, suggesting that an SED alone is insufficient for identifying the interaction stage except during the most active phases in strongly interacting systems. This result is supported by our finding that the SEDs calculated for simulated systems vary little over the interaction sequence.

  17. Aircraft vulnerability analysis by modeling and simulation

    NASA Astrophysics Data System (ADS)

    Willers, Cornelius J.; Willers, Maria S.; de Waal, Alta

    2014-10-01

    Infrared missiles pose a significant threat to civilian and military aviation. ManPADS missiles are especially dangerous in the hands of rogue and undisciplined forces. Yet, not all the launched missiles hit their targets; the miss being either attributable to misuse of the weapon or to missile performance restrictions. This paper analyses some of the factors affecting aircraft vulnerability and demonstrates a structured analysis of the risk and aircraft vulnerability problem. The aircraft-missile engagement is a complex series of events, many of which are only partially understood. Aircraft and missile designers focus on the optimal design and performance of their respective systems, often testing only in a limited set of scenarios. Most missiles react to the contrast intensity, but the variability of the background is rarely considered. Finally, the vulnerability of the aircraft depends jointly on the missile's performance and the doctrine governing the missile's launch. These factors are considered in a holistic investigation. The view direction, altitude, time of day, sun position, latitude/longitude and terrain determine the background against which the aircraft is observed. Especially high gradients in sky radiance occur around the sun and on the horizon. This paper considers uncluttered background scenes (uniform terrain and clear sky) and presents examples of background radiance at all view angles across a sphere around the sensor. A detailed geometrical and spatially distributed radiometric model is used to model the aircraft. This model provides the signature at all possible view angles across the sphere around the aircraft. The signature is determined in absolute terms (no background) and in contrast terms (with background). It is shown that the background significantly affects the contrast signature as observed by the missile sensor. A simplified missile model is constructed by defining the thrust and mass profiles, maximum seeker tracking rate, maximum

  18. Image based SAR product simulation for analysis

    NASA Technical Reports Server (NTRS)

    Domik, G.; Leberl, F.

    1987-01-01

    SAR product simulation serves to predict SAR image gray values for various flight paths. Input typically consists of a digital elevation model and backscatter curves. A new method is described of product simulation that employs also a real SAR input image for image simulation. This can be denoted as 'image-based simulation'. Different methods to perform this SAR prediction are presented and advantages and disadvantages discussed. Ascending and descending orbit images from NASA's SIR-B experiment were used for verification of the concept: input images from ascending orbits were converted into images from a descending orbit; the results are compared to the available real imagery to verify that the prediction technique produces meaningful image data.

  19. Multimodel Simulation of Water Flow: Uncertainty Analysis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Simulations of soil water flow require measurements of soil hydraulic properties which are particularly difficult at the field scale. Laboratory measurements provide hydraulic properties at scales finer than the field scale, whereas pedotransfer functions (PTFs) integrate information on hydraulic pr...

  20. Co-Simulation of Building Energy and Control Systems with the Building Controls Virtual Test Bed

    SciTech Connect

    Wetter, Michael

    2010-08-22

    This article describes the implementation of the Building Controls Virtual Test Bed (BCVTB). The BCVTB is a software environment that allows connecting different simulation programs to exchange data during the time integration, and that allows conducting hardware in the loop simulation. The software architecture is a modular design based on Ptolemy II, a software environment for design and analysis of heterogeneous systems. Ptolemy II provides a graphical model building environment, synchronizes the exchanged data and visualizes the system evolution during run-time. The BCVTB provides additions to Ptolemy II that allow the run-time coupling of different simulation programs for data exchange, including EnergyPlus, MATLAB, Simulink and the Modelica modelling and simulation environment Dymola. The additions also allow executing system commands, such as a script that executes a Radiance simulation. In this article, the software architecture is presented and the mathematical model used to implement the co-simulation is discussed. The simulation program interface that the BCVTB provides is explained. The article concludes by presenting applications in which different state of the art simulation programs are linked for run-time data exchange. This link allows the use of the simulation program that is best suited for the particular problem to model building heat transfer, HVAC system dynamics and control algorithms, and to compute a solution to the coupled problem using co-simulation.

  1. Self-affine analysis of protein energy

    NASA Astrophysics Data System (ADS)

    Figueirêdo, P. H.; Moret, M. A.; Pascutti, P. G.; Nogueira, E.; Coutinho, S.

    2010-07-01

    We study the time series of the total energy of polypeptides and proteins. These time series were generated by molecular dynamics methods and analyzed by applying detrended fluctuation analysis to estimate the long-range power-law correlation, i.e. to measure scaling exponents α. Such exponents were calculated for all systems and their values follow environment conditions, i.e., they are temperature dependent and also, in a continuum medium approach, vary according to the dielectric constants (we simulated ɛ=2 and ɛ=80). The procedure was applied to investigate polyalanines, and other realistic models of proteins (Insect Defensin A and Hemoglobin). The present findings exhibit results that are consistent with previous ones obtained by other methodologies.

  2. HEAO-1 analysis of Low Energy Detectors (LED)

    NASA Technical Reports Server (NTRS)

    Nousek, John A.

    1992-01-01

    The activities at Penn State University are described. During the period Oct. 1990 to Dec. 1991 work on HEAO-1 analysis of the Low Energy Detectors (LED) concentrated on using the improved detector spectral simulation model and fitting diffuse x-ray background spectral data. Spectral fitting results, x-ray point sources, and diffuse x-ray sources are described.

  3. Analysis of Boundary Conditions for Crystal Defect Atomistic Simulations

    NASA Astrophysics Data System (ADS)

    Ehrlacher, V.; Ortner, C.; Shapeev, A. V.

    2016-06-01

    Numerical simulations of crystal defects are necessarily restricted to finite computational domains, supplying artificial boundary conditions that emulate the effect of embedding the defect in an effectively infinite crystalline environment. This work develops a rigorous framework within which the accuracy of different types of boundary conditions can be precisely assessed. We formulate the equilibration of crystal defects as variational problems in a discrete energy space and establish qualitatively sharp regularity estimates for minimisers. Using this foundation we then present rigorous error estimates for (i) a truncation method (Dirichlet boundary conditions), (ii) periodic boundary conditions, (iii) boundary conditions from linear elasticity, and (iv) boundary conditions from nonlinear elasticity. Numerical results confirm the sharpness of the analysis.

  4. Complexity analysis of simulations with analytic bond-order potentials

    NASA Astrophysics Data System (ADS)

    Teijeiro, Carlos; Hammerschmidt, Thomas; Seiser, Bernhard; Drautz, Ralf; Sutmann, Godehard

    2016-02-01

    The modeling of materials at the atomistic level with interatomic potentials requires a reliable description of different bonding situations and relevant system properties. For this purpose, analytic bond-order potentials (BOPs) provide a systematic and robust approximation to density functional theory (DFT) and tight binding (TB) calculations at reasonable computational cost. This paper presents a formal analysis of the computational complexity of analytic BOP simulations, based on a detailed assessment of the most computationally intensive parts. Different implementation algorithms are presented alongside with optimizations for efficient numerical processing. The theoretical complexity study is complemented by systematic benchmarks of the scalability of the algorithms with increasing system size and accuracy level of the BOP approximation. Both approaches demonstrate that the computation of atomic forces in analytic BOPs can be performed with a similar scaling as the computation of atomic energies.

  5. Protein thermostability calculations using alchemical free energy simulations.

    PubMed

    Seeliger, Daniel; de Groot, Bert L

    2010-05-19

    Thermal stability of proteins is crucial for both biotechnological and therapeutic applications. Rational protein engineering therefore frequently aims at increasing thermal stability by introducing stabilizing mutations. The accurate prediction of the thermodynamic consequences caused by mutations, however, is highly challenging as thermal stability changes are caused by alterations in the free energy of folding. Growing computational power, however, increasingly allows us to use alchemical free energy simulations, such as free energy perturbation or thermodynamic integration, to calculate free energy differences with relatively high accuracy. In this article, we present an automated protocol for setting up alchemical free energy calculations for mutations of naturally occurring amino acids (except for proline) that allows an unprecedented, automated screening of large mutant libraries. To validate the developed protocol, we calculated thermodynamic stability differences for 109 mutations in the microbial Ribonuclease Barnase. The obtained quantitative agreement with experimental data illustrates the potential of the approach in protein engineering and design. PMID:20483340

  6. Nucleation Rate Analysis of Methane Hydrate from Molecular Dynamics Simulations

    DOE PAGESBeta

    Yuhara, Daisuke; Barnes, Brian C.; Suh, Donguk; Knott, Brandon C.; Beckham, Gregg T.; Yasuoka, Kenji; Wu, David T.; Amadeu K. Sum

    2015-01-06

    Clathrate hydrates are solid crystalline structures most commonly formed from solutions that have nucleated to form a mixed solid composed of water and gas. Understanding the mechanism of clathrate hydrate nucleation is essential to grasp the fundamental chemistry of these complex structures and their applications. Molecular dynamics (MD) simulation is an ideal method to study nucleation at the molecular level because the size of the critical nucleus and formation rate occur on the nano scale. Moreover, various analysis methods for nucleation have been developed through MD to analyze nucleation. In particular, the mean first-passage time (MFPT) and survival probability (SP)more » methods have proven to be effective in procuring the nucleation rate and critical nucleus size for monatomic systems. This study assesses the MFPT and SP methods, previously used for monatomic systems, when applied to analyzing clathrate hydrate nucleation. Because clathrate hydrate nucleation is relatively difficult to observe in MD simulations (due to its high free energy barrier), these methods have yet to be applied to clathrate hydrate systems. In this study, we have analyzed the nucleation rate and critical nucleus size of methane hydrate using MFPT and SP methods from data generated by MD simulations at 255 K and 50 MPa. MFPT was modified for clathrate hydrate from the original version by adding the maximum likelihood estimate and growth effect term. The nucleation rates were calculated by MFPT and SP methods and are within 5%; the critical nucleus size estimated by the MFPT method was 50% higher, than values obtained through other more rigorous but computationally expensive estimates. These methods can also be extended to the analysis of other clathrate hydrates.« less

  7. Nucleation Rate Analysis of Methane Hydrate from Molecular Dynamics Simulations

    SciTech Connect

    Yuhara, Daisuke; Barnes, Brian C.; Suh, Donguk; Knott, Brandon C.; Beckham, Gregg T.; Yasuoka, Kenji; Wu, David T.; Amadeu K. Sum

    2015-01-06

    Clathrate hydrates are solid crystalline structures most commonly formed from solutions that have nucleated to form a mixed solid composed of water and gas. Understanding the mechanism of clathrate hydrate nucleation is essential to grasp the fundamental chemistry of these complex structures and their applications. Molecular dynamics (MD) simulation is an ideal method to study nucleation at the molecular level because the size of the critical nucleus and formation rate occur on the nano scale. Moreover, various analysis methods for nucleation have been developed through MD to analyze nucleation. In particular, the mean first-passage time (MFPT) and survival probability (SP) methods have proven to be effective in procuring the nucleation rate and critical nucleus size for monatomic systems. This study assesses the MFPT and SP methods, previously used for monatomic systems, when applied to analyzing clathrate hydrate nucleation. Because clathrate hydrate nucleation is relatively difficult to observe in MD simulations (due to its high free energy barrier), these methods have yet to be applied to clathrate hydrate systems. In this study, we have analyzed the nucleation rate and critical nucleus size of methane hydrate using MFPT and SP methods from data generated by MD simulations at 255 K and 50 MPa. MFPT was modified for clathrate hydrate from the original version by adding the maximum likelihood estimate and growth effect term. The nucleation rates were calculated by MFPT and SP methods and are within 5%; the critical nucleus size estimated by the MFPT method was 50% higher, than values obtained through other more rigorous but computationally expensive estimates. These methods can also be extended to the analysis of other clathrate hydrates.

  8. Experimental spectra analysis in THM with the help of simulation based on the Geant4 framework

    NASA Astrophysics Data System (ADS)

    Li, Cheng-Bo; Wen, Qun-Gang; Zhou, Shu-Hua; Jiang, Zong-Jun; Fu, Yuan-Yong; Zhou, Jing; Meng, Qiu-Ying; Wang, Xiao-Lian

    2015-05-01

    The Coulomb barrier and electron screening cause difficulties in directly measuring nuclear reaction cross sections of charged particles at astrophysical energies. The Trojan-horse method (THM) has been introduced to solve the difficulties as a powerful indirect tool. In order to understand experimental spectra better, Geant4 is employed to simulate the method. Validity and reliability of simulation data are examined by comparing the experimental data with simulated results. The Geant4 simulation of THM improves data analysis and is beneficial to the design for future related experiments. Supported by National Natural Science Foundation of China (11075218, 10575132) and Beijing Natural Science Foundation (1122017)

  9. Simulation of diurnal thermal energy storage systems: Preliminary results

    NASA Astrophysics Data System (ADS)

    Katipamula, S.; Somasundaram, S.; Williams, H. R.

    1994-12-01

    This report describes the results of a simulation of thermal energy storage (TES) integrated with a simple-cycle gas turbine cogeneration system. Integrating TES with cogeneration can serve the electrical and thermal loads independently while firing all fuel in the gas turbine. The detailed engineering and economic feasibility of diurnal TES systems integrated with cogeneration systems has been described in two previous PNL reports. The objective of this study was to lay the ground work for optimization of the TES system designs using a simulation tool called TRNSYS (TRaNsient SYstem Simulation). TRNSYS is a transient simulation program with a sequential-modular structure developed at the Solar Energy Laboratory, University of Wisconsin-Madison. The two TES systems selected for the base-case simulations were: (1) a one-tank storage model to represent the oil/rock TES system; and (2) a two-tank storage model to represent the molten nitrate salt TES system. Results of the study clearly indicate that an engineering optimization of the TES system using TRNSYS is possible. The one-tank stratified oil/rock storage model described here is a good starting point for parametric studies of a TES system. Further developments to the TRNSYS library of available models (economizer, evaporator, gas turbine, etc.) are recommended so that the phase-change processes is accurately treated.

  10. Finecasting for renewable energy with large-eddy simulation

    NASA Astrophysics Data System (ADS)

    Jonker, Harmen; Verzijlbergh, Remco

    2016-04-01

    We present results of a single, continuous Large-Eddy Simulation of actual weather conditions during the timespan of a full year, made possible through recent computational developments (Schalkwijk et al, MWR, 2015). The simulation is coupled to a regional weather model in order to provide an LES dataset that is representative of the daily weather of the year 2012 around Cabauw, the Netherlands. This location is chosen such that LES results can be compared with both the regional weather model and observations from the Cabauw observational supersite. The run was made possible by porting our Large-Eddy Simulation program to run completely on the GPU (Schalkwijk et al, BAMS, 2012). GPU adaptation allows us to reach much improved time-to-solution ratios (i.e. simulation speedup versus real time). As a result, one can perform runs with a much longer timespan than previously feasible. The dataset resulting from the LES run provides many avenues for further study. First, it can provide a more statistical approach to boundary-layer turbulence than the more common case-studies by simulating a diverse but representative set of situations, as well as the transition between situations. This has advantages in designing and evaluating parameterizations. In addition, we discuss the opportunities of high-resolution forecasts for the renewable energy sector, e.g. wind and solar energy production.

  11. Symmetry energy impact in simulations of core-collapse supernovae

    NASA Astrophysics Data System (ADS)

    Fischer, Tobias; Hempel, Matthias; Sagert, Irina; Suwa, Yudai; Schaffner-Bielich, Jürgen

    2014-02-01

    We present a review of a broad selection of nuclear matter equations of state (EOSs) applicable in core-collapse supernova studies. The large variety of nuclear matter properties, such as the symmetry energy, which are covered by these EOSs leads to distinct outcomes in supernova simulations. Many of the currently used EOS models can be ruled out by nuclear experiments, nuclear many-body calculations, and observations of neutron stars. In particular the two classical supernova EOS describe neutron matter poorly. Nevertheless, we explore their impact in supernova simulations since they are commonly used in astrophysics. They serve as extremely soft and stiff representative nuclear models. The corresponding supernova simulations represent two extreme cases, e.g., with respect to the protoneutron star (PNS) compactness and shock evolution. Moreover, in multi-dimensional supernova simulations EOS differences have a strong effect on the explosion dynamics. Because of the extreme behaviors of the classical supernova EOSs we also include DD2, a relativistic mean field EOS with density-dependent couplings, which is in satisfactory agreement with many current nuclear and observational constraints. This is the first time that DD2 is applied to supernova simulations and compared with the classical supernova EOS. We find that the overall behaviour of the latter EOS in supernova simulations lies in between the two extreme classical EOSs. As pointed out in previous studies, we confirm the impact of the symmetry energy on the electron fraction. Furthermore, we find that the symmetry energy becomes less important during the post-bounce evolution, where conversely the symmetric part of the EOS becomes increasingly dominating, which is related to the high temperatures obtained. Moreover, we study the possible impact of quark matter at high densities and light nuclear clusters at low and intermediate densities.

  12. National Geo-Database for Biofuel Simulations and Regional Analysis

    SciTech Connect

    Izaurralde, Roberto C.; Zhang, Xuesong; Sahajpal, Ritvik; Manowitz, David H.

    2012-04-01

    The goal of this project undertaken by GLBRC (Great Lakes Bioenergy Research Center) Area 4 (Sustainability) modelers is to develop a national capability to model feedstock supply, ethanol production, and biogeochemical impacts of cellulosic biofuels. The results of this project contribute to sustainability goals of the GLBRC; i.e. to contribute to developing a sustainable bioenergy economy: one that is profitable to farmers and refiners, acceptable to society, and environmentally sound. A sustainable bioenergy economy will also contribute, in a fundamental way, to meeting national objectives on energy security and climate mitigation. The specific objectives of this study are to: (1) develop a spatially explicit national geodatabase for conducting biofuel simulation studies; (2) model biomass productivity and associated environmental impacts of annual cellulosic feedstocks; (3) simulate production of perennial biomass feedstocks grown on marginal lands; and (4) locate possible sites for the establishment of cellulosic ethanol biorefineries. To address the first objective, we developed SENGBEM (Spatially Explicit National Geodatabase for Biofuel and Environmental Modeling), a 60-m resolution geodatabase of the conterminous USA containing data on: (1) climate, (2) soils, (3) topography, (4) hydrography, (5) land cover/ land use (LCLU), and (6) ancillary data (e.g., road networks, federal and state lands, national and state parks, etc.). A unique feature of SENGBEM is its 2008-2010 crop rotation data, a crucially important component for simulating productivity and biogeochemical cycles as well as land-use changes associated with biofuel cropping. We used the EPIC (Environmental Policy Integrated Climate) model to simulate biomass productivity and environmental impacts of annual and perennial cellulosic feedstocks across much of the USA on both croplands and marginal lands. We used data from LTER and eddy-covariance experiments within the study region to test the

  13. Scripted Building Energy Modeling and Analysis (Presentation)

    SciTech Connect

    Macumber, D.

    2012-10-01

    Building energy analysis is often time-intensive, error-prone, and non-reproducible. Entire energy analyses can be scripted end-to-end using the OpenStudio Ruby API. Common tasks within an analysis can be automated using OpenStudio Measures. Graphical user interfaces (GUI's) and component libraries reduce time, decrease errors, and improve repeatability in energy modeling.

  14. San Carlos Apache Tribe - Energy Organizational Analysis

    SciTech Connect

    Rapp, James; Albert, Steve

    2012-04-01

    The San Carlos Apache Tribe (SCAT) was awarded $164,000 in late-2011 by the U.S. Department of Energy (U.S. DOE) Tribal Energy Program's "First Steps Toward Developing Renewable Energy and Energy Efficiency on Tribal Lands" Grant Program. This grant funded:  The analysis and selection of preferred form(s) of tribal energy organization (this Energy Organization Analysis, hereinafter referred to as "EOA").  Start-up staffing and other costs associated with the Phase 1 SCAT energy organization.  An intern program.  Staff training.  Tribal outreach and workshops regarding the new organization and SCAT energy programs and projects, including two annual tribal energy summits (2011 and 2012). This report documents the analysis and selection of preferred form(s) of a tribal energy organization.

  15. Scenario simulation based assessment of subsurface energy storage

    NASA Astrophysics Data System (ADS)

    Beyer, C.; Bauer, S.; Dahmke, A.

    2014-12-01

    Energy production from renewable sources such as solar or wind power is characterized by temporally varying power supply. The politically intended transition towards renewable energies in Germany („Energiewende") hence requires the installation of energy storage technologies to compensate for the fluctuating production. In this context, subsurface energy storage represents a viable option due to large potential storage capacities and the wide prevalence of suited geological formations. Technologies for subsurface energy storage comprise cavern or deep porous media storage of synthetic hydrogen or methane from electrolysis and methanization, or compressed air, as well as heat storage in shallow or moderately deep porous formations. Pressure build-up, fluid displacement or temperature changes induced by such operations may affect local and regional groundwater flow, geomechanical behavior, groundwater geochemistry and microbiology. Moreover, subsurface energy storage may interact and possibly be in conflict with other "uses" like drinking water abstraction or ecological goods and functions. An utilization of the subsurface for energy storage therefore requires an adequate system and process understanding for the evaluation and assessment of possible impacts of specific storage operations on other types of subsurface use, the affected environment and protected entities. This contribution presents the framework of the ANGUS+ project, in which tools and methods are developed for these types of assessments. Synthetic but still realistic scenarios of geological energy storage are derived and parameterized for representative North German storage sites by data acquisition and evaluation, and experimental work. Coupled numerical hydraulic, thermal, mechanical and reactive transport (THMC) simulation tools are developed and applied to simulate the energy storage and subsurface usage scenarios, which are analyzed for an assessment and generalization of the imposed THMC

  16. Collection and analysis of training simulator data

    SciTech Connect

    Krois, P.A.; Haas, P.M.

    1985-01-01

    The purposes of this paper are: (1) to review the objectives, approach, and results of a series of research experiments performed on nuclear power plant training simulators in support of regulatory and research programs of the US Nuclear Regulatory Commission (NRC), and (2) to identify general research issues that may lead to an improved research methodology using the training simulator as a field setting. Research products consist of a refined field research methodology, a data store on operator performance, and specific results pertinent to NRC regulatory positions. Issues and potential advances in operator performance measurement are discussed.

  17. Parachute system design, analysis, and simulation tool

    SciTech Connect

    Sundberg, W.D.; McBride, D.D.; Gwinn, K.W.; Waye, D.E.; Hailey, C.E.

    1992-01-01

    For over twenty years designers at Sandia National Laboratories have developed various parachute simulation codes to model deployment, inflation, loading, trajectories, aircraft downwash and line sail. In addition to these codes, material property data bases have been acquired. Recently we have initiated project to integrate these codes and data bases into a single software tool entitled SPARSYS (Sandia PARachute SYstem Simulation). We have constructed a graphical user interface as the driver and framework for SPARSYS. In this paper we present a status report on SPARSYS describing progress in developing and incorporating independent modules, in developing an integrated trajectory package, and in developing a materials data base including high-rate-of-strain data.

  18. Merging Energy Policy Decision Support, Education, and Communication: The 'World Energy' Simulation Role-Playing Game

    NASA Astrophysics Data System (ADS)

    Rooney-varga, J. N.; Franck, T.; Jones, A.; Sterman, J.; Sawin, E.

    2013-12-01

    To meet international goals for climate change mitigation and adaptation, as well as energy access and equity, there is an urgent need to explore and define energy policy paths forward. Despite this need, students, citizens, and decision-makers often hold deeply flawed mental models of the energy and climate systems. Here we describe a simulation role-playing game, World Energy, that provides an immersive learning experience in which participants can create their own path forward for global energy policy and learn about the impact of their policy choices on carbon dioxide emissions, temperature rise, energy supply mix, energy prices, and energy demand. The game puts players in the decision-making roles of advisors to the United Nations Sustainable Energy for All Initiative (drawn from international leaders from industry, governments, intergovernmental organizations, and citizens groups) and, using a state-of-the-art decision-support simulator, asks them to negotiate a plan for global energy policy. We use the En-ROADS (Energy Rapid Overview and Decision Support) simulator, which runs on a laptop computer in <0.1 sec. En-ROADS enables users to specify many factors, including R&D-driven cost reductions in fossil fuel-based, renewable, or carbon-neutral energy technologies; taxes and subsidies for different energy sources; performance standards and energy efficiency; emissions prices; policies to address other greenhouse gas emissions (e.g., methane, nitrous oxide, chlorofluorocarbons, etc.); and assumptions about GDP and population. In World Energy, participants must balance climate change mitigation goals with equity, prices and access to energy, and the political feasibility of policies. Initial results indicate participants gain insights into the dynamics of the energy and climate systems and greater understanding of the potential impacts policies.

  19. Energy utilization and efficiency analysis for hydrogen fuel cell vehicles

    NASA Astrophysics Data System (ADS)

    Moore, R. M.; Hauer, K. H.; Ramaswamy, S.; Cunningham, J. M.

    This paper presents the results of an energy analysis for load-following versus battery-hybrid direct-hydrogen fuel cell vehicles. The analysis utilizes dynamic fuel cell vehicle simulation tools previously presented [R.M. Moore, K.H. Hauer, J. Cunningham, S. Ramaswamy, A dynamic simulation tool for the battery-hybrid hydrogen fuel cell vehicle, Fuel Cells, submitted for publication; R.M. Moore, K.H. Hauer, D.J. Friedman, J.M. Cunningham, P. Badrinarayanan, S.X. Ramaswamy, A. Eggert, A dynamic simulation tool for hydrogen fuel cell vehicles, J. Power Sources, 141 (2005) 272-285], and evaluates energy utilization and efficiency for standardized drive cycles used in the US, Europe and Japan.

  20. Dynamic Process Simulation for Analysis and Design.

    ERIC Educational Resources Information Center

    Nuttall, Herbert E., Jr.; Himmelblau, David M.

    A computer program for the simulation of complex continuous process in real-time in an interactive mode is described. The program is user oriented, flexible, and provides both numerical and graphic output. The program has been used in classroom teaching and computer aided design. Typical input and output are illustrated for a sample problem to…

  1. Refined lateral energy correction functions for the KASCADE-Grande experiment based on Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Gherghel-Lascu, A.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bekk, K.; Bertaina, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Engler, J.; Fuchs, B.; Fuhrmann, D.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huber, D.; Huege, T.; Kampert, K.-H.; Kang, D.; Klages, H. O.; Link, K.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Oehlschläger, J.; Ostapchenko, S.; Palmieri, N.; Petcu, M.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schoo, S.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Weindl, A.; Wochele, J.; Zabierowski, J.

    2015-02-01

    In previous studies of KASCADE-Grande data, a Monte Carlo simulation code based on the GEANT3 program has been developed to describe the energy deposited by EAS particles in the detector stations. In an attempt to decrease the simulation time and ensure compatibility with the geometry description in standard KASCADE-Grande analysis software, several structural elements have been neglected in the implementation of the Grande station geometry. To improve the agreement between experimental and simulated data, a more accurate simulation of the response of the KASCADE-Grande detector is necessary. A new simulation code has been developed based on the GEANT4 program, including a realistic geometry of the detector station with structural elements that have not been considered in previous studies. The new code is used to study the influence of a realistic detector geometry on the energy deposited in the Grande detector stations by particles from EAS events simulated by CORSIKA. Lateral Energy Correction Functions are determined and compared with previous results based on GEANT3.

  2. Modeling and simulation of a transcutaneous energy transmission system used in artificial organ implants.

    PubMed

    Fang, Wan; Liu, Wei; Qian, Jie; Tang, Houjun; Ye, Pengsheng

    2009-12-01

    We present a mathematical model to simulate transcutaneous energy transmission systems. Treating such systems as resonant power electronic converters, we develop the equivalent circuit equations, for which the circuit variables are then expanded as Fourier series and a multi-frequency averaging method was applied. Keeping terms up to first-order, the analysis produces a dynamic and harmonic model describing these energy transmission systems. With appropriate values for the circuit parameters, numerical results are compared with those of the exact time domain model. This comparison verifies that our model can adequately represent to first-order such energy-transmission systems. PMID:19958349

  3. In situ and in-transit analysis of cosmological simulations

    NASA Astrophysics Data System (ADS)

    Friesen, Brian; Almgren, Ann; Lukić, Zarija; Weber, Gunther; Morozov, Dmitriy; Beckner, Vincent; Day, Marcus

    2016-08-01

    Modern cosmological simulations have reached the trillion-element scale, rendering data storage and subsequent analysis formidable tasks. To address this circumstance, we present a new MPI-parallel approach for analysis of simulation data while the simulation runs, as an alternative to the traditional workflow consisting of periodically saving large data sets to disk for subsequent `offline' analysis. We demonstrate this approach in the compressible gasdynamics/ N-body code Nyx, a hybrid MPI+OpenMP code based on the BoxLib framework, used for large-scale cosmological simulations. We have enabled on-the-fly workflows in two different ways: one is a straightforward approach consisting of all MPI processes periodically halting the main simulation and analyzing each component of data that they own (` in situ'). The other consists of partitioning processes into disjoint MPI groups, with one performing the simulation and periodically sending data to the other `sidecar' group, which post-processes it while the simulation continues (`in-transit'). The two groups execute their tasks asynchronously, stopping only to synchronize when a new set of simulation data needs to be analyzed. For both the in situ and in-transit approaches, we experiment with two different analysis suites with distinct performance behavior: one which finds dark matter halos in the simulation using merge trees to calculate the mass contained within iso-density contours, and another which calculates probability distribution functions and power spectra of various fields in the simulation. Both are common analysis tasks for cosmology, and both result in summary statistics significantly smaller than the original data set. We study the behavior of each type of analysis in each workflow in order to determine the optimal configuration for the different data analysis algorithms.

  4. Simulation of a passive solar energy system. Master's thesis

    SciTech Connect

    Slate, M.P.

    1982-12-01

    A simple lumped capacitance-resistance model is used to simulate heat flow in a residential size structure heated passively by the sun. The model takes the form of an analogous electrical circuit. A computer program was written to analyse the circuit. By altering the input parameters of the program, the thermal performance of a wide variety of passive solar designs can be investigated for any geographical location. By comparing program generated data to data taken from experimental test cells in Los Alamos, New Mexico, it was found that the simulation program predicted energy use to within 4 percent of measured values. Also, the computer program predicted temperature swings to within 16 percent of measured swings. Correlation with empirical methods of calculating monthly and annual savings in fuel use for heating was poor. Using the simulation calculations as a base, the predictions of anual savings differed by as much as 76 percent.

  5. Simulating and validating coastal gradients in wind energy resources

    NASA Astrophysics Data System (ADS)

    Hahmann, Andrea; Floors, Rogier; Karagali, Ioanna; Vasiljevic, Nikola; Lea, Guillaume; Simon, Elliot; Courtney, Michael; Badger, Merete; Peña, Alfredo; Hasager, Charlotte

    2016-04-01

    The experimental campaign of the RUNE (Reducing Uncertainty of Near-shore wind resource Estimates) project took place on the western coast of Denmark during the winter 2015-2016. The campaign used onshore scanning lidar technology combined with ocean and satellite information and produced a unique dataset to study the transition in boundary layer dynamics across the coastal zone. The RUNE project aims at reducing the uncertainty of near-shore wind resource estimates produced by mesoscale modeling. With this in mind, simulations using the Weather Research and Forecasting (WRF) model were performed to identify the sensitivity in the coastal gradients of wind energy resources to various model parameters and model inputs. Among these: model horizontal grid spacing and the planetary boundary layer and surface-layer scheme. We report on the differences amongst these simulations and preliminary results on the comparison of the model simulations with the RUNE observations of lidar and satellite measurements and near coastal tall mast.

  6. Structural, Physical, and Compositional Analysis of Lunar Simulants and Regolith

    NASA Technical Reports Server (NTRS)

    Greenberg, Paul; Street, Kenneth W.; Gaier, James

    2008-01-01

    Relative to the prior manned Apollo and unmanned robotic missions, planned Lunar initiatives are comparatively complex and longer in duration. Individual crew rotations are envisioned to span several months, and various surface systems must function in the Lunar environment for periods of years. As a consequence, an increased understanding of the surface environment is required to engineer and test the associated materials, components, and systems necessary to sustain human habitation and surface operations. The effort described here concerns the analysis of existing simulant materials, with application to Lunar return samples. The interplay between these analyses fulfills the objective of ascertaining the critical properties of regolith itself, and the parallel objective of developing suitable stimulant materials for a variety of engineering applications. Presented here are measurements of the basic physical attributes, i.e. particle size distributions and general shape factors. Also discussed are structural and chemical properties, as determined through a variety of techniques, such as optical microscopy, SEM and TEM microscopy, Mossbauer Spectroscopy, X-ray diffraction, Raman microspectroscopy, inductively coupled argon plasma emission spectroscopy and energy dispersive X-ray fluorescence mapping. A comparative description of currently available stimulant materials is discussed, with implications for more detailed analyses, as well as the requirements for continued refinement of methods for simulant production.

  7. Energy flux simulation in heterogeneous cropland - a two year study

    NASA Astrophysics Data System (ADS)

    Klein, Christian; Thieme, Christoph; Biernath, Christian; Heinlein, Florian; Priesack, Eckart

    2016-04-01

    Recent studies show that uncertainties in regional and global climate and weather simulations are partly due to inadequate descriptions of the energy flux exchanges between the land surface and the atmosphere [Stainforth et al. 2005]. One major shortcoming is the limitation of the grid-cell resolution, which is recommended to be about at least 3x3 km² in most models due to limitations in the model physics. To represent each individual grid cell most models select one dominant soil type and one dominant land use type. This resolution, however, is often too coarse in regions where the spatial heterogeneity of soil and land use types are high, e.g. in Central Europe. The relevance of vegetation (e.g. crops), ground cover, and soil properties to the moisture and energy exchanges between the land surface and the atmosphere is well known [McPherson 2007], but the impact of vegetation growth dynamics on energy fluxes is only partly understood [Gayler et al. 2014]. An elegant method to avoid the shortcoming of grid cell resolution is the so called mosaic approach. This approach is part of the recently developed ecosystem model framework Expert-N [Biernath et al. 2013] . The aim of this study was to analyze the impact of the characteristics of five managed field plots, planted with winter wheat, potato and maize on the near surface soil moistures and on the near surface energy flux exchanges of the soil-plant-atmosphere interface. The simulated energy fluxes were compared with eddy flux tower measurements between the respective fields at the research farm Scheyern, North-West of Munich, Germany. To perform these simulations, we coupled the ecosystem model Expert-N to an analytical footprint model [Mauder & Foken 2011] . The coupled model system has the ability to calculate the mixing ratio of the surface energy fluxes at a given point within one grid cell (in this case at the flux tower between the two fields). The approach accounts for the temporarily and spatially

  8. Analysis of DOE s Roof Savings Calculator with Comparison to other Simulation Engines

    SciTech Connect

    New, Joshua Ryan; Huang, Yu; Levinson, Ronnen; Mellot, Joe; Sanyal, Jibonananda; Childs, Kenneth W

    2014-01-01

    A web-based Roof Savings Calculator (RSC) has been deployed for the Department of Energy as an industry-consensus tool to help building owners, manufacturers, distributors, contractors and researchers easily run complex roof and attic simulations. This tool employs the latest web technologies and usability design to provide an easy input interface to an annual simulation of hour-by-hour, whole-building performance using the world-class simulation tools DOE-2.1E and AtticSim. Building defaults were assigned based on national averages and can provide estimated annual energy and cost savings after the user selects nothing more than building location. In addition to cool reflective roofs, the RSC tool can simulate multiple roof and attic configurations including different roof slopes, above sheathing ventilation, radiant barriers, low-emittance surfaces, HVAC duct location, duct leakage rates, multiple layers of building materials, ceiling and deck insulation levels, and other parameters. A base case and energy-efficient alternative can be compared side-by-side to generate an energy/cost savings estimate between two buildings. The RSC tool was benchmarked against field data for demonstration homes in Ft. Irwin, CA. However, RSC gives different energy savings estimates than previous cool roof simulation tools so more thorough software and empirical validation proved necessary. This report consolidates much of the preliminary analysis for comparison of RSC s projected energy savings to that from other simulation engines.

  9. EnergyPlus Run Time Analysis

    SciTech Connect

    Hong, Tianzhen; Buhl, Fred; Haves, Philip

    2008-09-20

    EnergyPlus is a new generation building performance simulation program offering many new modeling capabilities and more accurate performance calculations integrating building components in sub-hourly time steps. However, EnergyPlus runs much slower than the current generation simulation programs. This has become a major barrier to its widespread adoption by the industry. This paper analyzed EnergyPlus run time from comprehensive perspectives to identify key issues and challenges of speeding up EnergyPlus: studying the historical trends of EnergyPlus run time based on the advancement of computers and code improvements to EnergyPlus, comparing EnergyPlus with DOE-2 to understand and quantify the run time differences, identifying key simulation settings and model features that have significant impacts on run time, and performing code profiling to identify which EnergyPlus subroutines consume the most amount of run time. This paper provides recommendations to improve EnergyPlus run time from the modeler?s perspective and adequate computing platforms. Suggestions of software code and architecture changes to improve EnergyPlus run time based on the code profiling results are also discussed.

  10. Comparative visual analysis of 3D urban wind simulations

    NASA Astrophysics Data System (ADS)

    Röber, Niklas; Salim, Mohamed; Grawe, David; Leitl, Bernd; Böttinger, Michael; Schlünzen, Heinke

    2016-04-01

    Climate simulations are conducted in large quantity for a variety of different applications. Many of these simulations focus on global developments and study the Earth's climate system using a coupled atmosphere ocean model. Other simulations are performed on much smaller regional scales, to study very small fine grained climatic effects. These microscale climate simulations pose similar, yet also different, challenges for the visualization and the analysis of the simulation data. Modern interactive visualization and data analysis techniques are very powerful tools to assist the researcher in answering and communicating complex research questions. This presentation discusses comparative visualization for several different wind simulations, which were created using the microscale climate model MITRAS. The simulations differ in wind direction and speed, but are all centered on the same simulation domain: An area of Hamburg-Wilhelmsburg that hosted the IGA/IBA exhibition in 2013. The experiments contain a scenario case to analyze the effects of single buildings, as well as examine the impact of the Coriolis force within the simulation. The scenario case is additionally compared with real measurements from a wind tunnel experiment to ascertain the accuracy of the simulation and the model itself. We also compare different approaches for tree modeling and evaluate the stability of the model. In this presentation, we describe not only our workflow to efficiently and effectively visualize microscale climate simulation data using common 3D visualization and data analysis techniques, but also discuss how to compare variations of a simulation and how to highlight the subtle differences in between them. For the visualizations we use a range of different 3D tools that feature techniques for statistical data analysis, data selection, as well as linking and brushing.

  11. Energy levels scheme simulation of divalent cobalt doped bismuth germanate

    SciTech Connect

    Andreici, Emiliana-Laura; Petkova, Petya; Avram, Nicolae M.

    2015-12-07

    The aim of this paper is to simulate the energy levels scheme for Bismuth Germanate (BGO) doped with divalent cobalt, in order to give a reliable explanation for spectral experimental data. In the semiempirical crystal field theory we first modeled the Crystal Field Parameters (CFPs) of BGO:Cr{sup 2+} system, in the frame of Exchange Charge Model (ECM), with actually site symmetry of the impurity ions after doping. The values of CFPs depend on the geometry of doped host matrix and by parameter G of ECM. First, we optimized the geometry of undoped BGO host matrix and afterwards, that of doped BGO with divalent cobalt. The charges effect of ligands and covalence bonding between cobalt cations and oxygen anions, in the cluster approach, also were taken into account. With the obtained values of the CFPs we simulate the energy levels scheme of cobalt ions, by diagonalizing the matrix of the doped crystal Hamiltonian. Obviously, energy levels and estimated Racah parameters B and C were compared with the experimental spectroscopic data and discussed. Comparison of obtained results with experimental data shows quite satisfactory, which justify the model and simulation schemes used for the title system.

  12. Energy conserving continuum algorithms for kinetic & gyrokinetic simulations of plasmas

    NASA Astrophysics Data System (ADS)

    Hakim, A.; Hammett, G. W.; Shi, E.; Stoltzfus-Dueck, T.

    2015-11-01

    We present high-order, energy conserving, continuum algorithms for the solution of gyrokinetic equations for use in edge turbulence simulations. The distribution function is evolved with a discontinuous Galerkin scheme, while the fields are evolved with a continuous finite-element method. These algorithms work for a general, possibly non-canonical, Poisson bracket operator and conserve energy exactly. Benchmark simulations with ETG turbulence in 3X/2V are shown, as well as initial applications of the algorithms to turbulence in a simplified SOL geometry. Sheath boundary conditions with recycling and secondary electron emission are implemented, and a Lenard-Bernstein collision operator is included. Extension of these algorithms to full Vlasov-Maxwell equations are presented. It is shown that with a particular choice of numerical fluxes the total (particle+field) energy is conserved. Algorithms are implemented in a flexible and open-source framework, Gkeyll, which also includes fluid models, allowing potential hybrid simulations of various plasma problems. Supported by the Max-Planck/Princeton Center for Plasma Physics, and DOE Contract DE-AC02-09CH11466.

  13. EnergyPlus Weather Data for use with EnergyPlus Simulation Software

    DOE Data Explorer

    EnergyPlus is simulation software from DOE's Office of Energy Efficiency and Renewable Energy (EE) that models heating, cooling, lighting, ventilating, and other energy flows as well as water in buildings. Because the environment surrounding any building is an important component of the energy choices that go into the building's design and the energy performance of that building thereafter, weather data from all parts of the world are made available through the EnergyPlus web site. The data are collected from more than 2100 locations — 1042 locations in the USA, 71 locations in Canada, and more than 1000 locations in 100 other countries throughout the world. The weather data are arranged by World Meteorological Organization region and Country. In addition to using the weather data via the utility installed automatically with EnergyPlus software, users may view and download EnergyPlus weather data directly using a weather data layer for Google Earth.

  14. Geometric Modeling, Radiation Simulation, Rendering, Analysis Package

    Energy Science and Technology Software Center (ESTSC)

    1995-01-17

    RADIANCE is intended to aid lighting designers and architects by predicting the light levels and appearance of a space prior to construction. The package includes programs for modeling and translating scene geometry, luminaire data and material properties, all of which are needed as input to the simulation. The lighting simulation itself uses ray tracing techniques to compute radiance values (ie. the quantity of light passing through a specific point in a specific direction), which aremore » typically arranged to form a photographic quality image. The resulting image may be analyzed, displayed and manipulated within the package, and converted to other popular image file formats for export to other packages, facilitating the production of hard copy output.« less

  15. Real-Time Building Energy Simulation Using EnergyPlus and the Building Controls Test Bed

    SciTech Connect

    Pang, Xiufeng; Bhattachayra, Prajesh; O'Neill, Zheng; Haves, Philip; Wetter, Michael; Bailey, Trevor

    2011-11-01

    Most commercial buildings do not perform as well in practice as intended by the design and their performances often deteriorate over time. Reasons include faulty construction, malfunctioning equipment, incorrectly configured control systems and inappropriate operating procedures (Haves et al., 2001, Lee et al., 2007). To address this problem, the paper presents a simulation-based whole building performance monitoring tool that allows a comparison of building actual performance and expected performance in real time. The tool continuously acquires relevant building model input variables from existing Energy Management and Control System (EMCS). It then reports expected energy consumption as simulated of EnergyPlus. The Building Control Virtual Test Bed (BCVTB) is used as the software platform to provide data linkage between the EMCS, an EnergyPlus model, and a database. This paper describes the integrated real-time simulation environment. A proof-of-concept demonstration is also presented in the paper.

  16. Status of the MIND simulation and analysis

    SciTech Connect

    Cervera Villanueva, A.; Martin-Albo, J.; Laing, A.; Soler, F. J. P.; Lindroos, L.

    2010-03-30

    A realistic simulation of the Neutrino Factory detectors is required in order to fully understand the sensitivity of such a facility to the remaining parameters and degeneracies of the neutrino mixing matrix. Here described is the status of a modular software framework being developed to accommodate such a study. The results of initial studies of the reconstruction software and expected efficiency curves in the context of the golden channel are given.

  17. Simulation chain for acoustic ultra-high energy neutrino detectors

    NASA Astrophysics Data System (ADS)

    Neff, M.; Anton, G.; Enzenhöfer, A.; Graf, K.; Hößl, J.; Katz, U.; Lahmann, R.

    2013-10-01

    Acoustic neutrino detection is a promising approach for large-scale ultra-high energy neutrino detectors in water. In this paper, a Monte Carlo simulation chain for acoustic neutrino detection devices in water is presented. It is designed within the SeaTray/IceTray software framework. Its modular architecture is highly flexible and makes it easy to adapt to different environmental conditions, detector geometries, and hardware. The simulation chain covers the generation of the acoustic pulse produced by a neutrino interaction and the propagation to the sensors within the detector. In this phase of the development, ambient and transient noise models for the Mediterranean Sea and simulations of the data acquisition hardware, similar to the one used in ANTARES/AMADEUS, are implemented. A pre-selection scheme for neutrino-like signals based on matched filtering is employed, as it can be used for on-line filtering. To simulate the whole processing chain for experimental data, signal classification and acoustic source reconstruction algorithms are integrated. In this contribution, an overview of the design and capabilities of the simulation chain will be given, and some applications and preliminary studies will be presented.

  18. Model and particle-in-cell simulation of ion energy distribution in collisionless sheath

    SciTech Connect

    Zhou, Zhuwen; Kong, Bo; Luo, Yuee; Chen, Deliang; Wang, Yuansheng

    2015-06-15

    In this paper, we propose a self-consistent theoretical model, which is described by the ion energy distributions (IEDs) in collisionless sheaths, and the analytical results for different combined dc/radio frequency (rf) capacitive coupled plasma discharge cases, including sheath voltage errors analysis, are compared with the results of numerical simulations using a one-dimensional plane-parallel particle-in-cell (PIC) simulation. The IEDs in collisionless sheaths are performed on combination of dc/rf voltage sources electrodes discharge using argon as the process gas. The incident ions on the grounded electrode are separated, according to their different radio frequencies, and dc voltages on a separated electrode, the IEDs, and widths of energy in sheath and the plasma sheath thickness are discussed. The IEDs, the IED widths, and sheath voltages by the theoretical model are investigated and show good agreement with PIC simulations.

  19. HERMES: Simulating the propagation of ultra-high energy cosmic rays

    NASA Astrophysics Data System (ADS)

    De Domenico, Manlio

    2013-08-01

    The study of ultra-high energy cosmic rays (UHECR) at Earth cannot prescind from the study of their propagation in the Universe. In this paper, we present HERMES, the ad hoc Monte Carlo code we have developed for the realistic simulation of UHECR propagation. We discuss the modeling adopted to simulate the cosmology, the magnetic fields, the interactions with relic photons and the production of secondary particles. In order to show the potential applications of HERMES for astroparticle studies, we provide an estimation of the surviving probability of UHE protons, the GZK horizons of nuclei and the all-particle spectrum observed at Earth in different astrophysical scenarios. Finally, we show the expected arrival direction distribution of UHECR produced from nearby candidate sources. A stable version of HERMES will be released in the next future for public use together with libraries of already propagated nuclei to allow the community to perform mass composition and energy spectrum analysis with our simulator.

  20. SYSTEMATIC SENSITIVITY ANALYSIS OF AIR QUALITY SIMULATION MODELS

    EPA Science Inventory

    This report reviews and assesses systematic sensitivity and uncertainty analysis methods for applications to air quality simulation models. The discussion of the candidate methods presents their basic variables, mathematical foundations, user motivations and preferences, computer...

  1. Consistency analysis on laser signal in laser guided weapon simulation

    NASA Astrophysics Data System (ADS)

    Yin, Ruiguang; Zhang, Wenpan; Guo, Hao; Gan, Lin

    2015-10-01

    The hardware-in-the-loop simulation is widely used in laser semi-active guidance weapon experiments, the authenticity of the laser guidance signal is the key problem of reliability. In order to evaluate the consistency of the laser guidance signal, this paper analyzes the angle of sight, laser energy density, laser spot size, atmospheric back scattering, sun radiation and SNR by comparing the different working state between actual condition and hardware-in-the-loop simulation. Based on measured data, mathematical simulation and optical simulation result, laser guidance signal effects on laser seeker are determined. By using Monte Carlo method, the laser guided weapon trajectory and impact point distribution are obtained, the influence of the systematic error are analyzed. In conclusion it is pointed out that the difference between simulation system and actual system has little influence in normal guidance, has great effect on laser jamming. The research is helpful to design and evaluation of laser guided weapon simulation.

  2. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    SciTech Connect

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  3. Radiation and ionization energy loss simulation for the GDH sum rule experiment in Hall-A at Jefferson Lab

    NASA Astrophysics Data System (ADS)

    Yan, Xin-Hu; Ye, Yun-Xiu; Chen, Jian-Ping; Lu, Hai-Jiang; Zhu, Peng-Jia; Jiang, Feng-Jian

    2015-07-01

    The radiation and ionization energy loss are presented for single arm Monte Carlo simulation for the GDH sum rule experiment in Hall-A at the Jefferson Lab. Radiation and ionization energy loss are discussed for 12C elastic scattering simulation. The relative momentum ratio \\frac{{Δ p}}{p} and 12C elastic cross section are compared without and with radiative energy loss and a reasonable shape is obtained by the simulation. The total energy loss distribution is obtained, showing a Landau shape for 12C elastic scattering. This simulation work will give good support for radiation correction analysis of the GDH sum rule experiment. Supported by National Natural Science Foundation of China (11135002, 11275083), US Department of Energy contract DE-AC05-84ER-40150 under which Jefferson Science Associates operates the Thomas Jefferson National Accelerator Facility and Natural Science Foundation of An'hui Educational Committee (KJ2012B179)

  4. Spacecraft Trajectory Analysis and Mission Planning Simulation (STAMPS) Software

    NASA Technical Reports Server (NTRS)

    Puckett, Nancy; Pettinger, Kris; Hallstrom,John; Brownfield, Dana; Blinn, Eric; Williams, Frank; Wiuff, Kelli; McCarty, Steve; Ramirez, Daniel; Lamotte, Nicole; Vu, Tuan

    2014-01-01

    STAMPS simulates either three- or six-degree-of-freedom cases for all spacecraft flight phases using translated HAL flight software or generic GN&C models. Single or multiple trajectories can be simulated for use in optimization and dispersion analysis. It includes math models for the vehicle and environment, and currently features a "C" version of shuttle onboard flight software. The STAMPS software is used for mission planning and analysis within ascent/descent, rendezvous, proximity operations, and navigation flight design areas.

  5. Modeling and simulation method of target echo energy detection in laser simulation system

    NASA Astrophysics Data System (ADS)

    Cheng, Ye; Lv, Pin; Sun, Quan

    2015-10-01

    When using numerical simulation method study laser system, modeling and simulation energy distribution of the target echo on the detector is studied in order to achieve closed-loop optical path. From the perspective of Fresnel formula, using bidirectional reflectance distribution function (BRDF) model to calculate the intensity distribution of the target reflection; calculation of light vector angle expression reflects the phase change between reflected light and incident light when light travelling in a single medium surface. Setting position parameters and attitude parameters of different components in the laser simulation system, through the calculation of geometric relationship, the energy distribution under the view of the detector is achieved. Target surface shape was respectively set for planar, spherical and cylindrical. Analyzed the influence of targets surface roughness root mean square (RMS), zenith angle and azimuth angle of the incident light to targets reflection characteristics respectively. Results show that this method can accurately achieve the detection simulation of simple geometric shape surface target in laser system.

  6. Energy dynamics in a simulation of LAPD turbulence

    SciTech Connect

    Friedman, B.; Carter, T. A.; Schaffner, D.; Umansky, M. V.; Dudson, B.

    2012-10-15

    Energy dynamics calculations in a 3D fluid simulation of drift wave turbulence in the linear Large Plasma Device [W. Gekelman et al., Rev. Sci. Instrum. 62, 2875 (1991)] illuminate processes that drive and dissipate the turbulence. These calculations reveal that a nonlinear instability dominates the injection of energy into the turbulence by overtaking the linear drift wave instability that dominates when fluctuations about the equilibrium are small. The nonlinear instability drives flute-like (k{sub Parallel-To }=0) density fluctuations using free energy from the background density gradient. Through nonlinear axial wavenumber transfer to k{sub Parallel-To }{ne}0 fluctuations, the nonlinear instability accesses the adiabatic response, which provides the requisite energy transfer channel from density to potential fluctuations as well as the phase shift that causes instability. The turbulence characteristics in the simulations agree remarkably well with experiment. When the nonlinear instability is artificially removed from the system through suppressing k{sub Parallel-To }=0 modes, the turbulence develops a coherent frequency spectrum which is inconsistent with experimental data. This indicates the importance of the nonlinear instability in producing experimentally consistent turbulence.

  7. Inter-satellite laser link simulation analysis

    NASA Astrophysics Data System (ADS)

    Tong, Lanjuan; Guan, Hui; Wang, Zhilin

    2015-11-01

    The characteristic of satellite communication link was firstly described and four application modes were put forward. By comparison, it is suggested that microwave link is used in satellite-to-ground communication and laser link is used in inter-satellite communication. Secondly the condition and composition of laser link establishment was analyzed and laser link model was set up, and the principle and composition of APT system was described. Finally, based on STK and MATLAB platform, the process of inter-satellite laser link establishment was designed, and setting the scene of TDRS capturing and tracking user's satellite as an example, simulation was realized and demonstrated.

  8. ORSA: Orbit Reconstruction, Simulation and Analysis

    NASA Astrophysics Data System (ADS)

    Tricarico, Pasquale

    2012-04-01

    ORSA is an interactive tool for scientific grade Celestial Mechanics computations. Asteroids, comets, artificial satellites, solar and extra-solar planetary systems can be accurately reproduced, simulated, and analyzed. The software uses JPL ephemeris files for accurate planets positions and has a Qt-based graphical user interface. It offers an advanced 2D plotting tool and 3D OpenGL viewer and the standalone numerical library liborsa and can import asteroids and comets from all the known databases (MPC, JPL, Lowell, AstDyS, and NEODyS). In addition, it has an integrated download tool to update databases.

  9. Performance demonstration program plan for analysis of simulated headspace gases

    SciTech Connect

    1995-06-01

    The Performance Demonstration Program (PDP) for analysis of headspace gases will consist of regular distribution and analyses of test standards to evaluate the capability for analyzing VOCs, hydrogen, and methane in the headspace of transuranic (TRU) waste throughout the Department of Energy (DOE) complex. Each distribution is termed a PDP cycle. These evaluation cycles will provide an objective measure of the reliability of measurements performed for TRU waste characterization. Laboratory performance will be demonstrated by the successful analysis of blind audit samples of simulated TRU waste drum headspace gases according to the criteria set within the text of this Program Plan. Blind audit samples (hereinafter referred to as PDP samples) will be used as an independent means to assess laboratory performance regarding compliance with the QAPP QAOs. The concentration of analytes in the PDP samples will encompass the range of concentrations anticipated in actual waste characterization gas samples. Analyses which are required by the WIPP to demonstrate compliance with various regulatory requirements and which are included in the PDP must be performed by laboratories which have demonstrated acceptable performance in the PDP.

  10. GISAXS simulation and analysis on GPU clusters

    NASA Astrophysics Data System (ADS)

    Chourou, Slim; Sarje, Abhinav; Li, Xiaoye; Chan, Elaine; Hexemer, Alexander

    2012-02-01

    We have implemented a flexible Grazing Incidence Small-Angle Scattering (GISAXS) simulation code based on the Distorted Wave Born Approximation (DWBA) theory that effectively utilizes the parallel processing power provided by the GPUs. This constitutes a handy tool for experimentalists facing a massive flux of data, allowing them to accurately simulate the GISAXS process and analyze the produced data. The software computes the diffraction image for any given superposition of custom shapes or morphologies (e.g. obtained graphically via a discretization scheme) in a user-defined region of k-space (or region of the area detector) for all possible grazing incidence angles and in-plane sample rotations. This flexibility then allows to easily tackle a wide range of possible sample geometries such as nanostructures on top of or embedded in a substrate or a multilayered structure. In cases where the sample displays regions of significant refractive index contrast, an algorithm has been implemented to perform an optimal slicing of the sample along the vertical direction and compute the averaged refractive index profile to be used as the reference geometry of the unperturbed system. Preliminary tests on a single GPU show a speedup of over 200 times compared to the sequential code.

  11. Cochlear implant simulator for surgical technique analysis

    NASA Astrophysics Data System (ADS)

    Turok, Rebecca L.; Labadie, Robert F.; Wanna, George B.; Dawant, Benoit M.; Noble, Jack H.

    2014-03-01

    Cochlear Implant (CI) surgery is a procedure in which an electrode array is inserted into the cochlea. The electrode array is used to stimulate auditory nerve fibers and restore hearing for people with severe to profound hearing loss. The primary goals when placing the electrode array are to fully insert the array into the cochlea while minimizing trauma to the cochlea. Studying the relationship between surgical outcome and various surgical techniques has been difficult since trauma and electrode placement are generally unknown without histology. Our group has created a CI placement simulator that combines an interactive 3D visualization environment with a haptic-feedback-enabled controller. Surgical techniques and patient anatomy can be varied between simulations so that outcomes can be studied under varied conditions. With this system, we envision that through numerous trials we will be able to statistically analyze how outcomes relate to surgical techniques. As a first test of this system, in this work, we have designed an experiment in which we compare the spatial distribution of forces imparted to the cochlea in the array insertion procedure when using two different but commonly used surgical techniques for cochlear access, called round window and cochleostomy access. Our results suggest that CIs implanted using round window access may cause less trauma to deeper intracochlear structures than cochleostomy techniques. This result is of interest because it challenges traditional thinking in the otological community but might offer an explanation for recent anecdotal evidence that suggests that round window access techniques lead to better outcomes.

  12. Simulation of Energy Response of the ATIC Calorimeter

    NASA Technical Reports Server (NTRS)

    Batkov, K. E.; Adams, J. H., Jr.; Ahn, H. S.; Bashindzhagyan, G. L.; Case, G.; Christl, M.; Chang, J.; Fazely, A. R.; Ganel, O.; Granger, D.; Six, N. Frank (Technical Monitor)

    2002-01-01

    ATIC (Advanced Thin Ionization Calorimeter) is a balloon borne experiment designed to measure the cosmic ray composition for elements from hydrogen to iron and their energy spectra from approx.50 GeV to near 100 TeV. It consists of a Si-matrix detector to determine the charge of a CR particle, a scintillator hodoscope for tracking, carbon interaction targets and a fully active BGO calorimeter. ATIC had its first flight from McMurdo, Antarctica from 28/12/2000 to 13/01/2001. The ATIC flight collected approximately 25 million events. For reconstruction of primary spectra from spectra of energy deposits measured in the experiment, correlations between kinetic energy of a primary particle E(sub kin) and energy deposit in the calorimeter E(sub d) should be known. For this purpose, simulations of energy response of the calorimeter on energy spectra of different nuclei were done. The simulations were performed by GEANT-3.21 code with QGSM generator for nucleus - nucleus interactions. The incident flux was taken as isotropic in the ATIC aperture. Primary spectra power-law by momentum were used as inputs according to standard models of cosmic ray acceleration. These spectra become power-law by kinetic energy at E(sub kin) higher than approx.20Mc(sup 2), where M is primary nucleus mass. It should be noted that energy deposit spectra measured by ATIC illustrate similar behavior. Distributions of ratio E(sub kin)/E(sub d) are presented for different energy deposits and for a set of primaries. For power-law regions of energy spectra at E(sub d)> or equal to 20Mc(sup 2) the obtained mean value of E(sub kin)/E(sub d) increases from approx.2.4 for protons to approx.3.1 for iron, while rms/ decreases from 50% for protons to about 15% for iron. These values were obtained for the spectral index gamma=1.6

  13. Nesting Large-Eddy Simulations Within Mesoscale Simulations for Wind Energy Applications

    NASA Astrophysics Data System (ADS)

    Lundquist, J. K.; Mirocha, J. D.; Chow, F. K.; Kosovic, B.; Lundquist, K. A.

    2008-12-01

    With increasing demand for more accurate atmospheric simulations for wind turbine micrositing, for operational wind power forecasting, and for more reliable turbine design, simulations of atmospheric flow with resolution of tens of meters or higher are required. These time-dependent large-eddy simulations (LES) account for complex terrain and resolve individual atmospheric eddies on length scales smaller than turbine blades. These small-domain high-resolution simulations are possible with a range of commercial and open- source software, including the Weather Research and Forecasting (WRF) model. In addition to "local" sources of turbulence within an LES domain, changing weather conditions outside the domain can also affect flow, suggesting that a mesoscale model provide boundary conditions to the large-eddy simulations. Nesting a large-eddy simulation within a mesoscale model requires nuanced representations of turbulence. Our group has improved the Weather and Research Forecating model's (WRF) LES capability by implementing the Nonlinear Backscatter and Anisotropy (NBA) subfilter stress model following Kosoviæ (1997) and an explicit filtering and reconstruction technique to compute the Resolvable Subfilter-Scale (RSFS) stresses (following Chow et al, 2005). We have also implemented an immersed boundary method (IBM) in WRF to accommodate complex terrain. These new models improve WRF's LES capabilities over complex terrain and in stable atmospheric conditions. We demonstrate approaches to nesting LES within a mesoscale simulation for farms of wind turbines in hilly regions. Results are sensitive to the nesting method, indicating that care must be taken to provide appropriate boundary conditions, and to allow adequate spin-up of turbulence in the LES domain. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  14. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  15. Simulating electron energy loss spectroscopy with the MNPBEM toolbox

    NASA Astrophysics Data System (ADS)

    Hohenester, Ulrich

    2014-03-01

    Within the MNPBEM toolbox, we show how to simulate electron energy loss spectroscopy (EELS) of plasmonic nanoparticles using a boundary element method approach. The methodology underlying our approach closely follows the concepts developed by García de Abajo and coworkers (Garcia de Abajo, 2010). We introduce two classes eelsret and eelsstat that allow in combination with our recently developed MNPBEM toolbox for a simple, robust, and efficient computation of EEL spectra and maps. The classes are accompanied by a number of demo programs for EELS simulation of metallic nanospheres, nanodisks, and nanotriangles, and for electron trajectories passing by or penetrating through the metallic nanoparticles. We also discuss how to compute electric fields induced by the electron beam and cathodoluminescence. Catalogue identifier: AEKJ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKJ_v2_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 38886 No. of bytes in distributed program, including test data, etc.: 1222650 Distribution format: tar.gz Programming language: Matlab 7.11.0 (R2010b). Computer: Any which supports Matlab 7.11.0 (R2010b). Operating system: Any which supports Matlab 7.11.0 (R2010b). RAM:≥1 GB Classification: 18. Catalogue identifier of previous version: AEKJ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 370 External routines: MESH2D available at www.mathworks.com Does the new version supersede the previous version?: Yes Nature of problem: Simulation of electron energy loss spectroscopy (EELS) for plasmonic nanoparticles. Solution method: Boundary element method using electromagnetic potentials. Reasons for new version: The new version of the toolbox includes two additional classes for the simulation of electron energy

  16. Direct molecular simulation of nitrogen dissociation based on an ab initio potential energy surface

    SciTech Connect

    Valentini, Paolo Schwartzentruber, Thomas E. Bender, Jason D. Nompelis, Ioannis Candler, Graham V.

    2015-08-15

    The direct molecular simulation (DMS) approach is used to predict the internal energy relaxation and dissociation dynamics of high-temperature nitrogen. An ab initio potential energy surface (PES) is used to calculate the dynamics of two interacting nitrogen molecules by providing forces between the four atoms. In the near-equilibrium limit, it is shown that DMS reproduces the results obtained from well-established quasiclassical trajectory (QCT) analysis, verifying the validity of the approach. DMS is used to predict the vibrational relaxation time constant for N{sub 2}–N{sub 2} collisions and its temperature dependence, which are in close agreement with existing experiments and theory. Using both QCT and DMS with the same PES, we find that dissociation significantly depletes the upper vibrational energy levels. As a result, across a wide temperature range, the dissociation rate is found to be approximately 4–5 times lower compared to the rates computed using QCT with Boltzmann energy distributions. DMS calculations predict a quasi-steady-state distribution of rotational and vibrational energies in which the rate of depletion of high-energy states due to dissociation is balanced by their rate of repopulation due to collisional processes. The DMS approach simulates the evolution of internal energy distributions and their coupling to dissociation without the need to precompute rates or cross sections for all possible energy transitions. These benchmark results could be used to develop new computational fluid dynamics models for high-enthalpy flow applications.

  17. Development of EnergyPlus Utility to Batch Simulate Building Energy Performance on a National Scale

    SciTech Connect

    Valencia, Jayson F.; Dirks, James A.

    2008-08-29

    EnergyPlus is a simulation program that requires a large number of details to fully define and model a building. Hundreds or even thousands of lines in a text file are needed to run the EnergyPlus simulation depending on the size of the building. To manually create these files is a time consuming process that would not be practical when trying to create input files for thousands of buildings needed to simulate national building energy performance. To streamline the process needed to create the input files for EnergyPlus, two methods were created to work in conjunction with the National Renewable Energy Laboratory (NREL) Preprocessor; this reduced the hundreds of inputs needed to define a building in EnergyPlus to a small set of high-level parameters. The first method uses Java routines to perform all of the preprocessing on a Windows machine while the second method carries out all of the preprocessing on the Linux cluster by using an in-house built utility called Generalized Parametrics (GPARM). A comma delimited (CSV) input file is created to define the high-level parameters for any number of buildings. Each method then takes this CSV file and uses the data entered for each parameter to populate an extensible markup language (XML) file used by the NREL Preprocessor to automatically prepare EnergyPlus input data files (idf) using automatic building routines and macro templates. Using a Linux utility called “make”, the idf files can then be automatically run through the Linux cluster and the desired data from each building can be aggregated into one table to be analyzed. Creating a large number of EnergyPlus input files results in the ability to batch simulate building energy performance and scale the result to national energy consumption estimates.

  18. Simulation of the Atmospheric Boundary Layer for Wind Energy Applications

    NASA Astrophysics Data System (ADS)

    Marjanovic, Nikola

    Energy production from wind is an increasingly important component of overall global power generation, and will likely continue to gain an even greater share of electricity production as world governments attempt to mitigate climate change and wind energy production costs decrease. Wind energy generation depends on wind speed, which is greatly influenced by local and synoptic environmental forcings. Synoptic forcing, such as a cold frontal passage, exists on a large spatial scale while local forcing manifests itself on a much smaller scale and could result from topographic effects or land-surface heat fluxes. Synoptic forcing, if strong enough, may suppress the effects of generally weaker local forcing. At the even smaller scale of a wind farm, upstream turbines generate wakes that decrease the wind speed and increase the atmospheric turbulence at the downwind turbines, thereby reducing power production and increasing fatigue loading that may damage turbine components, respectively. Simulation of atmospheric processes that span a considerable range of spatial and temporal scales is essential to improve wind energy forecasting, wind turbine siting, turbine maintenance scheduling, and wind turbine design. Mesoscale atmospheric models predict atmospheric conditions using observed data, for a wide range of meteorological applications across scales from thousands of kilometers to hundreds of meters. Mesoscale models include parameterizations for the major atmospheric physical processes that modulate wind speed and turbulence dynamics, such as cloud evolution and surface-atmosphere interactions. The Weather Research and Forecasting (WRF) model is used in this dissertation to investigate the effects of model parameters on wind energy forecasting. WRF is used for case study simulations at two West Coast North American wind farms, one with simple and one with complex terrain, during both synoptically and locally-driven weather events. The model's performance with different

  19. Starlight emergence angle error analysis of star simulator

    NASA Astrophysics Data System (ADS)

    Zhang, Jian; Zhang, Guo-yu

    2015-10-01

    With continuous development of the key technologies of star sensor, the precision of star simulator have been to be further improved, for it directly affects the accuracy of star sensor laboratory calibration. For improving the accuracy level of the star simulator, a theoretical accuracy analysis model need to be proposed. According the ideal imaging model of star simulator, the theoretical accuracy analysis model can be established. Based on theoretically analyzing the theoretical accuracy analysis model we can get that the starlight emergent angle deviation is primarily affected by star position deviation, main point position deviation, focal length deviation, distortion deviation and object plane tilt deviation. Based on the above affecting factors, a comprehensive deviation model can be established. According to the model, the formula of each factors deviation model separately and the comprehensive deviation model can be summarized and concluded out. By analyzing the properties of each factors deviation model and the comprehensive deviation model formula, concluding the characteristics of each factors respectively and the weight relationship among them. According the result of analysis of the comprehensive deviation model, a reasonable designing indexes can be given by considering the star simulator optical system requirements and the precision of machining and adjustment. So, starlight emergence angle error analysis of star simulator is very significant to guide the direction of determining and demonstrating the index of star simulator, analyzing and compensating the error of star simulator for improving the accuracy of star simulator and establishing a theoretical basis for further improving the starlight angle precision of the star simulator can effectively solve the problem.

  20. Simulation and analysis of differential GPS

    NASA Astrophysics Data System (ADS)

    Denaro, R. P.

    NASA is conducting a research program to evaluate differential Global Positioning System (GPS) concepts for civil helicopter navigation. It is pointed out that the civil helicopter community will probably be an early user of GPS because of the unique mission operations in areas where precise navigation aids are not available. However, many of these applications involve accuracy requirements which cannot be satisfied by conventional GPS. Such applications include remote area search and rescue, offshore oil platform approach, remote area precision landing, and other precise navigation operations. Differential GPS provides a promising approach for meeting very demanding accuracy requirements. The considered procedure eliminates some of the common bias errors experienced by conventional GPS. This is done by making use of a second GPS receiver. A simulation process is developed as a tool for analyzing various scenarios of GPS-referenced civil aircraft navigation.

  1. Simulation and analysis of differential GPS

    NASA Technical Reports Server (NTRS)

    Denaro, R. P.

    1984-01-01

    NASA is conducting a research program to evaluate differential Global Positioning System (GPS) concepts for civil helicopter navigation. It is pointed out that the civil helicopter community will probably be an early user of GPS because of the unique mission operations in areas where precise navigation aids are not available. However, many of these applications involve accuracy requirements which cannot be satisfied by conventional GPS. Such applications include remote area search and rescue, offshore oil platform approach, remote area precision landing, and other precise navigation operations. Differential GPS provides a promising approach for meeting very demanding accuracy requirements. The considered procedure eliminates some of the common bias errors experienced by conventional GPS. This is done by making use of a second GPS receiver. A simulation process is developed as a tool for analyzing various scenarios of GPS-referenced civil aircraft navigation.

  2. Thermostructural analysis of simulated cowl lips

    NASA Technical Reports Server (NTRS)

    Melis, Matthew E.

    1988-01-01

    Three dimensional finite element analyses using MSC/NASTRAN and MARC are performed to predict the thermal and structural response of various cooling schemes under high heat loads. Steady state heat transfer analyses and elastic stress analyses are performed using MSC/NASTRAN. Elastic/plastic analyses are done using MARC. To help verify these analyses experimentally, a hydrogen-oxygen rocket engine was modified to use the exhaust stream as a high enthalpy, high heat flux source to evaluate various actively cooled, simulated cowl lip (leading edges) segments as well as flat structural segments. Cross flow and parallel flow cooling configurations were tested and analyzed using cooling fluids of water and gaseous hydrogen. In addition, various material types, including high conductivity copper, nickel, and a copper and graphite metal matrix composite were tested and compared.

  3. Analysis of ship maneuvering data from simulators

    NASA Astrophysics Data System (ADS)

    Frette, V.; Kleppe, G.; Christensen, K.

    2011-03-01

    We analyze complex manuevering histories of ships obtained from training sessions on bridge simulators. Advanced ships are used in fields like offshore oil exploration: dive support vessels, supply vessels, anchor handling vessels, tugs, cable layers, and multi-purpose vessels. Due to high demands from the operations carried out, these ships need to have very high maneuverability. This is achieved through a propulsion system with several thrusters, water jets, and rudders in addition to standard propellers. For some operations, like subsea maintenance, it is crucial that the ship accurately keeps a fixed position. Therefore, bridge systems usually incorporate equipment for Dynamic Positioning (DP). DP is a method to keep ships and semi submersible rigs in a fixed position using the propulsion systems instead of anchors. It may also be used for sailing a vessel from one position to another along a predefined route. Like an autopilot on an airplane, DP may operate without human involvement. The method relies on accurate determination of position from external reference systems like GPS, as well as a continuously adjusted mathematical model of the ship and external forces from wind, waves and currents. In a specific simulator exercise for offshore crews, a ship is to be taken up to an installation consisting of three nearby oil platforms connected by bridges (Frigg field, North Sea), where a subsea inspection is to be carried out. Due to the many degrees of freedom during maneuvering, including partly or full use of DP, the chosen routes vary significantly. In this poster we report preliminary results on representations of the complex maneuvering histories; representations that allow comparison between crew groups, and, possibly, sorting of the different strategic choices behind.

  4. Comparative Lifecycle Energy Analysis: Theory and Practice.

    ERIC Educational Resources Information Center

    Morris, Jeffrey; Canzoneri, Diana

    1992-01-01

    Explores the position that more energy is conserved through recycling secondary materials than is generated from municipal solid waste incineration. Discusses one component of a lifecycle analysis--a comparison of energy requirements for manufacturing competing products. Includes methodological issues, energy cost estimates, and difficulties…

  5. SIMULATIONS OF THE AGS MMPS STORING ENERGY IN CAPACITOR BANKS

    SciTech Connect

    MARNERIS,I.; BADEA, V.S.; BONATI, R.; ROSER, T.; SANDBERG, J.

    2007-06-25

    The Brookhaven AGS Main Magnet Power Supply (MMPS) is a thyristor control supply rated at 5500 Amps, +/-9000 Volts. The peak magnet power is 50 MWatts. The power supply is fed from a motor/generator manufactured by Siemens. The generator is 3 phase 7500 Volts rated at 50 MVA. The peak power requirements come from the stored energy in the rotor of the motor/generator. The motor generator is about 45 years old, made by Siemens and it is not clear if companies will be manufacturing similar machines in the future. We are therefore investigating different ways of storing energy for future AGS MMPS operations. This paper will present simulations of a power supply where energy is stored in capacitor banks. Two dc to dc converters will be presented along with the control system of the power section. The switching elements will be IGCT's made by ABB. The simulation program used is called PSIM version 6.1. The average power from the local power authority into the power supply will be kept constant during the pulsing of the magnets at +/-50 MW. The reactive power will also be kept constant below 1.5 MVAR. Waveforms will be presented.

  6. Adolescent girls' energy expenditure during dance simulation active computer gaming.

    PubMed

    Fawkner, Samantha G; Niven, Alisa; Thin, Alasdair G; Macdonald, Mhairi J; Oakes, Jemma R

    2010-01-01

    The objective of this study was to determine the energy expended and intensity of physical activity achieved by adolescent girls while playing on a dance simulation game. Twenty adolescent girls were recruited from a local secondary school. Resting oxygen uptake (VO(2)) and heart rate were analysed while sitting quietly and subsequently during approximately 30 min of game play, with 10 min at each of three increasing levels of difficulty. Energy expenditure was predicted from VO(2) at rest and during game play at three levels of play, from which the metabolic equivalents (METS) of game playing were derived. Mean +/- standard deviation energy expenditure for levels 1, 2, and 3 was 3.63 +/- 0.58, 3.65 +/- 0.54, and 4.14 +/- 0.71 kcal . min(-1) respectively, while mean activity for each level of play was at least of moderate intensity (>3 METS). Dance simulation active computer games provide an opportunity for most adolescent girls to exercise at moderate intensity. Therefore, regular playing might contribute to daily physical activity recommendations for good health in this at-risk population. PMID:20013462

  7. Energy Analysis Program 1990 annual report

    SciTech Connect

    Not Available

    1992-01-01

    The Energy Analysis Program has played an active role in the analysis and discussion of energy and environmental issues at several levels. (1) at the international level, with programs as developing scenarios for long-term energy demand in developing countries and organizing leading an analytic effort, Energy Efficiency, Developing Countries, and Eastern Europe,'' part of a major effort to increase support for energy efficiency programs worldwide; (2) at national level, the Program has been responsible for assessing energy forecasts and policies affecting energy use (e.g., appliance standards, National Energy Strategy scenarios); and (3) at the state and utility levels, the Program has been a leader in promoting integrated resource utility planning; the collaborative process has led to agreement on a new generation of utility demand-site programs in California, providing an opportunity to use knowledge and analytic techniques of the Program's researchers. We continue to place highest on analyzing energy efficiency, with particular attention given to energy use in buildings. The Program continues its active analysis of international energy issues in Asia (including China), the Soviet Union, South America, and Western Europe. Analyzing the costs and benefits of different levels of standards for residential appliances continues to be the largest single area of research within the Program. The group has developed and applied techniques for forecasting energy demand (or constructing scenarios) for the United States. We have built a new model of industrial energy demand, are in the process of making major changes in our tools for forecasting residential energy demand, have built an extensive and documented energy conservation supply curve of residential energy use, and are beginning an analysis of energy-demand forecasting for commercial buildings.

  8. Energy Analysis Program 1990 annual report

    SciTech Connect

    Not Available

    1992-01-01

    The Energy Analysis Program has played an active role in the analysis and discussion of energy and environmental issues at several levels. (1) at the international level, with programs as developing scenarios for long-term energy demand in developing countries and organizing leading an analytic effort, ``Energy Efficiency, Developing Countries, and Eastern Europe,`` part of a major effort to increase support for energy efficiency programs worldwide; (2) at national level, the Program has been responsible for assessing energy forecasts and policies affecting energy use (e.g., appliance standards, National Energy Strategy scenarios); and (3) at the state and utility levels, the Program has been a leader in promoting integrated resource utility planning; the collaborative process has led to agreement on a new generation of utility demand-site programs in California, providing an opportunity to use knowledge and analytic techniques of the Program`s researchers. We continue to place highest on analyzing energy efficiency, with particular attention given to energy use in buildings. The Program continues its active analysis of international energy issues in Asia (including China), the Soviet Union, South America, and Western Europe. Analyzing the costs and benefits of different levels of standards for residential appliances continues to be the largest single area of research within the Program. The group has developed and applied techniques for forecasting energy demand (or constructing scenarios) for the United States. We have built a new model of industrial energy demand, are in the process of making major changes in our tools for forecasting residential energy demand, have built an extensive and documented energy conservation supply curve of residential energy use, and are beginning an analysis of energy-demand forecasting for commercial buildings.

  9. Monte Carlo simulation of energy deposition by low-energy electrons in molecular hydrogen

    NASA Technical Reports Server (NTRS)

    Heaps, M. G.; Furman, D. R.; Green, A. E. S.

    1975-01-01

    A set of detailed atomic cross sections has been used to obtain the spatial deposition of energy by 1-20-eV electrons in molecular hydrogen by a Monte Carlo simulation of the actual trajectories. The energy deposition curve (energy per distance traversed) is quite peaked in the forward direction about the entry point for electrons with energies above the threshold of the electronic states, but the peak decreases and broadens noticeably as the electron energy decreases below 10 eV (threshold for the lowest excitable electronic state of H2). The curve also assumes a very symmetrical shape for energies below 10 eV, indicating the increasing importance of elastic collisions in determining the shape of the curve, although not the mode of energy deposition.

  10. Low-Energy Impacts onto Lunar Regolith Simulant

    NASA Astrophysics Data System (ADS)

    Seward, Laura M.; Colwell, J.; Mellon, M.; Stemm, B.

    2012-10-01

    Low-Energy Impacts onto Lunar Regolith Simulant Laura M. Seward1, Joshua E. Colwell1, Michael T. Mellon2, and Bradley A. Stemm1, 1Department of Physics, University of Central Florida, Orlando, Florida, 2Southwest Research Institute, Boulder, Colorado. Impacts and cratering in space play important roles in the formation and evolution of planetary bodies. Low-velocity impacts and disturbances to planetary regolith are also a consequence of manned and robotic exploration of planetary bodies such as the Moon, Mars, and asteroids. We are conducting a program of laboratory experiments to study low-velocity impacts of 1 to 5 m/s into JSC-1 lunar regolith simulant, JSC-Mars-1 Martian regolith simulant, and silica targets under 1 g. We use direct measurement of ejecta mass and high-resolution video tracking of ejecta particle trajectories to derive ejecta mass velocity distributions. Additionally, we conduct similar experiments under microgravity conditions in a laboratory drop tower and on parabolic aircraft with velocities as low as 10 cm/s. We wish to characterize and understand the collision parameters that control the outcome of low-velocity impacts into regolith, including impact velocity, impactor mass, target shape and size distribution, regolith depth, target relative density, and crater depth, and to experimentally determine the functional dependencies of the outcomes of low-velocity collisions (ejecta mass and ejecta velocities) on the controlling parameters of the collision. We present results from our ongoing study showing the positive correlation between impact energy and ejecta mass. The total ejecta mass is also dependent on the packing density (porosity) of the regolith. We find that ejecta mass velocity fits a power-law or broken power-law distribution. Our goal is to understand the physics of ejecta production and regolith compaction in low-energy impacts and experimentally validate predictive models for dust flow and deposition. We will present our

  11. Analysis & Simulation of Dynamics in Supercooled Liquids

    NASA Astrophysics Data System (ADS)

    Elmatad, Yael Sarah

    2011-12-01

    The nature of supercooled liquids and the glass transition has been debated by many scientists. Several theories have been put forth to describe the remarkable properties of this out-of-equilibrium material. Each of these theories makes specific predictions as to how the scaling of various transport properties in supercooled materials should behave. Given access to a large pool of high-quality supercooled liquid data we seek to compare these theories to one another. Moreover, we explore properties of a pair of models which are the basis for one particularly attractive theory---Chandler-Garrahan theory---and discuss the models' behavior in space-time and possible implications to the behavior of experimental supercooled liquids. Here we investigate the nature of dynamics in supercooled liquids using a two pronged approach. First we analyze the transport properties found in experiments and simulations of supercooled liquids. Then, we analyze simulation trajectories for lattice models which reproduce many of the interesting properties of supercooled liquids. In doing so, we illuminate several glass universalities, common properties of a wide variety of glass formers. By analyzing relaxation time and viscosity data for over 50 data sets and 1200 points, we find that relaxation time can be collapsed onto a single, parabolic curve. This collapse supports a theory of universal glass behavior based on facilitated models proposed by David Chandler and Juan Garrahan in 2003. We then show that the parabolic fit parameters for any particular liquid are a material property: they converge fast and are capable of predicting behavior in regions beyond the included data sets. We compare this property to other popular fitting schemes such as the Vogel-Fulcher, double exponential, and fractional exponential forms and conclude that these three forms result in parameters which are non predictive and therefore not material properties. Additionally, we examine the role of attractive

  12. Simulation and evaluation of latent heat thermal energy storage

    NASA Technical Reports Server (NTRS)

    Sigmon, T. W.

    1980-01-01

    The relative value of thermal energy storage (TES) for heat pump storage (heating and cooling) as a function of storage temperature, mode of storage (hotside or coldside), geographic locations, and utility time of use rate structures were derived. Computer models used to simulate the performance of a number of TES/heat pump configurations are described. The models are based on existing performance data of heat pump components, available building thermal load computational procedures, and generalized TES subsystem design. Life cycle costs computed for each site, configuration, and rate structure are discussed.

  13. Simulating distributed reinforcement effects in concrete analysis

    SciTech Connect

    Marchertas, A.H.

    1985-01-01

    The effect of the bond slip is brought into the TEMP-STRESS finite element code by relaxing the equal strain condition between concrete and reinforcement. This is done for the elements adjacent to the element which is cracked. A parabolic differential strain variation is assumed along the reinforcement from the crack, which is taken to be at the centroid of the cracked element, to the point where perfect bonding exists. This strain relationship is used to increase the strain of the reinforcement in the as yet uncracked elements located adjacent to a crack. By the same token the corresponding concrete strain is decreased. This estimate is made assuming preservation of strain energy in the element. The effectiveness of the model is shown by examples. Comparison of analytical results is made with structural test data. The influence of the bonding model on cracking is portrayed pictorially. 5 refs., 6 figs.

  14. Stochastic Modeling of Overtime Occupancy and Its Application in Building Energy Simulation and Calibration

    SciTech Connect

    Sun, Kaiyu; Yan, Da; Hong, Tianzhen; Guo, Siyue

    2014-02-28

    Overtime is a common phenomenon around the world. Overtime drives both internal heat gains from occupants, lighting and plug-loads, and HVAC operation during overtime periods. Overtime leads to longer occupancy hours and extended operation of building services systems beyond normal working hours, thus overtime impacts total building energy use. Current literature lacks methods to model overtime occupancy because overtime is stochastic in nature and varies by individual occupants and by time. To address this gap in the literature, this study aims to develop a new stochastic model based on the statistical analysis of measured overtime occupancy data from an office building. A binomial distribution is used to represent the total number of occupants working overtime, while an exponential distribution is used to represent the duration of overtime periods. The overtime model is used to generate overtime occupancy schedules as an input to the energy model of a second office building. The measured and simulated cooling energy use during the overtime period is compared in order to validate the overtime model. A hybrid approach to energy model calibration is proposed and tested, which combines ASHRAE Guideline 14 for the calibration of the energy model during normal working hours, and a proposed KS test for the calibration of the energy model during overtime. The developed stochastic overtime model and the hybrid calibration approach can be used in building energy simulations to improve the accuracy of results, and better understand the characteristics of overtime in office buildings.

  15. Simulation of the Atmospheric Boundary Layer for Wind Energy Applications

    NASA Astrophysics Data System (ADS)

    Marjanovic, Nikola

    Energy production from wind is an increasingly important component of overall global power generation, and will likely continue to gain an even greater share of electricity production as world governments attempt to mitigate climate change and wind energy production costs decrease. Wind energy generation depends on wind speed, which is greatly influenced by local and synoptic environmental forcings. Synoptic forcing, such as a cold frontal passage, exists on a large spatial scale while local forcing manifests itself on a much smaller scale and could result from topographic effects or land-surface heat fluxes. Synoptic forcing, if strong enough, may suppress the effects of generally weaker local forcing. At the even smaller scale of a wind farm, upstream turbines generate wakes that decrease the wind speed and increase the atmospheric turbulence at the downwind turbines, thereby reducing power production and increasing fatigue loading that may damage turbine components, respectively. Simulation of atmospheric processes that span a considerable range of spatial and temporal scales is essential to improve wind energy forecasting, wind turbine siting, turbine maintenance scheduling, and wind turbine design. Mesoscale atmospheric models predict atmospheric conditions using observed data, for a wide range of meteorological applications across scales from thousands of kilometers to hundreds of meters. Mesoscale models include parameterizations for the major atmospheric physical processes that modulate wind speed and turbulence dynamics, such as cloud evolution and surface-atmosphere interactions. The Weather Research and Forecasting (WRF) model is used in this dissertation to investigate the effects of model parameters on wind energy forecasting. WRF is used for case study simulations at two West Coast North American wind farms, one with simple and one with complex terrain, during both synoptically and locally-driven weather events. The model's performance with different

  16. Ray-tracing simulations of coupled dark energy models

    NASA Astrophysics Data System (ADS)

    Pace, Francesco; Baldi, Marco; Moscardini, Lauro; Bacon, David; Crittenden, Robert

    2015-02-01

    Dark matter and dark energy are usually assumed to couple only gravitationally. An extension to this picture is to model dark energy as a scalar field coupled directly to cold dark matter. This coupling leads to new physical effects, such as a fifth force and a time-dependent dark matter particle mass. In this work we examine the impact that coupling has on weak lensing statistics by constructing realistic simulated weak lensing maps using ray-tracing techniques through N-body cosmological simulations. We construct maps for different lensing quantities, covering a range of scales from a few arcminutes to several degrees. The concordance Λ cold dark matter (ΛCDM) model is compared to different coupled dark energy models, described either by an exponential scalar field potential (standard coupled dark energy scenario) or by a SUGRA potential (bouncing model). We analyse several statistical quantities and our results, with sources at low redshifts are largely consistent with previous work on cosmic microwave background lensing by Carbone et al. The most significant differences from the ΛCDM model are due to the enhanced growth of the perturbations and to the effective friction term in non-linear dynamics. For the most extreme models, we see differences in the power spectra up to 40 per cent compared to the ΛCDM model. The different time evolution of the linear matter overdensity can account for most of the differences, but when controlling for this using a ΛCDM model having the same normalization, the overall signal is smaller due to the effect of the friction term appearing in the equation of motion for dark matter particles.

  17. Summary of: Simulating the Value of Concentrating Solar Power with Thermal Energy Storage in a Production Cost Model (Presentation)

    SciTech Connect

    Denholm, P.; Hummon, M.

    2013-02-01

    Concentrating solar power (CSP) deployed with thermal energy storage (TES) provides a dispatchable source of renewable energy. The value of CSP with TES, as with other potential generation resources, needs to be established using traditional utility planning tools. Production cost models, which simulate the operation of grid, are often used to estimate the operational value of different generation mixes. CSP with TES has historically had limited analysis in commercial production simulations. This document describes the implementation of CSP with TES in a commercial production cost model. It also describes the simulation of grid operations with CSP in a test system consisting of two balancing areas located primarily in Colorado.

  18. Solvation free energies from a coupled reference interaction site model/simulation approach

    NASA Astrophysics Data System (ADS)

    Freedman, Holly

    2005-12-01

    Accounting for solvation in thermodynamic studies is one issue requiring further work so that computational models may be more advantageously applied to studies of systems in solution, especially studies of large systems such as aqueous biosystems. In the area of applied molecular biology, this capability is very significant, since computational thermodynamic studies can supply valuable information to assist in the design of molecules with specific desired binding or conformational properties, such as inhibitors. This dissertation addresses the issue by suggesting a new approach for the calculation of solvation free energies using a modified reference interaction site model (RISM) integral equation approach in which molecular simulations are used to provide the solvent structure around a solute at infinite dilution in the form of radial distribution functions. The intent is to compensate for insufficiencies arising in the standard RISM approach, and in this way to establish an alternative to the very time consuming free energy simulations, which are usually depended upon for high accuracy and have seen some recent applications, although falling somewhat short as far as practicality. Chapter 2 of this dissertation investigates how it is possible to implement such a scheme, with consideration given to the inherent errors associated with such an approach. In Chapter 3, the resulting coupled RISM simulation methodology is first tested by its application to determine absolute solvation free energies of some small molecules. Applications of relative solvation free energy determination by the coupled RISM/simulation methodology to the conformational analysis of the alanine dipeptide, and to the tautomeric equilibria of the DNA base cytosine and one of its analogues, are then described in Chapters 4 and 5, respectively. Chapter 6 is concerned with the determination of a potential of mean force profile for a simple atom transfer reaction in aqueous solution. By means of

  19. Efficient Coalescent Simulation and Genealogical Analysis for Large Sample Sizes

    PubMed Central

    Kelleher, Jerome; Etheridge, Alison M; McVean, Gilean

    2016-01-01

    A central challenge in the analysis of genetic variation is to provide realistic genome simulation across millions of samples. Present day coalescent simulations do not scale well, or use approximations that fail to capture important long-range linkage properties. Analysing the results of simulations also presents a substantial challenge, as current methods to store genealogies consume a great deal of space, are slow to parse and do not take advantage of shared structure in correlated trees. We solve these problems by introducing sparse trees and coalescence records as the key units of genealogical analysis. Using these tools, exact simulation of the coalescent with recombination for chromosome-sized regions over hundreds of thousands of samples is possible, and substantially faster than present-day approximate methods. We can also analyse the results orders of magnitude more quickly than with existing methods. PMID:27145223

  20. Annual Energy Consumption Analysis Report for Richland Middle School

    SciTech Connect

    Liu, Bing

    2003-12-18

    Richland Middle School is a single story, 90,000 square feet new school located in Richland, WA. The design team proposed four HVAC system options to serve the building. The proposed HVAC systems are listed as following: (1) 4-pipe fan coil units served by electrical chiller and gas-fired boilers, (2) Ground-source closed water loop heat pumps with water loop heat pumps with boiler and cooling tower, and (3) VAV system served by electrical chiller and gas-fired boiler. This analysis estimates the annual energy consumptions and costs of each system option, in order to provide the design team with a reasonable basis for determining which system is most life-cycle cost effective. eQuest (version 3.37), a computer-based energy simulation program that uses the DOE-2 simulation engine, was used to estimate the annual energy costs.

  1. Calculation of free-energy differences by confinement simulations. Application to peptide conformers.

    PubMed

    Cecchini, M; Krivov, S V; Spichty, M; Karplus, M

    2009-07-23

    Conformational free-energy differences are key quantities for understanding important phenomena in molecular biology that involve large structural changes of macromolecules. In this paper, an improved version of the confinement approach, which is based on earlier developments, is used to determine the free energy of individual molecular states by progressively restraining the corresponding molecular structures to pure harmonic basins, whose absolute free energy can be computed by normal-mode analysis. The method is used to calculate the free-energy difference between two conformational states of the alanine dipeptide in vacuo, and of the beta-hairpin from protein G with an implicit solvation model. In all cases, the confinement results are in excellent agreement with the ones obtained from converged equilibrium molecular dynamics simulations, which have a much larger computational cost. The systematic and statistical errors of the results are evaluated and the origin of the errors is identified. The sensitivity of the calculated free-energy differences to structure-based definitions of the molecular states is discussed. A variant of the method, which closes the thermodynamic cycle by a quasi-harmonic rather than harmonic analysis, is introduced. The latter is proposed for possible use with explicit solvent simulations. PMID:19552392

  2. Simulation analysis of intermodal sodium channel function

    NASA Astrophysics Data System (ADS)

    Zeng, Shangyou; Jung, Peter

    2008-12-01

    Although most sodium ion channels clustered in nodes of Ranvier provide the physiological basis for saltatory conduction, sodium ion channels cannot be excluded from internodal regions completely. The density of internodal sodium ion channels is of the order of 10/μm2 . The function of internodal sodium ion channels has been neglected for a long time; however, experimental and theoretical results show that internodal sodium ion channels play an important role in action potential propagation. In this paper, based on the compartment model, we investigate the function of internodal sodium ion channels. We find that internodal sodium ion channels can promote action potential propagation, enlarge the maximal internodal distance guaranteeing stable action potential propagation, and increase the propagation speed of action potentials. In this paper, we find an optimal conductance of internodal sodium ion channels (4-5mS/cm2) , which accords with the active internodal sodium ion conductance in a real myelinated axon. With the optimal conductance, the average sodium ion channel conductance of the axon is minimal, and the metabolic energy consumption due to ion channels is also minimal.

  3. Simulation analysis of intermodal sodium channel function.

    PubMed

    Zeng, Shangyou; Jung, Peter

    2008-12-01

    Although most sodium ion channels clustered in nodes of Ranvier provide the physiological basis for saltatory conduction, sodium ion channels cannot be excluded from internodal regions completely. The density of internodal sodium ion channels is of the order of 10/microm2. The function of internodal sodium ion channels has been neglected for a long time; however, experimental and theoretical results show that internodal sodium ion channels play an important role in action potential propagation. In this paper, based on the compartment model, we investigate the function of internodal sodium ion channels. We find that internodal sodium ion channels can promote action potential propagation, enlarge the maximal internodal distance guaranteeing stable action potential propagation, and increase the propagation speed of action potentials. In this paper, we find an optimal conductance of internodal sodium ion channels (4-5 mS/cm2), which accords with the active internodal sodium ion conductance in a real myelinated axon. With the optimal conductance, the average sodium ion channel conductance of the axon is minimal, and the metabolic energy consumption due to ion channels is also minimal. PMID:19256877

  4. Analysis and simulation of a city bus route

    NASA Astrophysics Data System (ADS)

    Kar, Leow Soo

    2014-12-01

    Public transport in crowded cities, in particular bus services play an essential role in the mobility of its citizenry. However, an efficient, reliable and safe bus system is still a distant dream of many cities. This paper uses a simulation approach to provide some insight into the factors that contribute to the service quality of a bus system. SAS Simulation Studio is used to model a city bus route. The simulation model consists of a bus depot, bus stops, terminal station and a bus route. Parameters used in the simulation include the number of buses serving the route, maximum bus capacity, inter departure time of buses, travel time between stops, number of passengers boarding and alighting. The simulation is applied to a real bus route in Kuala Lumpur city center and a sensitivity analysis is performed to evaluate how the different variables affect the service quality of the bus system.

  5. Dual Energy Method for Breast Imaging: A Simulation Study

    PubMed Central

    Koukou, V.; Martini, N.; Michail, C.; Sotiropoulou, P.; Fountzoula, C.; Kalyvas, N.; Kandarakis, I.; Nikiforidis, G.; Fountos, G.

    2015-01-01

    Dual energy methods can suppress the contrast between adipose and glandular tissues in the breast and therefore enhance the visibility of calcifications. In this study, a dual energy method based on analytical modeling was developed for the detection of minimum microcalcification thickness. To this aim, a modified radiographic X-ray unit was considered, in order to overcome the limited kVp range of mammographic units used in previous DE studies, combined with a high resolution CMOS sensor (pixel size of 22.5 μm) for improved resolution. Various filter materials were examined based on their K-absorption edge. Hydroxyapatite (HAp) was used to simulate microcalcifications. The contrast to noise ratio (CNRtc) of the subtracted images was calculated for both monoenergetic and polyenergetic X-ray beams. The optimum monoenergetic pair was 23/58 keV for the low and high energy, respectively, resulting in a minimum detectable microcalcification thickness of 100 μm. In the polyenergetic X-ray study, the optimal spectral combination was 40/70 kVp filtered with 100 μm cadmium and 1000 μm copper, respectively. In this case, the minimum detectable microcalcification thickness was 150 μm. The proposed dual energy method provides improved microcalcification detectability in breast imaging with mean glandular dose values within acceptable levels. PMID:26246848

  6. Simulating Environmental Changes Due to Hydrokinetic Energy Installations

    NASA Astrophysics Data System (ADS)

    James, S. C.; Jones, C. A.; Roberts, J. D.

    2010-12-01

    Marine and hydrokinetic (MHK) power projects will extract energy from ocean currents and tides, thereby altering water velocities and currents in the site’s waterway. These hydrodynamics changes can potentially affect the ecosystem, both near the MHK installation and in surrounding (i.e., far field) regions. In both marine and freshwater environments, devices will remove energy (momentum) from the system, potentially altering water quality and sediment dynamics. In estuaries, tidal ranges and residence times could change (either increasing or decreasing depending on system flow properties and where the effects are being measured). Effects will be proportional to the number and size of structures installed, with large MHK projects having the greatest potential effects and requiring the most in-depth analyses. This work implements modification to an existing flow, sediment dynamics, and water-quality code (SNL-EFDC) to qualify, quantify, and visualize the influence of MHK-device momentum/energy extraction at a representative site. New algorithms simulate changes to system fluid dynamics due to removal of momentum and reflect commensurate changes in turbulent kinetic energy and its dissipation rate. A generic model is developed to demonstrate corresponding changes to erosion, sediment dynamics, and water quality. Also, bed-slope effects on sediment erosion and bedload velocity are incorporated to better understand scour potential.

  7. Dual Energy Method for Breast Imaging: A Simulation Study.

    PubMed

    Koukou, V; Martini, N; Michail, C; Sotiropoulou, P; Fountzoula, C; Kalyvas, N; Kandarakis, I; Nikiforidis, G; Fountos, G

    2015-01-01

    Dual energy methods can suppress the contrast between adipose and glandular tissues in the breast and therefore enhance the visibility of calcifications. In this study, a dual energy method based on analytical modeling was developed for the detection of minimum microcalcification thickness. To this aim, a modified radiographic X-ray unit was considered, in order to overcome the limited kVp range of mammographic units used in previous DE studies, combined with a high resolution CMOS sensor (pixel size of 22.5 μm) for improved resolution. Various filter materials were examined based on their K-absorption edge. Hydroxyapatite (HAp) was used to simulate microcalcifications. The contrast to noise ratio (CNR tc ) of the subtracted images was calculated for both monoenergetic and polyenergetic X-ray beams. The optimum monoenergetic pair was 23/58 keV for the low and high energy, respectively, resulting in a minimum detectable microcalcification thickness of 100 μm. In the polyenergetic X-ray study, the optimal spectral combination was 40/70 kVp filtered with 100 μm cadmium and 1000 μm copper, respectively. In this case, the minimum detectable microcalcification thickness was 150 μm. The proposed dual energy method provides improved microcalcification detectability in breast imaging with mean glandular dose values within acceptable levels. PMID:26246848

  8. Simulating environmental changes due to marine hydrokinetic energy installations.

    SciTech Connect

    Jones, Craig A.; James, Scott Carlton; Roberts, Jesse Daniel; Seetho, Eddy

    2010-08-01

    Marine hydrokinetic (MHK) projects will extract energy from ocean currents and tides, thereby altering water velocities and currents in the site's waterway. These hydrodynamics changes can potentially affect the ecosystem, both near the MHK installation and in surrounding (i.e., far field) regions. In both marine and freshwater environments, devices will remove energy (momentum) from the system, potentially altering water quality and sediment dynamics. In estuaries, tidal ranges and residence times could change (either increasing or decreasing depending on system flow properties and where the effects are being measured). Effects will be proportional to the number and size of structures installed, with large MHK projects having the greatest potential effects and requiring the most in-depth analyses. This work implements modification to an existing flow, sediment dynamics, and water-quality code (SNL-EFDC) to qualify, quantify, and visualize the influence of MHK-device momentum/energy extraction at a representative site. New algorithms simulate changes to system fluid dynamics due to removal of momentum and reflect commensurate changes in turbulent kinetic energy and its dissipation rate. A generic model is developed to demonstrate corresponding changes to erosion, sediment dynamics, and water quality. Also, bed-slope effects on sediment erosion and bedload velocity are incorporated to better understand scour potential.

  9. In-Situ Statistical Analysis of Autotune Simulation Data using Graphical Processing Units

    SciTech Connect

    Ranjan, Niloo; Sanyal, Jibonananda; New, Joshua Ryan

    2013-08-01

    Developing accurate building energy simulation models to assist energy efficiency at speed and scale is one of the research goals of the Whole-Building and Community Integration group, which is a part of Building Technologies Research and Integration Center (BTRIC) at Oak Ridge National Laboratory (ORNL). The aim of the Autotune project is to speed up the automated calibration of building energy models to match measured utility or sensor data. The workflow of this project takes input parameters and runs EnergyPlus simulations on Oak Ridge Leadership Computing Facility s (OLCF) computing resources such as Titan, the world s second fastest supercomputer. Multiple simulations run in parallel on nodes having 16 processors each and a Graphics Processing Unit (GPU). Each node produces a 5.7 GB output file comprising 256 files from 64 simulations. Four types of output data covering monthly, daily, hourly, and 15-minute time steps for each annual simulation is produced. A total of 270TB+ of data has been produced. In this project, the simulation data is statistically analyzed in-situ using GPUs while annual simulations are being computed on the traditional processors. Titan, with its recent addition of 18,688 Compute Unified Device Architecture (CUDA) capable NVIDIA GPUs, has greatly extended its capability for massively parallel data processing. CUDA is used along with C/MPI to calculate statistical metrics such as sum, mean, variance, and standard deviation leveraging GPU acceleration. The workflow developed in this project produces statistical summaries of the data which reduces by multiple orders of magnitude the time and amount of data that needs to be stored. These statistical capabilities are anticipated to be useful for sensitivity analysis of EnergyPlus simulations.

  10. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  11. Analysis of low energy electrons

    NASA Technical Reports Server (NTRS)

    Sharp, R. D.

    1973-01-01

    Simultaneous observations of low energy electrons in the plasma sheet and in the auroral zone were analyzed. Data from the MIT plasma experiment on the OGO-3 satellite and from the Lockheed experiment on the OV1-18 satellite were processed and compared. The OV1-18 carried thirteen magnetic electron spectrometers designed to measure the intensity, angular, and energy distributions of the auroral electrons and protons in the energy range below 50 keV. Two computer programs were developed for reduction of the OV1-18 data. One program computed the various plasma properties at one second intervals as a function of Universal Time and pitch angle; the other program produced survey plots showing the outputs of the various detectors on the satellite as a function of time on a scale of approximately 100 seconds per cm. The OV1-18 data exhibit the high degree of variability associated with substorm controlled phenomena.

  12. Feasibility of generating quantitative composition images in dual energy mammography: a simulation study

    NASA Astrophysics Data System (ADS)

    Lee, Donghoon; Kim, Ye-seul; Choi, Sunghoon; Lee, Haenghwa; Choi, Seungyeon; Kim, Hee-Joung

    2016-03-01

    Breast cancer is one of the most common malignancies in women. For years, mammography has been used as the gold standard for localizing breast cancer, despite its limitation in determining cancer composition. Therefore, the purpose of this simulation study is to confirm the feasibility of obtaining tumor composition using dual energy digital mammography. To generate X-ray sources for dual energy mammography, 26 kVp and 39 kVp voltages were generated for low and high energy beams, respectively. Additionally, the energy subtraction and inverse mapping functions were applied to provide compositional images. The resultant images showed that the breast composition obtained by the inverse mapping function with cubic fitting achieved the highest accuracy and least noise. Furthermore, breast density analysis with cubic fitting showed less than 10% error compare to true values. In conclusion, this study demonstrated the feasibility of creating individual compositional images and capability of analyzing breast density effectively.

  13. Big Data Visual Analytics for Exploratory Earth System Simulation Analysis

    SciTech Connect

    Steed, Chad A.; Ricciuto, Daniel M.; Shipman, Galen M.; Smith, Brian E.; Thornton, Peter E.; Wang, Dali; Shi, Xiaoying; Williams, Dean N.

    2013-12-01

    Rapid increases in high performance computing are feeding the development of larger and more complex data sets in climate research, which sets the stage for so-called big data analysis challenges. However, conventional climate analysis techniques are inadequate in dealing with the complexities of today s data. In this paper, we describe and demonstrate a visual analytics system, called the Exploratory Data analysis ENvironment (EDEN), with specific application to the analysis of complex earth system simulation data sets. EDEN represents the type of interactive visual analysis tools that are necessary to transform data into insight, thereby improving critical comprehension of earth system processes. In addition to providing an overview of EDEN, we describe real-world studies using both point ensembles and global Community Land Model Version 4 (CLM4) simulations.

  14. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  15. Delight2 Daylighting Analysis in Energy Plus: Integration and Preliminary User Results

    SciTech Connect

    Carroll, William L.; Hitchcock, Robert J.

    2005-04-26

    DElight is a simulation engine for daylight and electric lighting system analysis in buildings. DElight calculates interior illuminance levels from daylight, and the subsequent contribution required from electric lighting to meet a desired interior illuminance. DElight has been specifically designed to integrate with building thermal simulation tools. This paper updates the DElight capability set, the status of integration into the simulation tool EnergyPlus, and describes a sample analysis of a simple model from the user perspective.

  16. Energy Storage Fuel Cell Vehicle Analysis: Preprint

    SciTech Connect

    Markel, T.; Pesaran, A.; Zolot, M.; Sprik, S.; Tataria, H.; Duong, T.

    2005-04-01

    In recent years, hydrogen fuel cell (FC) vehicle technology has received considerable attention as a strategy to decrease oil consumption and reduce harmful emissions. However, the cost, transient response, and cold performance of FC systems may present significant challenges to widespread adoption of the technology for transportation in the next 15 years. The objectives of this effort were to perform energy storage modeling with fuel cell vehicle simulations to quantify the benefits of hybridization and to identify a process for setting the requirements of ES for hydrogen-powered FC vehicles for U.S. Department of Energy's Energy Storage Program.

  17. Energy Storage Fuel Cell Vehicle Analysis

    SciTech Connect

    Pesaran, A; Markel, T; Zolot, M; Sprik, S; Tataria, H; Duong, T

    2005-08-01

    In recent years, hydrogen fuel cell (FC) vehicle technology has received considerable attention as a strategy to decrease oil consumption and reduce harmful emissions. However, the cost, transient response, and cold performance of FC systems may present significant challenges to widespread adoption of the technology for transportation in the next 15 years. The objectives of this effort were to perform energy storage modeling with fuel cell vehicle simulations to quantify the benefits of hybridization and to identify a process for setting the requirements of ES for hydrogen-powered FC vehicles for U.S. Department of Energy's Energy Storage Program.

  18. Cyber security analysis testbed : combining real, emulation, and simulation.

    SciTech Connect

    Villamarin, Charles H.; Eldridge, John M.; Van Leeuwen, Brian P.; Urias, Vincent E.

    2010-07-01

    Cyber security analysis tools are necessary to evaluate the security, reliability, and resilience of networked information systems against cyber attack. It is common practice in modern cyber security analysis to separately utilize real systems of computers, routers, switches, firewalls, computer emulations (e.g., virtual machines) and simulation models to analyze the interplay between cyber threats and safeguards. In contrast, Sandia National Laboratories has developed novel methods to combine these evaluation platforms into a hybrid testbed that combines real, emulated, and simulated components. The combination of real, emulated, and simulated components enables the analysis of security features and components of a networked information system. When performing cyber security analysis on a system of interest, it is critical to realistically represent the subject security components in high fidelity. In some experiments, the security component may be the actual hardware and software with all the surrounding components represented in simulation or with surrogate devices. Sandia National Laboratories has developed a cyber testbed that combines modeling and simulation capabilities with virtual machines and real devices to represent, in varying fidelity, secure networked information system architectures and devices. Using this capability, secure networked information system architectures can be represented in our testbed on a single, unified computing platform. This provides an 'experiment-in-a-box' capability. The result is rapidly-produced, large-scale, relatively low-cost, multi-fidelity representations of networked information systems. These representations enable analysts to quickly investigate cyber threats and test protection approaches and configurations.

  19. Cold Climate Foundation Retrofit Energy Savings. The Simulated Energy and Experimental Hygrothermal Performance of Cold Climate Foundation Wall Insulation Retrofit Measures -- Phase I, Energy Simulation

    SciTech Connect

    Goldberg, Louise F.; Steigauf, Brianna

    2013-04-01

    A split simulation whole building energy / 3-dimensional earth contact model (termed the BUFETS/EnergyPlus Model or BEM) capable of modeling the full range of foundation systems found in the target retrofit housing stock has been extensively tested. These foundation systems that include abovegrade foundation walls, diabatic floors or slabs as well as lookout or walkout walls, currently cannot be modeled within BEopt.

  20. Cold Climate Foundation Retrofit Energy Savings: The Simulated Energy and Experimental Hygrothermal Performance of Cold Climate Foundation Wall Insulation Retrofit Measures -- Phase I, Energy Simulation

    SciTech Connect

    Goldberg, L. F.; Steigauf, B.

    2013-04-01

    A split simulation whole building energy/3-dimensional earth contact model (termed the BUFETS/EnergyPlus Model or BEM) capable of modeling the full range of foundation systems found in the target retrofit housing stock has been extensively tested. These foundation systems that include abovegrade foundation walls, diabatic floors or slabs as well as lookout or walkout walls, currently cannot be modeled within BEopt.

  1. Impact of the U.S. National Building Information Model Standard (NBIMS) on Building Energy Performance Simulation

    SciTech Connect

    Bazjanac, Vladimir

    2007-08-01

    The U.S. National Institute for Building Sciences (NIBS) started the development of the National Building Information Model Standard (NBIMS). Its goal is to define standard sets of data required to describe any given building in necessary detail so that any given AECO industry discipline application can find needed data at any point in the building lifecycle. This will include all data that are used in or are pertinent to building energy performance simulation and analysis. This paper describes the background that lead to the development of NBIMS, its goals and development methodology, its Part 1 (Version 1.0), and its probable impact on building energy performance simulation and analysis.

  2. The Importance of Simulation Workflow and Data Management in the Accelerated Climate Modeling for Energy Project

    NASA Astrophysics Data System (ADS)

    Bader, D. C.

    2015-12-01

    The Accelerated Climate Modeling for Energy (ACME) Project is concluding its first year. Supported by the Office of Science in the U.S. Department of Energy (DOE), its vision is to be "an ongoing, state-of-the-science Earth system modeling, modeling simulation and prediction project that optimizes the use of DOE laboratory resources to meet the science needs of the nation and the mission needs of DOE." Included in the "laboratory resources," is a large investment in computational, network and information technologies that will be utilized to both build better and more accurate climate models and broadly disseminate the data they generate. Current model diagnostic analysis and data dissemination technologies will not scale to the size of the simulations and the complexity of the models envisioned by ACME and other top tier international modeling centers. In this talk, the ACME Workflow component plans to meet these future needs will be described and early implementation examples will be highlighted.

  3. Effective Energy Simulation and Optimal Design of Side-lit Buildings with Venetian Blinds

    NASA Astrophysics Data System (ADS)

    Cheng, Tian

    Venetian blinds are popularly used in buildings to control the amount of incoming daylight for improving visual comfort and reducing heat gains in air-conditioning systems. Studies have shown that the proper design and operation of window systems could result in significant energy savings in both lighting and cooling. However, there is no convenient computer tool that allows effective and efficient optimization of the envelope of side-lit buildings with blinds now. Three computer tools, Adeline, DOE2 and EnergyPlus widely used for the above-mentioned purpose have been experimentally examined in this study. Results indicate that the two former tools give unacceptable accuracy due to unrealistic assumptions adopted while the last one may generate large errors in certain conditions. Moreover, current computer tools have to conduct hourly energy simulations, which are not necessary for life-cycle energy analysis and optimal design, to provide annual cooling loads. This is not computationally efficient, particularly not suitable for optimal designing a building at initial stage because the impacts of many design variations and optional features have to be evaluated. A methodology is therefore developed for efficient and effective thermal and daylighting simulations and optimal design of buildings with blinds. Based on geometric optics and radiosity method, a mathematical model is developed to reasonably simulate the daylighting behaviors of venetian blinds. Indoor illuminance at any reference point can be directly and efficiently computed. They have been validated with both experiments and simulations with Radiance. Validation results show that indoor illuminances computed by the new models agree well with the measured data, and the accuracy provided by them is equivalent to that of Radiance. The computational efficiency of the new models is much higher than that of Radiance as well as EnergyPlus. Two new methods are developed for the thermal simulation of buildings. A

  4. Parabolic trough collector power plant performance simulation for an interactive solar energy Atlas of Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Ibarra, Mercedes; Frasquet, Miguel; Al Rished, Abdulaziz; Tuomiranta, Arttu; Gasim, Sami; Ghedira, Hosni

    2016-05-01

    The collaboration between the Research Center for Renewable Energy Mapping and Assessment (ReCREMA) at Masdar Institute of Science and Technology and the King Abdullah City for Atomic & Renewable Energy (KACARE) aims to create an interactive web tool integrated in the Renewable Resource Atlas where different solar thermal electricity (STE) utility-scale technologies will be simulated. In this paper, a methodology is presented for sizing and performance simulation of the solar field of parabolic trough collector (PTC) plants. The model is used for a case study analysis of the potential of STE in three sites located in the central, western, and eastern parts of Saudi Arabia. The plant located in the north (Tayma) has the lowest number of collectors with the best production along the year.

  5. A mechanical energy analysis of gait initiation

    NASA Technical Reports Server (NTRS)

    Miller, C. A.; Verstraete, M. C.

    1999-01-01

    The analysis of gait initiation (the transient state between standing and walking) is an important diagnostic tool to study pathologic gait and to evaluate prosthetic devices. While past studies have quantified mechanical energy of the body during steady-state gait, to date no one has computed the mechanical energy of the body during gait initiation. In this study, gait initiation in seven normal male subjects was studied using a mechanical energy analysis to compute total body energy. The data showed three separate states: quiet standing, gait initiation, and steady-state gait. During gait initiation, the trends in the energy data for the individual segments were similar to those seen during steady-state gait (and in Winter DA, Quanbury AO, Reimer GD. Analysis of instantaneous energy of normal gait. J Biochem 1976;9:253-257), but diminished in amplitude. However, these amplitudes increased to those seen in steady-state during the gait initiation event (GIE), with the greatest increase occurring in the second step due to the push-off of the foundation leg. The baseline level of mechanical energy was due to the potential energy of the individual segments, while the cyclic nature of the data was indicative of the kinetic energy of the particular leg in swing phase during that step. The data presented showed differences in energy trends during gait initiation from those of steady state, thereby demonstrating the importance of this event in the study of locomotion.

  6. Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads

    PubMed Central

    Stone, John E.; Hallock, Michael J.; Phillips, James C.; Peterson, Joseph R.; Luthey-Schulten, Zaida; Schulten, Klaus

    2016-01-01

    Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers. PMID:27516922

  7. Strategic Plan for Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS)

    SciTech Connect

    Kimberlyn C. Mousseau

    2011-10-01

    The Nuclear Energy Computational Fluid Dynamics Advanced Modeling and Simulation (NE-CAMS) system is being developed at the Idaho National Laboratory (INL) in collaboration with Bettis Laboratory, Sandia National Laboratory (SNL), Argonne National Laboratory (ANL), Utah State University (USU), and other interested parties with the objective of developing and implementing a comprehensive and readily accessible data and information management system for computational fluid dynamics (CFD) verification and validation (V&V) in support of nuclear energy systems design and safety analysis. The two key objectives of the NE-CAMS effort are to identify, collect, assess, store and maintain high resolution and high quality experimental data and related expert knowledge (metadata) for use in CFD V&V assessments specific to the nuclear energy field and to establish a working relationship with the U.S. Nuclear Regulatory Commission (NRC) to develop a CFD V&V database, including benchmark cases, that addresses and supports the associated NRC regulations and policies on the use of CFD analysis. In particular, the NE-CAMS system will support the Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program, which aims to develop and deploy advanced modeling and simulation methods and computational tools for reliable numerical simulation of nuclear reactor systems for design and safety analysis. Primary NE-CAMS Elements There are four primary elements of the NE-CAMS knowledge base designed to support computer modeling and simulation in the nuclear energy arena as listed below. Element 1. The database will contain experimental data that can be used for CFD validation that is relevant to nuclear reactor and plant processes, particularly those important to the nuclear industry and the NRC. Element 2. Qualification standards for data evaluation and classification will be incorporated and applied such that validation data sets will result in well

  8. Coupled dynamics analysis of wind energy systems

    NASA Technical Reports Server (NTRS)

    Hoffman, J. A.

    1977-01-01

    A qualitative description of all key elements of a complete wind energy system computer analysis code is presented. The analysis system addresses the coupled dynamics characteristics of wind energy systems, including the interactions of the rotor, tower, nacelle, power train, control system, and electrical network. The coupled dynamics are analyzed in both the frequency and time domain to provide the basic motions and loads data required for design, performance verification and operations analysis activities. Elements of the coupled analysis code were used to design and analyze candidate rotor articulation concepts. Fundamental results and conclusions derived from these studies are presented.

  9. Energy analysis program. 1995 Annual report

    SciTech Connect

    Levine, M.D.

    1996-05-01

    This year the role of energy technology research and analysis supporting governmental and public interests is again being challenged at high levels of government. This situation is not unlike that of the early 1980s, when the Administration questioned the relevance of a federal commitment to applied energy research, especially for energy efficiency and renewable energy technologies. Then Congress continued to support such activities, deeming them important to the nation`s interest. Today, Congress itself is challenging many facets of the federal role in energy. The Administration is also selectively reducing its support, primarily for the pragmatic objective of reducing federal expenditures, rather than because of principles opposing a public role in energy. this report is divided into three sections: International Energy and the global environment; Energy, economics, markets, and policy; and Buildings and their environment.

  10. Aiding Design of Wave Energy Converters via Computational Simulations

    NASA Astrophysics Data System (ADS)

    Jebeli Aqdam, Hejar; Ahmadi, Babak; Raessi, Mehdi; Tootkaboni, Mazdak

    2015-11-01

    With the increasing interest in renewable energy sources, wave energy converters will continue to gain attention as a viable alternative to current electricity production methods. It is therefore crucial to develop computational tools for the design and analysis of wave energy converters. A successful design requires balance between the design performance and cost. Here an analytical solution is used for the approximate analysis of interactions between a flap-type wave energy converter (WEC) and waves. The method is verified using other flow solvers and experimental test cases. Then the model is used in conjunction with a powerful heuristic optimization engine, Charged System Search (CSS) to explore the WEC design space. CSS is inspired by charged particles behavior. It searches the design space by considering candidate answers as charged particles and moving them based on the Coulomb's laws of electrostatics and Newton's laws of motion to find the global optimum. Finally the impacts of changes in different design parameters on the power takeout of the superior WEC designs are investigated. National Science Foundation, CBET-1236462.

  11. A Computer Simulation of the U.S. Energy Crisis, Energy. Teacher Guide. Computer Technology Program Environmental Education Units.

    ERIC Educational Resources Information Center

    Northwest Regional Educational Lab., Portland, OR.

    This is the teacher's guide to accompany the student guide which together comprise one of five computer-oriented environmental/energy education units. The computer program, ENERGY, at the base of this unit, simulates the pattern of energy consumption in the United States. The total energy demand is determined by energy use in the various sectors…

  12. Disturbance energy norms: A critical analysis

    NASA Astrophysics Data System (ADS)

    George, K. Joseph; Sujith, R. I.

    2012-03-01

    The question of which norm should be used to characterize the fluctuating disturbance energy in studies on thermoacoustic instabilities remains a topic of continued debate. In this paper we formulate a strategy for developing mathematically consistent norms that do not support spurious growth of disturbance energy in the absence of 'physical' sources of energy. A critical analysis of various disturbance energy norms existing in literature is conducted and important conclusions regarding positive definitiveness and susceptibility to exhibiting unphysical growth are drawn. It is shown that the energy norm proposed by Cantrell and Hart [Interaction between sound and flow in acoustic cavities: mass, momentum and energy considerations, Journal of Acoustical Society of America 36 (1964) 697-706] and Myers' first order disturbance energy norm [Transport of energy by disturbances in arbitrary steady flow, Journal of Fluid Mechanics 226 (1991) 383-400] are positive definite if M0<1 and M0<1/√{γ}, respectively, where M0 is the magnitude of the local mean flow Mach number, γ is the ratio of specific heats at constant pressure and volume and the stipulated conditions should be met at every point in the flow domain. Our analysis shows that the disturbance energy norm proposed by Chu [On the energy transfer to small disturbances in fluid flow (part I), Acta Mechanica 1 (1965) 215-234] does not exhibit unphysical growth. It is also shown that this property is not unique to Chu's disturbance energy norm and there exists a family of norms which satisfy this requirement. The analysis also leads to an interesting interpretation of the acoustic energy conservation principle of Cantrell and Hart. The potential held by various disturbance energy norms for exhibiting fictitious growth is quantified using tools from nonmodal stability theory. It is concluded that if the mean flow Mach number is small, Myers' norm is a suitable measure of the disturbance energy.

  13. Geothermal Energy Development in the Eastern United States, Sensitivity analysis-cost of geothermal energy

    SciTech Connect

    Kane, S.M.; Kroll, P.; Nilo, B.

    1982-12-01

    The Geothermal Resources Interactive Temporal Simulation (GRITS) model is a computer code designed to estimate the costs of geothermal energy systems. The interactive program allows the user to vary resource, demand, and financial parameters to observe their effects on delivered costs of direct-use geothermal energy. Due to the large number and interdependent nature of the variables that influence these costs, the variables can be handled practically only through computer modeling. This report documents a sensitivity analysis of the cost of direct-use geothermal energy where each major element is varied to measure the responsiveness of cost to changes in that element. It is hoped that this analysis will assist those persons interested in geothermal energy to understand the most significant cost element as well as those individuals interested in using the GRITS program in the future.

  14. Whole-House Energy Analysis Procedures for Existing Homes: Preprint

    SciTech Connect

    Hendron, R.

    2006-08-01

    This paper describes a proposed set of guidelines for analyzing the energy savings achieved by a package of retrofits or an extensive rehabilitation of an existing home. It also describes certain field test and audit methods that can help establish accurate building system performance characteristics that are needed for a meaningful simulation of whole-house energy use. Several sets of default efficiency values have been developed for older appliances that cannot be easily tested and for which published specifications are not readily available. These proposed analysis procedures are documented more comprehensively in NREL Technical Report TP-550-38238.

  15. NREL's Field Data Repository Supports Accurate Home Energy Analysis (Fact Sheet)

    SciTech Connect

    None, None

    2012-02-01

    This fact sheet discusses NREL's work to develop a repository of research-level residential building characteristics and historical energy use data to support ongoing efforts to improve the accuracy of residential energy analysis tools and the efficiency of energy assessment processes. The objective of this project is to create a robust empirical data source to support the research goals of the Department of Energy's Building America program, which is to improve the efficiency of existing U.S. homes by 30% to 50%. Researchers can use this data source to test the accuracy of building energy simulation software and energy audit procedures, ultimately leading to more credible and less expensive energy analysis.

  16. WRF model performance analysis for a suite of simulation design

    NASA Astrophysics Data System (ADS)

    Mohan, Manju; Sati, Ankur Prabhat

    2016-03-01

    At present scientists are successfully using Numerical Weather Prediction (NWP) models to achieve a reliable forecast. Nested domains are preferred by the modelling community with varying grid ratios having wider applications. The impact of the nesting grid ratio (NGR) on the model performance needs systematic analysis and explored in the present study. The usage of WRF is mostly as a mesoscale model in simulating either extreme events or events of smaller duration shown with statistical model evaluation for the correspondingly similar and short period of time. Thus, influence of the simulation period on model performance has been examined for key meteorological parameters. Several works done earlier on episodes involve model implementation for longer duration and for that single simulation is performed often for a continuous stretch. This study scrutinizes the influence on model performance due to one single simulation versus several smaller simulations for the same duration; essentially splitting the run-time. In the present study, the surface wind (i.e., winds at 10 meters), temperature and Relative humidity at 2 meters as obtained from model simulations are compared with the Observations. The sensitivity study of nesting grid ratio, continuous versus smaller split simulations and realistic simulation period is done in the present study. It is found that there is no statistically significant difference in the simulated results on changing the nesting grid ratio while the smaller time split schemes (2 days and 4 days schemes on comparison with 8 days and 16 days continuous run) improve the results significantly. The impact of increasing number of observations from different sites on model performance is also scrutinised. Furthermore, conceptual framework is provided for Optimum time period for simulations to have confidence in statistical model evaluation.

  17. Analysis of the Space Shuttle main engine simulation

    NASA Technical Reports Server (NTRS)

    Deabreu-Garcia, J. Alex; Welch, John T.

    1993-01-01

    This is a final report on an analysis of the Space Shuttle Main Engine Program, a digital simulator code written in Fortran. The research was undertaken in ultimate support of future design studies of a shuttle life-extending Intelligent Control System (ICS). These studies are to be conducted by NASA Lewis Space Research Center. The primary purpose of the analysis was to define the means to achieve a faster running simulation, and to determine if additional hardware would be necessary for speeding up simulations for the ICS project. In particular, the analysis was to consider the use of custom integrators based on the Matrix Stability Region Placement (MSRP) method. In addition to speed of execution, other qualities of the software were to be examined. Among these are the accuracy of computations, the useability of the simulation system, and the maintainability of the program and data files. Accuracy involves control of truncation error of the methods, and roundoff error induced by floating point operations. It also involves the requirement that the user be fully aware of the model that the simulator is implementing.

  18. Performance Analysis of an Actor-Based Distributed Simulation

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.

  19. Multifractal analysis and simulation of multifractal random walks

    NASA Astrophysics Data System (ADS)

    Schmitt, Francois G.; Huang, Yongxiang

    2016-04-01

    Multifractal time series, characterized by a scale invariance and large fluctuations at all scales, are found in many fields of natural and applied sciences. They are found i.e. in many geophysical fields, such as atmospheric and oceanic turbulence, hydrology, earth sciences. Here we consider a quite general type of multifractal time series, called multifractal random walk, as non stationary stochastic processes with intermittent stationary increments. We first quickly recall how such time series can be analyzed and characterized, using structure functions and arbitrary order Hilbert spectral analysis. We then discuss the simulation approach. The main object is to provide a stochastic process generating time series having the same multiscale properties We review recent works on this topic, and provide stochastic simulations in order to verify the theoretical predictions. In the lognormal framework we provide a h ‑ μ plane expressing the scale invariant properties of these simulations. The theoretical plane is compared to simulation results.

  20. Analysis of utilization of desert habitats with dynamic simulation

    USGS Publications Warehouse

    Williams, B.K.

    1986-01-01

    The effects of climate and herbivores on cool desert shrubs in north-western Utah were investigated with a dynamic simulation model. Cool desert shrublands are extensively managed as grazing lands, and are defoliated annually by domestic livestock. A primary production model was used to simulate harvest yields and shrub responses under a variety of climatic regimes and defoliation patterns. The model consists of six plant components, and it is based on equations of growth analysis. Plant responses were simulated under various combinations of 20 annual weather patterns and 14 defoliation strategies. Results of the simulations exhibit some unexpected linearities in model behavior, and emphasize the importance of both the pattern of climate and the level of plant vigor in determining optimal harvest strategies. Model behaviors are interpreted in terms of shrub morphology, physiology and ecology.

  1. Gyrokinetic theory and simulation of turbulent energy exchange

    SciTech Connect

    Waltz, R. E.; Staebler, G. M.

    2008-01-15

    A previous gyrokinetic theory of turbulent heating [F. L. Hinton and R. E. Waltz, Phys. Plasma 13, 102301 (2006)] is simplified and extended to show that the local radial average of terms in the gyrokinetic turbulent heating (which survive in the drift kinetic limit) are actually closer to a turbulent energy exchange between electrons and ions. The integrated flow for the local exchange is simulated with the GYRO [J. Candy and R. E. Waltz, J. Comput. Phys. 186, 545 (2003)] delta-f gyrokinetic code and found to be small in a well studied DIII-D [M. A. Mahdavi and J. L. Luxon, ''DIII-D Tokamak Special Issue'' Fusion Sci. Technol. 48, 2 (2005)] L-mode discharge.

  2. Simulations of ultra-high-energy cosmic rays propagation

    SciTech Connect

    Kalashev, O. E.; Kido, E.

    2015-05-15

    We compare two techniques for simulation of the propagation of ultra-high-energy cosmic rays (UHECR) in intergalactic space: the Monte Carlo approach and a method based on solving transport equations in one dimension. For the former, we adopt the publicly available tool CRPropa and for the latter, we use the code TransportCR, which has been developed by the first author and used in a number of applications, and is made available online with publishing this paper. While the CRPropa code is more universal, the transport equation solver has the advantage of a roughly 100 times higher calculation speed. We conclude that the methods give practically identical results for proton or neutron primaries if some accuracy improvements are introduced to the CRPropa code.

  3. Radiation Hydrodynamic Simulations of an Inertial Fusion Energy Reactor Chamber

    NASA Astrophysics Data System (ADS)

    Sacks, Ryan Foster

    Inertial fusion energy reactors present great promise for the future as they are capable of providing baseline power with no carbon footprint. Simulation work regarding the chamber response and first wall insult is carried out using the 1-D BUCKY radiation hydrodynamics code for a variety of differing chamber fills, radii, chamber obstructions and first wall materials. Discussion of the first wall temperature rise, x-ray spectrum incident on the wall, shock timing and maximum overpressure are presented. An additional discussion of the impact of different gas opacities and their effect on overall chamber dynamics, including the formation of two shock fronts, is also presented. This work is performed under collaboration with Lawrence Livermore National Laboratory at the University of Wisconsin-Madison's Fusion Technology Institute.

  4. Simulated, Emulated, and Physical Investigative Analysis (SEPIA) of networked systems.

    SciTech Connect

    Burton, David P.; Van Leeuwen, Brian P.; McDonald, Michael James; Onunkwo, Uzoma A.; Tarman, Thomas David; Urias, Vincent E.

    2009-09-01

    This report describes recent progress made in developing and utilizing hybrid Simulated, Emulated, and Physical Investigative Analysis (SEPIA) environments. Many organizations require advanced tools to analyze their information system's security, reliability, and resilience against cyber attack. Today's security analysis utilize real systems such as computers, network routers and other network equipment, computer emulations (e.g., virtual machines) and simulation models separately to analyze interplay between threats and safeguards. In contrast, this work developed new methods to combine these three approaches to provide integrated hybrid SEPIA environments. Our SEPIA environments enable an analyst to rapidly configure hybrid environments to pass network traffic and perform, from the outside, like real networks. This provides higher fidelity representations of key network nodes while still leveraging the scalability and cost advantages of simulation tools. The result is to rapidly produce large yet relatively low-cost multi-fidelity SEPIA networks of computers and routers that let analysts quickly investigate threats and test protection approaches.

  5. Digital computer simulation of inductor-energy-storage dc-to-dc converters with closed-loop regulators

    NASA Technical Reports Server (NTRS)

    Ohri, A. K.; Owen, H. A.; Wilson, T. G.; Rodriguez, G. E.

    1974-01-01

    The simulation of converter-controller combinations by means of a flexible digital computer program which produces output to a graphic display is discussed. The procedure is an alternative to mathematical analysis of converter systems. The types of computer programming involved in the simulation are described. Schematic diagrams, state equations, and output equations are displayed for four basic forms of inductor-energy-storage dc to dc converters. Mathematical models are developed to show the relationship of the parameters.

  6. Energy landscape of LeuT from molecular simulations

    NASA Astrophysics Data System (ADS)

    Gur, Mert; Zomot, Elia; Cheng, Mary Hongying; Bahar, Ivet

    2015-12-01

    The bacterial sodium-coupled leucine transporter (LeuT) has been broadly used as a structural model for understanding the structure-dynamics-function of mammalian neurotransmitter transporters as well as other solute carriers that share the same fold (LeuT fold), as the first member of the family crystallographically resolved in multiple states: outward-facing open, outward-facing occluded, and inward-facing open. Yet, a complete picture of the energy landscape of (sub)states visited along the LeuT transport cycle has been elusive. In an attempt to visualize the conformational spectrum of LeuT, we performed extensive simulations of LeuT dimer dynamics in the presence of substrate (Ala or Leu) and co-transported Na+ ions, in explicit membrane and water. We used both conventional molecular dynamics (MD) simulations (with Anton supercomputing machine) and a recently introduced method, collective MD, that takes advantage of collective modes of motions predicted by the anisotropic network model. Free energy landscapes constructed based on ˜40 μs trajectories reveal multiple substates occluded to the extracellular (EC) and/or intracellular (IC) media, varying in the levels of exposure of LeuT to EC or IC vestibules. The IC-facing transmembrane (TM) helical segment TM1a shows an opening, albeit to a smaller extent and in a slightly different direction than that observed in the inward-facing open crystal structure. The study provides insights into the spectrum of conformational substates and paths accessible to LeuT and highlights the differences between Ala- and Leu-bound substates.

  7. Energy landscape of LeuT from molecular simulations.

    PubMed

    Gur, Mert; Zomot, Elia; Cheng, Mary Hongying; Bahar, Ivet

    2015-12-28

    The bacterial sodium-coupled leucine transporter (LeuT) has been broadly used as a structural model for understanding the structure-dynamics-function of mammalian neurotransmitter transporters as well as other solute carriers that share the same fold (LeuT fold), as the first member of the family crystallographically resolved in multiple states: outward-facing open, outward-facing occluded, and inward-facing open. Yet, a complete picture of the energy landscape of (sub)states visited along the LeuT transport cycle has been elusive. In an attempt to visualize the conformational spectrum of LeuT, we performed extensive simulations of LeuT dimer dynamics in the presence of substrate (Ala or Leu) and co-transported Na(+) ions, in explicit membrane and water. We used both conventional molecular dynamics (MD) simulations (with Anton supercomputing machine) and a recently introduced method, collective MD, that takes advantage of collective modes of motions predicted by the anisotropic network model. Free energy landscapes constructed based on ∼40 μs trajectories reveal multiple substates occluded to the extracellular (EC) and/or intracellular (IC) media, varying in the levels of exposure of LeuT to EC or IC vestibules. The IC-facing transmembrane (TM) helical segment TM1a shows an opening, albeit to a smaller extent and in a slightly different direction than that observed in the inward-facing open crystal structure. The study provides insights into the spectrum of conformational substates and paths accessible to LeuT and highlights the differences between Ala- and Leu-bound substates. PMID:26723619

  8. Digital Simulation-Based Training: A Meta-Analysis

    ERIC Educational Resources Information Center

    Gegenfurtner, Andreas; Quesada-Pallarès, Carla; Knogler, Maximilian

    2014-01-01

    This study examines how design characteristics in digital simulation-based learning environments moderate self-efficacy and transfer of learning. Drawing on social cognitive theory and the cognitive theory of multimedia learning, the meta-analysis psychometrically cumulated k?=?15 studies of 25 years of research with a total sample size of…

  9. Computational simulation for analysis and synthesis of impact resilient structure

    NASA Astrophysics Data System (ADS)

    Djojodihardjo, Harijono

    2013-10-01

    Impact resilient structures are of great interest in many engineering applications varying from civil, land vehicle, aircraft and space structures, to mention a few examples. To design such structure, one has to resort fundamental principles and take into account progress in analytical and computational approaches as well as in material science and technology. With such perspectives, this work looks at a generic beam and plate structure subject to impact loading and carry out analysis and numerical simulation. The first objective of the work is to develop a computational algorithm to analyze flat plate as a generic structure subjected to impact loading for numerical simulation and parametric study. The analysis will be based on dynamic response analysis. Consideration is given to the elastic-plastic region. The second objective is to utilize the computational algorithm for direct numerical simulation, and as a parallel scheme, commercial off-the shelf numerical code is utilized for parametric study, optimization and synthesis. Through such analysis and numerical simulation, effort is devoted to arrive at an optimum configuration in terms of loading, structural dimensions, material properties and composite lay-up, among others. Results will be discussed in view of practical applications.

  10. Statistical energy analysis computer program, user's guide

    NASA Technical Reports Server (NTRS)

    Trudell, R. W.; Yano, L. I.

    1981-01-01

    A high frequency random vibration analysis, (statistical energy analysis (SEA) method) is examined. The SEA method accomplishes high frequency prediction of arbitrary structural configurations. A general SEA computer program is described. A summary of SEA theory, example problems of SEA program application, and complete program listing are presented.

  11. Cross-impacts analysis development and energy policy analysis applications

    SciTech Connect

    Roop, J.M.; Scheer, R.M.; Stacey, G.S.

    1986-12-01

    Purpose of this report is to describe the cross-impact analysis process and microcomputer software developed for the Office of Policy, Planning, and Analysis (PPA) of DOE. First introduced in 1968, cross-impact analysis is a technique that produces scenarios of future conditions and possibilities. Cross-impact analysis has several unique attributes that make it a tool worth examining, especially in the current climate when the outlook for the economy and several of the key energy markets is uncertain. Cross-impact analysis complements the econometric, engineering, systems dynamics, or trend approaches already in use at DOE. Cross-impact analysis produces self-consistent scenarios in the broadest sense and can include interaction between the economy, technology, society and the environment. Energy policy analyses that couple broad scenarios of the future with detailed forecasting can produce more powerful results than scenario analysis or forecasts can produce alone.

  12. A thermal, thermoelastic, and wear simulation of a high-energy sliding contact problem

    NASA Technical Reports Server (NTRS)

    Kennedy, F. E., Jr.; Ling, F. F.

    1973-01-01

    This paper describes an investigation of the sliding contact problem encountered in high-energy disk brakes. The analysis includes a simulation modeling, using the finite element method, of the thermoelastic instabilities that cause transient changes in contact to occur on the friction surface. In order to include the effect of wear of the concentrated contacts on the friction surface, a wear criterion is proposed that results in prediction of wear rates for disk brakes that are quite close to experimentally determined wear rates. The thermal analysis shows that the transient temperature distribution in a disk brake can be determined more accurately by use of this thermomechanical analysis than by a more conventional analysis that assumes constant contact conditions. It is also shown that lower, more desirable, temperatures in disk brakes can be attained by increasing the volume, the thermal conductivity, and especially, the heat capacity of the brake components.

  13. Simulation-Length Requirements in the Loads Analysis of Offshore Floating Wind Turbines: Preprint

    SciTech Connect

    Haid, L.; Stewart, G.; Jonkman, J.; Robertson, A.; Lackner, M.; Matha, D.

    2013-06-01

    The goal of this paper is to examine the appropriate length of a floating offshore wind turbine (FOWT) simulation - a fundamental question that needs to be answered to develop design requirements. To examine this issue, a loads analysis of an example FOWT was performed in FAST with varying simulation lengths. The offshore wind system used was the OC3-Hywind spar buoy, which was developed for use in the International Energy Agency Code Comparison Collaborative Project and supports NREL's offshore 5-megawatt baseline turbine. Realistic metocean data from the National Oceanic and Atmospheric Administration and repeated periodic wind files were used to excite the structure. The results of the analysis clearly show that loads do not increase for longer simulations. In regards to fatigue, a sensitivity analysis shows that the procedure used for counting half cycles is more important than the simulation length itself. Based on these results, neither the simulation length nor the periodic wind files affect response statistics and loads for FOWTs (at least for the spar studied here); a result in contrast to the offshore oil and gas industry, where running simulations of at least 3 hours in length is common practice.

  14. Built Environment Energy Analysis Tool Overview (Presentation)

    SciTech Connect

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  15. Mechanism of pKID/KIX Association Studied by Molecular Dynamics Free Energy Simulations.

    PubMed

    Bomblies, Rainer; Luitz, Manuel P; Zacharias, Martin

    2016-08-25

    The phosphorylated kinase-inducible domain (pKID) associates with the kinase interacting domain (KIX) via a coupled folding and binding mechanism. The pKID domain is intrinsically disordered when unbound and upon phosphorylation at Ser133 binds to the KIX domain adopting a well-defined kinked two-helix structure. In order to identify putative hot spot residues of binding that could serve as an initial stable anchor, we performed in silico alanine scanning free energy simulations. The simulations indicate that charged residues including the phosphorylated central Ser133 of pKID make significant contributions to binding. However, these are of slightly smaller magnitude compared to several hydrophobic side chains not defining a single dominant binding hot spot. Both continuous molecular dynamics (MD) simulations and free energy analysis demonstrate that phosphorylation significantly stabilizes the central kinked motif around Ser133 of pKID and shifts the conformational equilibrium toward the bound conformation already in the absence of KIX. This result supports a view that pKID/KIX association follows in part a conformational selection process. During a 1.5 μs explicit solvent MD simulation, folding of pKID on the surface of KIX was observed after an initial contact at the bound position of the phosphorylation site was enforced following a sequential process of αA helix association and a stepwise association and folding of the second αB helix compatible with available experimental results. PMID:27054660

  16. Energy consumption program: A computer model simulating energy loads in buildings

    NASA Technical Reports Server (NTRS)

    Stoller, F. W.; Lansing, F. L.; Chai, V. W.; Higgins, S.

    1978-01-01

    The JPL energy consumption computer program developed as a useful tool in the on-going building modification studies in the DSN energy conservation project is described. The program simulates building heating and cooling loads and computes thermal and electric energy consumption and cost. The accuracy of computations are not sacrificed, however, since the results lie within + or - 10 percent margin compared to those read from energy meters. The program is carefully structured to reduce both user's time and running cost by asking minimum information from the user and reducing many internal time-consuming computational loops. Many unique features were added to handle two-level electronics control rooms not found in any other program.

  17. Dispersion analysis and linear error analysis capabilities of the space vehicle dynamics simulation program

    NASA Technical Reports Server (NTRS)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    Previous error analyses conducted by the Guidance and Dynamics Branch of NASA have used the Guidance Analysis Program (GAP) as the trajectory simulation tool. Plans are made to conduct all future error analyses using the Space Vehicle Dynamics Simulation (SVDS) program. A study was conducted to compare the inertial measurement unit (IMU) error simulations of the two programs. Results of the GAP/SVDS comparison are presented and problem areas encountered while attempting to simulate IMU errors, vehicle performance uncertainties and environmental uncertainties using SVDS are defined. An evaluation of the SVDS linear error analysis capability is also included.

  18. Covariance analysis of symmetry energy observables from heavy ion collision

    NASA Astrophysics Data System (ADS)

    Zhang, Yingxun; Tsang, M. B.; Li, Zhuxia

    2015-10-01

    Using covariance analysis, we quantify the correlations between the interaction parameters in a transport model and the observables commonly used to extract information of the Equation of State of Asymmetric Nuclear Matter in experiments. By simulating 124Sn + 124Sn, 124Sn + 112Sn and 112Sn + 112Sn reactions at beam energies of 50 and 120 MeV per nucleon, we have identified that the nucleon effective mass splitting is most strongly correlated to the neutrons and protons yield ratios with high kinetic energy from central collisions especially at high incident energy. The best observable to determine the slope of the symmetry energy, L, at saturation density is the isospin diffusion observable even though the correlation is not very strong (∼0.7). Similar magnitude of correlation but opposite in sign exists for isospin diffusion and nucleon isoscalar effective mass. At 120 MeV/u, the effective mass splitting and the isoscalar effective mass also have opposite correlation for the double n / p and isoscaling p / p yield ratios. By combining data and simulations at different beam energies, it should be possible to place constraints on the slope of symmetry energy (L) and effective mass splitting with reasonable uncertainties.

  19. LENS: μLENS Simulations, Analysis, and Results

    NASA Astrophysics Data System (ADS)

    Rasco, Charles

    2013-04-01

    Simulations of the Low-Energy Neutrino Spectrometer prototype, μLENS, have been performed in order to benchmark the first measurements of the μLENS detector at the Kimballton Underground Research Facility (KURF). μLENS is a 6x6x6 celled scintillation lattice filled with Linear Alkylbenzene based scintillator. We have performed simulations of μLENS using the GEANT4 toolkit. We have measured various radioactive sources, LEDs, and environmental background radiation measurements at KURF using up to 96 PMTs with a simplified data acquisition system of QDCs and TDCs. In this talk we will demonstrate our understanding of the light propagation and we will compare simulation results with measurements of the μLENS detector of various radioactive sources, LEDs, and the environmental background radiation.

  20. Parallel runway requirement analysis study. Volume 2: Simulation manual

    NASA Technical Reports Server (NTRS)

    Ebrahimi, Yaghoob S.; Chun, Ken S.

    1993-01-01

    This document is a user manual for operating the PLAND_BLUNDER (PLB) simulation program. This simulation is based on two aircraft approaching parallel runways independently and using parallel Instrument Landing System (ILS) equipment during Instrument Meteorological Conditions (IMC). If an aircraft should deviate from its assigned localizer course toward the opposite runway, this constitutes a blunder which could endanger the aircraft on the adjacent path. The worst case scenario would be if the blundering aircraft were unable to recover and continue toward the adjacent runway. PLAND_BLUNDER is a Monte Carlo-type simulation which employs the events and aircraft positioning during such a blunder situation. The model simulates two aircraft performing parallel ILS approaches using Instrument Flight Rules (IFR) or visual procedures. PLB uses a simple movement model and control law in three dimensions (X, Y, Z). The parameters of the simulation inputs and outputs are defined in this document along with a sample of the statistical analysis. This document is the second volume of a two volume set. Volume 1 is a description of the application of the PLB to the analysis of close parallel runway operations.

  1. A simulation model for wind energy storage systems. Volume 1: Technical report

    NASA Technical Reports Server (NTRS)

    Warren, A. W.; Edsinger, R. W.; Chan, Y. K.

    1977-01-01

    A comprehensive computer program for the modeling of wind energy and storage systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel and pneumatic) was developed. The level of detail of Simulation Model for Wind Energy Storage (SIMWEST) is consistent with a role of evaluating the economic feasibility as well as the general performance of wind energy systems. The software package consists of two basic programs and a library of system, environmental, and load components. The first program is a precompiler which generates computer models (in FORTRAN) of complex wind source storage application systems, from user specifications using the respective library components. The second program provides the techno-economic system analysis with the respective I/O, the integration of systems dynamics, and the iteration for conveyance of variables. SIMWEST program, as described, runs on the UNIVAC 1100 series computers.

  2. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  3. Abundance recovery error analysis using simulated AVIRIS data

    NASA Technical Reports Server (NTRS)

    Stoner, William W.; Harsanyi, Joseph C.; Farrand, William H.; Wong, Jennifer A.

    1992-01-01

    Measurement noise and imperfect atmospheric correction translate directly into errors in the determination of the surficial abundance of materials from imaging spectrometer data. The effects of errors on abundance recovery were investigated previously using Monte Carlo simulation methods by Sabol et. al. The drawback of the Monte Carlo approach is that thousands of trials are needed to develop good statistics on the probable error in abundance recovery. This computational burden invariably limits the number of scenarios of interest that can practically be investigated. A more efficient approach is based on covariance analysis. The covariance analysis approach expresses errors in abundance as a function of noise in the spectral measurements and provides a closed form result eliminating the need for multiple trials. Monte Carlo simulation and covariance analysis are used to predict confidence limits for abundance recovery for a scenario which is modeled as being derived from Airborne Visible/Infrared Imaging Spectrometer (AVIRIS).

  4. Theory, Image Simulation, and Data Analysis of Chemical Release Experiments

    NASA Technical Reports Server (NTRS)

    Wescott, Eugene M.

    1994-01-01

    The final phase of Grant NAG6-1 involved analysis of physics of chemical releases in the upper atmosphere and analysis of data obtained on previous NASA sponsored chemical release rocket experiments. Several lines of investigation of past chemical release experiments and computer simulations have been proceeding in parallel. This report summarizes the work performed and the resulting publications. The following topics are addressed: analysis of the 1987 Greenland rocket experiments; calculation of emission rates for barium, strontium, and calcium; the CRIT 1 and 2 experiments (Collisional Ionization Cross Section experiments); image calibration using background stars; rapid ray motions in ionospheric plasma clouds; and the NOONCUSP rocket experiments.

  5. Computer simulation of energy use, greenhouse gas emissions, and process economics of the fluid milk process.

    PubMed

    Tomasula, P M; Yee, W C F; McAloon, A J; Nutter, D W; Bonnaillie, L M

    2013-05-01

    Energy-savings measures have been implemented in fluid milk plants to lower energy costs and the energy-related carbon dioxide (CO2) emissions. Although these measures have resulted in reductions in steam, electricity, compressed air, and refrigeration use of up to 30%, a benchmarking framework is necessary to examine the implementation of process-specific measures that would lower energy use, costs, and CO2 emissions even further. In this study, using information provided by the dairy industry and equipment vendors, a customizable model of the fluid milk process was developed for use in process design software to benchmark the electrical and fuel energy consumption and CO2 emissions of current processes. It may also be used to test the feasibility of new processing concepts to lower energy and CO2 emissions with calculation of new capital and operating costs. The accuracy of the model in predicting total energy usage of the entire fluid milk process and the pasteurization step was validated using available literature and industry energy data. Computer simulation of small (40.0 million L/yr), medium (113.6 million L/yr), and large (227.1 million L/yr) processing plants predicted the carbon footprint of milk, defined as grams of CO2 equivalents (CO2e) per kilogram of packaged milk, to within 5% of the value of 96 g of CO 2e/kg of packaged milk obtained in an industry-conducted life cycle assessment and also showed, in agreement with the same study, that plant size had no effect on the carbon footprint of milk but that larger plants were more cost effective in producing milk. Analysis of the pasteurization step showed that increasing the percentage regeneration of the pasteurizer from 90 to 96% would lower its thermal energy use by almost 60% and that implementation of partial homogenization would lower electrical energy use and CO2e emissions of homogenization by 82 and 5.4%, respectively. It was also demonstrated that implementation of steps to lower non

  6. Advanced Thermal Simulator Testing: Thermal Analysis and Test Results

    SciTech Connect

    Bragg-Sitton, Shannon M.; Dickens, Ricky; Dixon, David; Reid, Robert; Adams, Mike; Davis, Joe

    2008-01-21

    Work at the NASA Marshall Space Flight Center seeks to develop high fidelity, electrically heated thermal simulators that represent fuel elements in a nuclear reactor design to support non-nuclear testing applicable to the potential development of a space nuclear power or propulsion system. Comparison between the fuel pins and thermal simulators is made at the outer fuel clad surface, which corresponds to the outer sheath surface in the thermal simulator. The thermal simulators that are currently being tested correspond to a liquid metal cooled reactor design that could be applied for Lunar surface power. These simulators are designed to meet the geometric and power requirements of a proposed surface power reactor design, accommodate testing of various axial power profiles, and incorporate imbedded instrumentation. This paper reports the results of thermal simulator analysis and testing in a bare element configuration, which does not incorporate active heat removal, and testing in a water-cooled calorimeter designed to mimic the heat removal that would be experienced in a reactor core.

  7. Advanced Thermal Simulator Testing: Thermal Analysis and Test Results

    NASA Technical Reports Server (NTRS)

    Bragg-Sitton, Shannon M.; Dickens, Ricky; Dixon, David; Reid, Robert; Adams, Mike; Davis, Joe

    2008-01-01

    Work at the NASA Marshall Space Flight Center seeks to develop high fidelity, electrically heated thermal simulators that represent fuel elements in a nuclear reactor design to support non-nuclear testing applicable to the development of a space nuclear power or propulsion system. Comparison between the fuel pins and thermal simulators is made at the outer fuel clad surface, which corresponds to the outer sheath surface in the thermal simulator. The thermal simulators that are currently being tested correspond to a SNAP derivative reactor design that could be applied for Lunar surface power. These simulators are designed to meet the geometric and power requirements of a proposed surface power reactor design, accommodate testing of various axial power profiles, and incorporate imbedded instrumentation. This paper reports the results of thermal simulator analysis and testing in a bare element configuration, which does not incorporate active heat removal, and testing in a water-cooled calorimeter designed to mimic the heat removal that would be experienced in a reactor core.

  8. Energy Analysis Program, 1993 annual report

    SciTech Connect

    Not Available

    1994-06-01

    With the new federal Administration in place, we have observed increasing attention in the areas of research and analysis that have been central to our interests. Energy efficiency is once again a very high priority in the national agenda. This increased emphasis on energy efficiency was already apparent prior to the national elections in the contents of the National Energy Policy Act (EPACT), passed by Congress in 1992. The Climate Change Action Plan, released by the White House in October 1993, strengthens the federal government`s leadership role in the design and implementation of energy-efficiency programs. Budget submissions show energy efficiency and renewable energy programs among the relatively small number of discretionary programs at the federal level that are expected to receive large percentage increases.

  9. High resolution simulations of energy absorption in dynamically loaded cellular structures

    NASA Astrophysics Data System (ADS)

    Winter, R. E.; Cotton, M.; Harris, E. J.; Eakins, D. E.; McShane, G.

    2016-04-01

    Cellular materials have potential application as absorbers of energy generated by high velocity impact. CTH, a Sandia National Laboratories Code which allows very severe strains to be simulated, has been used to perform very high resolution simulations showing the dynamic crushing of a series of two-dimensional, stainless steel metal structures with varying architectures. The structures are positioned to provide a cushion between a solid stainless steel flyer plate with velocities ranging from 300 to 900 m/s, and an initially stationary stainless steel target. Each of the alternative architectures under consideration was formed by an array of identical cells each of which had a constant volume and a constant density. The resolution of the simulations was maximised by choosing a configuration in which one-dimensional conditions persisted for the full period over which the specimen densified, a condition which is most readily met by impacting high density specimens at high velocity. It was found that the total plastic flow and, therefore, the irreversible energy dissipated in the fully densified energy absorbing cell, increase (a) as the structure becomes more rodlike and less platelike and (b) as the impact velocity increases. Sequential CTH images of the deformation processes show that the flow of the cell material may be broadly divided into macroscopic flow perpendicular to the compression direction and jetting-type processes (microkinetic flow) which tend to predominate in rod and rodlike configurations and also tend to play an increasing role at increased strain rates. A very simple analysis of a configuration in which a solid flyer impacts a solid target provides a baseline against which to compare and explain features seen in the simulations. The work provides a basis for the development of energy absorbing structures for application in the 200-1000 m/s impact regime.

  10. Rheological Models of Blood: Sensitivity Analysis and Benchmark Simulations

    NASA Astrophysics Data System (ADS)

    Szeliga, Danuta; Macioł, Piotr; Banas, Krzysztof; Kopernik, Magdalena; Pietrzyk, Maciej

    2010-06-01

    Modeling of blood flow with respect to rheological parameters of the blood is the objective of this paper. Casson type equation was selected as a blood model and the blood flow was analyzed based on Backward Facing Step benchmark. The simulations were performed using ADINA-CFD finite element code. Three output parameters were selected, which characterize the accuracy of flow simulation. Sensitivity analysis of the results with Morris Design method was performed to identify rheological parameters and the model output, which control the blood flow to significant extent. The paper is the part of the work on identification of parameters controlling process of clotting.

  11. Computer simulation of ion beam analysis of laterally inhomogeneous materials

    NASA Astrophysics Data System (ADS)

    Mayer, M.

    2016-03-01

    The program STRUCTNRA for the simulation of ion beam analysis charged particle spectra from arbitrary two-dimensional distributions of materials is described. The code is validated by comparison to experimental backscattering data from a silicon grating on tantalum at different orientations and incident angles. Simulated spectra for several types of rough thin layers and a chessboard-like arrangement of materials as example for a multi-phase agglomerate material are presented. Ambiguities between back-scattering spectra from two-dimensional and one-dimensional sample structures are discussed.

  12. Synthetic CT: Simulating low dose single and dual energy protocols from a dual energy scan

    SciTech Connect

    Wang, Adam S.; Pelc, Norbert J.

    2011-10-15

    Purpose: The choice of CT protocol can greatly impact patient dose and image quality. Since acquiring multiple scans at different techniques on a given patient is undesirable, the ability to predict image quality changes starting from a high quality exam can be quite useful. While existing methods allow one to generate simulated images of lower exposure (mAs) from an acquired CT exam, the authors present and validate a new method called synthetic CT that can generate realistic images of a patient at arbitrary low dose protocols (kVp, mAs, and filtration) for both single and dual energy scans. Methods: The synthetic CT algorithm is derived by carefully ensuring that the expected signal and noise are accurate for the simulated protocol. The method relies on the observation that the material decomposition from a dual energy CT scan allows the transmission of an arbitrary spectrum to be predicted. It requires an initial dual energy scan of the patient to either synthesize raw projections of a single energy scan or synthesize the material decompositions of a dual energy scan. The initial dual energy scan contributes inherent noise to the synthesized projections that must be accounted for before adding more noise to simulate low dose protocols. Therefore, synthetic CT is subject to the constraint that the synthesized data have noise greater than the inherent noise. The authors experimentally validated the synthetic CT algorithm across a range of protocols using a dual energy scan of an acrylic phantom with solutions of different iodine concentrations. An initial 80/140 kVp dual energy scan of the phantom provided the material decomposition necessary to synthesize images at 100 kVp and at 120 kVp, across a range of mAs values. They compared these synthesized single energy scans of the phantom to actual scans at the same protocols. Furthermore, material decompositions of a 100/120 kVp dual energy scan are synthesized by adding correlated noise to the initial material

  13. A virtual laboratory for the simulation of sustainable energy systems in a low energy building: A case study

    NASA Astrophysics Data System (ADS)

    Breen, M.; O’Donovan, A.; Murphy, M. D.; Delaney, F.; Hill, M.; Sullivan, P. D. O.

    2016-03-01

    The aim of this paper was to develop a virtual laboratory simulation platform of the National Building Retrofit Test-bed at the Cork Institute of Technology, Ireland. The building in question is a low-energy retrofit which is provided with electricity by renewable systems including photovoltaics and wind. It can be thought of as a living laboratory, as a number of internal and external building factors are recorded at regular intervals during human occupation. The analysis carried out in this paper demonstrated that, for the period from April to September 2015, the electricity provided by the renewable systems did not consistently match the building’s electricity requirements due to differing load profiles. It was concluded that the use of load shifting techniques may help to increase the percentage of renewable energy utilisation.

  14. Simulation and analysis of conjunctive use with MODFLOW's farm process

    USGS Publications Warehouse

    Hanson, R.T.; Schmid, W.; Faunt, C.C.; Lockwood, B.

    2010-01-01

    The extension of MODFLOW onto the landscape with the Farm Process (MF-FMP) facilitates fully coupled simulation of the use and movement of water from precipitation, streamflow and runoff, groundwater flow, and consumption by natural and agricultural vegetation throughout the hydrologic system at all times. This allows for more complete analysis of conjunctive use water-resource systems than previously possible with MODFLOW by combining relevant aspects of the landscape with the groundwater and surface water components. This analysis is accomplished using distributed cell-by-cell supply-constrained and demand-driven components across the landscape within " water-balance subregions" comprised of one or more model cells that can represent a single farm, a group of farms, or other hydrologic or geopolitical entities. Simulation of micro-agriculture in the Pajaro Valley and macro-agriculture in the Central Valley are used to demonstrate the utility of MF-FMP. For Pajaro Valley, the simulation of an aquifer storage and recovery system and related coastal water distribution system to supplant coastal pumpage was analyzed subject to climate variations and additional supplemental sources such as local runoff. For the Central Valley, analysis of conjunctive use from different hydrologic settings of northern and southern subregions shows how and when precipitation, surface water, and groundwater are important to conjunctive use. The examples show that through MF-FMP's ability to simulate natural and anthropogenic components of the hydrologic cycle, the distribution and dynamics of supply and demand can be analyzed, understood, and managed. This analysis of conjunctive use would be difficult without embedding them in the simulation and are difficult to estimate a priori. Journal compilation ?? 2010 National Ground Water Association. No claim to original US government works.

  15. Simulation and analysis of conjunctive use with MODFLOW's farm process.

    PubMed

    Hanson, R T; Schmid, W; Faunt, C C; Lockwood, B

    2010-01-01

    The extension of MODFLOW onto the landscape with the Farm Process (MF-FMP) facilitates fully coupled simulation of the use and movement of water from precipitation, streamflow and runoff, groundwater flow, and consumption by natural and agricultural vegetation throughout the hydrologic system at all times. This allows for more complete analysis of conjunctive use water-resource systems than previously possible with MODFLOW by combining relevant aspects of the landscape with the groundwater and surface water components. This analysis is accomplished using distributed cell-by-cell supply-constrained and demand-driven components across the landscape within "water-balance subregions" comprised of one or more model cells that can represent a single farm, a group of farms, or other hydrologic or geopolitical entities. Simulation of micro-agriculture in the Pajaro Valley and macro-agriculture in the Central Valley are used to demonstrate the utility of MF-FMP. For Pajaro Valley, the simulation of an aquifer storage and recovery system and related coastal water distribution system to supplant coastal pumpage was analyzed subject to climate variations and additional supplemental sources such as local runoff. For the Central Valley, analysis of conjunctive use from different hydrologic settings of northern and southern subregions shows how and when precipitation, surface water, and groundwater are important to conjunctive use. The examples show that through MF-FMP's ability to simulate natural and anthropogenic components of the hydrologic cycle, the distribution and dynamics of supply and demand can be analyzed, understood, and managed. This analysis of conjunctive use would be difficult without embedding them in the simulation and are difficult to estimate a priori. PMID:20572873

  16. Filtering analysis of a direct numerical simulation of the turbulent Rayleigh-Benard problem

    NASA Technical Reports Server (NTRS)

    Eidson, T. M.; Hussaini, M. Y.; Zang, T. A.

    1990-01-01

    A filtering analysis of a turbulent flow was developed which provides details of the path of the kinetic energy of the flow from its creation via thermal production to its dissipation. A low-pass spatial filter is used to split the velocity and the temperature field into a filtered component (composed mainly of scales larger than a specific size, nominally the filter width) and a fluctuation component (scales smaller than a specific size). Variables derived from these fields can fall into one of the above two ranges or be composed of a mixture of scales dominated by scales near the specific size. The filter is used to split the kinetic energy equation into three equations corresponding to the three scale ranges described above. The data from a direct simulation of the Rayleigh-Benard problem for conditions where the flow is turbulent are used to calculate the individual terms in the three kinetic energy equations. This is done for a range of filter widths. These results are used to study the spatial location and the scale range of the thermal energy production, the cascading of kinetic energy, the diffusion of kinetic energy, and the energy dissipation. These results are used to evaluate two subgrid models typically used in large-eddy simulations of turbulence. Subgrid models attempt to model the energy below the filter width that is removed by a low-pass filter.

  17. Building Energy Simulation Test for Existing Homes (BESTEST-EX) (Presentation)

    SciTech Connect

    Judkoff, R.; Neymark, J.; Polly, B.

    2011-12-01

    This presentation discusses the goals of NREL Analysis Accuracy R&D; BESTEST-EX goals; what BESTEST-EX is; how it works; 'Building Physics' cases; 'Building Physics' reference results; 'utility bill calibration' cases; limitations and potential future work. Goals of NREL Analysis Accuracy R&D are: (1) Provide industry with the tools and technical information needed to improve the accuracy and consistency of analysis methods; (2) Reduce the risks associated with purchasing, financing, and selling energy efficiency upgrades; and (3) Enhance software and input collection methods considering impacts on accuracy, cost, and time of energy assessments. BESTEST-EX Goals are: (1) Test software predictions of retrofit energy savings in existing homes; (2) Ensure building physics calculations and utility bill calibration procedures perform up to a minimum standard; and (3) Quantify impact of uncertainties in input audit data and occupant behavior. BESTEST-EX is a repeatable procedure that tests how well audit software predictions compare to the current state of the art in building energy simulation. There is no direct truth standard. However, reference software have been subjected to validation testing, including comparisons with empirical data.

  18. Review of simulation techniques for aquifer thermal energy storage (ATES)

    SciTech Connect

    Mercer, J.W.; Faust, C.R.; Miller, W.J.; Pearson, F.J. Jr.

    1981-03-01

    The storage of thermal energy in aquifers has recently received considerable attention as a means to conserve and more efficiently use energy supplies. The analysis of aquifer thermal energy storage (ATES) systems will rely on the results from mathematical and geochemical models. Therefore, the state-of-the-art models relevant to ATES was reviewed and evaluated. These models describe important processes active in ATES including ground-water flow, heat transport (heat flow), solute transport (movement of contaminants), and geochemical reactions. In general, available models of the saturated ground-water environment are adequate to address most concerns associated with ATES; that is, design, operation, and environmental assessment. In those cases where models are not adequate, development should be preceded by efforts to identify significant physical phenomena and relate model parameters to measurable quantities. Model development can then proceed with the expectation of an adequate data base existing for the model's eventual use. Review of model applications to ATES shows that the major emphasis has been on generic sensitivity analysis and site characterization. Assuming that models are applied appropriately, the primary limitation on model calculations is the data base used to construct the model. Numerical transport models are limited by the uncertainty of subsurface data and the lack of long-term historical data for calibration. Geochemical models are limited by the lack of thermodynamic data for the temperature ranges applicable to ATES. Model applications undertaken with data collection activities on ATES sites should provide the most important contributions to the understanding and utilization of ATES. Therefore, the primary conclusion of this review is that model application to field sites in conjunction with data collection activities is essential to the development of this technology.

  19. Energy and environmental analysis of a linear concentrating photovoltaic system

    NASA Astrophysics Data System (ADS)

    Kerzmann, Tony

    The world is facing an imminent energy supply crisis. In order to sustain and increase our energy supply in an environmentally-conscious manner, it is necessary to advance renewable technologies. Despite this urgency, however, it is paramount to consider the larger environmental effects associated with using renewable energy resources. This research is meant to better understand linear concentrating photovoltaics (LCPVs) from an engineering and environmental standpoint. In order to analyze the LCPV system, a simulation and life cycle assessment (LCA) were developed. The LCPV system serves two major purposes: it produces electricity, and waste heat is collected for heating use. There are three parts to the LCPV simulation. The first part simulates the multijunction cell output so as to calculate the temperature-dependent electricity generation. The second part simulates the cell cooling and waste heat recovery system using a model consisting of heat transfer and fluid flow equations. The waste heat recovery in the LCPV system was linked to a hot water storage system, which was also modeled. Coupling the waste heat recovery simulation and the hot water storage system gives an overall integrated system that is useful for system design, optimization, and acts as a stepping stone for future multijunction cell Photovoltaic/Thermal (PV/T) systems. Finally, all of the LCPV system components were coded in Engineering Equation Solver (EES) and were used in an energy analysis under actual weather and solar conditions for the Phoenix, AZ, region. The life cycle assessment for the LCPV system allowed for an environmental analysis of the system where areas of the highest environmental impact were pinpointed. While conducting the LCA research, each component of the system was analyzed from a resource extraction, production, and use standpoint. The collective production processes of each LCPV system component were gathered into a single inventory of materials and energy flows

  20. Analysis of differences between seating positions in simulators and orbiters

    NASA Technical Reports Server (NTRS)

    Mongan, Philip T.

    1993-01-01

    Crew comments indicate that Space Shuttle simulator seats place crewmembers in a position different from that of the actual Orbiter seats. The crew feel that they launch in a different position, and with a different reach and visibility, from that in which they had trained. This study examined three factors in differences between training and flight positions. Key dimensions, which were considered important to spatial orientation, were compared in the Orbiters and simulators. These were dimensions such as seat back to glare shield and seat pan to overhead. The differences between flight and training crew equipment, and how these differences may contribute to the problem were discussed with engineers and technicians responsible for the equipment. Eye position measurements were taken on subjects to assess any differences that could be attributed to different ingress methods in the Orbiters and the simulators. This report presents the data, analysis, and recommendations.

  1. Review of wind simulation methods for horizontal-axis wind turbine analysis

    NASA Astrophysics Data System (ADS)

    Powell, D. C.; Connell, J. R.

    1986-06-01

    This report reviews three reports on simulation of winds for use in wind turbine fatigue analysis. The three reports are presumed to represent the state of the art. The Purdue and Sandia methods simulate correlated wind data at two points rotating as on the rotor of a horizontal-axis wind turbine. The PNL method at present simulates only one point, which rotates either as on a horizontal-axis wind turbine blade or as on a vertical-axis wind turbine blade. The spectra of simulated data are presented from the Sandia and PNL models under comparable input conditions, and the energy calculated in the rotational spikes in the spectra by the two models is compared. Although agreement between the two methods is not impressive at this time, improvement of the Sandia and PNL methods is recommended as the best way to advance the state of the art. Physical deficiencies of the models are cited in the report and technical recommendations are made for improvement. The report also reviews two general methods for simulating single-point data, called the harmonic method and the white noise method. The harmonic method, which is the basis of all three specific methods reviewed, is recommended over the white noise method in simulating winds for wind turbine analysis.

  2. Magnetic properties and energy-mapping analysis.

    PubMed

    Xiang, Hongjun; Lee, Changhoon; Koo, Hyun-Joo; Gong, Xingao; Whangbo, Myung-Hwan

    2013-01-28

    The magnetic energy levels of a given magnetic solid are closely packed in energy because the interactions between magnetic ions are weak. Thus, in describing its magnetic properties, one needs to generate its magnetic energy spectrum by employing an appropriate spin Hamiltonian. In this review article we discuss how to determine and specify a necessary spin Hamiltonian in terms of first principles electronic structure calculations on the basis of energy-mapping analysis and briefly survey important concepts and phenomena that one encounters in reading the current literature on magnetic solids. Our discussion is given on a qualitative level from the perspective of magnetic energy levels and electronic structures. The spin Hamiltonian appropriate for a magnetic system should be based on its spin lattice, i.e., the repeat pattern of its strong magnetic bonds (strong spin exchange paths), which requires one to evaluate its Heisenberg spin exchanges on the basis of energy-mapping analysis. Other weaker energy terms such as Dzyaloshinskii-Moriya (DM) spin exchange and magnetocrystalline anisotropy energies, which a spin Hamiltonian must include in certain cases, can also be evaluated by performing energy-mapping analysis. We show that the spin orientation of a transition-metal magnetic ion can be easily explained by considering its split d-block levels as unperturbed states with the spin-orbit coupling (SOC) as perturbation, that the DM exchange between adjacent spin sites can become comparable in strength to the Heisenberg spin exchange when the two spin sites are not chemically equivalent, and that the DM interaction between rare-earth and transition-metal cations is governed largely by the magnetic orbitals of the rare-earth cation. PMID:23128376

  3. Simulation, design, and analysis of azeotropic distillation operations

    SciTech Connect

    Bossen, B.S.; Joergensen, S.B.; Gani, R. )

    1993-04-01

    The computational tools needed for simulation, design, and analysis of azeotropic distillation operations are described. These tools include simple methods to identify the existence of binary and ternary azeotropes and to classify ternary mixtures as homogeneous or heterogeneous. The tools also include more complex methods to compute the phase diagram (or a heterogeneous liquid boiling surface), predict liquid-vapor phase equilibrium, and/or predict liquid-liquid-vapor phase equilibrium for simulations of batch and continuous distillation column operations. Important new features of these tools are the incorporation of a fast and efficient method for test of phase stability in simulation of distillation operations, the ability to handle a large range of mixtures (including mixtures with supercritical compounds), and the ability for computations covering wide ranges of temperature and pressure. On the basis of these tools, simple and consistent design algorithms are developed. The applicability of the design algorithms is verified through process simulation and analysis of the predicted behavior and data from the open literature. Conditions are given for examples illustrating (when and how possible distillation boundaries can be crossed) how multiple steady states can be obtained. Finally, the effect of changes in operating on the dynamic behavior of the azeotropic distillation columns and the sensitivity of design to the prediction of phase equilibria are presented.

  4. Visualization and analysis of eddies in a global ocean simulation

    SciTech Connect

    Williams, Sean J; Hecht, Matthew W; Petersen, Mark; Strelitz, Richard; Maltrud, Mathew E; Ahrens, James P; Hlawitschka, Mario; Hamann, Bernd

    2010-10-15

    Eddies at a scale of approximately one hundred kilometers have been shown to be surprisingly important to understanding large-scale transport of heat and nutrients in the ocean. Due to difficulties in observing the ocean directly, the behavior of eddies below the surface is not very well understood. To fill this gap, we employ a high-resolution simulation of the ocean developed at Los Alamos National Laboratory. Using large-scale parallel visualization and analysis tools, we produce three-dimensional images of ocean eddies, and also generate a census of eddy distribution and shape averaged over multiple simulation time steps, resulting in a world map of eddy characteristics. As expected from observational studies, our census reveals a higher concentration of eddies at the mid-latitudes than the equator. Our analysis further shows that mid-latitude eddies are thicker, within a range of 1000-2000m, while equatorial eddies are less than 100m thick.

  5. Differential maneuvering simulator data reduction and analysis software

    NASA Technical Reports Server (NTRS)

    Beasley, G. P.; Sigman, R. S.

    1972-01-01

    A multielement data reduction and analysis software package has been developed for use with the Langley differential maneuvering simulator (DMS). This package, which has several independent elements, was developed to support all phases of DMS aircraft simulation studies with a variety of both graphical and tabular information. The overall software package is considered unique because of the number, diversity, and sophistication of the element programs available for use in a single study. The purpose of this paper is to discuss the overall DMS data reduction and analysis package by reviewing the development of the various elements of the software, showing typical results that can be obtained, and discussing how each element can be used.

  6. Simulation and Analysis of Converging Shock Wave Test Problems

    SciTech Connect

    Ramsey, Scott D.; Shashkov, Mikhail J.

    2012-06-21

    Results and analysis pertaining to the simulation of the Guderley converging shock wave test problem (and associated code verification hydrodynamics test problems involving converging shock waves) in the LANL ASC radiation-hydrodynamics code xRAGE are presented. One-dimensional (1D) spherical and two-dimensional (2D) axi-symmetric geometric setups are utilized and evaluated in this study, as is an instantiation of the xRAGE adaptive mesh refinement capability. For the 2D simulations, a 'Surrogate Guderley' test problem is developed and used to obviate subtleties inherent to the true Guderley solution's initialization on a square grid, while still maintaining a high degree of fidelity to the original problem, and minimally straining the general credibility of associated analysis and conclusions.

  7. Information Security Analysis Using Game Theory and Simulation

    SciTech Connect

    Schlicher, Bob G; Abercrombie, Robert K

    2012-01-01

    Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions are always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.

  8. Analysis of Energy Saving Impacts of ASHRAE 90.1-2004 for New York

    SciTech Connect

    Gowri, Krishnan; Halverson, Mark A.; Richman, Eric E.

    2007-08-03

    The New York State Energy Research and Development Authority (NYSERDA) and New York State Department of State (DOS) requested the help of DOE’s Building Energy Codes Program (BECP) in estimating the annual building energy savings and cost impacts of adopting ANSI/ASHRAE/IESNA Standard 90.1-2004 (ASHRAE 2004) requirements. This report summarizes the analysis methodology and results of energy simulation in response to that request.

  9. Analysis of the interrelationship of energy, economy, and environment: A model of a sustainable energy future for Korea

    NASA Astrophysics Data System (ADS)

    Boo, Kyung-Jin

    The primary purpose of this dissertation is to provide the groundwork for a sustainable energy future in Korea. For this purpose, a conceptual framework of sustainable energy development was developed to provide a deeper understanding of interrelationships between energy, the economy, and the environment (E 3). Based on this theoretical work, an empirical simulation model was developed to investigate the ways in which E3 interact. This dissertation attempts to develop a unified concept of sustainable energy development by surveying multiple efforts to integrate various definitions of sustainability. Sustainable energy development should be built on the basis of three principles: ecological carrying capacity, economic efficiency, and socio-political equity. Ecological carrying capacity delineates the earth's resource constraints as well as its ability to assimilate wastes. Socio-political equity implies an equitable distribution of the benefits and costs of energy consumption and an equitable distribution of environmental burdens. Economic efficiency dictates efficient allocation of scarce resources. The simulation model is composed of three modules: an energy module, an environmental module and an economic module. Because the model is grounded on economic structural behaviorism, the dynamic nature of the current economy is effectively depicted and simulated through manipulating exogenous policy variables. This macro-economic model is used to simulate six major policy intervention scenarios. Major findings from these policy simulations were: (1) carbon taxes are the most effective means of reducing air-pollutant emissions; (2) sustainable energy development can be achieved through reinvestment of carbon taxes into energy efficiency and renewable energy programs; and (3) carbon taxes would increase a nation's welfare if reinvested in relevant areas. The policy simulation model, because it is based on neoclassical economics, has limitations such that it cannot fully

  10. Validation studies of the DOE-2 Building Energy Simulation Program. Final Report

    SciTech Connect

    Sullivan, R.; Winkelmann, F.

    1998-06-01

    This report documents many of the validation studies (Table 1) of the DOE-2 building energy analysis simulation program that have taken place since 1981. Results for several versions of the program are presented with the most recent study conducted in 1996 on version DOE-2.1E and the most distant study conducted in 1981 on version DOE-1.3. This work is part of an effort related to continued development of DOE-2, particularly in its use as a simulation engine for new specialized versions of the program such as the recently released RESFEN 3.1. RESFEN 3.1 is a program specifically dealing with analyzing the energy performance of windows in residential buildings. The intent in providing the results of these validation studies is to give potential users of the program a high degree of confidence in the calculated results. Validation studies in which calculated simulation data is compared to measured data have been conducted throughout the development of the DOE-2 program. Discrepancies discovered during the course of such work has resulted in improvements in the simulation algorithms. Table 2 provides a listing of additions and modifications that have been made to various versions of the program since version DOE-2.1A. One of the most significant recent changes in the program occurred with version DOE-2.1E. An improved algorithm for calculating the outside surface film coefficient was implemented. In addition, integration of the WINDOW 4 program was accomplished resulting in improved ability in analyzing window energy performance. Validation and verification of a program as sophisticated as DOE-2 must necessarily be limited because of the approximations inherent in the program. For example, the most accurate model of the heat transfer processes in a building would include a three-dimensional analysis. To justify such detailed algorithmic procedures would correspondingly require detailed information describing the building and/or HVAC system and energy plant parameters

  11. Energy loss analysis of an integrated space power distribution system

    NASA Technical Reports Server (NTRS)

    Kankam, M. D.; Ribeiro, P. F.

    1992-01-01

    The results of studies related to conceptual topologies of an integrated utility-like space power system are described. The system topologies are comparatively analyzed by considering their transmission energy losses as functions of mainly distribution voltage level and load composition. The analysis is expedited by use of a Distribution System Analysis and Simulation (DSAS) software. This recently developed computer program by the Electric Power Research Institute (EPRI) uses improved load models to solve the power flow within the system. However, present shortcomings of the software with regard to space applications, and incompletely defined characteristics of a space power system make the results applicable to only the fundamental trends of energy losses of the topologies studied. Accountability, such as included, for the effects of the various parameters on the system performance can constitute part of a planning tool for a space power distribution system.

  12. Simulation and Analysis of the Hybrid Operating Mode in ITER

    SciTech Connect

    Kessel, C.E.; Budny, R.V.; Indireshkumar, K.

    2005-09-22

    The hybrid operating mode in ITER is examined with 0D systems analysis, 1.5D discharge scenario simulations using TSC and TRANSP, and the ideal MHD stability is discussed. The hybrid mode has the potential to provide very long pulses and significant neutron fluence if the physics regime can be produced in ITER. This paper reports progress in establishing the physics basis and engineering limitation for the hybrid mode in ITER.

  13. DOE2.1E. Building Energy Consumption Analysis

    SciTech Connect

    Birdsall, B.; Buhl, W.; Ellington, K.; Erdem, E.; Meldem, R.; Winkelmann, F.; Hirsch, J.; Gates, S.

    1993-10-01

    DOE2.1E is a set of programs for the analysis of energy consumption and cost in residential and commercial buildings. Programs are included to enter input data (Building Description Language Processor), to calculate the heating and cooling loads for each space in a building for each hour of a year (LOADS subprogram), to simulate the operation and response of the equipment and systems that control temperature and humidity and distribute heating and cooling to the building (SYSTEMS subprogram), to model primary energy conversion equipment that uses fuel or electricity to provide the required heating, cooling, and electricity (PLANT subprogram), and to compute the cost of energy and building operation based on utility rate schedules and economic parameters (ECONOMICS subprogram), to display results (Reports subprogram)), and to convert raw weather files into DOE-2 compatible weather files (Weather Processor). A library of materials, constructions, and windows is provided.

  14. DOE2.1E-088. Building Energy Consumption Analysis

    SciTech Connect

    Birdsall, B.; Buhl, W.; Ellington, K.; Erdem, E.; Meldem, R.; Winkelmann, F.; Hirsch, J.; Gates, S.

    1993-10-01

    DOE2.1E-088 is a set of programs for the analysis of energy consumption and cost in residential and commercial buildings. Programs are included to enter input data (Building Description Language Processor), to calculate the heating and cooling loads for each space in a building for each hour of a year (LOADS subprogram), to simulate the operation and response of the equipment and systems that control temperature and humidity and distribute heating and cooling to the building (SYSTEMS subprogram), to model primary energy conversion equipment that uses fuel or electricity to provide the required heating, cooling, and electricity (PLANT subprogram), and to compute the cost of energy and building operation based on utility rate schedules and economic parameters (ECONOMICS subprogram), to display results (Reports subprogram)), and to convert raw weather files into DOE-2 compatible weather files (Weather Processor). A library of materials, constructions, and windows is provided.

  15. DOE2.1F. Building Energy Consumption Analysis

    SciTech Connect

    Birdsall, B.; Buhl, W.; Ellington, K.; Erdem, E; Winkleman, F.; Hirsch, J.; Gates, S.

    1993-10-01

    DOE2.1F is a set of programs for the analysis of energy consumption and cost in residential and commercial buildings. Programs are included to enter input data (Building Description Language Processor), to calculate the heating and cooling loads for each space in a building for each hour of a year (LOADS subprogram), to simulate the operation and response of the equipment and systems that control temperature and humidity and distribute heating and cooling to the building (SYSTEMS subprogram), to model primary energy conversion equipment that uses fuel or electricity to provide the required heating, cooling, and electricity (PLANT subprogram), and to compute the cost of energy and building operation based on utility rate schedules and economic parameters (ECONOMICS subprogram), to display results (Reports subprogram)), and to convert raw weather files into DOE-2.1F compatible weather files (Weather Processor). A library of materials, constructions, and windows is provided.

  16. DOE2.1E. Building Energy Consumption Analysis

    SciTech Connect

    Birdsall, B.; Buhl, W.; Ellington, K.; Erdem, E.; Meldem, R; Winkelmann, F.; Hirsch, J.J.; Gates, S.

    1993-10-01

    DOE2.1E is a set of programs for the analysis of energy consumption in residental and commercial buildings. Programs are included to enter input data (Building Description Language Processor), to calculate the heating and cooling loads for each space in a building for each hour of a year (LOADS subprogram), to simulate the operation and response of the equipment and systems that control temperature and humidity and distribute heating and cooling to the building (SYSTEMS subprogram), to model primary energy conversion equipment that uses fuel or electricity to provide the required heating, cooling, and electricity (PLANT subprogram), to compute the cost of energy and building operation based on utility rate schedules and economic parameters (ECONOMICS subprogram), to display results (Reports subprogram), and to convert raw weather files into DOE-2 compatible weather files (Weather Processor). A library of materials, constructions, and windows is provided.

  17. Dynamic simulation of kinematic Stirling engines: Coupled and decoupled analysis

    SciTech Connect

    Fischer, K.; Lemrani, H.; Stouffs, P.

    1995-12-31

    A coupled analysis modelling method of Stirling engines is presented. The main feature of this modelling method is the use of a software package combining the capabilities of a pre-/post-processor with a differential algebraic equations solver. As a result, modelling is merely a matter of linking appropriate objects from a model library and the outcoming tool is very flexible and powerful. Some simulation results are presented and compared with those obtained from a decoupled analysis. It clearly appears that the main imperfection of the model does not come from the modelling process itself but from their incomplete knowledge of the physics behind the Stirling engine operation.

  18. Magnetic Testing, and Modeling, Simulation and Analysis for Space Applications

    NASA Technical Reports Server (NTRS)

    Boghosian, Mary; Narvaez, Pablo; Herman, Ray

    2012-01-01

    The Aerospace Corporation (Aerospace) and Lockheed Martin Space Systems (LMSS) participated with Jet Propulsion Laboratory (JPL) in the implementation of a magnetic cleanliness program of the NASA/JPL JUNO mission. The magnetic cleanliness program was applied from early flight system development up through system level environmental testing. The JUNO magnetic cleanliness program required setting-up a specialized magnetic test facility at Lockheed Martin Space Systems for testing the flight system and a testing program with facility for testing system parts and subsystems at JPL. The magnetic modeling, simulation and analysis capability was set up and performed by Aerospace to provide qualitative and quantitative magnetic assessments of the magnetic parts, components, and subsystems prior to or in lieu of magnetic tests. Because of the sensitive nature of the fields and particles scientific measurements being conducted by the JUNO space mission to Jupiter, the imposition of stringent magnetic control specifications required a magnetic control program to ensure that the spacecraft's science magnetometers and plasma wave search coil were not magnetically contaminated by flight system magnetic interferences. With Aerospace's magnetic modeling, simulation and analysis and JPL's system modeling and testing approach, and LMSS's test support, the project achieved a cost effective approach to achieving a magnetically clean spacecraft. This paper presents lessons learned from the JUNO magnetic testing approach and Aerospace's modeling, simulation and analysis activities used to solve problems such as remnant magnetization, performance of hard and soft magnetic materials within the targeted space system in applied external magnetic fields.

  19. Handbook of Scaling Methods in Aquatic Ecology: Measurement, Analysis, Simulation

    NASA Astrophysics Data System (ADS)

    Marrasé, Celia

    2004-03-01

    Researchers in aquatic sciences have long been interested in describing temporal and biological heterogeneities at different observation scales. During the 1970s, scaling studies received a boost from the application of spectral analysis to ecological sciences. Since then, new insights have evolved in parallel with advances in observation technologies and computing power. In particular, during the last 2 decades, novel theoretical achievements were facilitated by the use of microstructure profilers, the application of mathematical tools derived from fractal and wavelet analyses, and the increase in computing power that allowed more complex simulations. The idea of publishing the Handbook of Scaling Methods in Aquatic Ecology arose out of a special session of the 2001 Aquatic Science Meeting of the American Society of Limnology and Oceanography. The edition of the book is timely, because it compiles a good amount of the work done in these last 2 decades. The book is comprised of three sections: measurements, analysis, and simulation. Each contains some review chapters and a number of more specialized contributions. The contents are multidisciplinary and focus on biological and physical processes and their interactions over a broad range of scales, from micro-layers to ocean basins. The handbook topics include high-resolution observation methodologies, as well as applications of different mathematical tools for analysis and simulation of spatial structures, time variability of physical and biological processes, and individual organism behavior. The scientific background of the authors is highly diverse, ensuring broad interest for the scientific community.

  20. Tool Support for Parametric Analysis of Large Software Simulation Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  1. Uncertainty analysis of geothermal energy economics

    NASA Astrophysics Data System (ADS)

    Sener, Adil Caner

    This dissertation research endeavors to explore geothermal energy economics by assessing and quantifying the uncertainties associated with the nature of geothermal energy and energy investments overall. The study introduces a stochastic geothermal cost model and a valuation approach for different geothermal power plant development scenarios. The Monte Carlo simulation technique is employed to obtain probability distributions of geothermal energy development costs and project net present values. In the study a stochastic cost model with incorporated dependence structure is defined and compared with the model where random variables are modeled as independent inputs. One of the goals of the study is to attempt to shed light on the long-standing modeling problem of dependence modeling between random input variables. The dependence between random input variables will be modeled by employing the method of copulas. The study focuses on four main types of geothermal power generation technologies and introduces a stochastic levelized cost model for each technology. Moreover, we also compare the levelized costs of natural gas combined cycle and coal-fired power plants with geothermal power plants. The input data used in the model relies on the cost data recently reported by government agencies and non-profit organizations, such as the Department of Energy, National Laboratories, California Energy Commission and Geothermal Energy Association. The second part of the study introduces the stochastic discounted cash flow valuation model for the geothermal technologies analyzed in the first phase. In this phase of the study, the Integrated Planning Model (IPM) software was used to forecast the revenue streams of geothermal assets under different price and regulation scenarios. These results are then combined to create a stochastic revenue forecast of the power plants. The uncertainties in gas prices and environmental regulations will be modeled and their potential impacts will be

  2. The impact of energy pricing policy on Taiwan`s economy: A simulation of CGE model

    SciTech Connect

    Bor, Y.J.

    1995-12-31

    The purpose of this paper is to simulate the impacts of energy pricing policy on Taiwan`s economy. Based on a CGE model and utilizing empirical data from the 1989 input-output table, energy balance table and the national income report of Taiwan, this paper simulates a single energy price shock, witch is a 1 percent increase in one energy commodity, and examines two real cases of energy price adjustment, on February 16 and August 10, 1994 (a decrease of about 3 percent and an increase of about 3 percent in oil and gas prices). The simulation results are then interpreted. Finally, the conclusion and suggestions for further research are presented.

  3. Entry, concentration and market efficiency: A simulation of the PJM energy market

    NASA Astrophysics Data System (ADS)

    Harvill, Terry

    The rapid and substantial expansion of the PJM energy market during 2004 and 2005 provides a unique opportunity to test the theory of market concentration and its effect on market efficiency. With ten years of operational experience, the PJM energy market is uniquely suited to test the theories of market concentration and efficiency in a natural experiment. This research tests the hypothesis that, for a given number of generating units in the industry, system marginal price will be a decreasing function of the number of owners or generators controlling the units (i.e., the industry concentration ratio). Market simulations are utilized to assess price-cost markups in the PJM energy market during three distinct periods of expansion: (1) pre-Commonwealth Edison integration, (2) pre-American Electric Power (AEP), Dayton Power and Light (DPL), Duquesne Light (Duquesne), and Dominion Virginia Power (Dominion) integration, and (3) post-AFT, DPL. Duquesne, and Dominion Integration. The results of the market simulations for the May 1 to August 31 periods for 2003, 2004, and 2005, indicate that the performance of the market improved with the addition of new market participants in 2004 and 2005. The results of the simulation indicate that the load-weighted Lerner index decreased to -3.70 percent in 2005 from 0.92 percent in 2003. Clearly, the addition of Commonwealth Edison in 2004 significantly increased constraints within the PJM energy market and likely impacted the observed prices in PJM during 2004 due to the lack of a significant link to the other PJM market participants. This deficiency was address in 2005 with the addition of American Electric Power. The market simulations also highlight the prevalence of computed negative markups in the simulation results. Many of the off-peak periods in particular are characterized by negative markups where the expected marginal cost exceeds the observed price. Unit commitment constraints are believed to largely account for these

  4. Energy performance analysis of prototype electrochromic windows

    SciTech Connect

    Sullivan, R.; Rubin, M.; Selkowitz, S.

    1996-12-01

    This paper presents the results of a study investigating the energy performance of three newly developed prototype electrochromic devices. The DOE-2.1 E energy simulation program was used to analyze the annual cooling, lighting, and total electric energy use and peak demand as a function of window type and size. The authors simulated a prototypical commercial office building module located in the cooling-dominated locations of Phoenix, AZ and Miami, FL. Heating energy use was also studied in the heating-dominated location of Madison, WI. Daylight illuminance was used to control electrochromic state-switching. Two types of window systems were analyzed; i.e., the outer pane electrochromic glazing was combined with either a conventional low-E or a spectrally selective inner pane. The properties of the electrochromic glazings are based on measured data of new prototypes developed as part of a cooperative DOE-industry program. The results show the largest difference in annual electric energy performance between the different window types occurs in Phoenix and is about 6.5 kWh/m{sup 2} floor area (0.60 kWh/ft{sup 2}) which can represent a cost of about $.52/m{sup 2} ($.05/ft{sup 2}) using electricity costing $.08/kWh. In heating-dominated locations, the electrochromic should be maintained in its bleached state during the heating season to take advantage of beneficial solar heat gain which would reduce the amount of required heating. This also means that the electrochromic window with the largest solar heat gain coefficient is best.

  5. Heat-Energy Analysis for Solar Receivers

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.

    1982-01-01

    Heat-energy analysis program (HEAP) solves general heat-transfer problems, with some specific features that are "custom made" for analyzing solar receivers. Can be utilized not only to predict receiver performance under varying solar flux, ambient temperature and local heat-transfer rates but also to detect locations of hotspots and metallurgical difficulties and to predict performance sensitivity of neighboring component parameters.

  6. Analysis of Aurora's Performance Simulation Engine for Three Systems

    SciTech Connect

    Freeman, Janine; Simon, Joseph

    2015-07-07

    Aurora Solar Inc. is building a cloud-based optimization platform to automate the design, engineering, and permit generation process of solar photovoltaic (PV) installations. They requested that the National Renewable Energy Laboratory (NREL) validate the performance of the PV system performance simulation engine of Aurora Solar’s solar design platform, Aurora. In previous work, NREL performed a validation of multiple other PV modeling tools 1, so this study builds upon that work by examining all of the same fixed-tilt systems with available module datasheets that NREL selected and used in the aforementioned study. Aurora Solar set up these three operating PV systems in their modeling platform using NREL-provided system specifications and concurrent weather data. NREL then verified the setup of these systems, ran the simulations, and compared the Aurora-predicted performance data to measured performance data for those three systems, as well as to performance data predicted by other PV modeling tools.

  7. Simulation and template generation for LISA Pathfinder Data Analysis

    NASA Astrophysics Data System (ADS)

    Rais, Boutheina; Grynagier, Adrien; Diaz-Aguiló, Marc; Armano, Michele

    The LISA PathFinder (LPF) mission is a technology demonstration mission which aims at testing a number of critical technical challenges that the future LISA (Gravitational wave detection in space) mission will face: LPF can be seen as a complex laboratory experiment in space. It is therefore critical to be able to define which measurements and which actuations will be applied during the scientific part of the mission. The LISA Technology Package (LTP), part of ESA's hardware contribution to LPF, outlines hence the importance of developing an appropriate simulation tool in order to test these strate-gies before launch and to analyse the dynamical behaviour of the system during the mission. The detailed model of the simulation can be used in an off-line mode for further planning: cor-rect estimation of timeline priorities, risk factors, duty cycles, data analysis readiness. The Lisa Technology Package Data Analysis (LTPDA) team has developed an object-oriented MATLAB toolbox for general case of data analysis needs. However, to meet specific needs of LPF mis-sion, a template generation tool has been developed. It provides a recognizable data pattern, avoiding the risk of missing the model during mission's analysis. The aim of the template generator tool is to provide tools to analyse LTP system modeled in State Space Model (SSM). The SSM class, the aim of this poster, includes this tools within the LTPDA toolbox. It can be used to generate the time-domain response for any given actuation and/or noise, the frequency response using bode diagrams and the steady state of the system. It allows the user to project noises on system outputs to get spectra of outputs for given input noises spectra. This class is sufficiently general to be used with a variety of systems once the SSM of the system is provided in the library. Furthermore, one of the main objectives of the data analysis for LPF (the estimation of different parameters of the system), can be achieved by a new

  8. Production and destruction of eddy kinetic energy in forced submesoscale eddy-resolving simulations

    NASA Astrophysics Data System (ADS)

    Mukherjee, Sonaljit; Ramachandran, Sanjiv; Tandon, Amit; Mahadevan, Amala

    2016-09-01

    We study the production and dissipation of the eddy kinetic energy (EKE) in a submesoscale eddy field forced with downfront winds using the Process Study Ocean Model (PSOM) with a horizontal grid resolution of 0.5 km. We simulate an idealized 100 m deep mixed-layer front initially in geostrophic balance with a jet in a domain that permits eddies within a range of O(1 km-100 km). The vertical eddy viscosities and the dissipation are parameterized using four different subgrid vertical mixing parameterizations: the k - ɛ , the KPP, and two different constant eddy viscosity and diffusivity profiles with a magnitude of O(10-2m2s-1) in the mixed layer. Our study shows that strong vertical eddy viscosities near the surface reduce the parameterized dissipation, whereas strong vertical eddy diffusivities reduce the lateral buoyancy gradients and consequently the rate of restratification by mixed-layer instabilities (MLI). Our simulations show that near the surface, the spatial variability of the dissipation along the periphery of the eddies depends on the relative alignment of the ageostrophic and geostrophic shear. Analysis of the resolved EKE budgets in the frontal region from the simulations show important similarities between the vertical structure of the EKE budget produced by the k - ɛ and KPP parameterizations, and earlier LES studies. Such an agreement is absent in the simulations using constant eddy-viscosity parameterizations.

  9. Building America Top Innovations 2012: Building Energy Optimization Analysis Method (BEopt)

    SciTech Connect

    none,

    2013-01-01

    This Building America Top Innovations profile describes the DOE-sponsored BEopt software, which ensures a consistent analysis platform and accurate simulations. Many BEopt algorithms have been adopted by private-sector HERS software tools that have helped improve the energy efficiency of tens-of-thousands of ENERGY STAR-certified homes.

  10. MC 93 - Proceedings of the International Conference on Monte Carlo Simulation in High Energy and Nuclear Physics

    NASA Astrophysics Data System (ADS)

    Dragovitsch, Peter; Linn, Stephan L.; Burbank, Mimi

    1994-01-01

    The Table of Contents for the book is as follows: * Preface * Heavy Fragment Production for Hadronic Cascade Codes * Monte Carlo Simulations of Space Radiation Environments * Merging Parton Showers with Higher Order QCD Monte Carlos * An Order-αs Two-Photon Background Study for the Intermediate Mass Higgs Boson * GEANT Simulation of Hall C Detector at CEBAF * Monte Carlo Simulations in Radioecology: Chernobyl Experience * UNIMOD2: Monte Carlo Code for Simulation of High Energy Physics Experiments; Some Special Features * Geometrical Efficiency Analysis for the Gamma-Neutron and Gamma-Proton Reactions * GISMO: An Object-Oriented Approach to Particle Transport and Detector Modeling * Role of MPP Granularity in Optimizing Monte Carlo Programming * Status and Future Trends of the GEANT System * The Binary Sectioning Geometry for Monte Carlo Detector Simulation * A Combined HETC-FLUKA Intranuclear Cascade Event Generator * The HARP Nucleon Polarimeter * Simulation and Data Analysis Software for CLAS * TRAP -- An Optical Ray Tracing Program * Solutions of Inverse and Optimization Problems in High Energy and Nuclear Physics Using Inverse Monte Carlo * FLUKA: Hadronic Benchmarks and Applications * Electron-Photon Transport: Always so Good as We Think? Experience with FLUKA * Simulation of Nuclear Effects in High Energy Hadron-Nucleus Collisions * Monte Carlo Simulations of Medium Energy Detectors at COSY Jülich * Complex-Valued Monte Carlo Method and Path Integrals in the Quantum Theory of Localization in Disordered Systems of Scatterers * Radiation Levels at the SSCL Experimental Halls as Obtained Using the CLOR89 Code System * Overview of Matrix Element Methods in Event Generation * Fast Electromagnetic Showers * GEANT Simulation of the RMC Detector at TRIUMF and Neutrino Beams for KAON * Event Display for the CLAS Detector * Monte Carlo Simulation of High Energy Electrons in Toroidal Geometry * GEANT 3.14 vs. EGS4: A Comparison Using the DØ Uranium/Liquid Argon

  11. Simulation of energy absorption spectrum in NaI crystal detector for multiple gamma energy using Monte Carlo method

    SciTech Connect

    Wirawan, Rahadi; Waris, Abdul; Djamal, Mitra; Handayani, Gunawan

    2015-04-16

    The spectrum of gamma energy absorption in the NaI crystal (scintillation detector) is the interaction result of gamma photon with NaI crystal, and it’s associated with the photon gamma energy incoming to the detector. Through a simulation approach, we can perform an early observation of gamma energy absorption spectrum in a scintillator crystal detector (NaI) before the experiment conducted. In this paper, we present a simulation model result of gamma energy absorption spectrum for energy 100-700 keV (i.e. 297 keV, 400 keV and 662 keV). This simulation developed based on the concept of photon beam point source distribution and photon cross section interaction with the Monte Carlo method. Our computational code has been successfully predicting the multiple energy peaks absorption spectrum, which derived from multiple photon energy sources.

  12. Baroclinic internal wave energy distribution in the Baltic Sea derived from 45 years of circulation simulations

    NASA Astrophysics Data System (ADS)

    Rybin, Artem; Soomere, Tarmo; Kurkina, Oxana; Kurkin, Andrey; Rouvinskaya, Ekaterina; Markus Meier, H. E.

    2016-04-01

    Internal waves and internal tides are an essential component of the functioning of stratified shelf seas. They carry substantial amounts of energy through the water masses, drive key hydrophysical processes such as mixing and overturning and support the functioning of marine ecosystem in many ways. Their particular impact becomes evident near and at the bottom where they often create substantial loads to engineering structures and exert a wide range of impacts on the bottom sediments and evolution of the seabed. We analyse several properties of spatio-temporal distributions of energy of relatively long-period large-scale internal wave motions in the Baltic Sea. The analysis is based on numerically simulated pycnocline variations that are extracted from the hydrographic data calculated by the Rossby Centre Ocean circulation model (RCO) for the entire Baltic Sea for 1961-2005. This model has a horizontal resolution of 2 nautical miles and uses 41 vertical layers with a thickness between 3 m close to the surface and 12 m in 250 m depth. The model is forced with atmospheric data derived from the ERA-40 re-analysis using a regional atmosphere model with a horizontal resolution of 25 km. It also accounts for river inflow and water exchange through the Danish Straits. See (Meier, H.E.M., Höglund, A., 2013. Studying the Baltic Sea circulation with Eulerian tracers, in Soomere, T., Quak, E., eds., Preventive Methods for Coastal Protection, Springer, Cham, Heidelberg, 101-130) for a detailed description of the model and its forcing. The resolution of the model output used in this study (once in 6 hours) is sufficient for estimates of spectral amplitudes of the displacements of isopycnal surfaces with a typical period of 2-12 days. We provide the analysis of kinetic and potential energy of motions with these periods. The resulting maps of the maxima of energy and spatial distributions of near-bottom velocities have been evaluated for the entire simulation interval of 45

  13. A simulation model for wind energy storage systems. Volume 2: Operation manual

    NASA Technical Reports Server (NTRS)

    Warren, A. W.; Edsinger, R. W.; Burroughs, J. D.

    1977-01-01

    A comprehensive computer program (SIMWEST) developed for the modeling of wind energy/storage systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel, and pneumatic) is described. Features of the program include: a precompiler which generates computer models (in FORTRAN) of complex wind source/storage/application systems, from user specifications using the respective library components; a program which provides the techno-economic system analysis with the respective I/O the integration of system dynamics, and the iteration for conveyance of variables; and capability to evaluate economic feasibility as well as general performance of wind energy systems. The SIMWEST operation manual is presented and the usage of the SIMWEST program and the design of the library components are described. A number of example simulations intended to familiarize the user with the program's operation is given along with a listing of each SIMWEST library subroutine.

  14. An expanded system simulation model for solar energy storage (technical report), volume 1

    NASA Technical Reports Server (NTRS)

    Warren, A. W.

    1979-01-01

    The simulation model for wind energy storage (SIMWEST) program now includes wind and/or photovoltaic systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel and pneumatic) and is available for the UNIVAC 1100 series and the CDC 6000 series computers. The level of detail is consistent with a role of evaluating the economic feasibility as well as the general performance of wind and/or photovoltaic energy systems. The software package consists of two basic programs and a library of system, environmental, and load components. The first program is a precompiler which generates computer models (in FORTRAN) of complex wind and/or photovoltaic source/storage/application systems, from user specifications using the respective library components. The second program provides the techno-economic system analysis with the respective I/0, the integration of system dynamics, and the iteration for conveyance of variables.

  15. NREL Evaluates Thermal Performance of Uninsulated Walls to Improve Accuracy of Building Energy Simulation Tools (Fact Sheet)

    SciTech Connect

    Not Available

    2012-03-01

    NREL researchers discover ways to increase accuracy in building energy simulations tools to improve predictions of potential energy savings in homes. Uninsulated walls are typical in older U.S. homes where the wall cavities were not insulated during construction or where the insulating material has settled. Researchers at the National Renewable Energy Laboratory (NREL) are investigating ways to more accurately calculate heat transfer through building enclosures to verify the benefit of energy efficiency upgrades that reduce energy use in older homes. In this study, scientists used computational fluid dynamics (CFD) analysis to calculate the energy loss/gain through building walls and visualize different heat transfer regimes within the uninsulated cavities. The effects of ambient outdoor temperature, the radiative properties of building materials, insulation levels, and the temperature dependence of conduction through framing members were considered. The research showed that the temperature dependence of conduction through framing members dominated the differences between this study and previous results - an effect not accounted for in existing building energy simulation tools. The study provides correlations for the resistance of the uninsulated assemblies that can be implemented into building simulation tools to increase the accuracy of energy use estimates in older homes, which are currently over-predicted.

  16. Least cost analysis of renewable energy projects

    SciTech Connect

    Cosgrove-Davies, M.; Cabraal, A.

    1994-12-31

    This paper describes the methodology for evaluating dispersed and centralized rural energy options on a least cost basis. In defining the load to be served, each supply alternative must provide equivalent levels of service. The village to be served is defined by the number of loads, load density, distance from the nearest power distribution line, and load growth. Appropriate rural energy alternatives are identified and sized to satisfy the defined load. Lastly, a net present value analysis (including capital, installation, O and M, fuel, and replacement costs, etc.) is performed to identify the least cost option. A spreadsheet-based analytical tool developed by the World Bank`s Asia Alternative Energy Unit (ASTAE) incorporates this approach and has been applied to compare photovoltaic solar home systems with other rural energy supply options in Indonesia. Load size and load density are found to be the critical factors in choosing between a grid and off-grid solution.

  17. Monte Carlo Simulations of Ultra-High Energy Resolution Gamma Detectors for Nuclear Safeguards

    SciTech Connect

    Robles, A; Drury, O B; Friedrich, S

    2009-08-19

    Ultra-high energy resolution superconducting gamma-ray detectors can improve the accuracy of non-destructive analysis for unknown radioactive materials. These detectors offer an order of magnitude improvement in resolution over conventional high purity germanium detectors. The increase in resolution reduces errors from line overlap and allows for the identification of weaker gamma-rays by increasing the magnitude of the peaks above the background. In order to optimize the detector geometry and to understand the spectral response function Geant4, a Monte Carlo simulation package coded in C++, was used to model the detectors. Using a 1 mm{sup 3} Sn absorber and a monochromatic gamma source, different absorber geometries were tested. The simulation was expanded to include the Cu block behind the absorber and four layers of shielding required for detector operation at 0.1 K. The energy spectrum was modeled for an Am-241 and a Cs-137 source, including scattering events in the shielding, and the results were compared to experimental data. For both sources the main spectral features such as the photopeak, the Compton continuum, the escape x-rays and the backscatter peak were identified. Finally, the low energy response of a Pu-239 source was modeled to assess the feasibility of Pu-239 detection in spent fuel. This modeling of superconducting detectors can serve as a guide to optimize the configuration in future spectrometer designs.

  18. Simulation analysis for ion assisted fast ignition using structured targets

    NASA Astrophysics Data System (ADS)

    Sakagami, H.; Johzaki, T.; Sunahara, A.; Nagatomo, H.

    2016-05-01

    As the heating efficiency by fast electrons in the fast ignition scheme is estimated to be very low due to their large divergence angle and high energy. To mitigate this problem, low-density plastic foam, which can generate not only proton (H+) but also carbon (C6+) beams, can be introduced to currently used cone-guided targets and additional core heating by ions is expected. According to 2D PIC simulations, it is found that the ion beams also diverge by the static electric field and concave surface deformation. Thus structured targets are suggested to optimize ion beam characteristics, and their improvement and core heating enhancement by ion beams are confirmed.

  19. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  20. Finite element analysis simulations for ultrasonic array NDE inspections

    NASA Astrophysics Data System (ADS)

    Dobson, Jeff; Tweedie, Andrew; Harvey, Gerald; O'Leary, Richard; Mulholland, Anthony; Tant, Katherine; Gachagan, Anthony

    2016-02-01

    Advances in manufacturing techniques and materials have led to an increase in the demand for reliable and robust inspection techniques to maintain safety critical features. The application of modelling methods to develop and evaluate inspections is becoming an essential tool for the NDE community. Current analytical methods are inadequate for simulation of arbitrary components and heterogeneous materials, such as anisotropic welds or composite structures. Finite element analysis software (FEA), such as PZFlex, can provide the ability to simulate the inspection of these arrangements, providing the ability to economically prototype and evaluate improved NDE methods. FEA is often seen as computationally expensive for ultrasound problems however, advances in computing power have made it a more viable tool. This paper aims to illustrate the capability of appropriate FEA to produce accurate simulations of ultrasonic array inspections - minimizing the requirement for expensive test-piece fabrication. Validation is afforded via corroboration of the FE derived and experimentally generated data sets for a test-block comprising 1D and 2D defects. The modelling approach is extended to consider the more troublesome aspects of heterogeneous materials where defect dimensions can be of the same length scale as the grain structure. The model is used to facilitate the implementation of new ultrasonic array inspection methods for such materials. This is exemplified by considering the simulation of ultrasonic NDE in a weld structure in order to assess new approaches to imaging such structures.

  1. Energy consumption analysis of the Venus Deep Space Station (DSS-13)

    NASA Technical Reports Server (NTRS)

    Hayes, N. V.

    1983-01-01

    This report continues the energy consumption analysis and verification study of the tracking stations of the Goldstone Deep Space Communications Complex, and presents an audit of the Venus Deep Space Station (DSS 13). Due to the non-continuous radioastronomy research and development operations at the station, estimations of energy usage were employed in the energy consumption simulation of both the 9-meter and 26-meter antenna buildings. A 17.9% decrease in station energy consumption was experienced over the 1979-1981 years under study. A comparison of the ECP computer simulations and the station's main watt-hour meter readings showed good agreement.

  2. Database Driven 6-DOF Trajectory Simulation for Debris Transport Analysis

    NASA Technical Reports Server (NTRS)

    West, Jeff

    2008-01-01

    Debris mitigation and risk assessment have been carried out by NASA and its contractors supporting Space Shuttle Return-To-Flight (RTF). As a part of this assessment, analysis of transport potential for debris that may be liberated from the vehicle or from pad facilities prior to tower clear (Lift-Off Debris) is being performed by MSFC. This class of debris includes plume driven and wind driven sources for which lift as well as drag are critical for the determination of the debris trajectory. As a result, NASA MSFC has a need for a debris transport or trajectory simulation that supports the computation of lift effect in addition to drag without the computational expense of fully coupled CFD with 6-DOF. A database driven 6-DOF simulation that uses aerodynamic force and moment coefficients for the debris shape that are interpolated from a database has been developed to meet this need. The design, implementation, and verification of the database driven six degree of freedom (6-DOF) simulation addition to the Lift-Off Debris Transport Analysis (LODTA) software are discussed in this paper.

  3. Automated analysis for detecting beams in laser wakefield simulations

    SciTech Connect

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-07-03

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets.

  4. Constraint methods that accelerate free-energy simulations of biomolecules.

    PubMed

    Perez, Alberto; MacCallum, Justin L; Coutsias, Evangelos A; Dill, Ken A

    2015-12-28

    Atomistic molecular dynamics simulations of biomolecules are critical for generating narratives about biological mechanisms. The power of atomistic simulations is that these are physics-based methods that satisfy Boltzmann's law, so they can be used to compute populations, dynamics, and mechanisms. But physical simulations are computationally intensive and do not scale well to the sizes of many important biomolecules. One way to speed up physical simulations is by coarse-graining the potential function. Another way is to harness structural knowledge, often by imposing spring-like restraints. But harnessing external knowledge in physical simulations is problematic because knowledge, data, or hunches have errors, noise, and combinatoric uncertainties. Here, we review recent principled methods for imposing restraints to speed up physics-based molecular simulations that promise to scale to larger biomolecules and motions. PMID:26723628

  5. Constraint methods that accelerate free-energy simulations of biomolecules

    SciTech Connect

    Perez, Alberto; MacCallum, Justin L.; Coutsias, Evangelos A.; Dill, Ken A.

    2015-12-28

    Atomistic molecular dynamics simulations of biomolecules are critical for generating narratives about biological mechanisms. The power of atomistic simulations is that these are physics-based methods that satisfy Boltzmann’s law, so they can be used to compute populations, dynamics, and mechanisms. But physical simulations are computationally intensive and do not scale well to the sizes of many important biomolecules. One way to speed up physical simulations is by coarse-graining the potential function. Another way is to harness structural knowledge, often by imposing spring-like restraints. But harnessing external knowledge in physical simulations is problematic because knowledge, data, or hunches have errors, noise, and combinatoric uncertainties. Here, we review recent principled methods for imposing restraints to speed up physics-based molecular simulations that promise to scale to larger biomolecules and motions.

  6. Feature-based statistical analysis of combustion simulation data.

    PubMed

    Bennett, Janine C; Krishnamoorthy, Vaidyanathan; Liu, Shusen; Grout, Ray W; Hawkes, Evatt R; Chen, Jacqueline H; Shepherd, Jason; Pascucci, Valerio; Bremer, Peer-Timo

    2011-12-01

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  7. Building prototypes of damaged systems from analysis simulations

    NASA Astrophysics Data System (ADS)

    Tsai, Cynthia S.; Dolin, Ronald M.; Hefele, Jill

    1997-01-01

    Our rapid prototype of damaged systems project seeks to provide a technology for allowing engineers to build demonstration prototypes of damaged products from analysis post-processing data. Most commercial finite element programs do not have a capability to construct deformed geometry at conclusion of an analysis simulation. It is therefore not presently possible to build prototypes of predicted states of a product as the result of being subjected to simulated adverse environments. Our approach is to reverse engineer a description of a deformed finite element mesh into a stereolithography format for prototyping using a selective laser sintering (SLS) machine. This stereolithography file can be generated from deformed surface node information as well as from a reconstructed surface defined by inspection data. We are developing software to allow users to represent a part or assembly in a deformed condition. The resulting representation can also be used to create simulated x-rays of a damaged or deformed configuration for comparison with experimental test results or field data. This allows engineers to benchmark their analysis methods and provide increased understanding of analysis results through enhanced visualization. The process of reverse engineering 'in-use' or damaged products allows for a more refined inspection and comparison of imperfect parts. It addresses the issue of whether or not a part will still work when subjected to certain environments or scenarios. Answers to this question can be found using our model reconstruction technique that represents an 'as-built' engineering model configuration. An additional feature of this reverse engineering process is product benchmarking and closer engineer/manufacturer interactions.

  8. Building protypes of damaged systems from analysis simulations

    SciTech Connect

    Tsai, C.S.; Dolin, R.M.; Hefele, J.

    1996-12-31

    Our rapid prototype of damaged systems project seeks to provide a technology for allowing engineers to build demonstration prototypes of damaged products from analysis post-processing data. Most commercial finite element programs do not have a capability to construct deformed geometry at the conclusion of an analysis simulation. It is therefore not presently possible to build prototypes of predicted states of a product as the result of being subjected to simulated adverse environments. Our approach is to reverse engineer a description of a deformed finite element mesh into a stereolithography format for prototyping using a Selective Laser Sintering (SLS) machine. This stereolithography file can be generated from deformed surface node information as well as from a reconstructed surface defined by inspection data. We are developing software to allow users to represent a part or assembly in a deformed condition. The damaged part can then be manufactured using the SLS process for visualization and assessment purposes. The resulting representation can also be used to create simulated X-rays of a damaged or deformed configuration for comparison with experimental test results or field data. This allows engineers to benchmark their analysis methods and provide increased understanding of analysis results through enhanced visualization. The process of reverse engineering `in-use` or damaged products allows for a more refined inspection and comparison of imperfect parts. It addresses the issue of whether or not a part will still work when subjected to certain environments or scenarios. Answers to this question can be found using our model reconstruction technique that represents an `as- built` engineering model configuration. An additional feature of this reverse engineering process is product benchmarking and closer engineer/manufacturer interactions.

  9. Feature-Based Statistical Analysis of Combustion Simulation Data

    SciTech Connect

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  10. A Posteriori Analysis for Hydrodynamic Simulations Using Adjoint Methodologies

    SciTech Connect

    Woodward, C S; Estep, D; Sandelin, J; Wang, H

    2009-02-26

    This report contains results of analysis done during an FY08 feasibility study investigating the use of adjoint methodologies for a posteriori error estimation for hydrodynamics simulations. We developed an approach to adjoint analysis for these systems through use of modified equations and viscosity solutions. Targeting first the 1D Burgers equation, we include a verification of the adjoint operator for the modified equation for the Lax-Friedrichs scheme, then derivations of an a posteriori error analysis for a finite difference scheme and a discontinuous Galerkin scheme applied to this problem. We include some numerical results showing the use of the error estimate. Lastly, we develop a computable a posteriori error estimate for the MAC scheme applied to stationary Navier-Stokes.

  11. Swiftvis: Data Analysis And Visualization For Planetary Science Simulations

    NASA Astrophysics Data System (ADS)

    Lewis, Mark C.; Levison, H. F.; Kavanagh, G.

    2007-07-01

    SwiftVis is a tool originally developed as part of a rewrite of Swift to be used for analysis and plotting of simulations performed with Swift and Swifter. The extensibility built into the design has allowed us to make SwiftVis a general purpose analysis and plotting package customized to be usable by the planetary science community at large. SwiftVis is written in Java and has been tested on Windows, Linux, and Mac platforms. Its graphical interface allows users to do complex analysis and plotting without having to write custom code. The software package and a tutorial can be found at http://www.cs.trinity.edu/ mlewis/SwiftVis/.

  12. Strategic Plan for Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS)

    SciTech Connect

    Rich Johnson; Kimberlyn C. Mousseau; Hyung Lee

    2011-09-01

    NE-KAMS knowledge base will assist computational analysts, physics model developers, experimentalists, nuclear reactor designers, and federal regulators by: (1) Establishing accepted standards, requirements and best practices for V&V and UQ of computational models and simulations, (2) Establishing accepted standards and procedures for qualifying and classifying experimental and numerical benchmark data, (3) Providing readily accessible databases for nuclear energy related experimental and numerical benchmark data that can be used in V&V assessments and computational methods development, (4) Providing a searchable knowledge base of information, documents and data on V&V and UQ, and (5) Providing web-enabled applications, tools and utilities for V&V and UQ activities, data assessment and processing, and information and data searches. From its inception, NE-KAMS will directly support nuclear energy research, development and demonstration programs within the U.S. Department of Energy (DOE), including the Consortium for Advanced Simulation of Light Water Reactors (CASL), the Nuclear Energy Advanced Modeling and Simulation (NEAMS), the Light Water Reactor Sustainability (LWRS), the Small Modular Reactors (SMR), and the Next Generation Nuclear Power Plant (NGNP) programs. These programs all involve computational modeling and simulation (M&S) of nuclear reactor systems, components and processes, and it is envisioned that NE-KAMS will help to coordinate and facilitate collaboration and sharing of resources and expertise for V&V and UQ across these programs. In addition, from the outset, NE-KAMS will support the use of computational M&S in the nuclear industry by developing guidelines and recommended practices aimed at quantifying the uncertainty and assessing the applicability of existing analysis models and methods. The NE-KAMS effort will initially focus on supporting the use of computational fluid dynamics (CFD) and thermal hydraulics (T/H) analysis for M&S of nuclear

  13. Pedestrian simulation and distribution in urban space based on visibility analysis and agent simulation

    NASA Astrophysics Data System (ADS)

    Ying, Shen; Li, Lin; Gao, Yurong

    2009-10-01

    Spatial visibility analysis is the important direction of pedestrian behaviors because our visual conception in space is the straight method to get environment information and navigate your actions. Based on the agent modeling and up-tobottom method, the paper develop the framework about the analysis of the pedestrian flow depended on visibility. We use viewshed in visibility analysis and impose the parameters on agent simulation to direct their motion in urban space. We analyze the pedestrian behaviors in micro-scale and macro-scale of urban open space. The individual agent use visual affordance to determine his direction of motion in micro-scale urban street on district. And we compare the distribution of pedestrian flow with configuration in macro-scale urban environment, and mine the relationship between the pedestrian flow and distribution of urban facilities and urban function. The paper first computes the visibility situations at the vantage point in urban open space, such as street network, quantify the visibility parameters. The multiple agents use visibility parameters to decide their direction of motion, and finally pedestrian flow reach to a stable state in urban environment through the simulation of multiple agent system. The paper compare the morphology of visibility parameters and pedestrian distribution with urban function and facilities layout to confirm the consistence between them, which can be used to make decision support in urban design.

  14. Impact of actuarial assumptions on pension costs: A simulation analysis

    NASA Astrophysics Data System (ADS)

    Yusof, Shaira; Ibrahim, Rose Irnawaty

    2013-04-01

    This study investigates the sensitivity of pension costs to changes in the underlying assumptions of a hypothetical pension plan in order to gain a perspective on the relative importance of the various actuarial assumptions via a simulation analysis. Simulation analyses are used to examine the impact of actuarial assumptions on pension costs. There are two actuarial assumptions will be considered in this study which are mortality rates and interest rates. To calculate pension costs, Accrued Benefit Cost Method, constant amount (CA) modification, constant percentage of salary (CS) modification are used in the study. The mortality assumptions and the implied mortality experience of the plan can potentially have a significant impact on pension costs. While for interest rate assumptions, it is inversely related to the pension costs. Results of the study have important implications for analyst of pension costs.

  15. An educational model for ensemble streamflow simulation and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    AghaKouchak, A.; Nakhjiri, N.; Habib, E.

    2013-02-01

    This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  16. Simulation of probabilistic wind loads and building analysis

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Chamis, Christos C.

    1991-01-01

    Probabilistic wind loads likely to occur on a structure during its design life are predicted. Described here is a suitable multifactor interactive equation (MFIE) model and its use in the Composite Load Spectra (CLS) computer program to simulate the wind pressure cumulative distribution functions on four sides of a building. The simulated probabilistic wind pressure load was applied to a building frame, and cumulative distribution functions of sway displacements and reliability against overturning were obtained using NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), a stochastic finite element computer code. The geometry of the building and the properties of building members were also considered as random in the NESSUS analysis. The uncertainties of wind pressure, building geometry, and member section property were qualified in terms of their respective sensitivities on the structural response.

  17. Analysis of simulated engine sounds using a psychoacoustic model

    NASA Astrophysics Data System (ADS)

    Duvigneau, Fabian; Liefold, Steffen; Höchstetter, Marius; Verhey, Jesko L.; Gabbert, Ulrich

    2016-03-01

    The aim of the paper is the evaluation and the prediction of the perceived quality of engine sounds, which is predicted in the design process by numerical simulations. Periodic combustion sounds of the operating engine are synthesized with the help of an overall numerical simulation approach before a real prototype exists. The perceived quality of the sound is rated in hearing tests using the method of relative comparison and absolute judgment. Results are transferred into an interval scaled ranking of the stimuli. Based on the data, a psychoacoustic model for sound quality is developed using psychoacoustic parameters. Predictions of this model are used to evaluate the sound quality of several technical design modifications, for example, different engine encapsulations. The results are visualized to allow a simple qualitative analysis of the sound perception. This results in an impartial and objective decision regarding the final design of an acoustic encapsulation with a higher perceived sound quality.

  18. Analysis of a simulation algorithm for direct brain drug delivery

    PubMed Central

    Rosenbluth, Kathryn Hammond; Eschermann, Jan Felix; Mittermeyer, Gabriele; Thomson, Rowena; Mittermeyer, Stephan; Bankiewicz, Krystof S.

    2011-01-01

    Convection enhanced delivery (CED) achieves targeted delivery of drugs with a pressure-driven infusion through a cannula placed stereotactically in the brain. This technique bypasses the blood brain barrier and gives precise distributions of drugs, minimizing off-target effects of compounds such as viral vectors for gene therapy or toxic chemotherapy agents. The exact distribution is affected by the cannula positioning, flow rate and underlying tissue structure. This study presents an analysis of a simulation algorithm for predicting the distribution using baseline MRI images acquired prior to inserting the cannula. The MRI images included diffusion tensor imaging (DTI) to estimate the tissue properties. The algorithm was adapted for the devices and protocols identified for upcoming trials and validated with direct MRI visualization of Gadolinium in 20 infusions in non-human primates. We found strong agreement between the size and location of the simulated and gadolinium volumes, demonstrating the clinical utility of this surgical planning algorithm. PMID:21945468

  19. The Role of Multiphysics Simulation in Multidisciplinary Analysis

    NASA Technical Reports Server (NTRS)

    Rifai, Steven M.; Ferencz, Robert M.; Wang, Wen-Ping; Spyropoulos, Evangelos T.; Lawrence, Charles; Melis, Matthew E.

    1998-01-01

    This article describes the applications of the Spectrum(Tm) Solver in Multidisciplinary Analysis (MDA). Spectrum, a multiphysics simulation software based on the finite element method, addresses compressible and incompressible fluid flow, structural, and thermal modeling as well as the interaction between these disciplines. Multiphysics simulation is based on a single computational framework for the modeling of multiple interacting physical phenomena. Interaction constraints are enforced in a fully-coupled manner using the augmented-Lagrangian method. Within the multiphysics framework, the finite element treatment of fluids is based on Galerkin-Least-Squares (GLS) method with discontinuity capturing operators. The arbitrary-Lagrangian-Eulerian method is utilized to account for deformable fluid domains. The finite element treatment of solids and structures is based on the Hu-Washizu variational principle. The multiphysics architecture lends itself naturally to high-performance parallel computing. Aeroelastic, propulsion, thermal management and manufacturing applications are presented.

  20. A Computational Approach for Probabilistic Analysis of Water Impact Simulations

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2009-01-01

    NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  1. Lower-Energy Requirements for Power-Assist HEV Energy Storage Systems--Analysis and Rationale (Presentation)

    SciTech Connect

    Gonder, J.; Pesaran, A.

    2010-03-18

    Presented at the 27th International Battery Seminar and Exhibit, 15-18 March 2010, Fort Lauderdale, Florida. NREL conducted simulations and analysis of vehicle test data with research partners in response to a USABC request; results suggest that power-assist hybrid electric vehicles (HEVs), like conventional HEVs, can achieve high fuel savings with lower energy requirements at potentially lower cost.

  2. Macro-System Model for Hydrogen Energy Systems Analysis in Transportation: Preprint

    SciTech Connect

    Diakov, V.; Ruth, M.; Sa, T. J.; Goldsby, M. E.

    2012-06-01

    The Hydrogen Macro System Model (MSM) is a simulation tool that links existing and emerging hydrogen-related models to perform rapid, cross-cutting analysis. It allows analysis of the economics, primary energy-source requirements, and emissions of hydrogen production and delivery pathways.

  3. Preliminary analysis of the dynamic heliosphere by MHD simulations

    SciTech Connect

    Washimi, H.; Zank, G. P.; Tanaka, T.

    2006-09-26

    A preliminary analysis of the dynamic heliosphere to estimate the termination shock (TS) distance from the sun around the time when Voyager 1 passed the termination shock at December 16, 2004 is performed by using MHD simulations. For input to this simulation, we use the Voyager 2 solar-wind data. We first find a stationary solution of the 3-D outer heliosphere by assigning a set of LISM parameters as our outer boundary conditions and then the dynamical analysis is performed. The model TS crossing is within 6 months of the observed date. The TS is pushed outward every time a high ram-pressure solar wind pulse arrives. After the end of the high ram-pressure wind, the TS shock shrinks inward. When the last Halloween event passed through the TS at DOY 250, 2004, the TS began to shrink inward very quickly and the TS crossed V1. The highest inward speed of the TS is over 400 km/s. The high ram-pressure solar wind transmitted through the TS becomes a high thermal-pressure plasma in the heliosheath, acting to push the TS inward. This suggests that the position of the TS is determined not only by the steady-state pressure balance condition between the solar wind ram-pressure and the LISM pressure, but by the dynamical ram pressure too. The period when the high ram-pressure solar wind arrives at the TS shock seems to correspond to the period of the TS particle event (Stone et al, 2005, Decker et al., 2005). The TS crossing date will be revised in future simulations using a more appropriate set of parameters for the LISM. This will enable us to undertake a detailed comparison of the simulation results with the TS particle events.

  4. Impact Testing and Simulation of a Sinusoid Foam Sandwich Energy Absorber

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Fasanella, Edwin L; Littell, Justin D.

    2015-01-01

    A sinusoidal-shaped foam sandwich energy absorber was developed and evaluated at NASA Langley Research Center through multi-level testing and simulation performed under the Transport Rotorcraft Airframe Crash Testbed (TRACT) research project. The energy absorber, designated the "sinusoid," consisted of hybrid carbon- Kevlar® plain weave fabric face sheets, two layers for each face sheet oriented at +/-45deg with respect to the vertical or crush direction, and a closed-cell ELFOAM(TradeMark) P200 polyisocyanurate (2.0-lb/ft3) foam core. The design goal for the energy absorber was to achieve an average floor-level acceleration of between 25- and 40-g during the full-scale crash test of a retrofitted CH-46E helicopter airframe, designated TRACT 2. Variations in the design were assessed through quasi-static and dynamic crush testing of component specimens. Once the design was finalized, a 5-ft-long subfloor beam was fabricated and retrofitted into a barrel section of a CH-46E helicopter. A vertical drop test of the barrel section was conducted onto concrete to evaluate the performance of the energy absorber prior to retrofit into TRACT 2. Finite element models were developed of all test articles and simulations were performed using LSDYNA ®, a commercial nonlinear explicit transient dynamic finite element code. Test analysis results are presented for the sinusoid foam sandwich energy absorber as comparisons of load-displacement and acceleration-time-history responses, as well as predicted and experimental structural deformations and progressive damage for each evaluation level (component testing through barrel section drop testing).

  5. The Communication Link and Error ANalysis (CLEAN) simulator

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.; Crowe, Shane

    1993-01-01

    During the period July 1, 1993 through December 30, 1993, significant developments to the Communication Link and Error ANalysis (CLEAN) simulator were completed and include: (1) Soft decision Viterbi decoding; (2) node synchronization for the Soft decision Viterbi decoder; (3) insertion/deletion error programs; (4) convolutional encoder; (5) programs to investigate new convolutional codes; (6) pseudo-noise sequence generator; (7) soft decision data generator; (8) RICE compression/decompression (integration of RICE code generated by Pen-Shu Yeh at Goddard Space Flight Center); (9) Markov Chain channel modeling; (10) percent complete indicator when a program is executed; (11) header documentation; and (12) help utility. The CLEAN simulation tool is now capable of simulating a very wide variety of satellite communication links including the TDRSS downlink with RFI. The RICE compression/decompression schemes allow studies to be performed on error effects on RICE decompressed data. The Markov Chain modeling programs allow channels with memory to be simulated. Memory results from filtering, forward error correction encoding/decoding, differential encoding/decoding, channel RFI, nonlinear transponders and from many other satellite system processes. Besides the development of the simulation, a study was performed to determine whether the PCI provides a performance improvement for the TDRSS downlink. There exist RFI with several duty cycles for the TDRSS downlink. We conclude that the PCI does not improve performance for any of these interferers except possibly one which occurs for the TDRS East. Therefore, the usefulness of the PCI is a function of the time spent transmitting data to the WSGT through the TDRS East transponder.

  6. Automated detection and analysis of particle beams in laser-plasma accelerator simulations

    SciTech Connect

    Ushizima, Daniela Mayumi; Geddes, C.G.; Cormier-Michel, E.; Bethel, E. Wes; Jacobsen, J.; Prabhat, ,; R.ubel, O.; Weber, G,; Hamann, B.

    2010-05-21

    Numerical simulations of laser-plasma wakefield (particle) accelerators model the acceleration of electrons trapped in plasma oscillations (wakes) left behind when an intense laser pulse propagates through the plasma. The goal of these simulations is to better understand the process involved in plasma wake generation and how electrons are trapped and accelerated by the wake. Understanding of such accelerators, and their development, offer high accelerating gradients, potentially reducing size and cost of new accelerators. One operating regime of interest is where a trapped subset of electrons loads the wake and forms an isolated group of accelerated particles with low spread in momentum and position, desirable characteristics for many applications. The electrons trapped in the wake may be accelerated to high energies, the plasma gradient in the wake reaching up to a gigaelectronvolt per centimeter. High-energy electron accelerators power intense X-ray radiation to terahertz sources, and are used in many applications including medical radiotherapy and imaging. To extract information from the simulation about the quality of the beam, a typical approach is to examine plots of the entire dataset, visually determining the adequate parameters necessary to select a subset of particles, which is then further analyzed. This procedure requires laborious examination of massive data sets over many time steps using several plots, a routine that is unfeasible for large data collections. Demand for automated analysis is growing along with the volume and size of simulations. Current 2D LWFA simulation datasets are typically between 1GB and 100GB in size, but simulations in 3D are of the order of TBs. The increase in the number of datasets and dataset sizes leads to a need for automatic routines to recognize particle patterns as particle bunches (beam of electrons) for subsequent analysis. Because of the growth in dataset size, the application of machine learning techniques for

  7. In-wheel hub SRM simulation and analysis

    NASA Astrophysics Data System (ADS)

    Sager, Milton W., III

    Is it feasible to replace the conventional gasoline engine and subsequent drive system in a motorcycle with an electric switched reluctance motor (SRM) by placing the SRM inside the rear wheel, thereby removing the need for things such as a clutch, chain, transmission, gears and sprockets? The goal of this thesis is to study the theoretical aspect of prototyping and analyzing an in-wheel electric hub motor to replace the standard gasoline engine traditionally found on motorcycles. With the recent push for clean energy, electric vehicles are becoming more common. All currently produced electric motorcycles use conventional, prefabricated electric motors connected to the traditional sprocket and chain design. This greatly restricts the efficiency and range of these motorcycles. My design stands apart by turning the rear wheel into a SRM which uses electromagnets around a non-magnetic core to convert electrical energy into mechanical force driving the rear wheel. To my knowledge, there is currently no motorcycle designed with an in-wheel hub SRM. A three-phase SRM and a five-phase SRM will be simulated and analyzed using MATLAB with Simulink. Factors such as friction, weight, power, etc. will be taken into account in order to create a realistic simulation as if it were inside the rear wheel of a motorcycle. Since time and finances will not allow for a full scale build, a scaled model three-phase SRM will be attempted for demonstration purposes.

  8. Exosystem Modeling for Mission Simulation and Survey Analysis

    NASA Astrophysics Data System (ADS)

    Savransky, Dmitry

    In the last twenty years, the existence of exoplanets (planets orbiting stars other than our own sun) has gone from conjecture to established fact. The accelerating rate of exoplanet discovery has generated a wealth of important new knowledge, and is due mainly to the development and maturation of a large number of technologies that drive a variety of planet detection and observation methods. The overall goal of the exoplanet community is to study planets around all types of stars, and across all ranges of planetary mass and orbit size. With this capability we will be able to build confidence in planet formation and evolution theories and learn how our solar system came to exist. Achieving this goal requires creating dedicated instrumentation capable of detecting signals that are a small fraction of the magnitude of signals we can observe today. It also requires analyzing highly noisy data sets for the faint patterns that represent the presence of planets. Accurate modeling and simulation are necessary for both these tasks. With detailed planetary and observation models we can predict the type of data that will be generated when a specific instrument observes a specific planetary system. This allows us to evaluate the performance of both the instrument and the data analysis methods used to extract planet signals from observational data. The same simulations can help optimize observation scheduling and statistical analysis of data sets. The purpose of this thesis is to lay down the groundwork necessary for building simulations of this type, and to demonstrate a few of their many possible applications. First, we show how each of four different detection methods (astrometry, doppler spectroscopy, transit photometry and direct imaging) can be described using a common parameter set which also encodes sufficient information to propagate the described exosystem in time. We analyze this parameter set and derive the distribution functions of several of its elements. These

  9. Uncertainty analysis of penicillin V production using Monte Carlo simulation.

    PubMed

    Biwer, Arno; Griffith, Steve; Cooney, Charles

    2005-04-20

    Uncertainty and variability affect economic and environmental performance in the production of biotechnology and pharmaceutical products. However, commercial process simulation software typically provides analysis that assumes deterministic rather than stochastic process parameters and thus is not capable of dealing with the complexities created by variance that arise in the decision-making process. Using the production of penicillin V as a case study, this article shows how uncertainty can be quantified and evaluated. The first step is construction of a process model, as well as analysis of its cost structure and environmental impact. The second step is identification of uncertain variables and determination of their probability distributions based on available process and literature data. Finally, Monte Carlo simulations are run to see how these uncertainties propagate through the model and affect key economic and environmental outcomes. Thus, the overall variation of these objective functions are quantified, the technical, supply chain, and market parameters that contribute most to the existing variance are identified and the differences between economic and ecological evaluation are analyzed. In our case study analysis, we show that final penicillin and biomass concentrations in the fermenter have the highest contribution to variance for both unit production cost and environmental impact. The penicillin selling price dominates return on investment variance as well as the variance for other revenue-dependent parameters. PMID:15742389

  10. Visualization and Analysis of Climate Simulation Performance Data

    NASA Astrophysics Data System (ADS)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and

  11. Energy Analysis Program. 1992 Annual report

    SciTech Connect

    Not Available

    1993-06-01

    The Program became deeply involved in establishing 4 Washington, D.C., project office diving the last few months of fiscal year 1942. This project office, which reports to the Energy & Environment Division, will receive the majority of its support from the Energy Analysis Program. We anticipate having two staff scientists and support personnel in offices within a few blocks of DOE. Our expectation is that this office will carry out a series of projects that are better managed closer to DOE. We also anticipate that our representation in Washington will improve and we hope to expand the Program, its activities, and impact, in police-relevant analyses. In spite of the growth that we have achieved, the Program continues to emphasize (1) energy efficiency of buildings, (2) appliance energy efficiency standards, (3) energy demand forecasting, (4) utility policy studies, especially integrated resource planning issues, and (5) international energy studies, with considerate emphasis on developing countries and economies in transition. These continuing interests are reflected in the articles that appear in this report.

  12. NREL Evaluates the Thermal Performance of Uninsulated Walls to Improve the Accuracy of Building Energy Simulation Tools (Fact Sheet)

    SciTech Connect

    Not Available

    2012-01-01

    This technical highlight describes NREL research to develop models of uninsulated wall assemblies that help to improve the accuracy of building energy simulation tools when modeling potential energy savings in older homes. Researchers at the National Renewable Energy Laboratory (NREL) have developed models for evaluating the thermal performance of walls in existing homes that will improve the accuracy of building energy simulation tools when predicting potential energy savings of existing homes. Uninsulated walls are typical in older homes where the wall cavities were not insulated during construction or where the insulating material has settled. Accurate calculation of heat transfer through building enclosures will help determine the benefit of energy efficiency upgrades in order to reduce energy consumption in older American homes. NREL performed detailed computational fluid dynamics (CFD) analysis to quantify the energy loss/gain through the walls and to visualize different airflow regimes within the uninsulated cavities. The effects of ambient outdoor temperature, radiative properties of building materials, and insulation level were investigated. The study showed that multi-dimensional airflows occur in walls with uninsulated cavities and that the thermal resistance is a function of the outdoor temperature - an effect not accounted for in existing building energy simulation tools. The study quantified the difference between CFD prediction and the approach currently used in building energy simulation tools over a wide range of conditions. For example, researchers found that CFD predicted lower heating loads and slightly higher cooling loads. Implementation of CFD results into building energy simulation tools such as DOE2 and EnergyPlus will likely reduce the predicted heating load of homes. Researchers also determined that a small air gap in a partially insulated cavity can lead to a significant reduction in thermal resistance. For instance, a 4-in. tall air gap

  13. Expand the Modeling Capabilities of DOE's EnergyPlus Building Energy Simulation Program

    SciTech Connect

    Don Shirey

    2008-02-28

    EnergyPlus{trademark} is a new generation computer software analysis tool that has been developed, tested, and commercialized to support DOE's Building Technologies (BT) Program in terms of whole-building, component, and systems R&D (http://www.energyplus.gov). It is also being used to support evaluation and decision making of zero energy building (ZEB) energy efficiency and supply technologies during new building design and existing building retrofits. Version 1.0 of EnergyPlus was released in April 2001, followed by semiannual updated versions over the ensuing seven-year period. This report summarizes work performed by the University of Central Florida's Florida Solar Energy Center (UCF/FSEC) to expand the modeling capabilities of EnergyPlus. The project tasks involved implementing, testing, and documenting the following new features or enhancement of existing features: (1) A model for packaged terminal heat pumps; (2) A model for gas engine-driven heat pumps with waste heat recovery; (3) Proper modeling of window screens; (4) Integrating and streamlining EnergyPlus air flow modeling capabilities; (5) Comfort-based controls for cooling and heating systems; and (6) An improved model for microturbine power generation with heat recovery. UCF/FSEC located existing mathematical models or generated new model for these features and incorporated them into EnergyPlus. The existing or new models were (re)written using Fortran 90/95 programming language and were integrated within EnergyPlus in accordance with the EnergyPlus Programming Standard and Module Developer's Guide. Each model/feature was thoroughly tested and identified errors were repaired. Upon completion of each model implementation, the existing EnergyPlus documentation (e.g., Input Output Reference and Engineering Document) was updated with information describing the new or enhanced feature. Reference data sets were generated for several of the features to aid program users in selecting proper model inputs. An

  14. Advanced Analysis Methods in High Energy Physics

    SciTech Connect

    Pushpalatha C. Bhat

    2001-10-03

    During the coming decade, high energy physics experiments at the Fermilab Tevatron and around the globe will use very sophisticated equipment to record unprecedented amounts of data in the hope of making major discoveries that may unravel some of Nature's deepest mysteries. The discovery of the Higgs boson and signals of new physics may be around the corner. The use of advanced analysis techniques will be crucial in achieving these goals. The author discusses some of the novel methods of analysis that could prove to be particularly valuable for finding evidence of any new physics, for improving precision measurements and for exploring parameter spaces of theoretical models.

  15. Large perturbation flow field analysis and simulation for supersonic inlets

    NASA Technical Reports Server (NTRS)

    Varner, M. O.; Martindale, W. R.; Phares, W. J.; Kneile, K. R.; Adams, J. C., Jr.

    1984-01-01

    An analysis technique for simulation of supersonic mixed compression inlets with large flow field perturbations is presented. The approach is based upon a quasi-one-dimensional inviscid unsteady formulation which includes engineering models of unstart/restart, bleed, bypass, and geometry effects. Numerical solution of the governing time dependent equations of motion is accomplished through a shock capturing finite difference algorithm, of which five separate approaches are evaluated. Comparison with experimental supersonic wind tunnel data is presented to verify the present approach for a wide range of transient inlet flow conditions.

  16. Analysis and simulation of a torque assist automated manual transmission

    NASA Astrophysics Data System (ADS)

    Galvagno, E.; Velardocchia, M.; Vigliani, A.

    2011-08-01

    The paper presents the kinematic and dynamic analysis of a power-shift automated manual transmission (AMT) characterised by a wet clutch, called assist clutch (ACL), replacing the fifth gear synchroniser. This torque assist mechanism becomes a torque transfer path during gearshifts, in order to overcome a typical dynamic problem of the AMTs, that is the driving force interruption. The mean power contributions during gearshifts are computed for different engine and ACL interventions, thus allowing to draw considerations useful for developing the control algorithms. The simulation results prove the advantages in terms of gearshift quality and ride comfort of the analysed transmission.

  17. Analysis of motion features for molecular dynamics simulation of proteins

    NASA Astrophysics Data System (ADS)

    Kamada, Mayumi; Toda, Mikito; Sekijima, Masakazu; Takata, Masami; Joe, Kazuki

    2011-01-01

    Recently, a new method for time series analysis using the wavelet transformation has been proposed by Sakurai et al. We apply it to molecular dynamics simulation of Thermomyces lanuginosa lipase (TLL). Introducing indexes to characterize collective motion of the protein, we have obtained the following two results. First, time evolution of the collective motion involves not only the dynamics within a single potential well but also takes place wandering around multiple conformations. Second, correlation of the collective motion between secondary structures shows that collective motion exists involving multiple secondary structures. We discuss future prospects of our study involving 'disordered proteins'.

  18. Bifurcations analysis of turbulent energy cascade

    SciTech Connect

    Divitiis, Nicola de

    2015-03-15

    This note studies the mechanism of turbulent energy cascade through an opportune bifurcations analysis of the Navier–Stokes equations, and furnishes explanations on the more significant characteristics of the turbulence. A statistical bifurcations property of the Navier–Stokes equations in fully developed turbulence is proposed, and a spatial representation of the bifurcations is presented, which is based on a proper definition of the fixed points of the velocity field. The analysis first shows that the local deformation can be much more rapid than the fluid state variables, then explains the mechanism of energy cascade through the aforementioned property of the bifurcations, and gives reasonable argumentation of the fact that the bifurcations cascade can be expressed in terms of length scales. Furthermore, the study analyzes the characteristic length scales at the transition through global properties of the bifurcations, and estimates the order of magnitude of the critical Taylor-scale Reynolds number and the number of bifurcations at the onset of turbulence.

  19. Analysis of subgrid models using direct and large-eddy simulations of isotropic turbulence

    NASA Astrophysics Data System (ADS)

    Menon, S.; Yeung, P. K.

    1994-12-01

    Direct and large eddy simulations of forced and decaying isotropic turbulence have been performed using a pseudospectral and a finite-difference code. Subgrid models that include a one-equation subgrid kinetic energy model with and without a stochastic backscatter forcing term and a new scale similarity model have been analyzed in both Fourier space and physical space. The Fourier space analysis showed that the energy transfer across the cutoff wavenumber k(sub c) is dominated by local interaction. The correlation between the exact and the modeled (by a spectral eddy viscosity) nonlinear terms and the subgrid energy transfer in physical space was found to be quite low. In physical space, a similar correlation analysis was carried out using top hat filtering. Results show that the subgrid stress and the energy flux predicted by the subgrid models correlates very well with the exact data. The scale similarity model showed very high correlation for reasonable grid resolution. However, with decrease in grid resolution, the scale similarity model became more uncorrelated, when compared to the kinetic energy subgrid model. The subgrid models were then used for large-eddy simulations for a range of Reynolds number. It was determined that the dissipation was modeled poorly and that the correlation with the exact results was quite low for all the models. In general, for coarse grid resolution, the scale similarity model consistently showed very low correlation while the kinetic energy model showed a relatively higher correlation. These results suggest that to use the scale similarity model relatively fine grid resolution may be required, whereas, the kinetic energy model could be used even in coarse grid.

  20. Scripted Building Energy Modeling and Analysis: Preprint

    SciTech Connect

    Hale, E.; Macumber, D.; Benne, K.; Goldwasser, D.

    2012-08-01

    Building energy modeling and analysis is currently a time-intensive, error-prone, and nonreproducible process. This paper describes the scripting platform of the OpenStudio tool suite (http://openstudio.nrel.gov) and demonstrates its use in several contexts. Two classes of scripts are described and demonstrated: measures and free-form scripts. Measures are small, single-purpose scripts that conform to a predefined interface. Because measures are fairly simple, they can be written or modified by inexperienced programmers.

  1. Nonlinear transient analysis via energy minimization

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.; Knight, N. F., Jr.

    1978-01-01

    The formulation basis for nonlinear transient analysis of finite element models of structures using energy minimization is provided. Geometric and material nonlinearities are included. The development is restricted to simple one and two dimensional finite elements which are regarded as being the basic elements for modeling full aircraft-like structures under crash conditions. The results indicate the effectiveness of the technique as a viable tool for this purpose.

  2. Lac Courte Oreilles Energy Analysis Project

    SciTech Connect

    Leslie Isham; Denise Johnson

    2009-04-01

    and funding to do so will be sought. While we already are in ownership of a Hydro Dam it is currently not functioning to its full capacity we are seeking operation and maintenance firm proposals and funding sources. One of our biggest accomplishment this project gave us was our total Carbon Emissions 9989.45 tons, this will be the number that we will use to base our reductions from. It will help us achieve our goals we have set for ourselves in achieving the Kyoto Protocol and saving our Earth for our future generations. Another major accomplishment and lesson learned is we need to educate ourselves and our people on how to conserve energy to both impact the environment and our own budgets. The Lac Courte Oreilles (LCO) Energy Analysis Project will perform an energy audit to gather information on the Tribe's energy usage and determine the carbon emissions. By performing the audit we will be able to identify areas where conservation efforts are most viable and recommend policies that can be implemented. These steps will enable LCO to begin achieving the goals that have been set by the Tribal Governing Board and adopted through resolutions. The goals are to reduce emissions by 25% and to produce 25% of its energy using sustainable sources. The project objectives were very definitive to assist the Tribe in achieving its goals; reducing carbon emissions and obtaining a sustainable source of energy. The following were the outlined objectives: (1) Coordinate LCO's current and future conservation and renewable energy projects; (2) Establish working relationships with outside entities to share information and collaborate on future projects; (3) Complete energy audit and analyze LCO's energy load and carbon emissions; (4) Identify policy changes, education programs and conservation efforts which are appropriate for the LCO Reservation; and (5) Create a plan to identify the most cost effective renewable energy options for LCO.

  3. Reactor Subsystem Simulation for Nuclear Hybrid Energy Systems

    SciTech Connect

    Shannon Bragg-Sitton; J. Michael Doster; Alan Rominger

    2012-09-01

    Preliminary system models have been developed by Idaho National Laboratory researchers and are currently being enhanced to assess integrated system performance given multiple sources (e.g., nuclear + wind) and multiple applications (i.e., electricity + process heat). Initial efforts to integrate a Fortran-based simulation of a small modular reactor (SMR) with the balance of plant model have been completed in FY12. This initial effort takes advantage of an existing SMR model developed at North Carolina State University to provide initial integrated system simulation for a relatively low cost. The SMR subsystem simulation details are discussed in this report.

  4. Analysis of lunar regolith thermal energy storage

    SciTech Connect

    Colozza, A.J.

    1991-11-01

    The concept of using lunar regolith as a thermal energy storage medium was evaluated. The concept was examined by mathematically modeling the absorption and transfer of heat by the lunar regolith. Regolith thermal and physical properties were established through various sources as functions of temperature. Two cases were considered: a semi-infinite, constant temperature, cylindrical heat source embedded in a continuum of lunar regolith and a spherically shaped molten zone of lunar regolith set with an initial temperature profile. The cylindrical analysis was performed in order to examine the amount of energy which can be stored in the regolith during the day. At night, the cylinder acted as a perfect insulator. This cycling was performed until a steady state situation was reached in the surrounding regolith. It was determined that a cycling steady state occurs after approximately 15 day/night cycles. Results were obtained for cylinders of various diameters. The spherical molten zone analysis was performed to establish the amount of thermal energy, within the regolith, necessary to maintain some molten material throughout a nighttime period. This surrounding temperature profile was modeled after the cycling steady state temperature profile established by the cylindrical analysis. It was determined that a molten sphere diameter of 4.76 m is needed to maintain a core temperature near the low end of the melting temperature range throughout one nighttime period.

  5. Analysis of lunar regolith thermal energy storage

    NASA Technical Reports Server (NTRS)

    Colozza, Anthony J.

    1991-01-01

    The concept of using lunar regolith as a thermal energy storage medium was evaluated. The concept was examined by mathematically modeling the absorption and transfer of heat by the lunar regolith. Regolith thermal and physical properties were established through various sources as functions of temperature. Two cases were considered: a semi-infinite, constant temperature, cylindrical heat source embedded in a continuum of lunar regolith and a spherically shaped molten zone of lunar regolith set with an initial temperature profile. The cylindrical analysis was performed in order to examine the amount of energy which can be stored in the regolith during the day. At night, the cylinder acted as a perfect insulator. This cycling was performed until a steady state situation was reached in the surrounding regolith. It was determined that a cycling steady state occurs after approximately 15 day/night cycles. Results were obtained for cylinders of various diameters. The spherical molten zone analysis was performed to establish the amount of thermal energy, within the regolith, necessary to maintain some molten material throughout a nighttime period. This surrounding temperature profile was modeled after the cycling steady state temperature profile established by the cylindrical analysis. It was determined that a molten sphere diameter of 4.76 m is needed to maintain a core temperature near the low end of the melting temperature range throughout one nighttime period.

  6. Analysis of Measured and Simulated Supraglottal Acoustic Waves.

    PubMed

    Fraile, Rubén; Evdokimova, Vera V; Evgrafova, Karina V; Godino-Llorente, Juan I; Skrelin, Pavel A

    2016-09-01

    To date, although much attention has been paid to the estimation and modeling of the voice source (ie, the glottal airflow volume velocity), the measurement and characterization of the supraglottal pressure wave have been much less studied. Some previous results have unveiled that the supraglottal pressure wave has some spectral resonances similar to those of the voice pressure wave. This makes the supraglottal wave partially intelligible. Although the explanation for such effect seems to be clearly related to the reflected pressure wave traveling upstream along the vocal tract, the influence that nonlinear source-filter interaction has on it is not as clear. This article provides an insight into this issue by comparing the acoustic analyses of measured and simulated supraglottal and voice waves. Simulations have been performed using a high-dimensional discrete vocal fold model. Results of such comparative analysis indicate that spectral resonances in the supraglottal wave are mainly caused by the regressive pressure wave that travels upstream along the vocal tract and not by source-tract interaction. On the contrary and according to simulation results, source-tract interaction has a role in the loss of intelligibility that happens in the supraglottal wave with respect to the voice wave. This loss of intelligibility mainly corresponds to spectral differences for frequencies above 1500 Hz. PMID:26377510

  7. Bragg's Law diffraction simulations for electron backscatter diffraction analysis.

    PubMed

    Kacher, Josh; Landon, Colin; Adams, Brent L; Fullwood, David

    2009-08-01

    In 2006, Angus Wilkinson introduced a cross-correlation-based electron backscatter diffraction (EBSD) texture analysis system capable of measuring lattice rotations and elastic strains to high resolution. A variation of the cross-correlation method is introduced using Bragg's Law-based simulated EBSD patterns as strain free reference patterns that facilitates the use of the cross-correlation method with polycrystalline materials. The lattice state is found by comparing simulated patterns to collected patterns at a number of regions on the pattern using the cross-correlation function and calculating the deformation from the measured shifts of each region. A new pattern can be simulated at the deformed state, and the process can be iterated a number of times to converge on the absolute lattice state. By analyzing an iteratively rotated single crystal silicon sample and recovering the rotation, this method is shown to have an angular resolution of approximately 0.04 degrees and an elastic strain resolution of approximately 7e-4. As an example of applications, elastic strain and curvature measurements are used to estimate the dislocation density in a single grain of a compressed polycrystalline Mg-based AZ91 alloy. PMID:19520512

  8. Axisymmetric Plume Simulations with NASA's DSMC Analysis Code

    NASA Technical Reports Server (NTRS)

    Stewart, B. D.; Lumpkin, F. E., III

    2012-01-01

    A comparison of axisymmetric Direct Simulation Monte Carlo (DSMC) Analysis Code (DAC) results to analytic and Computational Fluid Dynamics (CFD) solutions in the near continuum regime and to 3D DAC solutions in the rarefied regime for expansion plumes into a vacuum is performed to investigate the validity of the newest DAC axisymmetric implementation. This new implementation, based on the standard DSMC axisymmetric approach where the representative molecules are allowed to move in all three dimensions but are rotated back to the plane of symmetry by the end of the move step, has been fully integrated into the 3D-based DAC code and therefore retains all of DAC s features, such as being able to compute flow over complex geometries and to model chemistry. Axisymmetric DAC results for a spherically symmetric isentropic expansion are in very good agreement with a source flow analytic solution in the continuum regime and show departure from equilibrium downstream of the estimated breakdown location. Axisymmetric density contours also compare favorably against CFD results for the R1E thruster while temperature contours depart from equilibrium very rapidly away from the estimated breakdown surface. Finally, axisymmetric and 3D DAC results are in very good agreement over the entire plume region and, as expected, this new axisymmetric implementation shows a significant reduction in computer resources required to achieve accurate simulations for this problem over the 3D simulations.

  9. Passivhaus: indoor comfort and energy dynamic analysis.

    NASA Astrophysics Data System (ADS)

    Guida, Antonella; Pagliuca, Antonello; Cardinale, Nicola; Rospi, Gianluca

    2013-04-01

    The research aims to verify the energy performance as well as the indoor comfort of an energy class A+ building, built so that the sum of the heat passive contributions of solar radiation, transmitted through the windows, and the heat generated inside the building, are adeguate to compensate for the envelope loss during the cold season. The building, located in Emilia Romagna (Italy), was built using a wooden structure, an envelope realized using a pinewood sandwich panels (transmittance U = 0.250 W/m2K) and, inside, a wool flax insulation layer and thermal window frame with low-emissivity glass (U = 0524 W/m2K). The building design and construction process has followed the guidelines set by "CasaClima". The building has been modeled in the code of dynamic calculation "Energy Plus" by the Design Builder application and divided it into homogenous thermal zones, characterized by winter indoor temperature set at 20 ° (+ / - 1 °) and summer indoor temperature set at 26 ° (+ / - 1 °). It has modeled: the envelope, as described above, the "free" heat contributions, the air conditioning system, the Mechanical Ventilation system as well as home automation solutions. The air conditioning system is an heat pump, able to guarantee an optimization of energy consumption (in fact, it uses the "free" heat offered by the external environment for conditioning indoor environment). As regards the air recirculation system, it has been used a mechanical ventilation system with internal heat cross-flow exchanger, with an efficiency equal to 50%. The domotic solutions, instead, regard a system for the control of windows external screening using reeds, adjustable as a function of incident solar radiation and a lighting management system adjusted automatically using a dimmer. A so realized building meets the requirement imposed from Italian standard UNI/TS 11300 1, UNI/TS 11300 2 and UNI/TS 11300 3. The analysis was performed according to two different configurations: in "spontaneous

  10. Vibrational energy flow in the villin headpiece subdomain: Master equation simulations

    SciTech Connect

    Leitner, David M. E-mail: stock@physik.uni-freiburg.de; Buchenberg, Sebastian; Brettel, Paul; Stock, Gerhard E-mail: stock@physik.uni-freiburg.de

    2015-02-21

    We examine vibrational energy flow in dehydrated and hydrated villin headpiece subdomain HP36 by master equation simulations. Transition rates used in the simulations are obtained from communication maps calculated for HP36. In addition to energy flow along the main chain, we identify pathways for energy transport in HP36 via hydrogen bonding between residues quite far in sequence space. The results of the master equation simulations compare well with all-atom non-equilibrium simulations to about 1 ps following initial excitation of the protein, and quite well at long times, though for some residues we observe deviations between the master equation and all-atom simulations at intermediate times from about 1–10 ps. Those deviations are less noticeable for hydrated than dehydrated HP36 due to energy flow into the water.

  11. Probability theory versus simulation of petroleum potential in play analysis

    USGS Publications Warehouse

    Crovelli, R.A.

    1987-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An objective was to replace an existing Monte Carlo simulation method in order to increase the efficiency of the appraisal process. Underlying the two methods is a single geologic model which considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The results of the model are resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and a closed form solution of all means and standard deviations, along with the probabilities of occurrence. ?? 1987 J.C. Baltzer A.G., Scientific Publishing Company.

  12. Analysis of forest structure using thematic mapper simulator data

    NASA Technical Reports Server (NTRS)

    Peterson, D. L.; Westman, W. E.; Brass, J. A.; Stephenson, N. J.; Ambrosia, V. G.; Spanner, M. A.

    1986-01-01

    The potential of Thematic Mapper Simulator (TMS) data for sensing forest structure information has been explored by principal components and feature selection techniques. In a survey of forest structural properties conducted for 123 field sites of the Sequoia National Park, the canopy closure could be well estimated (r = 0.62 to 0.69) by a variety of channel bands and band ratios, without reference to the forest type. Estimation of the basal area was less successful (r = 0.51 or less) on the average, but could be improved for certain forest types when data were stratified by floristic composition. To achieve such a stratification, individual sites were ordinated by a detrended correspondence analysis based on the canopy of dominant species. The analysis of forest structure in the Sequoia data suggests that total basal area can be best predicted in stands of lower density, and in younger even-aged managed stands.

  13. Functional analysis of the binding model of microbial inulinases using docking and molecular dynamics simulation.

    PubMed

    Singh, Puneet Kumar; Joseph, Josmi; Goyal, Sukriti; Grover, Abhinav; Shukla, Pratyoosh

    2016-04-01

    Recently inulinase has regained interest due to its usage in the production of fructooligosaccharides, biofuels, and in pharmaceutical industries. Inulinases properties are experimentally reported by nomerous studies but their characteristics are just partially explained by only a few computational investigations. In the present study we have investigated exoinulinase and endoinulinase from different microbial sources toward their catalytic activity. Docking and molecular dynamic (MD) simulation were carried out for microbial endoinulinase and exoinulinase docked with 1-kestose and fructose-6-phosphate respectively. Pseudomonas mucidolens (-7.42 kcal mol(-1) binding energy), docked with fructose-6-phosphate, was recorded as the most favorable binding energy, Pseudomonas mucidolens made hydrogen bonds with fructose-6-phosphate and the amino acids involved were arginine 286, tryptophan 158, and isoleucine 87. After the simulation only tryptophan 158 remained bonded and additionally valine 156 made hydrogen bonds with fructose-6-phosphate. Aspergillus niger docked with 1-kestose was bonded with the involvement of threonine 271, aspartate 285, threonine 288, and proline 283, after the simulation aspartate 285 was retained till the end of the simulation. The present study thus refers to the indication of depicting binding analysis of microbial inulinases. PMID:26956120

  14. Multiscale dynamical analysis of a high-resolution numerical model simulation of the Solomon Sea circulation

    NASA Astrophysics Data System (ADS)

    Djath, Bughsin'; Verron, Jacques; Melet, Angelique; Gourdeau, Lionel; Barnier, Bernard; Molines, Jean-Marc

    2014-09-01

    A high 1/36° resolution numerical model is used to study the ocean circulation in the Solomon Sea. An evaluation of the model with (the few) available observation shows that the 1/36° resolution model realistically simulates the Solomon Sea circulations. The model notably reproduces the high levels of mesoscale eddy activity observed in the Solomon Sea. With regard to previous simulations at 1/12° resolution, the average eddy kinetic energy levels are increased by up to ˜30-40% in the present 1/36° simulation, and the enhancement extends at depth. At the surface, the eddy kinetic energy level is maximum in March-April-May and is minimum in December-January-February. The high subsurface variability is related to the variability of the western boundary current (New Guinea Coastal Undercurrent). Moreover, the emergence of submesoscales is clearly apparent in the present simulations. A spectral analysis is conducted in order to evidence and characterize the modeled submesoscale dynamics and to provide a spectral view of scales interactions. The corresponding spectral slopes show a strong consistency with the Surface Quasi-Geostrophic turbulence theory.

  15. Statistical simulation of internal energy exchange in shock waves using explicit transition probabilities

    NASA Astrophysics Data System (ADS)

    Torres, Erik; Magin, Thierry

    2012-11-01

    A statistical model originally developed for electronic-translational energy transfer in atoms having multiple electronic states (Anderson et al, RGD15, 1986) is applied to the study of internal energy exchange in a polyatomic gas. The model is well-suited for gas kinetic simulations, because it provides an explicit expression for the transition probabilities between internal energy levels. All molecules possessing a given internal energy level are treated as a separate chemical species and all collisions involving exchange of internal energy thus become pseudo-chemical reactions. Post-collision energy levels of the two partners are determined by conserving the total energy of the collision pair and taking into account detailed balance. In the present work, DSMC simulations of relaxation in a stationary gas are performed and compared to those obtained by Anderson et al. Additionally, we apply the model to the simulation of rotational relaxation behind a normal shock wave.

  16. Theoretical analysis and simulation for a facilitated asymmetric exclusion process.

    PubMed

    Hao, Qing-Yi; Chen, Zhe; Sun, Xiao-Yan; Liu, Bing-Bing; Wu, Chao-Yun

    2016-08-01

    Driven diffusive systems are important models in nonequilibrium state statistical mechanics. This paper studies an asymmetric exclusion process model with nearest rear neighbor interactions associated with energy. The exact flux expression of the model is obtained by a cluster mean-field method. Based on the flux expression, the properties of the fundamental diagram have been investigated in detail. To probe the energy's influence on the coarsening process of the system, Monte Carlo simulations are carried out to acquire the monotonic phase boundary in energy-density space. Above the phase boundary, the system is inhomogeneous and the normalized residence distribution p(s) is nonmonotonically decreasing. Under the phase boundary, the system is homogeneous and p(s) is monotonically decreasing. Further study comparatively shows that the system has turned into a microscopic inhomogeneous state from a homogeneous state before the system current arrives at maximum, if nearest rear neighbor interactions are strong. Our findings offer insights to deeply understand the dynamic features of nonequilibrium state systems. PMID:27627252

  17. Theoretical analysis and simulation for a facilitated asymmetric exclusion process

    NASA Astrophysics Data System (ADS)

    Hao, Qing-Yi; Chen, Zhe; Sun, Xiao-Yan; Liu, Bing-Bing; Wu, Chao-Yun

    2016-08-01

    Driven diffusive systems are important models in nonequilibrium state statistical mechanics. This paper studies an asymmetric exclusion process model with nearest rear neighbor interactions associated with energy. The exact flux expression of the model is obtained by a cluster mean-field method. Based on the flux expression, the properties of the fundamental diagram have been investigated in detail. To probe the energy's influence on the coarsening process of the system, Monte Carlo simulations are carried out to acquire the monotonic phase boundary in energy-density space. Above the phase boundary, the system is inhomogeneous and the normalized residence distribution p (s ) is nonmonotonically decreasing. Under the phase boundary, the system is homogeneous and p (s ) is monotonically decreasing. Further study comparatively shows that the system has turned into a microscopic inhomogeneous state from a homogeneous state before the system current arrives at maximum, if nearest rear neighbor interactions are strong. Our findings offer insights to deeply understand the dynamic features of nonequilibrium state systems.

  18. Stochastic algorithms for the analysis of numerical flame simulations

    SciTech Connect

    Bell, John B.; Day, Marcus S.; Grcar, Joseph F.; Lijewski, Michael J.

    2004-04-26

    Recent progress in simulation methodologies and high-performance parallel computers have made it is possible to perform detailed simulations of multidimensional reacting flow phenomena using comprehensive kinetics mechanisms. As simulations become larger and more complex, it becomes increasingly difficult to extract useful information from the numerical solution, particularly regarding the interactions of the chemical reaction and diffusion processes. In this paper we present a new diagnostic tool for analysis of numerical simulations of reacting flow. Our approach is based on recasting an Eulerian flow solution in a Lagrangian frame. Unlike a conventional Lagrangian view point that follows the evolution of a volume of the fluid, we instead follow specific chemical elements, e.g., carbon, nitrogen, etc., as they move through the system . From this perspective an ''atom'' is part of some molecule of a species that is transported through the domain by advection and diffusion. Reactions cause the atom to shift from one chemical host species to another and the subsequent transport of the atom is given by the movement of the new species. We represent these processes using a stochastic particle formulation that treats advection deterministically and models diffusion and chemistry as stochastic processes. In this paper, we discuss the numerical issues in detail and demonstrate that an ensemble of stochastic trajectories can accurately capture key features of the continuum solution. The capabilities of this diagnostic are then demonstrated by applications to study the modulation of carbon chemistry during a vortex-flame interaction, and the role of cyano chemistry in rm NO{sub x} production for a steady diffusion flame.

  19. Regional Scale Analysis of Extremes in an SRM Geoengineering Simulation

    NASA Astrophysics Data System (ADS)

    Muthyala, R.; Bala, G.

    2014-12-01

    Only a few studies in the past have investigated the statistics of extreme events under geoengineering. In this study, a global climate model is used to investigate the impact of solar radiation management on extreme precipitation events on regional scale. Solar constant was reduced by 2.25% to counteract the global mean surface temperature change caused by a doubling of CO2 (2XCO2) from its preindustrial control value. Using daily precipitation rates, extreme events are defined as those which exceed 99.9th percentile precipitation threshold. Extremes are substantially reduced in geoengineering simulation: the magnitude of change is much smaller than those that occur in a simulation with doubled CO2. Regional analysis over 22 Giorgi land regions is also performed. Doubling of CO2 leads to an increase in intensity of extreme (99.9th percentile) precipitation by 17.7% on global-mean basis with maximum increase in intensity over South Asian region by 37%. In the geoengineering simulation, there is a global-mean reduction in intensity of 3.8%, with a maximum reduction over Tropical Ocean by 8.9%. Further, we find that the doubled CO2 simulation shows an increase in the frequency of extremes (>50 mm/day) by 50-200% with a global mean increase of 80%. In contrast, in geo-engineering climate there is a decrease in frequency of extreme events by 20% globally with a larger decrease over Tropical Ocean by 30%. In both the climate states (2XCO2 and geo-engineering) change in "extremes" is always greater than change in "means" over large domains. We conclude that changes in precipitation extremes are larger in 2XCO2 scenario compared to preindustrial climate while extremes decline slightly in the geoengineered climate. We are also investigating the changes in extreme statistics for daily maximum and minimum temperature, evapotranspiration and vegetation productivity. Results will be presented at the meeting.

  20. Biomass energy analysis for crop dehydration

    SciTech Connect

    Whittier, J.P.; Haase, S.G.; Quinn, M.W.

    1994-12-31

    In 1994, an agricultural processing facility was constructed in southern New Mexico for spice and herb dehydration. Annual operational costs are dominated by energy costs, due primarily to the energy intensity of dehydration. A feasibility study was performed to determine whether the use of biomass resources as a feedstock for a cogeneration system would be an economical option. The project location allowed access to unusual biomass feedstocks including cotton gin trash, pecan shells and in-house residues. A resource assessment of the immediate project area determined that approximately 120,000 bone dry tons of biomass feedstocks are available annually. Technology characterization for the plant energy requirements indicated gasification systems offer fuel flexibility advantages over combustion systems although vendor support and commercial experience are limited. Regulatory siting considerations introduce a level of uncertainty because of a lack of a precedent in New Mexico for gasification technology and because vendors of commercial gasifiers have little experience operating such a facility nor gathering emission data. A public opinion survey indicated considerable support for renewable energy use and biomass energy utilization. However, the public opinion survey also revealed limited knowledge of biomass technologies and concerns regarding siting of a biomass facility within the geographic area. The economic analysis conducted for the study is based on equipment vendor quotations, and indicates there will be difficulty competing with current prices of natural gas.

  1. LOOS: an extensible platform for the structural analysis of simulations.

    PubMed

    Romo, Tod D; Grossfield, Alan

    2009-01-01

    We have developed LOOS (Lightweight Object-Oriented Structure-analysis library) as an object-oriented library designed to facilitate the rapid development of tools for the structural analysis of simulations. LOOS supports the native file formats of most common simulation packages including AMBER, CHARMM, CNS, Gromacs, NAMD, Tinker, and X-PLOR. Encapsulation and polymorphism are used to simultaneously provide a stable interface to the programmer and make LOOS easily extensible. A rich atom selection language based on the C expression syntax is included as part of the library. LOOS enables students and casual programmer-scientists to rapidly write their own analytical tools in a compact and expressive manner resembling scripting. LOOS is written in C++ and makes extensive use of the Standard Template Library and Boost, and is freely available under the GNU General Public License (version 3) LOOS has been tested on Linux and MacOS X, but is written to be portable and should work on most Unix-based platforms. PMID:19965179

  2. HEAT TRANSFER EXPERIMENTS AND ANALYSIS OF A SIMULATED HTS CABLE

    SciTech Connect

    Demko, J. A.; Duckworth, R. C.; Gouge, M. J.; Knoll, D.

    2010-01-01

    Long-length high temperature superconducting (HIS) cable projects, over 1 km, are being designed that are cooled by flowing liquid nitrogen. The compact counter-flow cooling arrangement which has the supply and return stream in a single cryostat offers several advantages including smallest space requirement, least heat load, and reduced cost since a return cryostat is not required. One issue in long length HIS cable systems is the magnitude of the heat transfer radially through the cable. It is extremely difficult to instrument an HIS cable in service on the grid with the needed thermometry because of the issues associated with installing thermometers on high voltage components. A 5-meter long test system has been built that simulates a counter-flow cooled, HIS cable using a heated tube to simulate the cable. Measurements of the temperatures in the flow stream and on the tube wall can be made and compared to analysis. These data can be used to benchmark different HIS cable heat transfer and fluid flow analysis approaches.

  3. Heat Transfer Experiments and Analysis of a Simulated HTS

    SciTech Connect

    Demko, Jonathan A; Duckworth, Robert C; Gouge, Michael J; Knoll, David

    2010-01-01

    Long-length high temperature superconducting (HTS) cable projects, over 1 km, are being designed that are cooled by flowing liquid nitrogen. The compact counter-flow cooling arrangement which has the supply and return stream in a single cryostat offers several advantages including smallest space requirement, least heat load, and reduced cost since a return cryostat is not required. One issue in long length HTS cable systems is the magnitude of the heat transfer radially through the cable. It is extremely difficult to instrument an HTS cable in service on the grid with the needed thermometry because of the issues associated with installing thermometers on high voltage components. A 5-meter long test system has been built that simulates a counter-flow cooled, HTS cable using a heated tube to simulate the cable. Measurements of the temperatures in the flow stream and on the tube wall are presented and compared to analysis. These data can be used to benchmark different HTS cable heat transfer and fluid flow analysis approaches.

  4. MDAnalysis: a toolkit for the analysis of molecular dynamics simulations.

    PubMed

    Michaud-Agrawal, Naveen; Denning, Elizabeth J; Woolf, Thomas B; Beckstein, Oliver

    2011-07-30

    MDAnalysis is an object-oriented library for structural and temporal analysis of molecular dynamics (MD) simulation trajectories and individual protein structures. It is written in the Python language with some performance-critical code in C. It uses the powerful NumPy package to expose trajectory data as fast and efficient NumPy arrays. It has been tested on systems of millions of particles. Many common file formats of simulation packages including CHARMM, Gromacs, Amber, and NAMD and the Protein Data Bank format can be read and written. Atoms can be selected with a syntax similar to CHARMM's powerful selection commands. MDAnalysis enables both novice and experienced programmers to rapidly write their own analytical tools and access data stored in trajectories in an easily accessible manner that facilitates interactive explorative analysis. MDAnalysis has been tested on and works for most Unix-based platforms such as Linux and Mac OS X. It is freely available under the GNU General Public License from http://mdanalysis.googlecode.com. PMID:21500218

  5. MDAnalysis: A Toolkit for the Analysis of Molecular Dynamics Simulations

    PubMed Central

    Michaud-Agrawal, Naveen; Denning, Elizabeth J.; Woolf, Thomas B.; Beckstein, Oliver

    2011-01-01

    MDAnalysis is an object-oriented library for structural and temporal analysis of molecular dynamics (MD) simulation trajectories and individual protein structures. It is written in the Python language with some performance-critical code in C. It uses the powerful NumPy package to expose trajectory data as fast and efficient NumPy arrays. It has been tested on systems of millions of particles. Many common file formats of simulation packages including CHARMM, Gromacs, Amber, and NAMD and the Protein Data Bank format can be read and written. Atoms can be selected with a syntax similar to CHARMM’s powerful selection commands. MDAnalysis enables both novice and experienced programmers to rapidly write their own analytical tools and access data stored in trajectories in an easily accessible manner that facilitates interactive explorative analysis. MDAnalysis has been tested on and works for most Unix-based platforms such as Linux and Mac OS X. It is freely available under the GNU Public License from http://mdanalysis.googlecode.com. PMID:21500218

  6. Large-scale molecular dynamics simulation: Effect of polarization on thrombin-ligand binding energy.

    PubMed

    Duan, Li L; Feng, Guo Q; Zhang, Qing G

    2016-01-01

    Molecular dynamics (MD) simulations lasting 500 ns were performed in explicit water to investigate the effect of polarization on the binding of ligands to human α-thrombin based on the standard nonpolarizable AMBER force field and the quantum-derived polarized protein-specific charge (PPC). The PPC includes the electronic polarization effect of the thrombin-ligand complex, which is absent in the standard force field. A detailed analysis and comparison of the results of the MD simulation with experimental data provided strong evidence that intra-protein, protein-ligand hydrogen bonds and the root-mean-square deviation of backbone atoms were significantly stabilized through electronic polarization. Specifically, two critical hydrogen bonds between thrombin and the ligand were broken at approximately 190 ns when AMBER force field was used and the number of intra-protein backbone hydrogen bonds was higher under PPC than under AMBER. The thrombin-ligand binding energy was computed using the molecular mechanics Poisson-Boltzmann surface area (MM/PBSA) method, and the results were consistent with the experimental value obtained using PPC. Because hydrogen bonds were unstable, it was failed to predict the binding affinity under the AMBER force field. Furthermore, the results of the present study revealed that differences in the binding free energy between AMBER and PPC almost comes from the electrostatic interaction. Thus, this study provides evidence that protein polarization is critical to accurately describe protein-ligand binding. PMID:27507430

  7. Large-scale molecular dynamics simulation: Effect of polarization on thrombin-ligand binding energy

    PubMed Central

    Duan, Li L.; Feng, Guo Q.; Zhang, Qing G.

    2016-01-01

    Molecular dynamics (MD) simulations lasting 500 ns were performed in explicit water to investigate the effect of polarization on the binding of ligands to human α-thrombin based on the standard nonpolarizable AMBER force field and the quantum-derived polarized protein-specific charge (PPC). The PPC includes the electronic polarization effect of the thrombin-ligand complex, which is absent in the standard force field. A detailed analysis and comparison of the results of the MD simulation with experimental data provided strong evidence that intra-protein, protein-ligand hydrogen bonds and the root-mean-square deviation of backbone atoms were significantly stabilized through electronic polarization. Specifically, two critical hydrogen bonds between thrombin and the ligand were broken at approximately 190 ns when AMBER force field was used and the number of intra-protein backbone hydrogen bonds was higher under PPC than under AMBER. The thrombin-ligand binding energy was computed using the molecular mechanics Poisson-Boltzmann surface area (MM/PBSA) method, and the results were consistent with the experimental value obtained using PPC. Because hydrogen bonds were unstable, it was failed to predict the binding affinity under the AMBER force field. Furthermore, the results of the present study revealed that differences in the binding free energy between AMBER and PPC almost comes from the electrostatic interaction. Thus, this study provides evidence that protein polarization is critical to accurately describe protein-ligand binding. PMID:27507430

  8. Hydrogen analysis depth calibration by CORTEO Monte-Carlo simulation

    NASA Astrophysics Data System (ADS)

    Moser, M.; Reichart, P.; Bergmaier, A.; Greubel, C.; Schiettekatte, F.; Dollinger, G.

    2016-03-01

    Hydrogen imaging with sub-μm lateral resolution and sub-ppm sensitivity has become possible with coincident proton-proton (pp) scattering analysis (Reichart et al., 2004). Depth information is evaluated from the energy sum signal with respect to energy loss of both protons on their path through the sample. In first order, there is no angular dependence due to elastic scattering. In second order, a path length effect due to different energy loss on the paths of the protons causes an angular dependence of the energy sum. Therefore, the energy sum signal has to be de-convoluted depending on the matrix composition, i.e. mainly the atomic number Z, in order to get a depth calibrated hydrogen profile. Although the path effect can be calculated analytically in first order, multiple scattering effects lead to significant deviations in the depth profile. Hence, in our new approach, we use the CORTEO Monte-Carlo code (Schiettekatte, 2008) in order to calculate the depth of a coincidence event depending on the scattering angle. The code takes individual detector geometry into account. In this paper we show, that the code correctly reproduces measured pp-scattering energy spectra with roughness effects considered. With more than 100 μm thick Mylar-sandwich targets (Si, Fe, Ge) we demonstrate the deconvolution of the energy spectra on our current multistrip detector at the microprobe SNAKE at the Munich tandem accelerator lab. As a result, hydrogen profiles can be evaluated with an accuracy in depth of about 1% of the sample thickness.

  9. Simulation Development and Analysis of Crew Vehicle Ascent Abort

    NASA Technical Reports Server (NTRS)

    Wong, Chi S.

    2016-01-01

    I have taken thus far that focus on pure logic, simulation code focuses on mimicking the physical world with some approximation and can have inaccuracies or numerical instabilities. Learning from my mistake, I adopted new methods to analyze these different simulations. One method the student used was to numerically plot various physical parameters using MATLAB to confirm the mechanical behavior of the system in addition to comparing the data to the output from a separate simulation tool called FAST. By having full control over what was being outputted from the simulation, I could choose which parameters to change and to plot as well as how to plot them, allowing for an in depth analysis of the data. Another method of analysis was to convert the output data into a graphical animation. Unlike the numerical plots, where all of the physical components were displayed separately, this graphical display allows for a combined look at the simulation output that makes it much easier for one to see the physical behavior of the model. The process for converting SOMBAT output for EDGE graphical display had to be developed. With some guidance from other EDGE users, I developed a process and created a script that would easily allow one to display simulations graphically. Another limitation with the SOMBAT model was the inability for the capsule to have the main parachutes instantly deployed with a large angle between the air speed vector and the chutes drag vector. To explore this problem, I had to learn about different coordinate frames used in Guidance, Navigation & Control (J2000, ECEF, ENU, etc.) to describe the motion of a vehicle and about Euler angles (e.g. Roll, Pitch, Yaw) to describe the orientation of the vehicle. With a thorough explanation from my mentor about the description of each coordinate frame, as well as how to use a directional cosine matrix to transform one frame to another, I investigated the problem by simulating different capsule orientations. In the end

  10. Maximum Likelihood Analysis of Low Energy CDMS II Germanium Data

    SciTech Connect

    Agnese, R.

    2015-03-30

    We report on the results of a search for a Weakly Interacting Massive Particle (WIMP) signal in low-energy data of the Cryogenic Dark Matter Search experiment using a maximum likelihood analysis. A background model is constructed using GEANT4 to simulate the surface-event background from Pb210decay-chain events, while using independent calibration data to model the gamma background. Fitting this background model to the data results in no statistically significant WIMP component. In addition, we also perform fits using an analytic ad hoc background model proposed by Collar and Fields, who claimed to find a large excess of signal-like events in our data. Finally, we confirm the strong preference for a signal hypothesis in their analysis under these assumptions, but excesses are observed in both single- and multiple-scatter events, which implies the signal is not caused by WIMPs, but rather reflects the inadequacy of their background model.

  11. Dynamical energy analysis for built-up acoustic systems at high frequencies.

    PubMed

    Chappell, D J; Giani, S; Tanner, G

    2011-09-01

    Standard methods for describing the intensity distribution of mechanical and acoustic wave fields in the high frequency asymptotic limit are often based on flow transport equations. Common techniques are statistical energy analysis, employed mostly in the context of vibro-acoustics, and ray tracing, a popular tool in architectural acoustics. Dynamical energy analysis makes it possible to interpolate between standard statistical energy analysis and full ray tracing, containing both of these methods as limiting cases. In this work a version of dynamical energy analysis based on a Chebyshev basis expansion of the Perron-Frobenius operator governing the ray dynamics is introduced. It is shown that the technique can efficiently deal with multi-component systems overcoming typical geometrical limitations present in statistical energy analysis. Results are compared with state-of-the-art hp-adaptive discontinuous Galerkin finite element simulations. PMID:21895083

  12. Analysis on Turbulent Flows using Large-eddy Simulation on the Seaside Complex Terrain

    NASA Astrophysics Data System (ADS)

    Kamio, T.; Iida, M.; Arakawa, C.

    2014-12-01

    The purpose of this study is the Large-eddy Simulation (LES) of the turbulent wind on the complex terrain, and the first results of the simulation are described. The authors tried to apply the LES code, which was developed as an atmospheric simulator in Japan Agency for the Marine-Earth Science and Technology (JAMSTEC), to the wind prediction for the wind energy. On the wind simulation, the highest problem would be the boundary conditions, and the case in this paper was simplified one. The case study in this paper is the west wind on a complex terrain site, which is the wind from sea for the site. The steady flow was employed for the inlet condition, because the wind on the sea is the low turbulent wind, and almost all the turbulence would be generated by the roughness of the ground surface. The wall function was employed as the surface condition on the ground surface. The computational domain size was about 8 × 3 × 2.5 km3, and the minimum cell size was about 10 × 10 × 3 m3. The computational results, the vertical profile of the averaged wind speed and the turbulence intensity, agreed with the measurement by the meteorological masts. Moreover, the authors tried the analysis of the turbulence characteristics. The power spectrum density model, and the cross spectrum analyses gave the knowledge of the turbulent characteristics on the complex terrain and the hints for the domain and grid of the numerical analysis.

  13. 76 FR 64931 - Building Energy Codes Cost Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-19

    ... of Energy Efficiency and Renewable Energy Building Energy Codes Cost Analysis AGENCY: Office of Energy Efficiency and Renewable Energy, Department of Energy. ACTION: Notice of reopening the public... on September 13, 2011. 76 FR 56413. The original comment period closed on October 13, 2011....

  14. Arcing in Leo and Geo Simulated Environments: Comparative Analysis

    NASA Technical Reports Server (NTRS)

    Vayner, Boris V.; Ferguson, Dale C.; Galofaro, Joel TY.

    2006-01-01

    Comprehensive tests of two solar array samples in simulated Low Earth Orbit (LEO) and Geosynchronous Orbit (GEO) environments have demonstrated that the arc inception voltage was 2-3 times lower in the LEO plasma than in the GEO vacuum. Arc current pulse wave forms are also essentially different in these environments. Moreover, the wide variations of pulse forms do not allow introducing the definition of a "standard arc wave form" even in GEO conditions. Visual inspection of the samples after testing in a GEO environment revealed considerable damage on coverglass surfaces and interconnects. These harmful consequences can be explained by the discharge energy being one order of magnitude higher in vacuum than in background plasma. The tests also revealed a potential danger of powerful electrostatic discharges that could be initiated on the solar array surface of a satellite in GEO during the ignition of an arcjet thruster.

  15. Two-phase/two-phase heat exchanger simulation analysis

    NASA Technical Reports Server (NTRS)

    Kim, Rhyn H.

    1992-01-01

    The capillary pumped loop (CPL) system is one of the most desirable devices to dissipate heat energy in the radiation environment of the Space Station providing a relatively easy control of the temperature. A condenser, a component of the CPL system, is linked with a buffer evaporator in the form of an annulus section of a double tube heat exchanger arrangement: the concentric core of the double tube is the condenser; the annulus section is used as a buffer between the conditioned space and the radiation surrounding but works as an evaporator. A CPL system with this type of condenser is modeled to simulate its function numerically. Preliminary results for temperature variations of the system are shown and more investigations are suggested for further improvement.

  16. Monte Carlo Simulation of Heavy Nuclei Photofission at Intermediate Energies

    SciTech Connect

    Andrade-II, E.; Freitas, E.; Garcia, F.; Tavares, O. A. P.; Duarte, S. B.

    2009-06-03

    A detailed description of photofission process at intermediate energies (200 to 1000 MeV) is presented. The study of the reaction is performed by a Monte Carlo method which allows the investigation of properties of residual nuclei and fissioning nuclei. The information obtained indicate that multifragmentation is negligible at the photon energies studied here, and that the symmetrical fission is dominant. Energy and mass distributions of residual and fissioning nuclei were calculated.

  17. Global sensitivity analysis in wind energy assessment

    NASA Astrophysics Data System (ADS)

    Tsvetkova, O.; Ouarda, T. B.

    2012-12-01

    Wind energy is one of the most promising renewable energy sources. Nevertheless, it is not yet a common source of energy, although there is enough wind potential to supply world's energy demand. One of the most prominent obstacles on the way of employing wind energy is the uncertainty associated with wind energy assessment. Global sensitivity analysis (SA) studies how the variation of input parameters in an abstract model effects the variation of the variable of interest or the output variable. It also provides ways to calculate explicit measures of importance of input variables (first order and total effect sensitivity indices) in regard to influence on the variation of the output variable. Two methods of determining the above mentioned indices were applied and compared: the brute force method and the best practice estimation procedure In this study a methodology for conducting global SA of wind energy assessment at a planning stage is proposed. Three sampling strategies which are a part of SA procedure were compared: sampling based on Sobol' sequences (SBSS), Latin hypercube sampling (LHS) and pseudo-random sampling (PRS). A case study of Masdar City, a showcase of sustainable living in the UAE, is used to exemplify application of the proposed methodology. Sources of uncertainty in wind energy assessment are very diverse. In the case study the following were identified as uncertain input parameters: the Weibull shape parameter, the Weibull scale parameter, availability of a wind turbine, lifetime of a turbine, air density, electrical losses, blade losses, ineffective time losses. Ineffective time losses are defined as losses during the time when the actual wind speed is lower than the cut-in speed or higher than the cut-out speed. The output variable in the case study is the lifetime energy production. Most influential factors for lifetime energy production are identified with the ranking of the total effect sensitivity indices. The results of the present

  18. Stochastic Simulations and Sensitivity Analysis of Plasma Flow

    SciTech Connect

    Lin, Guang; Karniadakis, George E.

    2008-08-01

    For complex physical systems with large number of random inputs, it will be very expensive to perform stochastic simulations for all of the random inputs. Stochastic sensitivity analysis is introduced in this paper to rank the significance of random inputs, provide information on which random input has more influence on the system outputs and the coupling or interaction effect among different random inputs. There are two types of numerical methods in stochastic sensitivity analysis: local and global methods. The local approach, which relies on a partial derivative of output with respect to parameters, is used to measure the sensitivity around a local operating point. When the system has strong nonlinearities and parameters fluctuate within a wide range from their nominal values, the local sensitivity does not provide full information to the system operators. On the other side, the global approach examines the sensitivity from the entire range of the parameter variations. The global screening methods, based on One-At-a-Time (OAT) perturbation of parameters, rank the significant parameters and identify their interaction among a large number of parameters. Several screening methods have been proposed in literature, i.e., the Morris method, Cotter's method, factorial experimentation, and iterated fractional factorial design. In this paper, the Morris method, Monte Carlo sampling method, Quasi-Monte Carlo method and collocation method based on sparse grids are studied. Additionally, two MHD examples are presented to demonstrate the capability and efficiency of the stochastic sensitivity analysis, which can be used as a pre-screening technique for reducing the dimensionality and hence the cost in stochastic simulations.

  19. Global sensitivity analysis for DSMC simulations of hypersonic shocks

    NASA Astrophysics Data System (ADS)

    Strand, James S.; Goldstein, David B.

    2013-08-01

    Two global, Monte Carlo based sensitivity analyses were performed to determine which reaction rates most affect the results of Direct Simulation Monte Carlo (DSMC) simulations for a hypersonic shock in five-species air. The DSMC code was written and optimized with shock tube simulations in mind, and includes modifications to allow for the efficient simulation of a 1D hypersonic shock. The TCE model is used to convert Arrhenius-form reaction rate constants into reaction cross-sections, after modification to allow accurate modeling of reactions with arbitrarily large rates relative to the VHS collision rate. The square of the Pearson correlation coefficient was used as the measure for sensitivity in the first of the analyses, and the mutual information was used as the measure in the second. The quantity of interest (QoI) for these analyses was the NO density profile across a 1D shock at ˜8000 m/s (M∞ ≈ 23). This vector QoI was broken into a set of scalar QoIs, each representing the density of NO at a specific point downstream of the shock, and sensitivities were calculated for each scalar QoI based on both measures of sensitivity. Profiles of sensitivity vs. location downstream of the shock were then integrated to determine an overall sensitivity for each reaction. A weighting function was used in the integration in order to emphasize sensitivities in the region of greatest thermal and chemical non-equilibrium. Both sensitivity analysis methods agree on the six reactions which most strongly affect the density of NO. These six reactions are the N2 dissociation reaction N2 + N ⇄ 3N, the O2 dissociation reaction O2 + O ⇄ 3O, the NO dissociation reactions NO + N ⇄ 2N + O and NO + O ⇄ N + 2O, and the exchange reactions N2 + O ⇄ NO + N and NO + O ⇄ O2 + N. This analysis lays the groundwork for the application of Bayesian statistical methods for the calibration of parameters relevant to modeling a hypersonic shock layer with the DSMC method.

  20. Research on simulation system with the wide range and high-precision laser energy characteristics

    NASA Astrophysics Data System (ADS)

    Dong, Ke-yan; Lou, Yan; He, Jing-yi; Tong, Shou-feng; Jiang, Hui-lin

    2012-10-01

    The Hardware-in-the-loop(HWIL) simulation test is one of the important parts for the development and performance testing of semi-active laser-guided weapons. In order to obtain accurate results, the confidence level of the target environment should be provided for a high-seeker during the HWIL simulation test of semi-active laser-guided weapons, and one of the important simulation parameters is the laser energy characteristic. In this paper, based on the semi-active laser-guided weapon guidance principles, an important parameter of simulation of confidence which affects energy characteristics in performance test of HWIL simulation was analyzed. According to the principle of receiving the same energy by using HWIL simulation and in practical application, HWIL energy characteristics simulation systems with the crystal absorption structure was designed. And on this basis, the problems of optimal design of the optical system were also analyzed. The measured results show that the dynamic attenuation range of the system energy is greater than 50dB, the dynamic attenuation stability is less than 5%, and the maximum energy changing rate driven by the servo motor is greater than 20dB/s.